VituixCAD

Two different network-related things may exist. Program checks for updates when it is started. Searching non-existing web may take half a minute, and then program starts. This feature is possible to switch off in Options window, but it's not recommended.
Project list in File->Recent will be cleaned automatically if project file (vxp) does not exist anymore. That may also take a while if the file is on server which is not accessible. I've never tested that, but program should start - eventually.
 
Kill VituixCAD process or reboot PC.
Deleting user settings file could help. Pathname of the latest version is:
c:\Users\username\AppData\Local\Kimmo_Saunisto\VituixCAD2.exe_Url_...\2.0.85.5\user.config
Each old version has own directory and file. You can delete them all.

After that directory structure recommended in user manual should be followed without excuses and exceptions. It really helps:
1652896065143.png
 
  • Like
Reactions: 1 users
Hi Kimmo,
On a related note, from my own experience I have found with any project files that have been sent to be from other users, 100% of the time the above folder structure has not been followed, myself included. At the very least I find locating all files for each driver in a single folder for the driver is what most people will do, so near, far, merged and impedance files all end up in the same driver folder. The result as you can expect is a project that loads with missing file errors that have to be loaded and reloaded.

I would like to suggest the possibility of containing all loaded project information within a single project file. That is, creation of a unified project file that contains everything without needing external data. Each driver within a project includes it's frequency response data, impedance data, enclosure configuration, diffraction configuration. The project file would become much larger of course, but would contain everything, the entirety of the project can be shared and files relocated on local or network attached drives could be relocated without creating errors.

An alternative and less labour intensive solution I would have in mind is to use relative path for file locations. So the project file contains .\merger and .\impedance for driver file locations instead of absolute location on the local drive.
 
  • Like
Reactions: 1 users
Hi Kimmo,
On a related note, from my own experience I have found with any project files that have been sent to be from other users, 100% of the time the above folder structure has not been followed, myself included. At the very least I find locating all files for each driver in a single folder for the driver is what most people will do, so near, far, merged and impedance files all end up in the same driver folder. The result as you can expect is a project that loads with missing file errors that have to be loaded and reloaded.

I would like to suggest the possibility of containing all loaded project information within a single project file. That is, creation of a unified project file that contains everything without needing external data. Each driver within a project includes it's frequency response data, impedance data, enclosure configuration, diffraction configuration. The project file would become much larger of course, but would contain everything, the entirety of the project can be shared and files relocated on local or network attached drives could be relocated without creating errors.

An alternative and less labour intensive solution I would have in mind is to use relative path for file locations. So the project file contains .\merger and .\impedance for driver file locations instead of absolute location on the local drive.
I have the same experience as well.

Relative paths will work more convenient, but otherwise it would already help just to point to a certain folder, instead of having to do every bit manually.
 
Sub-folder names are not important in that recommendation so data for each driver can be separated to own folder below projectname. Relevant thing is that all "project" files (vxp, vxm, vxe, vxb, vxf) which refer to separate response files are in the same folder or above the response files. Then all file references remain relative without rooted filenames, and whole project can be archived or distributed e.g. to different language region with different folder naming. Saving measurement files in different folder than Documents\VituixCAD\Projects\projectname\... is the most common mistake. Measurement data (far field, near field and impedance) is typically project-related due to box/baffle, and may have project-related reference time, directivity processing etc.

Few users have suggested "packing" response files to vxp to improve working with cloud or removable drives. xml could include them for sure, but there are few issues. Not severe, but it will double, tipple or more everything and would not help with performance when e.g. delay in Drivers tab is adjusted and whole data is read to memory.
 
Here is my style for directory structure. That is one project with many driver options. All drivers have own directory for far field - of course because 37 off-axis responses/driver as raw and frd. Woofers and mids need merging so they have own directories. Near field and impedance files are easy to find and select from common place.
1652903576595.png
 
I guess the main difference is that a lot of people like to work per project folder and than have subfolders.

Instead of having subfolders with bits of the projects scattered all over them.

So in this example they would use a [SB13PFCR25-4] as the main folder.
Within that folder there will the [Far], [Impedance], [Merger], [Near] subfolders etc.

Or most of the time, they would have the speaker name as the main project folder, followed by the used drivers, followed by the measurements, merger, impedance folders etc.

Also, I have seen many people storing their main measurement files on an external location to keep it safe and organized (with a RAID setup as redundancy), instead of diving into the Documents folder of Windows.

Therefor I would not call saving measurement files in a different folder a "mistake", it's a very common workflow.
 
Here is my style for directory structure. That is one project with many driver options. All drivers have own directory for far field - of course because 37 off-axis responses/driver as raw and frd. Woofers and mids need merging so they have own directories. Near field and impedance files are easy to find and select from common place.
View attachment 1055475
Thanks for your responses. Reading through your reply sparked a couple of ideas that may help the situation.

First option would be to have VituixCAD automatically create the subfolders with the project file, and all response information loaded into the project is automatically copied to these subfolders if it doesn't already exist. This sort of forces a common directory structure on the user, and at least makes it easy to zip up a project folder for anyone with consistency of data organization.

Second option would be something similar within the export function of VituixCAD. An "export project" option that would create a zip file with the project file along with driver frequency response and impedance data that is correctly organized so that it can be shared with someone in its entirety without worry about the specific file locations used when the project was created. This allows user freedom of file organization with the ability to export a common data set format that is easily shared with others.
 
So in this example they would use a [SB13PFCR25-4] as the main folder.
Within that folder there will the [Far], [Impedance], [Merger], [Near] subfolders etc.
Yes, that is technically equal to recommendation as long as project files (vxp, ...) are in the project directory. I could even add this alternative to user manual.
But I'm afraid it won't help because main problem is that "modern" users don't read. User could be more willing to pay for youtube video than reading ~57 A4 pages. My offer of few thousand euros and no e-mail support has increased motivation though.
Therefor I would not call saving measurement files in a different folder a "mistake", it's a very common workflow.
It's a mistake if it does not support distribution without damaging the project. User Documents is one of the main directories which is / should be backed up, and raw measurement files are project-related - as already mentioned so there are no mandatory reasons to act differently.
 
...VituixCAD automatically create the subfolders with the project file, and all response information loaded into the project is automatically copied to these subfolders if it doesn't already exist.
...An "export project" option that would create a zip file with the project file along with driver frequency response and impedance data that is correctly organized...
Both require automatic manipulation of project file or file locations or duplicating data or decisions about folder naming. Risks are probably small if project folder contains just one project, but multiple project files changes that luck.
At least my projects usually have multiple speaker projects files (vxp) for different driver combinations so multiple projects compete renaming and copying/moving the same files to (possibly) different locations. Also vxm, vxe, vxb, vxf projects have references to external files so there could be quite a mess. For example project with previous folder structure has 11 project files (without vxb and vxf).
1652933032058.png


Duplicating inside vxp is still the safest option though it multiplies number of response files by number of project files (linked together). One option would be using the same packing algorithm with saving of cal/mic files. Error is quite small.
 
Last edited:
Yes, that is technically equal to recommendation as long as project files (vxp, ...) are in the project directory. I could even add this alternative to user manual.
But I'm afraid it won't help because main problem is that "modern" users don't read. User could be more willing to pay for youtube video than reading ~57 A4 pages. My offer of few thousand euros and no e-mail support has increased motivation though.

It's a mistake if it does not support distribution without damaging the project. User Documents is one of the main directories which is / should be backed up, and raw measurement files are project-related - as already mentioned so there are no mandatory reasons to act differently.
I totally and fully understand your frustrations.

I have to look up the references if you really want, but there has been done quite some research on reading manuals and such. Conclusion is always the same, that the vast majority doesn't read.
Very much so that even in some court cases a manual wasn't considered as a viable argument. Goes to show how hard the reality is.
 
Is someone following this "discussion"? Time aligned speakers - do they make sense? Any opinions or experiences to share about very low excess group delay or minimum phase designs vs normal IIR 3...5-way with 4...8th order slopes? Is old science, referred designers and professor(s) reliable sources in this matter, or do you think they have missed something or we are interpreting truncated message with own bias?
 
I don't follow the "discussions" at ASR, I tried once but there is not much DIY efforts there, and too many yes-men. I will rad through that thread a little later on, maybe there is some gems of information. In the meantime, I have heard your opinions in past pages of this thread on low frequency group delay and overall speaker step response, as well your competition over at Bodzio software I believe is in agreement, with a document about perfecting the "punch" of bass through linear phase.
https://bodziosoftware.com.au/Perfecting_Punch.pdf
 
When I read Floyd Toole's research I am given the impression that he tested phase adjustment to the input signal as a whole, so I am not surprised that he couldn't hear a difference. Similar to simply swapping the polarity of the entire audio signal, it will sound the same. This is a fair bit different than the phase interaction of multiple audio sources given the same input signal (as in a multi-way speaker), which I believe is the "time alignment" or "time coherency" being discussed in the ASR thread.

Clicking through eventually you may be led to this AudioXpress article:
https://audioxpress.com/article/zero-phase-in-studio-monitors

This article provides a nice excerpt from Toole's book, unfortunately the referenced figures are not made available, and I don't have the third edition of the book, only the first edition "Sound Reproduction: Loudspeakers and Rooms". In any case, reading the AudioXpress article I think provides more insight into the research of Floyd Toole on phase interaction than anything I read in that ASR thread.
 
^I have 3rd edition of Toole's book, but there is nothing more than copy-pasted two chapters in the article of AudioXpress.

Generally this topic is wide and it's almost impossible to know what detail others are meaning exactly; minimum phase, linear phase, normal group delay, excess group delay, constant distance from acoustic centers to ear, some minor phase variation or what. Another secret in practice is how investigators have done listening tests. Quite hollow words such as "there are different opinions" or "detection threshold is in the range 1.6 to 2 ms... these numbers are not exceeded by normal domestic and monitor loudspeakers" and generalized bad excuses such as "recordings are not pristine" or "room or off-axis lobing damages timing" exist too. For example some of the most known and respected hifi and monitor manufactures have exceeded GD of 2 ms already at low mid so that Toole's generalization is not true.
No one (else) is talking about reduced pressure due to energy distribution, and fundamental nature of the sound. Looks that researchers don't have own experience, opinions, and decent theory how and why for example bad timing can be detected by human being. They just select some music or generated impulses, listeners, gear - usually headphones (which is bad choice), make blind tests, read statistic, write paper and refer to others in the end. It's difficult to trust science leaving huge blind spots at least to reader's mind, and systematically different result compared to own experience over the years.
Timing is not the most important feature for sure, and it's sensitive to program material, but hifi is not just picking the most significant and easily audible features for average citizen and ruining the rest. It's perfection of everything possible - at least in my opinion.

Truth or not, I agree with member gnarly on ASR. Many others just copy-paste the same old truncated and obscure conclusions by authorities they trust as blind and deaf, and add some own biases, interpretations and insults.

I've been banned from ASR since January which probably continues until death. Hopefully disagreeing community / donors can sleep well by thinking that disturbing kimmosto is out and just spinorama and preference rating still matter. Let's see how long they use my program for free for community projects...
 
  • Like
Reactions: 2 users
I believe ASR gnarly is mark100 on this forum and he has been vocal on the subject for long time here as well.

To comment your question about it, without too much experience on the subject, experimenting with 3-way mono prototype speaker and DSP IIR filters I've been able to get very different sensation of sound out while the magnitude graphs are about the same in simulator, very close, while there is some difference in group delay and impulse response for example. In both cases the sound is fine it is just the other is much more exciting somehow, perhaps more tactile or something. I'd like to believe it is something you talk about. There is some posts on this earlier in the thread, haven't had energy to investigate it further yet. Also need to comment that never underestimate power of personal bias!:D

Without much experience that would make some kind of bias either way here are some thoughts on the subject:
It is very logical to go with DSP crossover on any speaker system except perhaps for cost or convenience reasons passives would sometimes be better option and still sound very good. People seem to know or assume there is audible difference between passive and active just because of technology even if the acoustic responses were the same, but it is possible to make "much better" measured acoustic response with DSP so there is no comparison in my view, I see no reason not to use DSP at the moment. FIR capable system can make very flat response, perfect timedomain, but also with IIR DSP response can be smoothed no problem, just use enough filters. Its about the physical construct mostly, then some kind of compromise with the filter slopes, trying to get nice magnitude response and the exciting sound quality, good sound in general. If DSP makes better sound and is easier to implement, even allow get both good magnitude responses and exciting sound, then why not, win win. We can consider crossovers trivial in sense we can basically do what ever with it if permitting ourselves to use FIR. VituixCAD is so good visualization of the acoustic response there is no difficulty to tune up any response for the measured construct. This makes the construct and how it measures only limitation for performance, and of course the designer needs to know what he is looking at and what kind of graphs make good sound in the given situation.
 
Last edited:
I don't follow the "discussions" at ASR, I tried once but there is not much DIY efforts there..

Let's see how long they use my program for free for community projects...

I don't think they do (or 95% of them) - I'm with DcibeL on this: very little DIY efforts there, who needs VituixCAD if all you are doing is measuring (and evaluating those measurements) of commercial products?