More overview

parent 810944ca
......@@ -284,31 +284,68 @@
</p>
<h4>Configuring VA</h4>
<p>
If you want to use VA, you most likely want to change the configuration to match your hardware and activate the rendering and reproduction modules you are intereset in. You can do this by modifying the <code>*.ini</code> files in the <code>conf</code> folder. The <code>VACore.ini</code> controls the core parameters, the <code>VASetup.*.ini</code> are describing hardware devices and channel layouts. Use <code>enabled = true</code> or <code>enabled = false</code> to activate or deactivate instantiation of sections, i.e. rendering or reproductions modules and output groups.
If you want to use VA, you most likely want to change the configuration to match your hardware and activate the rendering and reproduction modules you are intereset in.
</p>
<h5>Configuration using an INI file</h5>
<p>
You can do this by modifying the <code>*.ini</code> files in the <code>conf</code> folder and use the provided batch start scripts, that will start the VA server using these configuration files. The <code>VACore.ini</code> controls the core parameters, the <code>VASetup.*.ini</code> are describing hardware devices and channel layouts. They are included by a line in the <code>[Files]</code> section of the configuration file. Use <code>enabled = true</code> or <code>enabled = false</code> to activate or deactivate instantiation of sections, i.e. rendering or reproductions modules and output groups.
</p>
<h5>Using search paths</h5>
<p>
Loading files from the hard drive seems a triviality, but in practice a lot of time is wasted due to paths that can not be found during runtime - especially if error messages do not indicate this problem. <br >
In VA, we struggle a lot with this and it is a serious problem. Often, configurations and input data for scenes are created locally and are later transferred to a computer in the laboratory.
This computer is often not the computer that is also controlling the scene, because a remote network connection is used - which in consequence requires files to be mirrored on the hard drive of that server PC. If no precaution is taken, this usually leads to a nerve-wrecking trial-and-error process until all files are found - and mostly results in using absolute paths as the quick-and-dirty solution because we are all very lazy and to busy to do it right.
<br />
<b>DO IT RIGHT</b> in this context means, <b>NEVER</b> use absolute paths from the absolute beginning of working with VA. VA provides search path functionality. This means, it will find any relative file path with the smallest amount of help: you have to provide one or many base paths where to look for your input files. This makes it the easiest to avoid problems.
<br >
<br />
Here is the best practice assuming you want to run a listening experiment:<br />
Put all your input data in one base folder, let's call it<br />
<code>C:/Users/student54/Documents/BachelorThesis/3AFCTest/InputData</code><br />
In your <code>VACore.ini</code>, add a search path to this folder: <br />
<pre><code>[Paths]
studentlab_pc3_my_data = C:/Users/student54/Documents/BachelorThesis/3AFCTest/InputData</code></pre>
Let us assume you have some subfolders <code>trial1, trial2, ...</code> with WAV files and a HRIR dataset <code>Kemar_individualized.daff</code> in the root folder. You will load them using this pseudo code <br />
<pre><code>H = va.CreateDirectivityFromFile( 'Kemar_individualized.daff' )
Sample_1_1 = va.CreateSignalSourceBufferFromFile( 'trial1/sample1.wav' )
Sample_1_2 = va.CreateSignalSourceBufferFromFile( 'trial1/sample2.wav' )
Sample_2_1 = va.CreateSignalSourceBufferFromFile( 'trial2/sample1.wav' )
...</code></pre>
When you now move to another computer in the laboratory (for conducting the listening experiment there), copy the entire <code>InputData</code> folder to the computer, where the <u>VA server</u> will be running. For example <code>D:/experiments/BA/student54/3AFCTest/InputData</code> <br />
Now, all you have to do is add another search path to your <code>VACore.ini</code> configuration file, e.g. <br />
<pre><code>[Paths]
studentlab_pc3_my_data = C:/Users/student54/Documents/BachelorThesis/3AFCTest/InputData
hearingboth_pc_my_data = D:/experiments/BA/student54/3AFCTest/InputData</code></pre>
... and you have no trouble with paths, anymore. If it is applicable, you can also add search paths over the VA interface during runtime using the <code>AddSearchPath</code> function.
</p>
<h4>Controlling the virtual scene in VA</h4>
<h4>Controlling VA</h4>
<p>
The first question is: what kind of software do you usually use? There are bindings that make VA interfaces available in <b>Matlab</b>, <b>Python</b>, <b>Lua</b> and rudimentary functionality in <b>C#</b>. While many in the acoustics research area prefer Matlab, Python and especially the combination with Jupyter notebook is the open-source way to conveniently use VA. C# is your choice if you are planning to use VA for Unity environments, which is probably the lightest entry for those who are not familiar with either Matlab or Python scripting.
<br /><br />
Let's create a simple example scene for a binaural rendering. It requires a running VA server application on the same PC.<br /> Also, the two files <code>ita_demosound.wav</code> and a head-related impulse response file <code>NeumannKU100.v17.ir.daff</code> in OpenDAFF format in the data folder.
<h3>Matlab</h3>
Let's create a simple example scene for a binaural rendering. It requires a running VA server application on the same PC.<br /> Also, the two files <code>ita_demosound.wav</code> and a head-related impulse response file <code>NeumannKU100.v17.ir.daff</code> in OpenDAFF format are either in the data folder or in the same folder where the script is executed, because it adds this folder as a search path during runtime (see above).
</p>
<h5>Matlab</h5>
<p>
<pre><code>va = itaVA;
va.connect()
va.reset()
X = va.createAudiofileSignalSource( 'ita_demosound.wav' );
va.addSearchPath( pwd )
X = va.createAudiofileSignalSource( 'ita_demosound.wav' )
va.setAudiofileSignalSourcePlaybackAction( X, 'play' )
va.setAudiofileSignalSourceIsLooping( X, true );
S = va.createSoundSource( 'itaVA_Source' );
S = va.createSoundSource( 'itaVA_Source' )
va.setSoundSourcePosition( S, [-2 1.7 -2] )
va.setSoundSourceSignalSource( S, X )
H = va.loadHRIRDataset( 'NeumannKU100.v17.ir.daff' );
H = va.loadHRIRDataset( 'NeumannKU100.v17.ir.daff' )
L = va.createListener( 'itaVA_Listener' );
L = va.createListener( 'itaVA_Listener' )
va.setListenerPosition( L, [0 1.7 0] )
va.setListenerHRIR( L, H )
......@@ -316,20 +353,77 @@ va.setActiveListener( L )
va.disconnect()</code></pre>
</p>
<h3>Python</h3>
<h5>Python</h5>
<p>
<pre><code>import va
va.connect
va.reset
...
va.disconnect
</pre></code>
<h3>C#</h3>
<h5>C#</h5>
<pre><code>using VA
VA = new VANet()
VA.Connect()
VA.Reset()
...
VA.Disconnect()</code></pre>
</p>
<h4>Sound sources, sound receivers and sound portals</h4>
<p>
In VA, you will find three different virtual entities that represent sound objects.
While the term <i>sound source</i> is self explenatory, VA uses the term <i>sound receiver</i> instead of listener.
The reason is, that listeners would reduce the receiving entity to living creatures, while in VA those <i>listeners</i> can also be virtual microphones or have a complete different meaning in other contexts.
Sound portals are entities where sound can be picked up and transported and/or transformed to other portals or sound receivers. This concept is helpful for sound transmission handling in Geometrical Acoustics, for example if a door acts as a transmission object betwen two rooms.
It depends on the rendering module you use, but portals are mostly relevant in combination with gemoetry, say for room acoustics.
</p>
<h4>Auralization mode</h4>
<p>
</p>
<h4>Signal sources</h4>
<p>
</p>
<h4>Directivities (including HRTFs and HRIRs)</h4>
<p>
</p>
<h4>Geometry meshes and acoustic materials</h4>
<p>
Geometry-aware audio rendering is the holy grail of physics-based real-time geometrical acoustics. It requires sophisticated algorithms and powerful backend processing to achieve real-time capability.
VA tries to support this by providing a simple geometry mesh class and interfaces to load and transmit geo data. However, it is up to the implementation of the rendering modules what to do with the data.
Faces of meshes are assigned with acoustic materials such as absorption, scattering and transmission coefficients. These are, for example, used (or transformed and forwarded) by special rendering instances, like the binaural room acoustics audio renderer.
</p>
<h4>Scenes</h4>
<p>
A scene is a somewhat unspecified term in VA. Any assembled scene information is passed to the rendering modules, and it is up to the implementation what happens next.
The scene interface methods are for prototyping and future work, and can for example be used to define stratified medium definitions for air traffic noise rendering.
Or loading entire cities from a geo information server for a certain geo location.
</p>
<h4>Active listener concept</h4>
<p>
In VA, there is not one single sound receiver (or <i>listener</i>, if we speak of a human beeing). Instead, VA renders sound for all sound receivers, and the actual output stream that is forwarded to the reproduction module(s) can be switched dynamically by configuring the </b>active listener</b>.
This works either for all rendering instances, but can also be controlled for each rendering instance individually, i.e. to use VA for multiple listeners in one scene.
</p>
<h4>Real-world pose (tracking)</h4>
<p>
Sound receivers have a pose, a combination of a position and an orientation in 3D space. But they also have a pose for the <i>real-world</i>, meaning that a receiver can also be positioned in the reference frame of the real physical laboratory environment.
This is required for the processing of some reproduction modules, for example the binaural cross-talk cancellation reproduction <i>NCTC</i>, where the dynamic listener (this time it is a human beeing) pose has to be known very precisely.
</p>
<h4>The VA struct class</h4>
<p>
VA uses a struct class, that acts like an associative container. It can take basic data types like boolean, integer, floating point and strings. But it also uses more sophisticated VA types like samples. And another struct, which means that structs can be nested.
Structs behaves very much like Matlab structs or Python dicts, and these objects can be forwarded over remote interfaces, for example to update an impulse reponse in a FIR convolution engine.
It is very convenient to use this concept for prototyping, and it allows to change parameters in every corner of the VA core by very general parameter setters and getters.
</p>
</section>
</div>
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment