This tutorial will teach you, how to setup this project on your machine. Before you start, make sure to install Node and NPM and Virtual Audio Cable (Lite version) on your machine. You will need to
-
configure your Unreal Engine project to use the Pixel Streaming and Unreal VA plugin (only accessable within the IHTA network).
-
download and configure VA
-
download and configure the Streaming Dashboard git repository
-
Setup a way to pipe VA's audio output back into your machine as a microphone input
Configure your Unreal Engine project
Pixel Streaming plugin
To add the PixelStreaming plugin to your project, open your project in the Unreal Engine Editor and navigate to Edit -> Plugins
.
Search for pixel streaming
and activate the plugin. You will need to restart the editor afterwards.
source: Pixel Streaming documentation, 13.12.2022.
To enable running and testing the project from within the editor, you need to add the following as additional launch parameters:
- Navigate to
Edit -> Editor Preferences
and selectplay
in the newly opened window underLevel Editor
- Paste this line
-AudioMixer -PixelStreamingIP=localhost -PixelStreamingPort=8888
into the textbox labeledadditional launch parameters
source: Pixel Streaming documentation, 13.12.2022.
If you intend on packaging your UE project, you will also need to run the executable with these additional parameters. This can be done via the command line, a batch file or a symbolic link. For further information refer to the Pixel Streaming documentation.
Unreal VA Plugin
The Unreal VA plugin can be found on the IHTA GitLab. To add it to your project and start using it, refer to it's wiki page on the IHTA wiki.
Note: This page is only accessible within the IHTA network, you may need to use a VPN.
Configure Virtual Acoustics
Navigate to virtualacoustics.org and download the newest version of VA for your machine.
Unzip it and open conf/VACore.ini
in a text editor.
You will have to add VA/
infront of all paths under the [Paths]
tag. This is due to VA being later launched via the dashboard's control server, which will live one folder up in the folder structure we are going to build. The top of your VACore.ini
should look something like this:
[Paths]
# Any entry value will be added to the search path list, but only if existing! They will also be made available as macros (see below).
# If a file can not be found during runtime by local name, the core will try to locate it using the paths in ascending name order.
# Macros are not substituted, here. However, it is recommended to use AddSearchPath during runtime, if you have individual directories
# you want to add.
# Relative configuration path (with some more hardware setup files)
conf_dir = VA/conf
# Relative data path (with some common files)
data_dir = VA/data
# Path for TTS Voices of CereVoice
voices_dir = VA/data/Voices
[Files]
...
If you intend on using Virtual Audio Cable as your way of accessing VA's audio output, you will also need to download and configure ASIO4All. A detailed explanation can be found under "Streaming VA's audio output".
Streaming Dashboard
This dashboard serves as the admin frontend for the whole streaming setup. It can be used to control VA and the Signalling Server and also handles audio streaming from VA to the clients. To get started, firtst clone the git repository. You will need to have node and npm installed on your machine.
After cloning the repository, navigate to Server
and run start.bat
. On first startup, this will install all dependencies and prompt you to configure a user. Additional users can be configured by running newUser.bat
. To remove users, simply remove them from config/users.json
.
Adding VA
To run VA from within the dashboard (recommended), move your VA folder into the Server
folder (same level as start.bat
) and rename it to VA
.
Your folder structure should look similar to this:
Streaming VA's audio output
VA is designed to output its audio signal directly to a soundcard, so in order to stream the signal, a trick is used in this project - for further information on this, refer to the architecture documentation. The trick is to let VA output its signal to a mock soundcard and pipe that signal back into the PC as a microphone input. This microphone input can then be easily used by the streaming dashboard. There are multiple ways of achieving this, you could for example physically connect a soundcards output to its microphone input using a cable, but in this tutorial two reliable methods are presented, one using ASIO4All and Virtual Audio Cable and one only using the latter.
Method 1: ASIO4ALL and Virtual Audio Cable
ASIO4All
-
Navigate to asio4all.org, download the latest version and install it.
-
Open your
VACore.ini
and under[Audio driver]
selectASIO
as your driver andASIO4ALL v2
as your device. Your config file should look something like this:
[Audio driver]
# MANDATORY: Audio driver backend (ASIO|Portaudio|Virtual)
#Driver = Virtual
Driver = ASIO
#Driver = Portaudio
# MANDATORY: Audio device ( e.g. ASIO4ALL v2, ASIO Hammerfall DSP, Portaudio 'default', 0,1,2,3,..., virtual user 'Trigger' )
Device = ASIO4ALL v2
#Device = ASIO Hammerfall DSP
#Device = ASIO Fireface USB
#Device = ASIO MADIface USB
#Device = Focusrite USB 2.0 Audio Driver
#Device = M-Audio Fast Track Ultra ASIO
#Device = Yamaha Steinberg USB ASIO
#Device = default
#Device = Trigger
Virtual Audio Cable
Virtual Audio Cable is a software used to digitally patch outputs and inputs together.
-
Navigate to vac.muzychenko.net/en/ and download Virtual Audio Cable. The free lite version is enough to stream one instance of VA
-
Unzip the archive and run the installer
-
After installing you may need to reset Windows default input and output devices in the system's sound settings
-
Now run VA by double-clicking
run_VAServer.bat
.
Note
You will probably see some errors in the VA console:
[ VAInfo ][ Config ] Could not find path 'VA/conf', removed from search path list. [ VAInfo ][ Config ] Could not find path 'VA/data', removed from search path list. [ VAInfo ][ Config ] Could not find path 'VA/data/Voices', removed from search path list.
Those are due to the changes we made to the paths earlier in
VACore.ini
and can be ignored. Once VA is launched via the dashboard, these should disappear.
- Once VA is running, you should see the ASIO4All icon in your system tray. Click it
-
Click on the cog wheel icon to enable advanced settings, look for Virtual Audio Cable in the device list and unfold it by clicking the + icon
-
Activate
Virtual Audio Cable 1
with the speaker symbol next to it and deactivate all other devices in the list.
VA now outputs its signals to Virtual Audio Cable 1, that can also be used as a microphone input in any other software running on the system.
Method 2: Only using Virtual Audio Cable
Since output devices can also be specified withing VA's config file, one can skip using and configuring those via ASIO4ALL.
- Open
VACore.ini
and under[Audio Driver]
selectPortaudio
[Audio driver]
# MANDATORY: Audio driver backend (ASIO|Portaudio|Virtual)
#Driver = Virtual
#Driver = ASIO
Driver = Portaudio
# MANDATORY: Audio device ( e.g. ASIO4ALL v2, ASIO Hammerfall DSP, Portaudio 'default', 0,1,2,3,..., virtual user 'Trigger' )
Device = ASIO4ALL v2
#Device = ASIO Hammerfall DSP
...
- After installation, Virtual Audio Cable should have registered at least one new audio device both as input and output (usually called
Line 1
). Check the exact name by opening Windows' sound settings and specify it inVACore.ini
...
# MANDATORY: Audio device ( e.g. ASIO4ALL v2, ASIO Hammerfall DSP, Portaudio 'default', 0,1,2,3,..., virtual user 'Trigger' )
#Device = ASIO4ALL v2
#Device = ASIO Hammerfall DSP
Device = Line 1
...
Now VA should output its audio directly to Virtual Audio Cable without the need of ASIO4ALL in the middle.
Note
While this method is simpler than method 1, as of 10.01.2023 it is not equally well tested.
Start-up
Starting the control server
Navigate to the Server
directory and run start.bat
. You should see a console window like the following:
Now open a modern browser (try to avoid Edge or Internet Explorer) and navigate to http://localhost:7273. Log in with one of your configured users. You should see the following dashboard:
Note
Some browsers try to prevent you from visiting unencrypted http sites, even on localhost. Some browsers may even try to automatically redirect you to https://localhost:7273. This won't work, so you may need to change your browser settings to allow basic http connections (on localhost).

Clicking the accordion modules will expand them, modules (VA or Signalling Server) can be started and stopped by clicking the button and their output will be piped to the text field inside the accordion module.
You can either just start the component you need, or start both VA and the Signalling Server at once by clicking Start
on the top right-hand corner of the page.
Starting the Unreal Project
If you configured your Unreal project correctly (see above), you can start it once VA and the Signalling Server are running.
Starting from the editor
If you want to start your project from the Unreal Engine editor, you will need to:
-
Paste this line:
-AudioMixer -PixelStreamingIP=localhost -PixelStreamingPort=8888
undereditor -> preferences -> play -> additional launch parameters
-
Start your game as
Standalone Game

Starting a bundled project
If you bundled your project into an executable, you will need to start it with the additional launch parameters -AudioMixer -PixelStreamingIP=localhost -PixelStreamingPort=8888
as well. This can be done for example from the command line (./<project name>.exe AudioMixer -PixelStrea...
).
You should be able to see Streamer connected: 1
in the Signalling Servers output after you started your project.
Connecting to the stream
If the signalling server is running, the stream's website will be available under http://localhost.
Note
Again consider the potentially blocked http connection (see above)
If an Unreal project is connected to the Signalling Server, it will be displayed on the streaming website and streaming can be started by clicking the large "Play" symbol. Once a user clicks that symbol, a VoIP call is relayed to the dashboard web page (http://localhost:7273). This call carries VA's audio output.
When being on the dashboard page, your browser should ask you to allow the use of a microphone. As your microphone, select Virtual Audio Cable's Line 1
and allow. You can also tell your browser to remember that decision so you don't need to allow it again every time you restart the setup.
Note
Firefox allows you to choose your input device when prompting for one, but other browsers might not. If yours does not, you can also just configure
Line 1
as your standard input device in Windows' settings.
Note
Even if you clicked "remember my decision", the dashboard web page still needs to have focus to grab the input's audio stream. If your users don't receive VA audio, try just clicking into the dashboard web page to bring its browser tab back into the OS's focus.
Your streaming setup should now be running on http://localhost for your machine, and on http://<your machine's local ip> for everyone inside your local network. If you want it to be available over the internet, you will additionally get a domain linked to your public IP adress and enable port forwarding on port 80.
Note
When publishing the setup to the internet, one should add an SSH certificate to the SignallingServer and configure it accordingly to enable https connections. For more information, refer to Unreal Engine's PixelStreaming documentation