If you’ve started to dive into making systems for your game using MetaSounds, you may have found yourself with some minor hitches or latency when playing the sounds.
These tips might help!
Try toggling the Async MetaSound Generator.
When this is on, it reduces the CPU cost during play, however, it can lead to latency with larger MetaSounds. It might be worth turning it off (Mix Universe has it off to ensure that a sound will always play at the right time)
au.MetaSound.EnableAsyncGeneratorBuilder 0
Set the MetaSound BlockRate to a lower value
I’ve found 28 is a sweet spot for latency and performance for lower end machines. Lower numbers will increase latency, but decrease CPU usage. Higher numbers will decrease latency, but increase CPU usage. Default is 100, and this is described in code as “blocks per second” when processing the audio for a MetaSound.
au.MetaSound.BlockRate 28
Make sure to enable stream caching and force streaming on your audio files.
In project settings, enabling stream caching will ensure that audio files can be processed as fast as possible during runtime. On your wav files you can force streaming and make them seekable to take full advantage of wave samplers in MetaSounds
Try choosing ADPCM or PCM for your audio compression.
PCM is uncompressed and has a high memory footprint, but will result in the fastest audio playback possible.
ADPCM is 4x compression which may cause artifacts, but typically is ok for sounds that don’t have a ton of high frequency detail.
What these formats avoid is any types of decoders which will cause CPU performance overhead and latency. (Especially on something like the Quest) You can monitor this using stat audio
Consolidate or reduce inputs
Here is a pretty complex MetaSound I am using for example. I have found around 30 is the sweet spot right before it starts to cause real issues. It may vary depending on the input type, but this was my findings with ints and floats.
Right now, the only real way to improve this is to just simply reduce the number of inputs you are sending to a MetaSound. It doesn’t really matter which ones are sent during play.
Reduce usage of Compression or Delay Nodes
Similarly, any nodes that use a large buffer size or a copy of a previous audio buffer can add up really quickly. Reducing lookahead time helps for compressors and also just in general, not using too many of these in your graph. I’ve found 1 of each is fine, but if you start pushing it too far it will start to show when playing your sound!
Reducing Lookahead time will decrease latency and also help with performance.
If you see constructor pins, they can save you!
As MetaSound evolves, you will notice diamond shape connection points. These are constructor pins and are very important for performance. They are only ever evaluated once and cannot change with inputs, so that means they can be optimized to be super cheap. In the case of a delay, this allows you to set the max allowed delay time to limit the audio buffer allocated for the delay form ever going over a certain length. Smaller numbers in this make the node much cheaper.
Diamonds are constructor pins.
What if none of this works?
Then it is time to dive in deeper. There are many ways to do this, but I personally get the most info initially from doing a simple CPU profile using stat startfile and stat stopfile
stat startfile
stat stopfile
Then using the unreal frontend to see what is happening when a hitch occurs!
If you are new to performance profiling, I would highly recommend this talk as it goes over the basics on the thought process when it comes to solving a performance problem!
Then finally, if that isn’t enough, you can look into Unreal Insights a bit more in-depth which will give you the most detail about what is happening to your frames!
Solution #1 – Make an Editor Utility Widget and use the widget’s tick function to drive functionality on blueprints in the world.
This is probably the easiest of the 2 routes. The only downside is the tick will only run if the editor utility widget is open somewhere. If you need more robust and automatic tick functionality then please refer to solution #2 below.
Solution #2 .. Make your own actor class!
In your custom actor class you can use this to add a bool that toggles being able to tick in the editor. This way whenever you need this you can just simply check it on and get going. It also is separated so that you can have different logic happen in the editor vs in-game. Most of the time I end up with 1-1 anyway but its nice to have the option.
header file
/** Allows Tick To happen in the editor viewport*/
virtual bool ShouldTickIfViewportsOnly() const override;
UPROPERTY(BlueprintReadWrite, EditAnywhere)
bool UseEditorTick = false;
/** Tick that runs ONLY in the editor viewport.*/
UFUNCTION(BlueprintImplementableEvent, CallInEditor, Category = "Events")
void BlueprintEditorTick(float DeltaTime);
cpp file
// Separated Tick functionality and making sure that it truly can only happen in the editor.
//Might be a bit overkill but you can easily consolidate if you'd like.
void YourActor::Tick(float DeltaTime)
{
#if WITH_EDITOR
if (GetWorld() != nullptr && GetWorld()->WorldType == EWorldType::Editor)
{
BlueprintEditorTick(DeltaTime);
}
else
#endif
{
Super::Tick(DeltaTime);
}
}
// This ultimately is what controls whether or not it can even tick at all in the editor view port.
//But, it is EVERY view port so it still needs to be blocked from preview windows and junk.
bool YourActor::ShouldTickIfViewportsOnly() const
{
if (GetWorld() != nullptr && GetWorld()->WorldType == EWorldType::Editor && UseEditorTick)
{
return true;
}
else
{
return false;
}
}
Then when you are done you should be able to add the new BlueprintEditorTick to your event graph and get rolling!
Why not use an Editor Utility Actor?
Editor Utility Actors don’t quite get you the exact functionality that seems useful enough for ticking in the editor via an actor. It will tick in the preview windows of blueprint and potentially lead to lots of head scratching as you realize that the blueprint you were working on is logging when you are trying to use it in other places. Also, there are times where you actually do want the same actor runtime as well (It’s rare but totally a valid case).
Final Notes
This is pretty dangerous if you aren’t mindful with what nodes you use in here. Try not to do things like Add Component or any other nodes that would spawn objects into the world or if you do make sure you store them and clean them up. Delays might work but they could be a bit odd as well. Just be aware that it can be fragile at times so make sure to give unreal some cake beforehand… ❤️
This image shows an example of how to setup an undo-able function call in unreal 4’s blueprints. This was reported as a bug and marked “by design” simply because you can do it 100% controllable and custom in blueprints
Begin Transaction
Starts the undo stack and allows for anything after this call to be grouped into 1 undo operation.
Transact Object
Every object you want to be undo-able has to be added by using the Transact Object node. Then you can do whatever you want to the object and it will properly go back to whatever it was before doing anything to that object.
End Transaction
Closes the gates and stops recording anything else to the undo stack. Which then will allow you to finally undo properly you custom tool’s operation.
This was a bit tricky to find since typing the word “Undo” in blueprint doesn’t give you much. Under the hood though everything undo-based is actually referred to as the Transaction System. Which is why these nodes are called Transaction Nodes.
That is all, hope this helps and Have a great day!