studiomux 5 beta

First up, I’m not a recording artist…I’m a performer. SMUX probably is suitable for recording.

Live 11.0.5 / macOS 11.5.1 / MacBook Pro mid 2015 i7 2.5GHz (the one with the discrete GPU). The main interface is an iConnectivity iConnectAUDIO4+ but I change interfaces depending on what I’m doing – also use an Axe-Fx II and some Roland devices. 128 sample buffer settings in SMUX and Live 11. All operating on 48KHz. Nothing amazing or shocking about my rig – except sometimes I use aggregated audio devices.

Bear in mind…IDAM works for real-time synths perfectly.
iConnectAUDIO4+ also does…and real-time FX signal processing too.

Somebody mentioned above running Overloud TH-U (iOS) on their iDevice via SMUX. I tried this too, and while it basically works it’s not playable in real-time! I find this unusual. What is the challenge that stands in the way of making this possible? I’m sure I did it successfuly in the old version of SMUX with Bias FX (iOS) and MIDI Guitar 2 and an iPhone 6 in macOS Mojave.

Anyway, I can do all this with the iCA4+ But I want to add in an additional iDevice sometimes so I use Studiomux. Sometimes I also want to run a really light setup…iRig HD 2 and a MacBook with an iPhone attached for some iOS synth/FX magic. It’s sad news if the new SMUX can’t handle such real-time applications.

Can you measure the realtime latency you are encountering? I am surprised that you find the latency with Overloud high to point of unusable. That sounds like more latency than I am encountering. It is easy to measure. Record an audio track. Add Studiomux as an effect on the track and have the track just feed the input through Overloud back to the same channel. Set another track in Live to take that track’s output as the input. Compare the two waveforms to see the offset.

Sure, I checked and it’s approx 200ms – give or take 5ms.
Can you believe it?
I’m usually ok with under 12ms…might handle up to 20ms but over 10 times that? Unusable.

If sounds like some experiments are in order to see why you are experiencing so much latency. Maybe there is something about how things are set up. My latency with a 2011 MBP, Zoom interface, iPad gen 6 with iOS 13.7 or 14.7.- is on the order of 20 ms in reaper, live 10 or live 11. MacOS High Sierra.

EDIT: I recommend a simple experiment to eliminate extraneous variables as a test. For the sake of a first test, have your Mac and iPad connected together directly with no other devices connected to Mac or iPad – no hubs. For this first test, just use your Mac’s built-in audio. It is important to establish this baseline that nothing be connected other than the Mac tethered to the iPad with the charging cable. This SHOULDN’T make a difference but your results are so radically different from what I (and I assume others) are experiencing that we should start simple.

Reboot both machines before the test.

On the iPad:
Launch StudioMux
Set the buffer to 128

On the Mac:
Set system to use the built-in audio
In Live 11, set your buffer to 32.
Create a new project in Live 11.
Record some audio into that track.
Add studiomuxeffect as an insert on that track
Have it send and receive to the same StudioMux channel. [On the iPad, you should now have one channel strip in StudioMux that receives from the Mac and sends back to it)

In Live 11, set another audio track to receive its input from the track you already recorded. Have it receive the post-effect, pre-fader signal. Record that into Live and compare the two tracks.

EDIT: I just did this test on my 2016 MacBook Air (High Sierra) and Live 10.
Round-trip latency (mac’s built-in sound, Live Buffer at 64, StudioMux iPad buffer 128) was 26 ms.

Following this method at 32 sample buffer in Live 11, latency = 19.5ms.

But as I mentioned I’m interested in realtime performance…not DAW delay compensated performance for recording. Live 11 is an audio playground for me more so than a recording device.

So, end result, if I route the mic signal directly to SMUX (in parallel with an unprocessed mic signal for comparison also @ 32-sample buffer) I get around 200ms latency.

There is no DAW delay compensation in my setup. I get the same performance live as described in the test. So, if you are getting much larger latency in realtime, let us continue experimenting to determine what in your setup causes that massive change. I have my guitar plugged into the zoom which is plugged into my mac and can i realtime use THU and other effects on the ipad with a total rountrip latency of 20-30 ms.

please fully describe your routing and lets try to either eliminate the delay or figure why you are getting 10x more latency than I am. It clearly is not delay compensation.

– I wouldn’t expect a developer to be able to address an issue without more details since that first test seems in line with what I am seeing…i.e. mini,al latency.

Certainly requires investigation…
BTW, DO NOT use headphones with SMUX BETA.
I nearly just lost my hearing thanks to this half-baked app.

I can certainly say SMUX could be rebranded as a 200ms delay device.

I’ve tried it in 3 DAWs using VST3, VST2 and AU. All display the same performance.

it sounds like something is configured wrong. You reported <20 ms latency in Live. What is different in your configuration with massive latency. I think you need to spend some time reconstructing your setup before calling this half-baked. There is likely some aspect of how you have wired things up. I haven’t seen anyone else report such huge latency.

If you are getting that in Live, show how you have things setup on the Mac and iPad.

Try adding one element at a time to the baseline test setup until you encounter the latency you describe. then describe the details.

I did as you suggested ad got the reported results…but this is not my use case. It might be more useful if you could please describe how you can send a realtime mic/line signal into a DAW and route it in/out of SMUX with only 26ms latency.

With all due respect, what could I have configured incorrectly? I’ve even tried different ports, cables. Tested in 4 DAWs now at various sample rates and audio I/O configs…with all possible configurations of the various plugins. I’ve looked at server settings (nothing there). Looked into the settings on the iOS device(s). I’ve tried an iPad Pro and an iPhone 7+…same same same. You might even say I’ve been beta testing it…Buffer and sample rate – no problem. So why is it happening?

I feel quite justified in calling it “half-baked” at this stage because after more than a year (right?) it is effectively unusable in this config.

You will be doing the project a big favor if you will invest some time to patiently see what factors cause a jump from 20 ms to 200 ms of latency. It has to be done in steps since your investigations so far indicate that it isn’t DAW related. It could be some combination of interface, routing and particular effects or apps. Knowing which of those are responsible will help the developer find and address the issue.

I understand that the baseline test we did is not your use case. As I explained, we start simple to establish a baseline and then bit by bit bring the configuration closer to your use case to see where the latency comes from. That first test was a first test – it just established a baseline.

Since other people aren’t reporting this kind of latency, the developer can’t know where to start looking without our help. As beta testers, that is how we help the developer make it solid.

Just saying “I get 200 ms of latency with 4 different DAWs” provides no useful information that the developer can use. Since you now know that in a simple setup, you have 20 ms of latency, you can help figure out what causes that jump in latency and provide the information so that Pascal can address the issue.

You ask “why is it happening?”

We won’t know without applying some creativity to figuring out what changes result in your going from less than 20 ms of latency to 200ms of latency.

It will require some patience and creativity to figure it out.

It could be some combination of interface, routing, the particular effects – but some time needs to be spent to figure out which of those factors are involved.

As I explained, I AM GETTING ON THE ORDER OF 25 MS ROUNDTRIP LATENCY sending my guitar signal in realtime through the iPad.

Why are you getting radically different results? You need to investigate that. The information will no doubt be valuable. I don’t doubt that you are having terrible performanc, but let’s figure out why.

The baseline test is just there to show that the simple round-trip doesn’t have 200 ms latency.

There will be several steps to figuring it out.

NEXT STEP: realtime input with same setup
The next one is to use the built-in mic on your audio track and record both the direct signal on the track where you currently have your recording and the effected signal on another track in realtime. When I did this I had the metronome turned on. I

My results were the same. So, realtime roundtrip with no effects in the way on the iPad was the same as using a recorded source.

The usefulness of this as a test is that it means that there isn’t an inherent different in realtime input from a mic and usin recorded material. This is good to know because it will make testing down the line simpler.

STEP AFTER THAT:
Use your audio interface with the same buffer size settings as used for the built-in driver. For the test, if you have an interface that allows it to be connected to multiple device, have it connected to just the Mac.

Now repeat those first two tests and see what the latency is.

This is to determine to what extent the driver of the interface is involved.

P.S. This morning I am getting closer to 35 ms roundtrip latency with the same set-up. I also inserted THU and ToneBoosters EQ as effects in StudioMux host on the iPad with no difference. Overall. (Using 128 buffer on iPad).

I did encounter one instance where SMUX seems not to have been able to keep up and there was a crackle after which the latency was about doubled. I rebooted both devices and haven’t been able to reproduce that yet.

Hey, Let me begin by saying I was quite angered and later just frustrated after I got eardrum banged while trying to test the SMUX effect plugin – it’s a !!!DANGER!!! to allow the app to have this bug. It should be a high priority to fix this above anything else. If it’s not my ears it’ll be somebody else’s studio monitors or worse.

FWIW, I have provided plenty of beta feedback particularly with iOS app crashes as it crashes at least once a session. So I wouldn’t assume that I’m not helpful at this stage.

TEST GEAR: Mid-2015 MacBook Pro Retina (Big Sur 11.5.2) – tested in Live 11, Hosting AU, Logic, Mainstage. Today I was testing with the internal audio – i.e. int. mic/speakers (using Apple cabled headphones) – because I wasn’t in my studio. As I mentioned I tried all buffers and plugin types.

Worthy of note, all interfaces (except one which wasn’t connected anyway) I use are class-compliant devices – therefore no possible driver clashes. We can safely rule them out I think.

I have established that directly sending audio to the plugin/iOS device results in approx 200ms of latency. If I place the direct signal on a parallel track I can hear the two signals 200ms apart…like a single-repeat delay.

I next recorded audio (only in Live 11) on the SMUX plugin track (with a click). Next, I set another audio track to receive the processed audio from the original track. In this case the latency was approx 20ms between the original audio and the newly recorded processed audio.

I’m wondering if there could possibly be a conflicting file remnant from the old version of SMUX as I’ve upgraded from OS to OS…if that were the case could it affect performance?

That’s all I can offer for now. I hope this triggers some insight.

I am sure the dev wants to address this and I believe that you are experiencing this BUT he can’t address it without some sleuthing on your part (which I am willing to help with since such sleuthing is something that I used to do for a living).

Clearly others aren’t experiencing this (someone even posted above that they used it in a live performance) so there must be some relevant difference that we can try to sleuth out so that the developer can address it.

I outlined some next steps to try a couple of posts up.

What is different about your setup/routing when you get 200ms vs 20ms?

Are you using an aggregate device on your Mac by any chance?

Please try the steps that I outlined earlier this morning about next steps – they will help determine what aspect of your configuration results in the latency.

I’ve tried with 3 devices (built-in, Zoom U-24 and Motu82) and all are similar.

Thanks for your help.

I always look to internal audio as a baseline if/when encountering audio issues…it is fine for testing purposes.

Absolutely nothing is different about the setup from 200ms to 20ms* (*time error corrected in previous post) other than procedure as outlined in previous post/s.

I do use aggregates but I have not really tried to integrate SMUX yet because it’s not reliable enough yet. I certainly was not using an aggregate today when testing.

I’ll come back to this tomorrow when I’m in the studio and look at your other suggestions. Ciao for now.

I don’t understand. How can there be no difference in setup (which I take to mean: interface, routing, and plugins) between the situation where you have 20ms and 200ms latency.

I have no latency correction or anything turned on in Live and my results don’t change with any interface I have. They are identical if I use live input (mic or line) vs. recorded input. Adding TH-U and Toneboosters plugins doesn’t change the latency.

If when you return, you have those problems, take screenshots of everything involved (Audio setting panel in Live, your mixing board layout in Live showing whatever plugins are inserted, your StudioMux host on the iPad) – maybe something will stand out.

But really, add an element at a time to the basic project where you have 20 ms latency and see when the latency changes.

I don’t understand either…I’ve beta tested for several well-known audio software devs and this is the weirdest thing I’ve ever seen!

I was messing with it a couple of hours ago and noticed something really strange: tapping on the top right hand corner of the iOS app displays the signal latency over USB. When the single active channel is active with nothing in the FX slot and no sender/receiver it gives a reading of between -9999 and 31ms. With the plugins it is showing 215ms – this is what the metering states.

It’s the iPad Pro 10.5" (2017)…I’ll need to cross check with my iPhone 7+.

I did get some screenshots of this but I think a video would better demonstrate…so, pending…

Does it change once you establish the routing?

I also wonder if deleting the beta version and reinstalling might fix something. This is really odd.

1 Like

This. Reinstalling worked.

I went through and deleted all remaining traces of anything Zerodebug related and reinstalled. This included the Touchable Pro and Studiomux servers. Restarted, reinstalled latest betas of both, allowed security permission for Studiomux, restarted again, confirmed plugins are installed and now we are off to the races!

I also reinstalled the beta iOS apps just to be sure.

I suspect, as I hypothesized before, that it was in fact something hanging around from the old Studiomux server/driver interfering with the current beta version…but who knows?

I’m experiencing good performance now. Tested the VST3 effect plugin with TH-U (iOS) against TH-U (MacOS) in parallel. Same audio input. And any latency is negligible.

Also tested the VST3 instrument plugin with OB-Xd (DiscoDSP) against the MacOS version and it’s performance is solid.

Bug wise: I had one loud audio pop, one crash triggered by duplicating a track with an SMUX plugin on it (this always happens) and a bit of a strange phenomena where the plugin seems to be trying to find the right sample rate — produced audio starts out about a tone above pitch and gradually slides down to meet the actual audio input’s real pitch. Also, suddenly lost audio a few time and had to restart the server, app and DAW.

N.B. Only tested with iPad Pro at this stage.

Great! That’s good news.

Hi,

Not yet. I’m currently testing and implementing it, while trying to get the numerous non working AUEffects to finally work.

Will be fixed within the next beta, though i will exclude the samplerate setting. On iOS the samplerate is bond the hardware, as i understand i right. Up- or downlsampling will be done on receiver side anyway. (VST/AU, if audio is send from the mobile device/ studiomux app, if audio is send from desktop).

Thanks for trying out and good to hear that it finally works. Unfortunately we still have no idea, why a clean reinstall has this major impact on the functionality.

On windows, we will try to modify the installer, so each new install will properly clean all former installed files (this should already been done by overwriting the older files on install, but as you experienced, it obvioulsy does not).

On mac we would have to gave up the very convenient (for us :)) .dmg installer and set up a proper installer.

Will try to fix it within the next update.

The pitch/playing tempo adjustment is actually not only related to the receivers sample rate:

  1. Each device has a unique cpu clock, meaning you will always have differences in playback tempo/pitch. For example: Your desktop will play 44100 samples in 1.00000001 seconds while your iOS device will play the same amount of samples in 0.999999 seconds.

  2. Receiving audio from network and audio playback are procesing parallel, it is no serial process like:
    process1: receive audio -> process2: play audio -> process1: receive audio -> process2: play audio …

but a parrallel one: process1: receive audio -> receive audio -> receive audio
process2: play audio … -> play audio … -> play audio

To ensure audio can be rendered countinously, we have to keep a certain amount of samples in cache, .

The pitch adjustment you hear at the beginning is owed to the fact, that we want to minimize the latency (the number of samples in cache) as fast as possible, after starting the audio transmission: As soon as the audio receiver (Plugin/App) received samples from network, we start the playback (usually playback starts after the receiver got around 4 buffers from network). In case the number of cached samples is greater than 2.3buffersize, we increase the playback tempo to make sure the latency will be around 2.3buffersize at maximum.

Hope my explanation makes sense … In case it does not, feel free to ask. I attached the actual code.

When receiving audio, we calculate the clock_drift_addition value (one of the factors that determin the playback pitch).

if (cached_samples > buffersize * 2.3) {

clock_drift_addition = pow(((double)cached_samples - buffersize*2.3)/(16384.0*4), 1.2);
    }
    else {
        clock_drift_addition = 0;
}

Before pulling audio for playback, we calculate the number of samples we need to pull from cache (size_of_audio_needed), the clock_drift is derived from receivers/sender cpu clock relation):

        double index_change_per_sample = (1 + clock_drift + clock_drift_addition) * sample_rate_factor;
    int size_of_audio_needed = buffersize * index_change_per_sample + rest_of_sample_index;

Process of transmitting audio from mobile device to plugin:

  1. we start sending audio from the device and notify the plugin that it may start to render audio.
  2. the plugins sample cache fills.
  3. as soon as the cache has enough samples, the plugin starts to render audio. when the plugin starts rendering audio it usually has more than 2.3*buffersize samples in cache. and the plugin has to increase the playback tempo (sweep) to adjust the latency.
  4. as soon as the number of samples in cache is lower than 2.3 * buffersize, the tempo adjustment stops (it actually does not stop due to the clock drift, but the adjustment is not noticable. the clock drift is very small ~ 1.0001).

Hope i could at least answer most of the questions.

Best

Pascal

1 Like