Channel bonding for live stream video

Hey thanks for sharing. I’m still feeling my way through all the ways the Pepwave can be utilized.

Forgive my ignorance, but when it comes to VLANs I have very little experience setting those up. I understand the broad strokes, but not the details. How do you determine which PCs end up on which VLAN? If you care to explain a little about that that’d be great, in the meantime I’ll be doing some more reading up on setting up a VLAN.



Hi @dansherer,

Yes happy to clarify my VLAN setup!

As default, your MAX transit will have a network of 192.168.55.x, which is a flat network (no VLAN). There is no ‘tagging’ of packets required to use this network.

If you wanted to have other networks on the MAX transit to help segregate traffic, then VLANS can help. This are ‘virtual’ networks which can exist on the same ethernet interface, and requires a managed (LAYER-3) network switch connected to the MAX transit’s LAN port to utilise those.

By default, the MAX transit LAN port will be set up as a ‘trunk’ meaning it will send out your main network and any ‘tagged’ VLANS out of the port. You could also set it as ‘access’ meaning it would only transmit one of the vlans and nothing else if that’s what you wanted. This is found in the LAN port settings on the MAX transit.

Switch Config

In your managed switch config, you would first enter the VLAN IDs of your networks created in the MAX transit so the switch can identify them, and then you can ‘untag’ each port to a specific network on the switch according to your needs. The port which connects back to the MAX transit should always be a trunk port though, so it can receive all the networks.


Right it’s coming back to me now. The managed switch is the part of the equation I had forgotten. Thanks. Not sure I need vlans for my setup at this point but I appreciate the intel!

I could do similar protocols without the VLAN using outbound policies based on ip addresses or by outbound policies based on traffic destination.

Thanks for the help,


We use NDI with PTZ cameras but we setup our own network behind our Max 700 so don’t need to further segregate using vlans.

But if integrating NDI devices on an existing network, might help keep things contained by using a vlan for NDI and streaming. NDI uses mDNS for auto discovery so is limited to local subnet as broadcast domain unless a broadcast gateway is used to extend the broadcast to other subnets.

I’ve been trying to get my Blackmagic Design ATEM Mini Pro and Web Presenter HD encoders to work in conjunction with my Transit Max Duo, but sadly have not been able to get it to work properly.

I did a factory reset, re-enabled the Speedfusion service, and created a outbound policy to enforce all connections to be pushed over the Speedfusion Cloud. I also tried Dynamix Weighted Bonding, Forward Error Correction and WAN Smooting settings, but no matter what I try; I keep getting the same errors.

The ATEM encoder cache keeps filling up, and the packets are not getting delivered to the Vimeo server so it seems. The encoding process is set at 6mbps, but the Speedfusion service is only registering like 700kbps.

How have you guys been able to set up a ATEM RTMP stream to begin with (let alone optimising bonding settings and all that)? I wasn’t expecting it all to be this complicated to be honest.

Any help would be appreciated!


How many WANs are you bonding? What speeds are you getting when running Wan analysis?

For live-streaming FEC is the recommended approach I believe. WAN smoothing will use double the data as it replicates packets across all WANS and is ideal for video conferencing where latency and dropped packets are more noticeable.

We stream to YouTube at 2.5mbps and 720p even though our bonded speed tests usually show adequate speeds. 1080p 4.5mbps stream didn’t work for us.

Look at the Speedfusion graphs to see how the tunnel is performing.

1 Like

I’m bonding 2 cellular WAN’s. Performing speed checks indicates that connection quality and speeds are more than sufficient when doing standard web browsing activities. When streaming from the ATEM however, it looks like the device is unable to push enough band with. In other words; even though the ATEM is encoding at 6500 kbps, the Peplink is only showing like 800kbps throughput, even though the device is licensed to perform at 200mbps. It looks like something is bottlenecking this specific type of connection.

I’ve been considering replacing our current line-up of LiveU bonding encoders with Peplink line of products, but this situation is really concerning.

I’ll take a look at your setup for you if you like. Would be interesting to see whats going on and how we might fix it.

1 Like

Hi Martin,

This would be great. Any way we can get into touch directly?



Sure. just sent you a private message.

1 Like

Wow I just put up a post about network switchers and how to use transit duo so only my laptop uses the bandwidth to stream out to Vimeo fb YouTube via vmix! I am implementing ptz optic cameras poe over ndi via 10g switcher. I’m still learning but soaking it up.

Thank you for sharing your knowledge

I finally figured it out guys. I switched IoT SIM card provider and all is working fine now. So, for some reason, my previous provider was giving me all sorts of trouble. Don’t know why, but at least all seems to be working fine. I won’t point out any names in public here, so if you want to know my previous provider and what provider I switched to, send me a DM.



Glad you finally fixed it! I too had issues with ATEM products and Pepwave, I found in my testing that they didn’t like the bonding or FEC tunnels, but if I used a single WAN or the WAN SMOOTHING tunnel all the problems went away. Similar results for the Vidiu Pro, OBS and the web presenter. My ATEM extreme loves the WAN SMOOTHING tunnel!

I also finally got to the bottom of my 75mpbs limit on my MAX transit duo, I don’t think it was the QOS at all that was causing it, as turning it off only increased the pepwave throughput by 5mpbs. When @MartinLangmaid suggested a bandwidth limit on the upload, I was skeptical but gave it a go. On the tunnel settings, I applied a 150mbps download/upload limit, knowing my Transit duo had a hardware limit of 100mpbs across the pepvpn tunnel.

Now when testing a fast WAN, I can see speeds of 130mpbs max where before it would cap out at 75mpbs. Weird but it works.

This thread is great. I have been using my Max Transit Duo with my Atem Mini Pro and Atem Mini Pro Extreme Iso and it has been working very well. I assign the ATEMs to various SFC tunnels via their MAC address so even if I happen to change my IP addresses it automatically picks the traffic up and sends it where I want.

I use a SFC hot failover tunnel in conjunction with a hardline internet at a venue if a client purchases my “backup internet” service. This way I am truly saving my SIM/LTE data for use in a backup situation only.

If I find a location’s hardline to be insufficient to support a stream, then I will use a bonding tunnel to achieve better bandwidth for the stream and let the client know that the “backup internet” is now being used just to get their stream going.

And Finally if a client wants the best quality stream I can provide, then I use all the connections available and employ WAN smoothing and set my stream bit rate at whatever the highest setting the combined tunnel can achieve.

So far so good!

@dansherer great news!

This is exactly what I do as well, I work in live events but as a side project I do temporary internet connections for event wifi and livestreaming, so if the client is concerned about flaky venue wi-fi, i will just provide my pepwave as an extra service! Couple that with a battery pack or UPS and you have ultimate unbreakable internet!

I just ordered 3 Peplinks. Two Balance Twos and a BR1 5g. One Balance is for my home studio another for a permanent stadium location and the BR1 5g for road events. I am pretty nervous about it all, but I too am looking for a more robust solution to livestreaming. I have a LiveU solo, but I don’t like that I can’t also have the rest of my devices on it and it causes way more cables than I want trying to set it up an away venue with only about an hour to do so along with everything else we have to setup.

If I am not using LiveU to broadcast, then I generally use vMix. vMix is my preference if I can get all the bonding to work. I am really hopeful, but also confused with the differing opinions on setup. I don’t care much about latency, I just want my 720/60fps stream to reach my RTMP destination in a resilient way. I will change up what I use in my BR1 5g for source wans dependent on needs. The basic setup will look like this if no local ethernet:

5G T-Mobile in Sim slot
5G ATT Nighthawk using ethernet
4G ATT Nighthawk using WIFI as Wan (2.5ghz)
5 T-Mobile Hotspot from phone as 2nd WIFI to WAN (5ghz)

In a venue with ethernet or Wifi I will change the 5g Nighthawk to Wifi and use venue for ethernet. WAN and drop the 5G T mobile phone.

Now that all of the above has been mentioned, has anyone with a similar setup, come up with a solution that works best?

NOTE: I will be using a MikroTik switch and I use NDI and Dante pretty heavily. These don’t go out to internet, but they would be traffic flowing in the switch and to my main streaming PC.


I to have a LiveU and run vmix, I got a Peplink UBR LTE and bond 1 4g sim, 1 Nighthawk with 4g sim as wired LAN and a TP Link mifi modem with 4g sim as wifi LAN using speedfusion cloud.

Can stream 1080/50 @ 8000kbps steady

WIFI performance is poor so I run hardwired.

Image shows Speedfusion upload test then vmix stream.


Steve, I’d love to see your bonding settings! We are having serious issues trying to bond out of vmix. Im using two 5g nighthawks over ethernet, and the two 5G internal modems on TMO. I’ve tried every bonding profile, and it seems like after about 5-10 minutes of solid streaming, vmix turns orange and the buffer gets full and eventually the stream just completely fails.

1 Like

Can you share your Peplink settings that have been working for you?

Speedfusion and outbound policy settings.