Channel bonding for live stream video

hey everybody,
Im testing both the max hd 4 and the SpeedFusion Engine for live video streaming.
for the max hd 4 im only using 4 sim card. i noticed while using 4 sim cards at the sime time (bonding them) clearly there is more data passing )seeing it in speed test) but the video is not better than using only 1 sim.
is it posiible? what can i do to make it better?
thanks

1 Like

Hi and Welcome to the forum!
Live video streaming is always a question of getting the cleanest connection possible so that encoded video running over the top has the best possible journey ( ie shortest path, least amount of packet loss). However, how you tweak and adjust the things you can to fine tune that connection depends on your requirement.

I’ve worked with customers who need the most reliable video stream possible with the lowest latency (<100ms) and who use WAN smoothing to duplicate the video stream 4 times across all cellular WAN links to make that happen. The compromise is that a single 4Mbps Video stream consumes more than 16Mbps of cellular data.

I’ve also worked with customers who want the most bandwidth possible and don’t care about latency. So we add network buffers everywhere (increasing latency to more than 2 secs), turn on Forward Error correction, and up the the bitrate on the encoder. I’ve done 100Mbps streams over cellular reliably between countries this way for live remotely produced studio shows.

What type of live stream are you doing? Is high bandwidth more important than low latency?

It sounds like you are getting congestion on your cellular links when you combine 2 or more. The first thing I would try is Dynamic Weighted Bonding as that is proving very successful for me in multi-cellular deployments. Then enable WAN smoothing if DWB is not enough.

Good luck!

4 Likes

And are you using SRT in your encoder?

1 Like

Hi @MartinLangmaid
You always have good answers on this forum, I’ve been reading a lot of information lately as recently adopted the Pepwave infrastructure at the beginning of the year. And you’re from my home county of Devon too which is awesome!

My question about livestreaming is what settings on the pepwave will give me greatest resiliency, and when should I use bandwidth over latency? For my zoom webinars I just have a rule which sends the zoom traffic to my Fusionhub wan smooth tunnel via Saas in ic2, but what about video broadcast? I stream to Facebook live, Vimeo and YouTube and have had some luck with the FEC tunnel but have been running into bandwidth issues and breakups using LTE wans. Especially OBS which I get severe packet loss whenever I enable FEC! If I’m streaming between 4 and 10mbps, should I even be concerned about latency, and how would I go about prioritising the bandwidth on the FEC tunnel?

Thanks

Another Devonian! I suspect we are a rare breed on here :wink:

If you have enough bandwidth WAN Smoothing set to high. If you don’t have enough bandwidth everything you do after is a compromise.

Typically I show my streaming customers how to run WAN bandwidth tests and SpeedFusion bandwidth tests to work out how much quality bandwidth they have, and then we will typically use FEC and Dynamic Weighted Bonding most often with a hard bandwidth limit set in the tunnel sometimes too.

You can prioritise bandwidth when latency doesn’t matter. If you are hosting a panel based webinar with back and forth conversations, (low)latency will be king since interacting in real time with anything works best when latency is low. (which reminds me Dave Reynolds at Realm pictures is another Devonian and his use case makes this point well https://www.youtube.com/watch?v=ZAeMdOWE52o)

If you are just presenting, or using a 3rd party hardware encoder to stream video from a wedding or sports event, that encoder could well be adding 2-4 secs of latency in the video stream anyway so adding a little more buffers in the underlying IP stream won’t matter in practical terms or user experience.

I most often see issues on LTE when upload becomes saturated and there is buffer bloat. Dynamic weighted bonding is often a cure, or a hard upload bandwidth limit that is at 90% of the measured available bandwidth over the links can help also.

1 Like

Hi @MartinLangmaid,

I will give that a go. If I’m at a location I will usually test the LTE WANs individually, then the combined bonded throughput using the Pepvpn speedtest in the router. So you reckon use the FEC tunnel to my local fusionhub, with FEC and DWB enabled on the tunnel for best one-way video streams? I have noticed that OBS really hates both FEC and DWB but I will try it with my hardware encoder! And what about the receive buffer? would that be a useful thing? How does it work? I have tuned the DWB tunnel very well now using the cut off and suspension values.

Why does a hard upload limit help?

I currently have my zoom webinars on the WAN smoothing tunnel set to normal, so as zoom is a pretty low bandwidth hogger and if my speedtests show I can cope with the bandwidth, you reckon I should set Smoothing to high? I thought ‘normal’ duplicates every packet down all the links anyway?

Thanks again

1 Like

I enjoy this thread as I too am using a Pepwave Max Transit Duo for video streaming. I am curious what you have found to be the most effective since this exchange. I am planning on using the Pepwave from home with good broadband as well as in the field with widely varied internet availability.

If you don’t mind sharing what you have experienced in the past 4 months since this post was last addressed I’d love to hear what is working for you and what isn’t. I have an ATEM encoder for video streaming as well as OBS so anything you can recommend would be appreciated.

Dan

I’m using a 10 yr old Max 700. FusionHub on Vultr.

Bonding 2 WANS. WAN1 is fed from a GL.Inet travel router which is capturing the on-site WIFI. The on-site internet service is provided by mobile internet LTE router.

Second WAN is captured as wifi as WAN on the Max 700 fed from an iPhone hotspot on an LTE connection.

I bond both and using FEC on low setting. I have also set a custom latency cutoff of 500ms.

I have set my receive buffer on the backend FusionHub tunnel to 250ms.

So far it seems to be working ok. The latency on the iPhone hotspot can be erratic at times. I run into problems if the on-site wifi is misbehaving.

Bonded speeds average around 7-10 mbps upload.

We use Vmix and livestream to YouTube at 720p 2500kbps. We tried 1080p but that failed miserably.

Hi @dansherer,

Since the last post on this discussion, I have used my Pepwave Max Transit on around 50 livestreams and Zoom webinars. Here is a summary of my findings.

I set up a ‘production network’ for the backstage team and myself for general internet, show content downloading etc using the WAN port as the hot failover for most tasks and the 2 cellular links as dedicated streaming connections or backups if the WAN fails. The WAN in this case is either venue provided internet or a CAT 20 4G modem if I’m doing remote streaming.

If just on cellular and the 4G modem connected to WAN is fast enough, I will leave the setup as is, otherwise I will change the outbound policy rules to bond all 3 connections together using DWB over the usual bonding method. In my experience this gives slightly higher bandwidth and consistant connection on the bonded link.

When it comes to streaming, I have another VLAN set up for any streaming PCs which have their own outbound rule, which uses the WAN smoothing tunnel (on normal) using both cellulars and the WAN (as before, either another attached 4G modem or venue supplied internet). This works flawlessly for livestreaming and never had any dropouts or packet loss. The same with zoom webinars, I either use the VLAN with the zoom machines attached or put them in the main production network and let the outbound policy rule detect the zoom traffic and send it down the WAN smoothing tunnel rather than the Hot Failover tunnel.

I do not recommend FEC at all for streaming. I have tried various methods and platforms and have experienced dropouts and problems each time. I have used OBS, Wirecast, Blackmagic Web Presenter, Teradek Vidiu Pro and ATEM mini extreme, all have had issues when using FEC so have disabled it entirely. It doesn’t work for me and I’m not risking any more streams over it! Since only using WAN smoothing I have had no further problems.

The only other issue is the hardware limit of 90mbps on the Max transit while using the Speedfusion VPN. One day I will upgrade to the MDX but for now it’s fine for livestreaming.

I also had an issue using the Speedfusion cloud service and Zoom, maybe it was my particular setup but I had a slew of breakups and high packet loss when testing this. When I went back to my locally hosted Fusionhub in my own city, it was flawless again. But your mileage may vary.

Any other questions feel free to hit me up. I’ve loved using the Pepwave and has really improved my ‘unbreakable’ event internet system I use myself and hire out to AV companies.

Cheers

1 Like

Hey thanks for sharing. I’m still feeling my way through all the ways the Pepwave can be utilized.

Forgive my ignorance, but when it comes to VLANs I have very little experience setting those up. I understand the broad strokes, but not the details. How do you determine which PCs end up on which VLAN? If you care to explain a little about that that’d be great, in the meantime I’ll be doing some more reading up on setting up a VLAN.

Cheers,

Dan

Hi @dansherer,

Yes happy to clarify my VLAN setup!

As default, your MAX transit will have a network of 192.168.55.x, which is a flat network (no VLAN). There is no ‘tagging’ of packets required to use this network.

If you wanted to have other networks on the MAX transit to help segregate traffic, then VLANS can help. This are ‘virtual’ networks which can exist on the same ethernet interface, and requires a managed (LAYER-3) network switch connected to the MAX transit’s LAN port to utilise those.

By default, the MAX transit LAN port will be set up as a ‘trunk’ meaning it will send out your main network and any ‘tagged’ VLANS out of the port. You could also set it as ‘access’ meaning it would only transmit one of the vlans and nothing else if that’s what you wanted. This is found in the LAN port settings on the MAX transit.

Switch Config

In your managed switch config, you would first enter the VLAN IDs of your networks created in the MAX transit so the switch can identify them, and then you can ‘untag’ each port to a specific network on the switch according to your needs. The port which connects back to the MAX transit should always be a trunk port though, so it can receive all the networks.

Cheers

Right it’s coming back to me now. The managed switch is the part of the equation I had forgotten. Thanks. Not sure I need vlans for my setup at this point but I appreciate the intel!

I could do similar protocols without the VLAN using outbound policies based on ip addresses or by outbound policies based on traffic destination.

Thanks for the help,

Dan

We use NDI with PTZ cameras but we setup our own network behind our Max 700 so don’t need to further segregate using vlans.

But if integrating NDI devices on an existing network, might help keep things contained by using a vlan for NDI and streaming. NDI uses mDNS for auto discovery so is limited to local subnet as broadcast domain unless a broadcast gateway is used to extend the broadcast to other subnets.

https://support.newtek.com/hc/en-us/articles/218109477-NDI-Discovery-and-Registration?mobile_site=true

I’ve been trying to get my Blackmagic Design ATEM Mini Pro and Web Presenter HD encoders to work in conjunction with my Transit Max Duo, but sadly have not been able to get it to work properly.

I did a factory reset, re-enabled the Speedfusion service, and created a outbound policy to enforce all connections to be pushed over the Speedfusion Cloud. I also tried Dynamix Weighted Bonding, Forward Error Correction and WAN Smooting settings, but no matter what I try; I keep getting the same errors.

The ATEM encoder cache keeps filling up, and the packets are not getting delivered to the Vimeo server so it seems. The encoding process is set at 6mbps, but the Speedfusion service is only registering like 700kbps.

How have you guys been able to set up a ATEM RTMP stream to begin with (let alone optimising bonding settings and all that)? I wasn’t expecting it all to be this complicated to be honest.

Any help would be appreciated!

Peter

How many WANs are you bonding? What speeds are you getting when running Wan analysis?

For live-streaming FEC is the recommended approach I believe. WAN smoothing will use double the data as it replicates packets across all WANS and is ideal for video conferencing where latency and dropped packets are more noticeable.

We stream to YouTube at 2.5mbps and 720p even though our bonded speed tests usually show adequate speeds. 1080p 4.5mbps stream didn’t work for us.

Look at the Speedfusion graphs to see how the tunnel is performing.

1 Like

I’m bonding 2 cellular WAN’s. Performing speed checks indicates that connection quality and speeds are more than sufficient when doing standard web browsing activities. When streaming from the ATEM however, it looks like the device is unable to push enough band with. In other words; even though the ATEM is encoding at 6500 kbps, the Peplink is only showing like 800kbps throughput, even though the device is licensed to perform at 200mbps. It looks like something is bottlenecking this specific type of connection.

I’ve been considering replacing our current line-up of LiveU bonding encoders with Peplink line of products, but this situation is really concerning.

I’ll take a look at your setup for you if you like. Would be interesting to see whats going on and how we might fix it.

1 Like

Hi Martin,

This would be great. Any way we can get into touch directly?

Regards,

Peter

Sure. just sent you a private message.

1 Like

Wow I just put up a post about network switchers and how to use transit duo so only my laptop uses the bandwidth to stream out to Vimeo fb YouTube via vmix! I am implementing ptz optic cameras poe over ndi via 10g switcher. I’m still learning but soaking it up.

Thank you for sharing your knowledge