Skip to content

Video mixers tend to involve expensive, complicated hardware. We made one for the browser that anyone can use.

i

The television will be revolutionised

BBC R&D logo

In May 2017 we started working on one of the most challenging short-term projects we’ve ever had.

Since 2012 BBC R&D has been running a major programme to develop IP Studio, a broadcast television platform based on internet technology. Our brief was to build a front-end application that interfaces with IP Studio’s back-end services. In other words, a vision mixer that works in a web browser.

From truck to mouse

As the name suggests, vision mixing takes multiple video and audio inputs and produces a final composite output, including graphics, pre-recorded video and live feeds. This is already a significant task but, to further complicate matters, our solution had to be suitable for live television. This means it had to interface with specialist equipment that can handle 4K streams in real time with no buffering.

Traditionally, even a modest live broadcast might require half a dozen people. When you’re filming away from the studio, it also needs an outside broadcast van costing millions of pounds. BBC R&D wanted to operate with just a keyboard and a mouse.

Technologies: React
Duration: 3 months
Team size: 4 devs
Hosted: Private cloud

It’s about time

We soon discovered that the big issue was time. Our solution had to combine multiple flows into a final synchronised output. This is fiendishly difficult because some flows arrive later than others and systems need to juggle them until everything is running together.

Also, as the vision mixing technician cuts to a different camera or fades out an image or superimposes a graphic, they have to see the effect of their decisions instantly and in the right order. This means showing lower resolution streams in the browser and sending the corresponding high resolution streams out for broadcast. As we said, it was challenging.

We built the vision mixer in just eight weeks, using React. Needless to say, we learned a lot.

The learning curve

We built the vision mixer in just eight weeks, using React. Needless to say, we learned a lot. We started by grasping the nettle.

We often work in areas that we have little to no knowledge of. A few recent examples include horse racing odds, medical curricula and epilepsy diagnosis. However, we don’t let this become a problem. Instead, we focus on designing and building products, letting our customers provide the industry expertise. We were lucky that BBC R&D had written a comprehensive brief and took us to see a traditional vision mixing gallery, so we could understand the practicalities.

Because the timescale was so tight we had to be both purists and pragmatists. Dealing with the synchronisation issues demanded perfection, because any bugs would lead to instability that was impossible to trace. On the other hand, there were some aspects of the UI that needed to perform and look excellent, but the code could be rougher around the edges. BBC R&D trusted us to juggle technical and commercial priorities and make the right decisions for the overall project.

More please

We’re really proud of this project. Our vision mixer was used at the Edinburgh festival to broadcast real live television — and it looks set to help revolutionise the industry. If you’ve got a similarly impossible brief, we’d love to hear from you.

See more of our work