- TVFind out more about...
- RadioFind out more about...
- Mobile and TelecomsFind out more about...
- Arqiva brings 4G DAS to Canary Wharf - Case Study
- Managing and maximising the BT Reach rooftop portfolio – An Arqiva case study
- Providing first-class portfolio management for ScottishPower – An Arqiva case study
- Arqiva helps Horsebridge to deliver ferry fleet connectivity
- Bringing connectivity to the skies, through the innovative EAN – Arqiva Case Study
- Smart Metering
The future of multiplexing: what’s in store?
Few consumers realise how much goes into the delivery of digital television content. The process is complex to say the least.
Two essential parts of the journey from camera to screen are encoding (also known as compression) and multiplexing, both of which take place in Arqiva’s dedicated multiplexes. These processes, like most others in the broadcast industry, are subject to evolve – and with good reason.
So, what exactly is changing, and what can we expect for the future?
First, back to basics with multiplexing
We encode and compress for one reason: to get high quality digital pictures and audio into a realistic amount of bandwidth.
In the past, with analogue programming, you will have had one TV channel per transponder. Now we’re digital, without compression you’d need several transponders and a high data throughput to make it work. Not only is this impractical, it doesn’t fit the UK and international frequency plans. Compression, therefore, is essential.
After the channels are suitably compressed – typically by a ratio of around 100:1 or more - we aggregate them so they can fill transponder space efficiently, and be fed to the relevant distribution networks. This is called multiplexing.
Further reading: Coding and multiplexing – How does multiplxing work?
Where are we now?
All in all, there are more than 80 multiplexes delivering TV to UK homes by Direct-To-Home (DTH) satellite - either via Freesat or Sky - around 40 per cent of which are provided by Arqiva. Across these multiplexes, there’s a blend of standard definition (SD) and high definition (HD) programming.
Within each of the 80 or so multiplexes, we’re able to fit between ten and 15 SD channels or around five HD channels - or a mix of both. Each service is allocated a set amount of bandwidth, based on what we, in collaboration with our customers, believe to be suitable for its programming. This works well, but challenges are inevitable.
HD content is becoming the standard, and even higher-quality programming is on the horizon in the shape of ultra-high definition (UHD). We, and the multiplexing process, must adapt.
The first big step came in the form of statistical multiplexing – an approach that allows the most demanding content, on a ‘from time-to-time’ basis, to borrow bandwidth from others in the same multiplex. Conversely, channels are borrowers and lenders – any channel can equally lend bandwidth back to others when necessary – all working together for the best results. Within any programme, the complexity of the picture is constantly changing, and so the statistical multiplexing needs to react dynamically, frequently and rapidly - a sports programme with plenty of movement, for example, will usually require more than a panel show, but when it reverts to a simpler image, such as a player’s face, its bandwidth can be used elsewhere temporarily. All the channels within a multiplex therefore effectively form a lending/borrowing ‘pool’.
This helps everything to run a lot more efficiently, and benefits all involved, from us and the broadcasters right through to the viewers, who get to enjoy subjectively better quality pictures.
The industry’s use of statistical multiplexing will only continue to grow as we move forward, and as long as multiplex pools are thoughtfully and carefully loaded with an appropriate blend of channels, statistical multiplex should continue to help solve challenges.
We’ve also seen encoding technologies improve steadily over the years, allowing for greater compression. This means more channels per multiplex and ultimately more value for money for Arqiva’s customers.
What happens next?
Unsurprisingly, HD programming requires more bandwidth than SD programming, so UHD being the next natural step leaves us with various challenges to overcome. Compression will only go so far with the incredibly high resolution of a UHD picture, so it’s likely to be the case that even after encoding, we’re unable to fit more than two UHD channels in a single multiplex without compromising quality through overly aggressive compression. This then limits the usefulness of statistical multiplexing, which works best when there are several channels in the bandwidth pool, all working together to maximise efficiency.
What we’ll need to see, therefore, is larger ‘super’ multiplexes, of 100 – 200 Mbps, capable of handling more channels at one time – four, five or six channels. Maybe more. Such rates are impossible to carry in a single transponder, and so channel bonding will also become more important; the super multiplex will be broken down into up to three lower rate Transport Streams, each being delivered over several transponders, before being re-assembled within the viewer’s receiver. Viewers will be unaware of such complexities, as they conventionally select the channel they want to watch.
Channel Bonding enables better statistical multiplexing for HD and UHD. The way in which the Statistical Multiplex engine allocates bandwidth between channels is already more sophisticated than it’s ever been, and it will only become more effective – the rate at which a channel can change from being a borrower to a lender in the pool is a focus, and improvements here will be crucial to maintaining picture quality for consumers.
Harnessing the power of the cloud
To achieve all of this, the technologies the industry relies on will need to work harder, quicker and more efficiently. Physical advancements will help, but the cloud is also primed to play a crucial role – much like it has already done in most other sectors.
At present, the multiplexing process happens in appliances, i.e. physical boxes, but in the future, we expect much more of it to take place in the cloud. Adopting this approach will come with various significant benefits.
First and foremost, we’ll have access to near-unlimited computing power without the huge financial investment; this means more capability and better value for our broadcast customers. We’ll also save on physical space, and will no longer be required to physically configure multiplex engines – everything can be done through software, as remotely as necessary.
The cloud will allow the industry – broadcasters in particular - to be a lot more agile too. Notably, lead times (i.e. time-to-deploy) will reduce - channels can be set up with minimal preparation, opening the door for part-time and pop-up channels to become a normal part of consumers’ viewing.
The future’s bright
For as long as multiplexing and encoding have been parts of the broadcast process, they have evolved to meet the needs of the industry and consumers at home. This will undoubtedly continue, and it’s clear that everyone involved stands to benefit.
Advancements like statistical multiplexing are already helping us deliver high-quality services effectively and efficiently, and as we move forwards towards UHD and even more diverse programming, the cloud will become crucial.
Find out how we’re setting the standards that others follow
Introduction, Wave One, Two and Three now available to download
Find out all you need to know about developing a D2C marketing strategy that will drive viewers to your content