Recent developments in technologies and transformations in Cloud Solutions are changing the face of how TV and broadcast work, both visibly and behind the scenes.
"We’re integrating our cloud-based metadata services with Alexa and Google Assistant and users will soon be able to find content through voice"
There are two parts to this. For (1) content delivery and consumption, an increased focus on on-demand IP content and use of standard internet technologies can go hand in hand with increasing cloud usage. We’re clearly seeing a shift in the way people are willing to consume content, with more people watching programmes from the likes of Amazon, Netflix and Google’s TV platforms than ever before. Amazon and Google are both significant cloud infrastructure providers, and Netflix has been an early adopter and innovator in cloud technology.
However, there is still a need for regular broadcast via terrestrial, satellite or cable which use set top boxes in the home. People like watching live TV and this is an efficient way to deliver it. The custom nature of the technologies and hardware required means there’s less use of cloud technology right now, but we’re starting to see increasing virtualization of set top box functions. YouView is running a project to do just this, where TV recordings, clash management and streaming between devices is managed in the cloud. It keeps our set top box software simple, and gives viewers consistent access to their content from any device.
For (2) content discovery–enabling people to find content they want to watch–we see rapid adoption of cloud technologies by broadcasters. This is certainly YouView’s focus. All of our backend services reside in the cloud using Amazon, and that’s been the case since we launched six years ago. We have little legacy with most of our services built with cloud in mind from the start and as a business we are completely cloud-native. No lift-and-shifts. The benefit is a highly agile, scalable infrastructure we can use to adapt and personalise for viewers. For example, when a set top box starts to record a live programme, we synchronise that information back to the cloud. We can then use this across a range of devices. It’s the kind of spiky, low-latency, high-throughput workload that would be impractical without on-demand, elastic cloud infrastructure.
We are now extending our services for others to use–starting with our BT and TalkTalk shareholders–but we hope to get other content providers using too. The cloud enables us to build services and share them in a controlled and secure way–we call this our Third-Party Toolkit. Ultimately, we believe this could enable an ecosystem of players that can share content a viewer is interested in–for example when you add a content provider’s programme to your YouView watch list, this could appear in their player, and vice-versa. It could create a seamless content discovery experience for viewers.
Data capture and analysis represents a large part of YouView’s cloud investment and spend. I expect it is true for other broadcasters too. We collect anonymous data on viewing habits, operational issues, and how people use our service to find content. We use it only to improve our service. It throws up some valuable and sometimes fascinating insights. For example, as a TV series is starting to air, we see quite a lot of on-demand consumption for those early episodes. But for later episodes, as viewers become interested and aware of when it is being aired, we see more consumption switch to live and playback from recordings. It has implications for how content provides promote their content, advertising and how we structure our user interface.
The introduction of new devices such as home AI smart assistants, for instance Amazon’s Echo, means consumers will soon be able to find content through voice. In order to make this possible, we’re integrating our cloud-based metadata services with Alexa and Google Assistant. Voice will lead to a more convenient and personalised experience, but we don’t think it replaces our current user interface. Rather the two can be combined. The viewer can start with an open question like “Alexa, show me something interesting”, then through a content discovery dialogue, could end up with four or five most relevant options to select from on the screen.
In terms of what the future holds for the cloud, of course we’ll see the basics improved. For example, whilst cloud infrastructure currently offers a high degree of elasticity and scalability, this will progress so that access at high loads can be close to instantaneous. This is important for our synchronous live use cases.
Another area on which we’ll see progress is being able to join up cloud services across departments and organisations, coupled with sophisticated analytics to provide viewers with a better experience–so called omnichannel. For example, it would be possible for a call-centre operator at BT or TalkTalk to have-to-hand a holistic summary of any problems a customer might be facing across any of their broadband, mobile or TV services. This would enable them to better serve that customer, and maybe offer them a relevant new TV package if they’re considered at risk of churning.
Machine-learning is an area where cloud services have been slow to adapt to use in production workloads, but this will change. We’re already using to spot patterns in our data we think correlate to when viewers are having problems, and for content recommendations.
And specifically for broadcasters, we will see more content delivered over IP, more of that being on-demand, and a lesser need for custom devices. Ultimately this will mean any device can find and consume any content, and where all of the smarts in building a video service reside in the cloud.