More?!! A first look at MPEG-MORE
Sometimes standards make all the difference. In this technical post, MARK LOMAS from BBC R&D looks at some standards that seem relevant to the delivery of multi-screen experiences. The 2-IMMERSE team is not yet clear about the impact these standards will have. If nothing else, the development of standards forces someone to diligently describe, and attempt to generalise, a problem – and that’s useful.
Multiscreen experiences are unlike regular TV programmes. The components required to make multiscreen experiences are delivered, a bit like flat-pack furniture, ready to be assembled in different ways for playback across a group of devices.
Playing media at the right time on the right device is tricky; it is like conducting an orchestra. The conductor must interpret the musical score and direct the performers by metering out the musical pulse. 2-IMMERSE is investigating how to conduct ‘orchestras of devices’ in the home, in the cloud and within production environments. We have been following the evolution of MPEG-MORE, a proposed technical standard for media orchestration.
What is MPEG-MORE?
MPEG-MORE is an acronym for “Motion Picture Experts Group – Media ORchEstration”. As of 29 March 2017, the MPEG-MORE specification is at the committee draft stage. The latest documentation is available from the ISO content server.
MPEG-MORE is concerned with the orchestration of media capture, processing and presentation involving multiple devices. The standard describes a reference architecture comprised of an object model and set of control protocols for supporting orchestration scenarios in a network independent way. It also describes many types of timed-metadata such as spatiotemporal ‘Regions of Interest’ for orchestrating media processing and playback. It specifies how timed orchestration data and timed metadata is delivered in transport formats such as ISOBMFF, MPEG-2_TS and MPEG-Dash.
MPEG-MORE has adopted the same DVB-CSS Inter-device Media Synchronisation standard as 2-IMMERSE, but describes how it can be generalised and used within production. Finally, MPEG-MORE suggests that multiscreen experiences are an Internet of Things (IoT) application. This has encouraged us to investigate IoT cloud platforms such as Amazon Web Services IoT and Google Cloud IoT.
How can 2-IMMERSE leverage MPEG-MORE?
2-IMMERSE and MPEG-MORE have a lot in common. The 2-IMMERSE Theatre At Home trial provides an early working demonstration of some of the MPEG-MORE concepts in action.
• 2-IMMERSE Design Patterns
When the 2-IMMERSE architecture is described in terms of the MPEG-MORE object model, it surfaces system design patterns that are otherwise obscured by implementation. This representation shows that 2-IMMERSE micro services can act as orchestrators and demonstrates the 2-IMMERSE platform’s extensibility in a new way.
• Orchestration in the Cloud
MPEG-MORE uses a formulation of DVB-CSS timing and synchronisation that allows it to be extended into the cloud and right back into production systems. It describes a timing architecture for production that mirrors the timing architecture for playback. We would like to leverage this to hoist compute-intensive operations such as video compositing into the cloud to support devices and homes with poorer bandwidth and/or compute capability. We would also like to use this to generate timeline correlations at various points throughout the system.
• Quality of Service
A finding from our early pilot of the Theatre At Home experience was that multiple devices in the home compete with each other for available network bandwidth. This is a result of bit-rate adaption algorithms used by video players and the absence of a coordination mechanism for managing bandwidth across multiple devices. We discovered that MPEG-MORE’s communication is modelled on MPEG-SAND control messages. (See ‘Enhancing MPEG DASH performance via server and network assistance’. This makes MPEG-SAND of interest to 2-IMMERSE because it processes Quality of Service (QoS) information to arrange for the optimal delivery of content in multi-device ecosystems.
MPEG-SAND control messages are sent between sources, sinks and processing nodes in a network and therefore MPEG-SAND fits the MPEG-MORE object model nicely. The MPEG-MORE specification actually gives an example use case where Mean Opinion Score (MOS) timed meta-data is exchanged via control messages to orchestrate playback of different video feeds.
MPEG-SAND was designed to address a range of issues and use cases. Those that are relevant to 2-IMMERSE include:
- ‘Multiple DASH clients compete for the same bandwidth, leading to unwanted mutual interactions and possibly oscillations’.
- ‘Where a DASH client lets the delivery node know beforehand what it will request in the near future to prime the cache’
- ‘Network mobility, e.g., when the user physically moves, which makes the device switch from one network to another, but must maintain Quality of Experience (QoE).’
- ‘Inter-device media synchronization, e.g., when one or more DASH clients playback content in a synchronised manner.’
Available network bandwidth is a constraint that must be processed by the 2IMMERSE layout service when deciding what content to lay out. A future version of the service could collaborate with an MPEG-SAND element to gather, communicate and act on QoS/QoE measurements and exchange parameters for enhanced reception and delivery.
The architecture of 2-IMMERSE and MPEG-MORE are in excellent agreement. We are eagerly awaiting the next revision of the MPEG-MORE specification, but it has already given us plenty to think about. Right now, we are trying to understand how to blend MPEG-MORE, MPEG-SAND, IoT and 2-IMMERSE technologies together to deliver a revised architecture that can solve the many challenges of multiscreen experiences.
Image copyright: stockbroker / 123RF Stock Photo