Fine-tuning the live production tools

JI LIE of the Distributed and Interactive Systems (DIS) group at CWI Amsterdam writes about the project’s recent encounters with broadcast professionals.

2-IMMERSE is developing a production platform for creating multiscreen and immersive experiences by TV professionals. The production platform includes three software components:

  1. A preproduction tool that aims at digitalizing the authoring process in an object-based manner. It is a graphical tool to help producers scripting by temporally arranging different media objects (such as video clips, graphics, audio, and subtitles) for both TV screens and companions.
  2. A live editing tool that enables the production team to react to live events happening during a television program, such as a goal during a football match and an overtake during a MotoGP race. When live events happen, the editorial team can quickly, according to the director’s instructions, edit associated media (graphics, name tags, replay clips) based on prepared templates.
  3. A live triggering tool that includes a button-based graphical interface and a keyboard launcher. The edited templates are enqueued as a thumbnail on the interface of the launcher, which can then by easily triggered by the director.

The live editing and triggering tools have been successfully evaluated at the Wembley Stadium during the FA Cup semi-final and final and showcased at the International Broadcasting Convention (IBC) 2018 in Amsterdam. The team is now busy further developing the preproduction tool.

Fig. 1. A demo of the live editing tool and the live triggering tool at IBC2018

The initial wireframes of the preproduction tool were designed and iterated based on the inputs of two rounds of interviews with a total of 20 professionals in the broadcasting industry (see papers: “Designing an Object-based Preproduction Tool for Multiscreen TV Viewing” and “A New Production Platform for Authoring Object-based Multiscreen TV Viewing Experiences”). In September of 2018, a number of extra interviews to fine-tune the interface and interaction design of the tool have taken place.

The interviews were conducted with seven professionals (P1-P7; 5 males, 2 females; M=35.0, SD=6.0), whose backgrounds are summarized in Table 1. The interviews happened from 3 – 5 September 2018 at the usability lab of BBC R&D, located in the MediaCity, Manchester.

Fig. 2 (Top). The hierarchical organization of the program chapters; (Bottom) The spacial and temporal arrangements of the DMApp components.

The interviewees confirmed the basic design decisions for the preproduction tool: hierarchical organization of the program sections and sequential arrangements of the media objects following the script (see Fig. 2). The former offers a clear structure and enables the use of a master layout to reduce repetitive production work. The media objects can be configured and reused to have interactive and novel functions on TV programs. These can, for example, enable users to select different view angles or to follow their favourite football players.

In addition, the professionals recommended a node structure to link media objects within a section, to have a clear overview of objects that are running in parallel and those running in sequence. Regarding the layout of the TV program, the professionals confirmed the intuition of the research team that the preproduction tool does not require to manage it.

Based on the feedback, the original interaction design of the tool will be improved, a new development process will take place, and a final evaluation with the seven professionals will happen by the end of November. The idea is to ask the professionals to create a multiscreen and interactive TV sports program. Stay tuned!

READ MORE


Back of the net!

The 2-IMMERSE Football production team report on the success of our major trial at Wembley.

2-IMMERSE passed another milestone at the recent 2018 FA Cup Final at Wembley Stadium between Chelsea and Manchester United as we proved our prototype end-to-end live production system for object-based broadcasting to multiple screens.

For those that may not know the FA Cup is the world’s oldest football challenge cup, having been started back in 1871. The high profile final is the curtain closer for the UK domestic football season and has spawned and fuelled many of the intense inter club rivalries that give the UK league its particular character. For decades the match has also been a key television moment and a showcase for experimentation and innovation in the presentation of football on television. So 2-IMMERSE is very proud this year to have been a small part of a great tradition. The FA Cup Final is a national treasure, but it also draws a global audience estimated to be in the hundreds of millions.

Our team for the FA Cup was drawn from BT, CWI, Cisco, and BBC R&D – and they all worked tirelessly over the preceding weeks and months to design and develop the end-to-end system. Early ‘sighting visits’ to Wembley (see our previous blog post) helped us identify and address the key issues that can affect the delivery of the service and helped the test on the day of the Final to go as smoothly as it did.

On the day, we arrived with our modest broadcast truck and once the feeds from all the individual cameras were in place we were set to go. After all the preparation we were able to conduct three key activities at Wembley:

  • Live orchestration of match graphics, mirroring the broadcast graphics production but using our own HTML 5 focused production and delivery chain
  • Live monitoring and demonstration of prototype Football DMApp (see blog post) on site at Wembley
  • ChyronHego data capture for Virtual Placement: camera parametric data and Tracab player tracking data.

In addition to the on-site achievements, and to further illustrate the end-to-end nature of the trial, we engaged 10 remote viewing participants in UK, Sweden and the Netherlands, to experience watching our interactive, customisable and multi-screen version of the 2018 FA Cup Final.

Among the key system elements and features of our system are the following (which will be illustrated in a short video coming soon):

  • Live triggering tool with trigger launcher driven by Elgato Streamdeck
  • Real-time preview in OB van driven direct from SDI
  • TV emulator and Samsung tablet as client devices
  • Live sessions integrated within client onboarding workflow
  • Match GFX components developed by ChyronHego and implemented using new DMApp component which integrates with their Prime universal graphics platform.
  • Interactive experience on tablet included:
    • ScoreClock menu with match overview, team line-ups and replays
    • Broadcast menu with camera feed viewer and customised picture-in-picture layout on TV

Like most productions this was a massive collaboration involving people and organisations across and beyond 2-IMMERSE. We are happy to acknowledge the help of: AWS Elemental in providing real-time encoders; BT Technology Service and Operations for providing some key with contacts and helping with system design; BT Sport for permission to access the stadium and allowing our small van to take its place alongside the big trucks from BT Sport and the BBC; and to BT Sport and BT media and broadcast for providing the guaranteed upstream bandwidth we needed.

The next step – or at least one of them – is to work with all the video and data captured at the Final to develop a stand-alone demo showing the capabilities of the system and illustrating the end-to-end process. We will present this on our stand in the Future Zone at IBC 2018 in Amsterdam in September. We look forward to seeing you there.

READ MORE


Making object-based CAKE at BBC R&D

Object-based media production, which is a central principle underpinning 2-IMMERSE, is being intensely developed by our colleagues at BBC R&D. Freely available online are resources that introduce the ideas behind object-based media, and which outline one especially neat demonstration of it in action, the Cook-Along Kitchen Experience, or CAKE.

Back in 2013 Tony Churnside wrote a BBC R&D blog post that outlines the approach and rationale for object-based broadcasting. This was updated just a few months ago, and remains an essential introduction:

Complementing this is a slightly more technical post by Robert Wadge, written in 2013 and updated two years later:

Just about a year ago, Matthew Brooks and Tristan Ferne looked back over 2016 to review recent work with object-based media from BBC R&D:

This includes a nod towards the involvement of BBC R&D in 2-IMMERSE:

We got busy with 2Immerse, a European project creating a multi-screen, multi-home, interactive immersive home theatre experience. As well as finalising the architecture, we built scrolling scripts that synchronise with the performance, video chat that brings homes together during intervals, and a layout engine that can present content across multiple screens.

Earlier this year, in May, Ian Forrester outlined BBC R&D’s plans to develop a community of practice for object-based media production:

The team is putting together an impressive sequence of demonstrations and workshops around the country (and beyond), and along with outlining this the post features a rallying cry for the significance of object-media:

We believe that the object-based approach is the key to content creation of the future, one which uses the attributes of the internet to let us all make more personal, interactive, responsive content and by learning together we can turn it into something which powers media beyond the scope of the BBC.

Perhaps BBC R&D’s most developed demonstration of the approach is the CAKE pilot, which is described in detail here (and illustrated above). The test period for the prototype of this cooking experience has recently come to an end but the ideas behind it are well worth exploring:

Following a recipe with CAKE is different to other cooking shows because it’s not a linear TV programme. It customises recipes based on your familiarity with ingredients and methods, your tastes or dietary preferences, and how many people you’re inviting round for dinner. The experience reacts ‘in the moment’ to your progress, allowing you to create new dishes at your own pace. Novices can level-up and experts can cut to the chase, supported by an evolving dialogue between audience and presenter.

Also available online is a paper prepared for the recent IBC symposium, ‘Moving Object-Based Media Production from One-Off Examples to Scalable Workflows’ [.pdf], authored by Jasmine Cox, Matthew Brooks, Ian Forrester and Mike Armstrong from BBC R&D. This is a valuable account of the team’s experiences and plans for the next stage of development, as their introduction promises:

This paper follows the creation of our most recent example of object-based media, the Cook-Along Kitchen Experience (CAKE) which was conceived and produced as an object-based experience from the outset. This paper looks at how we are applying the lessons learned from our previous work to the development of OBM data models and software tools. The paper also discusses how we intend to involve content creators from both inside and outside the BBC and build a community of practice around the development of new forms of media.

READ MORE


Beyond the video wall – responsive content projection

TIM PEARCE from BBC R&D writes:

In the 2-IMMERSE project we are aiming to create a platform which will enable content creators to author multimedia experiences which adapt to the different ecosystems of devices surrounding users. In the Theatre At Home and MotoGP trials, we have focused on second screen (companion) experiences, presenting content on a phone or tablet device that synchronises with the main content on screen.

Our future prototypes, including Theatre In Schools, are designed to extend the capabilities of the platform that may enable the creation of experiences that can be displayed on multiple communal screens, surrounding a larger group of users. By building these features into the platform, we will enable content creators to create highly immersive experiences, including more diverse applications such as digital signage and art installations in addition to traditional broadcast content.

The BBC has been exploring ‘on-boarding’, the end-to-end user experience of 2-IMMERSE experiences. One of the challenges of this is joining many different devices in a room to a single experience, especially as some devices lack a keyboard and mouse. Multiple communal screens pose an interesting use case for layout service which needs to be captured in the on-boarding process. If we want to project our object-based content onto multiple communal screens, how do we know the spatial relationship between those screens and how can we use this to influence the layout of content? Can components span multiple screens or animate from one screen to another?

A really elegant solution to this problem can be found in the Info Beamer project, an open-source digital signage solution for Raspberry PI (above). Each communal screen displays a unique QR Code, which is scanned by a camera-equipped companion device to connect the screens to the experience and determine their relative position, size and orientations. This could be a potential solution for on-boarding a number of large-screen devices in future 2-IMMERSE scenarios.

We will discuss on-boarding and the challenges in designing an end-to-end multiscreen experience in greater detail in a future blog post.

READ MORE


HbbTV 2: a note on the state of play

MICHAEL PROBST from IRT writes:

Hybrid Broadcast Broadband TV (HbbTV), as Wikipedia details, is both an industry standard (ETSI TS 102 796[1]) and promotional initiative for hybrid digital TV to harmonise the broadcastIPTV, and broadband delivery of entertainment to the end consumer through connected TVs (smart TVs) and set-top boxes.

The latest release of the HbbTV specification was released about a year and a half ago. From a public perspective, not much has happened since then, as it is still not possible to purchase a HbbTV 2-enabled TV. But in fact, a great deal has been happening “behind the curtains” and HbbTV 2 is evolving steadily. In this post we highlight some of the latest developments in terms of implementations and services.

Let’s have a short look at the new elements in HbbTV 2:

  • Support for HTML5 including the Media Elements replacing the “unloved” XHTML 1.0
  • Fancy UIs with CSS3 Animations and Transitions, but also downloadable fonts, e.g. to support languages with exotic characters.
  • Enabling closed subtitles for all broadband delivered media.
  • Companion screen: discovery of TVs and launching HbbTV apps; discovery of special (manufacturer specific) launcher apps and launch of mobile apps; communication between HbbTV apps and mobile apps
  • Media Synchronisation: The most complex feature allows to play broadcast on TV with content from the Internet in sync either also on TV or on the companion device.

HbbTV 1.5 was a relatively small update to the first release and included only features with urgent market needs, and hence it was implemented and deployed rather quickly. In Germany we can assume that all new TVs sold in 2017 implement 1.5. The market share of these devices, in relation to all HbbTV TVs, depends to a degree on who you ask, but now in summer 2017 it is quite likely 50%.

Once HbbTV 2 is deployed Germany’s broadcasters will have to deal with a large legacy of HbbTV 1.X devices. Other countries have the advantage of starting with HbbTV 2 but will have to migrate from former platforms like MHEG-5 (UK) and DVB-MHP (Italy). The UK industry has chosen to launch HbbTV with a sub-profile (Freeview Play), that does not include companion screen and media sync features, but Italy could be the first country where deployed TVs have a full HbbTV 2 implementation.

The HbbTV consortium is still very active and working on a number of different topics to allow HbbTV to find new markets:

  • The test suite already supports a first set of the HbbTV 2 test cases, and the HbbTV testing group is working hard to complete it.
  • ADB (Application discovery via broadband) is a specification that will make HbbTV services available in broadcast networks where operators block or do not care about HbbTV. A second version of this spec will allow scenarios were people to have to use STBs of the network operator, e.g. by employing watermarks.
  • IPTV: this spec addresses specific issues to use HbbTV in IPTV networks, i.e. where they differ from classical broadcast networks
  • Operator apps: not yet released, but this specification will define a complete new class of applications suitable for broadcast network operators.

HbbTV 2 in action

Synchronised textbook using HbbTV 2; image shows the Royal Shakespeare Company production of Richard II (2013), © RSC.

Several companies have shown prototypes over the last years including the BBC and IRT who are partners in the 2-IMMERSE project.

In 2015 IRT presented a first HbbTV 2 showcase – live streaming of MPEG DASH with EBU-TT-D subtitles – with several partners including both manufacturers of streaming encoders and TVs, as well as a CDN provider and content provider. As a result of the success of this activity we now see first live streaming services with MPEG-DASH and subtitles offered by ARD broadcasters. More information on this can be found here.

Also in 2015 IRT cooperated with ARD Mediathek – a service providing catch-up TV – to enable their mobile application to cast videos on HbbTV 2 TVs. The application has been very useful for testing HbbTV 2.0 features for automatic discovery and application launch with a large number of TV manufacturers and the next step will be that the function is integrated into the end-user version of the application. Further information here.

The concept of HbbTV 2 Media Synchronisation is largely based on contributions from the BBC. TNO and IRT also supported it as part of their cooperation in the EU-funded hbb-next project. The BBC have done some early HbbTV 2 implementations using their own TV emulator and have released libraries and tools as open source on github.com. At TV Connect 2017 they showcased synchronised playback of broadband media on the companion device with a broadcast service on TV in cooperation with Opera TV.

If you have not had the chance to see any of these demos, there is another one at IFA Berlin and IBC Amsterdam, both taking place in September.

The HbbTV 2 demos of IRT this year will focus on media synchronisation for broadcast services, like broadband delivered foreign language audio tracks. As a study for new types of devices emerging, IRT has implemented a companion application for Microsoft’s Hololens, that will show additional synchronised video feeds alongside the real picture of an HbbTV 2 TV set.

At the IFA IRT demos can be seen at the booth of ARD digital in hall 2.2/103 (look for the IRT table at the back of the booth). And at IBC IRT’s stand is in hall 10.F51 (in a corner of hall 10). We look forward to seeing you there.

READ MORE