Fine-tuning the live production tools

JI LIE of the Distributed and Interactive Systems (DIS) group at CWI Amsterdam writes about the project’s recent encounters with broadcast professionals.

2-IMMERSE is developing a production platform for creating multiscreen and immersive experiences by TV professionals. The production platform includes three software components:

  1. A preproduction tool that aims at digitalizing the authoring process in an object-based manner. It is a graphical tool to help producers scripting by temporally arranging different media objects (such as video clips, graphics, audio, and subtitles) for both TV screens and companions.
  2. A live editing tool that enables the production team to react to live events happening during a television program, such as a goal during a football match and an overtake during a MotoGP race. When live events happen, the editorial team can quickly, according to the director’s instructions, edit associated media (graphics, name tags, replay clips) based on prepared templates.
  3. A live triggering tool that includes a button-based graphical interface and a keyboard launcher. The edited templates are enqueued as a thumbnail on the interface of the launcher, which can then by easily triggered by the director.

The live editing and triggering tools have been successfully evaluated at the Wembley Stadium during the FA Cup semi-final and final and showcased at the International Broadcasting Convention (IBC) 2018 in Amsterdam. The team is now busy further developing the preproduction tool.

Fig. 1. A demo of the live editing tool and the live triggering tool at IBC2018

The initial wireframes of the preproduction tool were designed and iterated based on the inputs of two rounds of interviews with a total of 20 professionals in the broadcasting industry (see papers: “Designing an Object-based Preproduction Tool for Multiscreen TV Viewing” and “A New Production Platform for Authoring Object-based Multiscreen TV Viewing Experiences”). In September of 2018, a number of extra interviews to fine-tune the interface and interaction design of the tool have taken place.

The interviews were conducted with seven professionals (P1-P7; 5 males, 2 females; M=35.0, SD=6.0), whose backgrounds are summarized in Table 1. The interviews happened from 3 – 5 September 2018 at the usability lab of BBC R&D, located in the MediaCity, Manchester.

Fig. 2 (Top). The hierarchical organization of the program chapters; (Bottom) The spacial and temporal arrangements of the DMApp components.

The interviewees confirmed the basic design decisions for the preproduction tool: hierarchical organization of the program sections and sequential arrangements of the media objects following the script (see Fig. 2). The former offers a clear structure and enables the use of a master layout to reduce repetitive production work. The media objects can be configured and reused to have interactive and novel functions on TV programs. These can, for example, enable users to select different view angles or to follow their favourite football players.

In addition, the professionals recommended a node structure to link media objects within a section, to have a clear overview of objects that are running in parallel and those running in sequence. Regarding the layout of the TV program, the professionals confirmed the intuition of the research team that the preproduction tool does not require to manage it.

Based on the feedback, the original interaction design of the tool will be improved, a new development process will take place, and a final evaluation with the seven professionals will happen by the end of November. The idea is to ask the professionals to create a multiscreen and interactive TV sports program. Stay tuned!


Success for 2-Immerse at TVX

Jie Li honoured with Best Paper Award

On 27 June Jie Li of the Distributed and Interactive Systems (DIS) group at 2-IMMERSE partner CWI was awarded with the Best Paper Award at ACM TVX 2018, the ACM International Conference on Interactive Experiences for Television and Online Video. Li won the award for the paper ‘A New Production Platform for Authoring Object-based Multiscreen TV Viewing Experiences’. Outlining research done within 2-IMMERSE, the paper has as co-authors Pablo Cesar, Maxine Glancy (BBC), Jack Jansen and Thomas Röggla

Multiscreen TV viewing refers to a spectrum of media productions that can be watched on TV screens and companion screens such as smartphones and tablets. TV production companies are now promoting an interactive and engaging way of viewing TV by offering tailored applications for TV programs.

However, viewers are demotivated to install dozens of applications and switch between them. This is one of the obstacles that hinder companion screen applications from reaching mass audiences. To solve this, TV production companies need a standard process for producing multiscreen content, allowing viewers to follow all kinds of programs in one single application. This paper proposes a new object-based production platform for authoring and broadcasting programs for multiscreen.

Besides the awarded paper, the DIS group also presented a demo at the conference, showcasing object-based live broadcasting of a sports event, to both a large TV screen and companion screens. The demo included both the producer and the user side. The producer side takes place inside an outside broadcast truck, where the director/producer can insert on-demand snippets and interactive components in the live broadcast. The user side is a home, where a viewer with a TV screen and several companion devices enjoy a personalized and interactive experience.

CWI’s Distributed and Interactive Systems (DIS) research group focuses on facilitating and improving the way people access media and communicate with others and with the environment. They address key problems for society and science, resulting from the dense connectivity of content, people, and devices. The group uses recognized scientific methods, following a full-stack, experimental, and human-centered approach.

More at TVX from BBC and IRT

A work-in-progress paper submitted by BBC and IRT explores the potential of augmented reality technology as a novel way to allow users to view a sign language interpreter through an optical head-mounted display while watching a TV programme. We address the potential of augmented reality for personalisation of TV access services. Based on guidelines of regulatory authorities and research on traditional sign language services on TV, as well as feedback from experts, we justify two design proposals. We describe how we produced the content for the AR prototype applications and what we have learned during the process. Finally, we develop questions for our upcoming user studies.

Also at TVX BBC and IRT demonstrated results of our work stream which targets deploying the 2-IMMERSE apps on HbbTV 2.0 devices. In cooperation with Samsung we showed the 2-IMMERSE MotoGP experience from Silverstone GP 2017 on a recent consumer device running an HbbTV 2.0 ready firmware.


A Day in Brussels: showcasing our production tools

PABLO CESAR, JIE LI and THOMAS RÖGGLA from project partner CWI Amsterdam report on a successful showcase in Brussels:

Vlaamse Radio- en Televisieomroeporganisatie (VRT), the national public-service broadcaster for the Flemish Region and Community of Belgium, organizes every year a networking event on media innovation: Media Fastforward. This year it happened in the beautiful location of Bozar in Brussels on 5 December. VRT Innovatie invited a number of European research projects to participate in the Future Zone.

Media Fastforward, focusing on media innovation, fits well the networking and dissemination needs of 2IMMERSE, providing a unique opportunity to meet with others and to showcase the results of the project. So the CWI team packed their bags, taking along a demo about our object-based multi-screen broadcasting, with an emphasis on the live-triggering tool intended for sports events such as football or MotoGP races.

Our demo showed a working prototype of our tool, interfacing with the 2IMMERSE platform, that in real-time could trigger broadcast events, showing and hiding collections of objects. The tool aims to reduce the workload during live broadcasting by providing templates for certain events (e.g., crashes, overtakes in MotoGP). The template resembles a ‘data package’, including graphics, placeholders for camera feeds, and scripts describing sequences of contents within this event.

The event was a huge success with around one thousand registered visitors, 25 (inter-)national speakers, more than 40 startups, 12 research projects and 3 impressive tech companies. It allowed us to interact with peer projects like FLAME, MOS2S and ImmersiaTV, and show and discuss about our production tools with entrepreneurs, media professionals, and policy makers.

Yes, we also had as well some extra time to relax with bike games and alcohol-free mojitos; we will certainly be back next year!