We won! “A very sophisticated application…”

DOUG WILLIAMS from BT R&D reports on a notable success for the project:

2-IMMERSE are very pleased to report that the multi-screen presentation of MotoGP, developed as part of the 2-IMMERSE project has won the 2018 ‘Best multiscreen HbbTV service” award from the HbbTV Association. (Note for new readers, HbbTV is an abbreviation for ‘Hybrid broadcast broadband TV‘).

The HbbTV Association and Deutsche TV-Platform announced the winners of the HbbTV Awards 2018 at a ceremony on November 14, 2018, as part of the 7th HbbTV Symposium and Awards, which took place in Berlin under the theme Growing Value Through HbbTV.

Michael Probst, from IRT, was there to receive the award on behalf of the project.

The Best Multiscreen Service Award recognises research and innovations in HbbTV services running across multiple screens. Entrants were asked to demonstrate how their service will bring value to the viewer’s experience and to the service providers.

In our submission we explained that the service concept was a multi-screen interactive presentation of the 2017 UK MotoGP race and that it runs on an HbbTV emulator installed on an Intel NUC and, in cut down form, directly on a HbbTV2 TV. We explained that the application allows users to watch the race on TV and access additional video, graphics and data driven content on a companion screen and that the auxiliary screen could be used to: trigger replays on the main screen; customise the leader board on the TV and to cast on board bike cams to the main screen.

The Jury quote accompanying the award described the service concept as:

A very sophisticated application, HbbTV 2.0 and Multiscreen at the highest level.

The service concept was also runner up for the Grand Prix prize awarded to the best of the best submissions to the HbbTV awards.

READ MORE


Alternative TV companions

CHRISTOPH ZIEGLER from 2-IMMERSE partner IRT introduces a parallel project that uses HbbTV 2 technology in an innovative way. 

IRT’s project, ‘A Tangible TV Companion for a Children’s Quiz Show’, has been nominated in the category ‘Best multiscreen HbbTV service’ in the 2018 HbbTV Awards, the winners of which are announced on 14 November.

2-IMMERSE develops compelling multi-screen applications for MotoGP, football and live theatre productions. The partners invest substantial efforts in designing and implementing intuitive user interfaces, as well as developing a technically reliable platform. The applications are then evaluated in elaborate user trials. When designing the applications for receiving highly relevant user feedback, the project focuses on today’s mainstream end-user devices for multi-screen experiences: the TV, tablets and smartphones. In addition, the partners are also watching more recent developments in the market, not least because we do not want to miss out on potential opportunities. This is why we evaluate emerging technologies that seem exciting to us on the basis of simpler proof-of-concept prototypes, like the one presented here.

Figure 1. The fields for the answer possibilities in the studio of the programme1, 2 oder 3 (left) and the sensor floor pads we developed for trade fairs (right).

In cooperation with the German broadcaster ZDF, IRT developed a TV companion experience for the children’s quiz show 1, 2 oder 3, which shows that the companion-screen features of HbbTV 2, which are the basis for all 2-IMMERSE applications, can enable innovative forms of interaction and opportunities for viewer engagement. This companion experience does not involve a tablet or smartphone, but rather a sensor pad, which viewers can potentially build themselves at home. The sensor pad connects via a Raspberry PI Micro Computer to an HbbTV 2 application and allows viewers to play along with the TV show.

Figure 2. Prototype of a floor pad sensor build from carton, aluminium foil and adhesive tape.

The sensors in our playing field were originally part of cat doorbells. The sensors are covered with printed PVC mats. WS2812 LED strips are mounted on the front of the sensor pad. Light patterns provide feedback on the quiz application’s state. Sensor pads and LED stripes are connected to a Raspberry Pi Zero W (RPI) microcomputer. The RPI runs the Node.js-based companion app which makes use of the DIAL protocol to discover HbbTV devices on the local network. Users initiate the discovery by simply stepping on the floor pad.

The HbbTV app is synchronised to the broadcast programme via the App-to-AV synchronisation feature of HbbTV 2. Questions and response options are displayed by the HbbTV app at the same time as they are mentioned by the quiz master in the show. When the viewers select an answer by stepping on a field on their connected sensor pads, their choices are displayed on the TV. When the correct answer is presented by the quiz master in the show, it is also relayed to the floor pad via the App-to-App Communication channel. The segment of the LED strip that corresponds to the correct answer is illuminated. The HbbTV app displays viewers aggregated scores.

Figure 3. Board game version of the companion hardware.

In addition to the sensor pad, we have also built a board game version of the companion hardware.  The playing field of the board game is made up of three fields of acrylic glass. In addition to LED strips, Reed switches are installed under the three fields. The Reed switches react to the magnetic field release by the permanent magnets in the playing figure.

We believe our demo could prototypically stand for a range of broadcast services fulfilling an educational mission. Information technology (IT) is omnipresent. Educational programmes such as the BBC’s micro:bit [1] project intend to enable children to make conscious use of IT instead of just consuming IT products. Our TV app could motivate children to explore IT concepts. A simple version of the sensor playing field could be built by children at home from carton and aluminium foil [2]. Our board game shows that the design possibilities are manifold and that there are no limits to the creativity of the tinkerers.

We have shown a demo with huge success at IFA and IBC. At IFA our demo was shown at two stands: ARD’s „Digitale Welt“ and at the stand of Deutsche TV-Plattform. The latter was featured on the title page of IFA’s daily exhibition newspaper “IFA heute”. However, we believe that our demo is more than an attraction at trade fairs. We see many opportunities for broadcasters and third parties.

Figure 4. Piet, the official mascot of the show “1, 2 oder 3” presents our demo at the booth of the German TV-Platform at IFA 2018.

Figure 5 Our demo promoted on the cover of the “IFA heute” exhibition magazine.

References

[1] – BBC micro:bit Website, online: https://microbit.org/, accessed 1 October 2018

[2] – Jason Poel Smith. “Use a DIY pressure plate switch to automate you haunted house”, online: https://www.instructables.com/id/Use-a-DIY-Pressure-Plate-Switch-to-Automate-Your-H/, , accessed 1 October t 2018.

READ MORE


The 2-Immerse Layout Engine Packing Algorithm

In a technical post, AVIVA VAKNIN from 2-IMMERSE partner Cisco outlines the operation of the algorithm behind the project’s layout engine.

The motivation for the 2-IMMERSE layout service is to provide a cloud based, highly available layout engine that could provide a layout for a dynamically varying set of digital application components over a set of varying viewing devices. The service manages the set of devices and running components and provides the layout engine with a full layout specification for each layout computation. It then returns a list of components per device, with region, position and size.

The layout engine computes a rectangular layout of rectangles onto a set of rectangles. The primary use case is to compute layout arrangements for digital components over a set of viewing devices, e.g. several video playbacks and a control panel that will run over a video wall comprised of many monitors. The service is comprised of a REST interface and a layout engine. The layout engine is unique in that (1) it computes a layout over a set of rectangular display regions and (2) the rectangles, or digital components, may be heavily constrained.

The input to the packer algorithm includes a list of rectangular regions (logical rectangular display areas, mapped onto underlying physical devices), their associated device information, and a list of rectangular components and associated constraints to be packed.

The output is a list of components per device, with region, position and size for each placed component where all coordinates are relative to the top left corner which is at (0,0).

The algorithm is multi-pass; since the primary application is for digital media applications, the number of components and displays is small, thus efficiency is not a prime consideration, but rather the generating of optimal and aesthetic layouts.

Pass One begins by sorting the regions in decreasing size order and sorting the components in decreasing priority order using size as a secondary sort constraint. The component sizes are not fixed, thus the packing engine computes a first approximation to the number of components that will fit into the given region set, and then sorts the highest priority components that will fit, in decreasing size order.

The destination regions are stored in a list of nodes, initialized with one node per region, all marked as unoccupied. When the packer places a component into a region, and the incoming component does not utilize an entire dimension (e.g. due to aspect ratio or maximum size constraints) the node will be split into two or three regions rather than two, as seen in Fig. 1.

If the packer does not find an unoccupied appropriate node for the component, it attempts to split and occupied node and populate the second half with the new component. The packer uses the preferred size, minimum size, anchor, and aspect ratio constraints to determine how to fit the components.

Pass Two attempts to fit in any unplaced components in the first pass due to lack of space: if not all components were laid out, the packer will successively reduce the size of all the components using a decreasing reduction factor. The packer then chooses the layout that fits the highest number of components, and if they are all equal it chooses that with the least white space.

Pass Three re-arranges the layout so that is more aesthetic, i.e., avoids holes in the centre of the displays, minimises white space, and collates white space to the right and bottom of the rectangle. The packer sorts the resultant nodes in position order, starting at the top left and moving right, then down, and packs the components again using the internal packer loop, choosing the layout that maximises real estate coverage as well as the number of placed components.

The result is that the packing algorithm always adheres to the defined constraints, provides a good distribution of the components over the regions, and generates aesthetic layouts, particularly when constraints do not include maximum sizes.

Sample video layouts

Video component constraints: aspect ratio = 16:9 and minimum width = 50%

READ MORE


Fine-tuning the live production tools

JI LIE of the Distributed and Interactive Systems (DIS) group at CWI Amsterdam writes about the project’s recent encounters with broadcast professionals.

2-IMMERSE is developing a production platform for creating multiscreen and immersive experiences by TV professionals. The production platform includes three software components:

  1. A preproduction tool that aims at digitalizing the authoring process in an object-based manner. It is a graphical tool to help producers scripting by temporally arranging different media objects (such as video clips, graphics, audio, and subtitles) for both TV screens and companions.
  2. A live editing tool that enables the production team to react to live events happening during a television program, such as a goal during a football match and an overtake during a MotoGP race. When live events happen, the editorial team can quickly, according to the director’s instructions, edit associated media (graphics, name tags, replay clips) based on prepared templates.
  3. A live triggering tool that includes a button-based graphical interface and a keyboard launcher. The edited templates are enqueued as a thumbnail on the interface of the launcher, which can then by easily triggered by the director.

The live editing and triggering tools have been successfully evaluated at the Wembley Stadium during the FA Cup semi-final and final and showcased at the International Broadcasting Convention (IBC) 2018 in Amsterdam. The team is now busy further developing the preproduction tool.

Fig. 1. A demo of the live editing tool and the live triggering tool at IBC2018

The initial wireframes of the preproduction tool were designed and iterated based on the inputs of two rounds of interviews with a total of 20 professionals in the broadcasting industry (see papers: “Designing an Object-based Preproduction Tool for Multiscreen TV Viewing” and “A New Production Platform for Authoring Object-based Multiscreen TV Viewing Experiences”). In September of 2018, a number of extra interviews to fine-tune the interface and interaction design of the tool have taken place.

The interviews were conducted with seven professionals (P1-P7; 5 males, 2 females; M=35.0, SD=6.0), whose backgrounds are summarized in Table 1. The interviews happened from 3 – 5 September 2018 at the usability lab of BBC R&D, located in the MediaCity, Manchester.

Fig. 2 (Top). The hierarchical organization of the program chapters; (Bottom) The spacial and temporal arrangements of the DMApp components.

The interviewees confirmed the basic design decisions for the preproduction tool: hierarchical organization of the program sections and sequential arrangements of the media objects following the script (see Fig. 2). The former offers a clear structure and enables the use of a master layout to reduce repetitive production work. The media objects can be configured and reused to have interactive and novel functions on TV programs. These can, for example, enable users to select different view angles or to follow their favourite football players.

In addition, the professionals recommended a node structure to link media objects within a section, to have a clear overview of objects that are running in parallel and those running in sequence. Regarding the layout of the TV program, the professionals confirmed the intuition of the research team that the preproduction tool does not require to manage it.

Based on the feedback, the original interaction design of the tool will be improved, a new development process will take place, and a final evaluation with the seven professionals will happen by the end of November. The idea is to ask the professionals to create a multiscreen and interactive TV sports program. Stay tuned!

READ MORE


2-IMMERSE in IBC’s Future Zone

DOUG WILLIAMS of BT R&D offers some first thoughts after IBC 2018:

2-IMMERSE was well represented in the Future Zone at IBC – the world’s leading media technology exhibition – over the weekend, from 13-18 September. Team members had the chance to meet, address and learn from more than 57,000 of the most engaged power brokers and technical analysts in media and entertainment. In addition, we launched the open source software that underpins the multi-screen personalised TV demos that 2-IMMERSE has developed.

Apart from the stand in the Future Zone, results from the project, including a compelling rationale extolling the benefits of object based media approach used by the project, were also delivered at the IBC conference by Ian Kegel on behalf of the wider team.

From left, Fergus Garber, Gemma Knight and Jamie Hindhaugh with Ian Kegel learning about the Football at Home prototype.

Many important industry stakeholders were there including Jamie Hindhaugh, COO BT Sport, who said,  ‘This is the best thing I’ve seen at IBC; you just know that has been designed with the fans in mind, I think it’s beautiful.’ We also rubbed shoulders with producers, directors, technologists, analysts and broadcasters from all over Europe and around the world.

Visitors to the stand were particular encouraged to see that the project’s software contributions were being made available through open source software licenses and every time we showed the Fan Zone demo, they smiled. We lost count of the number of times we were told ‘I like that’ and ‘this is the future’. As the teams disperse back to their work places we’re encouraged, feeling our focus on exploitation is paying dividends.

The Football at Home prototype – a personalised interactive multi screen experience delivered over IPTV

The Football Fan Zone prototype showing a new representation of BT Sport suitable for viewing in public spaces

READ MORE


IBC launch for 2-IMMERSE open source software

2-IMMERSE is approaching the end of its three year term.  With three and a half of our four prototype services developed, and with them all running on the same common platform, it is a good time to share the progress we have made with the wider media community. That is why we are spending a lot of time and effort developing our stand and our presence more generally at IBC.  IBC (the International Broadcasting Convention) is held each September at the RAI Amsterdam and is regarded as the world’s most influential media, entertainment and technology show. The exhibition dates this year are Friday 14 to Tuesday 18.

2-IMMERSE will have its own stand, in the Future Zone (Hall 8 F46), but you’ll also see 2-IMMERSE project outputs on the stands of ChyronHego (Hall 7, C21), of IRT (Hall 10, F51)  and in the conference programme (Sunday 16 at 16.45 in the Forum) where Ian Kegel is presenting a description of the technical platform used to deliver object based productions.

Through a number of channels, including this blog post, we are inviting friends in the industry to come and see what we have achieved and to learn specifically about the software we will be launching at the end of the project.  2-IMMERSE has recognised a number of key challenges related to the delivery of object based media productions across multiple screens and has sought to develop common, extensible solutions to these that will be made available as open source software.  In addition we will be describing a reference architecture that will help explain the technology components required to deliver our object based experiences.

The software components we are developing all play valuable roles in the delivery of our multi screen experiences.  The software components will:

  • Enable frame accurate multi-screen synchronisation
  • Manage layouts across multiple screens
  • Manage timelines across multiple screens
  • Enable unified sign-on for multi-screen experiences
  • Provide bandwidth orchestration capability to help deliver the “best” experience even when bandwidth is limited

2-IMMERSE worked closely with rights holders throughout the project, setting – and meeting – the challenges of delivering one of its prototypes as a live end-to-end production.  We chose the high profile 2018 FA Cup final as the object of this trial and at IBC we will be showing the production tools we developed to do this. We will also be available to discuss how the production chain needs to be evolved to support object based delivery.

2-IMMERSE set out to show that object based delivery approach can be used to effectively deliver immersive multi-screen experiences that over achieve compared to their single screen alternatives.  What ‘over achieve’ means depends upon the context.  For the sport examples it means delivering more on the brand objective for the rights holders; in the case of BT Sport that means helping viewers get to the heart of sport – you can see what we have done towards this objective with our MotoGP at Home demo.

If you come to IBC you’ll be amongst the first to see our new FanZone experience for football – a prototype service that will highlight how the object based approach can be used to create partisan representations of games in public spaces. This exceptional demo is also based on the 2018 FA cup final.

‘Over-achieving’ for theatre companies with education objectives means supporting their ambition to allow equal provision and equal access to the best theatre-based education resources through connected screens.  For our Theatre At Home demonstration it was about delivering a more social and engaging experience of watching filmed theatre in the home.  These experiences will all be on show at IBC and we are keen to share our enthusiasm for our achievements with the wider broadcasting and production industries.

We hope to see you in Amsterdam!

In the Future Zone, Hall 8, Stand F46.

READ MORE


Object-based media: the ‘next big thing’

“BT Sport: Object-based delivery ‘next big thing’ ” – so reads the title of a recent article on the web site Advanced TelevisionDoug Williams from BT argues that this illustrates that we are making progress on exploitation – although perhaps not in the way you might expect.

A key objective of an Innovation action like 2-IMMERSE is to be able to show that the project has an impact, and most particularly that it leads to commercial impact. Simplistically it would be nice to think 2-IMMERSE designed and proved the viability of a widget, and that following the project a new company was started that made, marketed and sold said widget.  That would be an easy story to tell – but things don’t often work like that.

 2-IMMERSE is a collaborative project involving, among others, BBC, BT, CISCO and IRT, and in this blog post I want to tell you a little about some of the progress we are making towards delivering our exploitation objectives.  I am not telling the whole story,  just relating something of what is happening in BT.

As in the suggested ‘easy to tell’ story, 2-IMMERSE is designing and proving the viability of something, though that something is not a widget.  We are proving, through the work done on the MotoGP experience and on the football trials, for example, that using the objectbased delivery approach, we can create multi screen TV-based experiences and that those experiences can be appealing.  We have worked closely with rights holders BT Sport and Dorna Sports as our experiments progressed and we have shared our progress with those rights holders along the way.

A fundamental aspect of this project is that the delivery of the TV experiences is based on objects – where those objects can be video streams, graphics, audio streams or interactive elements that are assembled, according to rules, across the devices that are available to present the experience.  There might be an element of “so what?” to that description.  It does not highlight any benefits and gives no obvious reason why such an approach is useful to a broadcaster.  But there is a simple benefit addressing something that is distinctly missing from current TV services and that is the ability to personalise or customise the experience to better suit the context in which the TV experience is being consumed.

TV is great. I am writing this the morning after nearly half the UK population were together watching England play, and disappointingly lose, in the semi-final of the World Cup. Television is the only medium than can achieve such moments where such a large fraction of the population are all sharing the same experience.  And that’s magic.

But.

• What if I was hard of hearing and wanted to quieten that crowd volume and increase the commentary volume so I could hear the commentary more clearly?  I couldn’t do that last night, but I could with the object-based delivery approach.

• What if I was watching on a small TV and found the graphics a bit hard to read and wanted to increase the size of the graphics so I could read them more easily?  I couldn’t do that last night, but I could do that with the object-based delivery approach.

• What if I was profoundly deaf and wanted to pull up a signed commentary on the TV screen?  I couldn’t do that last night, but with the object-based delivery approach that is possible.

• What if I wanted to actively track the relative possession enjoyed by each of the teams throughout the game?  I couldn’t do that last night, but I could do so with the object-based delivery approach.

• What if I was interested in seeing at all times, the expressions on each of the different managers’ faces?  I couldn’t do that last night, but with the object-based delivery approach I could.

The object-based delivery approach provides ithe ability to personalise the TV experience to better address the needs of a large number of niche populations.  Whether they are characterised by a need for more data, clearer graphics or a different audio mix.  We’ve been discussing these options with our rights-holders partners for some time now, and together we are beginning to focus on these benefits and to work out how we could take them from the lab to the living room.

BT Sport have a fantastic innovation record.  BT Sport were the first to do deliver live UHD TV in Europe.  BT Sport were among the first to provide 360 video coverage of live sporting events.  BT Sport were the first to use Dolby Atmos to deliver even more compelling soundscape to accompany the pictures.  And now? Well now it’s good to see reports of Jamie Hindhaugh, COO of BT Sport, and the one who has driven innovation in the company, ruminating on what innovations are next in the pipeline.

According to an article penned by Colin Mann of the Advanced Television web site,  reporting on the Love Broadcasting Summit,

[Jamie Hindhaugh] didn’t see VR and 3D as necessarily the future for sports viewing, as they lacked the social aspect, with BT preferring to offer such options as different camera angles, instead suggesting that object-based delivery would grow with the spread of IP. We will start giving you objects, packages, templates, so you can create your own version of what you are watching.

Hindhaugh said that hardcore fans of MotoGP could benefit from additional graphics and stats available to the broadcaster. BT would curate the data, and allow the viewer to choose their preferences from the programme feed. That’s where we’re going. That’s where the next big initiative is coming through, combined with 5G mobile.

We have always seen our greatest potential for innovation impact in large companies as being the ability to affect the way they choose to develop their services.  We have set out to work with the parts of our business that run key services and to effect innovative change.  We are pleased to see the ideas we are developing and demonstrating are appealing not only to users but also to decision makers and that innovations developed in this project could be the ‘next big thing’.

READ MORE


Success for 2-Immerse at TVX

Jie Li honoured with Best Paper Award

On 27 June Jie Li of the Distributed and Interactive Systems (DIS) group at 2-IMMERSE partner CWI was awarded with the Best Paper Award at ACM TVX 2018, the ACM International Conference on Interactive Experiences for Television and Online Video. Li won the award for the paper ‘A New Production Platform for Authoring Object-based Multiscreen TV Viewing Experiences’. Outlining research done within 2-IMMERSE, the paper has as co-authors Pablo Cesar, Maxine Glancy (BBC), Jack Jansen and Thomas Röggla

Multiscreen TV viewing refers to a spectrum of media productions that can be watched on TV screens and companion screens such as smartphones and tablets. TV production companies are now promoting an interactive and engaging way of viewing TV by offering tailored applications for TV programs.

However, viewers are demotivated to install dozens of applications and switch between them. This is one of the obstacles that hinder companion screen applications from reaching mass audiences. To solve this, TV production companies need a standard process for producing multiscreen content, allowing viewers to follow all kinds of programs in one single application. This paper proposes a new object-based production platform for authoring and broadcasting programs for multiscreen.

Besides the awarded paper, the DIS group also presented a demo at the conference, showcasing object-based live broadcasting of a sports event, to both a large TV screen and companion screens. The demo included both the producer and the user side. The producer side takes place inside an outside broadcast truck, where the director/producer can insert on-demand snippets and interactive components in the live broadcast. The user side is a home, where a viewer with a TV screen and several companion devices enjoy a personalized and interactive experience.

CWI’s Distributed and Interactive Systems (DIS) research group focuses on facilitating and improving the way people access media and communicate with others and with the environment. They address key problems for society and science, resulting from the dense connectivity of content, people, and devices. The group uses recognized scientific methods, following a full-stack, experimental, and human-centered approach.

More at TVX from BBC and IRT

A work-in-progress paper submitted by BBC and IRT explores the potential of augmented reality technology as a novel way to allow users to view a sign language interpreter through an optical head-mounted display while watching a TV programme. We address the potential of augmented reality for personalisation of TV access services. Based on guidelines of regulatory authorities and research on traditional sign language services on TV, as well as feedback from experts, we justify two design proposals. We describe how we produced the content for the AR prototype applications and what we have learned during the process. Finally, we develop questions for our upcoming user studies.

Also at TVX BBC and IRT demonstrated results of our work stream which targets deploying the 2-IMMERSE apps on HbbTV 2.0 devices. In cooperation with Samsung we showed the 2-IMMERSE MotoGP experience from Silverstone GP 2017 on a recent consumer device running an HbbTV 2.0 ready firmware.

READ MORE


MotoGP roars out on a HbbTV 2 television

MICHAEL PROBST from the 2-IMMERSE partner IRT introduces an impressive video showcasing our MotoGP prototype running on a HbbTV 2 television.

The 2-IMMERSE architecture has been built around the HbbTV 2 specification, mainly using the new protocols defined for interaction and media synchronisation with multiple screens in the home network.

During the services trials, the project has made use of custom prototype units as client devices. For the most recent trials a Intel NUC-based Linux PC was used to emulate the main TV and feed the screen  in the home.

In a parallel track, however, the 2-IMMERSE implementation of the MotoGP trial has been tested and validated with televisions supporting HbbTV2 that became available in 2018. Watch the video, hosted by IRT’s Florian Bachmann, to see the MotoGP showcase on one of the first HbbTV 2 televisions and learn more about which features 2-IMMERSE actually uses from HbbTV 2.

You can view the video here.

You can see this demonstration at IFA in Berlin (31 August – 5 September) and IBC2018 in Amsterdam (14-18 September) at the IRT stands (hall 2.2@IFA, 10F51@IBC) in addition to the 2-IMMERSE main presentation in the IBC Future Zone.

READ MORE


Back of the net!

The 2-IMMERSE Football production team report on the success of our major trial at Wembley.

2-IMMERSE passed another milestone at the recent 2018 FA Cup Final at Wembley Stadium between Chelsea and Manchester United as we proved our prototype end-to-end live production system for object-based broadcasting to multiple screens.

For those that may not know the FA Cup is the world’s oldest football challenge cup, having been started back in 1871. The high profile final is the curtain closer for the UK domestic football season and has spawned and fuelled many of the intense inter club rivalries that give the UK league its particular character. For decades the match has also been a key television moment and a showcase for experimentation and innovation in the presentation of football on television. So 2-IMMERSE is very proud this year to have been a small part of a great tradition. The FA Cup Final is a national treasure, but it also draws a global audience estimated to be in the hundreds of millions.

Our team for the FA Cup was drawn from BT, CWI, Cisco, and BBC R&D – and they all worked tirelessly over the preceding weeks and months to design and develop the end-to-end system. Early ‘sighting visits’ to Wembley (see our previous blog post) helped us identify and address the key issues that can affect the delivery of the service and helped the test on the day of the Final to go as smoothly as it did.

On the day, we arrived with our modest broadcast truck and once the feeds from all the individual cameras were in place we were set to go. After all the preparation we were able to conduct three key activities at Wembley:

  • Live orchestration of match graphics, mirroring the broadcast graphics production but using our own HTML 5 focused production and delivery chain
  • Live monitoring and demonstration of prototype Football DMApp (see blog post) on site at Wembley
  • ChyronHego data capture for Virtual Placement: camera parametric data and Tracab player tracking data.

In addition to the on-site achievements, and to further illustrate the end-to-end nature of the trial, we engaged 10 remote viewing participants in UK, Sweden and the Netherlands, to experience watching our interactive, customisable and multi-screen version of the 2018 FA Cup Final.

Among the key system elements and features of our system are the following (which will be illustrated in a short video coming soon):

  • Live triggering tool with trigger launcher driven by Elgato Streamdeck
  • Real-time preview in OB van driven direct from SDI
  • TV emulator and Samsung tablet as client devices
  • Live sessions integrated within client onboarding workflow
  • Match GFX components developed by ChyronHego and implemented using new DMApp component which integrates with their Prime universal graphics platform.
  • Interactive experience on tablet included:
    • ScoreClock menu with match overview, team line-ups and replays
    • Broadcast menu with camera feed viewer and customised picture-in-picture layout on TV

Like most productions this was a massive collaboration involving people and organisations across and beyond 2-IMMERSE. We are happy to acknowledge the help of: AWS Elemental in providing real-time encoders; BT Technology Service and Operations for providing some key with contacts and helping with system design; BT Sport for permission to access the stadium and allowing our small van to take its place alongside the big trucks from BT Sport and the BBC; and to BT Sport and BT media and broadcast for providing the guaranteed upstream bandwidth we needed.

The next step – or at least one of them – is to work with all the video and data captured at the Final to develop a stand-alone demo showing the capabilities of the system and illustrating the end-to-end process. We will present this on our stand in the Future Zone at IBC 2018 in Amsterdam in September. We look forward to seeing you there.

READ MORE