Back of the net!

The 2-IMMERSE Football production team report on the success of our major trial at Wembley.

2-IMMERSE passed another milestone at the recent 2018 FA Cup Final at Wembley Stadium between Chelsea and Manchester United as we proved our prototype end-to-end live production system for object-based broadcasting to multiple screens.

For those that may not know the FA Cup is the world’s oldest football challenge cup, having been started back in 1871. The high profile final is the curtain closer for the UK domestic football season and has spawned and fuelled many of the intense inter club rivalries that give the UK league its particular character. For decades the match has also been a key television moment and a showcase for experimentation and innovation in the presentation of football on television. So 2-IMMERSE is very proud this year to have been a small part of a great tradition. The FA Cup Final is a national treasure, but it also draws a global audience estimated to be in the hundreds of millions.

Our team for the FA Cup was drawn from BT, CWI, Cisco, and BBC R&D – and they all worked tirelessly over the preceding weeks and months to design and develop the end-to-end system. Early ‘sighting visits’ to Wembley (see our previous blog post) helped us identify and address the key issues that can affect the delivery of the service and helped the test on the day of the Final to go as smoothly as it did.

On the day, we arrived with our modest broadcast truck and once the feeds from all the individual cameras were in place we were set to go. After all the preparation we were able to conduct three key activities at Wembley:

  • Live orchestration of match graphics, mirroring the broadcast graphics production but using our own HTML 5 focused production and delivery chain
  • Live monitoring and demonstration of prototype Football DMApp (see blog post) on site at Wembley
  • ChyronHego data capture for Virtual Placement: camera parametric data and Tracab player tracking data.

In addition to the on-site achievements, and to further illustrate the end-to-end nature of the trial, we engaged 10 remote viewing participants in UK, Sweden and the Netherlands, to experience watching our interactive, customisable and multi-screen version of the 2018 FA Cup Final.

Among the key system elements and features of our system are the following (which will be illustrated in a short video coming soon):

  • Live triggering tool with trigger launcher driven by Elgato Streamdeck
  • Real-time preview in OB van driven direct from SDI
  • TV emulator and Samsung tablet as client devices
  • Live sessions integrated within client onboarding workflow
  • Match GFX components developed by ChyronHego and implemented using new DMApp component which integrates with their Prime universal graphics platform.
  • Interactive experience on tablet included:
    • ScoreClock menu with match overview, team line-ups and replays
    • Broadcast menu with camera feed viewer and customised picture-in-picture layout on TV

Like most productions this was a massive collaboration involving people and organisations across and beyond 2-IMMERSE. We are happy to acknowledge the help of: AWS Elemental in providing real-time encoders; BT Technology Service and Operations for providing some key with contacts and helping with system design; BT Sport for permission to access the stadium and allowing our small van to take its place alongside the big trucks from BT Sport and the BBC; and to BT Sport and BT media and broadcast for providing the guaranteed upstream bandwidth we needed.

The next step – or at least one of them – is to work with all the video and data captured at the Final to develop a stand-alone demo showing the capabilities of the system and illustrating the end-to-end process. We will present this on our stand in the Future Zone at IBC 2018 in Amsterdam in September. We look forward to seeing you there.

READ MORE


2-IMMERSE at NAB 2018

The trade fair, exhibition and conference NAB is one of the fixtures on the calendars of all the key influencers in the broadcasting industry.  NAB, which took place 7-12 April in Las Vegas, and its sister conference IBC, in Amsterdam 13-18 September, are great litmus tests for what is important for the industry. This year the idea of immersion was strongly featured and there was a great deal of interest in high resolution VR technologies.

ChyronHego, partner in 2-IMMERSE and provider of sports graphics and data to the broadcasting industry, took to NAB our project’s interpretation of what immersion can mean. ChyronHego used the multi-screen object based broadcasting concept demonstrated by the MotoGP service prototype (which can be seen in this video) to showcase the way we believe we can immerse viewers in an experience across multiple screens.

With private demonstrations in meeting rooms on the ChyronHego stand (the main stand is above, our demo set-up to the right), the work assumed a slightly mysterious hue. Since it remains a prototype, it was inappropriate to place our MotoGP demo front-and-centre of the ChyronHego stand, since such space is reserved for today’s products.

In private presentations, however, the work was introduced to many key influencers from more than 15 different broadcasters based in Germany, Switzerland, Sweden, USA, Norway and the UK. ChyronHego’s Director of Software Development Stefan Fjellsten, who led the interactions, was delighted with the response:

It exceeded my expectations. Everyone was really positive with some asking for exclusive rights to the technology in their territory, and all of them trying to work out how the capability could be applied for the content rights they have.

The feedback and interest we garnered through this comparatively low-key display of our ideas at NAB is really encouraging.  Alongside the feedback we are getting from users it suggests strongly we are following a path that interests providers, rights holders and their viewing public. It also prompts us to be bold as we plan to provide a more comprehensive display of the project’s results at this year’s IBC iin the early autumn.

Thanks to Stefan Fjellsten for the photographs.

READ MORE


Immersive statistics and advanced data capture

STEFAN FJELLSTEN, Chief Architect at 2-IMMERSE partner ChyronHego, writes about the data visualisation possibilities of the project’s football trial:

The third trial in the 2-IMMERSE project will target football broadcasts to create an immersive multi-device experience for the viewer. One of the goals is to provide the audience with more control over the graphics elements presented, giving the viewer the opportunity to focus on what is relevant for them. This control allows presentation of more data than ever since the viewers themselves determine what and how much to visualise.

Traditionally broadcasters of live football coverage have been conservative in their use of graphics on screen. They have opted instead for a minimalist approach where graphics are shown sparingly and the focus is instead on the video coverage. With a growing demographic of young audiences becoming more accustomed to data-rich content as well as being more demanding in their content being personalised to their own individual viewing preferences, the need to control graphics at a far more granular level is now beckoning.

But, what is there to show?

Of course, there is an indispensable wealth of statistical information to visualise if desired. But this becomes more interesting and technically challenging with dynamic in-game generated data. What is available depends on what level of data capture is associated with a particular football game. There are several options for capturing football data, which can be combined or used in isolation. The most common are listed below:

  • Manual scouting, one or more persons, mostly on site, are following the game with relevant digital tools for capturing the most basic events such as goals, substitutions, free-kicks, corner kicks, offsides, red and yellow cards and similar. This allows close to real-time distribution of these events. The delay is the human interaction with the digital tool.
  • Video-based scouting with manual input, similar to the above, but more events can be collected since work is assisted with a video recorder. Events like passes can be captured, as well as their type, such as through-ball. This information is not distributed in real-time, but normally is available typically in less than a minute after the event.
  • Automatic capture using optical computer vision or wearable technology. These technologies collect automated data on the position of the players and the ball (optical only) in real-time and collect this data much more quickly, objectively and consistently than a human can. The raw data can be used to create many new derived metrics, the most well-known and used being distance travelled and speed, but with the capability to create many hundreds of other physical or spatio-temporal metrics and visualisations.

As ChyronHego is a partner in the 2-IMMERSE project, the football trial will give the viewer access to data captured by the player tracking system TRACAB™, which in real-time is tracking all the players, referees and ball 25 times per second during a football game. This will provide the raw positional data, as well as a lot of statistics that can be derived and visualised.

Example of visualisation of this data on a web page:

Above, statistics and analysis derived from position.

Above, pitch radar visualisation with players, referees and ball, as well as voronoi spatial analysis.

Above, virtual graphics visualisation on top of live or recorded video.

READ MORE