Immersive statistics and advanced data capture

STEFAN FJELLSTEN, Chief Architect at 2-IMMERSE partner ChyronHego, writes about the data visualisation possibilities of the project’s football trial:

The third trial in the 2-IMMERSE project will target football broadcasts to create an immersive multi-device experience for the viewer. One of the goals is to provide the audience with more control over the graphics elements presented, giving the viewer the opportunity to focus on what is relevant for them. This control allows presentation of more data than ever since the viewers themselves determine what and how much to visualise.

Traditionally broadcasters of live football coverage have been conservative in their use of graphics on screen. They have opted instead for a minimalist approach where graphics are shown sparingly and the focus is instead on the video coverage. With a growing demographic of young audiences becoming more accustomed to data-rich content as well as being more demanding in their content being personalised to their own individual viewing preferences, the need to control graphics at a far more granular level is now beckoning.

But, what is there to show?

Of course, there is an indispensable wealth of statistical information to visualise if desired. But this becomes more interesting and technically challenging with dynamic in-game generated data. What is available depends on what level of data capture is associated with a particular football game. There are several options for capturing football data, which can be combined or used in isolation. The most common are listed below:

  • Manual scouting, one or more persons, mostly on site, are following the game with relevant digital tools for capturing the most basic events such as goals, substitutions, free-kicks, corner kicks, offsides, red and yellow cards and similar. This allows close to real-time distribution of these events. The delay is the human interaction with the digital tool.
  • Video-based scouting with manual input, similar to the above, but more events can be collected since work is assisted with a video recorder. Events like passes can be captured, as well as their type, such as through-ball. This information is not distributed in real-time, but normally is available typically in less than a minute after the event.
  • Automatic capture using optical computer vision or wearable technology. These technologies collect automated data on the position of the players and the ball (optical only) in real-time and collect this data much more quickly, objectively and consistently than a human can. The raw data can be used to create many new derived metrics, the most well-known and used being distance travelled and speed, but with the capability to create many hundreds of other physical or spatio-temporal metrics and visualisations.

As ChyronHego is a partner in the 2-IMMERSE project, the football trial will give the viewer access to data captured by the player tracking system TRACAB™, which in real-time is tracking all the players, referees and ball 25 times per second during a football game. This will provide the raw positional data, as well as a lot of statistics that can be derived and visualised.

Example of visualisation of this data on a web page:

Above, statistics and analysis derived from position.

Above, pitch radar visualisation with players, referees and ball, as well as voronoi spatial analysis.

Above, virtual graphics visualisation on top of live or recorded video.