IBC launch for 2-IMMERSE open source software

2-IMMERSE is approaching the end of its three year term.  With three and a half of our four prototype services developed, and with them all running on the same common platform, it is a good time to share the progress we have made with the wider media community. That is why we are spending a lot of time and effort developing our stand and our presence more generally at IBC.  IBC (the International Broadcasting Convention) is held each September at the RAI Amsterdam and is regarded as the world’s most influential media, entertainment and technology show. The exhibition dates this year are Friday 14 to Tuesday 18.

2-IMMERSE will have its own stand, in the Future Zone (Hall 8 F46), but you’ll also see 2-IMMERSE project outputs on the stands of ChyronHego (Hall 7, C21), of IRT (Hall 10, F51)  and in the conference programme (Sunday 16 at 16.45 in the Forum) where Ian Kegel is presenting a description of the technical platform used to deliver object based productions.

Through a number of channels, including this blog post, we are inviting friends in the industry to come and see what we have achieved and to learn specifically about the software we will be launching at the end of the project.  2-IMMERSE has recognised a number of key challenges related to the delivery of object based media productions across multiple screens and has sought to develop common, extensible solutions to these that will be made available as open source software.  In addition we will be describing a reference architecture that will help explain the technology components required to deliver our object based experiences.

The software components we are developing all play valuable roles in the delivery of our multi screen experiences.  The software components will:

  • Enable frame accurate multi-screen synchronisation
  • Manage layouts across multiple screens
  • Manage timelines across multiple screens
  • Enable unified sign-on for multi-screen experiences
  • Provide bandwidth orchestration capability to help deliver the “best” experience even when bandwidth is limited

2-IMMERSE worked closely with rights holders throughout the project, setting – and meeting – the challenges of delivering one of its prototypes as a live end-to-end production.  We chose the high profile 2018 FA Cup final as the object of this trial and at IBC we will be showing the production tools we developed to do this. We will also be available to discuss how the production chain needs to be evolved to support object based delivery.

2-IMMERSE set out to show that object based delivery approach can be used to effectively deliver immersive multi-screen experiences that over achieve compared to their single screen alternatives.  What ‘over achieve’ means depends upon the context.  For the sport examples it means delivering more on the brand objective for the rights holders; in the case of BT Sport that means helping viewers get to the heart of sport – you can see what we have done towards this objective with our MotoGP at Home demo.

If you come to IBC you’ll be amongst the first to see our new FanZone experience for football – a prototype service that will highlight how the object based approach can be used to create partisan representations of games in public spaces. This exceptional demo is also based on the 2018 FA cup final.

‘Over-achieving’ for theatre companies with education objectives means supporting their ambition to allow equal provision and equal access to the best theatre-based education resources through connected screens.  For our Theatre At Home demonstration it was about delivering a more social and engaging experience of watching filmed theatre in the home.  These experiences will all be on show at IBC and we are keen to share our enthusiasm for our achievements with the wider broadcasting and production industries.

We hope to see you in Amsterdam!

In the Future Zone, Hall 8, Stand F46.


Object-based media: the ‘next big thing’

“BT Sport: Object-based delivery ‘next big thing’ ” – so reads the title of a recent article on the web site Advanced TelevisionDoug Williams from BT argues that this illustrates that we are making progress on exploitation – although perhaps not in the way you might expect.

A key objective of an Innovation action like 2-IMMERSE is to be able to show that the project has an impact, and most particularly that it leads to commercial impact. Simplistically it would be nice to think 2-IMMERSE designed and proved the viability of a widget, and that following the project a new company was started that made, marketed and sold said widget.  That would be an easy story to tell – but things don’t often work like that.

 2-IMMERSE is a collaborative project involving, among others, BBC, BT, CISCO and IRT, and in this blog post I want to tell you a little about some of the progress we are making towards delivering our exploitation objectives.  I am not telling the whole story,  just relating something of what is happening in BT.

As in the suggested ‘easy to tell’ story, 2-IMMERSE is designing and proving the viability of something, though that something is not a widget.  We are proving, through the work done on the MotoGP experience and on the football trials, for example, that using the objectbased delivery approach, we can create multi screen TV-based experiences and that those experiences can be appealing.  We have worked closely with rights holders BT Sport and Dorna Sports as our experiments progressed and we have shared our progress with those rights holders along the way.

A fundamental aspect of this project is that the delivery of the TV experiences is based on objects – where those objects can be video streams, graphics, audio streams or interactive elements that are assembled, according to rules, across the devices that are available to present the experience.  There might be an element of “so what?” to that description.  It does not highlight any benefits and gives no obvious reason why such an approach is useful to a broadcaster.  But there is a simple benefit addressing something that is distinctly missing from current TV services and that is the ability to personalise or customise the experience to better suit the context in which the TV experience is being consumed.

TV is great. I am writing this the morning after nearly half the UK population were together watching England play, and disappointingly lose, in the semi-final of the World Cup. Television is the only medium than can achieve such moments where such a large fraction of the population are all sharing the same experience.  And that’s magic.


• What if I was hard of hearing and wanted to quieten that crowd volume and increase the commentary volume so I could hear the commentary more clearly?  I couldn’t do that last night, but I could with the object-based delivery approach.

• What if I was watching on a small TV and found the graphics a bit hard to read and wanted to increase the size of the graphics so I could read them more easily?  I couldn’t do that last night, but I could do that with the object-based delivery approach.

• What if I was profoundly deaf and wanted to pull up a signed commentary on the TV screen?  I couldn’t do that last night, but with the object-based delivery approach that is possible.

• What if I wanted to actively track the relative possession enjoyed by each of the teams throughout the game?  I couldn’t do that last night, but I could do so with the object-based delivery approach.

• What if I was interested in seeing at all times, the expressions on each of the different managers’ faces?  I couldn’t do that last night, but with the object-based delivery approach I could.

The object-based delivery approach provides ithe ability to personalise the TV experience to better address the needs of a large number of niche populations.  Whether they are characterised by a need for more data, clearer graphics or a different audio mix.  We’ve been discussing these options with our rights-holders partners for some time now, and together we are beginning to focus on these benefits and to work out how we could take them from the lab to the living room.

BT Sport have a fantastic innovation record.  BT Sport were the first to do deliver live UHD TV in Europe.  BT Sport were among the first to provide 360 video coverage of live sporting events.  BT Sport were the first to use Dolby Atmos to deliver even more compelling soundscape to accompany the pictures.  And now? Well now it’s good to see reports of Jamie Hindhaugh, COO of BT Sport, and the one who has driven innovation in the company, ruminating on what innovations are next in the pipeline.

According to an article penned by Colin Mann of the Advanced Television web site,  reporting on the Love Broadcasting Summit,

[Jamie Hindhaugh] didn’t see VR and 3D as necessarily the future for sports viewing, as they lacked the social aspect, with BT preferring to offer such options as different camera angles, instead suggesting that object-based delivery would grow with the spread of IP. We will start giving you objects, packages, templates, so you can create your own version of what you are watching.

Hindhaugh said that hardcore fans of MotoGP could benefit from additional graphics and stats available to the broadcaster. BT would curate the data, and allow the viewer to choose their preferences from the programme feed. That’s where we’re going. That’s where the next big initiative is coming through, combined with 5G mobile.

We have always seen our greatest potential for innovation impact in large companies as being the ability to affect the way they choose to develop their services.  We have set out to work with the parts of our business that run key services and to effect innovative change.  We are pleased to see the ideas we are developing and demonstrating are appealing not only to users but also to decision makers and that innovations developed in this project could be the ‘next big thing’.


Success for 2-Immerse at TVX

Jie Li honoured with Best Paper Award

On 27 June Jie Li of the Distributed and Interactive Systems (DIS) group at 2-IMMERSE partner CWI was awarded with the Best Paper Award at ACM TVX 2018, the ACM International Conference on Interactive Experiences for Television and Online Video. Li won the award for the paper ‘A New Production Platform for Authoring Object-based Multiscreen TV Viewing Experiences’. Outlining research done within 2-IMMERSE, the paper has as co-authors Pablo Cesar, Maxine Glancy (BBC), Jack Jansen and Thomas Röggla

Multiscreen TV viewing refers to a spectrum of media productions that can be watched on TV screens and companion screens such as smartphones and tablets. TV production companies are now promoting an interactive and engaging way of viewing TV by offering tailored applications for TV programs.

However, viewers are demotivated to install dozens of applications and switch between them. This is one of the obstacles that hinder companion screen applications from reaching mass audiences. To solve this, TV production companies need a standard process for producing multiscreen content, allowing viewers to follow all kinds of programs in one single application. This paper proposes a new object-based production platform for authoring and broadcasting programs for multiscreen.

Besides the awarded paper, the DIS group also presented a demo at the conference, showcasing object-based live broadcasting of a sports event, to both a large TV screen and companion screens. The demo included both the producer and the user side. The producer side takes place inside an outside broadcast truck, where the director/producer can insert on-demand snippets and interactive components in the live broadcast. The user side is a home, where a viewer with a TV screen and several companion devices enjoy a personalized and interactive experience.

CWI’s Distributed and Interactive Systems (DIS) research group focuses on facilitating and improving the way people access media and communicate with others and with the environment. They address key problems for society and science, resulting from the dense connectivity of content, people, and devices. The group uses recognized scientific methods, following a full-stack, experimental, and human-centered approach.

More at TVX from BBC and IRT

A work-in-progress paper submitted by BBC and IRT explores the potential of augmented reality technology as a novel way to allow users to view a sign language interpreter through an optical head-mounted display while watching a TV programme. We address the potential of augmented reality for personalisation of TV access services. Based on guidelines of regulatory authorities and research on traditional sign language services on TV, as well as feedback from experts, we justify two design proposals. We describe how we produced the content for the AR prototype applications and what we have learned during the process. Finally, we develop questions for our upcoming user studies.

Also at TVX BBC and IRT demonstrated results of our work stream which targets deploying the 2-IMMERSE apps on HbbTV 2.0 devices. In cooperation with Samsung we showed the 2-IMMERSE MotoGP experience from Silverstone GP 2017 on a recent consumer device running an HbbTV 2.0 ready firmware.


Back of the net!

The 2-IMMERSE Football production team report on the success of our major trial at Wembley.

2-IMMERSE passed another milestone at the recent 2018 FA Cup Final at Wembley Stadium between Chelsea and Manchester United as we proved our prototype end-to-end live production system for object-based broadcasting to multiple screens.

For those that may not know the FA Cup is the world’s oldest football challenge cup, having been started back in 1871. The high profile final is the curtain closer for the UK domestic football season and has spawned and fuelled many of the intense inter club rivalries that give the UK league its particular character. For decades the match has also been a key television moment and a showcase for experimentation and innovation in the presentation of football on television. So 2-IMMERSE is very proud this year to have been a small part of a great tradition. The FA Cup Final is a national treasure, but it also draws a global audience estimated to be in the hundreds of millions.

Our team for the FA Cup was drawn from BT, CWI, Cisco, and BBC R&D – and they all worked tirelessly over the preceding weeks and months to design and develop the end-to-end system. Early ‘sighting visits’ to Wembley (see our previous blog post) helped us identify and address the key issues that can affect the delivery of the service and helped the test on the day of the Final to go as smoothly as it did.

On the day, we arrived with our modest broadcast truck and once the feeds from all the individual cameras were in place we were set to go. After all the preparation we were able to conduct three key activities at Wembley:

  • Live orchestration of match graphics, mirroring the broadcast graphics production but using our own HTML 5 focused production and delivery chain
  • Live monitoring and demonstration of prototype Football DMApp (see blog post) on site at Wembley
  • ChyronHego data capture for Virtual Placement: camera parametric data and Tracab player tracking data.

In addition to the on-site achievements, and to further illustrate the end-to-end nature of the trial, we engaged 10 remote viewing participants in UK, Sweden and the Netherlands, to experience watching our interactive, customisable and multi-screen version of the 2018 FA Cup Final.

Among the key system elements and features of our system are the following (which will be illustrated in a short video coming soon):

  • Live triggering tool with trigger launcher driven by Elgato Streamdeck
  • Real-time preview in OB van driven direct from SDI
  • TV emulator and Samsung tablet as client devices
  • Live sessions integrated within client onboarding workflow
  • Match GFX components developed by ChyronHego and implemented using new DMApp component which integrates with their Prime universal graphics platform.
  • Interactive experience on tablet included:
    • ScoreClock menu with match overview, team line-ups and replays
    • Broadcast menu with camera feed viewer and customised picture-in-picture layout on TV

Like most productions this was a massive collaboration involving people and organisations across and beyond 2-IMMERSE. We are happy to acknowledge the help of: AWS Elemental in providing real-time encoders; BT Technology Service and Operations for providing some key with contacts and helping with system design; BT Sport for permission to access the stadium and allowing our small van to take its place alongside the big trucks from BT Sport and the BBC; and to BT Sport and BT media and broadcast for providing the guaranteed upstream bandwidth we needed.

The next step – or at least one of them – is to work with all the video and data captured at the Final to develop a stand-alone demo showing the capabilities of the system and illustrating the end-to-end process. We will present this on our stand in the Future Zone at IBC 2018 in Amsterdam in September. We look forward to seeing you there.


Wins for Chelsea and 2-IMMERSE at Wembley

A report from our man on the terraces, DOUG WILLIAMS from BT R&D:

2-IMMERSE is developing a live service prototype based on football that we aim to test at the FA Cup final on 19 May this year.  Several 2-IMMERSE project participants gave up this last weekend to take part in key technical tests to prepare for that event.  The test took place at Wembley Stadium during the FA Cup semi-final between Chelsea and Southampton.

Getting experience of all that is required for the Wembley trial incorporates a number of fundamental and non-trivial milestones, including knowing what we have to do get access to Wembley with our own outside broadcast truck, knowing what we have to do to accept all the live feeds from the cameras in the stadium, and what we have to do to make the uplink work.

For most of the 73,000 in attendance the key question was, “Who would book their place in the final against Man United?” Our team members were much were less concerned about the result on the field. They were worried by the result on the screens.

The truck we used, thanks to the invaluable assistance of BT Sport, allowed us to test in a live environment a range of tools and systems that have been developed by the project. We learned an enormous amount by being in the middle of a major outside broadcast and through meeting a number of suppliers and colleagues who were centrally involved in delivering it.

We were thrilled that the 2-IMMERSE was recognised and supported – and importantly that we are now better known to some of the outside broadcast teams whose support we will rely on for our next two live tests. Inevitably, and as you would expect from such a test, we spotted a range of bugs and problems, and we are now busy examining the logs to see if we can identify their causes – and the mend them.

While our team went home with a bug list and deeper knowledge of how the outside broadcast process interacts with our system, Chelsea went home victorious.  So the FA Cup final that we will be using in our live trials on Saturday 19 May will be between Manchester United and Chelsea.  Prior to that, as the footballers get measured for their match-day suits, we have some bug fixing to attend to and, thankfully, another live test to complete before we conduct a pioneering object based broadcast test of the 137th FA Cup final.

With thanks to Martin Trimby for the photographs.


2-IMMERSE at NAB 2018

The trade fair, exhibition and conference NAB is one of the fixtures on the calendars of all the key influencers in the broadcasting industry.  NAB, which took place 7-12 April in Las Vegas, and its sister conference IBC, in Amsterdam 13-18 September, are great litmus tests for what is important for the industry. This year the idea of immersion was strongly featured and there was a great deal of interest in high resolution VR technologies.

ChyronHego, partner in 2-IMMERSE and provider of sports graphics and data to the broadcasting industry, took to NAB our project’s interpretation of what immersion can mean. ChyronHego used the multi-screen object based broadcasting concept demonstrated by the MotoGP service prototype (which can be seen in this video) to showcase the way we believe we can immerse viewers in an experience across multiple screens.

With private demonstrations in meeting rooms on the ChyronHego stand (the main stand is above, our demo set-up to the right), the work assumed a slightly mysterious hue. Since it remains a prototype, it was inappropriate to place our MotoGP demo front-and-centre of the ChyronHego stand, since such space is reserved for today’s products.

In private presentations, however, the work was introduced to many key influencers from more than 15 different broadcasters based in Germany, Switzerland, Sweden, USA, Norway and the UK. ChyronHego’s Director of Software Development Stefan Fjellsten, who led the interactions, was delighted with the response:

It exceeded my expectations. Everyone was really positive with some asking for exclusive rights to the technology in their territory, and all of them trying to work out how the capability could be applied for the content rights they have.

The feedback and interest we garnered through this comparatively low-key display of our ideas at NAB is really encouraging.  Alongside the feedback we are getting from users it suggests strongly we are following a path that interests providers, rights holders and their viewing public. It also prompts us to be bold as we plan to provide a more comprehensive display of the project’s results at this year’s IBC iin the early autumn.

Thanks to Stefan Fjellsten for the photographs.


A Day in Brussels: showcasing our production tools

PABLO CESAR, JIE LI and THOMAS RÖGGLA from project partner CWI Amsterdam report on a successful showcase in Brussels:

Vlaamse Radio- en Televisieomroeporganisatie (VRT), the national public-service broadcaster for the Flemish Region and Community of Belgium, organizes every year a networking event on media innovation: Media Fastforward. This year it happened in the beautiful location of Bozar in Brussels on 5 December. VRT Innovatie invited a number of European research projects to participate in the Future Zone.

Media Fastforward, focusing on media innovation, fits well the networking and dissemination needs of 2IMMERSE, providing a unique opportunity to meet with others and to showcase the results of the project. So the CWI team packed their bags, taking along a demo about our object-based multi-screen broadcasting, with an emphasis on the live-triggering tool intended for sports events such as football or MotoGP races.

Our demo showed a working prototype of our tool, interfacing with the 2IMMERSE platform, that in real-time could trigger broadcast events, showing and hiding collections of objects. The tool aims to reduce the workload during live broadcasting by providing templates for certain events (e.g., crashes, overtakes in MotoGP). The template resembles a ‘data package’, including graphics, placeholders for camera feeds, and scripts describing sequences of contents within this event.

The event was a huge success with around one thousand registered visitors, 25 (inter-)national speakers, more than 40 startups, 12 research projects and 3 impressive tech companies. It allowed us to interact with peer projects like FLAME, MOS2S and ImmersiaTV, and show and discuss about our production tools with entrepreneurs, media professionals, and policy makers.

Yes, we also had as well some extra time to relax with bike games and alcohol-free mojitos; we will certainly be back next year!


Making object-based CAKE at BBC R&D

Object-based media production, which is a central principle underpinning 2-IMMERSE, is being intensely developed by our colleagues at BBC R&D. Freely available online are resources that introduce the ideas behind object-based media, and which outline one especially neat demonstration of it in action, the Cook-Along Kitchen Experience, or CAKE.

Back in 2013 Tony Churnside wrote a BBC R&D blog post that outlines the approach and rationale for object-based broadcasting. This was updated just a few months ago, and remains an essential introduction:

Complementing this is a slightly more technical post by Robert Wadge, written in 2013 and updated two years later:

Just about a year ago, Matthew Brooks and Tristan Ferne looked back over 2016 to review recent work with object-based media from BBC R&D:

This includes a nod towards the involvement of BBC R&D in 2-IMMERSE:

We got busy with 2Immerse, a European project creating a multi-screen, multi-home, interactive immersive home theatre experience. As well as finalising the architecture, we built scrolling scripts that synchronise with the performance, video chat that brings homes together during intervals, and a layout engine that can present content across multiple screens.

Earlier this year, in May, Ian Forrester outlined BBC R&D’s plans to develop a community of practice for object-based media production:

The team is putting together an impressive sequence of demonstrations and workshops around the country (and beyond), and along with outlining this the post features a rallying cry for the significance of object-media:

We believe that the object-based approach is the key to content creation of the future, one which uses the attributes of the internet to let us all make more personal, interactive, responsive content and by learning together we can turn it into something which powers media beyond the scope of the BBC.

Perhaps BBC R&D’s most developed demonstration of the approach is the CAKE pilot, which is described in detail here (and illustrated above). The test period for the prototype of this cooking experience has recently come to an end but the ideas behind it are well worth exploring:

Following a recipe with CAKE is different to other cooking shows because it’s not a linear TV programme. It customises recipes based on your familiarity with ingredients and methods, your tastes or dietary preferences, and how many people you’re inviting round for dinner. The experience reacts ‘in the moment’ to your progress, allowing you to create new dishes at your own pace. Novices can level-up and experts can cut to the chase, supported by an evolving dialogue between audience and presenter.

Also available online is a paper prepared for the recent IBC symposium, ‘Moving Object-Based Media Production from One-Off Examples to Scalable Workflows’ [.pdf], authored by Jasmine Cox, Matthew Brooks, Ian Forrester and Mike Armstrong from BBC R&D. This is a valuable account of the team’s experiences and plans for the next stage of development, as their introduction promises:

This paper follows the creation of our most recent example of object-based media, the Cook-Along Kitchen Experience (CAKE) which was conceived and produced as an object-based experience from the outset. This paper looks at how we are applying the lessons learned from our previous work to the development of OBM data models and software tools. The paper also discusses how we intend to involve content creators from both inside and outside the BBC and build a community of practice around the development of new forms of media.