Context:
The 2-IMMERSE project started from the concept that current television services are being challenged by the new forms of OTT content and digital media. The project sought to develop new entertainment experiences that were more engaging than conventional linear broadcasts and that commanded more of the users’ attention (thus deepening the sense of immersion), and we worked with a number of avenues to deliver this. The first was using multiple screens; the second was enabling personalisation; the third was encouraging interaction with the content; and the fourth was co-opting facets of social media services (sharing, contributing and communicating).
Through performing this project the participants learned many valuable lessons and this page captures those experience so that you can share them and benefit from the 2IMMERSE experience. You can also read the introduction to the project achievements here: 2immerse-project-guide.
Approach:
2-IMMERSE’s key focus was to develop a robust and scalable multiscreen platform architecture for supporting object based broadcasting which could be showcased and tested through a number of sports and theatre use cases. The emphasis was to ensure the platform was reusable and useful across a number of use cases rather than to perfect the service being trialed in any of the specific use cases.
As it was, the 2-IMMERSE project successfully built and tested four multi-screen service prototypes, two based on theatre, two based on sport, using the same platform architecture and many of the same components in each case, despite the very different nature of the services.
Theatre at Home Trial
The 2IMMERSE project, working with content from the Royal Shakespeare Company, developed a multiscreen Theatre At Home service prototype, that used coordinated content, including a main broadcast feed together with additional text-based content including the script, scene synopses and actor biographies, together with real time text chat and real time video chat enabling friends to watch together in a virtual box.
This prototype was evaluated in 12 homes (23 participants) in early 2017. When polled on the experience, the users were positive about some of the design concepts that steered the development of the experience, including the use of metaphors from attending theatre in real life (the theatre box, the interval, the bell etc.) and the primacy, in terms of screen real estate, of the main broadcast performance.
MotoGP service trial
The 2IMMERSE project, in collaboration with Dorna Sports, generated a multiscreen as-live proof of concept across a TV Screen, a Tablet and a phone using the MotoGP content. This used an “as-live” data stream (timing data), the as-live broadcast feed, VoD assets and replays to generate a multiscreen experience that allowed interaction and supported responsive design features.
The experience was evaluated by 85 users in early 2018. 70% of the viewers said they felt the experience was better than the existing single screen experience. The remaining 30% expressed a preference for the non-interactive version.
In the autumn of 2018 the experience won an award as best Multiscreen experience from the HbbTV Awards panel.
Football at Home & Football in the Fanzone
Working with BT Sport, and following months of preparation and exploratory tests, the 2IMMERSE project completed a live end-to-end technical test that was transmitted from Wembley Stadium to 10 households of project participants who enjoyed and interacted with a multiscreen version of the 2018 FA Cup Final.
This trial illustrated how the production chain for live football would need to be adapted to deliver live interactive multi-screen experiences and showed how differential delay paths for data driven graphics and encoded video can be overcome.
Assets collected from this pioneering live trial were subsequently used to generate two further “as-live” demos, one for an at-home experience and one for a FanZone presentation where the multiscreen approach could be shared in a public space.
Theatre in Schools
The 2IMMERSE project worked with the education department of Donmar Warehouse to create and deliver new interactive lesson formats for schools based on a film version of William Shakespeare’s Julius Caesar.
Two lessons were developed using a common Watch / Make / Share framework and delivered to three schools in London and in Suffolk involving many screens controlled via a teachers screen but allowing the students to be proactively involved.
The opinion poll evaluation of the prototype suggested it offered lessons that were more enjoyable and fun in the eyes of the students, while the Teachers felt that learning objectives were achieved better with the interactivity invoked via the prototype service than would have been the case with a single-screen experience.
Service prototype design
2-IMMERSE’s key focus was to develop a robust and scalable multiscreen platform architecture which could be showcased and tested through a number of sports and theatre use cases. As such, the multiscreen UXs developed across the use cases had to serve as a demonstration of key platform capabilities whilst being easy to use for one-time operation in user trials.
The development of the initial UX designs came from a team of experienced UX/UI design experts working on the project. 2-IMMERSE followed established user-centred design processes to develop first stage prototypes, which were then tested in field trials to inform the technical design priorities for the platform. The aim of a user-centred design process is to ensure that the design of a product or service remains focused upon who will use it, in what context, and with what aim. The UX/UI designers worked with a number of key user groups in mind included; content producers, broadcasters, orchestrators (such as a teacher or landlord) and audience members.
The UX/UI design insights gained from adopting this approach were used to iteratively develop and refine the initial proof of concepts to a state that they were ready to be trialled ‘in the wild’. It was agreed that once we had developed the design to a point that it could be trialled, we would not look to further iteratively develop the UX/UI through formal lab and contextual user tests and evaluation procedures. Of course, for a product that was to be launched it would be standard practice to further iteratively develop the UX /UI through such user tests and evaluations.
Service prototype evaluation
The use cases in 2-IMMERSE were all distinct, and were evaluated in different ways to gain insights from the perspectives of platform capabilities, production tools and the audience experience. This was immensely useful for the project team as 2-IMMERSE primarily sought to build, test and prove a capable and relevant platform that supports greater immersion through the use of multiple screens, personalisation, and interaction with content and other people.
Rather than undertake iterative trials to perfect any single use case, 2-IMMERSE instead sought greater value by experimenting across genres, device configurations and social spaces to validate the platform across a variety of use cases. This approach was considered prudent given the key ‘platform focused’ goals of the project.
2-IMMERSE primarily used qualitative evaluation methods including small-scale trials, interviews with stakeholders and end-users, and contextual observation in the field, to test and qualify key aspects of the design and implementation of the platform. This evaluation provided sufficient insights and evidence to the project team and stakeholders alike to suggest that object based broadcasting (OBB) has many potential benefits which warrant further exploration beyond the conclusion of the 2-IMMERSE project.
Qualitative techniques were primarily employed in the evaluation as these suited the limited scale of the trials undertaken in 2-Immerse. Quantitative evaluation would have required running trials at significantly larger scale for a longer period of time with more instrumentation and supporting post-trial analysis. 2-IMMERSE believe that it would not have been possible to undertake significant quantitative evaluation given the project’s scope, time and resources constraints.
Adopt a micro service approach
The micro service infrastructure that we used for the delivery of object based multi-screen presentations was good. Because micro services can be a little complex you are invited to re-use the open source platform we used.
During the development process of our platform we used the industry standard approach based on micro services believing this would bring the affordances of scalability, re-usability and extensibility that the project sought. Our experience within the project has borne this out. We have been able to extend and reuse components and have achieved significant re-use between different use cases. We are pleased that we adopted the micro services approach. However, trying to adapt our service stack to various deployment scenarios, we realized that it could be tricky to find the right order and settings for each scenario. Building Distributed Media Applications (DMApps) can also be challenging in such a context, as we have multiple services that can work together, each with its own API and set of capabilities. To address this we developed some documentation to help.
To help you adopt micro services, you are invited to re-use the 2IMMERSE platform:
We recommend that you should adopt a micro service approach if you want to achieve scalability, extensibility and reusability of components. You can use the extensive platform that we developed, based on open source software and including some component developed by 2-IMMERSE. Please access the ”Get Started” GitHub repository which includes everything you need to deploy a standalone 2-IMMERSE platform within a few minutes. It also contains a 2-IMMERSE overview, documentation and examples of DMApps to help you get started.
Ensure the visual quality and aesthetics the presentation
Value the quality of the presentation you develop if you want to influence the TV industry.
We found it important to demonstrate how OBB can support the visual quality needed to deliver the look and feel of well-established content brands. As the TV production and broadcasting industry places significant value on presentation, OBB needs to at the very least replicate the aesthetic quality of existing productions. This includes on-screen graphics, animations, transitions, etc. If it is not possible to offer equivalence, then consider hybrid options that mix together OBB and traditionally rendered effects.
We recommend that you should work hard to ensure the visual quality and aesthetics of an OBB presentation that is not inferior to the standard currently delivered via traditional production methods. Although OBB offers many new and exciting opportunities to producers and broadcasters, it should not be undertaken at the expense of compromising the visual look and feel or editorial narratives of established practices.
Be sensitive to the needs of the brand
The project sought to better understand the tussle between offering users control over layout and retaining such control by producers. In the first service prototypes we were advocating a feature, dubbed a component switcher, which would allow the user to decide which feature should appear in which portion of which screen. We never fully implemented the component switcher, but it is instructive to contrast the appetite for this design idea in the first service prototype with those that emerged for MotoGP at Home, the Football and Theatre in Schools service prototypes. In the latter examples, which were more design-led, the approach adopted was to use templates, where users could select from a range of pre-determined layouts but could not build their own layout from scratch.
Our reflection is that for most branded and treasured content the sensitivity from rights holders about how the content will appear will mean that templated solutions will be preferred. Templated solutions allow users to select from a range of options that have all been crafted by professionals sensitive to the needs of the brand, and our experience is that this will be more likely to be allowed by the rights holders. As we worked with our content partners it became clear that the way the content looks now, in established frames on TV or mobile, is not an accident. Media professionals have been working to optimise the ways in which they present their content and tell stories with it for years, sometimes decades; this history should be embraced and adopted.
We recommend that you should work with the existing mores and tropes used in traditional production. Working with the established design approaches will make your interactive design ideas easier to sell to rights holders, will require a less onerous change in the overall production effort, and will be more easily assimilated by your end users already familiar with the look and feel of the existing designs.
Do not disrupt the story
The story really is the thing.
Theatre, MotoGP and Football all showcase great drama and tell good stories. In the design of multi-screen experiences it is possible to lose sight of the story in a forest of options, interactions and added extras. Be careful, the story really is the thing. Do not disrupt the story in fundamental ways. If the heart of the story is the commentary be very careful as you consider options that may interrupt this. When offering optional replays consider how you will allow the main story to remain present. In MotoGP we made the live feed a picture-in-picture as the replay was playing. In the football prototypes we never allowed the replay to usurp the main broadcast feed on the shared TV; it could sit alongside it or appear on a companion device but the main live feed was never deposed and the live commentary was always dominant. In Theatre we did not have replays to contend with but we consciously forced the presentation of the live play to be the biggest item on the shared screen and, like a theatre production, each extract could not be interrupted. In this way the film retained its primacy over other content forms and helped to keep the story the main focus of the experience.
We recommend that you should think about where the story is being driven from and be very wary of allowing interactions that would significantly limit, obstruct or demote that storytelling feature.
Design rules are guided by the content
When it comes to design rules for multi-screen experiences we are not able to offer prescriptive advice about font sizes, the number of items in menus or where to place interactive menus, nor about how many features is too many. Each case is different, and must be approached as such.
The way MotoGP is presented on screens today is already data and graphics heavy. Football employs fewer graphics and theatre is presented with no additional graphics or data at all except in the closing credits. Design rules are not common across genres but rather are guided by the brand and the form of content in question.
Do not expect simple one-size-fits-all content rules.
We recommend that you should not expect the relatively “busy” nature of a screen design used in one genre to be appropriate for another content genre. Reflect on the style and emphasis used for traditional broadcasts in your genre and develop your content appropriate designs from there.
Develop a strategy for using object based broadcasting
We explored many ideas in the design phase for how OBB could offer new opportunities for presentation and interaction. Some of these ideas were developed and implemented so that we could assess and identify which features were most attractive and which had the greatest impact with audiences, broadcasters and producers. The different features implemented varied significantly in both impact and cost. For example, users reported that alternative video views (particularly the onboard bike cams in MotoGP) had the greatest impact, but it should be acknowledged that this feature is expensive to produce and requires the most capable target devices. As a consequence, the addressable base for such features will be small. As you develop ideas for multi-screen experience these complexities need to be acknowledged and you will need to negotiate through them and, inevitably, to make compromises.
Be clear about the objectives you have for using object based broadcasting.
We recommend that you should develop a cogent strategy based on a clear rationale for using the the OBB approach. There are several possible rationales that could shape your response including:
- Maximum addressable market strategy; where you design using technologically simple features that require only modest capabilities in the end-clients and which are those most likely to be achievable across a range of end target devices.
- An inclusion strategy; where you choose to focus on improving the inclusive design of your content. In this case subtitles, text description, responsive design and onscreen signing options may be paramount and targeted even though such features may not work on simpler devices.
- Premium experience strategy; where you focus on delivering the most impactful experience possible, accepting that doing so is likely to limit your addressable market. Developing such an option may help position the experience as a premium or ‘halo’ product that will showcase differentiation, drive interest in your product and brand, and increase uptake.
Personalise the audio experience too
Personalised audio experiences have high impact.
Audio is an object too; as user experience designers consider options for interactivity and choice, it is possible to overlook audio, but doing so would be a mistake. Using audio as an object is impactful and relatively low cost. In the MotoGP trials the ability to change the relative volume of the commentary and the ambient crowd noise was the feature gaining the second highest number of spontaneous recalls from users.
Audio does not require high bandwidth and is relatively easy to decode and render, and so can easily form a component of a maximum addressable market strategy.
Providing control over some aspects of the audio mix can also offer accessibility enhancement for those with hearing difficulties.
We recommend that you should ask at many points of your design journey, How am I using audio?
Audio in multi-screen environments
One issue that 2IMMERSE identified, but for which we do not have a clear answer, is how to deal with audio presentation when multiple screen devices are capable of delivering audio. In replays for Football and MotoGP and in the Theatre At Home and Theatre In Schools service prototypes we enabled features that would play video with audio on the companion device.
During the live event both Football and MotoGP adopted a stance of muting audio from the replay so the narrative from the live broadcast was never lost. However, post event we did enable the audio from both the main screen TV and the companion device to play out concurrently, which on occasion did cause confusion and frustration as neither could be heard clearly. This multiple audio approach was not ideal, but since the large screen is a shared experience and the companion screen a personal device neither did it seem right to silence the audio on the main screen – others may be watching and listening to it.
We advise you to consider the impact of second screen audio and to look for compromises or good design rules that may help you and others navigate this issue.
Constraints on audio or video chat
In the Theatre At Home service prototype we attempted to allow people in different homes to enjoy watching theatre together in a virtual theatre box and to use video chat during the interval and before and after the play itself. This remains a seductive idea but it proved difficult to implement well at low cost. Mainstream video chat implementations have focused on optimising the experience for single person devices in which audio can be handled quite well. In our use case, where the ideal was to place the audio and video on the main screen, it was much harder to capture both audio and video well. Built-in devices are optimised for a presumed user orientation which may not apply in these shared screen experiences. Harnessing efficient beam forming and echo cancellation is essential to develop good audio chat in an arbitrary environment. Be aware that if you try and implement audio and video chat this is a complex task. However, watch developments with smart speakers closely as the efficient functioning of these devices depends on addressing such audio challenges.
Audio for communications between open spaces remains difficult to implement well, but the technology employed by smart speakers may help.
We recommend that you should be very careful if considering including an audio or video chat feature in your experience. Attractive as they may seem, current implementations may not be optimised for a TV-based set-up.
On-boarding is key
It is relatively easy to imagine a user within a multi-screen experience and to conceive the ways that they will interact with the media objects made available to them. It was always harder to envisage how they joined a multi-screen experience in the first place.
Within the project defining the process by which people and devices are on-boarded (brought into a multi-screen experience) received considerable attention and by the end of our work we had defined a robust and extensible method for on-boarding.
Three iterations of on-boarding development yielded a standards-based approach that works for all the use cases developed in 2-IMMERSE and there appears to be no reason why it should not also work for other use-cases.
We have built into the on-boarding process ways of defining the type of screen (shared screen, e.g. TV, tablet or phone) and the role of the user (teacher, student, venue manager) as they are added to the experience. This became critical for us towards the end of the project where in the Football FanZone development we have five potential large screen roles and in Theatre in Schools where we had three roles for the tablets.
We recommend that you should think carefully about how devices and people become associated with a multi-screen experience.
the 2IMMERSE “on-boarding” implementation works well and you are welcome to re-use it.
Simplify Work Flows.
It is comparatively easy to see OBB experiences from the perspective of the end user, but it is critical that you recognise that OBB also requires changes to production workflows. The best implementations of object based productions we can imagine will demand a clear separation of the underlying video and the graphics, audio and alternative video that may augment them. Production workflows must be able to deliver both clean assets, video, graphics and audio as well as the existing dirty feed that has all graphics and audio burned in. Achieving these different outcomes should not require distinct workflows from the perspective of the operator. Ideally single actions should trigger two outcomes; one for the existing broadcast world and one for an object based broadcast.
We recommend that you should work with partners in the production chain to enable this twin output solution. Chyron Hego, in this project, have set out on this journey by developing an adaptation to their PRIME graphics authoring tools that will output graphics suited for embedding in a video stream and also to output HTML5-based code describing the graphics that can be rendered over a clean feed using a client-side browser plug-in.
Manage Bandwidth
2IMMERSE discovered during the Theatre At Home trials that, due to the nature of TCP, traffic patterns could emerge that would favour one service or content stream at the expense of others, effectively starving certain features of our multi-screen experience of bandwidth. To unlock the largest addressable market possible it is important to manage intelligently the use of the available bandwidth in order that, even in situations where bandwidth may be limited, users are provided with the best possible experience.
2IMMERSE has implemented a bandwidth orchestration tool that allows content producers and broadcasters to specify how the experience should respond to situations where bandwidth is limited.
We recommend that you should use a bandwidth management and orchestration approach to ensure the experience for the user remains optimal even as the supply of bandwidth may become limited. You can reference and use the 2IMMERSE bandwidth orchestration tool.
Synchronisation is essential
Multi-screen experiences can be rendered unattractive or even incomprehensible when the presentation of objects is not correctly synchronised. There are many reasons that screens can become out of sync and we have worked within the project to provide mechanisms that will allow screens to be synchronised frame accurately within a room or across multiple sites. It is extremely important from a production perspective that the directorial intent for content presentation is maintained as, without this assurance, OBB presentation would not be considered fit for purpose.
Synchronisation is essential, and you can use the 2IMMERSE open source synchronisation code.
You should be aware of the risk the lack of screen synchronisation creates. You must either design your experience such that it is not a critical facet of the experience or implement a facility to ensure screens are always in sync.
The 2IMMERSE network based synchronisation service is available as an open source component.
Where to find the 2IMMERSE results
There are multiple aspects to the 2IMMERSE results and you can ideally access them in line with the stage your are at in considering Object Based Broadcasting and the degree of technical information you want to see.
For high level overviews and the introduction to our trials and services please have a look at our videos
For the detailed descriptions of our implementations and results please have look at our published deliverables
For the open source software and components produced by the 2IMMERSE project please look at our GitHub repository.