Designing Production Tools for Interactive Multi-Platform Experiences

BRITTA MEIXNER, JEI LI and PABLO CESAR from CWI Amsterdam write about one of the key challenges for the 2-IMMERSE project:

Recent technical advances make authoring and broadcasting of interactive multi-platform experiences possible. Most of the efforts to date, however, have been dedicated to the delivery and transmission technology (such as HbbTV2.0), and not to the production process. Media producers face the following problem: there is a lack of tools for crafting interactive productions that can span across several screens.

Currently, each broadcast service (media + application) is created in an ad-hoc manner, for specific requirements, and without offering sufficient control over the overall experience to the creative director. Our intention as a contribution to 2-IMMERSE is to provide appropriate and adequate authoring tools for multi-screen experiences that can reshape the existing workflow to accommodate to the new watching reality.

We have been working to identify new requirements for multi-platform production tools. The requirements for traditional broadcast productions are clear and well-established, and are fulfilled by conventional broadcast mixing galleries such as the one above. But it is far from clear how multi-platform experiences will be produced and authored, as so far there are only a few experiences available. Each of these current experiences has been treated as an independent project and as a consequence was implemented on demand for a specific setting. The next generation of production tools must be particularly designed for interactive multi-platform experiences. These new tools are intended for broadcasters and cover both pre-recorded and live selection of content.

To find out about specific requirements for the aforementioned tools, we conducted semi-structured interviews with seven technical and five non-technical participants. The interview guidelines covered several sections. The first section tried to identify state-of-the-art knowledge and current challenges when creating interactive multi-platform experiences, to learn about how past experiences were authored, and to find a common ground between interviewer and interviewee(s). The second section aimed to find out who will use the system in the future and for which purpose, and it included questions like:

  • Who will be users of the system?
  • What level of education or training do users have?
  • What technical platforms do they use today? What tools do they use to produce (immersive) experiences?
  • What other IT systems does the organization use today that the new system will need to link to?
  • What training needs and documentation do you expect for the future system?

Functional and non-functional requirements were then gathered. Exemplary questions for functional requirements were:

  • What does the production process for live experiences look like?
  • Is spatial and temporal authoring desired?
  • Is the spatial design based on templates or can elements be arranged freely? How should layout support be realised, if at all?
  • Should the application be able to preview the presentation. If so, then to which degree of detail?
  • Which data formats do you use for video/audio/images that have to be processed by the authoring environment?

Exemplary questions for non-functional requirements were:

  • What are your expectations for system performance?
  • Are there any legal requirements or other regulatory requirements that need to be met?

After conducting the interviews, the transcripts were analysed, and user characteristics, general and environmental constraints, assumptions and dependencies related to live broadcasts, and open questions and issues were identified and noted. We also differentiated between functional requirements, and non-functional, i.e. technical and user requirements.

Fig. 1

Figure 1 above shows a subset of the initial collection of requirements, open questions, and issues. These were then rearranged according to phases of the production process, for which see Figure 2 below.

Fig. 2

Especially for the planning phase, a large number of open questions were identified. Production, distribution, and consumption phases revealed some technical questions that need to be solved. We identified a set of requirements that were used as the basis to create first screen designs for the authoring tool. Based on the most relevant requirements, four concepts of the production tool interfaces were designed, namely Chapter-based IDE (Integrated Design Environment), Mixed IDE, Workflow Wizard and a Premiere Plugin.

Fig. 3

Fig. 4

 

 

 

 

 

 

 

 

 

 

 

 

 

The Chapter-based IDE concept (Figure 3) divides a program into several chapters (e.g., for a sports event such as MotoGP, pre-race, main race, post-race). Each chapter contains (dozens of) components such as leaderboard, course map etc. The authoring process starts with newly-created or predefined templates, so all the components are assigned to specific regions on screens. The timing for each component to start and end is authored on a timeline.

The Mixed IDE concept (Figure 4) does not specify different phases/chapters of a program. A collection of re-usable Distributed Media Application (DMApp) components, includes components that play audio and video, present text and image content, and provide real-time video communication and text chat.

The limited collections (so far, 12 have been developed) of DMApp components reduce the diversity/complexity of the components. Dragging and dropping the DMApp components into the defined regions on screens allows program producers to author the multi-screen experience into a coherent look and feel. The sequence of the applied DMApp components are editable on a timeline.

Fig. 5

The Workflow Wizard (Figure 5) concept gives a program author an overview of the authoring process and guides the authoring step-by-step. It allows the assignment of work to different collaborators and facilittates a check on everyone’s progress.

 

 

 

 

 

 

 

 

 

 

 

Fig. 6

A Premier Plugin (Figure 6) is very similar to the Mixed IDE concept, but is based on the interfaces of Adobe Premiere. Since it is assumed that program authors are expert users of Adobe Premiere, the idea behind this concept is to increase their feeling of familiarity and ease of use.

In the future, further evaluations of these four concepts will be conducted, and new concepts will be formulated based on the feedback.

READ MORE