Getting in Sync with Shakespeare

We know our audiences like to use their smartphones and tablets (companion devices) while they watch television. Sometimes they use their companion devices to look up things related to what they are watching. For instance: to tell their friends how excited they are about their favourite team’s performance in the world cup or to look up the name of the actor playing the latest Dr. Who’s assistant or to learn more about what a narwhal eats.

What if we could anticipate what our audiences want to do while they watch television and make it possible to do it as simply as possible? What if their companion devices were able interact with their television and figure out what they were watching and where along the programme timeline they were at? If their companion devices knew exactly which scene they were watching on the television, what then? What would they want our companion device to do about it?

Over the past few years, we have worked with colleagues to write up and publish technical standards which will be needed to make this work. We have been building and demonstrating prototypes to showcase the different types of experiences enabled when we allow companion devices to interact with televisions. It was all very cool and interesting work but we really wanted to know if it would add any value to our audiences’ viewing experience.

As part of the 2-IMMERSE project we are focusing on four pilots to build multi-screen experiences of drama and sport. To get the synchronisation across devices right we planned to conduct some user experience studies in the BBC’s user testing labs (pictured below). We wanted to ask members of our audience (potential users) to evaluate a potential companion screen experience.

UX testing facilities in BBC's South Lab (London)

BBC’s UX testing facilities in London

Christoph Ziegler of IRT came along to work in the BBC’s R&D office in London. As part of the first pilot (an in-home theatre experience) we decided to test the concept of a synchronised script app, on a companion device. This app highlights the line being spoken in parallel to a Shakespearean play, Richard II, being played on a television. The concept was inspired by the use of subtitles in theatres and our conversations with John Wyver of Illuminations Media.

Sychronised Script app on a Companion Device

Three versions of the synchronised script app on a companion device

As part of the study, we asked potential users to view a series of short clips of Richard II on a television with a companion device running the synchronised textbook app. While some of the clips are perfectly synchronised to the textbook, others are injected with a pre-determined delay. We are looking to see a) if users notice these injected delays and b) how far users are prepared to tolerate and ‘forgive’ perceived faults in the synchronised experience as a whole.

We also presented users with three variations of the synchronised textbook app. The basic version presented users with a non-interactive textbook in which the relevant line spoken was highlighted in time with the scene on the television. The second version allowed users to click on the actor icons at the top of the app to get more information of the cast involved in the scene. The final version prompted users to take part in a quiz at certain points of the programme. We hope to get an idea of how much interactivity users feel comfortable while watching an engaging theatrical play.

We are also asking users to tell us their thoughts on how well they thought the experience works and what they think of the design of the synchronised textbook app. We showed users alternative versions of the app to see if users thought one design better than the other.

Currently, we are analysing all the feedback users have been kind enough to give us and we hope to have some results to share with you in the near future.

READ MORE