The IBC Accelerator programme began in 2019 to foster innovation in the broadcast industry. The programme encourages companies to collaborate on solutions to contemporary industry challenges.
Figure 1: IBC Accelerator Billboard
– Champions: ITN, YLE, BBC, TV2, TRANSMIXR, TALK TV
– Participants: Tinkerlist, Singular.live, Techex, Grass Valley, ZIXI
The goal of the project was to explore ways in which live broadcast and production workflows could be executed in a device, and gallery agnostic fashion. The aim was to place an existing production workflow at the centre of the solution and to enable that workflow to be executed using multiple hardware devices across a variety of locations. An important aspect of this was that the workflow should remain universal, the control interface should remain constant and the backend should be flexible enough to communicate with any studio software system that exposes an API.
Transmixr’s role in the project was to specifically investigate the use of XR devices in the production workflow. As part of this investigation, three proof of concept demos were created using the HoloLens, Meta Quest Pro and the XReal Light mixed reality head mounted displays (HMDs).
Motivations
Our motivations for taking part in the IBC Accelerator were to:
● increase awareness of TRANSMIXR and the consortium partners.
● gather feedback on our POC, evaluate the solution, and inform the direction of the next phase of development.
● find new collaborators who could help us to develop our POC
We also hoped our presence at IBC would lead to further opportunities to showcase our work at other conferences and events.
Demos
Three proof of concept demos were developed over the course of the project. One utilised the Meta Quest Pro headset, another utilised the XReal Light glasses and a third demo utilised the Hololens 2. The demos focused on three production tools that would be used in a broadcast context:
– TinkerList Automator
– TinkerList Cuez
– vMix
Cuez is fully cloud-based and is accessed through the Cuez web platform. The three chosen tools are components within a control room workflow that allow the operators to view the rundown in Cuez, preview and launch clips and graphics through the Automator, and view the playback in vMix.
Figure 2: Quest Pro demo showing the HMD paired with the laptop allowing for control of virtual screens with the trackpad
The Meta Quest Pro was paired with the laptop (see Figure 2) using Meta’s Remote Desktop (running on the laptop) and Horizon Workrooms (running in the HMD). This allowed for three virtual monitors to be set up, mirroring the laptop. In this way, all three tools could be simultaneously viewed without having to switch between tabs on the laptop or without having to utilise three physical monitors. The interaction method for this implementation involved using the physical trackpad and keyboard on the laptop. Using this approach, it was hoped to demonstrate the convergence of physical and virtual elements, namely, the physical trackpad enabling interaction with the virtual screens.
The XReal glasses were tethered to a Samsung Galaxy S21 Ultra 5G smartphone over a USB-C connection. The smartphone is responsible for powering the glasses and rendering all audio and graphical content. The production tools were all accessed through the browser that is built-in to XReal’s Nebula platform.
Figure 3: XReal glasses with tethered smartphone being used as a controller.
There were two interaction methods used with the XReal glasses. The first method utilised the phone as a pointer, with a virtual ray being cast from the top of the device to the virtual content (see Figure 3). The phone screen then acted as a touchpad. In this way, the user could point to a clip in the Automator and tap the screen of the phone to launch a clip in vMix. It was also possible to re-arrange elements in the Cuez rundown. The second interaction method involved using the physical keyboard and trackpad on the laptop. In this context, the user was able to use the standard mouse cursor to select and launch clips from the Automator whilst viewing the laptop screen. Then they could view the Cuez rundown and vMix windows in the glasses on the virtual monitors.
The HoloLens solution was built using entirely virtual elements including three virtual monitors and a virtual keyboard. Hand tracking was used as the method of interaction with the virtual elements. There were no hardware control devices used, in contrast to the Quest Pro and XReal implementations. The three virtual monitors displayed TinkerList Cuez, TinkerList Automator and OBS via a web browser.
The main intention in presenting three proof of concept implementations that incorporate different HMDs and interaction methods, was to demonstrate that the solution proposed by the project participants can be implemented using a multitude of devices. That is, the proposed solution is not tied to any particular piece of hardware. In addition, when two or more devices were used simultaneously, triggering items in the Automator affected the playback on both devices. This demonstrates the potential for collaborative work, remote collaboration, and integration of XR devices with traditional production control room devices.
Survey
A short post-experience survey was prepared by Gareth Young and Grace Dinan. We conducted this survey at the pod in order to gather the opinions of industry professionals on the use of XR technology in their work. The following questions were asked:
1. What is your role in the broadcast industry?
2. How would XR technology be helpful in your job?
3. How can XR technology contribute to environmental sustainability or reduce resource consumption?
The survey was concluded with an open-ended question asking the interviewee if they wanted to add anything else. The responses were recorded using a Roland R-09HR field recorder. We are currently in the process of analysing this data.
Outcomes
The project and demos were very well received throughout the event. We were located in the Accelerator Zone in Hall 3, with approximately 43,000 visitors over four days. A video of the TRANSMIXR XR Control Room played on a continuous loop on a large monitor mounted on a wall at our pod (see Figure 4).
We showcased two of the solutions at the event; the Quest Pro and the XReal glasses implementations. Footage of the third proof of concept, using the HoloLens, was included in the video demo that was playing on the monitor at the pod, as well as on stage during our panel discussion.
The devices and laptop were arranged on a desk in front of the wall-mounted monitor. We found that the headsets attracted the initial attention of passers-by and often provided the impetus for a conversation which then led to them trying the demo. We then gathered feedback from the demos, in the form of a verbal questionnaire, and recorded the answers for further analysis. Anecdotally, we found that users preferred using the physical laptop and trackpad as interaction methods compared to hand-tracking or using the phone as a virtual pointer. In the case of the XReal glasses, users found the phone pointer with trackpad quite cumbersome as the virtual elements sometimes did not respond correctly to input. Also, the act of pointing the ray accurately was difficult to master. In contrast they found using the laptop trackpad to be faster and more robust. They often commented that the tactile response of using physical control devices, combined with virtual displays was a more comfortable way to interact with the content than using only hand-tracked gestures or virtual pointers.
There were also several comments highlighting the fidelity of the virtual screens. Several users preferred the sharpness of the virtual screens as displayed on the Quest Pro. They also commented that text on the screens viewed through the XReal glasses was less clear. This is perhaps due to the design of the headsets. The XReal glasses are a see-through XR device which results in slightly transparent virtual content, similar to the Hololens. However, the Quest Pro display consists of LCD screens where the virtual content is combined with an RGB camera feed of the environment. This allows for a wider field of view and fully opaque virtual content. Regarding the form factor of the HMDs, we received feedback that the light-weight XReal glasses would be more conducive to longer sessions than the heavier Quest Pro.
Our Accelerator team gave a one hour presentation on the Innovation Stage in Hall 3, during a prime slot on Saturday at 10.45am (see Figure 5). All seats and additional standing areas were filled. Our TRANSMIXR video was played during the stage presentation, and the panel discussed the benefits of collaborating with TRANSMIXR and the benefits of XR for the broadcast sector. After the stage presentation, the visitors to the pod came specifically to ask about the XR solution. We often found that this was the element of the project that sparked their interest.
We networked with the other IBC Accelerator teams as well as visitors and vendors. If we continue the IBC Accelerator for 2024, we already have a number of interested collaborators who could help us achieve our goals. We also had an opportunity to research and explore the latest tech and industry trends, while at IBC. We had numerous discussions around the future of the broadcast industry and where XR could play a role.
Conclusion
We are now in the process of transcribing the answers from our survey at the pod, and analysing the data in detail. We received valuable insights into the negatives and potential pitfalls of XR solutions for broadcast. These were mainly related to the HMD’s, user experience and effects of wearing a HMD for extended periods of time. Hand-tracking and XR controllers were also potential negatives and areas to be addressed. We need to ensure the end user experience is intuitive and comfortable. However, the mixed reality solution that integrated a physical keyboard and mouse was well received. In general, the XR Control Room received overwhelmingly positive feedback, particularly the virtual screens and benefits in terms of sustainability and remote production. The general feedback we received is that the XR Control Room is easier, faster and cheaper to deploy than traditional solutions. We were asked on numerous occasions whether this solution was available to purchase right now.