Live virtual flights around Wimbledon - Venue Vu
The Wimbledon coverage on BBC2 on Tuesday evening (21st June) featured the first ever use on air of 'Venue Vu' - a new system we've been developing to help visualise live events happening across a wide area. The system was used to produce a virtual 'flight' from Centre Court to Court 3, created using a pre-generated 3D model of Wimbledon, with live video from these two courts being projected into the model.
A snapshot from the first onscreen use of Venue Vu from this year's Wimbledon Coverage. Court 2 is visible on the left of the image, with a real TV camera feed of the court embedded in the 3d model.
The use of live video to 'bring to life' parts of the model allows the generation of a seamless flight that starts and ends with the live camera feeds. This allows the relationship between the areas visible in the live images and the rest of the 3D model to be seen much more clearly than they would be if the model was shown in isolation. Our colleagues in BBC Sport are trialling the system to see how it can help explain to viewers the layout of Wimbledon, and to test its robustness in a live programme, and we expect it to be making appearances throughout this year's coverage.
This system has come out of our work in VSAR - a collaborative project part funded by the Technology Strategy Board in which we are working with other UK partners to investigate approaches to merging 3D models with live images and other data to help people interpret events across a large area. Our interest is in portraying events such as Wimbledon, whilst other partners are looking at applications in fields like security and surveillance. We worked with one of the project partners, CAST (the Centre for Advanced Software Technology in north Wales), to produce the interactive visualisation system. We also worked closely with Crystal CG (who create high-end 3D models, for example as used for virtual flights around the Beijing Olympic venues) who created the model of Wimbledon and rendered the virtual flights.
The system uses two PCs and a video router, set up in the back of one of the outside broadcast trucks at Wimbledon. When the operator selects the courts to fly between, the video router passes the camera feeds from the courts to video capture cards on the PCs. One PC runs software to track the motion of each camera by reference to the court lines (based on the technology we developed for the award-winning Piero system), so the system knows its current pan, tilt and field-of-view. The other PC plays the appropriate pre-rendered animation from a RAID disk array, and uses a graphics card to project the video from the cameras onto the relevant courts, taking account of the current position and orientation of both the live cameras and the 'virtual' camera used when rendering the animation. A carefully-crafted arrangement of soft-edge masks ensures that the live video 'illuminates' the court area without turning the crowd into a mixture of real and virtual people. As the video is only a 2D representation of the scene, it cannot produce an accurate rendition if viewed from a location a long way away from the real camera position, but by carefully choosing the animation paths it is possible to produce a surprisingly realistic result.
We hope to further develop the system, for example to allow interactive use on-line, and will work with BBC Sport and Crystal CG to explore other events that it could be used for, possibly including the 2012 Olympics.
I'd like to thank my colleagues who worked on the system, particularly Bruce Weir, as well as our partners at CAST, and our collaborators at Crystal CG, for all the effort they have put into getting the trial on air.