The Day of the Doctor: a 3D case study

David Wigram, stereo3D consultant on The Day of the Doctor, takes us through the process of creating a 3D show

The first episode of Doctor Who was broadcast on 23 November 1963, and to celebrate the 50th anniversary of the show, the BBC commissioned their biggest-ever episode. The Day of the Doctor is 75 minutes long and was conceived and shot in 3D for simultaneous transmission into homes and cinemas worldwide. It is the first major TV drama to be shot in 3D, and was a serious undertaking for the production team, as only the VFX department had any experience of working in the format. Producer Marcus Wilson embraced the shift from 2D to 3D, successfully overseeing both the full production team at BBC Wales, and also key London partner Milk who supplied the majority of the VFX. Those effects, combined with contributions from the inhouse team and additional external suppliers, were managed by longstanding Doctor Who postproduction supervisor Nerys Davies. The 3D approach was led by specialists Adam Sculthorp the stereo supervisor, and David Wigram the stereo3D consultant, who were involved with the show from the R&D phase in December 2012 until delivery in October 2013. In this article David takes us through the process of creating a 3D show.

How 3D works

As a programme maker, using 3D is all about giving a more natural and exciting representation of the world. Away from the TV, wherever we look we see two slightly different views of the world, one through each eye, and our brain melds those views together into a clear and detailed experience of our environment. A well-produced show that is made in 3D (or stereoscopic as it’s technically called) will give the audience a vivid and lifelike sense of being there, and seeing for real what we present to them.

Using two cameras we capture a pair of images (one for the left eye, and one for the right), and using a special screen with matched glasses we can present those images discretely to the viewers’ appropriate eyes. The production team felt that the freewheeling adventurous style of Doctor Who would be well suited to stereoscopic, and made the commitment to change their production process to accommodate the needs of 3D. 

The equipment used

The requirements for a 3D camera setup is in many ways the same as any other high resolution, clean optics, low compression recording system, good ergonomics but in addition we need to get the cameras very close to each other. In order to reproduce a scene exactly as it would be seen by a viewer on the ground, the lens centres need to be placed as close as the centres of a human’s eyes. This either means using very small cameras indeed (and the low quality that results from that) or bouncing one camera’s view off a mirror to simulate the narrow alignment. These mirror rigs are motorised so that the distance between the cameras, and also their angle, can be adjusted live during a take. They are expensive bits of kit. 

On The Day of the Doctor we used pairs of Alexa-M cameras, with one lightweight rig (P+S Technik Freestyle) for handheld work, and one fully motorised rig (3Ality TS5) for grip supported shots. 

3Ality

The 3Ality rig on set

The difference from filming in 2D 

Shooting with 3D rigs can be a little slower, but as with any equipment an experienced team will work a lot faster than a novice team. We found once the crew was trained up, the shooting process was around five percent slower, averaging 20 setups a day. Directors and directors of photography (DoPs) new to the visual grammar will find that shot choices they are used to having one effect in 2D will have a different kind of impact in 3D. Our team did what many filmmakers do as they become comfortable with stereo: they used fewer over-the-shoulder shots, shorter focal lengths, and more specific staging of actors and props. The filming process itself is largely unchanged. There are another couple of folk on the shooting team, a 3D rig technician who sets up the rig with the camera assistants, and the stereographer who advises the director and DoP on 3D choices. The stereographer is also hands on during takes, using remote controllers to adjust the angle of the cameras relative to each other (known as convergence) and the distance between the cameras (the interaxial). These two metrics are crucial for capturing good 3D.

Adam Sculthorp setting the interaxial and convergence

Stereo supervisor Adam Sculthorp setting the interaxial and convergence, taking into account the depth of the VFX to be added later Technical challenges and common problems 

The large physical size of the rig does have implications for TV crews used to modern digital video cameras. Film crews used to 35mm gear tend to be unfazed. A scene between actors Jenna Malone and Jemma Redgrave was altered from a driving sequence to a static car scene once the DoP realised he couldn’t fit the rig inside even a large Range Rover. Grip gear needs to accommodate the extra weight, so the smallest cranes, dollies and sliders can’t be used. Lens flares where a bright light in front of the camera causes rings and glows across the frame are problematic, as they affect each lens differently and produce false depth effects and painful errors for the viewer. The way light responds on the rig mirror can cause problems when photographing shiny objects like wet roads or Daleks, so costume and makeup tests are crucial at the beginning of production, and the art department needs to be trained up too.

There are quite a few little tricks that work in 2D but not in 3D which trip up newcomers to the process. For example when we used split-screen to duplicate actor Joanna Page, in one shot the eyelines were correct for 2D but not for 3D, and the shot had to be corrected with VFX. Reflections in glass and mirrors can appear very strange unless tightly controlled. 

Effect on production and post 

The additional cost of shooting in 3D varies hugely, and although it is commonly expressed as a percentage of budget, the larger the production the smaller the extra percentage will be. Monitoring on set and during editing needs to be 3D, and of course for every angle twice as much video data is created, meaning backup and archive systems need to be able to cope. We used a workstation on set that performed checks on the data and the 3D choices before material was sent to editorial for backup and ingest. Filename and metadata systems need to be well adhered to as the conform process is more complex, and less able to cope with problems.

Offline editing proceeds largely unchanged, our editor was typical in that she worked through the day viewing in normal 2D, then popped on the glasses to review sequences and assemblies in 3D. The software needs to be 3D capable, but most are nowadays. An additional stage during the online editing process is the alignment and fixing of 3D errors that weren’t avoided during filming. This effectively turns every shot into an effects shot, requiring a compositing process to be achieved on every frame of the show, but there are fast hardware/software systems that are dedicated to the task, speeding it up considerably. 

Director Nick Hurran

Director Nick Hurran wearing 3D glasses looking at the monitor on set

The impact on special effects 

Practical SFX on set are largely unchanged by 3D, and can benefit greatly from the extra dimension; flame, smoke and rain all look significantly more lifelike. Preproduction testing meant that the director requested fluffy pollen in the barn sequences, and dust in the Arcadia battle sequences to fill the stereo depth of the shot. In-camera visual effects can become problematic, particularly forced perspective used on model shots and to simulate performers’ proximity to explosions where the illusion is broken by the addition of depth perception. The scene with David Tennant and Joanna Page on the horse was curtailed slightly because the initial plan was to cheat the horse behind them for dialogue. In the end it was achieved on the horse and looked great.

Digital VFX are more difficult, with matte paintings, wire removals and other standard tricks of the trade needing to be upgraded to a 3D environment. Full rendered 3D CG such as environments, creatures and vehicles work extremely well, but the rendering time and storage requirements are necessarily doubled.

How to learn 3D & what the future holds 

The best way to learn more about 3D is to buy a good book on the subject (such as 3D Movie Making by Bernard Mendiburu), get hold of a cheap 3D camcorder to experiment with, and watch all the 3D you can. A course from a reputable institution will accelerate the learning process considerably, and give you contact with working professionals who have the latest knowledge. Being at the forefront of technology, 3D is constantly evolving; at the time of writing high frame rates (HFR) as experimented with by Peter Jackson in his hobbit movies are causing much discussion, and glasses-free screens are coming out of the lab onto the market. It’s an exciting time. 

What does the future hold? Nobody really knows, but over the short term expect to see the top-grossing films of each year to remain being 3D, but the majority of releases to be 2D. Expect further innovations along the line of HFR as the theatrical industry invests in keeping the cinema a high-value experience, to compete with TV, tablets and phones as a way of consuming media. But of course experience has shown many times that where cinema leads, TV follows.

After a night of celebration for Doctor Who’s 50th Anniversary the show was awarded a Guinness World Record for the largest ever simulcast of a TV drama that saw The Day of the Doctor broadcast in 94 countries across six continents.