Visualising Material World: the ins and outs
[Editor's note: We've previously heard from Operations Manager Tony Ward about Material World and the visualisation trial. Now it's the turn of Radio 4's technical guru Richard Courtice to tell his side of the story.]
The BBC's Audio and Music Interactive department works in the clouds.
No really, it does. When I first got involved with Visualising Radio 4's Material World, I was presented with an A4 sheet of paper. On it was a diagram showing various fluffy looking clouds connected by arrows. The clouds had generic labels like "The Internet", "SMS Content", "Servers" and "The Studio". None of the arrows were labelled. It became clear that the cloud labelled "The Studio" and the arrow pointing towards "The Internet" was my bit to sort out.
On a side note, I also noticed that the clouds are the fluffy, Cumulus kind. The ones that make you think of a blue sky, a refreshing breeze and clean air. The sort of clouds that the Orb wrote about. But, I've been around for a while and I often suspect that Cumulonimbus might be more appropriate - dark, thunderous and anvil shaped, ready to dump on you from a great height, like the endlessly doomed Wile E. Coyote of the Roadrunner cartoons. ..... But I digress.
For the viewer/listener (the "visualisee" perhaps? *shudders*), visualisation is a window that can be added to a webpage or TV screen. It contains a small rectangle for video or pictures, a section to the right for messages from the listeners or for bullet points of information about the programme. Along the bottom of this window runs automated data from either BBC News, or for the Pop Music stations, the "now playing..." information from the computer playout systems. All this is known as the "Visualisation Console". The purpose of the trial is to see if this Console works technically and if it works as an addition to the audience's "listening" experience.
The engineers and producers of the Interactive team in Audio and Music had already achieved a great deal. The Console had been designed and the IP distribution had already been worked out, so despite only being a small part of the console, creating the video was going to involve the most work.
That weekend I was telling my partner, Kate, about the Visualisation Console.
"Isn't this television?" she asked.
"I think the idea is that it's not TV, because the visual bit only adds to the audio. One mustn't do anything that leaves the radio audience feeling left out. The sound is in charge" I said.
"Oh I see," Kate said, "because what you've described sounds just like that stupid football programme you sometimes watch on Saturday afternoons. You know, the one with ex footballers wearing headphones watching TV screens and it has the latest scores displayed on the right and scrolling news stories along the bottom."
"Yes I suppose." I said, suddenly very unsure of whether I was working on radio, TV or some point in between.
Another digression: One of my pet theories is that in any broadcasting project, you will be tripped up by one tiny detail that you haven't thought of, so the trick is to think of it and it won't trip you up. Trouble is, you can't think of everything, therefore you will always be tripped up at least once .......... Anyway back to Visualisation.
Material World broadcasts from Studio 50B on the 5th floor of Broadcasting house. Next door is 50D and this seemed the best place to put the Visualisation equipment and people. They "won't get in the way then."
With me in the technical team, was Abdi Ismail, a project engineer; Chris Price from DV Solutions who supplied the video equipment, Terry O'Leary & Phil Watson, producers in the Interactive team; and Ilika Copeland, the Executive Producer for the Visualisation Project.
The plan was to put two remote control Sony Cameras on tripods in 2 adjacent corners of the studio. Also attached to these tripods would be two Sony A1 DV Cameras. These would be locked off on wide shots, thus leaving the two remote control cameras to pick up the single shots and close ups.
A Sony Anycast would be used to control the cameras, vision mix, add the sound and generate an AV feed in DV format. The AV stream would feed into a laptop via a Firewire cable. The laptop, running Windows XP, would use Adobe Flash Encoder to encode the AV. This laptop would then connect to the Internet via a SDSL link in the Basement to send the video to a distribution company.
The word laptop worries me. Radio and television works best on the basis of when one pushes the button, the desired outcome happens. The phrase "hang on - just need to reboot" is not what you want to hear when the producer says "Go". The rig is also full of single points of failure - another thing that gets us technical folk twitching. But on the other hand, this is only a trial and spending should be kept to a minimum.
So, Abdi had a list of engineering jobs: to rustle up 4 video tielines from 50C to 50D terminating in BNC connectors and a CAT5 tieline for the remote control unit. Then, find a CAT5 cabling route from 50D to the SDSL modem in the basement.
For the audio, I decided to use a feed of Radio 4 network for the digital platforms, post-processor. This is not normally available in the studios, but a quick chat with Control Room showed that it could easily be got and routed to an Outside Source in 50D. The reasoning behind this choice was a desire for the audio on the Visualisation Console to match as closely as possible the audio of the BBC iPlayer streams. These internet streams are fed with the same distribution as the Digital TV and DAB platforms and so it made sense to use this feed for the Visualisation Console.
The next minor issue was finding an analogue feed of this in the studio. When Broadcasting House was refurbished in 2006, the whole infrastructure became digital. Thankfully we were in a well-equipped studio and the Studer Vista mixing desk was easily configured to spit out an analogue feed on some tielines.
Then I pondered the question of sound and vision sync. The audio from Studio 50B would go through a mixing desk in Radio 4 continuity, then through the Network Switcher, then through the audio processor, then through the 50D desk and the digital to analogue conversion before finally arriving at the Sony Anycast.
I guessed, the delay could be as much as 80 milliseconds. But that was nothing when compared to the delay on the video after being mixed in the Sony Anycast. Chris and I crunched some numbers and our starting point was to delay the audio by 4 frames. In the end, 3.5 frames delay on the audio gave the best result - so I was 12ms or so out in my calculations.
Finally, there's the encoding options. For the trial, we're using an FLV container and RTMP protocol to stream via Flash Media Server 3. The video coding is On2 VP6-E, 384kbps and the audio coding is MPEG-1 Audio Layer 3 (MP3) at 128kbps joint stereo.
A week or so after the first meeting, the team met in 50B to work out lighting and camera angles. Joining us was Luke Finn, an expert in lighting.
"Pah", said Luke, grimacing at the ceiling, "low ceilings - can't stand low ceilings". He reached up and dug his fingers into a cavity between the ceiling tiles and the wall. Finding a purchase point, he pulled sharply down. The ceiling flexed a fraction and Luke let go.
"Weeelll" he mused. "We might get a clamp in there but there's nowhere to bond it."
He prowled round the studio table looking at it from various angles.
"Are those staying?" he asked pointing at the table. "Are you going to be using those microphones?"
The C414 by AKG.
This microphone is the workhorse of Broadcasting House. I've never heard a bad one.
"Yes" I said, "They're staying."
"All of them?" he asked.
"Yes, all of them." I said, firmly.
Luke seemed to accept this and took up another position looking at the table. There then followed a flurry of activity. By applying some scrim, he used the florescent strip lights around the edge of the room to back light the presenter. A small Dedo spot with some frost to soften the light was attached to a bar on the wall (the bar normally holds video monitors). Then a Diva 400 was used to key light the presenter.
This gives a soft shadow and produces very little heat. The stand for the Diva was carefully hidden behind a panel showing the Radio 4 logo. That'll please the marketing department I thought.
It transformed the pictures. Just two lamps and some deftly applied filters had made all the difference. Luke had also designed a lighting rig that could be set by our studio technicians well within the one hour period we would have every Thursday lunchtime to rig the studio.
We were ready for the first broadcast.
For a broadcast, a producer and a member of the radio production team "drive the console" They cut in either video from the studio, or still shots taken whilst recording on location. They also add in pre-prepared text about the programme, or comments from the audience.
For the Material World trial, Philip from BBC TV Resources operates the cameras, lights and vision mixer. I set up the links and manage the studio rig and transmission. As I write, we have now visualised 2 episodes of the Material World and the results have been very pleasing.
Already, we have thought of things we would like to change. On the 3rd of July broadcast, we made the terrible discovery that the text on the Visualisation Console couldn't reproduce apostrophes. The moment the phrase "... a virtual cows bottom" appeared on the screen, I winced. The producer was horrified.
"I typed an apostrophe" he cried.
Within seconds, many of the audience took advantage of the "message the studio" button on the console. Being the Radio 4 audience, they were all very polite, but their scorn was plain to read and rightly so.
It's that tiny detail that you never think of, you see?
Richard Coutice is a Senior Studio Manager and Digital Operations Specialist at the BBC.