The Space: Building a Broadcaster in a Box
Back in August last year, Tony Ageh asked us a question: "How would you deliver a 'pop-up' television channel to desktops, mobiles, tablets and connected TVs?"
The typical response, particularly within the BBC, would be a suggestion to re-purpose much of the infrastructure we already have: media ingest, metadata management, transcoding, web publication, device targeting.
There was a snag, though. In fact, there were a couple. First, this wasn't just a pop-up TV channel - this was a "broadcaster in a box", which could later be handed over to arts organisations to pick up and run with. Second, we had to have as little impact upon the rest of the BBC as possible (it turns out that 2012 is quite a busy year for the Olympic Broadcaster).
And it had to go live on the 1st May 2012.
At the BBC we generally rely upon existing infrastructure, procedures, and operational support. This was to be a project which didn't come with any of that out of the box: we had to build it from the ground up, and with a very modest budget.
We had to figure out not just how we could get this thing up and running, but how we could do it - and document it - well enough so that unskilled yet motivated people who'd never been anywhere near a traditional broadcasting operation would be comfortable running, allowing arts organisations to play a part in an emerging digital cultural space.
My answer to the question was to build a website targeting what the BBC tends to call these "four screens", but with room for some native applications to augment it.
I didn't plan on actually doing it, though. I was just giving him an opinion. I figured that this would be one of a range of options considered and, given the stretched resources, it would end up being outsourced to a video-on-demand specialist to rapidly put together and operate.
That didn't happen. It turned out that in that brief question/answer exchange, the phrase "'pop-up' television channel" didn't really capture the breadth and depth of what was going to be attempted. This wasn't just going to be another video-on-demand service: it needed to be inherently flexible and able to take a range of media along with specialist propositions from cradle to screen. The ultimate objective wasn't "to run another BBC service", but to have a toolkit containing a "broadcaster in a box". We weren't out to create the next BBC channel, we were capturing the essence of what a broadcaster is and does.
A group of us sat in a small meeting room on the seventh floor of Television Centre to discuss the options we had. We'd spoken to video-on-demand specialist suppliers, and at that time the combination of timescales, required platform flexibility and potential audience reach meant that this route just wasn't feasible. If we weren't going to use an existing platform, somebody would have to build one - and at that point I still didn't think it would be us.
As we reached a quiet moment in the meeting, with everyone in the room considering our limited options, Jake Berger bit the bullet: "why don't we just do it ourselves?"
Another tricky pause, before Tony asked me, "so… if we were to do it your way, how would it work?"
And so, in that room, I sketched out how I would go about building what came to be named "The Space". It went a little something like this:
Take a well-known easy-to-use open source content management system (WordPress), and put it onto Linux machines firewalled to the hilt. Add plugins to it to in order to generate a completely static version of the site, which is then sent over to public-facing Apache web servers or a CDN (content delivery network), and build a set of templates to present the different kinds of media we're presenting.
So long as the CMS is extensible, we shouldn't have too much trouble storing all of the required metadata and relationships. Because the public-facing servers are dealing solely with static resources, making it scale cost-effectively is relatively easy, and because it's not built on any BBC-specific systems, it can be hosted anywhere and subsequently handed over when "our part" of the project ends. And there, in the space (no pun intended) of a few minutes, I'd sketched out the basis of how The Space would - and now does - work.
Jake, Dirk-Willem van Gulik and I were asked to come up with some numbers and a plan that was a bit more tangible than my handwaving in the meeting room. Between us, we came up with estimates of how which kinds of servers we'd need, when (and for how long), how we'd handle the design and markup of the templates, the video encoding, the technical operations, and committed our planned architecture to paper.
From my rather vague idea, Jake added his own project management expertise and Dirk his in-depth knowledge of BBC operational processes and we ended up with something we believed to be realistic and achievable, and that we could send on to the project board for consideration. A little later on, we tasked Jon Stuart, drawing on his experience with the BBC's audience-facing online systems, to refine the numbers and figure out ways by which we could deploy and manage the proposition.
We produced a small working model which gave us an indication of cost against projected given audience size, the number of live events and anticipated hours scheduled. Tony then brought in Vibeke Hansen, who had designed the original iPlayer with him, to come up with the the look and feel of The Space. After some very quick work by Vibeke, Caroline Smith and Nick Clement we had a design we could let our client-side developers loose on. Paul Coghlan and Stephen Collings went to work building and testing templates across a range of connected TVs, mobile devices and desktop browsers which would put the Blue Room to shame, while Aaron Dey, Steve Allen, Robert Gummeson and Dagmara Kodlubanski started putting together an ingest and transcoding chain. Meanwhile, we had the help and advice of Brandon Butterworth and colleagues in BBC Research and Development, particularly around live streaming, and Alex Russell from BBC Distribution to help us with our Freeview HD interface.
One of the dirty secrets of broadcast engineering is that you always assume that things will go wrong - because, in reality, no matter what you do, they always do. That's why everything at the BBC has five layers of backups and contingency planning… but having that degree of fall-back also requires the twenty-four-hours-a-day seven-days-a-week operational support of a major broadcaster. The Space certainly wouldn't have that. It needed to be able to stand alone.
We solved this by doing two things.
First, we captured the essence of our existing online practice and experience, by liberally (and in some cases literally) copying a lot of the provisioning and deployment scripts, approaches and processes that keep the BBC on the Internet. At the same time we have also automated (and now documented) a lot of the knowledge needed to run all of that.
Second, we took the opportunity to refine and rebase our approach. The BBC has been on the Web for quite a while now, and over time its legacy infrastructure has grown, and grown… and grown. Now, this isn't only about hardware - there aren't huge city-sized data-centres dedicated to keeping the BBC online - but anybody who's gone digging through bbc.co.uk will know that it sometimes feels like cutting through an old tree and counting the rings. The older pages were developed to entirely different processes - and are today still hosted today using quite different setups to that of the present systems.
As an early mover, the BBC has invented a lot of complex technology and approaches. The most successful of these are now appearing in hosted services or with cloud providers with features and refinements which we often find we're often unable to adopt ourselves. In short: we were able to cherry-pick the best parts of BBC online infrastructure and get rid of the bits we didn't need, and properly explore how something which technologically looks not a million miles away from the stuff powering bbc.co.uk today can be operated and managed in an off-the-shelf cloud hosting environment.
While we did this, Bill Thomson and Simon Still kept a constant feedback loop going with those commissioning work and mentoring the commissioned artists.
Today, we have a small back-office system which lets us create and transcode content (for all 4 screen sizes and a multitude of browsers) and send it off to cloud-based storage acting as a CDN origin. The content management system allows arts organisations themselves to submit metadata for their pieces, and for The Space's editorial team to review, edit, arrange and publish it.
Yet even after such a breakneck journey, this is still just the beginning (and I have to keep reminding myself of that): The Space is operational within the BBC until October, at which point the real hand-over happens, and then in many ways it's down to Arts Council England to continue operations if they wish.
In the next few weeks, I'll be following up on this post with a more detailed and technical look at how we customised WordPress to handle a content model not a million miles from /programmes, how we bent Apache to our will, and - of course - whether our approach paid off.
Meanwhile, you can visit The Space on your desktop, tablet, mobile, or connected TV with a web browser by visiting http://thespace.org. If you have a compatible Internet-connected Freeview HD television or set-top box, you can access our MHEG service - powered by the same content management system, and developed by specialists S&T - on channel 117 (you may need to re-scan to pick up the new channel).
Mo McRoberts is a Data Analyst in the Digital Public Space project and Technical Lead of The Space. Tony Ageh has also blogged about this partnership with Arts Council England.