Posted by Richard Salmon on , last updated
In November 1967, a seminal report was published by BBC Research Department (as it then was). Digital methods in television by George Monteath and Victor Devereux set out how one might go about digitising a television signal, and the challenges, advantages and reasons for doing so.
Throughout industry there is a growing tendency to eliminate routine tasks by the use of more sophisticated equipment, advance in this direction being accelerated by a continuous improvement in the reliability of electronic devices together with a reduction in their cost.
Since broadcasting, and particularly television, is based entirely on electronic devices, it might be expected to be in the forefront of the advance towards automatic methods and the elimination of routine work, but in fact broadcasting seems rather backward in this respect. Engineers, who must be sufficiently well-qualified to recognise and deal with difficult problems, spend far too much of their time performing menial tasks - for example, setting up equipment this work demands unremitting care if the quality of the transmitted pictures is to be maintained.
To say this report was ahead of its time sounds like a cliché, but in this case it was clearly true. The report indicated that the introduction of such digital techniques warranted further investigation, but suggested that at that time the speed and capacity of components required for television applications had yet to (but soon would) be reached.
It first considered the required sampling rate, and suggested that “about 13 MHz would be used”, when considering monochrome or colour component (Red, Green and Blue) signals. It suggested that composite colour signals might be encoded digitally as well, and indeed, whilst initial digital encoding was done in this way, it was to be nearly 20 years before digital component systems became commonplace, with 13.5 MHz as the actual sampling frequency.
Again the work done on required bit-depth was prescient. Studies concluded that 7 or possibly 8 bits per sample would be required to prevent the visibility of contours, and that “it might prove desirable to work with more than 8 bits per sample during certain processes to avoid rounding-off errors.” 8-bits per component for distribution to the home, and the use of 10 bits within the production environment, has held good even with the introduction of High Definition TV over the last decade. It’s only now, once we started to think about Ultra-High Definition (UHD) and High Dynamic Range (HDR) television that we find an advantage in moving beyond 8 bits, other than within the production process.
The carriage of the digital signals in parallel and (when transmission and processing speeds allowed) serial form within and between equipment was explained, and indeed over the following 20 years was the journey taken by developments.
The authors acknowledged that the idea of using digital techniques for the communication of television signals was not entirely new. What they foresaw however was the gradual digitisation of certain pieces of television production equipment to take advantage of digital processing techniques. The question was asked as to whether “a general purpose digital computer” might help, and the conclusion was that they were far too slow to be useful in the shorter term, so dedicated hardware would, for a time, generally be needed. 40 years later, BBC R&D became a pioneer of introducing commodity computer hardware into video production processes, and that development is by no means complete!
The use of digital storage to provide line and field stores was considered likely to replace the use of solid-state delay lines, enabling digital techniques to be used for filtering and equalisation, and standards conversion (work on which started in 1970).
The report foresaw the likely introduction of digital techniques initially within the more complex pieces of equipment as (what were later to become known as) digital islands connected by analogue lines, and indeed that was the way in which the broadcast industry developed. Very wisely, the authors saw that there were huge developments of digital techniques ahead in far larger industries than our own specialism, and just as we do today, advised not competing with such industries, but developing techniques which took advantage of developments made elsewhere.
Digital video recording on magnetic tape was considered advantageous in terms of preserving image quality, and in 1974 BBC Research demonstrated the world’s first such recorder at IBC in Brighton. One of the first projects I worked on, when I joined the BBC 30 years ago, involved testing the first commercially produced digital video recorder, the D1, which had emerged from earlier collaborative work on the topic with Sony. Vic Devereux was one of the senior engineers in the section in which I was working in 1988, and ever modest, didn’t show how proud I’m sure he must have felt at seeing so much of his far-sighted research from 20 years before now coming to pass.