UK Broadcast Transmission
Main indexMain GalleryFeaturesInfoTech Wiki
Send in your photosDesktop wallpaperMailing listsFAQsContact
The LibraryTeletextMHPTBSAstrohosts



Being fairy-stories told to the author as a young engineer
© Ray Cooper, 2005 (2nd revised edition, Jan 2006)

Testing Times

Most large transmitting stations need test signal generators. At Sutton, signals were needed not only to test transmitters, but also as drive waveforms for caption scanners, monoscopes, film channels and so on.

The Waveform Generator originally provided in 1949 was the most fearsome beast. It stood nearly six feet high and occupied two bays of equipment - a Mighty Wurlitzer of an instrument, larger than the average sideboard. It is a solemn thought that, thirty years later, most of its functions could have been replicated by a sixteen-pin integrated circuit. But it was built on a heroic scale. It was almost certainly the only waveform generator ever provided for a transmitting site that ran on three-phase mains. The reasons for this were always rather obscure. The original Alexandra Palace waveform generators, which principally fed the studios, was also a three-phase beastie, but at least it had an excuse - its master oscillator was locked to mains frequency by a control system which involved a three-phase synchronous motor. In the Sutton contraption, phase locking was done all-electronically and there was no need for three-phase.

This phase-locked system was one of the very earliest ones, wasn't tremendously stable but made up for that by being faintly entertaining. Once a shift, it had to be set to the middle of its control range, manually. To assist this there was a small CRT screen provided, displaying an interesting shape. Basically, this was a wide flat ellipse, drawn out by the 50 Hz. mains frequency, superimposed on which was a rectangular pulse derived from the field drive output waveform of the generator, also at 50 Hz. When everything was set up properly, the pulse should have been at the top centre of the ellipse, giving the display a very rudimentary resemblance to a submarine. If the conning tower was too far forwards, or too far back, a quick tweak on the master oscillator control knob would bring it back to centre - 'setting the sub'.

In addition to all the drive waveforms needed by the caption scanner (line and field drive, line, field and mixed syncs) there was also a range of outputs for the transmitter. 'Black and Syncs' - just plain mixed syncs - would keep the transmitter happy if the input signal disappeared. 'Sawtooth' - a rising ramp waveform - was indispensable for setting up the transmitter linearity. 'Flagpole' - a narrow white bar on a black background - gave its users the haziest of ideas about the pulse response of the transmitter. And, my personal favourite - ' Art Bars ' (short for Artificial Bars) - a black St. George cross on a white background, the very first of all the electronically-generated tuning signals.

With the passing of years, rather more sophisticated signals became available for circuit testing. 'Sawtooth' was largely replaced by 'Staircase', a stepped waveform that was easier to analyse quantitatively. 'Flagpole' was replaced by 'Pulse and Bar', which at least gave some real information about frequency and phase response, at last.

Unfortunately, although there were some perfectly good versions of these waveform generators available (the Post Office made their own, which were very satisfactory) the BBC insisted in going its own way and designing its own. The pulse and bar generator was not too bad, but the linearity generator, used to produce 'staircase', must have ranked amongst the worst bits of test gear that I ever had to use. Not that, when working, it didn't work well: unfortunately, it was seldom working. Unlike the Post Office gear, which used delay-lines to set up all of its internal timings, the BBC version used interconnected mulitivibrators - all done with valves, of course. The thing was just not stable. If it produced an output at all, succeeding lines of the waveform could well have different numbers of risers in the staircase: or occasionally the sync pulse would be half-way up the waveform. Fixing it took patience, (usually) a complete re-valve, and days.

The unit was, in the truest sense of the phrase, a complete waste of time. Later BBC versions of the gear used transistors and a completely different design philosophy, and worked well.

Having got some test signals, you now need measuring gear of some sort. There must have been some sort of test demodulator supplied with the 405-line transmitters, but it seemed not to have survived long. Certainly, after only a few years a BBC-designed test demodulator in the RC1 series was being used - all valve, of course, but its performance was pretty good - probably an order of magnitude better than the transmitter it was measuring. Other than that, test gear seems to have been pretty thin on the ground, in the early days.

FM radio measurements centred about a distinctly non-portable bit of gear supplied with the original installation, the 'Station Monitor' (believed to be by Marconi - it certainly didn't feel like S.T.&C. anyway). This was a bayful of equipment seven feet high, but it did enable you to measure almost anything. Apart from being a high-quality demodulator, you could check carrier frequencies, modulation carrier shift, FM and AM noise, and deviation. The deviation measurements were done with a Bessel Zero technique, which I'm not going to describe in these pages since it is highly mathematical and I might not understand it. But potentially it was a highly accurate technique, though difficult and subject to operator error.

Almost all fault-finding work was done using AVO multimeters, and cathode-ray oscilloscopes. Of the latter, there were two: a BBC designed and built unit, the Type B, and an EMI waveform monitor (WM series), again probably supplied with the transmitters. The Type B was a general-purpose 'scope slanted towards TV usage, and had the benefit of a reasonably large (six-inch) screen. Its performance was adequate and it made no pretensions to accuracy. The EMI unit was intended to be a bit of precision measuring gear, and to a large extent it succeeded in this. It had a quite distinctive appearance. It was trolley mounted, and had two units: the power supplies mounted at the trolley base, and the display unit, which intriguingly was trunnion mounted on top of the trolley in such a manner that it could be tilted for optimum viewing position. Measurements were done with calibrated X-and Y-shifts: the Y-shift used a centre-zero cirscale meter movement displaying volts, but the X-shift, whilst similar in appearance, was in fact only a pointer mechanism travelling over graduated scales, and coupled to the X-shift control via a belt system. The display tube was 3.5inch diameter flat-face with good focus. It had a separated pair of graticules to enable measurements to be taken without parallax error, and these were illuminated by four lamps enclosed in bulbous extensions of the display bezel: the appearance was of a stylised letter 'X', and with the circular shift meters on either side, the whole front-plate simply screamed the word 'OXO' at you. Accordingly, the machine was known on site as the 'Oxometer' -

"oxo'meter n. Instrument for measuring bullshit [f. OE ox bovine animal; see -o-, - METER ]

- as the OED would have had it.

These instruments were adequate for 405-line use, but completely outclassed when 625 lines and later colour came along. Later instruments included a Tektronix of unknown provenance and a Marconi TF1277 'scope, both of which had vastly improved performance, but were still 'valve gear', meaning heavy power supplies, and so were still trolley-mounted. But by the 'seventies transistorised gear was becoming available, smaller and capable of being carried about (by one hand!). These were mostly from Tektronix, Hewlett-Packard and Philips.

Carrier frequency checking for TV was originally done by an intriguing bit of BBC gear, which used a standard-frequency transmission (Droitwich 200kHz) as a reference: via suitable multipliers, this drove the deflection systems of a small CRT display. The frequency to be measured brightness-modulated the display, to produce a series of dots, which would be stationary if there was no error, but would slowly chase themselves round a circuit on the display if there was one. By noting the speed and direction of creep, the error could be estimated. It was of most use, of course, for setting the frequency spot-on.

Much more impressive was a bit of genuine frequency-measuring gear that was provided towards the end of the 'fifties. This was by either H-P or Tektronix (and I can't remember which) and was again valve-operated, massive and trolley-mounted. It covered the range 0 to100 MHz or thereabouts, and the display was by neon tubes illuminating transparent numbers from behind. There were columns of these lamps. When counting, the neon lamps chased one another about in a most diverting manner, and when the count stopped, the result could be read out. The device was known, of course, as the 'Fruit Machine'.

In the 'sixties, this device was used to measure not only Sutton's own carrier frequencies, but also those of other stations. This came about because somebody realised that the new Band I aerial array would, when not in use, make an excellent receiving aerial by virtue of it's great height. The idea was to connect it to a sensitive communications receiver (an Eddystone was used) and then beat the remote signal to be measured with a local signal generator, the frequency of which was then measured with the Fruit Machine. Each Monday morning before 9 a.m., 'phone calls were made to ensure that no high-power TV stations were radiating (they would have trampled the fainter ones). Then the beat-frequency technique was applied. Due to the fact that many stations sharing the same channel had in fact different offsets (small divergences, of the order of two-thirds line frequency: about 6.7 kHz - that were made to minimise visibility of co-channel interference) they could be sorted out by this technique and measured. A surprising number of distant stations were audible - Divis, in Northern Ireland , came romping in - and nearer low power stations could also be measured. Results were then 'phoned to the stations concerned.

Things like spectrum analysers were completely absent from the equipment roster of earlier days. They were exceedingly expensive, heavy, delicate and easily damaged. They were also regarded (by Head Office) as unnecessary, so far as station staff were concerned. They were for the use of specialist departments only. Should unusual circumstances dictate the use of one, it would arrive from London , together with a trained operator, and afterwards be promptly returned. Station staff were not permitted to touch the thing.

Eventually, the proliferation of UHF relay sites maintained by stations like Sutton meant that the 'unnecessary' viewpoint had to be reversed. Additionally, analysers were getting smaller, cheaper and more robust. When our first, very own H-P analyser arrived, it felt almost like the paper bag had been removed from your head - folk wondered just how we had managed for all those years without one. For example, with an analyser in one hand (you could just about lift one without risk of rupture), a frequency counter slung over one shoulder, and a suitcase of leads and adaptors to counterbalance you in the other hand, you were completely equipped to performance-test a small TV relay site. Previously, a vanload of gear would have been needed.

The End of the Road >

mb21 by Mike Brown
Hosted by Astrohosts