UBC's copy, or at least the copy in my hands, comes from the library of Dr. J. S. Milsum, who seems like an interesting guy in his own right. |
Given a choice of which numbers of The Engineer and Engineering I was going to epitomise last week, I ended up choosing the ones that didn't have reviews of James, Nicholls and Philips, Theory of Servomechanisms (Internet archive entry), which came out, as Number 25 in the Radar Handbook series from the MIT Radiation Laboratory, in June of 1948. It's in the 18 June issue of Engineering, which means that it's not up as a pdf yet at Grace's Guide, so you'll have to trust me that it's quite a nice notice.
You will have noted that all three of the British technical periodicals I follow, have notices about the upcoming summer schools on the theory of servomechanisms, to be held around the North. (No comment needed.) It has now been a year since the Institution of Electrical Engineers' special session on a"automatic regulators and servo mechanisms," which was held in May of 1947. As far as I can put a finger on it, it would be the publication of the proceedings of that conference in the Journal of the Institution of Electrical Engineers, as much as anything, that inspired the summer schools. These are huge developments in the history of information technology; the problem is that our received history of same is so shallow and episodic, that I might be the lone voice in the wilderness of history of science who is even aware that the conference happened. I don't blame the profession. It's just too small to cover this enormous subject. I do blame the people making policy based on bad history, but I say that every week, and I haven't changed the world yet.
Also to be caught up on here are the intersection of race and natural disaster at Vanport, Washington, and the Berlin Blockade. The latter is pretty closely connected to the history of servomechanisms via the problems of air navigation, while the former . . . is not.
Black soldiers on flood control at Vanport, 1948, from the files of the Oregon History Project |
Vanport, the Oregon History Project says, was built by Henry Kaiser to house the people flocking to the Fort Washington-Portland area to work in his burgeoning shipyard. Frustrated by a real estate "code of practice" that restricted Black housing to a tiny area in Portland, he bought some meadow land outside the city and of the city's control, and erected a ramshackle housing project on it. A little less than ten thousand housing units were provided, in fourteen-unit apartment buildings, consisting of two-story buildings with one-story wings, of wood on wood foundations. The levees consisted of the railbed of the Northern Pacific, The community was informally segregated and never majority black, with about 6000 blacks in a peak population of 40,000, but in the wake of the flood, 25% of the new homeless from a community of 18,000 were black.
I reproduce this map of Vanport from Michael McGregor's short article over at the Oregon History Project, which notes that Portland remained a heavily segregated city into the 1950s, in spite of a revision of the real estate code. The Vanporters who remained in Oregon did so by moving into the existing Black community of Albina. I've also confirmed that the story went uncovered in Time. I suspect that the story gets a great deal more sinister when examined more closely. The flood cannot have been entirely unexpected. Henry Kaiser appears to have evaded any consequences, so I assume that at some point he was able to unload the project on the Federal Government.
In the end, the story of Vanport, secreted behind its inadequate walls, was one of an unwanted imposition on a community that probably even the burghers of Albina did not welcome. Its erasure by the rains of the spring of 1948 is a human outrage, but contemporaries' refusal to be outraged is telling.
The blockade of Berlin, beginning less than two months later, is another matter. I can't say that I've done a heroic job of researching the secondary literature, but Daniel Harrington's Berlin on the Blank is a 2012 book with a shiny cover that stands out in the DD881 range amongst books about many other aspects of Berlin's long and complicated history.
. . . Okay, it's mainly Cabaret and the Wall, but Harrington has been working in the field for a generation, and he's perfectly write that the Airlift literature is only less enormous than other, very large literatures. Harrington takes issue with much of that literature, arguing that an implicitly politica science-inspired literature is far too concerned with overarching theories of decision-making to grapple with the improvised nature of the airlift, which developed gradually out of Washington's desperate desire to avoid escalation ahead of the November Presidential election. It would be hard to argue that the Airlift was over the hump (see what I did there?), but things were still held sufficiently in suspense for the newly-reelected President to hold off from anything drastic while the world waited for what the winter would bring.
As it happens, it was a mild winter, in which the airlift went from strength to strength. By May of 1949, it had become an obvious Allied victory in the emerging Cold War. As a pragmatic reality, it was mainly a victory for hard currencies. The occupation mark (from March, the Deutschmark) pried open the blockade, and would have destroyed the Group of Soviet Forces Germany had it not leaked.
That's an interesting lesson, even if the historiography doesn't grapple with it adequately, but it doesn't change the fact that the airlift was also a technical victory for airpower, and specifically a victory in a battle of Berlin. Recently defeated in a bombing offensive by Berlin's terribly flying weather, the RAF was, as junior partner, now victor in an even more technically-demanding victory, eventually flying more than 6000 tons of air cargo into the city every day, in planes that hardly exceeded a disposable lift of 10 tons each. Six hundred landings a day on two-and-a-half going to three airfields a day is something. When we realise that, on 24 June, none of those fields had a concrete runway, it gets even more amazing. The airlift would have been impossible without an almost-complete concrete runway at RAF Gatow, but this will suffice as a condemnation of the ramshackle nature of the Nazi regime, which neglected, for all its airmindedness, to provide its capital city with a hard-surface landing field in twelve years rule.
In terms of vintage tin, the airlift began with Dakotas, well-suited to half-assed PSP fields and the radio ranges and D/F beacons that were the USAF-in-Europe's main navigational aids in the spring of 1945, ill-suited to cargo lift or to a city with worse flying weather than Pittsburgh, notoriously the worst airport in the continental United States. (I suspect some exaggeration . . .) The airlift continued with the underrated C-54 Skymaster, somewhat hampered by an inadequate supply of aircraft. "Only" 1200 C-54s were built, compared with more than 10,000 DC-3s/C-47s/Dakotas, and the Military Transport Service could not support the occupation of Japan without them, although it pitched its attempts to hold C-54s from the airlift, which really is something that happened, in terms of its responsibility to deploy atomic bombs to operational commands in event of war.
In the end, the Berlin Airlift was held under 250 Skymasters. The key to beating the lift's tonnage targets lay in turning the fleet around and achieving 2.5 flights per plane, per day. It should be noted that Don Bennett achieved that sortie rate all by his lonesome, flying a Tudor V (possibly G-AKBY Star Girl); but he did so by flying at night, the half or more of the day less than fully utilised by the Airlift on account of, while it was an all-out effort of the Free World, it was an all-out effort within a limited budget. It was not until the last two months of the Airlift that BEA executives were brought in to coordinate the efforts of various subcontracting charter airlines, whose contributions, save for Alan Cobham's flying tankers, had been rather disappointing up to that point. As for employing airline aircraft or highly experienced airline pilots, well, this is only an existential battle for Western Civilisation. What do you think we are? Made of money?
Apart from observing that it was the airliftable, highly valuable, lightweight products of the Siemens and AEG factories that kept West Berlin's economy afloat (apart from the much larger volume of exports to East Berlin and surrounding Brandenburg through the blockade), that's enough said about the Airlift for now. It's got almost a year to run in postblogging, and I've no doubt that it will get boring before the end. That anecdote, however, I cannot resist. The age of electronics really is at hand!
The Airlift started out as a story of 1930s-era navigational aids, albeit supplemented by Ground Controlled Approach, although Harrington notes that the RAF Transport Command pilots could rely on GEE and Eureka, Yorks, and, subsequently, Hastings, and not the C-47s, with their tiny cockpits. Things changed in November with the arrival of an AN/CPS-5, an L-band (30 to 15cm, for those wanting a familiar reference) search radar developed at GE from 1945. Controllers found it magical, at least once a way of filtering out fixed structure returns was introduced, and used it to enforce a 3 minute interval between planes, which was quite the thing in low visibility. Now that's how you run a railroad!
Which brings me to Theory of Servomechanisms, and if you think I've been procrastinating because it is a difficult and highly mathematical text, whose importance is hard to explain, well I have a confession to make, and one more procrastinating manoeuvre to make.
In the end, the story of Vanport, secreted behind its inadequate walls, was one of an unwanted imposition on a community that probably even the burghers of Albina did not welcome. Its erasure by the rains of the spring of 1948 is a human outrage, but contemporaries' refusal to be outraged is telling.
The blockade of Berlin, beginning less than two months later, is another matter. I can't say that I've done a heroic job of researching the secondary literature, but Daniel Harrington's Berlin on the Blank is a 2012 book with a shiny cover that stands out in the DD881 range amongst books about many other aspects of Berlin's long and complicated history.
. . . Okay, it's mainly Cabaret and the Wall, but Harrington has been working in the field for a generation, and he's perfectly write that the Airlift literature is only less enormous than other, very large literatures. Harrington takes issue with much of that literature, arguing that an implicitly politica science-inspired literature is far too concerned with overarching theories of decision-making to grapple with the improvised nature of the airlift, which developed gradually out of Washington's desperate desire to avoid escalation ahead of the November Presidential election. It would be hard to argue that the Airlift was over the hump (see what I did there?), but things were still held sufficiently in suspense for the newly-reelected President to hold off from anything drastic while the world waited for what the winter would bring.
As it happens, it was a mild winter, in which the airlift went from strength to strength. By May of 1949, it had become an obvious Allied victory in the emerging Cold War. As a pragmatic reality, it was mainly a victory for hard currencies. The occupation mark (from March, the Deutschmark) pried open the blockade, and would have destroyed the Group of Soviet Forces Germany had it not leaked.
That's an interesting lesson, even if the historiography doesn't grapple with it adequately, but it doesn't change the fact that the airlift was also a technical victory for airpower, and specifically a victory in a battle of Berlin. Recently defeated in a bombing offensive by Berlin's terribly flying weather, the RAF was, as junior partner, now victor in an even more technically-demanding victory, eventually flying more than 6000 tons of air cargo into the city every day, in planes that hardly exceeded a disposable lift of 10 tons each. Six hundred landings a day on two-and-a-half going to three airfields a day is something. When we realise that, on 24 June, none of those fields had a concrete runway, it gets even more amazing. The airlift would have been impossible without an almost-complete concrete runway at RAF Gatow, but this will suffice as a condemnation of the ramshackle nature of the Nazi regime, which neglected, for all its airmindedness, to provide its capital city with a hard-surface landing field in twelve years rule.
The Handley Page Hastings might have had an old-fashioned tail wheel undercarriage, but it could lift 10 tons. Too bad those sleeve valves needed to be reground by special machine tools at each overhaul. Mike Freer - Touchdown-aviation - Gallery page http://www.airliners.net/photo/UK---Air/Handley-Page-HP-67/0909783/L Photo http://cdn-www.airliners.net/aviation-photos/photos/3/8/7/0909783.jpg |
In the end, the Berlin Airlift was held under 250 Skymasters. The key to beating the lift's tonnage targets lay in turning the fleet around and achieving 2.5 flights per plane, per day. It should be noted that Don Bennett achieved that sortie rate all by his lonesome, flying a Tudor V (possibly G-AKBY Star Girl); but he did so by flying at night, the half or more of the day less than fully utilised by the Airlift on account of, while it was an all-out effort of the Free World, it was an all-out effort within a limited budget. It was not until the last two months of the Airlift that BEA executives were brought in to coordinate the efforts of various subcontracting charter airlines, whose contributions, save for Alan Cobham's flying tankers, had been rather disappointing up to that point. As for employing airline aircraft or highly experienced airline pilots, well, this is only an existential battle for Western Civilisation. What do you think we are? Made of money?
Apart from observing that it was the airliftable, highly valuable, lightweight products of the Siemens and AEG factories that kept West Berlin's economy afloat (apart from the much larger volume of exports to East Berlin and surrounding Brandenburg through the blockade), that's enough said about the Airlift for now. It's got almost a year to run in postblogging, and I've no doubt that it will get boring before the end. That anecdote, however, I cannot resist. The age of electronics really is at hand!
The Airlift started out as a story of 1930s-era navigational aids, albeit supplemented by Ground Controlled Approach, although Harrington notes that the RAF Transport Command pilots could rely on GEE and Eureka, Yorks, and, subsequently, Hastings, and not the C-47s, with their tiny cockpits. Things changed in November with the arrival of an AN/CPS-5, an L-band (30 to 15cm, for those wanting a familiar reference) search radar developed at GE from 1945. Controllers found it magical, at least once a way of filtering out fixed structure returns was introduced, and used it to enforce a 3 minute interval between planes, which was quite the thing in low visibility. Now that's how you run a railroad!
Which brings me to Theory of Servomechanisms, and if you think I've been procrastinating because it is a difficult and highly mathematical text, whose importance is hard to explain, well I have a confession to make, and one more procrastinating manoeuvre to make.
A literal screenshot from Simon Lavington, Moving Targets: Elliott-Automation and the Dawn of the Computer Age. UBC hasn't collected a physical copy, and the email-to-self option is disabled, so. . . |
This is the guts of the Elliot 152 computer, developed at the Elliott experimental works at Borehamwood, nestled in the heart of London's movie studio district, on the grounds of an old silk mill. A digital, stored-program computer with a cathode-ray memory of 256 double bits, it was developed under the direction of John Coales as the calculating element of the "MRS5" specification for a medium-range naval antiaircraft gunnery system, and the memory was needed to deal with clutter. The Admiralty decided that it was a nightmare of electronic unreliability (the circuits had to be retraced with a punch and soldering gun every minute or so), but paid up for a single working machine as the guts of a tracking radar for high speed experimental drones. Exhaustive work on the MRS5 began in 1945, and Coales' proposal for a digital solution had been accepted in the first flush of enthusiasm for ENIAC in 1947.
It is, therefore, like the AN/CPS-5 a pre-Theory of Servomechanisms conception that would fall far behind the public understanding of what automation and computers could do, when the dry-as-dust work of the servomechanism community was overtaken by the now-imminent publication of Wiener's Cybernetics. (A frustrating web search for the exact date of the first imprint turns up reviews from the spring of 1949, so I expect I'll be postblogging about it in a few months.)
On the other hand, Wiener's Cybernetics is based on his work on the Mk 56, Anglicised as the "interim" MRS3, which the MRS5 was to replace in due course. As so often in the history of computing, public expectations of the possibilities of computing ran vastly ahead of reality, perhaps because expectations were equally unreal --the Admiralty was also working on "Long Range Solution 1,'" an air-to-air missile that could not be realistically expected before the MRS5 was available to tell scientists what happened when high-speed drones were fired, apart from, "It went that way!"
I was on about control theory last time. James, et al, emphasise that the theory of servomechanisms has two mathematical aspects. The first is, as Engineering puts it, the "analytical theory of harmonics--" tackling the simple harmonic motion partial differential equation, in other words. That's your control theory.
The other is the statistical method, in which a servomechanism is treated as a species of filter, with the problem being one of measuring and minimising the error between input and output signal. This might seem irrelevant, but it really, really isn't. Modern digital computing takes it for granted that the high frequency electrical impulses transmitted through our little devices can be treated as error-free, but that's because our ancestors solved this problem. You'll see below that I don't trust myself to give a solid, accurate account of how they solved it, but we're not going to understand the history of computing without engaging the problem, and I don't see anyone else lining up to give no doubt hopelessly misleading layman's explanations!
The key method upon which James and his colleagues rely is the touchingly old-fashioned Nyquist stability criterion, the kind of thing that old-time engineers like, because they can sketch it.
(Once all the theory is done, the math isn't too perplexing, either.)
Skipping through many, many pages of math, let's look at a Ralph Philips' treatment of the "well-known example of an automatic tracking radar," in "The Statistical Properties of Real-Time Data" (262--307). An automatic tracking radar works by looking at a radar signal from an object, rotating a specified distance, comparing what it sees with what it expects to see, and using the difference between real and predicted signal to adjust the action of the servomechanism moving the antenna. The difference here is the "error signal," and if the error signal falls towards zero, we can say that the tracking problem is converging towards a solution, or alternatively, looking at the motion in the servo system, which is steady or declining, speak of a "stable system."
Although we are talking about a statistical model of a real world system, we can see that "error" means something quite different from the familiar data-sampling methods used in the humanities. The error signal is data. Confusing!
Since the radar doesn't know the target's course, all received data is sampled as an input, or, rather, range of inputs, of variable intensity and phase angle. We can then speak of a "spectral density," which is a probability density of predicted possible future locations, and solve statistically for the centre of the density distribution. I won't trouble you with the mathematics of this, as it would ask me to understand it, but the result is an "autocorrelation" that gives the probability density of the target's angular velocity, hence future location. In an antiaircraft control system, this is good enough to spray proximity-fuzed shells at it, at least until you run out of ammunition, Admiral Cunningham. (To be fair, Cunningham managed to unload responsibility on Vice-Admiral E. L. King, so it's all good. Beevor has a readable account.)
In a 1947-era automatic tracking radar, all of this computing is done by analogue systems, mostly consisting of electrical circuits, which is a bit magical. Coales' 152 does it with arithmetic, which seems even more magical, but is actually not that hard if you're willing to do enough arithmetic, fast enough --the point of computers! The AN/CPS-5 has only a clutter filter, but while much simpler, the problem, and solution, is the same. In this case, clutter is detected by the fact that the error is zero --it's always where you expect it to be, because it doesn't move!
It should finally be noted that you are not going to have reliable electronic computations, analogue or digital, until the output signal is within an acceptable probability amplitude variation of the expected output of the specified operation on the input signal. This doesn't sound like it ought to be hard, but pushing a high-frequency signal through a specified inductance-resistance-capacitance circuit will only produce a bright, narrow output signal if the input signal isn't smeared. As we've seen, NPL is currently working on quartz [silicon dioxide] crystal-controlled high frequency signal inputs for various calculating elements. This is pretty recursive. Once again, as we sidle up towards the semiconductor transistor, we encounter semiconductors already in use in the industry.
It is, therefore, like the AN/CPS-5 a pre-Theory of Servomechanisms conception that would fall far behind the public understanding of what automation and computers could do, when the dry-as-dust work of the servomechanism community was overtaken by the now-imminent publication of Wiener's Cybernetics. (A frustrating web search for the exact date of the first imprint turns up reviews from the spring of 1949, so I expect I'll be postblogging about it in a few months.)
On the other hand, Wiener's Cybernetics is based on his work on the Mk 56, Anglicised as the "interim" MRS3, which the MRS5 was to replace in due course. As so often in the history of computing, public expectations of the possibilities of computing ran vastly ahead of reality, perhaps because expectations were equally unreal --the Admiralty was also working on "Long Range Solution 1,'" an air-to-air missile that could not be realistically expected before the MRS5 was available to tell scientists what happened when high-speed drones were fired, apart from, "It went that way!"
I was on about control theory last time. James, et al, emphasise that the theory of servomechanisms has two mathematical aspects. The first is, as Engineering puts it, the "analytical theory of harmonics--" tackling the simple harmonic motion partial differential equation, in other words. That's your control theory.
The other is the statistical method, in which a servomechanism is treated as a species of filter, with the problem being one of measuring and minimising the error between input and output signal. This might seem irrelevant, but it really, really isn't. Modern digital computing takes it for granted that the high frequency electrical impulses transmitted through our little devices can be treated as error-free, but that's because our ancestors solved this problem. You'll see below that I don't trust myself to give a solid, accurate account of how they solved it, but we're not going to understand the history of computing without engaging the problem, and I don't see anyone else lining up to give no doubt hopelessly misleading layman's explanations!
The key method upon which James and his colleagues rely is the touchingly old-fashioned Nyquist stability criterion, the kind of thing that old-time engineers like, because they can sketch it.
(Once all the theory is done, the math isn't too perplexing, either.)
Skipping through many, many pages of math, let's look at a Ralph Philips' treatment of the "well-known example of an automatic tracking radar," in "The Statistical Properties of Real-Time Data" (262--307). An automatic tracking radar works by looking at a radar signal from an object, rotating a specified distance, comparing what it sees with what it expects to see, and using the difference between real and predicted signal to adjust the action of the servomechanism moving the antenna. The difference here is the "error signal," and if the error signal falls towards zero, we can say that the tracking problem is converging towards a solution, or alternatively, looking at the motion in the servo system, which is steady or declining, speak of a "stable system."
Although we are talking about a statistical model of a real world system, we can see that "error" means something quite different from the familiar data-sampling methods used in the humanities. The error signal is data. Confusing!
Since the radar doesn't know the target's course, all received data is sampled as an input, or, rather, range of inputs, of variable intensity and phase angle. We can then speak of a "spectral density," which is a probability density of predicted possible future locations, and solve statistically for the centre of the density distribution. I won't trouble you with the mathematics of this, as it would ask me to understand it, but the result is an "autocorrelation" that gives the probability density of the target's angular velocity, hence future location. In an antiaircraft control system, this is good enough to spray proximity-fuzed shells at it, at least until you run out of ammunition, Admiral Cunningham. (To be fair, Cunningham managed to unload responsibility on Vice-Admiral E. L. King, so it's all good. Beevor has a readable account.)
In a 1947-era automatic tracking radar, all of this computing is done by analogue systems, mostly consisting of electrical circuits, which is a bit magical. Coales' 152 does it with arithmetic, which seems even more magical, but is actually not that hard if you're willing to do enough arithmetic, fast enough --the point of computers! The AN/CPS-5 has only a clutter filter, but while much simpler, the problem, and solution, is the same. In this case, clutter is detected by the fact that the error is zero --it's always where you expect it to be, because it doesn't move!
It should finally be noted that you are not going to have reliable electronic computations, analogue or digital, until the output signal is within an acceptable probability amplitude variation of the expected output of the specified operation on the input signal. This doesn't sound like it ought to be hard, but pushing a high-frequency signal through a specified inductance-resistance-capacitance circuit will only produce a bright, narrow output signal if the input signal isn't smeared. As we've seen, NPL is currently working on quartz [silicon dioxide] crystal-controlled high frequency signal inputs for various calculating elements. This is pretty recursive. Once again, as we sidle up towards the semiconductor transistor, we encounter semiconductors already in use in the industry.
No comments:
Post a Comment