Wednesday, April 29, 2020

A Technological But Also Pure Science Appendix to Postblogging Technology, January, 1944, II: The Secret Life of Stars

I was not quite 19 when Ronald Reagan made his "Star Wars" speech on 23 March 1983, perfectly prepared by far too much libertarian science fiction to be convinced by his case that it was about time that scientists stopped working on MAD and started working on ABM, on defending civilisation against the atomic horror instead of exacerbating it.  In its immediate wake, the argument against "Star Wars" that got the most attention, seemed ludicrously pessimistic. Computer scientists emerged to argue that it was just too hard to programme ABM guidance  software. 

It turns out, as we saw in February, that ABM wasn't something that people just thought of in 1983 after seeing Star Wars IV. The first ABM proposal was put on the table before WWII even ended, and involved shooting down V2s with antiaircraft guns. This week, it turns out that ABM efforts are intimately tied to the H-bomb debate. If the Communists are going to shoot down a large proportion of the atomic bombs aimed at them, the ones that get through should be corkers, and also much cheaper. 

This is why, as January comes to an end, President Truman is hearing conflicting advice from his experts. Some believe, correctly as it turns out, that the H-bomb will be significantly cheaper, in bangs for the buck, than atom bombs. This argument does not, however, get anywhere near as public play as the "corker" argument. It is, admittedly, in part because the arguments are related. if only one in ten weapons get through, better that Moscow be targeted with 10 H-bombs, so that one can level it, than with 100 atom bombs, etc. 

For the scientists that object to this second argument, it was mostly to the grotesque barbarism of aiming to hit "Moscow" with a 10 megaton bomb, in order to destroy all Moscow-related strategic targets, but  also, of course, well, Moscow. The fact that, in the background, there are some goulish rumblings about bombs of unlimited yield --the gigaton bomb-- underlines this. But there's a reason that Edward Teller is talking about a gigaton bomb that, it seems to me, gets obscured in the rearview mirror. Nuclear physicists had a lot to learn about atomic physics in 1950.

The January 1950 argument flows from the fact that Teller is still arguing for his "classical super" H-bomb. Others believe that the classical super is highly problematic. We have two accounts. One is that the difficulty is economic and industrial. There is not enough tritium for the classical super. The other is that the classical super is seen as physically impossible. It is here that we have to ask ourselves how so many physicists are ranging themselves against one of the greatest theoretical physicists of the day. The usual explanation is that Teller was a colossal, raging asshole. And this is true! Moreover, he is quite likely being driven by all kinds of motivated reasoning. But that doesn't mean that he can't defend the classical super, however weak his argument actually is! So what is going on, in January of 1950, months before the Teller-Ulam design surfaces?  This is a very interesting question, because it invites us to peer through telescopes and into cloud chambers with Bit Science, and investigate the way that "pure science" is wrapped up with big booms. 
(All old time adventure movies set in Old LA have to have their climaxes at the Griffith Observatory. If I were the LAPD, I'd just stake the place out permanently, and arrest the master villains as they arrive.)

The proton-proton chain dominates in our sun.
By Borb, CC BY-SA 3.0,
Teller's "classical super" is more gestured at than explained, but the idea was proposed, by Enrico Fermi, as early as 1941. At that point, the fission device envisioned would have been a uranium-235  "gun" device, and Fermi proposed placing it adjacent to a tank of liquid deuterium. The fireball temperature, at hundreds of millions of kelvin, would be propagated by high energy neutrons into the deuterium, causing a self-sustaining fusion "burn" as the deuterons combined to form "helium-3, tritium,or more rarely, helium-4, presumably based on the number of spare neutrons, but I'm not typing out all of those super-, and subscripts. 

Teller, we're told, argued that the deuterium burn would not be self-sustaining due to energy lost as "useless" photons rather than as ejected nuclear particles. However, Teller's collaborator, Emil Konopinski, showed that the addition of tritium to the mix would propagate and raise the temperatures in a tank of pure deuterium to the point where the deuterium burn would take place. Moving forward, historians of the programme launch into a discussion of computing, since demonstrating that the burns would happen required a numerical analysis of partial differential equations via the kind of tedious, iterative calculations  for which the early mathematical computers were developed in the first place. 

The results of these calculations did not become available until 1946, by which time Los Alamos had gone through a long detour as, first, plutonium became available as a much cheaper fission material, and, second, the plutonium gun was demonstrated to be impractical, and the more complex and familiar symmetrical-compression device substituted for it. 

What we are not getting here is the plausibility argument, which, in the postblogging approach, we can afford to dawdle on for a bit. Teller (and presumably, originally Fermi), were persuaded that this would work because they were aiming to produce a supernova on Earth. Or, actually, because this discussion is so early, perhaps a super-nova. The first "upper class novae" had only been identified fifteen or so years before, and while our buddies at Wikipedia say that "supernova" was accepted by 1938 on the basis of Baade and Zwicky's 1931 lectures, things are a bit more complicated. Zwicky and Baade proposed the existence of a "super-nova" based on 1885 observations of a supernova in the Andromeda galaxy, but the distance to Andromeda, and the very existence of galaxies remained somewhat controversial in the 1920s. Baade and Zwicky took Ritchy's 1917 measurement of the distance to Andromeda, used it to estimate the energy released by the event, and concluded that it could only  have been caused by a star's collapse into a neutron star --an entity for which more conventional minds had no time whatsoever. 

Such an event, Zwicky believed, was the likeliest source of the incredibly energetic cosmic rays observed entering the Earth's atmosphere and showering the planet with nuclear particles and still more exotic phenomena. Zwicky, an eccentric even by the lofty standards of nuclear physicists, was an uncannily successful one, and married the daughter of a California state senator, who stepped up to secure the funding of the Palomar Observatory, and also, perhaps, his son-in-law's job. Meanwhile, and speaking of the outer reaches of eccentricity, a quite different perspective was developing at George Washington University in St. Louis, where George Gamow and a young Edward Teller were pursuing their own, outlandish version of a now-accepted cosomological oddity --the Big Bang. The salience here is not modern science's cosmogony, or mostly not. Gamow invoked the impossibly hot and dense conditions of the first moments after the Big Bang to explain matter. According to Gamow, all matter in the modern universe was produced in these first moments by the condensation of the cloud. One after another, neutrons and protons accreted to form each element in the periodic table, in the same proportions as exist today. In other words, Gamow's intellectual tradition rejected Arthur Eddington's off-hand idea that the stuff of the modern universe had been produced by "nucleosynthesis" in stars. Indeed, Gamow developed an approximation for estimating the rates of nuclear reactions in stars, the Gamow Factor, which was the basis for Teller's estimates of the rates of nuclear reactions at stellar temperatures. 
By Borb, CC BY-SA 3.0,

None of this is to say that Teller (and Gamow) are wrong or even not useful for the proton-proton, deuterium, deuterium-tritium and other light-element fusion processes that take place in stars. They were, however, succeeded by Hans Bethe, who developed Carl Friedrich von Weizsacker's earlier work into a description of the so-called "carbon-nitrogen-oxygen" cycle, an unbelievably complicated series of nucleosyntheses. 

You will have heard it told, phrased as an typical bit of eccentricity, that in advance of the Trinity test, someone was set to calculating whether it would set the atmosphere on fire. For once, the peculiarities of the community are overstated, in the name of minimising its irresponsibility. It was hardly unreasonable to worry that firing off a fission bomb would jumpstart the Earth's atmosphere into the middle of the CNO cycle, and turn the sweet breezes that cool the green hills of Earth into the galaxy's tiniest and most perfect supernova. 

Bethe, like Teller, worked in the Manhattan Project, pipping Teller (and Felix Bloch, who resigned and headed east to join the Radiation Laboratory) for the position of chief of the theoretical division at Los Alamos, where his attention was quickly diverted from the zoo of subatomic particles to the hydrodynamics of symmetrically converging plutonium shells. Someone who didn't, was the Freiherr von Weizsacker, who remained in Germany, supposedly sabotaging the German nuclear weapons programme from the inside. Embarrassingly, it seems likely that the Germans might have been hijacked as much by mistaken ideas about subatomic physics as by their advanced political consciousness. Heisenberg and Weizsacker had gone hairing off after "isospin," a real thing explained by quark phenomena, but in the Thirties looking like it might be a fundamental state condition differentiating protons and neutrons, so that by changing a particle's isospin, one could turn a proton into a neutron. (Boom? BIG boom.) 

As with possible early work on muon-catalysed fusion, I have no idea how far this kind of wild goose chase led nuclear physicists astray. Where we are in 1950 is, Teller has a bomb+deterium/tritium+pure deuterium device that he has persuaded himself will undergo a self-sustaining chain reaction. The death blow was apparently not struck to the super until the summer, when Ulam and Neumann, using more-accurately measured nuclear cross sections produced by "Tuck and his group," showed that it was not feasible as it stood even with astronomical amounts of tritium. James L. Tuck turns out to have shown up around here before. He is the British physicist who left Oxford in 1949, first for the University of Chicago, and then Los Alamos. Tuck himself arrived back in America (he had previously worked at Los Alamos during the war) with the "pinch" tucked under his arm and a yen to work on nuclear fusion as a power source. The "pinch" is a magnetic device for compressing fusion fuel, and compression is what the Teller-Ulam device will, eventually, have over the Classical Super. Discussions of Tuck's work seem to vault right over this phase and into his "perhapsatron" prototype fusion reactor. 

I have no idea what, if anything, magnetic confinement might have contributed to paper napkin sketches of a working "classical super." So I should probably straightforwardly deny that I'm even speculating that it crossed Teller's mind. All that is really interesting here, apart from the search for fusion that was about to ramp up around the world thanks to the one-and-only Juan Peron is that Teller was working with poor nuclear cross-section approximations. The whole thing is rather reminiscent of the scuttling of the plutonium gun design, when, after no shortage of warnings, Emilio Segre's P-5 group demonstrated that the spontaneous decay of contaminant Pu-240 in reactor-produced P-239 was high enough to cause a gun-type plutonium bomb to fizzle. There's an interesting sidelight here, in that the P-5 group was previously producing Panglossian reports reconciling the very different decay rates in samples produced by the cyclotrons at Brookhaven and Berkeley by resort to cosmic rays. (The crucial measurements to be explained, were obtained at Los Alamos, which, due to its higher altitude, received more cosmic radiation.) Given how egregiously Segre had been mistreated during the period, one can only wonder how much longer he might have gone on telling his superiors what they wanted to hear, had he not been able to establish the cause was contamination, and not mistaken estimates of the Pu-239 decay rate. 

Given that the decision to pursue the hydrogen bomb was made without any assurance that it would work, it is over-egging the cake to say that the President was persuaded by bad science. There is something weird going on here, however. It should have been clear by that the Classical Super would not work. Someone, or something, was creating mystery where there ought not to have been any mystery. Loading it all off on Edward Teller's Svengali-like influence seems a bit rich to me. and it seems to me more likely that, in the winter of 1950, the community was less certain of its models and explanations than it presents itself as being in hindsight. Supernovas happened in nature. Leaving Zwicky's gravitational collapse theories aside, it seemed that they happened in conditions far less extreme than in the vicinity of fission devices. If science said they wouldn't, science was presumably missing something, and Teller was gambling on finding it. 

Sadly, in some ways, he didn't. The thermonuclear bomb turns out to be a matter of engineering, and not of something missing in the Standard Model --the thing that the community is still looking for, at ever higher energies, seventy years later. On the other hand, it's something to remember that this was actually a possibility, and that the world had to get to sleep in the Fifties with the thought that a bomb that could set the world on fire was at least conceivable. 

No comments:

Post a Comment