Countdown to The Cold War: February 1945 February 18, 2015Posted by Lofty Ambitions in Science.
Tags: Books, Nuclear Weapons, Physics, Radioactivity, WWII
add a comment
In February 1945, the end of war in the European theatre of operations was still a few months off in the future. Nonetheless, Allied leaders felt that the war’s end was close enough that they could begin to anticipate the post-war era. To that end, Churchill, Roosevelt, and Stalin met in Yalta—a city on the Crimean peninsula overlooking the Black Sea—on February 4-11 to discuss the shape of post-war Europe. Because of the tense relations between the United States and Britain on one hand and the Soviet Union on the other, which were reinforced during the meetings, the Yalta Conference is the oft-cited start of the Cold War.
In our “Countdown to the Cold War: October 1944” post, we detailed the struggles associated with the Hanford nuclear reactors, then known as atomic piles. In the last months of 1944 and in January 1945, engineers and scientists working on Hanford’s problems ironed out the kinks of the plutonium production process. Sometime between February 2th and 7th—sources vary on the exact date—the first weapons grade plutonium began making its way from Hanford to Los Alamos.
In the book, Hanford and the Bomb: An Oral History of World War II, author S. L. Sanger has this to say about the event:
[T]he first Hanford-produced plutonium was handed over by Du Pont to the Army. The next morning, Col. F.T. Matthias took it to Portland by car with a military intelligence escort. From there, Matthias and an agent went by train to Los Angeles where the package was given to an officer from Los Alamos. Matthias described the container as a wooden box wrapped in brown paper about 14 inches on a side and 18 inches high. It had a carrying handle and the syrupy plutonium, weighing about 100 grams, was carried in a flask suspended between shock absorbers.
The next time you’re about to board an airplane and TSA agents in the security area shout reminders of the restriction to 3-ounce containers of liquids and gels, think about how times have changed. During World War II, one of the most hazardous substances ever present on the face of the earth was carried on a regular passenger train. In a wooden box. Wrapped in brown paper.
Sanger’s book describes the meeting between Matthias and the officer from Los Alamos in what was almost certainly Los Angeles’s Union Passenger Terminal. Apparently, Matthias discovered that the officer was traveling back to Los Alamos in an upper berth, a means of rail travel that had privacy by means of curtains, but no real security, not even a door. Matthias discovered that the officer didn’t know what exactly he was being entrusted to carry back to Los Alamos. Matthias told the officer that it cost $350 million to produce the item and suggested to the man that he get a compartment with a locking door. The man did as Matthias instructed.
As revealed in to Critical Assembly by Lillian Hoddesson, et al., the Los Alamos contingent was very pessimistic about the quality and amount of the plutonium that they expected to receive from Hanford: “Oppenheimer was not optimistic about the ease of interacting with Hanford.” Ultimately, the quality and quantity of the Hanford plutonium was deemed sufficient to carry out the metallurgical research necessary so that plutonium could be used in the Fat Man weapon.
While the arrival of the Hanford plutonium in February 1945 was a huge event in the run-up to the Trinity test of a Fat Man type of atomic weapon, other activities related to Fat Man were taking place at Los Alamos at the time as well.
In December 1944, several new advisory boards and standing committees were created at Los Alamos. Chaired by physicist Samuel K. Allison, the Technical and Scheduling Conference was responsible for oversight and coordination of the transition from research to implementation. On Saturday, February 17, the Technical and Scheduling Conference met for four hours to discuss competing designs for the Fat Man-type weapon.
J. Robert Oppenheimer, the Manhattan Project’s director, argued throughout the day for simpler, more conservative design decisions. As Bruce Cameron Reed describes it in his excellent book The History and Science of the Manhattan Project, the final outcome of that committee meeting wouldn’t be decided until an end-of-the-month visit by General Leslie Groves:
On February 28, just eleven days after the TSC meeting, Oppenheimer and Groves decided provisionally on the Christy-core design with explosive lenses made of Comp B and Baratol. Characteristic of so many decisions in the Manhattan Project, their choice was a gamble: few implosion lenses had by then been tested[…].
With this end-of-February meeting between Groves and Oppenheimer, the design for the Trinity test was effectively fixed, and the lab could then focus on fashioning the numerous technologies into the world’s first atomi
Lyon Air Museum (Photos!) February 11, 2015Posted by Lofty Ambitions in Aviation.
Tags: Museums & Archives, WWI, WWII
add a comment
Lyon Air Museum, founded by Major General William Lyon and opened in 2009, is our local aviation museum. It’s located just across the runways from the terminals at John Wayne Airport in Santa Ana, and it’s open 10am-4pm every day except Thanksgiving and Christmas. On March 9, at 10am, the museum will open the cockpit of their Douglas DC-3 flagship. On March 21, at 10:30am, Tuskegee Airmen will share their stories.
We finally made our first visit this past weekend. We’re sure to go back, and here’s why.
- 7 aircraft
- 8 automobiles (General Lyon is a long-time collector!)
- lots of motorcycles
It’s small, incredibly well kept, and filled with surprising treasures. And planes are taking off and landing just outside the windows. Here’s a sampling of what we saw.
Countdown to the Cold War: September 1944 September 24, 2014Posted by Lofty Ambitions in Science.
Tags: Books, Cancer, Countdown to The Cold War, Nuclear Weapons, Physics, Radioactivity, WWII
In the last couple of posts, we’ve begun our Countdown to the Cold War by talking about the reorganized at Los Alamos in the fall of 1944 to develop a method known as implosion. You can read the last post in the series by clicking HERE.
The next step on the Manhattan Project’s Countdown to the Cold War occurred on September 22, 1944, and was known as the RaLa experiment. Very early in the implosion research program, it became obvious that being able to systematically verify the success or failure of implosion would be a crucial measure for success. But very few experimental measures of implosion existed at the time.
In particular, for a successful atomic weapon, it was imperative that the scientists be able to engineer a symmetric implosion. Early attempts at creating implosion revealed a wide range of asymmetric behaviors that scattered material unevenly. In order to measure the symmetry of implosion, it became necessary to observe implosion events with instruments. One technique that was developed for observing implosion was known as RaLa.
RaLa is a shorthand for the active ingredient in a RaLa test: radiolanthanum. Radiolanthanum (La-140) is a manmade radioactive isotope of lanthanum. According to Critical Assembly (by Hoddesson, et al), Robert Serber first outlined what would become the RaLa method on November 1, 1943. Serber was arguably Robert Oppenheimer’s right-hand man at Los Alamos and someone familiar to folks there for the Los Alamos Primer, the introductory lectures that kicked off the Manhattan Project’s bomb design effort.
The RaLa method depended upon the use of gamma radiation given off by the radiolanthanum isotope. Gamma radiation—or just gamma rays—are a very energetic type of electromagnetic radiation. The EPA.gov website devoted to radiation protection has this to say about gamma rays:
Gamma photons have about 10,000 times as much energy as the photons in the visible range of the electromagnetic spectrum. Gamma photons have no mass and no electrical charge. The are pure electromagnetic energy.
Highly energetic gamma rays travel at the speed of light and easily pass through most materials. It is this set of properties that made them useful in characterizing the implosion necessary for setting off an atomic bomb.
Serber hypothesized that by placing an amount of radiolanthanum in the center of the metal sphere to be compressed by implosion, the strength of the gamma rays emitted during that implosion would vary in such a way that the scientists could use instruments to understand how symmetrical the implosion was. Serber knew that, as an implosion event progressed in a metallic core (uranium or plutonium for the atom bomb), there would be significant changes in the density of the material being compressed. These changes in density would retard the gamma rays in predictable ways. In addition, because the gamma rays would radiate out from the center of the sphere, the scientists would be able to collect information about the implosion in three physical dimensions.
Given that the radiolanthanum material would be at the center of an explosion, there would of course be radioactive debris and dispersal of that debris. Gamma radiation is ionizing—releases electrons—and therefore has biological implications, meaning that it affects human bodies. And because gamma rays penetrate materials, they can be very dangerous. In this way, the RaLa experiments constitute the world’s first production of radioactive fallout, a waft of the Cold War to come. In order to minimize human exposure to the radiation that would be released, the RaLa experiments were held offsite in Bayo Canyon, located about two miles east of Los Alamos—a sort of lab away from lab. Checking the wind direction or measuring fallout, however, weren’t much a priority for these early radioactive test explosions.
Countdown to The Cold War: August 1944 (3) September 10, 2014Posted by Lofty Ambitions in Science.
Tags: Books, Countdown to The Cold War, Nuclear Weapons, Radioactivity, WWII
add a comment
In our post two weeks ago, we mentioned implosion as an assembly method for a critical mass. The critical mass is the amount of fissile material—in the form of uranium or plutonium—necessary to set-up the uncontrolled fission chain reaction that’s at the heart of a nuclear weapon. Implosion was one of three original assembly methods evaluated during the Manhattan Project: autocatalysis, the gun method, and implosion. The scientists at Los Alamos, however, had no experience using explosives to systematically create the symmetric, spherical blast wave necessary to compress solid materials for implosion. Indeed, in one of the official histories of the Manhattan Project, David Hawkins says the following:
[T]he behavior of solid matter under the thermodynamical conditions created by an implosion went far beyond current laboratory experience. As even its name implies, the implosion seemed “against nature.”
Physicist Seth Neddermeyer was an early advocate of the implosion method, and he began a serious investigation of the process in 1943. By mid-1944, because of plutonium’s propensity for spontaneous fission, it became clearer that, if there was to be an atomic bomb that used plutonium, then implosion was the only viable assembly method. The progress that Neddermeyer’s team had made on the implosion problem was deemed to be inadequate, though, and Neddermeyer was replaced. The realization that implosion was an extremely complicated problem set off a reorganization of Los Alamos that saw the creation of entirely new research groups, promotion or hiring of scientists to lead those groups, and realignment within existing research groups.
What’s remarkable about the Los Alamos reorganization is the breadth of the changes and the speed with which they were executed in the fall of 1944. A letter in mid-June, a series of meetings in July, and final approval on July 20th, 1944—1, 2, 3, go. The changes required by the reorganization were considered to be in effect on August 14th, 1944. The gun design was considered to be making acceptable progress under the leadership of Navy Captain William “Deak” Parsons. Parsons had been in charge of the Ordnance Division, and perhaps the biggest change that underwent was becoming the O Division. The two most important of the newly created divisions were X Division and G Division. X Division—X for Explosives—was headed by Harvard physical chemist George Kistiakowsky. Kisti’s group was responsible for every engineering and development aspect of creating the explosive system used to render the implosion. G Division—G for Gadget—was led by Robert Bacher and became responsible for all of the aspects of the bomb that had to do with its nuclear core, the so-called plutonium pit. In addition, because of G Division’s responsibility for the pit, they were also charged with developing various experimental methodologies for evaluating the effectiveness of the implosion—in particular, measure for validating the compression of solid materials. Importantly, the series of organizational changes that enhanced the overall understanding of the implosion-based atomic bomb. So, existing divisions such as R Division (Research, the Experimental Physics Division prior to the reorganization) and T Division (Theory) adjusted as the focus on implosion took hold across the laboratory at Los Alamos.
The Manhattan Project’s leadership, spurred on by J. Robert Oppenheimer, saw a problem and worked effectively to address that problem. This speedy, drastic effort that reorganized the Manhattan Project reminds us of an engineering analogy that used to come up in computer systems development: replacing a car’s engine as you’re going down the highway at 70 mile per hour. Just over two months time elapsed from the proposed changes to their implementation, with research continuing all the while. The development of the implosion device, the Gadget, was the primary focus of the laboratory from this reorganization in August 1944 until the Trinity test of the first atomic weapon on July 16, 1945. The Countdown to the Cold War was well underway 70 years ago today.
Countdown to The Cold War: August 1944 (2) August 27, 2014Posted by Lofty Ambitions in Science.
Tags: Books, Countdown to The Cold War, Nuclear Weapons, Radioactivity, WWII
add a comment
Our first “Countdown to The Cold War” post appeared LAST WEEK, so you may want to start there.
In the vernacular of the Manhattan Project scientists and engineers, assembly is the process of transforming a subcritical mass of either uranium or plutonium into a supercritical mass, an uncontrolled nuclear chain reaction resulting in an explosion. In the earliest days of the project, most of the effort was spent on developing what was called the gun-type assembly method. This is essentially the act of slamming together two subcritical masses by firing one at the other. As a means of setting off an atomic explosion, this process has always struck the Lofty Duo as the equivalent of one of our very distant ancestors stumbling across two stones, banging them together, and wiping out the entire forest in which they lived.
The initial designs for a gun-type weapon were essentially navy cannons with one end containing a near-critical mass of fissionable to be shot at from the other end by a smaller mass of fissionable material. The first attempts were thought to require a ten-thousand pound, seventeen-foot long cannon. These designs were known as the Thin Man, after the Dashiell Hammett novel of the same name.
Scientists and engineers hoped that this design would work for both uranium and plutonium. While enriched uranium–enrichment being the process used to increase the proportion of desirable U-235 vs. undesirable U-238 in a given amount of uranium (see last week’s post)–had suitable physical properties for a gun-type weapon, the enrichment process was complex and expensive. During the Manhattan Project, electromagnetic separation, thermal diffusion, and, to a lesser extent, gas centrifugation were all used as enrichment processes. In fact, these processes of enriching uranium were so difficult that there were serious questions about whether enough uranium could be produced to build a bomb.
Plutonium, on the other hand, could be produced by transmuting–transmuting being changing one element or isotope into another–uranium in nuclear reactors (atomic piles at the time). Once produced, its purification and separation could be handled chemically, as opposed to the complicated means necessary for uranium. Plutonium is a fiendish metal to manipulate, and its been called the most dangerous substance known to humankind. In the early days of the Manhattan Project, it was also in short supply. As more of it became available in April 1944 and subjected to experiment, scientists at Los Alamos, particularly physicist Emilio Segrè and his group, discovered that reactor-produced plutonium (as opposed to previous plutonium samples which had been created in cyclotrons) suffered from an alarming problem.
As Segrè and his group discovered in their Forrest Service cabin deep in Pajarito Canyon, the plutonium produced in atomic piles has two isotopes: Pu-239 and Pu-240. The presence of the second isotope, Pu-240, caused the plutonium that Los Alamos was receiving to undergo spontaneous fission. In nature, fissionable elements can also undergo nuclear reaction known as spontaneous fission. This process is a somewhat different process than when nuclear fission is artificially induced through the use of a neutron. Richard Rhodes in his Pulitzer Prize Winning tome, The Making of the Atomic Bomb, gives a footnote definition of spontaneous fission: “a relatively rare nuclear event, differs from fission caused by neutron bombardment; it occurs without outside stimulus as a natural consequence of the instability of heavy nuclei.” Spontaneous was not what the Manhattan Project wanted in its nuclear material.
The unplanned for nuclear reaction was occurring to such an extent that, as two subcritical pieces of plutonium were brought in proximity to one another, the assembling mass of plutonium would be subject to pre-detonation. In short, the plutonium produced in Hanford’s reactors couldn’t be used in a gun-type assembly method. So the scientists and engineers needed to figure out what kind of bomb assembly would work if they wanted to use plutonium.
It was relatively quickly realized that, in order to make use of plutonium and to avoid pre-detonation, the subcritical mass would have to be assembled fast. Very fast. The only method that was available to Los Alamos was implosion. We’ll discuss that and its implications for the Manhattan Project next in our “Countdown to The Cold War.”
In the meantime, for more on uranium, plutonium, and fission, see our post called “Uranium & Plutonium & Fission.”
Countdown to The Cold War: August 1944 August 20, 2014Posted by Lofty Ambitions in Uncategorized.
Tags: Countdown to The Cold War, Physics, Radioactivity, WWII
add a comment
Over the last few years, your Lofty Duo has had an inordinate amount of interest in the Manhattan Project. If you were to draw a Venn diagram of our many overlapping interests in this historical event, it’s likely that somewhere in the shaded region at the center of the diagram would be a man named Henry Cullen. Henry was Anna’s grandfather. In his professional life, he was a Pullman Conductor on the Santa Fe Chief. The stories that Henry told about his train dropping off men with foreign-sounding names and accents in-the-middle-of-nowhere New Mexico are a part of Anna’s family lore.
That middle-of-nowhere spot was Lamy, New Mexico, situated about ten miles south of Santa Fe. During the years 1943-1945, the Lamy railway station was the disembarkation point for thousands of American scientists, engineers, soldiers, and their families as they made their way to the heart of the Manhattan Project: Site Y, more popularly known as Los Alamos. Site Y was one of the thirty locations that made up the Manhattan Engineer District, an administrative organization for the atomic bomb project that was created within the Army Corps of Engineers.
The military director of the Manhattan Engineer District was General Leslie M. Groves, who received the assignment to manage the Manhattan Engineer District as a result of his success with building the Pentagon. As Groves contemplated the necessity of moving so many valuable technical people around the country, he became concerned by the possibility of airplane crashes. As a result, trains like the Santa Fe Chief became the primary mode of cross-country transportation for the people working on the Manhattan Project. If it weren’t for the General’s fears, it’s unlikely that Henry Cullen would have crossed paths with so many individuals who were in the process of changing the course of history.
Henry Cullen’s outsider-looking-in stories about the then secret world of the Manhattan Project have given rise to a number of projects here at Lofty Ambitions. We’ve made trips to Santa Fe and Los Alamos numerous times. We’ve visited a number of atomic-themed museums. And we’re academics, so we’ve turned what we learned into conference papers and presentations. Doug is also using parts of Henry’s story in the novel he’s writing this summer.
As we mentioned earlier this month, over the next year, we’re going to be taking a look at the last year (August 1944-1945) of the Manhattan Project. Our starting point is a sequence of events that led to a massive reorganization of the laboratory at Site Y seventy years ago in August of 1944. That reorganization centered on a new design, a new model for the atomic bomb called implosion. This new design was necessary in order for the project to make use of the element plutonium, about which we’ve written. To understand this shift in August 1944, it’s helpful to keep in mind how the Manhattan Project scientists had initially thought they might go about designing an atomic bomb.
Hungarian physicist Leo Szilard is the scientist credited for first recognizing the possibility of using the energy released by the splitting of an atom—the process of nuclear fission—to create a weapon. In the late 1930s, much of the research in the area of nuclear fission was focused on the radioactive element uranium.
In uranium, the fission process begins with the absorption of a neutron (a subatomic particle with no electric charge, and one of the three constituents of atoms along with electrons and protons). This new neutron introduced to the uranium atom adds to the protons and neutrons in the nucleus, a process that excites the atom and makes it unstable. As a result of this instability, the uranium atom breaks apart into lighter elements (krypton and barium), three more neutrons, and energy.
However, this set of byproducts is the result of the fission in a specific uranium isotope, U-235. Naturally occurring uranium has two isotopes: U-235 and U-238. The element uranium has 92 protons in its nucleus. Isotopes are alternative configurations of a chemical element that differ in the number of neutrons in the nucleus. U-235 has 143 neutrons in its nucleus, and U-238 has 146 neutrons. The number after the chemical symbol—235 or 238—indicates the total number of protons and neutrons for that isotope (e.g., U-235: 92 + 143 = 235).
The nuclear fission that described above for U-235 releases three new neutrons. Each of those neutrons can then go on to fission more uranium atoms. As this process repeats cycle after cycle, it produces what is known as a chain reaction. In nuclear engineering, a controlled chain reaction is a nuclear reactor, a machine that can be used to generate power. An uncontrolled chain reaction is a weapon, and that was the goal of the Manhattan Project. Get that fission started and let it run wild.
U-238, the other naturally occurring isotope of uranium, has a nuclear reaction that generates only a single new neutron. So, one neutron is needed to cause fission, and one neutron is produced by the fission. That’s just not enough to sustain a chain reaction. So the Manhattan Project needed U-235.
Naturally occurring uranium, however, is found in an isotope mix that is 99.3% U-238 (which the scientists and engineers didn’t want) and about 0.7% U-235 (which was what they did want). They worked as best they could with this situation of separating out the isotope they wanted. As their work proceeded, though, they wondered whether plutonium might be used instead of uranium. As they began to think about how plutonium might work, they realized that the bomb design under development for uranium wasn’t suitable for using plutonium.
So while the Manhattan Project continued to pursue a weapon that used uranium, they refocused efforts on plutonium and began developing another design.
For the next post in “Countdown to the Cold War,” click HERE.
Palomar Observatory (Part 3) September 18, 2013Posted by Lofty Ambitions in Science.
Tags: Museums & Archives, Palomar Observatory, WWII
add a comment
Longtime readers of Lofty Ambitions know that we’ve devoted a number of blog posts to the Manhattan Project and its legacy. We’ve made several treks to Los Alamos. We visited and wrote about the Nevada Test Site, that enormous expanse of the American west where the government tested, both above- and below-ground, several generations of the nuclear weapons designed at Los Alamos National Lab. We each have writing projects—Doug a novel and Anna a memoir—that involve the Manhattan
Project and America’s legacy of atomic energy, nuclear weapons, and our irradiated environment. That was a project often labeled Big Science.
Defining Big Science has long been a loose, intuitive, “I know it when I see it” endeavor. Roughly, it denotes a project so large in scope and aims that it requires collaboration between universities, government, and industry. The Manhattan Project is the prototypical Big Science project, and it is sometimes referenced as the tipping point between the era when science was an individual or small-team practice and the more large-scale, industrialized practice that exists today. In the book, The Manhattan Project: Big Science and the Atom Bomb, author Jeff Hughes devotes a chapter (Chapter 2: “Long Before the Bomb”) to the origins of Big Science. In his explanation, he mentions the role of astronomy and observatories in the creation of this phenomenon. Palomar Observatory, then, and particularly its the 200-inch Hale Telescope fit squarely into the tradition of Big Science.
The initial idea for what would become the Hale Telescope was put forward in a Harper’s magazine article by George Ellery Hale in 1928. Later that year, the Rockefeller Foundation gave Hale a $6M grant—the largest scientific grant that had ever been awarded at that time—to begin construction of the telescope. It would be twenty years before the project was completed—twice as long as the construction phase of an earlier Hale telescope, the 100-inch at Mount Wilson Observatory—and Hale wouldn’t live to see the project through, dying at the halfway point in 1938. His colossal masterpiece would, however, be named in his honor.
In earlier posts, we recounted some of the outsized numbers associated with this project. The one that matters most, however, is 200—the 200-inch mirror. In doubling the mirror’s diameter over the previous largest telescope, Hale’s new telescope had four (4x) times the surface area, and in telescopes, surface area determines how much light you can gather. The more light, the farther the telescope can see and the smaller the objects that it can resolve.
Constructing the telescope’s primary mirror was a gargantuan project of its own. Hale first worked with General Electric in an attempt to build the mirror out of fused quartz. As our docent on the Palomar tour pointed out, “The only thing Hale learned was GE didn’t know how to do it.” Reports vary, but Hale spent at least $600K—10% of his grant—on this failed effort.
The backup plan involved working with Corning Glass and their newly developed Pyrex glass (developed in 1915), a low thermal expansion glass. For telescopes, it’s extremely important that flexing and expansion due to temperature change is minimized so that the mirror maintains its shape. Corning’s first attempt at pouring the 200-inch mirror ended in failure when some of the mounting brackets melted in the heat. Despite the fact that that mirror would never be usable, it was used to develop engineering models of cooling. In a testament to the dictum “there’s a sucker born every minute” (oft attribued to PT Barnum, but likely said by someone else), Corning Glass put the failed mirror on display and charged to see it. In now resides in the Corning Museum of Glass, and the company has a lovely website dedicated to the mirror’s development.
The engineering and development of a useable mirror required pouring several test “blanks” for working out the process. It’s interesting to note that one test mirror, itself a not-insignificant 120 inches in diameter, would later become the primary mirror for the Lick Observatory’s C. Donald Shane telescope. When it began operation in 1959—astronomer’s call such an event First Light—the Shane 120-inch telescope was the second largest in the world, behind the Hale Telescope.
The supporting structure and mount developed for the big Hale mirror are also enormous. Engineered by Westinghouse and manufactured in its South Philadelphia factory, the steel beams, tubes, and gearing required to support and aim the telescope weigh in at 530 tons.
Since its First Light in 1949, Hale has been in operation roughly 300 nights every year. Over the history of those long nights, the Hale Telescope has dramatically increased our understanding of the universe. An important part of this work was the discovery of “quasi-stellar objects,” more popularly known as quasars. Initially discovered through radio astronomy, the light spectra of quasars defied characterization until astronomers Alan Sandage and Maarten Schmidt used the Hale Telescope to identify 3C 273, an astronomical object that had previously been described only as a radio source.
The funding, constructing, and operation of Palomar Observatory’s Hale Telescope tracks the evolution through the 20th century of astronomy into Big Science. For a large portion of the 20th century (1948-1976), the Hale Telescope was the largest optical telescope in the world. It remains the largest one we’ve seen in person.
Keep reading with PART 4.
A Lucky Disaster, or Canada’s Loss, NASA’s Gain (Part 2) March 13, 2013Posted by Lofty Ambitions in Aviation, Space Exploration.
Tags: Apollo, WWII
add a comment
Also see PART 1 of “A Lucky Disaster, or Canada’s Loss, NASA’s Gain.”
For the last 40 years, at least in the public’s eyes, Florida’s Space Coast and Houston have been the homes of American manned space flight. But in the earliest days of America’s space program, a select group of engineers calling themselves the Space Task Group (STG) made their home in rural Virginia at the Langley Research Center. Langley is NASA’s oldest research home, founded in 1917 by NASA’s predecessor, the National Advisory Committee for Aeronautics (just as you would think, NACA). The STG at Langley, inaugurated on November 5, 1958, came into existence little more than a month after NACA became NASA. These name changes and group birthings were all of a piece. Forty-five years ago, the nation was obsessed with space—and the nation remains intrigued.
In our February 20th post, we hinted that the February 20th, 1959, cancellation of AVRO’s CF-105 Arrow aircraft—less than six months after NASA was itself born—wound up being a boon for America’s fledgling space program. America’s first human spaceflight program, Project Mercury, was announced to the world six days after NASA was born, but that ambitious program was struggling to get its legs under it. The STG, with its single-minded view of putting an American in space, also had trouble finding its footing and was viewed with skepticism by the airplanes-only culture of Langley’s old guard.
Aeronautics was becoming Aerospace, but not everyone was excited by the changes that this shift implied. In part, resistance was only logical. The American aviation industry had achieved remarkable successes since the end of World War II. The nascent American efforts in space didn’t have a record of success. Not only had the Russians beaten the Americans into space with Sputnik, but they had done it spectacularly. Sputnik had been followed less than a month later by Sputnik-2, and that second Sputnik had carried a living creature, a dog named Laika. America’s side of the space-race equation was also spectacular, but mostly spectacular failures. The nationally televised explosion of America’s first attempted satellite launch—the Vanguard mission on December 6, 1957—earned it the derisive nickname Kaputnik.
Into this environment came the opportunity for NASA’s STG to add significant engineering talent. Arguably, AVRO’s Arrow was the most advanced aircraft in active engineering and development at that time, and it was cancelled. The United States’ most advanced interceptor aircraft of that moment, the North American Aviation XF-108 Rapier—with delta wings and predicted Mach 3 performance, it was quite similar to the Arrow—was also cancelled in 1959. Both were victims of the coming age of ballistic missiles and pushbutton warfare. But whereas the American XF-108 project was limited to engineering drawings and a single wooden mock-up, the CF-105 Arrow knew the feel of air beneath its wings.
In all, AVRO designed, manufactured, and flight-tested six Arrow aircraft. This effort had given a talented young cadre of AVRO engineers experience at the leading edge of aeronautical engineering. The Arrow was the first aircraft designed to use a fly-by-wire system, a means of controlling the aircraft’s flight surfaces with electronic systems. The Arrow was designed in great part on computers. An IBM 704 mainframe computer at AVRO Canada’s headquarters in Malton, Ontario (near Toronto), was used not only for design purposes, but also for simulation and modeling. In fact, data collected during the Arrow flight test program was analyzed on the 704 and then fed back into the simulator. In sum, the young AVRO engineers had just the sort of experience that NASA’s STG needed for Project Mercury.
Ultimately, the AVRO engineers wound up in the STG because of the Arrow’s chief designer, Jim Chamberlin. Chamberlin was a known quantity to engineers at Langley from the collaborative work between AVRO and NACA on wind-tunnel testing for the Arrow and because of an earlier project, the AVRO VZ-9 Car (a saucer shaped jet).
As the layoffs took hold, Chamberlin and others jumped into action. Arrows to the Moon, a comprehensive look by author Chris Gainor of the contributions that AVRO engineers made to the American space program, indicates that the original idea was for a two-year exchange that would bring engineers from the cancelled Arrow project to the STG at Langley. NASA benefited by getting an immediate injection of talent for Project Mercury. AVRO hoped to get returns from sending its best-and-brightest off for two years for the equivalent of a graduate degree, a U.S.-funded, on-the-job school that was essentially the only program in space systems design and engineering in the free world.
When all was said and done, 32 AVRO engineers joined the STG. Another fantastic book that touches on this subject, Charles Murray and Catherine Bly Cox’s Apollo: The Race to the Moon, recounts a story in which Robert Gilruth, first head of the STG, told one of the AVRO engineers, Tec Roberts, “We thought about taking more of your crowd from AVRO…but we figured twenty-five percent aliens in the American space program was sufficient.”
Those aliens would make contributions to the American space program that are still being felt to this this day.
On This Date January 9, 2013Posted by Lofty Ambitions in Aviation.
Tags: Armstrong/Dryden Flight Research Center, Art & Science, Museums & Archives, Wright Brothers, WWII
Today is the birthday—first flight day—of two aircraft that share some background but also differ significantly. A good portion of the world was at war in the 1940s, and that gave rise to these two aircraft in different places. The AVRO Lancaster first took to the war-torn skies of England seventy-two years ago, in 1941, when test pilot Bill Thorn coaxed prototype BT308 to off of the tarmac and into the air at Manchester’s Ringway Airport. Two years later, in 1943, the prototype L-049 Constellation made its first flight, a short hop really, from Burbank, CA, to Muroc Air Force Base (later to become Edwards Air Force Base and also current home to NASA’s Dryden Flight Research Center).
Large, four-engined, and born during World War II are among the very limited set of characteristics that the Lancaster and the Constellation had in common. That said, both aircraft followed architect’s Louis Sullivan’s “form ever follows function” dictum to a tee and turned out very differently.
The Lancaster was designed as a bomber. Utilitarian, slab sided, and broad winged, the Lancaster is not easily mistaken for anything but a military aircraft. The Lancaster began military service in February 1942, and more than 7,000 would be built before the last “Lanc” was retired in 1963. During WWII, Lancaster’s flew nearly 160,000 missions. The Lancaster gained particular fame during the war for its use of bouncing bombs in mission against dams.
While the Lanc was decidedly of its time, the Lockheed Constellation—affectionately known as the “Connie”—had an art deco design, a blend of organic shapes and machine grace, that was ahead of its time. Much larger than the Lanc—early Connies had a takeoff weight of 137,500 lb versus the Lanc’s 68,000 lb—the Lockheed design was curved and sinous. Many mid-twentieth-century trains, planes, and automobiles were shaped to cheat the wind, and a designer’s eyeball of that era served as a wind-tunnel test. The Connie looks like it’s going fast even when it is sitting still.
Much is often made of Howard Hughes’s involvement in the design of the Connie. In reality, Hughes’ TWA simply issued the specification for the Connie, and Lockheed engineered an aircraft to satisfy that spec. Once the Connie was flying though, Hughes, ever the promoter and master showman, made headlines with the aircraft. Because of his close relationship to Lockheed, Hughes managed to finagle the use of an early Constellation. Once he had it, he repainted it in TWA colors and promptly set a speed record while flying it across the country. Passengers on that trip included Hughes’s gal-pal Ava Gardner and Lockheed engineer (and Upper Peninsula native) Kelly Johnson. On his return trip, Hughes garnered more press by giving Orville Wright what would be the aviation pioneer’s last flight.
Despite its obvious style and speed—the Connie was faster than a number of WWII fighter aircraft—the Connie had a short and somewhat difficult career. Its Wright 3350 engines had a reputation for inflight fires, leading to uncomfortable jokes about the Connie, which had four engines, being the world’s faster trimotor. On top of that, the first generation of jet airliners arrived just as the Connie began to hit its stride. Although Connies survived for a number of years in the military and in passenger service outside of the United States, this aircraft made its final domestic revenue flight in 1967.
As we’ve written elsewhere, we have a fondness for visiting small airports just to see what’s sitting on the ramp. We developed this ritual while we were both professors at our alma mater, Knox College, in the late-1990s. Years later, on a return trip to Galesburg, we visited the local airport—call sign KGBG—for old-time’s sake. Sitting there in all of its shapely, aluminum glory was a Constellation.
The first Constellation that we saw in the metal was the so-called MATS Connie, one of the handful still flying and once owned by John Travolta. We’ve also seen the military variant at Chanute-Rantoul, just outside of Champaign, IL, where our colleague Richard Bausch once served. President Eisenhower flew on a Constellation; he had two in service at the time.
Only two Lancasters remain airworthy, one in the United Kingdom and one at the Canadian Warplane Heritage Museum. There’s a Lanc near us, though, in Chico, CA, that folks are planning to restore to flying condition. A reminder that we haven’t yet thoroughly investigated the aviation history that’s right in our own back yard here in Southern California.
In the Footsteps: Jean Dayton (Part 14) December 26, 2012Posted by Lofty Ambitions in Science.
Tags: In the Footsteps, Nuclear Weapons, Serendipity, WWII
add a comment
As frequent readers of Lofty Ambitions well know, we’re big believers in serendipity–that chance meeting with an idea, a place, or a person (or even better, a combination of those). Afterwards, your thoughts move in a new, unexpected direction. Last week’s post was about recent serendipity, and this week’s is about serendipity from our past.
In May 2003, while he was a graduate student at Oregon State University, Doug had just such a chance collosion while attending a lecture about the Cold War and nuclear weapons. Peter Galison, Pellegrino University Professor in History of Science and Physics at Harvard University, was speaking about a documentary that he had recently completed, The Ultimate Weapon: The H-bomb Dilemma. The title of Galison’s talk was “Filming and Writing History: The H-bomb Debate.” Doug had just started doing research for a historical novel set during the Manhattan Project, and the talk seemed to dovetail neatly with this new project.
One of the 20th century’s most controversial scientific figures, Edward Teller, is often referred to as the father of the hydrogen bomb (H-bomb). The H-bomb came into the world’s consciousness in 1952, less than ten years after the atomic bomb. Although much distinguishes the two types of weapons, not the least of which is that they operate on different physical principles; atomic bombs use fission, H-bombs use fusion, and the resulting difference between the two weapons is their destructive power. Atomic bombs have a practical upper limit in explosive yield based on the size of their uranium or plutonium core (see more HERE). Hydrogen bombs (more commonly called thermonuclear weapons in the latter stages of the Cold War) are nearly unlimited in their destructive potential. The primary requirement for increasing their power is adding more fuel (see more HERE).
Galison’s documentary–which aired on the History Channel in August 2000–gave voice to a number of the people associated with the development and deployment of thermonuclear weapons. In his talk, Galison made it seem as if he had a particular fondness for, or at least was intrigued by, the nuclear weapons designer Theodore “Ted” Taylor. Taylor had a reputation for being a particularly innovative thinker, a producer of remarkably elegant designs, although perhaps elegant isn’t quite the right term when the context is nuclear weapons. In the late 1950s, Taylor worked with physicist Freeman Dyson on Project Orion, an extravagantly ambitious plan to create a spacecraft capable of deep-space travel. At a time when NASA had yet to place a man in orbit, the mavens behind Orion were proposing a ship that could scoot easily past Mars and make its way to the outer planets, Saturn, Jupiter, Neptune, Uranus, and even Pluto (back in the days when Pluto was punching above its weight and still held planet status). Potential multi-generational missions involving dozens of scientists, their families, and a small menagerie of farm animals gallivanting off to Alpha Centauri were considered. The magical elixir that would power the enormous Orion? Not Star Trek’s dilithium crystals or ion drives. No, Orion was designed to ride a steady stream of H-bomb explosions. Megaton class (1) H-bombs would be ejected from the rear of Orion, detonated at a so-called safe distance, and the resulting stream of radiation and shock waves would push against a gigantic metal plate–logically enough called a pusher plate–fixed to Orion’s backside. Orion, of course, never went beyond the drawing board.
In his later years, Taylor became an ardent critic of the nation’s nuclear weapons program and its potential for nuclear proliferation. Not so with Edward Teller. Teller remained a passionate defender of nuclear weapons and his role in the creation of the H-bomb until the end of his life in 2003.
At the end of Galison’s talk, Doug went up to the speaker’s lectern to ask him a question about Ted Taylor. Doug was not the only person in the audience whose personal interests weren’t fully addressed in the short Q&A; there were a half-dozen people in line to speak with Galison. Standing quietly in front of Doug was a tiny, elderly woman. When it was her turn to speak with Galison, the woman stepped forward and began to tell her story. She had been a part of the Manhattan Project. There, she worked as a kind of assistant to Edward Teller. Though not a physicist, she’d studied biology at Cornell University, Teller valued her unorthodox problem solving strategies–what we’d today call outside-the-box thinking–and often gave her problems to work on, thorny, unusual problems that were stymieing the physicists.
The woman’s interaction with Galison was economical. She did the majority of the speaking, and after a brief moment of silence, she turned and left. Even after standing in line and while still wanting to ask Galison a question, Doug made a very easy decision: he followed the woman. As reached the lecture hall’s doorway, Doug tapped her gently on the shoulder. Introducing himself and explaining his interest in the Manhattan Project, Doug asked her if he might interview her about her experiences. With a look that suggested she was taken aback by this turn of events, she thought for a moment and ultimately said in a soft voice, “Yes.” Again she turned to leave, and again Doug tapped her lightly on the shoulder. “Your name. I need your name.” This time, the frown on her face indicated that she hadn’t anticipated this question as a part of the bargain. After a brief pause, she relented and said, “Jean Dayton.”