History often rewards great breakthroughs but ignores the preparatory steps that made those achievements possible. The Apollo program, for instance, has been documented in great detail and still receives ample attention, but what of the extraordinary labors that led to that summit? How was flight to the moon realized in practical terms after Jules Verne & Co. designed the blueprints?
When President John F. Kennedy approved the program soon after taking office, he proclaimed that humans would be on the moon before the close of the 1960s. As a result the technicians of the NASA saw their budget suddenly increased tenfold and were able to draw on a tremendous body of technological knowledge. The American public had already embraced the supposedly infinite possibilities of the Space Age. Of course, technological progress doesn’t just happen. In an earlier era, the railroad had promised to diminish travel time like no other means of transport had done before. It became a metaphor for the changing relationship of humans to the material world—for the domination of the landscape by technology. In fact, at a time when fast movement was uncommon, travelers found rail travel to be a mind-altering experience, and often a confusing one. The railroad became an essential asset in the American Civil War, and its influence was still critical to rapid delivery of weapons and matériel during the First World War. Airplanes were quickly put to military use, too, and soon proved highly effective as weapons. In fact, military concerns drove much technological progress in the nineteenth and twentieth centuries.
In the human imagination the way to reach the moon is simply to fly there, but from a purely technical perspective, moon “flight” converged with the history of rocketry at the end of the nineteenth century, when it became clear that rocket propulsion would be needed to transport humans into space. Technology then evolved more or less independently in Russia, the United States, and Germany. Given the backwardness of czarist Russia—an atmosphere that hindered experimentation—it is all the more surprising that Konstantin Eduardovich Tsiolkovsky (1857–1935), a teacher of mathematics, developed what is now seen as the basis of theoretical astronautics. The American Robert Goddard (1882–1945), the Frenchman Robert Esnault-Pelterie (1881–1951), and the German Romanian Hermann Oberth (1894–1989) also independently worked on effective rocket systems.
Tomorrow’s geniuses are often today’s lunatics, and initially, such endeavors were regarded as esoteric. When Oberth wrote a dissertation on rockets in space and submitted it to the University of Heidelberg, the professors initially rejected it. But by 1923 his thesis, “By Rocket into Planetary Space,” inspired wide interest and encouraged further research on the topic of space flight. Oberth was vindicated.
The space program was a pastiche of new and established technologies. As Walter A. McDougall has asserted, it combined “four great inventions: Britain’s radar, Germany’s ballistic rocket, and the United States’ electronic computer and atomic bomb,” each “the product of humankind’s most destructive conflict—World War II.” Immediately after the war, before the start of the space race, these discoveries were applied to the development of intercontinental missiles. The German V-2 rocket program had already established that flight outside the atmosphere was possible. Given the American advance in bombers, the Soviet Union felt particularly compelled to compensate for its lag in warhead technology with a rocket force.
Research on the moon and the idea of flying there attracted people who combined visionary and technical inclinations. Wernher von Braun was a key figure in this effort. From his background as the chief rocket engineer of the Third Reich and a member of the SS, he ultimately became head of launch-vehicle development for the Apollo moon program—an odyssey requiring not only exceptional skills and intelligence but also an opportunistic grasp of diplomacy. As his biographer Michael J. Neufeld has put it, few individuals have “shaken the hand of Eisenhower, Kennedy, Johnson and Nixon—but also Hitler, Himmler, Göring and Goebbels.” It is known that the German underground rocket factory that produced the V-2 rockets—called Vengeance Weapon 2 by the Nazi Propaganda Ministry—used slave laborers from a nearby concentration camp. It is also known that von Braun was involved in the facility’s planning and operation. He clearly benefited from war crimes and bears at least moral if not legal guilt. Even under the most adverse circumstances at the end of a lost war, von Braun managed to control the outcome of events. Realizing that rocket research could best be continued in the United States rather than in Great Britain or Russia, he allowed himself to be captured by the Americans and brought more than one hundred key members of his team along. He was the world’s most experienced rocket engineer and soon developed a broader vision of man’s venture into space. He suffered a setback in 1954, while working on the advanced rockets program in Huntsville, Alabama, when he was denied the necessary financial support to launch a satellite to ensure that the United States would be the first country in space, but the success of the Soviet Union’s Sputnik program boosted his promotion of space exploration.
Humans would not have set foot on the moon so early without the fierce antagonism and competition between the two superpowers during the Cold War. Without the political will and, as Roger D. Launius has put it, “the desire to demonstrate the technological superiority of one form of government over another,” the Apollo program would not have been possible. After the USSR launched the Sputnik 1 satellite in October 1957 and, a month later, Sputnik 2with its dog passenger, Laika, the space race gained momentum. Suddenly, Americans found themselves in second place. Because the Soviet Union was known to be working to develop nuclear weapons, its small orbiting orb was seen as a potential military threat to targets inside the United States. The extreme secrecy surrounding the Soviet space program—neither the launch site (the Baikonur Cosmodrome, in the desert of what is now the Republic of Kazakhstan) nor the name of the chief designer of the rockets (Sergei Korolev) was known at that time—certainly contributed to the sense of urgency of the American efforts. This perception translated to a massive infusion of public money into the space race. The Apollo program cost approximately $25 billion and was driven by Cold War politics rather than important scientific goals. It became, as Launius reminds us, “the largest nonmilitary technological endeavor ever undertaken by the United States.”
It is easy to forget that President Kennedy not only failed to define a clear purpose for the program, but also, recognizing the vast costs of the effort, repeatedly tried to persuade Soviet Premier Nikita Krushchev to pursue a joint expedition to the moon—to no avail. The Apollo program had critics from the beginning. According to polls taken in the year leading up the Apollo 11 launch, there was never a clear majority in favor of it. Some objections were based on the belief that “God never intended us to go into space,” in the words of one response offered. This is a minority opinion in the population as a whole, but it reminds us of the metaphysical feelings the moon has always evoked. Should this eternal symbol in the sky be touched, its secrets unveiled? In fact, the Apollo program assumed various quasi-religious reverberations. Norman Mailer, for example, characterized the space capsule itself as a sacred object. He also speculated about the extent to which Nazi ideology might have been enmeshed in the space program, mediated through Wernher von Braun and other former German officers.
Most objections, though, had nothing to do with metaphysics and everything to do with economics. As early as 1964, the sociologist Amitai Etzioni characterized the race to the moon as a “monumental misdecision” in his book The Moondoggle. The space program, Etzioni argued, produced neither major economic development nor a better understanding of the universe. “Some of the claims are safely projected into a remote and dateless future, others should have never been made; still others exaggerated out of proportion to their real value.” All of scientific manpower devoted to space, he wrote, should instead be put into health care or education. “Above all the space race is used as an escape. By focusing on the Moon, we delay facing ourselves, as Americans and as citizens of the Earth.” Etzioni later served as senior adviser to the White House under Jimmy Carter. For the eminent science historian and independent humanist thinker Lewis Mumford (1895–1990), the Apollo program was simply a waste of money, “an extravagant feat of technological exhibitionism.” He likened the manned space capsule “to the innermost chambers of the great pyramids, where the mummified body of the Pharaoh, surrounded by the miniaturized equipment necessary for magical travel to Heaven, was placed.” Still, the Apollo program received consistently favorable media coverage. In the wake of the 1968 assassinations of Martin Luther King, Jr., and Robert F. Kennedy, as well as ever more distressing reports from Vietnam, the push toward the moon acted as a counterbalance for the national psyche.
The technological wonders of yesterday rapidly become the banalities of today. The late 1950s and the 1960s held the promise of a world in which nuclear power would be the key to solving problems, replacing scarce energy resources, sharply reducing pollution, and even relieving poverty. The mere threat of nuclear annihilation would put an end to war. Hypersonic air travel would shrink our world while settlements on the moon would expand its capacity. Half a century later, the dreams that became reality have shown a nightmarish side. Implementation of nuclear power has been hamstrung by grave doubts about safety. The supersonic Concorde, a plane developed in 1962 with more than twice the speed of conventional aircraft, was eventually abandoned after a spectacular crash in 2000.
The live transmission of the shaky images of the astronauts’ ghostlike steps and their metallic voices from the moon, which triggered a global wave of excitement at the time, has become part of our cultural memory, a relic of a century past. Perhaps the most significant lasting image was that captured when the cameras were turned toward home. For the first time, the entirety of our planet was no longer an abstraction. At last we perceived the Earth in context, not merely as our own specific location. Traveling to the moon gave us an unprecedented sense of our own uniqueness in space and of the limits of the oasis we inhabit. Even as we continued to reach farther outward beyond our world, much of our imagination was turned inward.
Well into the nineteenth century, the moon, understood as a space of the imagination, had assumed a role similar—metaphorically, at least—to those of earlier unknown landscapes and continents. But conquest of the moon was driven by completely different motives than those of the Portuguese and Spanish seafarers in the fifteenth century.
Was the lunar mission really more than a historical accident? It cannot be simply dismissed as a propaganda coup, as an example of hubris, an American ego trip; those characterizations are not false, but they’re no more than half-true. The program was not motivated by a desire for wealth, and innovations it inspired still benefit people who never thrilled to Neil Armstrong’s “small step.” The fuel cell, based on a controlled hydrogen-oxygen reaction, has various applications and may yet prove to be the most useful “clean” alternative to the internal-combustion engine. It was developed in part to power the life support system in a space capsule. Tiny diodes measuring the astronauts’ pulse and blood pressure were forerunners of the medical telemetry equipment used today. Freeze-drying made it possible to preserve and to condense such foods as potatoes, peas, carrots, and minced meat, which could later be restored to their original forms with the help of water and a microwave. In addition to these specific developments, Apollo sparked a general interest in engineering, the results of which later benefited various industries. Some of the gadgets and activities associated with moon flight also found their way into popular culture. Michael Jackson’s moonwalk was inspired by the astronauts’ mode of moving on the lunar surface, and the Italian designer Giancarlo Zanatto’s moonboots would not have made sense without their real-life predecessors.
But to find the most ubiquitous consequence of Apollo-related research, most of us need look no farther than our desktops. The NASA program drove the miniaturization of information technology crucial to the development of modern computers. The Apollo Guidance Computer developed at MIT—a seventy-pound on-board device with a capacity less than that of a cell phone today—made the safe landing on the moon possible. As David A. Mindell writes in Digital Apollo, “Apollo began in a world when hardware and electronics were suspect and might fail anytime. It ended with the realization that as electronics became integrated, computers could become reliable, but that software held promise and peril. It could automate the flying, eliminate black boxes from the cramped cabin, and make the subtlest maneuvers seem simple. Yet it was hideously complex and difficult to manage. If it went wrong at a bad time, it could abort a mission or kill its users.”
After 1968, about ninety thousand people registered for Pan American World Airways’ First Moon Flights Club. Departure was scheduled for the year 2000; Ronald Reagan was one of the first to reserve a seat. The fare for the trip was projected to be fourteen thousand dollars. In retrospect, it is doubtful that a moon flight could have been provided at such a low price, but Pan Am went out of business in 1991 and never had to fulfill its lunar obligation. A few years later, enthusiasm for space exploration had faded, superseded by more urgent cultural and political issues. America’s manned lunar landing project ended in December 1972, when Apollo 17 ended its flight in the Pacific Ocean. In Washington, D.C., NASA officials, astronauts, scientists, and business managers celebrated with what the Washington Post called “the last splashdown party.”
Depending on political perspectives, the story of the U.S. space program can be portrayed as a feel-good triumph or as a waste of money that should have gone to social programs. But there is a third, less common perspective that has stirred considerable emotion over the decades. According to this revisionist narrative, Apollo moon landings never happened at all, and what we saw was faked at a terrestrial staging area. According to Roger D. Launius, such theories began to be spun early on, “almost from the point of the first spaceflight missions.” The explanation offered most commonly for the contrarian view is simple ignorance: for some people this technological accomplishment just didn’t fit their worldview, and it was easier for them to imagine a complex hoax than to accept the challenge to their assumptions. For some moon-landing deniers, though, the issue was more complex. They had an almost messianic belief in a conspiracy and often would aggressively resist any discussion of it.
In 1974, at the height of the Watergate scandal, when trust in the American political system was at a low point, Bill Kaysing published We Never Went to the Moon: America’s Thirty Billion Dollar Swindle. Since then, discussion has raged—often repeating the same arguments ad nauseam. At one point, the former Apollo astronaut Buzz Aldrin was so provoked by Bart Sibrel—who made two films claiming the landings were a fraud and labeled Aldrin “a coward, a liar, and a thief”—that he punched the much younger Sibrel in the face.
Conspiracy theorists disagree about the extent to which the moon landings were faked. A minority believes that the crew actually reached the moon, but that the images were faked to obfuscate the technical details of their journey. Others think that Stanley Kubrick, who directed 2001: A Space Odyssey, was commissioned by NASA to produce some of the Apollo 11 and 12 footage. According to this scenario, a dummy was launched and allowed to splash down into the ocean. All the other images transmitted into hundreds of millions of homes are said to have been fake footage, produced on a quickly improvised movie set in some remote spot in the Nevada desert. That Kubrick hired former NASA employees for 2001 is, to conspiracy theorists, further evidence. The issue created quite a stir when the Fox television network aired Conspiracy Theory: Did We Land on the Moon? in 2001. This hourlong documentary gave a platform to several hoax advocates but offered very little refuting evidence.
Even though a number of independent sources have stressed that the sheer, overwhelming body of physical evidence proves that humans did walk on the moon, the minority view provides admittedly fascinating fodder for publishers, and the nicely packaged story is always guaranteed an audience. If we begin with an open mind, the arguments of the conspiracy theorists may initially seem plausible, but they don’t hold up to rigorous scrutiny.
The arguments have been discussed extensively elsewhere, but a taste of the contrarian stance is still in order. Believers in a NASA conspiracy—the Apollo Simulation Project, as some call it—often cite the absence of stars in the jetblack sky photographed by the astronauts. They disregard the explanation that the exposure times were too short to have captured the faint stars. The unusual play of light and shadows in some of the photos, such as the “man on the moon” image of Buzz Aldrin, also draws the ire of conspiracy theorists. Why does it look like a spotlight is directed at him although the sun is clearly behind him or shining from the side? What the conspiracy theorists fail to consider is that the lunar surface has a tendency to reflect light back in the direction of the source. This phenomenon results in a specific glow: a halo or aureole. Often, the deniers are also skeptical about the astronauts’ ability to survive exposure to radiation during the trip or the high temperatures on moon’s surface. Although radiation and temperature were indeed threats to the astronauts, program scientists charged with astronaut safety clearly resolved the issues. The intensity of denial in the face of all evidence to the contrary reminds us that, for a number of people, the moon landings may have been traumatic. The intrusion of humans onto this numinous orb somehow violated their idea of the natural order. A trip to the moon had been a dream for centuries. Few events are more disturbing than a dream come true.
The issue of the denial also touches upon other, deeper questions. We live in a culture in which the deliberate blurring of boundaries between fact and fiction has achieved a certain artistic legitimacy in popular culture. And while many of us still remember the live images and sounds from the first moon landing, more than half of the world’s population is too young to have such an intimate connection with this event. Though not remembering doesn’t necessarily translate to denying, they may be less inclined to take its truth for granted.
For those who accept that men have been to the moon, the visits have raised a host of more mundane issues. For example, some people claim to own parts of the moon. But which property laws apply there? And who is authorized to define such laws in the first place? Who—if anyone—should have the right to change the moon’s surface? In 1967 the United States, the United Kingdom, and the Soviet Union signed the Outer Space Treaty that declared the moon to be terra nullius, a world belonging to no one. About one hundred nations adhere to this agreement today. In 1979 this treaty was supplemented by a more comprehensive one. The Agreement Governing the Activities of States on the Moon and Other Celestial Bodies specifies that the moon should be used for the benefit of all states and peoples and not, for example, as a testing ground for military purposes. The “Moon Treaty” also precludes any state from claiming sovereignty over any territory of a celestial body. However, the treaty has never been ratified by any of the major space-faring powers and remains unsigned by most of them, so it carries no legal weight. Nor is it clear how environmental ethics for the moon can be established and put into practice.
Future projects involving the moon still face major challenges and require piecemeal technological solutions. Companies under contract to NASA are exploring lunar logistics, mining, and spacesuit design. An inflatable test habitat in Antarctica serves as a base for lunar research and a testing station for Mars exploration. Locations such as the Canadian Arctic, the Arizona desert, and the underwater Aquarius habitat serve as key proving grounds. Of course, these extreme terrestrial environments provide only distant approximations of the many challenges that we would confront on future trips to the moon.
The perils of the moon go well beyond the mere terrain. Even though the moon doesn’t have the aggressive gases of Venus or radiation contamination of Jupiter, exposure to cosmic rays and solar wind and flares poses much more risk than on Earth, where the atmosphere and the magnetic field act as protective shields. The fine lunar dust, moreover, is highly abrasive and potentially harmful not only to the lungs of the astronauts and to their spacesuits but also to the joints and bearings of lunar robots—a problem that some propose solving by outfitting the robots with disposable coveralls. In any case, robotics will play a major role in lunar exploration, with lightweight vehicles or robotractors moving about the surface and performing tasks that include sampling the lunar soil and extracting any water or oxygen-rich minerals it might contain. Once the water is broken down into hydrogen and oxygen, it might be used to fuel rocket propellants and to create air for breathing.
Champions of the moon’s commercial exploitation, such as the Arizona-based Lunar Research Institute, like to stress that even the bulk of the resources needed to build an industrial complex on the moon would not have to be taken there but could be mined on-site. Larry Clark, senior manager for Lockheed Martin’s spacecraft technology development laboratory, calculates that processing just the top two inches of soil from an area half the size of a basketball court could yield enough oxygen to keep four astronauts alive for seventy-five days. Silicon can be used to make cells to harvest solar energy, iron to build structures, aluminum, titanium, and magnesium for the construction of spacecraft, and carbon and nitrogen to help grow food. Given the moon’s comparably low gravitational pull, transport back to the Earth would be much cheaper than the other way around.
The moon may be a place of infinite possibilities, and some of these projects are reminiscent of the enthusiasm of a gold rush. At the core of these dreams is helium-3, the light variant of a noble gas that is hardly present on Earth but appears to exist in large quantities on the moon. Although Earth’s atmosphere and magnetic field shield it from helium-3 in space, the moon lacks these obstacles. Shipped to Earth, the moon’s helium-3 could be processed with deuterium in fusion reactors to create helium-4, the energy source of the sun and stars, which could provide power without producing nuclear waste. But such fusion reactors are far from ready for operation; some experts claim that this technology is decades away from commercial viability.
One undeniable resource of the moon is an abundant supply of solar energy, which could be harnessed to power lunar bases or even as an energy source for Earth, thus freeing future generations from dependence on fossil and nuclear fuels. To make the moon’s solar energy available on Earth, scientists would have to find a way to convert it into electricity and then into microwaves that could be beamed across space. It is impossible to gauge the effects of sending such concentrations of microwaves through the Earth’s atmosphere. Paul D. Spudis, chief of the Clementineteam (named for a spacecraft launched in 1994 to study the moon’s surface by means of special cameras), considers the “Mountain of Eternal Light” near the moon’s south pole to be “the most valuable piece of extraterrestrial real estate in the solar system.” “The big advantage of this place on the Moon is that it allows you to survive the 14-day lunar night with solar power. You cannot do this on the equator. Moreover, it is close to deposits of excess hydrogen, allowing us to make water, air and rocket propellant. If the Moon is a desert, the poles are its oases,” Spudis writes.
All such futuristic projects resonate uneasily on an Earth that more than ever seems on the brink of ecological disaster. Working on an offshore oil platform entails many privations, but who would want to spend a substantial part of his or her life on the moon? Who would choose the moon if he or she could have green landscapes, the ocean’s shore, and the mountains? Are we really supposed to live somewhere other than Earth? In addition, living in an environment with less gravity than the Earth has severe side effects. A period of rehabilitation is now standard practice after long stays in space. How can detrimental consequences for the circulatory and musculoskeletal systems be limited and psychological and social well-being be maintained? At this point, there still are a lot of open questions.
With no spaceship in sight, the European Space Agency (ESA) can only dream about a manned space mission. Recognizing the immense cost involved, U.S. President Barack Obama also put the brakes on moon euphoria. The Russians are not in a hurry, either—no cosmonaut will set foot on the moon before 2025. And China doesn’t foresee trying to get there before 2030. But there are parts of the world where the notion of a lunar mission still provokes strong emotions.
A case in point is India. When an unmanned spacecraft called Chandrayaan-1 (moon craft) went to the moon in October 2008, it was a matter of national pride and received extensive coverage in the Indian media. The main aims of this mission were to prepare a three-dimensional atlas in extremely high resolution of both the near and far sides, and to map the lunar surface for the distribution of various elements. For the first time, imaging radar was flown to the moon, resulting in valuable data on the lunar poles. Currently India is planning a manned mission for 2020.
Just before the launch of Chandrayaan, millions of Hindu women fasted until the moment when they could first discerned the moon’s reflection in a bowl of oil—a rite that they usually perform to safeguard the welfare of their husbands. For many Indians no real contradiction seems to exist between their ancient beliefs and the contemporary scientific pursuits in space. The Indian Space Research Organisation carried the following verse from the Rig Veda:
We should be able to know you through our intellect,
You enlighten us through the right path.
From Moon: A Brief History by Bernd Brunner; published by Yale University Press in 2010. Reproduced by permission.
Bernd Brunner is a freelance writer who often explores the intersection of cultural history and the history of science in his writings. He divides his time between Istanbul, Turkey, and Berlin, Germany.
Featured Image: “US postage stamp, 1969 issue: First Man on the Moon,” licensed for use on the public domain by the Bureau of Engraving and Printing.