You are a special generation: born just after the Cold War, the first of the digital age, and fortunate to have enjoyed the longest economic expansion in American history.
It may be hard to believe, but the threat of nuclear war once cast a long shadow over our lives. Your parents probably recall huddling under their school desks in civil defense drills. Thankfully, that horrible specter has receded. We did not win the Cold War, but the Soviet Union lost it for reasons that are relevant today.
Communism denied individuals freedom to speak out, stifled information and monopolized power. It was mind-numbing, suffocating and once covered about half the planet -- hundreds of millions of people suffered through it in the Soviet Union, China and elsewhere.
Imagine: there were special rooms in the Soviet Union where photo-copying machines were kept under lock and key so people could not share information on a page. Statistics about life expectancy were considered a state secret. In libraries, there were special drawers and whole rooms where forbidden literature was locked up by a paternalistic state. No Facebook, no Kindle, no freedom.
The Soviet Union expired for many reasons: over-militarization, a dysfunctional system of economic central planning, and a lack of civil society and rule of law. But one factor which we can see more clearly in retrospect was that, as a closed society, it could not compete with a wave of innovation, communications and new technology that was blossoming in the West. The advent of the personal computer in the 1980s empowered individuals to control and distribute information -- an idea that made Soviet bosses shudder. Later, the widespread connection of computers to networks triggered another explosion of innovation and prosperity, born and nurtured in societies that prized freedom and rewarded innovation. Mikhail Gorbachev's last-ditch bid for glasnost was certainly the right idea, but too late. Would Steve Jobs or Bill Gates have succeeded if they lived in Moscow in the 1980s? Probably not.
Today, the digital revolution has become a powerful liberating force for millions of people. China's burgeoning middle class is rife with ferment, making it harder and harder to sustain the Great Firewall. Events like the Zhejiang bullet train crash, once hushed up, are now shared with lightning speed on microblogs and provoke popular fury. In Russia, a single Facebook page was critical in organizing tens of thousands of people to protest Vladimir Putin's return to power and last December's fraudulent parliamentary elections. Russia now has 53 million people online, more than any other country in Europe. The Arab world was convulsed by demonstrations for democracy which spread like wildfire on the winds of social media and satellite television. The life sciences are in a period of discovery as exciting as physics was at the dawn of the nuclear age. The digital upheaval has transformed music, photography, news and literature, and a new video entertainment boom is around the corner. Already, 60 hours of video are uploaded to YouTube every minute; more video is uploaded to YouTube in one month than the three major U.S. broadcast networks created in 60 years. We live in an age of person-to-person communications that are more fluid and accessible than at any time in human history. We routinely search across oceans of data in a fraction of a second.
And you can hold a device to do this in the palm of your hand.
But there are danger signs. The world is now totally dependent on connectivity. Finance, medicine, education, science, news, national security and culture are all reliant on networks. What if the power in a major American city is abruptly switched off by a single command from a remote server that no one can trace? Or a dam sabotaged? Or the wrong signal causing stock markets to crash? Computers have been such an impressive force for good that it may be hard to think about the underside, about an arms race in cyberspace, but it is plausible. The United States, China, Russia and others are now investing in offensive cyber weapons, and doing it largely in secret, without public debate.
In the early years of the atomic age, nuclear bombs were huge and unwieldy -- they weighed 5,000 pounds and had to be lofted across the oceans by airplanes that would take five hours to reach their target. Technology relentlessly improved the "absolute weapon" so that by the end of the Cold War, a nuclear-armed missile could fly 4,000 nautical miles in 30 minutes and hit a target in a circle with a radius of 560 feet. No doubt the threat of warfare in cyberspace will arrive long before we are prepared for it. The commander of U.S. Cyber Command said recently we have a better chance of detecting an incoming ballistic missile than we do a cyber attack.
The digital revolution is also upending our politics. It has enabled every one of us to effortlessly choose the sources we want for information, and to custom-build them. Inevitably and inexorably, this is breaking down the middlemen or gatekeepers who often sifted and synthesized in an earlier time: the newspapers, the book publishers, the broadcast radio and television networks. To an older generation, it is painful and disorienting to see these institutions suffer, but that is not the real problem. Until there are new gatekeepers (if they will rise at all) we have to fend for ourselves in the realm of information. Sure, it can be exhilarating: new products, smart start-ups, and relentless competition that stimulates ideas and opportunity. The levels of participation are astounding. If Facebook users were citizens of a nation, it would be the third most populous on earth. Yet, in the United States we are fragmenting into ever-smaller and more narrow niches. We have lost the ability to form consensus on the big issues, such as our fiscal future, or climate change. Clearly the first wave of the digital age has been chaotic and disruptive. It will fall to you and future generations to guide it to something more coherent.
As you take this immense revolution in your hands, don't be passive. Climb out of your foxhole and look at the world broadly. It will be a terrible disappointment if the technology and creativity of recent years results in a new isolation -- everyone looking down at their smartphones without looking up at the horizon. Our problems right now are too daunting.
And just as your parents and grandparents fought to liberate millions of people from an ideology that locked up photocopiers and books, so too must you be ready to take action, perhaps under entirely different circumstances. Freedom, competition, openness, democracy and innovation are treasured values for any age. They helped bring us to this moment. Don't lose sight of what you inherited and what you must do to nurture and protect it.
What's the value of a nuclear warhead today? Not the monetary value, but as a deterrent?
Nuclear explosions are frightfully destructive, and that's the point: to inspire fear, to deter an adversary. Atomic bombs still appeal to some nations and terrorists, making proliferation a constant risk. Fortunately, there are fewer nuclear warheads in the world than during the Cold War; down from about 60,000 to about 22,000 today, most of which remain in the United States and Russia. But the deterrent value of the arsenals isn't what it used to be. Both the U.S. and Russia face new threats -- terrorism, proliferation, economic competition, pandemics -- for which these long-range or strategic nuclear weapons are of little value.
AFP/Getty Images/Robin Utrecht
Earlier this year, scientists at the J. Craig Venter Institute reported in the journal Science that they had designed and created a synthetic chromosome which they transplanted into a living cell. The living cell created new cells which are controlled only by the synthetic chromosome. The experiment was another reminder of the possibilities of synthetic biology, and its complexities.
The revolution in the life sciences is filled with promise for improvements in health, medicine, energy, and the environment. Yet the field known as synthetic biology is relatively new, and many hurdles remain. Turns out it is a lot easier to synthesize some bits of genetic material than it is to make them work inside the body. For an interesting look at the difficulties, see this piece from Nature, in January.
The knowledge of biology is dual use: that which can make our lives better can also be used for ill. With this in mind, President Obama last May 20 asked his new Presidential Commission for the Study of Bioethical Issues to undertake a study of synthetic biology, looking at the “potential medical, environmental, security, and other benefits of this field of research, as well as any potential health, security or other risks.” He asked for the study to be complete in six months.
Last week, The Scientist published an interview with Amy Gutmann, president of the University of Pennsylvania, who is the chairman of the 12-member commission.
Gutmann says the benefits of the new field could range from “better production of vaccines to environmentally friendly biofuels to developing, in the near term, semi-synthetic anti malarial drugs.” But there are risks, she added, all in the future. The primary risk “that needs to be overseen is introducing novel organisms into the environment, [and] how they will react with the environment.”
She says the panel will recommend some kind of middle ground between unfettered scientific discovery and stopping all scientific research until the risks are known.
Can biological science police itself? This is the question Amy E. Smithson has asked in a new article for the journal Survival. Smithson, a senior fellow at the James L. Martin Center for Nonproliferation Studies, Monterey Institute, says recent experiments have sparked a debate over the need for increased oversight, whether by scientists themselves, or by the government, or others. The article gives a good overview of the options. Smithson concludes that we haven’t found the best answer yet, and both government and those outside it need to do more to find the right mix of oversight. While the government can’t leave it all to the private sector, Smithson says the biotech industry is not likely to tolerate those who would misuse biology for malevolent purposes. “Companies do not want to see products designed and produced for legitimate purposes hijacked for malign ones,” she writes, “if only because such misuse could cause a company’s fortunes to plummet.”
A related phenomenon is the rise of the do-it-yourself bio community. According to another recent piece in Nature, potential “bio-hackers” around the world “are setting up labs in their garages, closests and kitchens—from professional scientists keeping a side project at home to individuals who have never used a pipette before.” For now, they are weekend hobbyists, the article says, but there have been security concerns raised about dabbling in dangerous pathogens. The FBI has taken a sort of “neighborhood watch” approach to the hobbyists, relying on the biohackers to monitor their own community and report any behavior they find threatening, the article says.
David E. Hoffman is a Pulitzer Prize-winning author and a contributing editor to Foreign Policy.