Competing Visions of a Computer-Controlled Future
Computers dominate how we live, work and think. For some, the technology is a boon and promises even better things to come. But others warn that there could be bizarre consequences and that humans may be on the losing end of progress.
Federico Faggin has lived in the United States for more than 40 years, but he’s still living la dolce vita in classic Italian style in his magnificent house on the edge of Silicon Valley. The elderly Faggin answers the phone with a loud “pronto” and serves wine and antipasti to guests. Everything about him is authentic. The only artificial thing in Faggin’s world is what he calls his “baby.” It has 16 feet--eight on each side--and sits wrapped in cotton in a cigarette case.
About four decades ago, Faggin was one of the first employees at Intel when he and his team developed the world’s first mass-produced microprocessor, the component that would become the heart of the modern era. Computer systems are ubiquitous today. They control everything, from mobile phones to Airbus aircraft to nuclear power plants. Faggin’s tiny creation made new industries possible, and he has played a key role in the progress of the last few decades. But even the man who triggered this massive revolution is slowly beginning to question its consequences.
“We are experiencing the dawn of a new age,” Faggin says. “Companies like Google and Facebook are nothing but a series of microprocessors, while man is becoming a marginal figure.”
This week, when German Chancellor Angela Merkel and Google chairman Eric Schmidt opened CeBIT--the digital industry’s most important annual trade fair--in the northern German city of Hanover, there was a lot of talk of the mobile Internet once again, of “cloud computing,” of “consumer electronics” and of “connected products.” The overarching motto of this convention is “Trust”--in the safety of technology, in progress and in the pace at which progress unfolds.
This effort to build trust seems more necessary than ever, now that those who place their confidence in progress are being joined by skeptics who also see something dangerous about the rapid pace of development.
In his book “The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future,” American computer scientist Martin Ford paints a grim picture. He argues that the power of computers is growing so quickly that they will be capable of operating with absolutely no human involvement at some point in the future. Ford believes that 75-percent unemployment is a possibility before the end of the century.
“Economic progress ultimately signifies the ability to produce things at a lower financial cost and with less labor than in the past,” says Polish sociologist Zygmunt Bauman. As a result, he says, increasing effectiveness goes hand in hand with rising unemployment, and the unemployed merely become “human waste.”
Likewise, in their book “Race Against the Machine,” Erik Brynjolfsson and Andrew McAfee, both scholars at the Massachusetts Institute of Technology (MIT), argue that, for the first time in its history, technological progress is creating more jobs for computers than for people.
The information-technology sector is indeed becoming increasingly important. In the 34 countries of the Organization for Economic Co-operation and Development (OECD), the club of industrialized nations, some 16 million people work in this field--a figure that does not include important manufacturing countries such as India and China. The trillions in IT sales worldwide already exceed those of other key industrial sectors, such as the chemical-pharmaceutical and auto industries.
At the same time, more and more jobs are being lost in traditional industries. According to Brynjolfsson and McAfee, the most recent economic crisis in the United States and the collapse of many markets in 2008 forced US companies to make massive layoffs. Although production had returned to pre-crisis levels by the fall of 2011, it did so with 7 million fewer workers.
The new technology is shaking up all old industries, changing their products, revolutionizing work processes and transforming companies. Digitization isn’t just changing work; it is also profoundly altering the way people think, act and live in their daily lives.
Germany’s most unusual television studio is a place where the computer has already outpaced human beings. The VIP tour of the Frankfurt Stock Exchange costs €125 ($165). From the gallery, the viewer looks down at the trading floor where the traders sit behind a wall of computer monitors. It’s an image of absolute control--but it’s also an illusion.
Some 4,500 traders are registered with Deutsche Börse AG, the company that owns and operates the exchange, but only about 100 are physically present in the room. They are part of a living stage set for the TV cameras, specialists in types of securities that are traded in such small quantities that it doesn’t make sense to have computers handle them.
In recent years, there is hardly any other area that digitization has changed as dramatically as the financial markets. Since the establishment of the first stock exchanges in the Middle Ages, speed has been the measure of all things.
Today, the financial markets consist of a digital network in which machines communicate with machines. Traders haven’t been able to keep up with the speed of computers for a long time. The blink of a human eye takes about 150 milliseconds. Computers on German stock exchanges can fill about 300 orders in the same amount of time.
Within milliseconds, the super-fast computers of so-called high-frequency traders buy and sell stocks around the world when they detect price fluctuations of only fractions of a cent. At the same time, the work of these computers increases the risk of an unwanted crash because buying and selling is no longer controlled by human reason alone, but also by the programmed logic of machines.
This occasionally leads to bizarre consequences. On May 6, 2010, the Dow Jones Industrial Average index on the New York Stock Exchange plummeted by almost 1,000 points in less than 20 minutes. In that time, close to 1.3 billion shares were traded, or six times the average trading volume. Some stocks lost up to 99 percent of their value.
The so-called “flash crash” was over after a short time, and the index recovered. But, even today, the causes have experts still scratching their heads. It is likely that the chaos was partly owed to an avalanche of sales by computer-controlled trading programs that sell a share as soon as its value has reached a pre-defined price. Indeed, the incident illustrates a fundamental problem: The supposedly intelligent computers all think in the same direction and, as a result, can promptly lead lemmings over a cliff.
A British government study on the future of computerized trading concludes that the financial markets are on the verge of radical change and predicts that the number of traders will drastically decline over the next decade. “The simple fact is that we humans are made from hardware that is just too bandwidth-limited, and too slow, to compete with coming waves of computer technology,” the study says.
As long ago as 1965, Gordon Moore, who would later go on to cofound Intel, predicted that the performance and component density of processors would rapidly develop. Over the last four decades, the number of transistors in a processor has doubled about once every 18 to 24 months, resulting in rapid changes in processing speed. The maximum computing speed of the first microchip was 740 kilohertz compared with the standard speed of about 3 gigahertz today, representing a more than 4,000-fold increase.
While this leads to the creation of new jobs in the digital economy, jobs in traditional industries are being cut. Granted, up to 6 million new IT jobs are expected between 2010 and 2014. But Foxconn, a Taiwan-based manufacturer to which IT companies outsource much of their production, plans to purchase a million robots over the next three years to replace some of its more than million employees.
It’s a paradox. On the one hand, digitization increases growth and prosperity. On the other, write MIT scholars Brynjolfsson and McAfee, “There is no economic law that says that everyone, or even most people, automatically benefit from technological progress.”
Life in the digital world doesn’t just change our behavior; it also changes how we learn and think. Children are growing up in a world in which the distinctions between real and simulated life, as well as between machines, humans and animals, are starting to disappear, concludes Sherry Turkle, a professor of Social Studies of Science and Technology at MIT.
Indeed, the behavior of small children can reveal whether their parents own iPhones and iPads. These are the children who spread their fingers across paper photo albums when they want to enlarge the images or drag their fingers across television screens when they’re bored by a cartoon they’re watching.
According to a recent study by Columbia University psychologist Betsy Sparrow, a person who knows that he or she can readily look up a piece of information online doesn’t remember it as well as someone without Internet access. The study finds that the human brain treats the Internet as an extension of itself, as a kind of external memory. Ideally, this means that trivial knowledge can be stored in this external memory, freeing up brain space for creativity. But, in the worst case, the computer becomes a prosthetic brain.
“In terms of IT and computer use, there is an enormous divide between those born before 1970 and after 1980,” Moshe Rappoport of IBM Research, the US company’s European research center, concluded in 2008. “The former will remain digital immigrants for the rest of their lives.” According to Rappoport, most young people have already logged thousands of computer-game hours by the time they’re 20, thereby acquiring skills and thought patterns completely foreign to the older generation.
Rappoport also argues that the change in the use of technology has had immense impacts on established companies and economic sectors. In computer games, one can quickly reach the goal through risky behavior and then simply start over again. In a similar way, the younger generation is characterized by a willingness to take risks. “Nowadays, 25-year-olds who have already established six or seven companies are no longer a rarity,” Rappoport said. “In the past, a business idea was considered a failure if it stopped working after two year. But, today, it’s much more about trying out ideas, implementing them and then discarding them again.”
Nowadays, games and their user interfaces have found their way into almost every part of the economy. For instance, cars like the hybrid Honda Insight display flowers and medals on a screen as rewards for energy-conscious driving.
The Gartner market research company predicts that, within a few years, the economic importance of game-based advertising will be similar to that of Facebook today. But, in the 21st century, the real question is: Exactly who is playing with whom?
Helmut Dubiel is leaning back into the sofa in the living room of his row house in the Bockenheim district of Frankfurt. The sociologist is 65 and incurably ill. He has Parkinson’s disease.
Every movement on the sofa is difficult for Dubiel, and his voice is barely more than a whisper. “I’ve had Parkinson’s since the early 1990s, when I was 46,” Dubiel says. At the time, he was at the height of his career as the director of the city’s Institute for Social Research.
At first, he ignored the disease, but then he began fighting it by taking up to 30 pills a day. Dubiel eventually faced the question of “whether I should have a brain pacemaker inserted.”
In 2003, two electrodes were implanted deep into Dubiel’s brain in a 10-hour operation, during which he was fully conscious. Wires connect the electrodes to a subdermal control unit on the right side of his chest. Using a small remote-control device, Dubiel can modify the pulse strength, thereby stimulating the affected parts of the brain. He can decide whether to improve his ability to walk or speak. The more comprehensible Dubiel’s speech becomes, the more he loses control over the rest of his body, and vice versa.
Dubiel’s brain pacemaker is known as a neuro-implant. It isn’t the only device of its kind currently in use. Cochlear implants for the deaf, for example, convert tones and sounds into electric signals and transmit them through an electrode to the hearing nerve in the brain. Millimeter-sized chips are implanted under the retina of blind people, converting light into nerve impulses. Neuro-stimulators implanted into patients with chronic pain can temporarily shut their nerves off. Conversely, the nerves of paralyzed patients can control artificial prostheses.
Computer technologies have been a boon to medicine and of great benefit to human beings. But the advances also illustrate that the divide between man and machine is becoming narrower. Neuro-implants define this boundary because they entail having a machine penetrate into the human body. Although today’s instruments are still relatively crude, brain pacemakers are already being used in patients with depression and obsessive-compulsive disorder. In this way, machines are no longer just intervening in the body’s mechanical functions, but also in its emotional life.
After his operation, Dubiel wrote a book about his disease. He criticized the surgery at first because he subsequently had trouble speaking and his handwriting became illegible. He had to learn to control the intervention into his own brain. Today, he says that he would have the operation again if given the chance. “We have to learn how to deal with this technology,” he says.
This also applies to politics, in particular. Digitization has been changing everyday life, affluence, work, living, loving and recreation for more than 40 years. But the political world has often failed to keep up when it comes to creating adequate basic rules, laws, supervision and management of the changes.
The issue isn’t something senior politicians address in Berlin, either. Cornelia Rogall-Grothe, a state secretary in the German Interior Ministry and the government’s coordinator for information technology, says things like: “The Federal Ministry of the Interior is currently developing the draft of a federal e-government law with the objective of facilitating electronic communication with the administration for citizens and the economy.”
Thus, while we download music and do our banking online, and computers control entire factories, the German government is merely developing draft legislation.
Politicians should be careful not to be overrun by developments, just as some companies have been. In the early 1970s, shortly after the invention of his microchip, Federico Faggin paid a visit to Nixdorf, a computer manufacturer in the northern German city of Paderborn. He wanted to present his innovative integrated processor to the Germans and offer it to them. But the German engineers declined, telling Faggin that there was no money to be made with his little toy. Today, there are no longer any computers under the brand name Nixdorf.