Self-driving cars should leave us all unsettled. Here’s why.

It is a warm autumn morning, and I am walking through downtown Mountain View, Calif., when I see it. A small vehicle that looks like a cross between a golf cart and a Jetson-esque, bubble-topped spaceship glides to a stop at an intersection. Someone is sitting in the passenger seat, but no one seems to be sitting in the driver seat. How odd, I think. And then I realize I am looking at a Google car. The technology giant is headquartered in Mountain View, and the company is road-testing its diminutive autonomous cars there.

This is my first encounter with a fully autonomous vehicle on a public road in an unstructured setting.

The Google car waits patiently as a pedestrian passes in front of it.  Another car across the intersection signals a left-hand turn, but the Google car has the right of way. The automated vehicle takes the initiative and smoothly accelerates through the intersection. The passenger, I notice, appears preternaturally calm.

I am both amazed and unsettled. I have heard from friends and colleagues that my reaction is not uncommon. A driverless car can challenge many assumptions about human superiority to machines.

Though I live in Silicon Valley, the reality of a driverless car is one of the most startling manifestations of the future unknowns we all face in this age of rapid technology development. Learning to drive is a rite of passage for people in materially rich nations (and becoming so in the rest of the world): a symbol of freedom, of power, and of the agency of adulthood, a parable of how brains can overcome physical limitations to expand the boundaries of what is physically possible. The act of driving a car is one that, until very recently, seemed a problem only the human brain could solve.

Driving is a combination of continuous mental risk assessment, sensory awareness, and judgment, all adapting to extremely variable surrounding conditions. Not long ago, the task seemed too complicated for robots to handle. Now, robots can drive with greater skill than humans — at least on the highways. Soon the public conversation will be about whether humans should be allowed to take control of the wheel at all.

This paradigm shift will not be without costs or controversies. For sure, widespread adoption of autonomous vehicles will eliminate the jobs of the millions of Americans whose living comes of driving cars, trucks, and buses (and eventually all those who pilot planes and ships). We will begin sharing our cars, in a logical extension of Uber and Lyft. But how will we handle the inevitable software faults that result in human casualties? And how will we program the machines to make the right decisions when faced with impossible choices — such as whether an autonomous car should drive off a cliff to spare a busload of children at the cost of killing the car’s human passenger?

I was surprised, upon my first sight of a Google car on the street, at how mixed my emotions were. I’ve come to realize that this emotional admixture reflects the countercurrents that the bow waves of these technologies are rocking all of us with: trends toward efficiency, instantaneity, networking, accessibility, and multiple simultaneous media streams, with consequences that include unemployment, cognitive and social inadequacy, isolation, distraction, and cognitive and emotional overload.

Once, technology was a discrete business dominated by business systems and some cool gadgets. Slowly but surely, though, it crept into more corners of our lives. Today, that creep has become a headlong rush. Technology is taking over everything: every part of our lives, every part of society, every waking moment of every day. Increasingly pervasive data networks and connected devices are enabling rapid communication and processing of information, ushering in unprecedented shifts — in everything from biology, energy and media to politics, food and transportation — that are redefining our future. Naturally we’re uneasy; we should be. The majority of us, and our environment, may receive only the backlash of technologies chiefly designed to benefit a few. We need to feel a sense of control over our own lives; and that necessitates actually having some.

The perfect metaphor for this uneasy feeling is the Google car. We welcome a better future, but we worry about the loss of control, of pieces of our identity, and most importantly of freedom. What are we yielding to technology? How can we decide whether technological innovation that alters our lives is worth the sacrifice?

The noted science-fiction writer William Gibson, a favorite of hackers and techies, said in a 1999 radio interview (though apparently not for the first time): “The future is already here; it’s just not very evenly distributed.” Nearly two decades later — though the potential now exists for most of us, including the very poor, to participate in informed decision-making as to its distribution and even as to bans on use of certain technologies — Gibson’s observation remains valid.

I make my living thinking about the future and discussing it with others, and am privileged to live in what to most is the future. I drive an amazing Tesla Model S electric vehicle. My house, in Menlo Park, close to Stanford University, is a “passive” home that extracts virtually no electricity from the grid and expends minimal energy on heating or cooling. My iPhone is cradled with electronic sensors that I can place against my chest to generate a detailed electrocardiogram to send to my doctors, from anywhere on Earth.

Many of the entrepreneurs and researchers I talk with about breakthrough technologies such as artificial intelligence and synthetic biology are building a better future at a breakneck pace. One team built a fully functional surgical-glove prototype to deliver tactile guidance for doctors during examinations — in three weeks. Another team’s visualization software, which can tell farmers the health of their crops using images from off-the-shelf drone-flying video cameras, took four weeks to build.

The distant future, then, is no longer distant. Rather, the institutions we expect to gauge and perhaps forestall new technologies’ hazards, to distribute their benefits, and to help us understand and incorporate them are drowning in a sea of change as the pace of technological change outstrips them.

The shifts and the resulting massive ripple effects will, if we choose to let them, change the way in which we live, how long we live for, and the very nature of being human. Even if my futuristic life sounds unreal, its current state is something we may laugh at within a decade as a primitive existence — because our technologists now have the tools to enable the greatest alteration of our experience of life since the dawn of humankind. As in all other manifest shifts — from the use of fire to the rise of agriculture and the development of sailing vessels, internal-combustion engines, and computing — this one will arise from breathtaking advances in technology. It is far larger, though, is happening far faster, and may be far more stressful to those living through this new epoch. Inability to understand it will make our lives and the world seem even more out of control.

A broad range of technologies are now advancing at an exponential pace, everything from artificial intelligence to genomics to robotics and synthetic biology. They are making amazing and scary things possible — at the same time. Broadly speaking, we will, jointly, choose one of two possible futures.  The first is a utopian “Star Trek” future in which our wants and needs are met, in which we focus our lives on the attainment of knowledge and betterment of mankind. The other is a “Mad Max” dystopia: a frightening and alienating future, in which civilization destroys itself.

These are both worlds of science fiction created by Hollywood, but either could come true. We are already capable of creating a world of tricorders, replicators, remarkable transportation technologies, general wellness and an abundance of food, water and energy. On the other hand, we are capable too now of ushering in a jobless economy; the end of all privacy; invasive medical-record keeping; eugenics; and an ever worsening spiral of economic inequality: conditions that could create an unstable, Orwellian or violent future that might undermine the very technology-driven progress that we so eagerly anticipate. And we know that it is possible to inadvertently unwind civilization’s progress. It is precisely what Europe did when, after the Roman Empire, humanity slid into the Dark Ages, a period during which significant chunks of knowledge and technology that the Romans had hard won through trial and error disappeared from the face of the Earth. To unwind our own civilization’s amazing progress will require merely cataclysmic instability.

It is the choices we all make which will determine the outcome. Technology will surely create upheaval and destroy industries and jobs. It will change our lives for better and for worse simultaneously. But we can reach “Star Trek” if we can share the prosperity we are creating and soften its negative impacts; ensure that the benefits outweigh the risks; and gain greater autonomy rather than becoming dependent on technology.

The oldest technology of all is probably fire, even older than the stone tools that our ancestors invented. It could cook meat and provide warmth; and it could burn down forests. Every technology since this has had the same bright and dark sides. Technology is a tool; it is how we use it that makes it good or bad. There is a continuum limited only by the choices we make jointly. And all of us have a role in deciding where the lines should be drawn.

This is an excerpt from my new book, “The Driver in the Driverless Car: How Our Technology Choices Will Create the Future.”

The post Self-driving cars should leave us all unsettled. Here’s why. appeared first on Vivek Wadhwa.




Browse articles by author

More Essays

Added 12.06.2018
Extract: “Nothing is beautiful except what is true,” Cézanne once said, “and only true things should be loved.” As the philosopher Jacques Derrida put it: “The truth in painting is signed Cézanne.” Perhaps it is this above all else that makes him the indispensible painter for our times, this era of so-called ‘post-truth.’ For Cézanne “painting was truth telling or it was nothing.” That is what it meant to paint from nature, to be primitive, to be free from all affectation, to be like those “first men who engraved their dreams of the hunt on the vaults of caves…” This is why we need to look and look again at Cézanne. And it is perhaps best that he has come to the National Gallery, to D.C., but a stone’s throw away from where truth is daily made a mockery of, and lies are proffered with breathtaking ease.
Added 06.06.2018
Extracts from the article: "Johnson and Johnson recently announced that it was halting a clinical trial for a new Alzheimer’s drug after safety issues emerged. This latest failure adds to the dozens of large, costly clinical trials that have shown no effect in treating this devastating disease. The growing list of failures should give us pause for thought – have we got the causes of Alzheimer’s all wrong?".............."Another option is to look at the risk factors for developing Alzheimer’s. One of these is type 2 diabetes." ............"Testing these [diabetes] drugs in animal models of another neurodegenerative disorder, Parkinson’s disease, also showed impressive effects, ............These new theories bring a fresh view on how these diseases develop and increase the likelihood of developing a drug treatment that makes a difference. To see any protective effect in the brain in a clinical trial is completely new, and it supports the new theory that Alzheimer’s and Parkinson’s disease are caused, at least in part, by a lack of growth factor activity in the brain. These new theories bring a fresh view on how these diseases develop and increase the likelihood of developing a drug treatment that makes a difference."
Added 01.06.2018
Extract from the article: "The most common defense of truth is the pragmatic one – namely, that truth works; that true beliefs are more likely to get the job done than those that are not true. The pragmatic account of the value of truth is not wrong, but at the same time it is not enough. Truth is not valuable for solely instrumental or extrinsic reasons. Truth has intrinsic value as well. When we reduce the value of truth to instrumentality, it is a very short step to saying that we just want beliefs that work for us, regardless of whether they are true or not."
Added 14.05.2018
During the first century of modern art, Paris was a magnet for ambitious artists from all over Europe. Remarkably, the current exhibition at Paris’ Petit Palais tells us that “Between 1789 and 1914, over a thousand Dutch artists traveled to France.” Prominent among these were Ary Scheffer, Johan Jongkind, Jacob Maris, Kees van Dongen. But of course most prominent were Vincent van Gogh and Piet Mondrian.
Added 10.05.2018
The Jewish Museum in New York City is currently presenting the work of Chaim Soutine (1893-1943), featuring just over thirty paintings by one of the most distinctive and significant artists of the early twentieth century. Focusing on still life paintings, of which he was a master, "Chaim Soutine: Flesh" includes his vigorous depictions of various slaughtered animals - of beef carcasses, hanging fowl, and game. These are dynamic works of great boldness and intensity, and taken together they constitute a sustained and profoundly sensuous interrogation of the flesh, of carnality - of blood, skin and sinew.
Added 08.05.2018
The impact of air pollution on human health is well-documented. We know that exposure to high levels of air pollutants raises the risk of respiratory infections, heart disease, stroke, lung cancer as well as dementia and Alzheimer’s disease. But there is growing evidence to suggest that air pollution does not just affect our health – it affects our behaviour too.
Added 05.05.2018
 

The May bank holiday is intimately linked to labour history and to struggles over time spent at work. In the US, May Day has its origins in the fight for an eight-hour work day at the end of the 19th century.

Added 01.05.2018
Quote from the article: "Who is talking about how globalized the world was between 1880 and 1914 -- until war broke out and fascists subsequently determined the course of history -- and the parallels between then and now? Globalization always had a down side, and was never meant to last forever -- but the gurus chose not to talk about it. It is always just a question of time until economic nationalism reappears, but the gurus have done a poor job of addressing the nexus between economics and politics, and its impact on business, which is the real story."
Added 29.04.2018
"......if we did manage to stop the kind of ageing caused by senescent cells using telomerase activation, we could start devoting all our efforts into tackling these additional ageing processes. There’s every reason to be optimistic that we may soon live much longer, healthier lives than we do today."
Added 29.04.2018
Many countries have introduced a sugar tax in order to improve the health of their citizens. As a result, food and drink companies are changing their products to include low and zero-calorie sweeteners instead of sugar. However, there is growing evidence that sweeteners may have health consequences of their own. New research from the US, presented at the annual Experimental Biology conference in San Diego, found a link with consuming artificial sweeteners and changes in blood markers linked with an increased risk of obesity and type 2 diabetes in rats. Does this mean we need to ditch sweeteners as well as sugar?
Added 25.04.2018
Female doctors show more empathy than male doctors. They ask their patients more questions, including questions about emotions and feelings, and they spend more time talking to patients than their male colleagues do. Some have suggested that this might make women better doctors. It may also take a terrible toll on their mental health.
Added 25.04.2018
The English-born Thomas Cole (1801-1848) is arguably America's first great landscape painter - the founder of the Hudson River School, the painter who brought a romantic sensibility to the American landscape, and sought to preserve the rapidly disappearing scenery with panoramas that invoke the divinity in nature. The Metropolitan Museum of Art's "Thomas Cole: Atlantic Crossings" is an astounding exhibition featuring a painter of extraordinary power and vision, underscoring his environmentalism and the deep sense of loss that pervades many works as he reflects on deforestation, the intrusion of the railroad, and the vanishing beauty of the untrammeled wilderness.
Added 23.04.2018
Quantitative evidence from three independent sources — auction prices, textbook illustrations, and counts of paintings included in retrospective exhibitions — all pointed to the fact that some important modern artists made their greatest work late in their careers — Cézanne, for example, in his 60s, and Kandinsky and Rothko in their 50s. But the same evidence indicated that other important artists produced their greatest work very early — Picasso, Johns, and Stella, for example, all in their 20s. Why was this was the case: why did great artists do their best work at such different stages of their careers? I couldn’t answer this question until I understood what makes an artist’s work his or her best.
Added 19.04.2018

People of all ages are at risk from diseases brought on by loneliness, new data has revealed.

Added 09.04.2018

I was a senior university student in Baghdad, Iraq. It was March 2003, and over the past few months, my classmates had whispered to each other about the possibility of a US-led invasion and the likelihood that 35 years of dictatorship and tyranny could be brought to an end.

Added 26.03.2018
In 1815, 69-year old Francisco de Goya painted a small self-portrait. Today it hangs in Madrid’s majestic Prado Museum. Next to it are the two enormous paintings of the uprising of May, 1808, in which Madrid’s citizens had been slaughtered by Napoleon’s troops, that Goya had painted in 1814 for King Ferdinand VII, to be hung in Madrid’s Royal Palace. One of these, of the execution of Spanish civilians by a French firing squad, is now among the most famous images in the history of Western art.
Added 15.03.2018

Soon after I enrolled as a graduate student at Cambridge University in 1964, I encountered a fellow student, two years ahead of me in his studies, who was unsteady on his feet and spoke with great difficulty. This was Stephen Hawking.

Added 03.03.2018

A lack of essential nutrients is known to contribute to the onset of poor mental health in people suffering from anxiety and depression, bipolar disorder, schizophrenia and

Added 27.02.2018

Mindfulness is big business, worth in excess of US$1.0 billion in the US alone and linked – somewhat paradoxically – to an expanding range of must have products.