Let's get the symanical nit-picking out of the way first. You would not drive an autonomous car. You would simply be a passenger in the car. Now let's get on to the meat of the matter.
The article points out ethical and legal issues surrounding decisions the technology must make in determining how to handle situations where some people are likely to die. Should it try to preserve the lives of the car's occupants at all costs or should it be more altruistically programmed to try to save the highest number of lives? A linked video asks, "If your robot commits murder, should you go to jail?"
While these are interesting questions, Google consultant Brad Templeton argues in this blog post that they are largely the domain of philosophy class debate. Such questions, he contends, are so far from reality that they don't rank "anywhere high on the list of important issues and questions." He notes that most drivers never face such decisions, thus implying that the same will be true of the vast majority of autonomous cars.
Judging from how things have worked in the past, it seems that social acceptance, ethical viewpoints, and legal interpretations will evolve as these questions arise in real time. I do not believe that all of these things need to be fully determined in advance, nor do I believe it is even possible to adequately anticipate many of these things in a realistic manner until the issues arise in the context of that day and age.
Besides, we regularly turn our safety over to much more fallible human machines today. Every time you are a passenger in any kind of vehicle operated by a human, you are at the mercy of their fallible capacities. Perhaps even more importantly, you are at the mercy of every other vehicle operator you encounter along the way. This is true of travel by ground, sea, or air. I don't see the shift to more technology as a hugely different issue.
Technological advancement has always been both welcomed and feared by humans. The term Luddite is commonly used to refer to those that fear technology developments. (This Smithonian article explains that the Luddites were fine with machinery; they just wanted to preserve high wages for machine operators. Still, the term is used the way it is used today.)
In my (admittedly limited) experience, Luddite well explains the initial reaction most people have to autonomous cars. When the subject is brought up, people seem to respond with the following fears:
- The loss/reduction of personal freedom.
- The imperfect technology will cause some crashes, injuries, and probably deaths.
- A massive reduction in driver error, the #1 factor in the vast majority of crashes. (SmartMotorist.com reports that "Over 95% of motor vehicle accidents ...involve some degree of driver behavior...."). More on this later.
- Getting problem drivers (elderly, distracted, impaired, novice, etc) out from behind the wheel without limiting their transportation.
- Freedom of people with driving limitations to get around. Frankly, I'm hoping that autonomous cars are ubiquitous by the time I am no longer capable of driving safely.
- Increase in Über-like services that allow people to get rides when needed and only paying for what they use, instead of paying 100% for a car that is parked 95+% of the time. This will mean that most places that have parking lots today will need smaller lots but perhaps larger dropoff/pickup zones.
- The ability to use your time commuting doing something other than driving the car and worrying about other drivers. How would it be to sleep during a long trip to a vacation spot?
Pretty much everyone agrees that self-driving cars will radically cut the number of crashes over time. But most people speaking from a fear base seem to demand zero crashes caused by the new technology. This is not even remotely realistic. With systems designed by humans to move humans around humans, some crashes will occur. But demanding zero crashes from new technology while accepting 5.5 million crashes involving human drivers each year makes no logical sense, whatever level of freedom one thinks operating a car brings.
Most experts agree that crashes involving autonomous cars will be highest during the crossover years, when there are still lots of human operated vehicles on the roads. At first autonomous cars will be very unusual. But just as gasoline powered cars overtook the horse and buggy, autonomous cars will eventually become the rule. The time will come when human-driven cars are considered unacceptably dangerous on the public roads. As it is with horses today, there will be places where people can go to drive cars, but those places will mostly be off the public roads. As this change occurs, infrastructure will morph to address new realities.
Don't worry, this change isn't going to happen all at once. We will be eased into it a little at a time. Automobile manufacturers have been adding "driver assist features" for years. We've had cruise control since the 70s. You can already buy high end cars that find a parking spot and park for you once you pull into a parking lot.
More and more features will become available, first in high end cars, then moving down to the mid-level cars, and finally pushing their way into low end cars. People will use these features for the convenience they bring. Then one day they will be sitting there using their mobile device as the car hauls them somewhere, thinking how glad they are that they no longer have to pay attention to traffic.
Autonomous cars are coming. It's not a matter of if; it's a matter of when. You can fear it. But that won't stop it from coming. And like our ancestors, you will eventually find yourself using the new technology, even if you continue to express misgivings about what it is doing to society.
No comments:
Post a Comment