Forum Index > Full Moon Saloon > Self driving cars...when the code decides, who is responsible?
 Reply to topic
Previous :: Next Topic
Author Message
MtnGoat
Member
Member


Joined: 17 Dec 2001
Posts: 11992 | TRs | Pics
Location: Lyle, WA
MtnGoat
Member
PostSat Nov 25, 2017 10:19 am 
Interesting article on the ethical issues of self driving cars..
Quote:
Who dies when the car is forced into a no-win situation? “There will be crashes,” said Van Lindberg, an attorney in the Dykema law firm's San Antonio office who specializes in autonomous vehicle issues. “Unusual things will happen. Trees will fall. Animals, kids will dart out.” Even as self-driving cars save thousands of lives, he said, “anyone who gets the short end of that stick is going to be pretty unhappy about it.” Few people seem to be in a hurry to take on these questions, at least publicly. It’s unaddressed, for example, in legislation moving through Congress that could result in tens of thousands of autonomous vehicles being put on the roads. In new guidance for automakers by the U.S. Department of Transportation, it is consigned to a footnote that says only that ethical considerations are "important" and links to a brief acknowledgement that "no consensus around acceptable ethical decision-making" has been reached. Whether the technology in self-driving cars is superhuman or not, there is evidence that people are worried about the choices self-driving cars will be programmed to take. Last year, for instance, a Daimler executive set off a wave of criticism when he was quoted as saying its autonomous vehicles would prioritize the lives of its passengers over anyone outside the car. The company later insisted he’d been misquoted, since it would be illegal “to make a decision in favor of one person and against another.” Last month, Sebastian Thrun, who founded Google’s self-driving car initiative, told Bloomberg that the cars will be designed to avoid accidents, but that “If it happens where there is a situation where a car couldn’t escape, it’ll go for the smaller thing.” But what if the smaller thing is a child? How that question gets answered may be important to the development and acceptance of self-driving cars. Azim Shariff, an assistant professor of psychology and social behavior at the University of California, Irvine, co-authored a study last year that found that while respondents generally agreed that a car should, in the case of an inevitable crash, kill the fewest number of people possible regardless of whether they were passengers or people outside of the car, they were less likely to buy any car “in which they and their family member would be sacrificed for the greater good.”

Will your car's code kill you for the 'greater' good? To me, this is very cut and dried. The party responsible for making decisions at the 'wheel' holds responsibility for the vehicle's control. If it's a human, it's the human, if it's code, it's the manufacturer.

Diplomacy is the art of saying 'Nice doggie' until you can find a rock. - Will Rogers
Back to top Reply to topic Reply with quote Send private message
HermitThrush
Member
Member


Joined: 14 Jan 2016
Posts: 384 | TRs | Pics
Location: Brainerd Lakes Area, MN
HermitThrush
Member
PostSat Nov 25, 2017 11:17 am 
Self-driving? No thanks.

Back to top Reply to topic Reply with quote Send private message
MtnGoat
Member
Member


Joined: 17 Dec 2001
Posts: 11992 | TRs | Pics
Location: Lyle, WA
MtnGoat
Member
PostSat Nov 25, 2017 11:34 am 
If people want to use them it's fine with me, but if your car can't handle the road with human drivers and the rest of the world going on around it, that's on your manufacturer and not everyone else.

Diplomacy is the art of saying 'Nice doggie' until you can find a rock. - Will Rogers
Back to top Reply to topic Reply with quote Send private message
Randito
Snarky Member



Joined: 27 Jul 2008
Posts: 8831 | TRs | Pics
Location: Bellevue at the moment.
Randito
Snarky Member
PostSat Nov 25, 2017 12:17 pm 
So far all the collisions between AI driven vehicles and human driven vehicles the human driver has been at fault. Many of these were rear end collisions where the AI driven vehicle was struck from behind by the human driven vehicle. One case was a delivery truck that backed into an AI driven mini-bus. As AI driven vehicles become more numerous and more municipalities permit them on their streets, more collisions will occur. There will be lawsuits -- just as there are many lawsuits everyday when two human driven vehicles collide and the respective insurance companies duke it out over who pays. Many people feel uncomfortable about "not being in control" -- e.g. many people have greater levels of fear about flying on commercial jets than they do about driving -- even though the actual level of risk of death is higher driving in a car is higher. (in terms of fatal accidents per mile traveled) I know some people that take the train to travel to the east coast rather than fly because their fear of flying is uncomfortably intense. I suspect that it will be the same with AI driven vehicles -- adoption will be slow at first and then expand rapidly and eventually it will be "normal", but there will be a few outliers that will still drive their own vehicles. This will be a long process -- there are still vehicles from the '60s and earlier on the roads today -- so I would fully expect that in 2050 the roads will still contain a mixture of AI driven and human driven vehicles on the road. By 2100 I think human driven vehicles will be a rare exception. Personally I'm ready for a AI driven vehicle as soon as they are permitted and available in non-luxury brands. I'd love to roll out of bed and into (my/a) car and grumble "Phelps Creek Trailhead" and snooze for another few hours. Even better after returning tired after a long hike and utter "home" and be taken home without fighting to stay awake.

Back to top Reply to topic Reply with quote Send private message
moonspots
Happy Curmudgeon



Joined: 03 Feb 2007
Posts: 2454 | TRs | Pics
Location: North Dakota
moonspots
Happy Curmudgeon
PostSat Nov 25, 2017 3:39 pm 
HermitThrush wrote:
Self-driving? No thanks.

Absolutely!

"Out, OUT you demons of Stupidity"! - St Dogbert, patron Saint of Technology
Back to top Reply to topic Reply with quote Send private message
Slugman
It’s a Slugfest!



Joined: 27 Mar 2003
Posts: 16700 | TRs | Pics
Slugman
It’s a Slugfest!
PostSat Nov 25, 2017 3:41 pm 
The lack of self driving cars causes tens of thousands of traffic deaths every year in this country alone. While early models might not be great, and transitions are always awkward, eventually they will be a godsend. Heavy drinkers will no longer be killers, people with visual impairments will have the same mobility options as everyone else, smart phone addicts will no longer be dangerous morons, the dangerous part anyway, and traffic deaths likely will drop to a tiny percentage of what they are now.

Back to top Reply to topic Reply with quote Send private message
Jaberwock
Member
Member


Joined: 30 Jan 2013
Posts: 721 | TRs | Pics
Location: Bellingham
Jaberwock
Member
PostSat Nov 25, 2017 5:10 pm 
I am all for self-driving cars. People kill so many people with cars it's not even funny. Something like 30,000 per year in the USA. What really made me realize how much of an impact self-driving cars will have was an article I read on how organ-donation organizations are scrambling to come up with a new source since there will be so many fewer organs available because of the reduction in deaths. Think about that, nuts!

Back to top Reply to topic Reply with quote Send private message
Schroder
Member
Member


Joined: 26 Oct 2007
Posts: 6295 | TRs | Pics
Location: on the beach
Schroder
Member
PostSat Nov 25, 2017 5:23 pm 
I took an advanced driving instructor's course some years back from General Motors at their proving grounds. I was training to teach emergency vehicle drivers. The number one rule taught was never hit a pedestrian under any circumstances, even if it meant a head-on collision.

Back to top Reply to topic Reply with quote Send private message
Get Out and Go
Member
Member


Joined: 13 Nov 2004
Posts: 2055 | TRs | Pics
Location: Leavenworth
Get Out and Go
Member
PostSat Nov 25, 2017 5:55 pm 
Really looking forward to checking out a self-driving big rig semi-truck up close and personal in its beta stage...spoken from one who can't even quite trust the backup-camera. huh.gif shakehead.gif

"These are the places you will find me hiding'...These are the places I will always go." (Down in the Valley by The Head and The Heart) "Sometimes you're happy. Sometimes you cry. Half of me is ocean. Half of me is sky." (Thanks, Tom Petty)
Back to top Reply to topic Reply with quote Send private message
Alpendave
Member
Member


Joined: 01 Aug 2008
Posts: 898 | TRs | Pics
Alpendave
Member
PostSat Nov 25, 2017 6:13 pm 
As long as they have a ready override capabilities (where the driver can override the vehicle) and it will be illegal to take a nap while your vehicle gets you there (GTH if you think that drunks should be able to just crawl in and go), then I suppose they would be fine. The driver should still be the ultimate one in control. And they should have to make some good faith effort at being alert. I don’t care about alcoholics that think they have a right to navigate the road. Stay put, or go to jail. If you do drive, I hope you only kill yourself. Addendum I felt kinda bad for being so severe. But after so many years of trauma radiography experience, I’m a wee bit jaded against drunks who drive. Seen ‘em ruin (and end) too many people’s lives. But I’m not a perfect driver myself so...

The highest form of dissent is to love those who will not give you the freedom to disagree with them. To genuinely love your enemies is the purest form of freedom from their power. Life is too short to take too seriously.
Back to top Reply to topic Reply with quote Send private message
Brian Curtis
Trail Blazer/HiLaker



Joined: 16 Dec 2001
Posts: 1635 | TRs | Pics
Location: Silverdale, WA
Brian Curtis
Trail Blazer/HiLaker
PostSat Nov 25, 2017 10:26 pm 
MtnGoat, I'm struggling with your initial post. You brought up a subject (liability) that wasn't addressed in the article and isn't a particularly difficult question. The article was asking about how autonomous cars make ethical decisions (who dies in a crash of someone must be sacrificed?) and how programmers account for other difficult decisions that humans can make fairly easily, but are more difficult for computers. These are all very interesting questions with no easy answers. We just bought a new car with some automated driving features such as adaptive cruise control, emergency braking, and late departure assist that steers the car back into the lane if you drift too close to the line. It is very different to drive down the road without touching the brake or accelerator and it is very strange to have the steering wheel actively turn. But I'm all for more highway safety and that is one of the promises of self driving cars. Bring 'em on.

that elitist from silverdale wanted to tell me that all carnes are bad--Studebaker Hoch
Back to top Reply to topic Reply with quote Send private message
moonspots
Happy Curmudgeon



Joined: 03 Feb 2007
Posts: 2454 | TRs | Pics
Location: North Dakota
moonspots
Happy Curmudgeon
PostSat Nov 25, 2017 11:37 pm 
Brian Curtis wrote:
...and that is one of the promises of self driving cars. Bring 'em on.

And here is why I'm against them: they'll be run by computers, which will be built and programmed by the lowest bidder. In my experience going back 30+ years, so far, without fail, every instance of on-board electronics I've run into has exhibited anything from intermittent or erratic "quirks" to complete online failure with no explanation or reason. Sometimes just a re-boot (shut the car off, re-start) is all that's needed, sometimes a replacement of the suspected module does it. Good grief, even NASA can't build failsafe computers and mechanical devices. No, I want no part of this regardless of any and all the marketing departments claims.

"Out, OUT you demons of Stupidity"! - St Dogbert, patron Saint of Technology
Back to top Reply to topic Reply with quote Send private message
Randito
Snarky Member



Joined: 27 Jul 2008
Posts: 8831 | TRs | Pics
Location: Bellevue at the moment.
Randito
Snarky Member
PostSun Nov 26, 2017 12:39 am 
moonspots wrote:
In my experience going back 30+ years, so far, without fail, every instance of on-board electronics I've run into has exhibited anything from intermittent or erratic "quirks" to complete online failure with no explanation or reason.

Of course there will be failures of various sorts and sometimes with bad collisions as a result. But the pertentant question is how the failure rate compares to "screw up" rate of human drivers. Cars currently use computerized controls for ignition and fuel injection timing, anti-lock brakes, traction control. Yes all these thing fail periodically. Yet the vast majority of collisions are due to human error not the failure of computer components.

Back to top Reply to topic Reply with quote Send private message
uww
Member
Member


Joined: 16 Dec 2015
Posts: 259 | TRs | Pics
uww
Member
PostSun Nov 26, 2017 1:00 am 
We are going to be 15 years away from self-driving cars forever. We barely have self-driving trains.

Back to top Reply to topic Reply with quote Send private message
moonspots
Happy Curmudgeon



Joined: 03 Feb 2007
Posts: 2454 | TRs | Pics
Location: North Dakota
moonspots
Happy Curmudgeon
PostSun Nov 26, 2017 9:09 am 
RandyHiker wrote:
But the pertentant question is how the failure rate compares to "screw up" rate of human drivers.

Understood. But if the failure rate approached the "glitch" rate of the radio electronics in my Dodge pickup, then it would be scrapped in the first week of implementation! However I believe a better answer would be to require prospective drivers to attend a comprehensive driving school at their expense prior to obtaining a license to drive. And by driving school I mean they learn how to drive, not just how to operate the car, and memorize a few rules of the road. And this also includes every license renewal. Learn to drive before being allowed to do so. Further, any collision would result in serious fines, perhaps starting at $1-3K, and pro-rated as to who caused the collision. Yeah, I know, this one would be tough to administer, and no legislative body has the moxie to implement either idea so it's just that, my idea. Anyway, I'm solidly against self driving cars unless they operate on their own "slot-car" track away from the rest of us. Anyway, that's my well thought out opinion, and is well worth all that you paid for it. wink.gif

"Out, OUT you demons of Stupidity"! - St Dogbert, patron Saint of Technology
Back to top Reply to topic Reply with quote Send private message
   All times are GMT - 8 Hours
 Reply to topic
Forum Index > Full Moon Saloon > Self driving cars...when the code decides, who is responsible?
  Happy Birthday Mike E., Highwalker's Daughter!
Jump to:   
Search this topic:

You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum