Forum Home Forum Home > Topics not related to music > General discussions
  New Posts New Posts RSS Feed - Self Driving Cars
  FAQ FAQ  Forum Search   Events   Register Register  Login Login

Topic ClosedSelf Driving Cars

 Post Reply Post Reply Page  <12
Author
Message
Man With Hat View Drop Down
Collaborator
Collaborator
Avatar
Jazz-Rock/Fusion/Canterbury Team

Joined: March 12 2005
Location: Neurotica
Status: Offline
Points: 166178
Direct Link To This Post Posted: June 29 2016 at 15:46
Originally posted by Finnforest Finnforest wrote:

I just read that "car morality" is one of the big issues that developers are grappling with.  In the event of a sudden pedestrian jumping in front of your car, they are trying to decide if the car should hit the person, or swerve into an oak tree to the side (assuming there is nowhere safe to swerve and stop is not possible). 

In other words, should the car be programmed to kill the pedestrian(s), or kill the occupants.  Most people polled say the car should kill the occupants.  However, most admitted if they had a choice they would buy a car programmed to save occupants at all costs.  LOL

I recently read about this too and find it a bit fascinating if I'm honest. I'd have to say, however cold hearted it sounds, if the pedestrian does just jump out in front of the car, especially in a non crosswalk area, I'd be ok with the pedestrian being hit. I'm not saying to target them of course...but you know...killing the occupant of the car because of someone else being an idiot is not a path I want to be on.

Other than that, I like the idea of self driving cars, if for nothing else it would (theoretically) make travel easier.
Dig me...But don't...Bury me
I'm running still, I shall until, one day, I hope that I'll arrive
Warning: Listening to jazz excessively can cause a laxative effect.
Back to Top
*frinspar* View Drop Down
Forum Senior Member
Forum Senior Member
Avatar

Joined: May 27 2008
Location: Arizona
Status: Offline
Points: 463
Direct Link To This Post Posted: June 29 2016 at 18:08
No more road trip movies. "Oh, here's a story about 2 people going cross-country in a safe bubble, talking. Buy popcorn!" LOL
Back to Top
Magnum Vaeltaja View Drop Down
Special Collaborator
Special Collaborator
Avatar
Honorary Collaborator

Joined: July 01 2015
Location: Out East
Status: Offline
Points: 6777
Direct Link To This Post Posted: June 29 2016 at 18:21
Originally posted by *frinspar* *frinspar* wrote:

No more road trip movies. "Oh, here's a story about 2 people going cross-country in a safe bubble, talking. Buy popcorn!" LOL

I suppose that once self-driving cars are universally (or near-universally) adapted they could make a road trip movie that plays on the novelty of someone who still drives their own car going cross country. 
when i was a kid a doller was worth ten dollers - now a doller couldnt even buy you fifty cents
Back to Top
Finnforest View Drop Down
Special Collaborator
Special Collaborator
Avatar
Honorary Collaborator

Joined: February 03 2007
Location: .
Status: Offline
Points: 16913
Direct Link To This Post Posted: June 29 2016 at 21:29
Originally posted by Man With Hat Man With Hat wrote:

Originally posted by Finnforest Finnforest wrote:

I just read that "car morality" is one of the big issues that developers are grappling with.  In the event of a sudden pedestrian jumping in front of your car, they are trying to decide if the car should hit the person, or swerve into an oak tree to the side (assuming there is nowhere safe to swerve and stop is not possible). 

In other words, should the car be programmed to kill the pedestrian(s), or kill the occupants.  Most people polled say the car should kill the occupants.  However, most admitted if they had a choice they would buy a car programmed to save occupants at all costs.  LOL

I recently read about this too and find it a bit fascinating if I'm honest. I'd have to say, however cold hearted it sounds, if the pedestrian does just jump out in front of the car, especially in a non crosswalk area, I'd be ok with the pedestrian being hit. I'm not saying to target them of course...but you know...killing the occupant of the car because of someone else being an idiot is not a path I want to be on.

Other than that, I like the idea of self driving cars, if for nothing else it would (theoretically) make travel easier.



Agree.  The moron who steps in front of moving vehicles is the one who should perish, not the guy being transported by an autonomous vehicle.  One of the earliest things my parents ever beat into me....look both ways before you step into the street. 


Edited by Finnforest - June 29 2016 at 21:30

Back to Top
ClemofNazareth View Drop Down
Special Collaborator
Special Collaborator
Avatar
Prog Folk Researcher

Joined: August 17 2005
Location: United States
Status: Offline
Points: 4659
Direct Link To This Post Posted: June 29 2016 at 21:44
Originally posted by Finnforest Finnforest wrote:

I just read that "car morality" is one of the big issues that developers are grappling with.  In the event of a sudden pedestrian jumping in front of your car, they are trying to decide if the car should hit the person, or swerve into an oak tree to the side (assuming there is nowhere safe to swerve and stop is not possible). 

In other words, should the car be programmed to kill the pedestrian(s), or kill the occupants.  Most people polled say the car should kill the occupants.  However, most admitted if they had a choice they would buy a car programmed to save occupants at all costs.  LOL



Google appears to have come up with a solution to this dilemma (and found a way to profit from it):

http://www.theverge.com/2016/5/19/11711738/google-self-driving-cars-patent-sticky-flypaper
"Peace is the only battle worth waging."

Albert Camus
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: June 30 2016 at 08:03
Originally posted by Finnforest Finnforest wrote:

Very interesting, thanks Dean as always for your technical explanations that are easy enough for a non techie like me to follow. 

You seem much less confident than the media, who in my reading about this, are very confident this is just around the corner....from the Guardian:  

Self-driving cars: from 2020 you will become a permanent backseat driver



Assuming they are correct and that there are millions of fully autonomous cars on the road in a decade, do you think they will allow the consumer to choose to human-drive?  Or do you think self-drive will be the only eventual option, mandated by the folks who have largely taken away my beloved swings, diving boards, and other sources of (dangerous) fun I had growing up (while still selling the tobacco and alcohol products which kill far more humans than most of the issues we hand-wring about)  I have read some who believe they will not give the consumer the choice to drive...and that is really the only problem I have with self driving cars.  If others want to crack a book and just ride, that's awesome.  But if I want to drive myself I think I should be able to.  I don't believe safety should be the only factor in the decision.  The people who think that way are the reason you can't find decent swings or Giganta anymore.  Tongue
[sarcasm]Welll... now we're out of Europe the UK is apparently free from all that hand-wring "Health and Safety gone mad" nonsense so we'll continue to drive on the wrong side of the road and do none of that other safety-conscious stuff imposed on us by faceless bureaucrats who we have been told we didn't vote for (but we did) therefore the UK at least will not be adopting the self-drive only option.[/sarcasm]

Joking aside... it will happen eventually - and when it does all totally non self-drive cars will be banned from the roads or at least from all urban and suburban areas and major trunk roads and highways. Since all traffic deaths are to all intents and purposes accidental deaths there are no magic fixes or panaceas that can be used to prevent them other than removing the greatest potential cause of the accident from the equation. Prevention is generally preferable to cure - airbags and seat belts save lives but they are not preventatives.

As Pat said in his post, anyone who wants to experience the joy of driving themselves will have to hire a track by the hour - which isn't my cup of tea.
Originally posted by Finnforest Finnforest wrote:


(For those too young to remember real swings, this is what a decent swing looks like. Check out the two who are standing!)


Ah yes, I remember it like it was only half a century ago...

You'll notice that in the following pictures kids are also standing instead of sitting as this was the best way of getting them to work. The first looks innocuous but with a couple of bigger kids on-board it could get pretty scary however the worse injury any of us ever received was mild concussion ... ah, happy daze ;-)

6-person rocking horse:

cheese-cutter (the one in our village park was much bigger than this and had running boards):

and the witches-hat:
All of these pieces of equipment are now "banned" from playgrounds, somewhat ironically because the founder of the UK company that made them wanted safe playgrounds set up so that kids in urban areas didn't have to play in the street. While he made his money from selling his equipment to borough councils and municipal corporations the theme park he established himself was the first and largest free theme park in Europe. 


Edited by Dean - June 30 2016 at 08:05
What?
Back to Top
Dean View Drop Down
Special Collaborator
Special Collaborator
Avatar
Retired Admin and Amateur Layabout

Joined: May 13 2007
Location: Europe
Status: Offline
Points: 37575
Direct Link To This Post Posted: July 01 2016 at 04:31
I expect by now some of us are aware of the news story that appeared in the media yesterday regarding a fatal accident that occurred two months ago involving a Tesla S being driven on Autopilot. 

Tesla claim that this is the first fatality in 130 million miles of autopilot driving compared to the US national average of 95 million miles... which means that driving a Tesla on Autopilot in the USA is as safe as normal driving in Germany (but still not as safe as in the UK or a Nordic country) if that autopilot driving fatality-rate were a real statistic (which it isn't). The improvement in fatality stats needed to make self-drive the only viable option for personal transport the actual numbers have to be magnitudes better (I'd realistically put a factor of 10 there but that would still mean 3,500 deaths per year on US roads - ideally the number needs to be reduced by a 100 or 1,000 before everyone decides that self-drive is the only option).

Originally posted by local police report local police report wrote:

In a separate crash on May 7 at 3:40 p.m. on U.S. 27 near the BP Station west of Williston, a 45-year-old Ohio man was killed when he drove under the trailer of an 18-wheel semi.

The top of Joshua Brown’s 2015 Tesla Model S vehicle was torn off by the force of the collision. The truck driver, Frank Baressi, 62, Tampa was not injured in the crash.

The FHP said the tractor-trailer was traveling west on US 27A in the left turn lane toward 140th Court. Brown’s car was headed east in the outside lane of U.S. 27A.

When the truck made a left turn onto NE 140th Court in front of the car, the car’s roof struck the underside of the trailer as it passed under the trailer. The car continued to travel east on U.S. 27A until it left the roadway on the south shoulder and struck a fence. The car smashed through two fences and struck a power pole. The car rotated counter-clockwise while sliding to its final resting place about 100 feet south of the highway. Brown died at the scene.

It is believed that the car's software perceived the radar image of the trailer across the carriageway to be an overhead gantry so failed to apply the brakes. It also seems that the brakes weren't applied at any time before the impact so the driver did not override the system and we'll never know the reason for that. It has been speculated that the driver didn't see the truck silhouetted against the sky - except he was travelling east in mid-afternoon so the sun was behind him... I think it is more likely that he didn't see the truck because he wasn't looking because if he was then, no matter how bad the visibility, seconds before the impact he would have hit the brakes. 

The trajectory of the car after the impact also raises more questions since it is apparent that neither autopilot nor driver were in control of the vehicle so the philosophical dilemma of minimising collateral damage in the event of an accident could never be addressed. Once the system fails to be in control it is an non-discriminating, 2 tonne, unguided projectile travelling at 100km/h.

So, as I intimated in my first post - self-drive software isn't smart enough to deal with all possible scenarios and probably never will be because every situation is different and no level of programming can deal with that. Software cannot predict the future - it can assess all available data, formulate some possible outcomes based upon what it has been programmed to do and then chose the safest course of action from those. No doubt the Tesla programmers will now change their software and will continue to refine it as more accident data is accrued but there is a finite limit to how much real-time processing can be done with current technology.

A truly safe risk-assessing self-drive car would never leave the garage because it would logically deduce that to be the safest course of action.


Edited by Dean - July 01 2016 at 04:43
What?
Back to Top
 Post Reply Post Reply Page  <12

Forum Jump Forum Permissions View Drop Down



This page was generated in 0.219 seconds.
Donate monthly and keep PA fast-loading and ad-free forever.