PowerSwitch Main Page
PowerSwitch
The UK's Peak Oil Discussion Forum & Community
 
 FAQFAQ   SearchSearch   MemberlistMemberlist   UsergroupsUsergroups   RegisterRegister 
 ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

Driverless cars
Goto page Previous  1, 2, 3, 4, 5  Next
 
Post new topic   Reply to topic    PowerSwitch Forum Index -> Transport
View previous topic :: View next topic  
Author Message
biffvernon



Joined: 24 Nov 2005
Posts: 18541
Location: Lincolnshire

PostPosted: Sun Mar 06, 2016 7:19 pm    Post subject: Reply with quote

johnhemming2 wrote:
I worry about the lack of jobs for taxi drivers. (inc private hire).

The purpose of automation is to allow folk to have the time to listen to Jazz quartets. The politicians need to ensure that wealth is distributed fairly to all, including those who no longer need to work as we now have machines.
_________________
http://biffvernon.blogspot.co.uk/
Back to top
View user's profile Send private message Send e-mail Visit poster's website
clv101
Site Admin


Joined: 24 Nov 2005
Posts: 8496

PostPosted: Sun Mar 06, 2016 7:50 pm    Post subject: Reply with quote

adam2 wrote:
Drverless HGVs also.
http://www.bbc.co.uk/news/uk-politics-35737104


Indeed. Driverless trains would be far easier than driverless trucks. I guess the train drivers have a better union than the truckers.
_________________
PowerSwitch on Facebook | The Oil Drum | Twitter | Blog
Back to top
View user's profile Send private message Visit poster's website
clv101
Site Admin


Joined: 24 Nov 2005
Posts: 8496

PostPosted: Sun Mar 06, 2016 8:01 pm    Post subject: Reply with quote

Little John wrote:
clv101 wrote:
I don't think that's the right framework to look at this. ABS has algorithms to decide when to apply, airbags have algorithms to decide when to deploy, AI has algorithms to decide which way to swerve to avoid collision. There's nothing magic about AI, it's just software....


I shall repeat, here, my answer to biff Vernon. The "decision" by the air bags is not a moral one, it is a mechanistic one You did read the word "moral" next to the word "decision" littered throughout my previous posts, right? Or, are you trying to imply that you do not understand the difference between mechanistic decisions and moral ones?


Yeah, I am struggling to see the difference here. How is the mechanistic algorithm leading to the decision whether or not to deploy an airbag any different to the mechanistic algorithm leading to a decision whether to swerve left (into the pedestrian) or right (into the oncoming traffic)? Just because a human making that decision might base it on a moral judgement doesn't mean the computer does. It would definitely be a mechanistic algorithm, computers don't do anything else. Now the computer may be programmed to, when faced with that situation, always chose the oncoming traffic (or the pedestrian) but only because the data will have shown one to have a statistically better outcome than the other...

In short I don't think driverless cars are making moral decisions and we shouldn't put this specific technology in a special box.

Little John wrote:
Tell me, would you euthanise all children born with certain congenital conditions, thus freeing up much needed resources which, if deployed elsewhere, would lead to the "greater good" of the health of the rest of the population?

I wouldn't euthanise all children born with certain congenital conditions as I'd like to think every individual should have equal treatment - even when they've been dealt a bad hand.
_________________
PowerSwitch on Facebook | The Oil Drum | Twitter | Blog
Back to top
View user's profile Send private message Visit poster's website
AutomaticEarth



Joined: 08 Nov 2010
Posts: 823

PostPosted: Sun Mar 06, 2016 8:09 pm    Post subject: Reply with quote

clv101 wrote:
adam2 wrote:
Drverless HGVs also.
http://www.bbc.co.uk/news/uk-politics-35737104


Indeed. Driverless trains would be far easier than driverless trucks. I guess the train drivers have a better union than the truckers.


A lot of truckers aren't members of unions and tend to come in from abroad. Many of these often flout laws designed to stop them from doing excessive hours on the road. I'm hoping that these driverless trucks are safer than ones driven by folks half asleep at the wheel.....as long as the truck at the front is not being driven by such a driver Smile
Back to top
View user's profile Send private message
Little John



Joined: 08 Mar 2008
Posts: 7765
Location: UK

PostPosted: Sun Mar 06, 2016 8:24 pm    Post subject: Reply with quote

clv101 wrote:
Little John wrote:
clv101 wrote:
I don't think that's the right framework to look at this. ABS has algorithms to decide when to apply, airbags have algorithms to decide when to deploy, AI has algorithms to decide which way to swerve to avoid collision. There's nothing magic about AI, it's just software....


I shall repeat, here, my answer to biff Vernon. The "decision" by the air bags is not a moral one, it is a mechanistic one You did read the word "moral" next to the word "decision" littered throughout my previous posts, right? Or, are you trying to imply that you do not understand the difference between mechanistic decisions and moral ones?


Yeah, I am struggling to see the difference here. How is the mechanistic algorithm leading to the decision whether or not to deploy an airbag any different to the mechanistic algorithm leading to a decision whether to swerve left (into the pedestrian) or right (into the oncoming traffic)? Just because a human making that decision might base it on a moral judgement doesn't mean the computer does. It would definitely be a mechanistic algorithm, computers don't do anything else. Now the computer may be programmed to, when faced with that situation, always chose the oncoming traffic (or the pedestrian) but only because the data will have shown one to have a statistically better outcome than the other...

In short I don't think driverless cars are making moral decisions and we shouldn't put this specific technology in a special box.

Little John wrote:
Tell me, would you euthanise all children born with certain congenital conditions, thus freeing up much needed resources which, if deployed elsewhere, would lead to the "greater good" of the health of the rest of the population?

I wouldn't euthanise all children born with certain congenital conditions as I'd like to think every individual should have equal treatment - even when they've been dealt a bad hand.
So, in short, you are trying to argue that a computerised device which must potentially make a real time decision on whether to more or less directly cause the death of one human over the death of another, in other words dealing one of those humans a very bad hand indeed, is not involved in making a moral choice? All of which is completely morally distinct from a device, such an an ABS system, which is triggered to respond, not to a complex moral dilemma involving the choice of killing one person over another, but to a simple, morally-irrelevant loss of traction on the wheels of a car.

And, yet, in the same post, you balk at the suggestion of applying exactly the same logic in another context.

In which case, I'm calling your response for what it is.

Bullshit.

It's not bullshit because of it's extreme utilitarianism. It's bullshit because of your completely disingenuous attempt to dodge the central issue by pleading ignorance of the philosophical underpinnings of your position and its wider philosophical implications.

Now, where have I regularly seen that kind of dishonest debating strategy before?
Back to top
View user's profile Send private message Send e-mail Yahoo Messenger
clv101
Site Admin


Joined: 24 Nov 2005
Posts: 8496

PostPosted: Sun Mar 06, 2016 8:39 pm    Post subject: Reply with quote

Little John wrote:
So, in short, you are trying to argue that a computerised device which must potentially make a real time decision on whether to more or less directly cause the death of one human over the death of another, in other words dealing one of those humans a very bad hand indeed, is not involved in making a moral choice?

Absolutely. Computers don't make moral choices, their mechanistic algorithms don't have any concept of morals. Just because the outcome of a decision is the likelihood of one human's death over another's doesn't make what goes on in the computer any different from an automatic trading programme deciding which stock to buy for example.

I think you're anthropomorphising the driverless car.
_________________
PowerSwitch on Facebook | The Oil Drum | Twitter | Blog
Back to top
View user's profile Send private message Visit poster's website
Little John



Joined: 08 Mar 2008
Posts: 7765
Location: UK

PostPosted: Sun Mar 06, 2016 8:46 pm    Post subject: Reply with quote

clv101 wrote:
Little John wrote:
So, in short, you are trying to argue that a computerised device which must potentially make a real time decision on whether to more or less directly cause the death of one human over the death of another, in other words dealing one of those humans a very bad hand indeed, is not involved in making a moral choice?

Absolutely. Computers don't make moral choices, their mechanistic algorithms don't have any concept of morals. Just because the outcome of a decision is the likelihood of one human's death over another's doesn't make what goes on in the computer any different from an automatic trading programme deciding which stock to buy for example.

I think you're anthropomorphising the driverless car.
Now I know that the apple does not fall far from the tree. You are well aware that I am well aware that a computer is incapable of being held responsible for a choice that has moral consequences to it and so are deliberately feigning ignorance of that. Given that such a device is morally incapable of being responsible, it is completely morally untenable to have a computer make such a decision. The only way that the use of such a device could be morally tenable would be if it's manufactures were held morally responsible by proxy. but, this leads to my original point in this thread, that the number of moral variables (assuming universal agreement could even be achieved as to what range of variable were even acceptable) would be impossible to predict. Consequently, manufactures will never allow themselves to be liable in the absence of government edict to force such liability on them. In which case, they would not produce such devices.
Back to top
View user's profile Send private message Send e-mail Yahoo Messenger
johnhemming2



Joined: 30 Jun 2015
Posts: 2159

PostPosted: Sun Mar 06, 2016 9:20 pm    Post subject: Reply with quote

In terms of tort the owner and manufacturer would have some legal responsibility.
Back to top
View user's profile Send private message
Little John



Joined: 08 Mar 2008
Posts: 7765
Location: UK

PostPosted: Sun Mar 06, 2016 11:40 pm    Post subject: Reply with quote

johnhemming2 wrote:
In terms of tort the owner and manufacturer would have some legal responsibility.
The death of a pedestrian could quite easily amount to far more then mere tort. It is a potentially criminal wrong as opposed to merely a civil wrong. Secondly, so far as the owner of the car's liability goes, I would surmise this would be similar to the liability of someone who hired a driver to drive them in their own car as a passenger. Logically following that though, the "driver's" responsibility (the "driver", in this case, being a computer) would pass onto the manufacturers of the "driver". however, this then raises my initial observation in this thread. Namely, that the decisions, in extremis, made by the computer, may occur in the context of highly complex moral dilemmas. In which case, I would contend it is all but impossible to codify that into any AI algorithm.

In other words, I am saying that AI is just that; artificial. A facsimile of intelligence. And, even if its intelligence amounted to more than that, this does not mean it is conscious of its choices or of the moral implications of them. Nevertheless, a decision, with very real moral implications will, potentially, have been made by such a device. In which case, either the manufacturers must assume (in my view, a completely untenable) full liability for such decisions or the devices must not be used or, sadly more likely, the economic imperative will be so strong that the manufacturers will lobby governments to fudge the issue and grant them some kind of limited liability. In which case we will end up with a moral quagmire on our hands.
Back to top
View user's profile Send private message Send e-mail Yahoo Messenger
johnhemming2



Joined: 30 Jun 2015
Posts: 2159

PostPosted: Sun Mar 06, 2016 11:45 pm    Post subject: Reply with quote

There is no reason to limit liability. The key question is what the insurance companies want to charge.
Back to top
View user's profile Send private message
Little John



Joined: 08 Mar 2008
Posts: 7765
Location: UK

PostPosted: Sun Mar 06, 2016 11:54 pm    Post subject: Reply with quote

johnhemming2 wrote:
There is no reason to limit liability. The key question is what the insurance companies want to charge.
This is basically a restatement, with reasons, of the second of the outcomes I have just outlined. That is to say, I think the premiums would be massive, since insurance companies are not stupid and will surely predict that same quagmire. I should have, perhaps, made it clearer that this was one of the underlying reasons for my second proposed outcomes.

However, assuming, for the sake of argument, that the premiums are within the bounds of the reasonable, this still leaves the issue of who gets to decide the moral algorithms. That is to say, going back to my original scenario of a road traffic accident, outlined at the start of this thread, how does the "driver" "decide" which course of action to take? Does it save the pedestrian, or does it save the car owner and the occupants of any other oncoming traffic? Does that decision get mitigated if the pedestrian is a child or an elderly person? The moral permutations are effectively endless and are particular to the specific time and place. This could never be codified into any kind of device that exists now. Nor, arguably, should it be. In short, AI should either be only deployed where there are no complex moral implications of the type outlined or, where they do exist, they are of an unambiguous, and therefore unarguable, binary nature
Back to top
View user's profile Send private message Send e-mail Yahoo Messenger
kenneal - lagger
Site Admin


Joined: 20 Sep 2006
Posts: 11588
Location: Newbury, Berkshire

PostPosted: Mon Mar 07, 2016 6:04 am    Post subject: Reply with quote

The law will be changed to protect big business as it usually is nowadays because it is big business that pays for our political system and provides politicians with their retirement income/post electoral failure jobs. Jaywalking would probably be made illegal, as it is in the US, I think, so that the liability, in most cases, would be with the pedestrian.

We can't be allowed to get in the way of growth/progress.
_________________
Action is the antidote to despair - Joan Baez
Back to top
View user's profile Send private message Send e-mail Visit poster's website
Little John



Joined: 08 Mar 2008
Posts: 7765
Location: UK

PostPosted: Mon Mar 07, 2016 9:14 am    Post subject: Reply with quote

That's sounds about right Ken
Back to top
View user's profile Send private message Send e-mail Yahoo Messenger
biffvernon



Joined: 24 Nov 2005
Posts: 18541
Location: Lincolnshire

PostPosted: Fri Apr 01, 2016 7:57 pm    Post subject: Reply with quote

The driverless bike.

https://www.youtube.com/watch?v=LSZPNwZex9s
_________________
http://biffvernon.blogspot.co.uk/
Back to top
View user's profile Send private message Send e-mail Visit poster's website
PS_RalphW



Joined: 24 Nov 2005
Posts: 5713
Location: Cambridge

PostPosted: Fri Jul 01, 2016 12:29 pm    Post subject: Reply with quote

http://www.bbc.co.uk/news/technology-36680043

Tesla autopilot 0 Truck 1.
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic   Reply to topic    PowerSwitch Forum Index -> Transport All times are GMT + 1 Hour
Goto page Previous  1, 2, 3, 4, 5  Next
Page 2 of 5

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group