Everyone of us heard about Tesla's autopilot function, which is the most known second level automation system. But has anyone of you ever wondered about the legal aspect of letting in autonomous cars on public roads?
Questions:
1. Would you like to buy a car that protects you or the one that chooses the safest action, even if that means hurting you?
2. Who should be responsible for an accident caused by autonomous car? Manufacturer, owner or the person that was sitting behind the wheel during the crash? Maybe someone else?
3. Would you sacrifice your family safety for greater good, which is less traffic-related deaths?
Questions:
1. Would you like to buy a car that protects you or the one that chooses the safest action, even if that means hurting you?
2. Who should be responsible for an accident caused by autonomous car? Manufacturer, owner or the person that was sitting behind the wheel during the crash? Maybe someone else?
3. Would you sacrifice your family safety for greater good, which is less traffic-related deaths?
Comments
I wouldn't, because almost each person is egoistic. My safety and safety of my family is more important than anything else. And it doesn't matter what people could say, if they were on place of driver in most casees they wouldn't risk themselves.
I think that a driver should be responsible for an acicdent because we should be concious that these days autonomous cars are not perfect and manufactures do their first steps in this kind of technology so we should be smart enought to know that we should control such a car and take a control in dangerous situation.
Like I mentioned before I would never sacrifice my family safety for 'greater good' because they are most important for me than other people. Of course I would prefer to safe everyone but if it had be impossible I would prefer to save my familly.
the second question is a philophical question, becasue everyone has responsibility for a car. It may be fault of the manufacturer or of a pedestrian, who was chatting on FB while crossing a road. I don't really know
Responsible should be the driver, it's his/her car and if he/she don't like drive, why you bought this car. Cycle will be better :)
On the third question, I don't know what to say...
If all of road traffic becomes autonomous there would be no accidents, and any accidents would be caused by software/hardware malfunctions which then liability lies on manufacturer.
I think that companies should be responsible for an accident caused by an autonomous car. And I always choose my family . I would never forgive myself if I did otherwise.
I think that if it comes about responsibility, it should take the person who was the driver in the accident. But i'm not sure what will be the best option.
If it comes about last question, I'm sure that i will do everything to keep my family safety. Maybe it isn't huge feat but if it comes about my family i know that my emotions don't let me choose the other option.
I would not buy an autonomous car, because their AI as it is done today (based on the machine learning algorithms) will not solve all challenges facing autonomous cars.
It is a question of how much involvement and control you have over an autonomous car. If we are to be made legally responsible for each of manufacturer's software bugs, why to risk having an autonomous car at all? I would prefer to drive myself.
In general, I believe it is a complex problem and we do not have legal and technological means to solve it just yet.
I would not sacrifice my family for a greater good, and I personally do not know any person that would, so yes, I think we have a problem here.
2. That depends entirely on situation.
3. Never.
Interesting question, i would say that both(driver and manufacturer) would be responsible for a car accident. For example people or another driver can also be responsible for a cars crash.
I agreed with a statement above, that each person is egoistic and my safety of my family is the most important thing.
But I like the idea of eject button from the video :) In my opinion, it's impossible to think the decision up without inventing something new. For example, the eject button or so.
It's likely that we'll have the autopilot in trams first. Because they ride in the same way, they have schedules and also rails. It'd be easier and safer to make. Maybe they even already exist (I've heard of something like this, but don't exactly remember).
There is no generic way of inspecting accidents like that. Each and every one of them should be judged individually.
I wouldn't put my family safety as less important than anything else.
Autonomous cars are/were a great hope for insurance companies and health systems because it would lower the number of traffic-related accidents to almost non existent, in theory. But would have to get rid of human drivers for it to work out like that and I don't think it's possible in the next let's say 30 years. So yeah, such an autonomous car sounds interesting but for now something that assists the driver rather than takes over is something to consider, in terms of safety. I've seen a video on the internet with a tesla car, signalling it's driver that an accident is takin place a car ahead of him. Driver couldn't notice it but the car knew it right when it was happening, instantly. That's something very useful.
Topic on who's responsible for an autonomous car error is too hard for me, and I worry that software developers/engineers could be held responsible ;)
No. Because sometimes car would choose me even where there was no fault of mine of causing the accident.
2. Who should be responsible for an accident caused by autonomous car? Manufacturer, owner or the person that was sitting behind the wheel during the crash? Maybe someone else?
There is no driver so I think just the insurance. Or if the manufacturer error would be proven then he should be responsible and pay for it to insurance company.
3. Would you sacrifice your family safety for greater good, which is less traffic-related deaths?
There is no greater good for me than family safety - so the answer is no.
I thing it's all depends on a type of accident. But, personally, manufacturer - if accident was caused by system/software , driver - if accident was caused by negligence/inadvertence.
I would not sacrifice my family safety , that's how the world works.
The second question is very difficult- propably manufacturer because he is responsible for the correct car working.
As a part of my family Im responsible for safety of the members so I would not sacrifice my family safety.
2. It all depends on a situation ehh...
3. Heck no, what is that question
2. I guess the most responsible shall be the government which accepts such machines on the roads, because they should be verified very carefully. Acceptance process shall be controlled very precisely to ensure avoiding all unpredictable situations. This is impossible to do in fact, but the authorities should do their best, everything they can, to approach this.
3. I wouldn’t ever take a risk to sacrifice my family’s safety for any other good. For me this is the highest value I have. I’d rather prefer to move out to another country, than to stay at place when it is required to understand such “higher priority”.
I think that still, the responsible one should be the one driving the car. After all, there will always be a "use at own risk" clause in the cars manual, and it will for sure always contain some kind of manual override for the driver to take control, for example in case of possible accident, so ultimately the one "driving" the car will always be at fault.
I like the notion of social dilemma in this presentation. Each of us thinks in a different way and makes different decisions. I think that the development of technology should take place in such way that such situations will not arise at all or the security system can save each life.
I love driving a car. I don't understand how some people can drive on autopilot. That kills all pleasure!
I consider the manufacturer are guilty. I think that I would blame them for the accident as the first but all depends on a situation...
I wouldn't sacrifice my family for a greater good. For me greater good is good of my family!
First question is tough. On the one hand I would like to fell safe in my own car, but having car which will hit some pederastian if there will be cat on road is bad idea. On the other hand, having car which will choose to protect others but me is insane. I think, that having such opportunities I would pick third one - none of above.
I think that it is necessary to have a switch in such cars, so driver may take control whenever he wants to (but while car crash he wouldn't change it on time). I think that responsible should be manufacturer and someone behing the wheel. First one, because he claimed that this is safe, and the other one, because wasn;t paying enough attention.
I wouldn't sacrifice my family for greater good, as there is no greater good than my family.
Apart from that I think I would buy such car. The point everyone seems to be missing here, is that even if the car would have a 0.0000001% chance of killing you, you should compare that to the probability of dying while driving a conventional car.
Regarding to second question it is difficult to say because it depends on guarantee from manufacturer. If you follow the procedures during driving you shouldn't be responsible for accident. I would never sacrifice my family for greater good.
I like those answers. Well you are right, for now the situation is simple, if you don't want to drive - don't drive. I believe in future world people who won't like to drive will use autonomous cars.
1. No, I wouldn’t buy an autonomous car. I think that we have to wait for a moment for AI to get better algorithms. I think AI is just not smart enough for now.
2. Again, hard one. You know, we can tell, that it’s the manufacturer responsibility, but then, why we’re taking the risk to have an autonomous car at all? That’s why I’m not really into it.
3. No, definitely not. My family safety is the most important thing for me.