Skip to main content

Week 11 [15.01.2018 - 21.01.2018] Autonomous driving

Everyone of us heard about Tesla's autopilot function, which is the most known second level automation system. But has anyone of you ever wondered about the legal aspect of letting in autonomous cars on public roads?







Questions:
1. Would you like to buy a car that protects you or the one that chooses the safest action, even if that means hurting you?
2. Who should be responsible for an accident caused by autonomous car? Manufacturer, owner or the person that was sitting behind the wheel during the crash? Maybe someone else?
3. Would you sacrifice your family safety for greater good, which is less traffic-related deaths?

Comments

Filip Sawicki said…
First question is really hard to answer, obviously personal safety is a priority for everyone but at the same there should exist some compromise. Intelligent cars must have advanced logic and reasoning to predict outcomes of a potential danger and direct decisions towards sensible action. Whether it will make us hurt or someone else doesn’t matter, the thing is to minimalize causalities and damage. Regarding accidents I guess it will depend on the situation, manufacturer should therefore provide access to logs (for example a video) in order to find the guilty.
Wow, all of your questions are impossible to answer. First of all, I don’t know if I would like a self driving car at all - I really like driving a car :D I think every accident should be judged individually and all side should be taken into an account. To be honest, I would probably choose my family’s safety first - its just the self preservation instinct.
Unknown said…
Would you sacrifice your family safety for greater good, which is less traffic-related deaths?

I wouldn't, because almost each person is egoistic. My safety and safety of my family is more important than anything else. And it doesn't matter what people could say, if they were on place of driver in most casees they wouldn't risk themselves.
Unknown said…
Now I wouldn't buy any of autonomous cars becuase I think that our artificial inteligence algorithms is not 'smart' enought and I wouldn't trust them. Hypotetycally I would prefer a car which would protect me and my passengers.

I think that a driver should be responsible for an acicdent because we should be concious that these days autonomous cars are not perfect and manufactures do their first steps in this kind of technology so we should be smart enought to know that we should control such a car and take a control in dangerous situation.

Like I mentioned before I would never sacrifice my family safety for 'greater good' because they are most important for me than other people. Of course I would prefer to safe everyone but if it had be impossible I would prefer to save my familly.
Very hard matter. Personally I don't believe that I should sacrifice myself or my family only because somebody acted like a total moron and put in danger his entire family, even if that would mean that more people would die/be injured as otherwise. I'd never buy such car if I didn't have a clear switch, as I don't want to be held responsible for mistakes of others, and I strongly believe that I'm not alone in this.
Unknown said…
Well, I agree that this matter is very difficult. Firstly, I would not buy such car, because I enjoy driving and I am inclined to believe that safe driving is the best decision in this sutuation. However, according to your question I would buy a car,which protects me.
the second question is a philophical question, becasue everyone has responsibility for a car. It may be fault of the manufacturer or of a pedestrian, who was chatting on FB while crossing a road. I don't really know
Unknown said…
I would not buy a car cuz for me it is not interesting. I would like walking to the work or university and don't drive. It's good for health.
Responsible should be the driver, it's his/her car and if he/she don't like drive, why you bought this car. Cycle will be better :)
On the third question, I don't know what to say...
Unknown said…
I can't answer your question and probably many other people can't. That's the reason we should't use autonomous cars yet. Computers are dumb, because they do what we want them to do. If we are not sure what is right and what is not, how autonomous cars can know it? I think I would probably save me and my family at the first place during the accident but I've never had any accident and I don't plan to have one.
I would never buy a car that would compromise my safety or safety of my family for the "greater good" i dont care. If the situation on the road requires me to hurt other ppl in order for my family to stay intact i will do it. The issue about all this situation is whos fault is that road sytuation in a first place if its mine let it be on me, but why would i care if a dumb driver kills himself by my hands if he chosen so for example if he chooses to speed?
If all of road traffic becomes autonomous there would be no accidents, and any accidents would be caused by software/hardware malfunctions which then liability lies on manufacturer.
Unknown said…
I know that autonomous cars are the future but it will take a lot of time until it fully works. Then we must change everything not only cars . People should build roads and buildings to match autonomous system. I've seen how tesla was caught in the railroad because it was in renovation. We have a lot of accidental situations which no one can predict. Tesla company is the only company which gives us all it makes. Other companies give their products and innovations slowly because it is a way to earn more money. I think that we could use phones which we have today 10 years ago. Only Tesla gives us everything. If you have doubts you should try Tesla car. When all cars had maximum range on electric engine maybe 50 km suddenly Tesla company released model S which has a range of 500 km and autonomous driving system. I think that Elon Reeve Musk who is the owner of the Tesla is a big and brave men. I think that he does very much for the world and he will do much more. I wait impatiently for his mission to Mars.
I think that companies should be responsible for an accident caused by an autonomous car. And I always choose my family . I would never forgive myself if I did otherwise.
Unknown said…
I think that if i could turn off this function i will do it. I can't choose if my live is more valuable than mine.
I think that if it comes about responsibility, it should take the person who was the driver in the accident. But i'm not sure what will be the best option.
If it comes about last question, I'm sure that i will do everything to keep my family safety. Maybe it isn't huge feat but if it comes about my family i know that my emotions don't let me choose the other option.
Alicja said…
I agree with Tomasz on this one.

I would not buy an autonomous car, because their AI as it is done today (based on the machine learning algorithms) will not solve all challenges facing autonomous cars.

It is a question of how much involvement and control you have over an autonomous car. If we are to be made legally responsible for each of manufacturer's software bugs, why to risk having an autonomous car at all? I would prefer to drive myself.

In general, I believe it is a complex problem and we do not have legal and technological means to solve it just yet.

I would not sacrifice my family for a greater good, and I personally do not know any person that would, so yes, I think we have a problem here.
Unknown said…
1. I would buy the second one only if i could choose the way to behave. For example if i could set the option to save my passengers even if it meant that i would be hurt more.
2. That depends entirely on situation.
3. Never.
Vladlen Kyselov said…
If I would definitely buy autonomous car, but at the same time I don`t want to this, because while driving by myself I get a lot of positive emotions and I start feeling myself better, driving a car by myself makes me feel more safe, because from my childhood I do not trust anyone who is driving the car)
I would like to buy a car that protects me, it is default. Everybody would make this choice.
Interesting question, i would say that both(driver and manufacturer) would be responsible for a car accident. For example people or another driver can also be responsible for a cars crash.
I agreed with a statement above, that each person is egoistic and my safety of my family is the most important thing.
Foodocado said…
All of your questions are really hard to answer. I think these kind of cars should choose the safest possible action to minimize the casualties and demage. Manufacturers also should provide access to logs from cameras and all sensors in those cars. It will be helpful in deciding who is responsible for accident. I will not give answer for the last question..
Unknown said…
Oh, I've just googled "Tesla's autopilot function die". So, no, I wouldn't buy such a car. And it's written in an article from Google, that Tesla wants drivers to be on guard no matter what and not to remove hands from the steering wheel, even when the autopilot is on.

But I like the idea of eject button from the video :) In my opinion, it's impossible to think the decision up without inventing something new. For example, the eject button or so.

It's likely that we'll have the autopilot in trams first. Because they ride in the same way, they have schedules and also rails. It'd be easier and safer to make. Maybe they even already exist (I've heard of something like this, but don't exactly remember).
Jakub Lisicki said…
I don't think that anyone would buy a car which wouldn't choose it's passengers safety as its top priority. If there were different manufacturers of cars like that with different degree of safety to the "driver", the choice would not be so tough.
There is no generic way of inspecting accidents like that. Each and every one of them should be judged individually.
I wouldn't put my family safety as less important than anything else.
Unknown said…
That's some tough questions to answer. Let me try though.
Autonomous cars are/were a great hope for insurance companies and health systems because it would lower the number of traffic-related accidents to almost non existent, in theory. But would have to get rid of human drivers for it to work out like that and I don't think it's possible in the next let's say 30 years. So yeah, such an autonomous car sounds interesting but for now something that assists the driver rather than takes over is something to consider, in terms of safety. I've seen a video on the internet with a tesla car, signalling it's driver that an accident is takin place a car ahead of him. Driver couldn't notice it but the car knew it right when it was happening, instantly. That's something very useful.
Topic on who's responsible for an autonomous car error is too hard for me, and I worry that software developers/engineers could be held responsible ;)
1. Would you like to buy a car that protects you or the one that chooses the safest action, even if that means hurting you?
No. Because sometimes car would choose me even where there was no fault of mine of causing the accident.
2. Who should be responsible for an accident caused by autonomous car? Manufacturer, owner or the person that was sitting behind the wheel during the crash? Maybe someone else?
There is no driver so I think just the insurance. Or if the manufacturer error would be proven then he should be responsible and pay for it to insurance company.
3. Would you sacrifice your family safety for greater good, which is less traffic-related deaths?
There is no greater good for me than family safety - so the answer is no.
Obviously, I will choose car that protects me. But, honestly, I am not a big fan of driving and cars.
I thing it's all depends on a type of accident. But, personally, manufacturer - if accident was caused by system/software , driver - if accident was caused by negligence/inadvertence.
I would not sacrifice my family safety , that's how the world works.
I would propably choose a car that will protect me but I prefer to drive my car on my own.
The second question is very difficult- propably manufacturer because he is responsible for the correct car working.
As a part of my family Im responsible for safety of the members so I would not sacrifice my family safety.
1. Oh, tought one. I have no idea. There are to many questions. Like how badly will it hurt me etc. I would buy car like that if it somehow connets to your mind so its more you that take this decision.
2. It all depends on a situation ehh...
3. Heck no, what is that question
1. I am definiteley not ready neither for a car that “protects” me nor for the one which chooses the safest action. I’d be frighten into any of them. We simply haven’t tested it yet and I suppose we have to go through several situations to understand how it works. However, it will be very difficult to give up controlling the car’s activities.
2. I guess the most responsible shall be the government which accepts such machines on the roads, because they should be verified very carefully. Acceptance process shall be controlled very precisely to ensure avoiding all unpredictable situations. This is impossible to do in fact, but the authorities should do their best, everything they can, to approach this.
3. I wouldn’t ever take a risk to sacrifice my family’s safety for any other good. For me this is the highest value I have. I’d rather prefer to move out to another country, than to stay at place when it is required to understand such “higher priority”.
Unknown said…
Probably I would choose a car that would protect me and my family, than the one that would go for greater good, even when I know, I should do the opposite. Also, as previosly mentioned, in some situations the car would kill me, even if the accident would be someone elses fault.
I think that still, the responsible one should be the one driving the car. After all, there will always be a "use at own risk" clause in the cars manual, and it will for sure always contain some kind of manual override for the driver to take control, for example in case of possible accident, so ultimately the one "driving" the car will always be at fault.
The first question is really hard to answer honestly, because frankly I just don’t know. I think that if there were some accident caused by autonomous car, it would have to be investigated whose fault it really is and then choose who is responsible. The third question is also very hard so I can’t really answer it.
Unknown said…
I remember that there is only one state in USA that autonomous driving cars ale legal. As I recall in 2020 first autonomic cars should be ready to drive. I would not like to get hurt so I would choose the safest for me way. I think when automatic cars starts driving there should be any insurance. We should pay by our money for the repairs. What is the point of buying insurance for the mistake of machine? It would be like buying insurance for making mistake by computer. Maybe there would be some point to punish developers who programmed bad alogorithms for cars?
Unknown said…
Robots are created by a human, and human also describes his algorithms. Therefore, we are in charge of an ethical side of artificial intelligence. Despite the fact that such technologies haven't finally entered our life, we already have to understand how should they behave. Robots are devoid of compassion, and one can only assume what they can do in the hands of a ruthless man.
I like the notion of social dilemma in this presentation. Each of us thinks in a different way and makes different decisions. I think that the development of technology should take place in such way that such situations will not arise at all or the security system can save each life.
Just have the car aim for self-preservation--it's what human drivers would do. If a group of 5 people ended up causing an inevitable accident, it is 100% their fault. The self-driving car itself will not be making purposeful mistakes.
Yevhen Shymko said…
The most obvious solution for me is to give a car buyer a choice of what type of AI they want in car. It might be for example made in a form of a test thet can give general answer of mental personality. Trying to regulate those is not acceptable in my opinion since for example I would like my car to care about my safety first.
Of course i would not buy such a car. Autonomous cars are not so smart at the moment so i think that the vision of cars without drivers is a bit too forward looking. People should build cars which would help drivers in many different ways but not replace them. Who should be responsible for an accident caused by autonomous car? To be honest - i have no idea but i'm 100% sure that i would not sacrifice my family for greater good.
Unknown said…
I'm not a supporter of the autonomous driving car.
I love driving a car. I don't understand how some people can drive on autopilot. That kills all pleasure!
I consider the manufacturer are guilty. I think that I would blame them for the accident as the first but all depends on a situation...
I wouldn't sacrifice my family for a greater good. For me greater good is good of my family!
Unknown said…
Wow, those are serious questions and they are not easy ones.
First question is tough. On the one hand I would like to fell safe in my own car, but having car which will hit some pederastian if there will be cat on road is bad idea. On the other hand, having car which will choose to protect others but me is insane. I think, that having such opportunities I would pick third one - none of above.
I think that it is necessary to have a switch in such cars, so driver may take control whenever he wants to (but while car crash he wouldn't change it on time). I think that responsible should be manufacturer and someone behing the wheel. First one, because he claimed that this is safe, and the other one, because wasn;t paying enough attention.
I wouldn't sacrifice my family for greater good, as there is no greater good than my family.
I like the ejection seat option the most.

Apart from that I think I would buy such car. The point everyone seems to be missing here, is that even if the car would have a 0.0000001% chance of killing you, you should compare that to the probability of dying while driving a conventional car.
Patryk Pohnke said…
That's why I think that legal problems will be the ones preventing fully autonomous cars driving anytime soon.
Unknown said…
It is really hard question to answer but if I would buy car I would rather buy one that is safe for me and my family. To be honest I won't choose autonomous car because I believe in my driving skills.
Regarding to second question it is difficult to say because it depends on guarantee from manufacturer. If you follow the procedures during driving you shouldn't be responsible for accident. I would never sacrifice my family for greater good.
Patryk Pohnke said…
They aren't trivial. I like driving too, but at some point having an autonomous car in your garage will be the only option :) When it comes to accidents everyone will be treated the same way, it will be one algorithm which will be responsible for actions and their results.
Patryk Pohnke said…
Haha, you kept this topic very simple :)
I like those answers. Well you are right, for now the situation is simple, if you don't want to drive - don't drive. I believe in future world people who won't like to drive will use autonomous cars.
Patryk Pohnke said…
Ejection option was awesome, imagine this feature workng during traffic jam in a tunnel where incoming cars can't break on time so there are multiple ejections straight into tunnel's ceiling :)
Patryk Pohnke said…
A lot of time will pass before there will be fully autonomous cars on roads. And even more time will pass before there will be only fully autonomous cars on roads. It is nice perspective, that roads will be ultimately safe, but on the other hand I love driving...
Of course we will never educate the machines with a human factor, but as far as I know, the autopilot in modern cars is good enough to use it.
Unknown said…
Such hard questions! I don't think I could sacrifice the safety of my family for whatever it is. Because I just couldn't. That is why, I just want everything to stay on it's place. I don't want the autonomous cars because it would lead to chaos on the roads and in the society. We can't always hold the situation now in habitual conditions, I can't even imagine what would it be, if the system of the autonomous cars would fail. I think I would no longer use the car. Well, ok, I love long walks..
Patryk Pohnke said…
I would choose the same option. I think that there is nothing wrong in willing to survive...
Unknown said…
This comment has been removed by the author.
Unknown said…
Those questions are tough, man.
1. No, I wouldn’t buy an autonomous car. I think that we have to wait for a moment for AI to get better algorithms. I think AI is just not smart enough for now.
2. Again, hard one. You know, we can tell, that it’s the manufacturer responsibility, but then, why we’re taking the risk to have an autonomous car at all? That’s why I’m not really into it.
3. No, definitely not. My family safety is the most important thing for me.
Wojtek Kania said…
No, I wouldn't buy a car that protects me if it hurts me. I really like idea of AI in cars but it should be only driver support feature. I think owner shouldn't be responsible for an accident caused by autonomus car, because why he should be? I think manufacturer bears the greatest responsibility for an accident caused by autonomous car, like a programmer who makes software.

Popular posts from this blog

Week 1 (09-15.03) VOD

http://www.vod-consulting.net/wp-content/uploads/2012/08/1.jpg

Week 11 [03-09.06.2019] The problem with ecological cars emission in UK

The problem with ecological cars emission in UK Since the adoption of the European Emission Allowance Directive in the European Parliament, all car makers have tried to submit. Since 1992, the Euro I standard has been in force, which limited the emission of carbon monoxide to the atmosphere. The Euro VI standard currently applies, which limits the series of exhaust gases. These include: hydrocarbons, nitrogen and carbon oxides, and dust.   The most significant change was brought by the Euro IV standard. For the first time it introduced the limitation of nitrogen oxides, which are responsible for the harmful compounds of smog.   What is smog?   Smog consists of sulfur oxides, nitrogen and carbon. In addition, solid substances such as suspended dust (PM). Dust suspend in atmospheric aerosols may be in liquid and solid form. These can be particles of sea salt, clouds from the Sahara and artificial compounds made by people. These compounds often come fr

Week 4 [06-12.11.2017] This is what happens when you reply to spam email.

James Veitch is a British comedian. In today’s Ted Talk James with characteristic for himself a sense of humor shows how he deals with spam emails and why responding to junk messages may be sometimes dangerous. Questions: What do you think about James’s  way of dealing with spam? Why are junk messages legal, even though it sometimes may be a fraud? Dou you have a problem with spam? How do you deal with with it?