Week 4 [16.04-22.04.2018] Autonomous cars are among us. But are their decisions autonomous?
Some time ago autonomous
cars were something that was even hard to imagine. Nowadays autonomous cars are real and each of us can have one (of course if you have enough $$$)
These cars have a lot of CPUs, sensors, different algorithms
etc. As we can see it’s possible to program them to deal with casual
situations, even difficult ones. But there seems to be tough edge case to consider. Each of us sometimes has to choose the
lesser of two evils. Imagine that
you are driving and another car in front
of you brakes hard. It’s obvious that you will try to avoid
hitting him but on your left side you have a car with a passenger without seatbelts
fastened and on your right side you see a motorcyclist
without a helmet. What to do?! There are a lot of social dilemmas like this. Should
your own car do everything to protect you or maybe it should also think about
others?! Watch this short video and share your opinion with us!
Comments
I believe that no robot should have the ability to harm human, especially not for the 'common good'. Not likely, bu still, tech can always be exploited. Somehow, I have a feeling that the speaker would like to live in a place when nobody dies or is killed, which is pretty abstract and in my opinion delusive. I hope he doesn't.
I agree we need to research and improve. As for myself, I wouldn't drive such car because I simply enjoy driving. If I go too far to drive I make stops, fly by plane or go by train. One good point to use his technology is it could minimize costs and losses of companies in truck business, but I've heard lots of opinions from people who know something about this work and they always list countless cases where today's tech wouldn't do well.
All in all, I think that autonomous cars will decrease dramatically number of car accidents.
I do not believe in autonomous cars. For me they have too many hidden bombs. A computer will never replace a human being. Who will be responsible for the accidents?
In my opinion, this is a mistake of today's science, which is heading in the wrong direction.
Even though I like driving on my own, I wouldn’t like to pay money for something that could potentially kill me.That makes no sense.
But on the other hand the car should still make a decision who should be dead or alive. This means that such cars should somehow compare human lives. But how? Should it take in consideration people’s age, material status, sex, weight, height, grades in the school? What should be the criteria for comparison?
The cause of most accidents is human faults. Alcohol consumption or fatigue caused by sleepiness during driving, instantaneous thrills during driving or stuttering at speed.
The autonomous cars does not have the momentary human and emotional behaviors and can take the best logic of self-control At the time of the accident.
When it comes to scenario which author has mentioned, I think that vehicle should then calculate the probability of death or serious injuries of passengers based on speed and other conditions and if it would be below some threshold it should decide to hit the car in front of it, otherwise it should of course swerve off.
That's my opinion.
All the regulations an street was invented so even the dumbest person, who passed the test and get their license can drive on the street. Speed limits was created so the driver can easly manage their car. 99% of the accident is because of people fault, people get distracted and lose control of the situation. But I'm not going to dive deeper into this topic, my point is - are we going to trust autonomous cars and eliminate human fault(like distraction) or stop this idea because we don't know how the car would react during accident? Like I said, regulations, laws and limits are made, so the idiot can stick to it and feel safe, I think, that if autonomous cars will stick to the limits, the probability of serious and dangerous accident should be low
Few weeks ago there was a fatal accident and even with those all sensors and security systems car can't avoid tragedy.
I think that you missed the part where human cant even make such decision of who to "hurt", i think that we come into autonomous driving topics with wrong attitude. Those cars this technology enables us to to even have such decision to make , Is it simple one ? NO! it never is but at least now we have a choice! And we hate that choice, oh we do we fear thet we wont be able to use our usual excuses like "I lost control over the car", "I did not react fast enough" etc. We want to make devil out of autonomous cars just because we fear being liable for what we are doing even more.
Lets look at this from the other side what if there are no non autonomous drived cars on the road ? Who is there to breake uncontrollably ? Who is there to cause an emergency on the road? People dont like this question because answer to that is "other people".
I'm not sure that autonomous cars can't prevent all of that kind of accidents. As you said, they include a lot of algorithms which can have errors. In my opinion, computer can't replace human being.
On the other hand, there are a lot of too confident drivers on the street who are driving very dangerous. The same situation is with older people whose reflex is very slow. That's why I have thumps up for autonomous cars.
Anyway: introducing driverless cars (even with principle to save the owner for all cost) will reduce accidents. So I support this idea.
I hope we will be able to maximise the optimisations and do the best we can to prevent human society but as mentioned in the movie above we must agree on some tradeoffs to be able to cover all cases.