Tuesday, March 21, 2023
Tier10 Lab
Youtube Facebook Instagram Linkedin
  • Home
  • Explore
  • About
  • Connect with Us
  • Subscribe
No Result
View All Result
Tier10 Lab
  • Home
  • Explore
  • About
  • Connect with Us
  • Subscribe
No Result
View All Result
Tier10 Lab
No Result
View All Result
Home Automotive

Quantifying Spock: How Vehicle Autonomy Presents an Ethical Quandary

Nathan Whipple by Nathan Whipple
October 7, 2016
in Automotive, Design, Technology
0
Quantifying Spock: How Vehicle Autonomy Presents an Ethical Quandary
325
SHARES
2.5k
VIEWS
Share on FacebookShare on Twitter

It’s a crisp, fall afternoon, and you’re being driven through the city by your new autonomous car. As you marvel at how far technology has come, a group of children becomes detached from their tour group and crosses the road. In a stroke of bad luck, it seems that your car’s brake system has malfunctioned! You’d swerve out of the way, but a large dump truck is occupying the other lane, and a collision into said object will spell certain death for you and your fiancé in the passenger seat (did I mention she’s pregnant?) You’re now faced with a decision: do you save you and your loved, or the children who are (illegally, I might add,) crossing the street?

While the specificity of the above scenario may seem a bit extraneous, it is exactly the type of situation that researchers, programmers, and consumers alike need to consider within the coming years, as self-driving cars are steadily increasing in popularity across the nation.  As their popularity and usage rises, the debate on the morality of autonomy intensifies, and now automakers and software designers must decide if human life is quantified for the sake of convenience.

I’ll throw it in reverse for a second. To say that self-driving cars are safe is an understatement: of the over 130 million miles of road driven by Tesla’s Autopilot, there has only been one fatality, and if you believe Elon Musk, it was simply the culmination of a series of rare circumstances. The nation’s roadways can and will be safer as autonomous vehicles become more and more prevalent, and while machine learning makes it possible for the software in these cars to predict and anticipate what decision it needs to make, it doesn’t make that decision any less consequential.

It’s not like the public’s not being polled either. Through a series of Amazon Mechanical Turk surveys, participants were asked to assess certain situations in which they must choose between saving pedestrians or the person in the car barreling towards them. In a hopefully unsurprising fashion, more than 75 percent of respondents decided that the needs of the many outweigh the needs of the few, and elected to “sacrifice” the person in the car. In other words, people overwhelming believe that self-driving cars should embrace a utilitarian mentality. That is, of course, until the situation turned into one of buying an autonomous vehicle. In that case, survey participants elected to select whichever car was going to protect them at all costs.

So, how do we progress from here?

screen-shot-2016-10-04-at-12-31-47-pm
MIT’s Moral Machine

The fact of the matter is, the choice to sacrifice your own life to save another predates the automobile by millennia, and our faith will have to be placed in the hands of an algorithm. There is some good news though; you get to help create it! The nerds over at MIT have created a “game” of sorts, in which visitors to the site will be faced with a number of situations that could potentially arise if the brakes of a self-driving car were to fail. Your results are then compared to other users, and you are given the chance to design your own situation if you feel the need to. The authors state that the site is their attempt at “building a crowd-sourced picture of human opinion on how machines should make decisions when faced with moral dilemmas,” and hope to learn more about not only how people view each situation, but machine learning in general. This author tended to steer more towards utilitarianism, opting to save as many as possible at the risk of my own, which I believe accurately reflects upon my “fight or flight” instincts.

Unless, of course, the person crossing the street was wearing a Cowboys jersey.

 

 

 

 

 

Sources: Moral Machine, The social dilemma of self driving cars, PBS, Science Magazine, Popular Mechanics

Tags: Automated VehiclesAutomationAutomobile Automationautonomous vehiclesSelf-Driving Cars
Previous Post

Are Spectacles the Event Horizon for Wearable Tech?

Next Post

Honda’s New CR-V Remains atop the Throne of the Compact SUV Kingdom

Next Post
Honda’s New CR-V Remains atop the Throne of the Compact SUV Kingdom

Honda’s New CR-V Remains atop the Throne of the Compact SUV Kingdom

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Categories

  • #TBT (21)
  • Advertising (191)
  • Automotive (168)
  • Behind the Scenes (1)
  • Branding (119)
  • Design (45)
  • Digital (183)
  • Mobile (70)
  • Monthly Sales Report (17)
  • Social (176)
  • Technology (132)

Recent.

8 Content Marketing Best Practices & Examples

8 Content Marketing Best Practices & Examples

July 2, 2020
8 Content Marketing Tools You Can’t Live Without

8 Content Marketing Tools You Can’t Live Without

June 19, 2020
The  Ultimate Guide to Content Marketing

The Ultimate Guide to Content Marketing

June 5, 2020
Tier10_Logo_Lab_White_TM
Tier10lab is the official web presence of Tier10, reporting on the latest in Social, Creative, Technology, Marketing and more.
  • NEWSROOM
  • CONTACT
Youtube Facebook Instagram Linkedin

Copyright © 2008-2021 Tier10. All Rights Reserved

  • Privacy Policy
  • Accessibility
  • Data Security
  • Terms of Use
No Result
View All Result
  • About
  • Blog
  • Contact
  • Create your website with blocks
  • Home 1
  • Sample Page
  • Yop Poll Archive
  • Home
  • Explore
  • About
  • Connect with Us
  • Contact Us
  • The Team
  • Subscribe

Copyright © 2008-2021 Tier10. All Rights Reserved