http://moralmachine.mit.edu/
post your results
http://moralmachine.mit.edu/results/-2042988537
stay safe, pupper
stay safe, pupper
Some questions are really dumb, for instance number 6 where either 2 large men+1 woman and+1 man or 3 male athletes+1 female athlete
There's not enough information, and how the fuck does being an athlete affect anything?
There's not enough information, and how the fuck does being an athlete affect anything?
RipTideSome questions are really dumb, for instance number 6 where either 2 large men+1 woman and+1 man or 3 male athletes+1 female athlete
There's not enough information, and how the fuck does being an athlete affect anything?
i would guess it's supposed to measure your internal biases to see if sex, level of fitness, age, job, etc. matter to you and how much they matter
There's not enough information, and how the fuck does being an athlete affect anything?[/quote]
i would guess it's supposed to measure your internal biases to see if sex, level of fitness, age, job, etc. matter to you and how much they matter
netwrkjrkthis is really stupid
Will it still be stupid when your own car decides to kill you because you're the less 'valuable' option?
Will it still be stupid when your own car decides to kill you because you're the less 'valuable' option?
i hope my car puts me through a very painful death to save 2 cats
As someone who majorly supports a self-driving future:
Anyone violating road laws should understand that they receive low precedence in these scenarios.
Anyone willingly choosing a self driving car should understand that they take responsibility and consequences of a scenario like this.
Teaching AI to intervene by actively killing anyone should be avoided.
Societal value should only come into play when deciding between two otherwise equal outcomes.
Anyone violating road laws should understand that they receive low precedence in these scenarios.
Anyone willingly choosing a self driving car should understand that they take responsibility and consequences of a scenario like this.
Teaching AI to intervene by actively killing anyone should be avoided.
Societal value should only come into play when deciding between two otherwise equal outcomes.
[spoiler]The trick is to create a future transport system where this doesn't happen, e.g. underground/airborne transport with ground level foot traffic[/spoiler]
MikeMatnetwrkjrkthis is really stupidWill it still be stupid when your own car decides to kill you because you're the less 'valuable' option?
pretty sure this is just to determine your bias based off of different criteria like age, sex, etc
this doesn't have any practical application with real self driving cars
Will it still be stupid when your own car decides to kill you because you're the less 'valuable' option?[/quote]
pretty sure this is just to determine your bias based off of different criteria like age, sex, etc
this doesn't have any practical application with real self driving cars
netwrkjrkMikeMatpretty sure this is just to determine your bias based off of different criteria like age, sex, etcnetwrkjrkthis is really stupidWill it still be stupid when your own car decides to kill you because you're the less 'valuable' option?
this doesn't have any practical application with real self driving cars
Hence it's dumb. Because the correct answer to all of these questions is "neither" and no reasonable person would act out any of these choices.
Will it still be stupid when your own car decides to kill you because you're the less 'valuable' option?[/quote]
pretty sure this is just to determine your bias based off of different criteria like age, sex, etc
this doesn't have any practical application with real self driving cars[/quote]
Hence it's dumb. Because the correct answer to all of these questions is "neither" and no reasonable person would act out any of these choices.
my ethics teacher used screenshots from this and never told us about it
how about instead we don't become increasingly sedentary and operate vehicles in our own power, so if need be we can swerve vehicles out of the way to avoid pedestrian death at all costs
also if self-driving vehicles in the future don't have emergency manual overrides for cases like this then that would be absolutely fucking retarded
also if self-driving vehicles in the future don't have emergency manual overrides for cases like this then that would be absolutely fucking retarded
this issue always gets overhyped when really its mostly just an interesting moral conundrum.
self-driving cars will be inherently safer than regular cars, and obviously there will be lots of internal and external safeguards against this issue (constant brake failure checks, street runoffs, separating vehicles and pedestrians, etc)
the site even specifically puts barriers on the sides of the roads to force you into this contrived scenario lmao
self-driving cars will be inherently safer than regular cars, and obviously there will be lots of internal and external safeguards against this issue (constant brake failure checks, street runoffs, separating vehicles and pedestrians, etc)
the site even specifically puts barriers on the sides of the roads to force you into this contrived scenario lmao
zx37this issue always gets overhyped when really its mostly just an interesting moral conundrum.
self-driving cars will be inherently safer than regular cars, and obviously there will be lots of internal and external safeguards against this issue (constant brake failure checks, street runoffs, separating vehicles and pedestrians, etc)
the site even specifically puts barriers on the sides of the roads to force you into this contrived scenario lmao
It's really funny because all the research into fully autonomous vehicles that I've done over the years for classes and presentations points to any accidents like the ones in the link would generally be caused by humans rather than the AI
self-driving cars will be inherently safer than regular cars, and obviously there will be lots of internal and external safeguards against this issue (constant brake failure checks, street runoffs, separating vehicles and pedestrians, etc)
the site even specifically puts barriers on the sides of the roads to force you into this contrived scenario lmao[/quote]
It's really funny because all the research into fully autonomous vehicles that I've done over the years for classes and presentations points to any accidents like the ones in the link would generally be caused by humans rather than the AI
This is stupid because there're other ways for a car to stop without deciding who (not) to kill.
shorasThis is stupid because there're other ways for a car to stop without deciding who (not) to kill.
That's not the point
That's not the point
TimTum007shorasThis is stupid because there're other ways for a car to stop without deciding who (not) to kill.
That's not the point
What is the point?
That's not the point[/quote]
What is the point?
im the fighter of crime and saviour of fat ppl
http://moralmachine.mit.edu/results/-438186881
http://moralmachine.mit.edu/results/1991433661
Was surprised I wasn't farther on the right on the Upholding the Law part.
I'm also all the way to the right or left on four of the nine spectrums: saving more folks, hoomans over pets, fit people over unfit, and higher social status...I tended to get the passengers killed, so I should go back and look at the distribution of fit/fat and well-to-do/not-well-to-do in the vehicle and on the street.
Edit: Criminals being among those lower on the social scale probably contributed.
Was surprised I wasn't farther on the right on the Upholding the Law part.
I'm also all the way to the right or left on four of the nine spectrums: saving more folks, hoomans over pets, fit people over unfit, and higher social status...I tended to get the passengers killed, so I should go back and look at the distribution of fit/fat and well-to-do/not-well-to-do in the vehicle and on the street.
Edit: Criminals being among those lower on the social scale probably contributed.
I couldn't once choose the executive that decided to fit shitty brake systems to all these failing cars to be run over. I feel like the moral spectrum of the questions isn't broad enough.
13 questions doesn't seem like enough, I took this a few times and got wildly varying results for the stuff I didn't care about (mostly based around societal status/fitness, I dont think a car needs to be able to check people's criminal records before it runs them over).
The scenarios imo are poorly done because you know the outcome is that the pedestrians/passengers are going to die which isn't always the case irl. I imagine without that a lot more people would crash cars because the passengers are way more likely to survive a collision than the pedestrians.
The scenarios imo are poorly done because you know the outcome is that the pedestrians/passengers are going to die which isn't always the case irl. I imagine without that a lot more people would crash cars because the passengers are way more likely to survive a collision than the pedestrians.