Blame (re-sending)
NB: I sent this out last week and for some reason very few people actually received it. I’m not sure what’s up with Substack but I’m resending it. Also: I was already writing something that basically disagrees with this post, so if you don’t like my opinion below, next week you can see my debating myself. Always a fun prospect. Have a great weekend!
I was reading about the unfortunate five-year project called Smart Columbus, which was intended to modernize Columbus, Ohio with increased electric vehicles, autonomous rides, and a variety of other improvements. They paused their driverless ride program run by EasyMile when a passenger fell out of her seat during a sudden braking. It’s a shame… the program as a whole kind of failed, and they still don’t have autonomous rides (which I would love to be able to do here in Los Angeles).
I realized: many people aren’t comfortable with the idea that they might get injured in a self-driving car. But why, especially if there are actually fewer accidents when a computer is driving the car as opposed to human beings?
I think it’s because there isn’t someone to blame (except I guess the people creating the driving AI)… and that’s linked to the truth that there isn’t a good human-centered way for me to avoid an accident if a computer is driving. It’s a reinforcement of the idea that bad things (on the road) happen to bad people (or rather, bad drivers).
So much of our culture is driven by this… some branch of the “just world fallacy”. That with anything bad that happens to us, there is some way we could have avoided it. But this creates a nasty mirror argument: that when bad things happen to people, they’ve done something to deserve it.
But sometimes life is just random. And that seems the hardest of all concepts for us to integrate. Because the only way you can totally avoid an accident in a driverless car is to never get into it at all.
And even today, knowing that we will very likely get to a place where driverless AI is significantly safer than human driving (if we’re not there already), I think there are many people that don’t want to run that risk.
I, however, would like to get in a cute little car with no one, and talk to no one, and arrive where I want to go. And I’m willing to accept a very tiny probability that something bad might happen.
(And: I’ll buckle my seatbelt so if the car stop suddenly I won’t faceplate on the windshield.)