By the time they put something like this on the streets, they’ll have done some palatability passes on it. Heck, some percentage of the voters will praise it.
“It’s almost always correct in who it shoots!”
“Higher good-kill ratio than human police.”
Honestly, I hope when we get robot police, it’ll be for the reason that they are much better at de-escalating than any human could be trained to be. They don’t have to worry about the potential consequences if they extend kindness to the wrong person at the wrong time. Human fear is tough to suppress, no matter how much training you have, but it’s even harder to suppress when you have very little training and none of it focused on that.
And while robots can’t empathize/sympathize with emotion, they can now recognise and act the same way that someone that does sympathize with it would. But more importantly, they can talk to a psychopath or sociopath in ways that resonate with them, rather than trying and failing to appeal to their empathy/emotions. It’s tough for a neurotypical to even understand a lack of emotions or empathy.
The first models won’t be great, but we already have the tech to start making something helpful in that field, and it’ll only get better as we collect more data. But it’s tough to put robots in any situation where the fail state could be loss of life, even if we see on paper that they avoid that result more often than humans. It just doesn’t feel right anyway.
I’m kinda worried we’ll see these on the streets one day.
at least ED209 looked cool
By the time they put something like this on the streets, they’ll have done some palatability passes on it. Heck, some percentage of the voters will praise it.
“It’s almost always correct in who it shoots!”
“Higher good-kill ratio than human police.”
Honestly, I hope when we get robot police, it’ll be for the reason that they are much better at de-escalating than any human could be trained to be. They don’t have to worry about the potential consequences if they extend kindness to the wrong person at the wrong time. Human fear is tough to suppress, no matter how much training you have, but it’s even harder to suppress when you have very little training and none of it focused on that.
And while robots can’t empathize/sympathize with emotion, they can now recognise and act the same way that someone that does sympathize with it would. But more importantly, they can talk to a psychopath or sociopath in ways that resonate with them, rather than trying and failing to appeal to their empathy/emotions. It’s tough for a neurotypical to even understand a lack of emotions or empathy.
The first models won’t be great, but we already have the tech to start making something helpful in that field, and it’ll only get better as we collect more data. But it’s tough to put robots in any situation where the fail state could be loss of life, even if we see on paper that they avoid that result more often than humans. It just doesn’t feel right anyway.