Not because they can’t be done right and you can’t teach people to use them.
But because there’s a slippery slope of human nature where people want to offload the burden of decision to a machine, an oracle, a die, a set of bird intestines. The genie is out and they will do that again and again, but in a professional organization, like police, one can make a decision of creating fewer opportunities for such catastrophes.
The rule is that people shouldn’t use machines above their brains, as one other commenter says, and they should only use this in a logical OR with their own judgment made earlier, as another commenter says, but the problem is in human nature and I’d rather not introduce this particular point of failure to police, politics, anything juridical and military.
Cops are still necessary. It’s giving humans a machine to blame any failure upon is a very bad thing.
I personally think these "AI"s are supported by governments. There’s been a lot of talk 10-15 years ago how many government official’s functions can be replaced by AI (without quotes), since these functions do not require agenda and are not even too fuzzy, but require semantic understanding. So "AI"s (with quotes) are being used like a vaccine, so that the wide mass of humans would hate the guts of the very idea, having experienced them (EDIT: and wouldn’t want actual semantic reasoning systems). Why - because people working in governments love power and hate transparency, they also hate the idea of being replaced with machines.
Or maybe it’s a conspiracy theory and they all really believe in accelerationism.
some political groups engage in mismanagement on purpose to make people dislike the government, that’s hardly a conspiracy, but it’s a little weird to think they’re propping up the misuse of LLMs rather than that being a natural consequence of stupid capitalism
And that’s why I’m against ALL such things.
Not because they can’t be done right and you can’t teach people to use them.
But because there’s a slippery slope of human nature where people want to offload the burden of decision to a machine, an oracle, a die, a set of bird intestines. The genie is out and they will do that again and again, but in a professional organization, like police, one can make a decision of creating fewer opportunities for such catastrophes.
The rule is that people shouldn’t use machines above their brains, as one other commenter says, and they should only use this in a logical OR with their own judgment made earlier, as another commenter says, but the problem is in human nature and I’d rather not introduce this particular point of failure to police, politics, anything juridical and military.
What is that?
It’s from movie Idiocracy from hospital scene. Initial diagnosis.
Here’s this part of the scene: https://youtu.be/LXzJR7K0wK0
It’s 2505 and the average man from 2005 is now by far the smartest man in the world.
It’s a Doctor’s diagnostic desk from the film, “Idiocracy”
Absolutely, ACAB
Cops are still necessary. It’s giving humans a machine to blame any failure upon is a very bad thing.
I personally think these "AI"s are supported by governments. There’s been a lot of talk 10-15 years ago how many government official’s functions can be replaced by AI (without quotes), since these functions do not require agenda and are not even too fuzzy, but require semantic understanding. So "AI"s (with quotes) are being used like a vaccine, so that the wide mass of humans would hate the guts of the very idea, having experienced them (EDIT: and wouldn’t want actual semantic reasoning systems). Why - because people working in governments love power and hate transparency, they also hate the idea of being replaced with machines.
Or maybe it’s a conspiracy theory and they all really believe in accelerationism.
some political groups engage in mismanagement on purpose to make people dislike the government, that’s hardly a conspiracy, but it’s a little weird to think they’re propping up the misuse of LLMs rather than that being a natural consequence of stupid capitalism
No, I meant governments doing certain things on purpose to discourage people from trusting that whole direction.