A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.
I have no sympathy for the people who are being scammed here, I hope they lose hundreds to it. Making fake porn of somebody else without their consent, particularly that which could be mistaken for real if it were to be seen by others, is awful.
I wish everyone involved in this use of AI a very awful day.
Imagine hiring a hit man and then realizing he hired another hit man at half the price. I think the government should compensate them.
Hitman hires hitman who hires hitman who hires hitman who hires hitman who tells police - Oct ‘19
Nested hit man scalpers taking advantage of overpaying client.