THE DARK SIDE OF AI: An AI Image Generator’s Exposed Database Reveals What People Really Used It For.

Tens of thousands of explicit AI-generated images, including AI-generated child sexual abuse material, were left open and accessible to anyone on the internet, according to new research seen by WIRED. An open database belonging to an AI image-generation firm contained more than 95,000 records, including some prompt data and images of celebrities such as Ariana Grande, the Kardashians, and Beyoncé de-aged to look like children.

The exposed database, which was discovered by security researcher Jeremiah Fowler, who shared details of the leak with WIRED, is linked to South Korea–based website GenNomis. The website and its parent company, AI-Nomis, hosted a number of image generation and chatbot tools for people to use. More than 45 GB of data, mostly made up of AI images, was left in the open.

The exposed data provides a glimpse at how AI image-generation tools can be weaponized to create deeply harmful and likely nonconsensual sexual content of adults and child sexual abuse material (CSAM).

Disturbing stuff but also fringe — 45 GB of photos isn’t all that many images, really.

More common ailments are like the one I experienced on Sunday when an AI tech support chatbot instructed me to try using a function (on its own website) that didn’t exist. When I informed it that it had hallucinated, it apologized and then did it again.