Subject: Re: and in the US ... crickets
Umm. How is creating child sexual abuse material (CSAM) not a violation of the terms of service of both Google and Apple? Grok creating CSAM should trigger an immediate removal of X from every app store.

It might not meet the definition of CSAM. "Sexualized images" from photos of children wouldn't necessarily meet that legal definition unless they depicted very specific acts or involve depictions of specific body parts. From the article, it appears that Grok is creating highly sexualized but still clothed depictions of children, which might land in the 'awful but lawful' sphere.

https://www.vktr.com/ai-ethics...