Hi, Shrewd!        Login  
Shrewd'm.com 
A merry & shrewd investing community
Best Of Politics | Best Of | Favourites & Replies | All Boards | Post of the Week! | How To Invest
Search Politics
Shrewd'm.com Merry shrewd investors
Best Of Politics | Best Of | Favourites & Replies | All Boards | Post of the Week! | How To Invest
Search Politics


Halls of Shrewd'm / US Policy
Unthreaded | Threaded | Whole Thread (9) |
Author: albaby1 🐝 HONORARY
SHREWD
  😊 😞

Number: of 75969 
Subject: Re: and in the US ... crickets
Date: 01/07/26 2:01 PM
Post New | Post Reply | Report Post | Recommend It!
No. of Recommendations: 4
Umm. How is creating child sexual abuse material (CSAM) not a violation of the terms of service of both Google and Apple? Grok creating CSAM should trigger an immediate removal of X from every app store.

It might not meet the definition of CSAM. "Sexualized images" from photos of children wouldn't necessarily meet that legal definition unless they depicted very specific acts or involve depictions of specific body parts. From the article, it appears that Grok is creating highly sexualized but still clothed depictions of children, which might land in the 'awful but lawful' sphere.

https://www.vktr.com/ai-ethics-law-risk/groks-spic...
Post New | Post Reply | Report Post | Recommend It!
Print the post
Unthreaded | Threaded | Whole Thread (9) |


Announcements
US Policy FAQ
Contact Shrewd'm
Contact the developer of these message boards.

Best Of Politics | Best Of | Favourites & Replies | All Boards | Followed Shrewds