Always treat others with respect and kindness, even if you disagree with them. Avoid making personal attacks or insulting others, and try to maintain a civil and constructive tone in your discussions.
- Manlobbi
Halls of Shrewd'm / US Policy
No. of Recommendations: 10
https://www.dailykos.com/stories/2026/1/6/2361708/...
Elon Musk’s AI chatbot Grok is being investigated by multiple governments around the world for its production of nonconsensual sexualized images, including photos of children. But despite the global uproar, senior Republicans have not commented on the issue—despite Musk’s role as a major donor to the GOP.
On Tuesday, the United Kingdom’s Secretary of State for Science, Innovation, and Technology Liz Kendall called on X, which is also owned by Musk, to address the circulation of the images, describing them as “absolutely appalling.”
No. of Recommendations: 9
Elon Musk’s AI chatbot Grok is being investigated by multiple governments around the world for its production of nonconsensual sexualized images, including photos of children. But despite the global uproar, senior Republicans have not commented on the issue—despite Musk’s role as a major donor to the GOP.
The party of pedophilia. Led by Donny the pussy grabber.
No. of Recommendations: 5
some coverage by bloomberg's matt levine, but i am sure he is reaching his musk\MAGA hypocrisy limit :
"...See, if I were getting paid a trillion dollars to run a car company, I would try to (1) sell more cars and (2) not build a chatbot that creates sexualized images of minors, because I would think things like “I want a trillion dollars” and “I do not want to be fired from my lucrative job” and “I do not want to get arrested.” But Musk is optimizing for the fate of humanity, and selling more cars is boring. At least sell robots!"
fanboyz are trying to address the latter problem by having every tesla counted as a robot.
No. of Recommendations: 12
PucksFool: Elon Musk’s AI chatbot Grok is being investigated by multiple governments around the world for its production of nonconsensual sexualized images, including photos of children.
Umm. How is creating child sexual abuse material (CSAM) not a violation of the terms of service of both Google and Apple? Grok creating CSAM should trigger an immediate removal of X from every app store.
Google, Apple, WTAF? Journalists, where are you?
No. of Recommendations: 4
You're delusional. The quote you provided made no mention of child sexual abuse material.
That's all in your head.
Are you a pedo, or do you just aspire to be one?
No. of Recommendations: 4
Umm. How is creating child sexual abuse material (CSAM) not a violation of the terms of service of both Google and Apple? Grok creating CSAM should trigger an immediate removal of X from every app store.It might not meet the definition of CSAM. "Sexualized images" from photos of children wouldn't necessarily meet that legal definition unless they depicted very specific acts or involve depictions of specific body parts. From the article, it appears that Grok is creating highly sexualized but still
clothed depictions of children, which might land in the 'awful but lawful' sphere.
https://www.vktr.com/ai-ethics-law-risk/groks-spic...
No. of Recommendations: 1
Define "highly sexualized.". That's not a legal definition. It's exactly the type of totally vague and subjective non definition that plays nicely into the EUs campaign to suppress free speech on these platforms.
I could as easily say that Macrons wife punching him out on an airplane is a highly offensive and threatening depiction of domestic violence.
Oooh let's sanction every media outlet that showed that clip unless they play ball with the EU censors.
No. of Recommendations: 9
Are you a pedo, or do you just aspire to be one?
You sure spend a lot of time calling other posters “pedos”.
Why is that?
No. of Recommendations: 5
albaby1: it appears that Grok is creating highly sexualized but still clothed depictions of children, which might land in the 'awful but lawful' sphere.
My understanding is that some of the 'clothing' is transparent or nearly see-through. But technicalities aside, anyone else besides X doing this and they would have been removed in a NY minute.