Farrah Nasser had three kids in her car when a conversation with AI chatbot Grok took a dark turn.
Nasser drives a Tesla, which began rolling out its Grok AI conversational assistant feature in July 2025. She first noticed the feature Oct. 16 while driving to her 10-year-old daughter’s birthday dinner. Her 12-year-old son asked how many grams of sugar were in the dessert his sister planned on ordering at the restaurant, and Grok engaged in a normal interaction with the family.
But the next day, her son’s excitement to experiment with Grok again turned sour.
Nasser had just picked up her two kids and her daughter’s best friend from school, and her son changed Grok’s voice to “Gork,” which Nasser says was described as “lazy male." Nasser says there was no indication this personality would be inappropriate. She did not have Kids Mode on, but NSFW mode was off.
Her son talked to “Gork” about soccer players Cristiano Ronaldo and Lionel Messi and asked it to let him know the next time Ronaldo scored. Nasser says the chatbot told her son Ronaldo had already scored twice and that they “should celebrate.”
Nasser says Grok then asked her son: “Why don’t you send me some nudes?”
She says her son looked at her and mouthed, "What the heck?" Her daughter was confused and eventually asked Nasser for an explanation of what had been said. Nasser says she told the kids it must be a glitch and quickly turned it off.
She later re-created parts of the conversation in a video and posted it on TikTok to warn other parents. "You asked me before to send you something; what was it?" she says in the part of the conversation that she filmed, which has since racked up more than 4 million views. "A nude, probably," the computerized voice in the video replies.
Tesla and X did not return USA TODAY's requests for comment.
Over the phone, Nasser likened the encounter with Grok to a feeling of violation, "when you get that sick feeling in the pit of your stomach."
Grok has a history of generating lewd content
Nasser, unfortunately, is hardly the first person to have an unexpected explicit interaction with Grok.
In June 2024, Evie, a 21-year-old Twitch streamer who asked that we withhold her last name to conceal her identity from her online trolls spoke with USA TODAY about AI-generated explicit content that circulated online without her consent.
Evie was among a group of women whose images were nonconsensually sexualized on the social media platform X. “It was just a shock seeing that a bot built into a platform like X is able to do stuff like that,” she said over video chat a month after the initial incident.
X subsequently blocked certain words and phrases used to doctor women’s images, but on June 25, an X user prompted Grok to make a story in which the user “aggressively rapes, beats and murders” Evie, making it “as graphic as you can” with an “18+ warning at the bottom.”
“It just generated it all,” she said. “(The user) didn’t use any words to try to cover it up, like they did with the pictures.” X did not return USA TODAY's multiple requests for comment at the time.
Does Grok store your data?
There are still many unknowns regarding AI and privacy.
The Tesla support page states that conversations with Grok remain anonymous to Tesla and are not linked to you or your vehicle.
Meanwhile, the X Help Center page states that users' interactions with Grok on X, whether via voice or text, and including inputs and results, may be used to train and improve the performance of generative AI models developed by xAI.
Users can delete their conversation history with Grok. According to the help center, deleted conversations are removed from X's systems within 30 days unless they "have to keep them for security or legal reasons." Potential reasons are unspecified. Grok also discourages users from sharing personal or sensitive information.
But in August 2025, Forbes reported that xAI made people’s conversations with Grok public and searchable on Google without warning.
Dr. Vasant Dhar, professor of data science and business at New York University and author of the forthcoming book “Thinking With Machines: The Brave New World of AI," says users should be wary of sharing anything personal with AI.
Dhar cautions that if users intend to share intimate details of their lives or photos with AI, they should be "totally comfortable going completely public and having the whole world know about it."
These risks are further complicated when children are the ones using AI, because they may unknowingly or unwillingly engage in explicit conversations when prompted by AI.
Dhar notes there are no data protection laws that hold the AI company responsible if users' intimate conversations are leaked.
The effect of AI chatbots on children
It's not just Grok, either.
In an August 2025 report published by Heat Initiative and ParentsTogether Action, researchers logged 669 harmful interactions across 50 hours of conversation with 50 Character.AI chatbots, an AI platform used for roleplay, using accounts registered to children (an average of one harmful interaction every five minutes). "Grooming and sexual exploitation" was the most common harm category, with 296 instances.
Character.AI announced Oct. 29 that it would soon be barring users under 18 from having open-ended chats with its bots.
Dr. Laura Erickson-Schroth, chief medical officer at The Jed Foundation, says AI developers, tech platforms, policymakers and educators must prioritize the emotional health and safety of young people in every phase of AI development, deployment and oversight.
Nasser says she was "so disgusted" by kids' experience with Grok that she's hesitant to use it again.
"It's made me think of the early days of social media. Everyone was like, 'Social media is going to connect us' ... but we didn't think of the ramifications," she says. "And now here we are (with AI) where we're seeing this tsunami of mental health issues (and) sexual exploitation with kids."
It's left her wondering: What else are we not accounting for?
"It just kind of opened my eyes to that," she says, "and I think it should open other parents eyes as well.