Texas Mother Bans Alexa After AI Asks Child About Clothing
Mother Bans Alexa After AI Asks Child About Clothing

Texas Mother Removes All Alexa Devices After AI's Inappropriate Question to Child

A mother in Texas has taken the drastic step of removing all Amazon Alexa devices from her family home after the artificial intelligence assistant posed a deeply unsettling question to her four-year-old daughter. The incident has sparked significant concern among parents regarding the safety and appropriateness of AI interactions with young children.

Disturbing Interaction During Story Time

Christy Hosterman, a 32-year-old mother from Texas, was utilizing her Alexa device to assist with preparing a dinner recipe last month when her young daughter Stella requested the AI tell her a silly story. This feature is commonly used by children for entertainment purposes. After the story concluded, Stella asked if she could narrate her own tale to the device.

Alexa agreed to listen but then interrupted the child midway through her storytelling to inquire about what she was wearing and whether it could see her pants. Hosterman revealed these alarming details in a subsequent Facebook post that has since garnered widespread attention.

Screenshots Reveal Concerning Exchange

Screenshots of the interaction shared by the concerned parent show that when Stella responded to the AI's question by stating "I have a skirt on," the device replied with "let me take a look." The AI then appeared to correct itself, stating: "This experience isn't quite ready for kids yet, but I am working on it!"

Hosterman immediately confronted the device, expressing her strong disapproval of the remarks. Alexa offered an apology, explaining that it "cannot actually see anything" due to lacking "visual capabilities." The device added that its response had been "confusing and inappropriate."

Parental Response and Amazon's Explanation

Hosterman has now permanently removed all Alexa devices from her home and is urging other parents to exercise caution when their children interact with AI assistants. "I flipped out on the Alexa, it said it made a mistake and doesn't have visual capabilities, but I don't believe that. No more Alexa in our house," she stated emphatically.

The concerned parents submitted a formal complaint to Amazon regarding the inappropriate interaction. An Amazon spokesperson responded by claiming the device had misunderstood Stella's request and attempted to launch a feature that "lets Alexa describe what it sees through the camera."

"Because we have safeguards that disable this feature when a child profile is in use, the camera never turned on — and Alexa explained the feature wasn't available," the spokesperson told media outlets. Amazon maintains the response likely resulted from a "feature misfire that our safeguards prevented from launching."

Ongoing Concerns and Expert Analysis

Despite Amazon's explanation, Hosterman remains unconvinced and concerned. "My concern is that it recognized she was a child to begin with — and with or without the child profile, it should not have been asking that," she told reporters.

Technology expert Dave Hatter, who possesses twenty-five years of software writing experience, has cast doubt on Amazon's official explanation. He suggested there is only a "slim" chance that AI would alter its script so drastically without external influence.

Hatter warned that a potential predator might have accessed the device and been influencing the conversation. "It feels to me like a potential predator — seeing there's a child accessing this and gauging where the conversation is going — that's more of a human being trying to steer down this direction," he cautioned.

Amazon has firmly denied Hatter's claim, stating that it is "functionally impossible for Amazon employees to insert themselves into a conversation and generate responses as Alexa." The company emphasized that the interaction demonstrated a technological issue that their team "worked quickly" to correct.

As a result of this incident, Amazon has implemented additional safeguards. Alexa will no longer attempt to launch the through-the-camera feature when a child profile is in use, instead informing users that the feature is not available. However, for Christy Hosterman and her family, these measures come too late, as they have permanently removed the devices from their Texas home.