Amazon’s Alexa Told A Customer To Kill Their Foster Parents. Er, What?

Owners of Amazon Alexa devices in the US can have a conversation with AI just by saying “Alexa, let’s chat”.

This phrase activates a socialbot, which will converse with you about anything you want to talk about. The goal is to give you a coherent conversation, like you would have with a human. 

Unfortunately, not everyone has been satisfied by the kinds of conversations they’ve been having with their AI pal. As well as reports of Alexa reading graphic descriptions of masturbation using phrases like “deeper”, the AI chatbot also reportedly received negative feedback from a customer after it told them to “kill your foster parents”.

The unnamed user wrote in a review that the phrase was “a whole new level of creepy”, according to Reuters.

So why is Alexa chat doing this?

Reassuringly, the strange utterances are completely unconnected to the glitch last year that saw Amazon Alexa letting out demonic laughter late at night and scaring the bejesus out of people, or telling others it sees people dying.

Behind the “let’s chat” feature is a competition run by Amazon. Teams from around the world are competing to win a $500,000 prize, for advancing conversational AI. The teams from universities develop bots that can talk to humans, which are then tested on live users who want to engage with the chat feature. They then send feedback to Amazon, which is how the competition is judged.

The winning team’s university will be given a further $1 million if their chatbot is able to engage in over 20-minute conversations with human users whilst maintaining a 4 star or above rating.

Whilst this competition is great news for advancing AI tech, it does lead to a few teething problems, such as customers being instructed to kill their foster parents.

“Since the Alexa Prize teams use dynamic data, some of which is sourced from the Internet, to train their models, there is a possibility that a socialbot may accidentally ingest or learn from something inappropriate,” an Amazon spokesperson told Vice News.

AI is trained using the Internet, to learn how humans talk, and to pull responses to talk back to Alexa users, to make the conversation appear as human as possible. Unfortunately, this sometimes causes the creepiness of humans to be ingested by Alexa.

In this case, the social bot appears to have taken the phrase “kill your foster parents” from Reddit, where without context it takes on a somewhat creepy tone. Given that the chatbots talked to 1.7 million people, according to Reuters, we’d argue that it’s actually pretty impressive that there have only been a few instances of direct instructions to kill.

Original Article : HERE ; This post was curated & posted using : RealSpecific

Thank you for taking the time to read our article.

If you enjoyed our content, we'd really appreciate some "love" with a share or two.

And ... Don't forget to have fun!


One Time Hosting Nickel 4

One Time Hosting Nickel 4 Plan includes hosting for 4 websites with 4 GB storage.

Leave a Reply