UK regulator has privacy concerns regarding Snap’s My AI

My AI has been under suspicion for some time by governments around the world. In a recent development in the situation, the Information Commissioner’s Office (ICO) issued a warning against Snap’s AI chatbot. Along with that, it has also raised questions about the privacy of children.

As per reports, the risks arise because of Snap’s failure to assess the privacy risks involved with My AI before its launch. Although there’s no reported breach, ICO is concerned with personal information associated with children from the age group of 13 to 17 under the Children’s Design Code, which contains a data protection rule implemented back in 2021. If SnapChat fails to address the issues, the matter might lead to the ban of chatbots in the UK.

My AI is a huge privacy concern for UK regulators

My AI was introduced back in February for premium Snapchat subscribers and provided them with a virtual friend that was offering advice and answering questions. After initially testing the feature, it was launched publicly to everyone, including children.

The chatbot has tight moderation and protection mechanisms, according to the business, yet there have been numerous incidents of AI giving incorrect advice. For instance, the chatbot once offered guidance to a 15-year-old user on how to mask the scent of alcohol and to a 13-year-old user about sexual encounters.

Response from Snap

Snap responded to these claims by reiterating its dedication to customer privacy and saying, “We are carefully analyzing the ICO’s preliminary ruling. As part of our usual product development process, My AI completed a thorough legal and privacy review procedure before being made available to the public. To make sure the ICO is happy with our risk assessment practices, we will keep working together.

Leave a Reply