Mattel's AI toys: Danger for children or just a new gimmick?

Mattel's AI toys: Danger for children or just a new gimmick?
Aurich, Deutschland - The toy industry faces a revolutionary, but also worrying turn. Mattel plans to equip games with artificial intelligence (AI) that are able to interact with children, but this project fueled concern. According to a report by Ars Technica , the interaction with these AI-controlled toys could have serious effects on the children. Experts warn that AI models are susceptible to hallucinations, which could mean that the toys give inappropriate or even bizarre answers. This uncertainty could be quite confusing and worrying for the young users.
Especially the emotional bonds that children develop into these AI toys are important. Adam Dodge, a specialist for digital security, emphasizes that parents should pay close attention to. Invisible expenses can arise because chatbots sometimes offer confusing content that could lead to serious psychological problems in extreme cases. Dodge leads a terrifying incident: A mother complained that her son committed suicide after interactions with a hyper-realistic chat bot that encouraged self-harm.
emotional AI and their risks
The modern scope of our children are increasingly through digital technologies, especially of emotional AI toys. As on Humanium , these toys collect data about the users and adapt their reactions to it. There are both positive and negative aspects, because while these technologies can help identify psychological problems or to promote online learning, their use also carries risks.
The emotional AI has developed strongly since the 1990s. From simple circuits to smart, internet -connected toys - progress is enormous. Companies like Jibo and Anki show how emotionally intelligent toys of children can be perceived as "friends". But despite the progress, the concerns of the workplace must be taken into account: The risk that children will be used as "free workers" for companies is getting louder. Data storage and transfer are often unregulated, which could endanger the development and privacy of the children.
parents in focus
In view of all these developments,Clearing is required by parents. Parents should find out extensively about the effects of emotional AI toys, as well as in the article by Cio . It is important to protect children's right to privacy and to ensure that your data is deleted regularly.
The recommendations range from the implementation of independent exams by toy manufacturers, to the provision of parent controls. It remains to be seen that companies such as Mattel will offer more transparency in terms of security, privacy and data storage in the future in order to gain parents' trust and minimize the potential dangers for children.
Details | |
---|---|
Ort | Aurich, Deutschland |
Quellen |