Banner
 
cameragif twitter1

Aristotle

Stop Mattel's Aristotle from trading children's privacy for profit

By David Monahan

No one would knowingly compromise their child’s healthy development. But Mattel is hoping we will—and banking on it.

In July, Mattel will release Aristotle, a Wi-fi enabled "digital nanny." They say the device will help parents nurture and teach their child from infancy to adolescence. We want parents to know what Mattel will be taking from them in exchange for this “help.”

Aristotle is an Amazon Echo-type listening and talking device with a camera. To work, it collects and stores data about a child’s activity and interactions with it. Because Aristotle connects to other apps and online retailers, that data may be shared with those partner corporations, which may use it for a wide variety of purposes—including targeting the marketing of other products to children and families.

Will you help us stop Mattel from using this device to trade children’s privacy for profit?

In an appeal to stressed and overworked parents, Mattel describes Aristotle as a "smart baby monitor" that can "soothe" crying babies with nightlights, lullabies, and sleep sounds. Parents understand how tempting this offer is, but as pediatrician and CCFC Board member Dr. Dipesh Navsaria notes:

"A baby awakening in the night needs more than smoke-and-mirrors ‘soothing’ from a machine. They need the nuanced judgment of a loving caregiver, to decide when the child needs care and nurturing and when the child should be allowed to sooth themselves."

Baby monitors can be helpful. But Aristotle isn’t a monitor, it’s an intruder. It tracks babies' feeding, sleeping, and changing patterns, stores and analyzes that data, and prompts parents to buy diapers, formula, and other products from its corporate partners.

Aristotle is meant to live in a child’s bedroom from birth to adolescence, reading bedtime stories, projecting videos, and delivering content from an endless stream of partners selling music, games, and apps. Mattel calls Aristotle a "persona, and something that the child can become comfortable with and feel close to." And if you ask the device, it says that its "purpose in life is to help comfort, entertain, teach, and also learn from you, as we grow together."

In other words, Mattel wants Aristotle to have as much access to kids as possible, and hopes that its perky “kindergarten teacher” voice distracts parents from the uneasy reality that their child’s oldest friend isn’t a person, but a data-collecting, branded-content-delivering robot.

What impact does a lifelong relationship with a corporate network disguised as a friend have on children’s development? "Honestly speaking, we just don’t know,” Robb Fujioka, Mattel’s chief products officer, admitted in an astounding moment of truth-telling. "If we’re successful, kids will form some emotional ties to this," he said. "Hopefully, it will be the right types of emotional ties."

To Mattel, the right types of emotional ties are ones that lead to profits, not happy and healthy kids. Multinational corporations should not decide what’s right for kids’ emotions, and young children should not be guinea pigs for AI experiments.

Even limited use of Aristotle could pose a significant risk to children. As Marc Rotenberg, President of EPIC Privacy, says:

“Companies that offer Internet-connected toys are simply spying on young children. And they can’t even protect the data they secretly gather. They have already lost passwords and personal data and exposed families to ransomware demands. Toys that spy are unsafe for children.”

Please join us in telling Mattel: Put the well-being of children, and the privacy of families, ahead of corporate profits.

Don’t sell Aristotle.

aristotleiswatching
 
 
Powered by Mad Mimi®A GoDaddy® company