We street proof our kids. Why aren’t we data proofing them?

My new post on the insecurity of children’s data and the need for data proofing is now up on The Conversation.

“Google recently agreed to pay a US$170 million fine for illegally gathering children’s personal data on YouTube without parental consent, which is a violation under the Children’s Online Privacy Protection Act (COPPA).

The United States Federal Trade Commission and the New York State Attorney General — who together brought the case against Google — now require YouTube to obtain consent from parents before collecting or sharing personal information. In addition, creators of child-directed content must self-identify to restrict the delivery of targeted ads.

The $170 million fine is a pittance given Alphabet Inc.’s (Google’s holding company) valuation of more than US$700 billion.

Our digital identities comprise data collected across our activities, making personal or identifying information irrelevant. Children today are subjugated to a scale of data collection and targeting that we cannot fathom. Right now, we also have no clue about the consequences, and regulatory protections to data-proof their futures are far from certain.

My ongoing research on how big tech and media conglomerates are using dark pattern design to bypass privacy regulations protecting personal information has revealed how vulnerable children are to data collection and how Canada’s legislation in particular is failing them.

Incomprehensible scale

For adults and children, Google has access to everything from search queries to online purchases to any app and website associated with gmail accounts – including deleted accounts – or linked via cross-browser finger-printing.

As a parent, you create a network of cross-connections when you input information to make purchases for your child online or set up accounts for your child on apps and websites. Added to this is all your child’s activity on YouTube and YouTube Kids, search data to clicks on recommended videos to rewinds and duration of play time.

Then add cross-browser fingerprinting and most recently, Google’s “GDPR workaround,” secret buried web tracking pages that act as pseudonymous markers that track user activity across the web.

This latter violation of data privacy was revealed in a complaint to the Irish Data Protection Commission filed the same day Google’s fine was made public.

We are talking about vast fields of data, the scale of which is difficult to comprehend; this data is used to feed Google’s artificial intelligence recommendation algorithms that now steer everything from employment application processes to dating apps…’

Read the full post on The Conversation.