Data Creep in Schools and Daycares in Waterfront Toronto’s Quayside? Where’s the alarm?

Teens in class room

Open letter to Waterfront Toronto, City of Toronto Council, Mayor John Tory, Minister of Education Stephen Lecce, and the Premier of Ontario, Doug Ford, on the implications of “Data Creep in Schools and Daycares in Waterfront Toronto’s Quayside.”

The just released Quayside Discussion Guide, produced for Waterfront Toronto’s MIDP Evaluation Consultation, February 2020, Round 2, has one very troubling “solution” listed in the Complete Communities and Inclusivity section:

Waterfront TO’s categorizes the integration of a “public elementary school and childcare facility” in Quayside as a solution it supports if there is government support:

Waterfront Toronto’s failure to recognize the potential for the violation of children’s data privacy in these two physical domains, digital AND physical, is alarming.

First. Currently, under the Ontario Education Act, publicly funded schools are not considered spaces “that are open to the public“, ie. public spaces. The question of whether schools are public places was raised before the Human Rights Commission in Fall 2017 in regards to Kenner Fee, an autistic boy who hoped to have his service dog in the classroom. The Waterloo Board’s lawyer, Nadya Tymochenko, stated, “The school is not a public space,” and “The classrooms in a school are not publicly accessible.’

“Our legislation recognizes the need to secure the physical safety of our children and restrict public access as to anyone entering a school. Period. Why data collection broadly framed here would be permissible, is a mystery. If data is strictly to do with utilities and infrastructure, water, electricity, temperature, that seems feasible and valuable. Any data collection beyond that opens up the potential for surveillance creep for our most vulnerable residents. That data here is undefined is not acceptable.” (Tymochenko.)

As to the casual inclusion of child care facilities, more alarms sound. If childcare facilities are privately funded, will this be an opt in option for private businesses that serve children? That’s leaving aside data privacy precarity again, given Google’s history of collecting of children’s personal information.

Daycare
Daycare with toys and children. Photo Credit: BBC Creative on Unsplash

As I have noted elsewhere, there is no logical basis to trust that Sidewalk Labs will consistently adhere to whatever regulations are in effect. The lack of recognition in the Waterfront Toronto Quayside Discussion Guide as to the vulnerability of minors leaves open the potential for what Rob Kitchin has termed the phenomenon of “control creep.”

Kitchin’s work has documented how Smart City infrastructures “are promoted as providing enhanced and more efficient and effective city services, ensuring safety and security, and providing resilience to economic and environmental shocks, but they also seriously infringe upon citizen’s privacy and are being used to profile and socially sort people, enact forms of anticipatory governance, and enable control creep, that is re-appropriation for uses beyond their initial design” (2015, italics mine).

These concerns as to whether Alphabet subsidiary companies will rigorously respect data privacy and forego data tracking continue to be significant given the new Feb. 20, 2020 charges brought against Google by the Attorney General of New Mexico, Hector Balderas, that Google is collecting the data of minors via its suite of ed-tech apps and services, Chromebooks, G-Suite, Gmail, and Google Docs. If proven, this will be the second time Google has knowingly collected children’s data via its ed-tech, in violation of COPPA, the Children’s Online Privacy Protection Act. (See other violations as to collecting children’s data). Although Google has now committed to a phasing out of third-party cookies that enable data tracking by 2022, Google’s “Privacy Sandbox” regulations will not stop its own data collection.

We should be very concerned as to the scope and scale to which Google has already colonized our children’s futures, via its dominance in the ed-tech space, the entertainment space (Youtube Kids), and the really unfathomable extent of its dynamic, persistent, digital profiling of users’ organic online behaviour.

What possible options do we have to counter “data creep”?

First, remove this “solution” from the existing agreement until we have better protections for minors in Canada, which are inadequate.

Second, look to the two significant regulations now impacting Google, Youtube Kids, and tech platforms that serve child-directed content.

The first is a Nov. 22, 2019 FTC requirement directed to Youtube and YouTube Kids that all content “directed to children” be tagged as such, that viewers of that content cannot be tracked with persistent identifiers, and that all other COPPA regulations must be met. This requirement effectively requires YouTube Kids to self-regulate as to proper compliance of the users of its platforms and content creators globally are “scrambling” as to how to avoid possible violations and financial penalties.

The second is the new UK “Age Appropriate Design Code” brought forward by the  Information Commissioner’s Office that applies to all digital media companies and platforms and requires that harmful content be blocked from minors. Let me quote in full:

“There are laws to protect children in the real world. We need our laws to protect children in the digital world too.’– UK Information Commissioner

Today the Information Commissioner’s Office has published its final Age Appropriate Design Code – a set of 15 standards that online services should meet to protect children’s privacy.

The code sets out the standards expected of those responsible for designing, developing or providing online services like apps, connected toys, social media platforms, online games, educational websites and streaming services. It covers services likely to be accessed by children and which process their data.

The code will require digital services to automatically provide children with a built-in baseline of data protection whenever they download a new app, game or visit a website.

That means privacy settings should be set to high by default and nudge techniques should not be used to encourage children to weaken their settings. Location settings that allow the world to see where a child is, should also be switched off by default. Data collection and sharing should be minimized and profiling that can allow children to be served up targeted content should be switched off by default too.” (Jan. 22, 2020.)

We do not have this degree of data protection for minors in Canada, let alone adults. We should be vigilant as to not simply granting access to children’s data as a bullet point “solution” without any regard or attention to what that could mean in the future. We should be demanding regulation at the federal level that can impose significant and meaningful financial penalties and operational restrictions for all violations of children’s data privacy.

As I have said before, if we can’t effectively protect children’s data privacy, we should assume that data privacy for 13+ is functionally non-existent. Every adult living today who has spent time online has a dynamic, persistent, constantly updating targetable profile. Do we want this for our children? As adults and parents, we need to demand much more rigorous and punitive regulations, because if we don’t, it won’t happen and there will be no limits to “data creep.” In the US and the UK, outcry and pressure from parents, the media, and children’s privacy advocates, such as The Campaign for a Commercial-Free Childhood, are producing results. We need similar activism in Canada.

See my earlier post, “We Street Proof Our Kids. Why Aren’t We Data-Proofing Them?“, originally published on The Conversation.

Top Photo Credit: Neonbrand on Unsplash

“Nuit Blanche and Transformational Publics”

Scotiabank Nuit Blanche City Hall 2009

I stumbled on this feature article on our SSHRC funded, social media creative research project. In 2010, Faisal Anwar and I began our investigation of how people were using Twitter as a wayfinding tool during Toronto’s all night arts event, Scotiabank Nuit Blanche

We built a Tweet analytic tool, archived tweets tagged with event specific hashtags (#NuitBlancheTO, #snbTO …), and ran searches based on event and installation names, that mapped people flows through the event’s various zones. 

Over three years, our research expanded to content shared via flickr, YouTube, and Instagram, revealing a communal psychogeography generated over multiple platforms during the 12 hour event and after. 

Presentation on +City / Nuit Blanche and Transformational Publics, 2012.

Working with research assistants, we often found specific moments captured by multiple individuals, offering a proto-photosynth data set that could be restitched, roughly, for a loose, sometimes 360 degree public documentary. 

Presentation on +City / Nuit Blanche and Transformational Publics, 2012.

As I wrote then in an essay published in Public (2012), edited by Jim Drobnick and Jennifer Fisher, “These exchanges make visible the fluid actualization and processual experience of participatory, emergent public(s) that accord with how Michael Warner defines a ‘public’: that it is self-organizing, involves a relation amongst strangers, is simultaneously personal and impersonal in address, is constituted only through attention, and provides a discursive public space.

In addition, we discovered that striking groups of participants would appear over the night in disparate photos and videos, as they traversed Nuit Blanche installations. One year in particular, it was a group of young people wearing oversized mustaches. Another year it was an indie band in costume, playing through the streets.

Presentation on +City / Nuit Blanche and Transformational Publics, 2012.

What I realized very quickly was the depth and scale of information we had available as to individual’s movements and activities, and the potential infringement of individual privacy. The question of privacy in the digital public sphere, however, was complicated by, #1 Twitter’s mandate to share widely, and #2 the use of hashtags which explicitly tag tweets as meant for a wider conversation and viewing by strangers. 

+City data visualization tool tracking #mit8 hashtag during the MIT: public media, private media 2008 Conference, MIT Cambridge MA.

My concerns with data privacy started here. Even with our tool in beta, the data aggregation from Twitter coupled with content analysis on other social media sharing platforms, all public, all accessible, made the outlines of the surveillance state visible.

We street proof our kids. Why aren’t we data proofing them?

Article page on The Conversation

My new post on the insecurity of children’s data and why we need to data proof as well as street proof our kids is now up on The Conversation.

“Google recently agreed to pay a US$170 million fine for illegally gathering children’s personal data on YouTube without parental consent, which is a violation under the Children’s Online Privacy Protection Act (COPPA).

The United States Federal Trade Commission and the New York State Attorney General — who together brought the case against Google — now require YouTube to obtain consent from parents before collecting or sharing personal information. In addition, creators of child-directed content must self-identify to restrict the delivery of targeted ads.

The $170 million fine is a pittance given Alphabet Inc.’s (Google’s holding company) valuation of more than US$700 billion.

Our digital identities comprise data collected across our activities, making personal or identifying information irrelevant. Children today are subjugated to a scale of data collection and targeting that we cannot fathom. Right now, we also have no clue about the consequences, and regulatory protections to data-proof their futures are far from certain.

My ongoing research on how big tech and media conglomerates are using dark pattern design to bypass privacy regulations protecting personal information has revealed how vulnerable children are to data collection and how Canada’s legislation in particular is failing them.

How do we street proof data at an incomprehensible scale?

For adults and children, Google has access to everything from search queries to online purchases to any app and website associated with gmail accounts – including deleted accounts – or linked via cross-browser finger-printing.

As a parent, you create a network of cross-connections when you input information to make purchases for your child online or set up accounts for your child on apps and websites. Added to this is all your child’s activity on YouTube and YouTube Kids, search data to clicks on recommended videos to rewinds and duration of play time.

Then add cross-browser fingerprinting and most recently, Google’s “GDPR workaround,” secret buried web tracking pages that act as pseudonymous markers that track user activity across the web.

This latter violation of data privacy was revealed in a complaint to the Irish Data Protection Commission filed the same day Google’s fine was made public.

We are talking about vast fields of data, the scale of which is difficult to comprehend; this data is used to feed Google’s artificial intelligence recommendation algorithms that now steer everything from employment application processes to dating apps…’

Read the full post on The Conversation. Act to safeguard our kids by data proofing the lives and futures.

See my 2019 post on Data Creep in Schools and Daycares in Waterfront Toronto’s Quayside? Where’s the alarm?

Hidden Histories: Labour to Lofts! First talk at MixtuRealities Conference!

Siobhan O'Flynn, Hidden Histories Labour to Lofts talk

I was delighted to kick-off MixtuRealities Conference at University of Toronto Mississauga, yesterday, sharing a short history of the Hidden Histories project and details of the just launched, Hidden Histories: Labour to Lofts, 2019.

Key points of insight moving forward are that the questions asked by those working cultural heritage are concerned with issues of access, audience, and sustainability. So questions of creating digital projects that are easily accessible via mobile platforms, that speak to a broader and younger audience of mobile users, and that can be supported by longer-term funding to update and renew, given rapid cycles of OS updates, and platform and device obsolescence.

The Hidden Histories uncovered in Labour to Lofts contribute to Toronto’s intangible cultural heritage by sharing short histories of the impact of key factories in the urban and community development of the city through the 19th and 20th centuries. Many of these buildings are now converted high-end condos, with listing prices far beyond the income of the employees who once worked in these buildings

Intangible cultural heritage, as UNESCO has defined it, is community-based, including living expressions of ” oral traditionsperforming artssocial practices, rituals, festive eventsknowledge and practices concerning nature and the universe or the knowledge and skills to produce traditional crafts.”

Safeguarding intangible cultural heritage is distinct from ‘preservation’ as this practice is actively connected by UNESCO to sustainable development. Culture, thus looks forward, as a future-oriented domain of knowledge, to borrow from Arjun Appadurai’s “The Capacity to Aspire.” A cornerstone of this series of projects is to contribute to a more detailed understanding of both the individual histories of buildings and the historical forces that have shaped our city today.

Hidden Histories: #3 will examine a new set of topics via Esri Story Maps, including a look back via historical GIS to the Rivers and Routes that shaped Toronto, and the development of the East Harbour, now the site of Sidewalk Toronto’s Quayside development proposal.

Thank you Slavica Ceperkovic for the photo!

Protecting Children’s Data Privacy in the Smart City

Just posted this essay on Medium –

“Protecting Children’s Data Privacy in the Smart City”

The material here is a small fraction of a larger research inquiry into how major tech platforms and media conglomerates are / are not adhering to children’s privacy regulations. Primarily, COPPA in the US, Office of the Privacy Commissioner guidelines in Canada, then relevant legislation and activities in Europe pre & post GDPR, and activities in Mexico & India.

Given PM Trudeau’s commitment to a Digital Charter in Canada to halt hate speech on social media and online platforms, children’s data privacy should be at the forefront of this discussion.

From my essay:

“The devices that we use have unique identifiers. With cross-browser fingerprinting, the data we generate as users isn’t as anonymized as we believe it is. The tracking of our online activity is extensive, comprehensive and persistent, and generates marketable data shadowsthat do not need our personal information in order to target us as consumers.

This should be a significant concern regarding today’s children and youth, who have extremely detailed data profiles that they will carry into adulthood, creating what Google’s Eric Schmidt termed an “indelible record.”

What is key to note here is that these instances of alleged violations of children’s privacy have occurred in the private realm, where regulations exist as to how this data should be handled. As smart city projects like Sidewalk Toronto’s Quayside project grow in profile and popularity, they have yet to identify what will happen to data generated in public by minors. Because Sidewalk Toronto may set precedents shaping future smart city planning, children’s privacy in the private and public spheres should be recognized as a national issue.


Sidewalk Toronto is a subsidiary of Alphabet Inc., Google’s parent company, with several concerning precedents regarding tracking and collecting the data of minors. The findings reported here are an extension of a longer paper as to how tech and media giants are observation privacy needs of minors. “Data Science, Disney, and The Future of Children’s Entertainment” will be published in The Palgrave Handbook of Children’s Film and Television (July 2019).

Minors can’t consent

Children today face unique challenges because they will be targeted by business intelligence, and shaped by this targeting to a degree that we cannot fathom. There are legal protections for minors under 13 as stated by the Office of the Privacy Commissioner of Canada (OPC) and Children’s Online Privacy Protection Rule (COPPA) in the United States. Children and youth are recognized as vulnerable and deserving of special considerations: they cannot make informed decisions as to what they are agreeing to. This makes the data tracking and mining of children under 13 a federal issue….”

Feature image: Photo by Samantha Sophia on Unsplash