This morning I attended the Waterfront Toronto Board of Directors Meeting and while the initial summary of ‘realignments’ from the original Sidewalk Labs / SidewalkTO Master Innovation and Development Plan (MIDP) seemed a positive move forward, one point in the summary handout is deeply problematic.
Dark Pattern Designs
The underlined text is a clear example of “dark pattern” design – in this instance, language that obfuscates and/or manipulates attention and perception to the advantage of the corporate interest. Here ‘commercially reasonable efforts’ defers an ethical treatment of data to established, current practices in the commercial sector. The legal loopholes and exceptions this phrasing enables means you might as well say, whatever we can get away with legally, as per our Terms of Service, we will.
Let me give examples from a current *live* Privacy Policy that demonstrate how much personal and non-personal data is legally collected and used. The following sections are from Calm, the #1 Sleep app, Apple’s Best of 2018 Award Winner, Apple’s 2017 App of the Year, and ‘The Happiest App in the World, according to the Center for Humane Technology. Most striking (see below), is how user data is clearly stated to be a ‘business asset’ that can be disclosed or transferred in the event of a bankruptcy.
You can read the full privacy statement here, (downloaded it’s a 22 page PDF). Note that you have to link to the statement from the Terms of Service page, a deliberate second step designed to deter users from reading the privacy policy.
Data Collection
Note below the range of data collected automatically and that none of this data is ‘personal information.’
In this section, ” commercially reasonable” includes accessing personal information from other sources:
Note below the extensive collection of non-personal data: device identifier, user settings, location information, mobile carrier, and operating system of your device.
Anonymized Data
Note below how anonymized personal Information is aggregated, encompassing de-identified demographic data and de-identified location information, for further use. As such, “Anonymized and aggregated information is not Personal Information, and we may use such information in a number of ways…”
The security of anonymized data is tenuous, as researchers at different UK universities in July 2019 “published a method they say is able to correctly re-identify 99.98% of individuals in anonymized data sets with just 15 demographic attributes.”
“Commercially Reasonable” & SidewalkTO
All of the above data collection and data use is “commercially reasonable.” The second major flag in the continue of the sentence I underlined is the “process[ing] of non-personal data.” As a data category, this functionally includes anything / everything that is not personal, from web-browsing and search history included, to any other online activity, cross device and cross platform that you engage in.
Suffice to say, this particular phrasing gives Sidewalk Labs and SidewalkTO a firehose of data to analyze and add to pre-existing user activity digital profiles, which we all have as Google/YouTube ad targets. Waterfront Toronto should be absolutely concerned as to what this statement legally allows. I find it laughable as to any assurance of data privacy protection.
If you haven’t read my prior posts on data privacy and children, a demographic more heavily regulated than adults, you can read these here:
“Data Creep in Schools and Daycares in Waterfront Toronto’s Quayside? Where’s the Alarm?” March 9, 2020.
“We street-proof our kids. Why aren’t we data proofing them?” Sept. 29, 2019
Can We Trust Alphabet & Sidewalk Toronto with Children’s Data? Past Violations Say No. June 6, 2019.
“Protecting children’s data privacy in the smart city.” May 15, 2019