I'm more beautiful than 40% of the Spice Girls.
A smoky house kicks off noisy alarms, and The Google Lady asks if we need help. A consideration of surveillance.
Read time: about 9 minutes: This week: They’re watching. Just trying to be helpful. Next week: Hoarding eggs. A rerun of a post during the pandemic, when toilet paper and eggs were scarce. This is on Thanksgiving Day (US).
How about sharing?
Just in time for cooler weather, we got a fireplace insert. Cast iron. Heavy as a pregnant sow. Fitted with a catalytic combustion thingy to reduce pollutants. It’s going to keep us warmer when winter hits.
We had to break it in. “We’re supposed to have three small fires in it to burn off oils and residues from the casting and such,” Bond Girl Bride told me. I built a fire successfully, after a few tries that produced some smoke in the living room, which we blew around with a fan. I thought the fire was “small.”
Bond Girl Bride disagreed. “Like usual. Everything to excess,” BGB said.
The fire, maybe larger than it was supposed to be, threw off good heat. Abundant heat. Evaporating (ahem, burning) casting residues added more smoke to already hazy air.
On cue, of course, the smoke alarms chimed in, a startling unison. BEEP-BEEP-BEEP! Towel thrashing took care of that for a few seconds, but then the din resumed. Loudly. Endlessly. Piercingly. Rosie the Dog and Guy the Fish watched, worried in their creaturely ways. Omar the Cat was amused and no doubt awaited what would happen next.
“Go get another fan from the garage,” BGB barked.
I fetched it and positioned it, Leaning Tower of Pisa-like, against the back door. A Kleenex box kept the door slightly ajar, a tissue in the wind waving Omar off. I thought maybe the fan would pull some of the haze away from the smoke alarm, tornado-like.
Screaming through all of this: BEEP-BEEP-BEEP-BEEP!
Then, as if magically, The Google Lady chimed in from the living room. “Do you want to call 9-1-1?” I thought she said.
I yelled to Bond Girl, “Did the fondle-slab just ask if we should call 9-1-1?”
Immediately, I thought perhaps that was exactly the wrong thing to say — Google Lady might interpret it as a command to call 9-1-1, which would only add distant sirens to the din.
Bond Girl yelled back, “No, it asked if we need help.”
“I’m always watching you, Wasowski. Always watching…”
How obliging that my Samsung tablet fondle-slab would think of us! How weird, too, that it did. At least that’s what I thought. I’m not convinced that my children or my students see weirdness there, only convenience.
After the smoke literally cleared, the more I thought about the automated offer of help, the less I appreciated it. It seemed an intrusion. In the hubbub of the moment, I don’t completely understand what could have prompted the Google Lady to speak up — even though there was a lot of noise going on. Besides, I thought I had turned off as many of what I believed to be surveillance features on the fondle-slab.
Of course, I wondered about being watched. And I especially wondered how concerns about surveillance shift, since I don’t think that concepts such as privacy have a rock-solid definition or, for that matter, even a modestly stable one. You only need to look at the Supreme Court’s decision overturning Roe v. Wade to see that.
Evan Selinger and Judy Rhee dived into questions about the process of “normalizing surveillance” and published an article bearing those words as its title (free download of the preprint here: https://papers.ssrn.com/abstract=3883551) Selinger and Rhee “categorized worries” this way:
the risk that selective attention leads people to overemphasize and overvalue the benefits of surveillance;
the risk that seemingly temporary surveillance measures become enduring and, possibly, more potent over time;
the risk that habituation leads people to view surveillance as unremarkable; and
the risk that people believe surveillance is acceptable and possibly desirable because widespread use makes it appear normal.
Of these worries, the third and fourth operate in my complex relationship with my fondle-slab. Its behavior while our smoke alarms screeched of course implied that some measure of surveillance was going on — a sensing action behind the scenes that I had not noticed or forgotten, even though the watching was “unremarkably” and “normally” persistent. That surveillance was indeed “necessary” for the Google Lady to chime in at what Google felt was the appropriate time — a programmed actuation kicked off by the sensing.
I was being watched all the time, evidently. For the sake of my convenience, among other reasons.
Surveillance as an on-off switch. And as an intrusion.
Perhaps my unease with the Google Lady’s offer in the din is misplaced, and I think this is where judgments of surveillance sometimes become difficult. It’s the connection of sensing and actuation that tangles us up as the connections become more complex and opaque. If sensing — constant watching — is a matter of a very simple actuation, its “surveillance” functions as a switch, well bounded and quite transparent in what it “actuates” or turns on or off. Hardly surveillance at all, you could say. Think of an old style thermostat. When the temperature goes below a threshold, the thermostat fires up the furnace. Simple. Hardwired. A machine that “watches” room temperature.
Contrast that with the Google Nest, which is a bestseller and an extremely sophisticated sensing device. Google claims that using it can save up to fifteen percent on heating and air-conditioning power bills. The Nest on the wall does a lot more than the old 1940s Honeywell device on your grandmother’s hallway wall.1 It retains data (somewhere) and learns from it, enabling it to modulate temperature to fit your personal habits. It senses your presence, so that it can light up its readings and, presumably, record your presence as part of your habits. The data goes to Google, while it keeps your rooms comfortable.
The Google Nest Learning Thermostat does the same thing for you as the old Honeywell did for your grandmother, but it does it better and offers some useful features that come from data analysis. It’s also a great sensor for Google to use for its own purposes.
And there’s the rub.
Part of the way that we might gauge the “creepiness” of surveillance has to do with its transparency — clarity about how things work. That is, who retains (or maybe even owns) data and for what purposes, including “downstream” uses by third-parties or the kind of “fusions” that data may be subjected to with other dataset. Transparency, I think, also includes matters of data type, capacity, and volume, because just knowing that data is retained and used is easy to misinterpret and underestimate. More creepiness slithers in as the volume and variety of collected data increase and broaden. Transparency has to do with algorithmic manipulations, too, which can be quite challenging to untangle. “Small data surveillance” differs substantively from “big data cyber-surveillance.” Big data is by nature difficult for mere mortals to comprehend.2
Get more independent writing. I recommend The Sample. They send one newsletter sample a day. No obligation. You can subscribe if you like it.
In part, what’s “creepy” is opaque and therefore suspicious. The Nest unnerves some people, because they don’t know the real underlying terms of the trade-off of their data for the sake of comfortable room temperatures. They often focus on the present and immediate — a willed blindness, perhaps — in order to allay concerns, but that doesn’t resolve the opacity.
“How Normal Am I” — We underestimate the data that’s collected
Finding the edges of data that’s collected is often mere guesswork, in part because data that is collected can undergo some significant analysis and expansion. So, while a website might collect, say, your interactions with elements of the website and declare so “in the interest of transparency,” the downstream uses of apparently innocuous data can transform the data into other kinds of data and judgments.
Tijmen Schep, who led the development of the website “How Normal Am I” (https://hownormalami.eu/) formulates part of the problem this way:
On dating websites you may only be matched to people who are about equally attractive. An insurance agency might have bought profiles from companies that use algorithms to judge your health or mental stability. It could be that your job application was judged by an algorithm which determined that you weren’t social enough for the company. The list is growing.
He adds, “One issue with this is that most people simply don’t know that this is happening.”
I’ve used “How Normal Am I” as part of my seminar, and I was particularly interested and amused by the AI’s assessment of “beauty.” Last year, I got the report that I was more beautiful than “forty percent of the Spice Girls.” This year, I got even more information — I ranked a “5.8” on a beauty scale and, despite growing a year older, I still kept pace with the Spice Girls. One student asked in class whether the beauty ranking was normalized according to age, which was a little deflating for my Spice Girl comparison but still a good question. (I am, alas, no longer a “Baby Spice.”) I actually wondered which two of the Spice Girls I outranked, beauty-wise. Poor ladies, whoever they are.
In a few amusing minutes, the site
estimated my age (shaving a couple years off, thank goodness),
estimated the number of years I had to live (twenty, more or less),
estimated my BMI (overshooting a bit),
assigned me a “beauty score” (maybe a bit generous)
captured a digital faceprint for re-identification,
determined whether I was happy, sad, or “neutral,”
determined with high accuracy where I looked at the screen and for how long,
figured out whether I was distracted, and
followed the movements of my mouse cursor.
In the few minutes I spent, the system captured 16,709 data points, which the website noted was a bit above average. It also used external data sources, such as geolocation data based on IP address, to triangulate and bolster results.
Schep’s narration in a small window on the screen is both informative and fun, so it’s easy to be flip about the information that “How Normal Am I” provides, but it’s also clearly a warning to us. We reveal far more than we know as we cavalierly surf the web, sign up for services, post updates. And what we think is innocent and benign information can be leveraged and transformed to serve purposes that we never expected or welcomed.
By the way, the website is served up in the European Union, and is subject to the GDPR. I had much more confidence using it than I would a similar website based in the US. You should give it a whirl!
Got a comment? We’re watching you.
Tags: surveillance, data collection, data fusion, personal data, GDPR, big data
Links, cited and not, some just interesting
Selinger, Evan, and Hyo Joo (Judy) Rhee. “Normalizing Surveillance.” SATS 22, no. 1 (July 27, 2021): 49–74. https://doi.org/10.1515/sats-2021-0002. The publisher wants $42 for a download, which is crazy! You can get the preprint, submitted version free from SSRN: https://papers.ssrn.com/abstract=3883551
Hu, Margaret. “Small Data Surveillance v. Big Data Cybersurveillance.” Pepperdine Law Review 42, no. 4 (2015): 773–844.
From the “daily missives”:
York, Joanna. “How ‘non-Verbal Communication’ Is Going Digital.” BBC Worklife, November 8, 2022. https://www.bbc.com/worklife/article/20221104-how-non-verbal-communication-is-going-digital.
Smith, Patti. “Things I’ve Seen, by Patti Smith.” The New Yorker, November 10, 2022. https://www.newyorker.com/culture/culture-desk/things-ive-seen.
Keegan, Jon. “How Political Campaigns Use Your Phone’s Location to Target You – The Markup.” The Markup, November 8, 2022. https://themarkup.org/privacy/2022/11/08/how-political-campaigns-use-your-phones-location-to-target-you.
Cheng, Michelle. “KFC Blamed a Bot for Its Kristallnacht Marketing Fail in Germany.” Quartz, November 10, 2022. https://qz.com/kfc-blamed-a-bot-for-its-kristallnacht-marketing-fail-i-1849769594.
Allyn, Bobby. “Google Pays Nearly $392 Million to Settle Sweeping Location-Tracking Case.” NPR, November 14, 2022, sec. Technology. https://www.npr.org/2022/11/14/1136521305/google-settlement-location-tracking-data-privacy.
Vallor, Shannon. “We Used to Get Excited about Technology. What Happened?” MIT Technology Review, October 21, 2022. https://www.technologyreview.com/2022/10/21/1061260/innovation-technology-what-happened/.
Bajaj, Simar. “A New, Transparent AI Tool May Help Detect Blood Poisoning.” Undark Magazine, October 12, 2022. https://undark.org/2022/10/12/a-new-transparent-ai-tool-may-help-detect-blood-poisoning/.
And, indeed, the term “Nest” today denotes more than just a fancy thermostat. It covers a class of home automation products, including surveillance cameras, doorbells, speakers, security features, and even a device that does “sleep sensing” at your bedside.
Margaret Hu follows Kate Crawford and Jason Schultz, who coined the “3Vs” of data collection: “Volume, Velocity, and Variety” and label big data as big-volume, high-velocity, and wide variety. There’s also “Veracity” as a fourth “V.” These terms are relative, of course, to the kind of capacities of computational and sensing infrastructure; what was big in 2010 is no longer big in 2022. Crawford Kate and Jason Schultz, “Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms,” Boston College Law Review, 93 (2014).
Fascinating that the spice girls remain the beauty standard. Also, congrats on being so good looking!
Love the idea that an old-style thermostat “watches” the room--so normal and so easy to invite a modern gadget that performs the same function into your home. But how do we get it to stop gathering data, and how to we even know all that it gathers? Nice way to pose these issues Mark.