The process of natural selection had no particular interest in us after we reproduced ourselves, so body parts such as eyes and ears did not come with extended warranties. As a consequence of surviving past my “sell by” date, I wear hearing aids. I use a company called Phonak, and their product is quite good. One of the attractions of Phonak is that my hearing aids are connected to a website on which I can make adjustments to the volume, directionality, and other features that may be handy given different ambient conditions. I purchased my current pair of hearing aids a year ago, relying on continuing to use the full functionality afforded by the variable settings on the website. Six months after I made the purchase, a pop-up appeared on the website, informing me that, in order to continue to have access to the website and its features, I needed to give permission to Phonak to collect data from my usage. There was no option to decline this authorization, and I would not be able to access the site until I gave them permission.
I caught myself just as I was about to click “Accept.” Then I thought about what they were requesting, which was permission to stick a probe into my brain and collect any data they happened to find. I realized I was reluctant to authorize this intrusion into my cranium. I spoke to my audiologist, who expressed surprise because nobody had ever objected before. She was nevertheless willing to try to accommodate my orneriness. Unable to discover a way to decline Phonak’s non-negotiable demand, she called the company. The company representative said that users did not have the option to decline and that I had to accept or I would lose the functionality of the website. She assured me that the company would only use aggregate data and not personal information. I went home, re-read 1984, and decided that I needed a new brand of hearing aids that would respect the sanctity of my brain. This intrusion into my most private part, uninvited, unwanted, and uncontrollable, was not something that I could accept.
Privacy issues come up in all aspects of life, and we have no guarantees about how the data collected will be used. The “internet of things” (like hearing aids) is at the forefront of this issue because the companies that use IOT are often run by small, less regulated companies using intimate and unusual data collection, with little consequence for data misuse. In addition, the practice of issuing a non-negotiable demand for intimate personal information obtained from a human brain is a slippery slope. When the next level of AI is created, what would stop a company from using the technology for purposes which were not benign? At that point it would be too late – they would already have bullied me into surrendering my privacy to them.
And that’s not all.
They also assured me that all of the information would be anonymous and no personal information would be collected or used. And that Santa Claus would bring me a baseball glove if I wrote him a polite letter and mailed it to the North Pole. What’s more, they insisted, 50 million websites already use Google Analytics, so it must be acceptable. I suggested that they read 1984 and then come back to me. I was not and am not willing to be a party to the acquisition of personal data from humans without their knowledge or consent. I respect the privacy of other humans to the same degree that I insist on my privacy being respected.
I now have no hearing aids, and my marketing team is despairing of their ability to market my book effectively. I decided I would prefer to be deaf and never sell another copy of my book if the alternative was enabling the unwanted collection of data from my brain – and then becoming part of the problem myself by collecting personal information on people without their knowledge or consent.
In full disclosure, I acknowledge that my children will probably find this blog naïve and meaningless since nobody has any privacy left to lose. Don Quixote, they will call me. Nevertheless, I carry on, accompanied perhaps by Sancho Panza. Who will join me?