“Among the most sensitive categories of data collected by connected devices are a person’s precise location and information about their health. Smartphones, connected cars, wearable fitness trackers, “smart home” products, and even the browser you’re reading this on are capable of directly observing or deriving sensitive information about users. Standing alone, these data points may pose an incalculable risk to personal privacy. Now consider the unprecedented intrusion when these connected devices and technology companies collect that data, combine it, and sell or monetize it. This isn’t the stuff of dystopian fiction. It’s a question consumers are asking right now,” Kristin Cohen, Acting Associate Director, US Federal Trade Commission (FTC) Division of Privacy & Identity Protection, wrote in a blog post dated July 11.
The FTC is “committed to using the full scope of its legal authorities to protect consumers’ privacy. We will vigorously enforce the law if we uncover illegal conduct that exploits Americans’ location, health, or other sensitive data,” Cohen stated.
Why does this matter? Ever since the US Supreme Court overturned federal abortion protections, privacy advocates have raised concerns over how data collected by period tracking apps and other online platforms can be misused and weaponised against those seeking abortions in states where it is allowed. This has prompted the US federal government to increase the protection of sensitive health data. But the misuse of sensitive health and location data is not a US-specific issue. Many of the concerns highlighted below apply to other jurisdictions as well, including India. As we reported earlier, “sensitive health data of 12.5 million pregnant women were left unprotected by the health department of a north Indian State in 2019, while 2018 saw the Andhra Pradesh government wantonly leak health data pertaining to individuals’ reproductive health, ambulance requests, pharmacy purchases, and abortions. In 2019, the fertility tracking app Maya, with 7 million downloads at the time, shared sensitive reproductive health data with Facebook.” The exercise undertaken by the US FTC to strengthen the protection of location and health data sheds light on what other countries can and must do to prevent the misuse of such sensitive data.
Never miss out on important developments in tech policy, whether in India or across the world. Sign up for our morning newsletter, with a “Free Read of the Day”, to experience MediaNama in a whole new way.
How sensitive data can be misused?
“The conversation about technology tends to focus on benefits. But there is a behind-the-scenes irony that needs to be examined in the open: the extent to which highly personal information that people choose not to disclose even to family, friends, or colleagues is actually shared with complete strangers. These strangers participate in the often shadowy ad tech and data broker ecosystem where companies have a profit motive to share data at an unprecedented scale and granularity.” — Kristin Cohen, Acting Associate Director, US FTC Division of Privacy & Identity Protection
Location and health data end up in ‘murky marketplaces’: Cohen explains that connected devices constantly ping cell towers, interact with WiFi networks, and capture GPS signals, creating a comprehensive record of the users’ whereabouts. “This location data can reveal a lot about people, including where we work, sleep, socialize, worship, and seek medical treatment,” Cohen said. Furthermore, connected devices are used by millions of people to generate their own sensitive data, such as apps to test blood sugar, record sleep patterns, monitor blood pressure, or track fitness. “The potent combination of location data and user-generated health data creates a new frontier of potential harms to consumers,” Cohen remarked.
“The marketplace for this information is opaque and once a company has collected it, consumers often have no idea who has it or what’s being done with it. After it’s collected from a consumer, data enters a vast and intricate sales floor frequented by numerous buyers, sellers, and sharers,” Cohen said. “There are the mobile operating systems that provide the mechanisms for collecting the data. Then there are app publishers and software development kit (SDK) developers that embed tools in mobile apps to collect location information and provide the data to third parties. The next stop in the murky marketplace may be data aggregators and brokers – companies that collect information from multiple sources and then sell access to it (or analyses derived from it) to marketers, researchers, and even government agencies. These companies often build profiles about consumers and draw inferences about them based on the places they have visited,” Cohen explained.
“The amount of information they collect is staggering,” Cohen remarked while giving the example of a 2014 study, in which the FTC reported that data brokers use data to make sensitive inferences, such as categorizing a consumer as ‘Expectant Parent.’ “According to the study, one data broker bragged to shareholders in a 2013 annual report that it had 3,000 points of data for nearly every consumer in the United States. In many instances, data aggregators and brokers have no interaction with consumers or the apps they’re using. So people are left in the dark about how companies are profiting from their personal information,” Cohen stated.
Reproductive health data can be misused to target women seeking abortions: A sensitive subset at the intersection of location and health is reproductive health data. For example, products that track women’s periods, monitor their fertility, oversee their contraceptive use, or even target women considering abortion. “The concerns many have expressed about the risk of misuse are more than just theoretical. In 2017, for example, the Massachusetts Attorney General reached a settlement with marketing company Copley Advertising, LLC, and its principal for using location technology to identify when people crossed a secret digital “fence” near a clinic offering abortion services. Based on that data, the company sent targeted ads to their phones with links to websites with information about alternatives to abortion. The Massachusetts AG asserted that the practice violated state consumer protection law,” Cohen explained. “And just recently, the FTC reached a settlement with Flo Health, alleging the company shared with third parties – including Google and Facebook – sensitive health information about women collected from its period and fertility-tracking app, despite promising to keep this information private,” Cohen added.
Health and location data can be misused by criminals, stalkers, and other bad actors: In addition to the above harms, criminals can use location or health data to facilitate phishing scams or commit identity theft and stalkers can use the same data to inflict physical and emotional injury. “The exposure of health information and medical conditions, especially data related to sexual activity or reproductive health, may subject people to discrimination, stigma, mental anguish, or other serious harms. Those are just a few of the potential injuries – harms that are exacerbated by the exploitation of information gleaned through commercial surveillance,” Cohen wrote.
What companies should keep in mind when collecting location and health data
Cohen lays out the following points that companies should consider when thinking about the collection of confidential consumer information, including location and health data:
- Sensitive data is protected by numerous federal and state laws: Cohen pointed out that there are numerous state and federal laws that govern the collection, use, and sharing of sensitive consumer data. “The FTC has brought hundreds of cases to protect the security and privacy of consumers’ personal information, some of which have included substantial civil penalties,” Cohen wrote.
- Claims that data is ‘anonymous’ or ‘has been anonymized’ are often deceptive: “Companies may try to placate consumers’ privacy concerns by claiming they anonymize or aggregate data. Firms making claims about anonymization should be on guard that these claims can be a deceptive trade practice and violate the FTC Act when untrue. Significant research has shown that ‘anonymized’ data can often be re-identified, especially in the context of location data. One set of researchers demonstrated that, in some instances, it was possible to uniquely identify 95% of a dataset of 1.5 million individuals using four location points with timestamps,” Cohen explained. “Companies that make false claims about anonymization can expect to hear from the FTC,” Cohen added.
- The FTC cracks down on companies that misuse consumers’ data: Cohen made it clear that “FTC does not tolerate companies that over-collect, indefinitely retain, or misuse consumer data,” pointing out recent cases in which companies like OpenX, Weight Watchers, and CafePress were penalised for improper collection and retention of data.
What else is the US doing to protect sensitive health data?
In addition to the FTC assuring that it will go after those who misuse health data, here are the other major developments in the US to protect sensitive health data:
- Executive order on abortion care and patient privacy: A few days before the FTC blog post, US President Joe Biden signed an executive order directing federal agencies to protect abortion access and the online privacy of patients seeking reproductive healthcare.
- Two bills were introduced by US lawmakers: On June 2, Congresswoman Sara Jacobs introduced the My Body, My Data Act which aims to protect a person’s reproductive health data by limiting the personal reproductive health data collected, used, or retained by a service, and allows users to access, view, and delete their personal reproductive health data. A month later, Senator Elizabeth Warren introduced the Health and Location Data Protection Act, which outright bans the sale and transfer of sensitive health and location data by data brokers.
This post is released under a CC-BY-SA 4.0 license. Please feel free to republish on your site, with attribution and a link. Adaptation and rewriting, though allowed, should be true to the original.
Update (15 July, 10:20 am): Added attribution in “Why does it matter” section and updated last paragraph based on editorial input.
Essay: Group Privacy Protection, Blurred Lines Between Personal And Non-Personal Data As Blindspots For Regulators
- What The ‘My Body, My Data Act’ Reveals About Abortion Surveillance And Health Data Protection In The US
- Tech Cos Concerned They May Be Roped In To Anti-Abortion Investigations In The U.S.
- Health Data As Wealth: What Can Those With Access To Health Data Do With Such Access?
- Period App Maya Shared Sensitive Medical And Sexual Data With Facebook: Privacy International Report