It’s no secret that every move you make on the internet can be tracked. Even when you are not actively searching, scrolling through a social media feed, or using your phone to navigate to a new local restaurant, your digital behavior can be tracked, analyzed, stored, marketed, and sold. But is that simply the price we now pay for convenience and connectedness? Maybe, but companies should at least tell you what they are doing with your valuable data.
We previously wrote about amendments to the California Consumer Privacy Act (CCPA) that required online businesses that sell the personal information of consumers to notify people of their right to opt out of having their personal information sold. The amendments also prohibit companies from leveraging “dark patterns,” or methods designed to impair a consumer’s choice to opt out. In Illinois, another state known for its pioneering privacy law and resulting consumer-driven litigations, a proposed class of students brought a claim against an online testing company for collecting their biometric data without their informed consent. Even when consent is given, companies must provide detailed disclosures that comply with state regulations, as we described in a post about employers collecting and storing new categories of health information, such as vaccination status and COVID-19 test results. In the Metaverse, an inconspicuous username or avatar may seem like a strategic path toward anonymity, but that’s simply not enough to protect your digital identity.
Enter the Federal Trade Commission (FTC), the government entity tasked with protecting consumers from deceptive and unfair business practices, which now includes protection against the illegal use, sharing, and selling of consumer data, such as location and health information. “[M]illions of people also actively generate their own sensitive data, including by using apps to test their blood sugar, record their sleep patterns, monitor their blood pressure, track their fitness, or sharing face and other biometric information to use app or device features,” the FTC explains. “The potent combination of location data and user-generated health data creates a new frontier of potential harms to consumers.”
To help combat those harms, the FTC took action against Flo Health, the developer of a period and fertility-tracking app used by more than 100 million consumers, alleging the company shared the health information of its users with third-party analytics providers without the consent of users. More recently, the FTC penalized health and wellness app Kurbo for indefinitely storing consumer data and collecting the personal information of children without parental permission.
And the FTC is not stopping there. In a recent blog post, the FTC suggested that companies look at past enforcement actions as a “roadmap” to inform compliance with privacy and consumer protection laws. For companies that fail to comply with the law, the FTC made clear that it “will vigorously enforce the law if we uncover illegal conduct that exploits Americans’ location, health, or other sensitive data.”
The FTC provided the following guidance for companies that collect sensitive consumer information:
- Understand both federal and state laws that govern the collection, use and sharing of consumer information. Companies that handle consumer data should consider Section 5 of the FTC Act, the FTC’s Safeguards Rule, Health Breach Notification Rule, and Children’s Online Privacy Protection Rule (COPPA), among others, in addition to the privacy and consumer-protection laws of individual states. Having privacy counsel retained and available to run policies and procedures by is always a good idea.
- Question whether the data collected from consumers is truly anonymized. The FTC warns that data can often be re-identified, especially in the context of location data, and in some instances, a few pieces of seemingly anonymized data when combined can be enough to identify an individual. Companies must be careful to not deceive consumers about the anonymization, or lack thereof, of data.
Compliance with privacy laws is an ever-present obligation for companies that collect, use and share consumer information. The FTC provides general guidance on the various laws that apply to businesses and how they can comply, but as regulations change and business priorities shift, companies must be proactive in frequently evaluating and re-evaluating their policies, practices, and the current requirements under the law.
Finally, companies with a global presence must also be mindful of privacy laws that apply to their collection and use of personal data relating to consumers in other jurisdictions. The EU GDPR has extra-territorial reach and will apply to companies providing products and services to EU-based consumers, subject to limited exceptions. Countries around the world are introducing their own privacy laws, in some cases based on the EU GDPR. Keeping track of and complying with privacy laws and obligations in key jurisdictions where customers are based must also be a priority.