In today’s News of Note, anxieties continue to grow over AI-generated art, effective cybersecurity for the high-tech era, and the impact of facial recognition and gunshot detection technology on human rights.
Identity can be hard to define. In the real world, we don different (often overlapping) masks depending on the situation—family, work, public service or private play. Online, the distance between the “real you” and these masks is often more pronounced. We adopt pseudonyms, handles, avatars and personas—each associated with a different reputation, a different level of trust from the community, and different data (profile pictures, posts, etc.). While some may be closer to what you might consider your “core” identity than others, they are all part of your overall digital identity. As the concept of the metaverse evolves, and with the prospect of avatars that span multiple virtual environments, identity becomes more complicated and protecting it becomes all the more important.
For years, website owners have leveraged the federal Computer Fraud & Abuse Act (CFAA) as a tool to combat unauthorized scraping of data and other content from their websites. Due to a circuit court split on the interpretation of the CFAA’s “exceeds authorized access” provision, there has long been a legal gray area around the widespread practice of web scraping and whether scraping data from publicly accessible websites can give rise to liability under the CFAA. A set of closely watched, high-level court cases, however, may soon offer some long-awaited clarification on the reach of the CFAA to web scraping.
This week the European Data Protection Board (EDPB), a body that represents European data protection authorities, set up a new cookie banner taskforce. The new taskforce will coordinate the response to over 400 complaints concerning cookie banners filed by a nonprofit organization founded by Max Schrems, None of Your Business (NOYB).
The commercial real estate industry is increasingly adopting proptech to unearth savings and business insights. But companies need to be careful. Security and privacy are two foundational components of a successful data analytics initiative. Ensuring the information is stored securely while adhering to the complex framework of privacy laws will be instrumental to a real estate organization’s success with data. Why? If the information is not kept safe or is used contrary to law or the commitments a business has made to consumers, companies will face fines, regulatory investigations and customer ire.
As California reopens from the COVID-19 pandemic and workers begin returning to work in-person, many employers have begun requesting their employees provide, sometimes on an ongoing basis, certain health information before returning to the workplace. This includes information such as temperature checks, health surveys, COVID-19 test results, or proof of vaccination status. Given the likelihood that collecting this information will trigger certain requirements under the California Consumer Privacy Act (CCPA), employers should take certain measures to ensure they remain in compliance with the CCPA as their workplaces reopen.
COVID-19 accelerated digital transformations across every industry. From the growth of e-commerce and food delivery services to virtual workspaces and online learning, a seismic shift towards digitalizing our day-to-day activities has become the new normal.
Fingerprints. Retinas. Facial symmetry itself. We frequently address the problems raised as new technology brings new privacy concerns for customers and businesses alike. In “Check Your Policies for Privacy Claim Coverage: New York City’s New Biometrics Law Is Now in Effect,” Sandra Kaczmarczyk examines New York City’s recent statute that imposes two limitations on the use of “biometric identifier information” and why businesses operating in New York City should consider both their potential liability under these new requirements and whether their current insurance program protects them against associated risks.
As the use of biometric information such as fingerprints, iris scans, facial scans, and voice prints becomes more and more common, so, too, have the number of lawsuits brought for the unauthorized use of private information and for the violation of privacy laws—including class action lawsuits. In “The Duty to Defend a Privacy Claim Arises from Even Limited Publication of Biometric Identifiers,” our colleague Sandra Kaczmarczyk examines an important recent Illinois Supreme Court decision that is “likely to be at the forefront of future coverage litigation as other state courts grapple with the coverage afforded by business insurance policies for privacy claims.”
On April 1, 2021, the U.S. Supreme Court resolved a long standing issue plaguing providers of text message services and the companies engaging in text message marketing. Lower courts have been split in defining what constitutes an “automatic telephone dialing system” or auto-dialer with the definition either limited to equipment whose capacity to generate, store and dial telephone numbers was limited to random or sequential numbers or to any device with the capacity to store and automatically dial stored numbers using, for example, a speed-dial function.