In 2021, the Department of Homeland Security started a process of adopting regulations for mobile driver’s licenses. The Transportation Security Administration (TSA) has since begun allowing mobile driver’s licenses as identification at airports, and several states jumped on the bandwagon, offering mobile driver’s licenses through state-sponsored apps or via Apple and Google Wallet. Now, the TSA has proposed new regulations that would waive REAL ID requirements for state-issued mobile driver’s licenses, but privacy advocates including the American Civil Liberties Union (ACLU) and the Electronic Frontier Foundation (EFF) warn this move may put consumers’ personal information at risk.
New and emerging technologies have always carried a host of potential risks to accompany their oft-blinding potential. Just as dependably, those risks have often been ignored, glossed over or just missed as public enthusiasm waxes and companies race to bring a product to market first and most effectively. Automobiles promised to get people (and products) from one place to another at life-changing speeds, but also posed a danger to life and limb while imposing a new burden on existing infrastructure. Even as technology leaps have transitioned from appliances and aircraft to computers, connectivity and large language models (LLMs), new and untested technologies continue to outpace the government and the public’s ability to moderate them. But while one can debate what constitutes an acceptable gap between the practical and ideal when it comes to regulating, mandating and evaluating the pros and cons of new technology, societies tend to generate their own methods of informing the public and attempting to rein in the more harmful aspects of the latest thing.
In today’s News of Note, anxieties continue to grow over AI-generated art, effective cybersecurity for the high-tech era, and the impact of facial recognition and gunshot detection technology on human rights.
Identity can be hard to define. In the real world, we don different (often overlapping) masks depending on the situation—family, work, public service or private play. Online, the distance between the “real you” and these masks is often more pronounced. We adopt pseudonyms, handles, avatars and personas—each associated with a different reputation, a different level of trust from the community, and different data (profile pictures, posts, etc.). While some may be closer to what you might consider your “core” identity than others, they are all part of your overall digital identity. As the concept of the metaverse evolves, and with the prospect of avatars that span multiple virtual environments, identity becomes more complicated and protecting it becomes all the more important.
For years, website owners have leveraged the federal Computer Fraud & Abuse Act (CFAA) as a tool to combat unauthorized scraping of data and other content from their websites. Due to a circuit court split on the interpretation of the CFAA’s “exceeds authorized access” provision, there has long been a legal gray area around the widespread practice of web scraping and whether scraping data from publicly accessible websites can give rise to liability under the CFAA. A set of closely watched, high-level court cases, however, may soon offer some long-awaited clarification on the reach of the CFAA to web scraping.
This week the European Data Protection Board (EDPB), a body that represents European data protection authorities, set up a new cookie banner taskforce. The new taskforce will coordinate the response to over 400 complaints concerning cookie banners filed by a nonprofit organization founded by Max Schrems, None of Your Business (NOYB).
The commercial real estate industry is increasingly adopting proptech to unearth savings and business insights. But companies need to be careful. Security and privacy are two foundational components of a successful data analytics initiative. Ensuring the information is stored securely while adhering to the complex framework of privacy laws will be instrumental to a real estate organization’s success with data. Why? If the information is not kept safe or is used contrary to law or the commitments a business has made to consumers, companies will face fines, regulatory investigations and customer ire.
As California reopens from the COVID-19 pandemic and workers begin returning to work in-person, many employers have begun requesting their employees provide, sometimes on an ongoing basis, certain health information before returning to the workplace. This includes information such as temperature checks, health surveys, COVID-19 test results, or proof of vaccination status. Given the likelihood that collecting this information will trigger certain requirements under the California Consumer Privacy Act (CCPA), employers should take certain measures to ensure they remain in compliance with the CCPA as their workplaces reopen.
COVID-19 accelerated digital transformations across every industry. From the growth of e-commerce and food delivery services to virtual workspaces and online learning, a seismic shift towards digitalizing our day-to-day activities has become the new normal.
Fingerprints. Retinas. Facial symmetry itself. We frequently address the problems raised as new technology brings new privacy concerns for customers and businesses alike. In “Check Your Policies for Privacy Claim Coverage: New York City’s New Biometrics Law Is Now in Effect,” Sandra Kaczmarczyk examines New York City’s recent statute that imposes two limitations on the use of “biometric identifier information” and why businesses operating in New York City should consider both their potential liability under these new requirements and whether their current insurance program protects them against associated risks.