As the use of biometric information such as fingerprints, iris scans, facial scans, and voice prints becomes more and more common, so, too, have the number of lawsuits brought for the unauthorized use of private information and for the violation of privacy laws—including class action lawsuits. In “The Duty to Defend a Privacy Claim Arises from Even Limited Publication of Biometric Identifiers,” our colleague Sandra Kaczmarczyk examines an important recent Illinois Supreme Court decision that is “likely to be at the forefront of future coverage litigation as other state courts grapple with the coverage afforded by business insurance policies for privacy claims.”
On April 1, 2021, the U.S. Supreme Court resolved a long standing issue plaguing providers of text message services and the companies engaging in text message marketing. Lower courts have been split in defining what constitutes an “automatic telephone dialing system” or auto-dialer with the definition either limited to equipment whose capacity to generate, store and dial telephone numbers was limited to random or sequential numbers or to any device with the capacity to store and automatically dial stored numbers using, for example, a speed-dial function.
On March 15, amendments to the California Consumer Privacy Act (CCPA) banned companies from using “dark patterns” that confuse or delay consumers trying to opt out of the sale of their personal information.
Your company wants to use a picture taken outside of your office at an event you are hosting or sponsoring. Perhaps the image shows someone wearing your clothing or other product or using something showing your brand. Possibly you participated in a parade and want some images showing your company’s float or views from the float along the parade route. Maybe the image shows the outside of your building or the immediately surrounding area. You may have hired a photographer to take the pictures, they may have been taken by an employee, or someone may have found them on a third-party website or social media posts. The pictures may depict people who were on the street or present at the event, and they may include images of one or more buildings or local landmarks.
Building upon the California Consumer Privacy Act (CCPA), on November 3, 2020, Californians voted to approve Proposition 24: the California Privacy Rights Act (CPRA). The CPRA does not replace the CCPA but rather adds to and modifies the language of the CCPA to strengthen consumer privacy rights and perhaps, in the future, form a basis for General Data Protection Regulation (GDPR) data transfer adequacy. While the CPRA is a landmark legislative accomplishment for privacy rights, it creates new problems for blockchain-based technologies, particularly those provisions regarding the right of correction and principles of data minimization and storage limitation.
“One who invites another to his home or office takes a risk that the visitor may not be what he seems, and that the visitor may repeat all he hears and observes when he leaves. But he does not and should not be required to take the risk that what is heard and seen will be transmitted by photograph or recording, or in our modern world, in full living color and hi-fi to the public at large or to any segment of it that the visitor may select.” When Ninth Circuit Judge Shirley M. Hufstedler wrote these words in 1971 about surreptitious recordings made by newsmen, she probably had no idea that a global pandemic would give new meaning to her words.
The 2020 election will be unprecedented in many respects. More people will be voting by mail, and there will likely be more democratic participation online than ever before. Internet platforms and communication services will become more influential forums as people are restricted from in-person conventions and debates. But even before the pandemic pushed these operations online, some lawmakers were already seeking to monitor misinformation in political discourse on the internet, specifically in the context of deepfakes. Deepfakes are manipulated photo, video or audio clips generated by computers, often with the assistance of artificial intelligence algorithms. They can make someone appear to say or do something that never actually happened or create realistic images of people who do not exist.
Have you ever been startled by the buzzing sound of a passing swarm of angry mechanical bees as you work from home? Have you ever looked out your window and noticed an agile device zipping through your property? If so, drones might be aversely affecting your lifestyle. These little devices once only lived in the imaginations of science fiction writers, but nowadays, they are popular gadgets that many parents routinely buy for their kids during Christmas. The popularity of drones has exploded due to cheaper production costs, advancements in camera and wireless technologies, and the appeal of high-quality bird’s-eye view footage popularized by aspiring vloggers looking to create impressive visual content. Recently, the COVID-19 pandemic has further fueled drone popularity due to their potential in the context of robotic delivery services. However, despite its advantages, drone technology poses a significant threat to property and privacy rights; luckily, the law offers several grounds to obtain legal remedies if such rights are infringed.
As the world continues to deal with the unprecedented challenges caused by the COVID-19 pandemic, Artificial Intelligence (AI) systems have emerged as a potentially formidable tool in detecting and predicting outbreaks. In fact, by some measures the technology has proven to be a step ahead of humans in tracking the spread of COVID-19 infections. In December 2019, it was a website-leveraging AI technology that provided one of the key early warnings of an unknown form of pneumonia spreading in Wuhan, China. Soon after, information sharing among medical professionals followed as experts tried to understand the extent of the unfolding public health crisis. While humans eventually acted on these warnings, the early detection enabled through use of AI-supported data aggregation demonstrates both the promise and potential concerns associated with these systems.
I am not at all embarrassed to admit that I love working in my pajamas. A lot of us are working from home now to help flatten the curve, and while social distancing has created a lot of challenges for most people, one of the perks is the ability to socially distance yourself from your hairbrush and roll into the office in your sweatpants. I don’t think I’m alone on this. According to one survey, 60% of office professionals report a better work-life balance when working from home, and 74% of workers would like to telecommute more often after social distancing restrictions are lifted. Not coincidentally, the widescale adoption of telecommuting has resulted in a corresponding uptick in employee monitoring tools.