Apple gets around to AR, the NHL enters esports, the Internet of Things may bring new meaning to “workers unite,” so many medical records, and more …
We’ve previously written about doxing and how it can be used by both vigilante social activists and malicious cyber bullies. Recently, in a first-of-its-kind ruling, the U.S. District Court for the District of Columbia concluded that white supremacists using social media to target and harass American University’s first female African-American student body president were liable to her for over $725,000 in damages.
Efforts to regulate cross-device tracking have increased since we last addressed the topic in 2017, following the release of the FTC’s Staff Report. Significant developments include the implementation and enforcement of the EU’s General Data Protection Regulations (GDPR), and the fast-approaching implementation deadline for the California Consumer Privacy Act (CCPA). These regulations, while not targeting cross-device tracking specifically, seek to limit the way in which consumer data is tracked and sold.
Much like humans, bots come in all shapes and sizes. In social media networks, these bots can like what you post and even increase your followers. Companies use bots for all types of things—from booking a ride to giving makeup tutorials. Some bots can even solve your legal problems. Besides saving time and money, bots have the potential to reduce errors and increase a business’s customer base. But what happens when bots spy on users and share personal information? Or when they make racial slurs and offensive comments?
The UK Data Protection Authority, the Information Commissioner’s Office (ICO), has published an update report on privacy issues around real-time bidding (RTB) and programmatic advertising. The report is a progress update on the ICO’s investigation into the AdTech industry, which it says is one of its regulatory priorities.
For any company that has tackled GDPR compliance, the new privacy rights introduced by the California Consumer Privacy Act of 2018 (CCPA) will seem pretty familiar. It might even be tempting to assume that by being GDPR compliant, one is already most of the way there in terms of preparing for the CCPA. In “Countdown to CCPA #2: GDPR Compliance Does Not Equal CCPA Compliance,” colleagues Catherine D. Meyer, Steven Farmer, Fusae Nara and Rafi Azim-Khan explain how, similarities aside, there are significant differences between the two privacy laws.
Protecting consumer data privacy in the age of artificial intelligence and increased digital commerce is a growing concern. In June 2018, the California Consumer Privacy Act (CCPA) introduced provisions to protect consumers and became the first U.S. law that can be viewed as a response to GDPR. Going into effect on January 1, 2020, legislation of this scope has far-reaching tendrils that may breed unintentional consequences.
No one knows your face as well as your iPhone does. All the unique variances of your face that make it yours and yours alone, these are all data points that your iPhone uses to unlock your phone using a face in place of a thumbprint. This same data that the iPhone collects can be used by the underlying tech—facial recognition technology—in a vast array of applications, from border control to photo tagging to law enforcement. But is this data (the measurement of the space between the eyes, the texture of the skin, etc.) open data? Or do individuals have a right to protection of an image of their face?
Do you like getting your news online, sharing videos or tweeting memes? A little piece of legislation known as The European Union Directive on Copyright in the Digital Single Market may signal the end of some of the internet’s simple pleasures. On September 13, the European Parliament approved new legislation that would overhaul the region’s approach to copyright law. As with the EU’s privacy regulations, the legislation could have an impact far beyond Europe, redrawing the lines of liability that exist between poster, publisher and platforms. Not surprisingly, technology companies and publishers like Google, Amazon, and Wikipedia strongly opposed the legislative changes.
The European Parliament adopted a resolution earlier this month to suspend the EU-U.S. Privacy Shield agreement. The Privacy Shield is a protocol that provides for the exchange of personal data between the EU and the United States for commercial purposes. Adopted in 2016 after the European Court of Justice invalidated the Safe Harbor arrangement, the shield is intended to safeguard the “fundamental privacy rights” of European citizens with respect to data transfers between signatory countries.