Be clear. Be open. Be upfront. That’s what influencers need to do to build a following. But those same standards could just as easily describe the legal guidelines applicable to influencers Fall short, and influencers may violate the law.
CBD, CBG, CBA, CBN, THC—the race to find the holy grail of cannabinoid production is in full swing. Money flows abound, unicorn-hungry investors looking to capture market share are swirling around promising frontrunners with lucrative IP. One interesting segment of cannabis IP gaining traction focuses on cannabinoids synthesis from microorganisms such as yeast.
For any company that has tackled GDPR compliance, the new privacy rights introduced by the California Consumer Privacy Act of 2018 (CCPA) will seem pretty familiar. It might even be tempting to assume that by being GDPR compliant, one is already most of the way there in terms of preparing for the CCPA. In “Countdown to CCPA #2: GDPR Compliance Does Not Equal CCPA Compliance,” colleagues Catherine D. Meyer, Steven Farmer, Fusae Nara and Rafi Azim-Khan explain how, similarities aside, there are significant differences between the two privacy laws.
In November 2018, the U.S. Department of Justice rolled out the China Initiative. This new policy includes plans to “identify priority Chinese trade theft cases, ensure we have enough resources dedicated to them, and … bring them to an appropriate conclusion quickly and effectively.” The new Attorney General, who has a master’s degree in Chinese Studies, supports the Initiative and intends to continue to advance it.
With the transition of software from physical, boxed merchandise to web-based, downloadable content, the question of how and whether to tax this less tangible manifestation of goods and services has been taken up by courts and legislators alike. In “The Evolution of Software as a Service Taxes Post-Wayfair,” colleagues Marc A. Simonetti, Dmitrii Gabrielov and William L. Bennett examine the evolution of tax laws regarding Software as a Service (SaaS) in the wake of the U.S. Supreme Court’s South Dakota v. Wayfair Inc. decision.
“For a bunch of hairless apes, we’ve actually managed to invent some pretty incredible things”
—Ernest Cline, Ready Player One
It’s an incredible time to be alive. The Digital Age has helped us reached levels of efficiency and connectivity that were unimaginable just a few decades ago. In his award-winning novel, Ready Player One, Ernest Cline, paints a picture of a not-so distant future where people spend the majority of their time experiencing life in the “Oasis,” a realistic virtual world where users interact with one another in amazing virtual environments that mimic reality in many ways, but where the rules of physics and nature are malleable, allowing the game publisher to create wildly entertaining games where virtually anything can happen. While we may not have the Oasis yet, today’s video games are rapidly evolving into similar immersive social platforms where users can play, compete and express themselves in settings that seem to be inevitably headed towards something that looks increasingly like Cline’s Oasis. One way that video game makers are able to make game backdrops more realistic and thus enrich the overall user experience is to incorporate real-world ideas and content to more closely emulate reality in the game. In response, an increasing number of intellectual property owners who object when their “property” gets incorporated into video games are bringing lawsuits that will help define the boundaries of intellectual property law in this new arena.
(Note, this post has spoilers for Avengers: Endgame.)
Perhaps one of the most mesmerizing scenes in Avengers: Endgame is where all the MCU superheroes (including those on Titan) come through Dr. Strange’s portals to enter the battle against Thanos. In Avengers: Infinity War, Dr. Strange didn’t use these portals to send Iron Man and the others on Titan back to Earth before everyone got dusted, but that alternative storyline certainly may have been one that fans would have enjoyed. Understandably, one enormous limiting factor to alternative storylines is cost—especially when $600-800 million was spent to create the two movies as they are. Future advances in artificial intelligence technologies may change that. Indeed, a number of large tech companies are already interested in creating interactive content to personalize storytelling (e.g., Black Mirror’s “Bandersnatch” episode), and recent developments in machine learning algorithms (including those fueling the creation of photorealistic images) may bring us closer to that reality sooner than later. If so, under what circumstances will companies own and get to collect on the copyright?
When it comes to photos destined for the web, I’d rather be behind the camera than in front of it. However, on a recent trip to Tokyo I was reminded that photos of me, and specifically my face, are often being captured and processed by systems that are increasingly being embedded in our modern life.
We’ve previously written about “tweet-less, picture-less,” computer-operated accounts or bots, that make one appear more popular—a.k.a. influential on social media—than one actually is. Recently, legislators and law enforcement agencies have moved to crack down on bots, their evil cousins known as sock puppets, and other deceptive social engagement practices. Specifically, California passed a law that goes into effect in July 2019 banning the undisclosed use of bots to communicate or interact with a person for knowingly deceiving that person to influence commercial transactions or vote in an election. Meanwhile, New York and Florida announced settlements with Devumi LLC, a company that grossed over $15 million in revenue by creating, packaging and selling fake social media likes, followers and posts after the media exposed Devumi’s deceptive activities. The Devumi settlements mark the first of their kind indicating that such activity constitutes illegal deception of the public and, to the extent Devumi used stolen identities for its online activities, illegal impersonation.
Protecting consumer data privacy in the age of artificial intelligence and increased digital commerce is a growing concern. In June 2018, the California Consumer Privacy Act (CCPA) introduced provisions to protect consumers and became the first U.S. law that can be viewed as a response to GDPR. Going into effect on January 1, 2020, legislation of this scope has far-reaching tendrils that may breed unintentional consequences.