In “The Case of Prince, a Dancing Baby and the DMCA Takedown Notice,” we discussed the potential impact of the Ninth Circuit decision in Lenz v. Universal Music Corp., 801 F.3d 1126 (2015), a.k.a. the “dancing baby case,” in which the appeals court held that under the Digital Millennium Copyright Act (DMCA), copyright holders have a “duty to consider—in good faith and prior to sending a takedown notification—whether allegedly infringing material constitutes fair use.” However, in considering whether there is fair use, the court was “mindful of the pressing crush of voluminous infringing content that copyright holders face in a digital age.” To deal with this reality, the court affirmed that computers may be leveraged to support the fair use analysis.
We have written previously about the role of traditional discovery roles in “newer” platforms, and how social media content can be discoverable and used in litigation. What about using information from social media in jury selection? U.S. District Court Judge William Alsup says no.
At its heart, social media’s purpose is sharing content; however, fair use can only take one so far. A recent case serves as yet another reminder to exercise caution when reposting content, and that, in a litigious society, it is advisable to take the conservative approach and secure permission before reposting another’s content, even when there has been some modification of that content.
In our recent post, Living in a Nonmaterial World: Determining IP Rights for Digital Data, we discussed the potential impact of the Federal Circuit decision in ClearCorrect v. ITC, 2014-1527, in which the appeals court held that the “articles that infringe” are limited to “material things” and thus do not include “electronic transmission of digital data.” The decision limited the regulatory jurisdiction of the U.S. International Trade Commission (ITC) to articles that are considered physical products. The implications of the decision are far-reaching since the Internet of Things touches on most industry sectors. As previously noted, the decision has been supported by open-Internet advocacy groups, characterizing the decision as a “win for the Internet,” while other groups (including the dissent to the opinion) see the decision as a significant setback in the fight against overseas piracy of patented and copyrighted works.
Hours. Days. Weeks. Months. When it comes to acting on copyright infringement takedown notices, just how fast is fast enough for social media platforms? Some recent (and not-so-recent) cases reveal how difficult the question has proven for the courts.
Last month, Google announced a groundbreaking policy that may help shift the balance of power between copyright claimants and those who upload YouTube videos that may be covered by fair use. According to Google’s Public Policy Blog, users upload more than 400 hours of video every minute. Those uploads sometimes make use of existing video or music clips in new and transformative ways. When uploads transform the original work in this way (such as a parody or critique), it adds social value beyond the value contained in the original work. In the United States, a transformative use is considered a fair use and exempted from copyright infringement liability.
As user-generated content explodes over the Internet, intellectual property disputes over posting or uploading such content without the owner’s consent continue to escalate. As we touched on in a recent post, social media platforms, hosting websites or other online service providers (OSPs) may be entrapped in these disputes based on the infringing actions of users of these OSPs. In such a situation, the Digital Millennium Copyright Act (DMCA) provides a safe harbor provision to the OSP known as the Online Copyright Infringement Liability Limitation Act (OCILLA.) This provision, found at 17 U.S.C. § 512(c), protects service providers from liability for “infringement of copyright by reason of the storage at the direction of a user of material that resides” on the provider’s system or network, if certain requirements are met.
Stories of interest this week include discussions of “melt your brain” VR at YouTube; the resurrecting of deceased loved ones via social media history; transforming that key fob or piece of jewelry into a payment device; and more…
Brand companies have come to view user-generated content as often one of the most effective and authentic ways to advertise their products or services. This is known as “user-generated content marketing.” For example, with the ubiquitous selfie, brand companies have discovered a rich supply of user-generated content. Consider a consumer who takes a selfie wearing a favorite pair of jeans, posts the photo on Instagram, and then tags the photo with #brandname. The jean company sees and likes the photo, re-posting it on the company website. Legal issues? If the consumer or user was hoping to get attention from the brand for the photo and opinions shared online, not at all. This is how many digital influencers get their start. But if the user was not seeking such attention? Then, problems can arise.
With live-streaming apps Periscope and Meerkat becoming increasingly popular, the introduction of a “live” element in the social media game is creating unique business and legal concerns. While most of the videos streamed on Periscope or Meerkat merely allow users to create real-time videos to share with their followers or show snippets of everyday life (like a walk through the park or a birthday celebration), legal complications can arise when users give viewers a glimpse into highly anticipated and publicized live events.