In a move that underscores the escalating tension between the music industry and artificial intelligence (AI), many of the world’s largest music publishers have filed a joint lawsuit against AI startup Anthropic over song lyrics. The suit alleges that Anthropic’s chatbot, Claude, scrapes lyrics from the publishers’ catalogs without permission and thereby infringes on copyrighted material. It serves as yet another example of generative AI companies facing increasing pressure over their use of intellectual property to develop the groundbreaking, generative AI technology.
Companies use a variety of causes of actions to protect their websites from competitors or others wanting to “scrape” data from their site using automated tools. Over the years, legal doctrines such as copyright infringement, misappropriation, unjust enrichment, breach of contract, and trespass to chattels have all been asserted, though many of them have limited applicability or are otherwise imperfect options for site owners. One of the most commonly used tools to protect against scraping has been a federal statute: the Computer Fraud and Abuse Act (CFAA). The CFAA is a cybersecurity law passed in 1986 as an amendment to the Comprehensive Crime Control Act of 1894. Originally drafted to address more traditional computer “hacking,” the CFAA prohibits intentional access to a computer without authorization, or in excess of authorization. Due to both the criminal and civil liability that it imposes, the CFAA has been an effective tool to discourage scraping, with website operators arguing that by simply stating on the site that automated scraping is prohibited, any such activity is unauthorized and gives rise to CFAA liability. An ongoing case between data analytics company hiQ Labs Inc. and LinkedIn questions the extent to which companies may invoke the CFAA as it pertains to scraping of this type of data.