We’ve previously written about “tweet-less, picture-less,” computer-operated accounts or bots, that make one appear more popular—a.k.a. influential on social media—than one actually is. Recently, legislators and law enforcement agencies have moved to crack down on bots, their evil cousins known as sock puppets, and other deceptive social engagement practices. Specifically, California passed a law that goes into effect in July 2019 banning the undisclosed use of bots to communicate or interact with a person for knowingly deceiving that person to influence commercial transactions or vote in an election. Meanwhile, New York and Florida announced settlements with Devumi LLC, a company that grossed over $15 million in revenue by creating, packaging and selling fake social media likes, followers and posts after the media exposed Devumi’s deceptive activities. The Devumi settlements mark the first of their kind indicating that such activity constitutes illegal deception of the public and, to the extent Devumi used stolen identities for its online activities, illegal impersonation.
Protecting consumer data privacy in the age of artificial intelligence and increased digital commerce is a growing concern. In June 2018, the California Consumer Privacy Act (CCPA) introduced provisions to protect consumers and became the first U.S. law that can be viewed as a response to GDPR. Going into effect on January 1, 2020, legislation of this scope has far-reaching tendrils that may breed unintentional consequences.
In another case of the law trying to keep pace with evolving technology, legislators are introducing bills to punish those who attempt to create false images that purport to be real. Targeting the rise of automated computer-generated imagery that has become increasingly accessible to the public, on February 14, 2019, California Assemblyman Marc Berman introduced a bill to create a criminal cause of action for making or distributing a “deepfake.” Deepfakes are multimedia, often audiovisual recordings, that seem real but that are generated by computers, often utilizing artificial intelligence-enhanced algorithms.
The Committee on Foreign Investment in the U.S. (CFIUS) has effectively ordered the divestiture of Beijing Kunlun’s ownership of the online dating site, Grindr, just as the company was preparing for an IPO. The case is important for three reasons. It emphasizes the importance of a CFIUS risk assessment before an investment or acquisition is negotiated. It highlights the risks of deciding not to make a voluntary CFIUS filing before the deal closes. And it sends a message to parties who have already closed: the U.S. government is watching, so assess your CFIUS exposure now.
Given the growth of investments in and shift of regulatory views regarding cannabis-related products, many companies in industries like medicine, lifestyle and foods/beverages are looking to carve out niches and be leaders in the relatively new space. As with any new technology space, it is essential to have a robust intellectual property protection strategy to both establish and preserve one’s position as a dominant player in an emerging market. One important step that a company may take when creating such a strategy is applying for patents.
Back in September, we looked at the concerns and implications surrounding a proposed new copyright law being considered by EU legislators. Yesterday, perhaps faster than many expected, the European Parliament passed the new law. Many tech companies, digital rights activists and academic researchers found common ground in opposing the legislation, which they claim will stifle information sharing and enable censorship. Supporters of the law see it as a means to protect creative content. As written, the legislation is not quite as restrictive in all areas as initially feared—memes and gifs are “safe,” as are uploads to noncommercial and open-source sites—but nonetheless, now that it has been passed, and after inevitable legal challenges lead to further adjustments in the language, we’ll see who was right.
Over the past several years, cannabis has been one of the hottest areas of investment and innovation, with many states introducing legislation to legalize cannabis use in some form. Correspondingly, many companies have entered the U.S. market and are even listed on the Nasdaq or the New York Stock Exchange, leading to much interest on Wall Street. Many nascent industries have budded in the cannabis space, ranging from growing the cannabis plant itself to extraction processes to consumer products like vapes.
No one knows your face as well as your iPhone does. All the unique variances of your face that make it yours and yours alone, these are all data points that your iPhone uses to unlock your phone using a face in place of a thumbprint. This same data that the iPhone collects can be used by the underlying tech—facial recognition technology—in a vast array of applications, from border control to photo tagging to law enforcement. But is this data (the measurement of the space between the eyes, the texture of the skin, etc.) open data? Or do individuals have a right to protection of an image of their face?
On the heels of a January 2019 announcement that it was charging nine persons with participation in a scheme that allowed them to hack into the SEC’s confidential database of public filings, commonly known as EDGAR. On February 28, the SEC named Gabriel Benincasa as its first-ever Chief Risk Officer (CRO). Although the two events have no direct causal link, they serve as useful reminders that the SEC is determined to re-emphasize its mission to ensure the smooth operation of the U.S. securities markets and to root out and punish instances of fraud and market manipulation, be it by traditional methods or where digital tools are implicated and databases are compromised. The position of CRO is a new one at the SEC. Created by SEC Chairman Jay Clayton to strengthen the agency’s risk management and cybersecurity efforts, Benincasa’s office will help to coordinate efforts to identify, monitor and address risks facing the agency.
Fortnite is the most popular video game in the world. So popular that it was last year’s highest earning video game, grossing more than $2.4 billion in 2018 alone. So popular, in fact, that its fans successfully convinced Sony to reverse its longstanding policy against cross-platform gaming, thus allowing PlayStation Fortniters to play with their PC, mobile and other console-owning friends. Fortnite is also free.