While we’ve devoted ample time to discussing areas of potential concern regarding the application of algorithms—and algorithm bias in particular—it’s also a good time to remember algorithmic technology is poised to make our lives better, often in ways we’ll never know about.
Just before Thanksgiving, Dave Chappelle posted to his Instagram account an 18-minute stand-up set titled “Unforgiven.” As expected, Chappelle’s fans ate it up, and the original video has been viewed more than six million times in the two weeks since its posting. Chappelle opens the set sharing stories and lessons learned from his start in comedy at the age of 14 before turning to a very public airing of grievances with Viacom over the streaming rights to the early-2000s Comedy Central hit Chappelle’s Show, which Viacom owns and had recently started streaming on both HBO Max and Netflix.
Many restaurants that do not like food delivery platforms delivering their food will have their concerns addressed by a new California law enacted on September 24, 2020. AB-2149, known as the Fair Food Delivery Act of 2020, prohibits food delivery platforms from arranging for the delivery of food delivery orders without the express authorization of the food facilities. Food delivery platforms will need to obtain agreements from any restaurants that they want to take orders and deliver meals from.
The sweeping use of facial recognition software across public and private sectors has raised alarm bells in communities of color, for good reason. The data that feed the software, the photographic technology in the software, the application of the software—all these factors work together against darker-skinned people.
As research continues to prove that AI is not an impartial arbiter of who’s who (or who’s what), various mechanisms are being devised to mitigate the collateral damage from facial recognition software.
We’ve written frequently about the distributed ledger technology (DLT) and the blockchain—on the interesting variations of the technology, its ability to bolster other technologies and its potential applications on everything from team giveaways to trading platforms (be they for cryptocurrency or energy commodities). In “Blockchain-Based Tokenization of Commercial Real Estate,” colleagues Josh D. Morton and Matt Olhausen examine the real-world application of tokenization—the process of representing a fractional ownership interest in an asset with a blockchain-based digital token—in commercial real estate.
We’ve previously touched on some of the issues caused by AI bias. We’ve described how facial recognition technology may result in discriminatory outcomes, and more recently, we’ve addressed a parade of “algorithmic horror shows” such as flash stock market crashes, failed photographic technology, and egregious law enforcement errors. As uses of AI technology burgeons, so, too, do the risks. In this post, we explore ways to allocate the risks caused by AI bias in contracts between developers/licensors of the products and the customers purchasing the AI systems. Drafting a contract that incentivizes the AI provider to implement non-biased techniques may be a means to limit legal liability for AI bias.
Say what you want about the digital ad you received today for the shoes you bought yesterday, but research shows that algorithms are a powerful tool in online retail and marketing. By some estimates, 80 percent of Netflix viewing hours and 33 percent of Amazon purchases are prompted by automated recommendations based on the consumer’s viewing or buying history.
But algorithms may be even more powerful where they’re less visible—which is to say, everywhere else. Between 2015 and 2019, the use of artificial intelligence technology by businesses grew by more than 270 percent, and that growth certainly isn’t limited to the private sector.
“One who invites another to his home or office takes a risk that the visitor may not be what he seems, and that the visitor may repeat all he hears and observes when he leaves. But he does not and should not be required to take the risk that what is heard and seen will be transmitted by photograph or recording, or in our modern world, in full living color and hi-fi to the public at large or to any segment of it that the visitor may select.” When Ninth Circuit Judge Shirley M. Hufstedler wrote these words in 1971 about surreptitious recordings made by newsmen, she probably had no idea that a global pandemic would give new meaning to her words.