Skip Ribbon Commands
Skip to main content

​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​

 

‭(Hidden)‬ Catalog-Item Reuse

The Forecast for Increased Litigation Around AI Adoption

Clyde & Co, a global law firm, predicts a sharp increase in AI-related cases, particularly class action lawsuits, in 2024 and beyond.
Sponsored by

Generative artificial intelligence (AI) tools like OpenAI's ChatGPT created quite the buzz last year, but now it's time to pay the piper. Clyde & Co, a global law firm, predicts a sharp increase in AI-related cases, particularly class action lawsuits, in 2024 and beyond.

Companies that use generative AI tools are opening themselves to liability over several risks. “One of the areas we identified was discrimination and bias," says Janice Holmes, Washington, D.C.-based senior counsel at Clyde & Co. The harbinger arrived in August with the settlement of the first lawsuit brought by the U.S. Equal Employment Opportunity Commission (EEOC) for bias using AI software. The lawsuit alleged that China-based tutoring company ITutorGroup Inc. had used AI software that discriminated against older job applicants. ITutorGroup agreed to pay $365,000 to more than 200 job applicants.

“We expect to see more of these types of claims, as applicants start looking behind the hiring process and asking, 'What tools were used? How were they used? And what data was used to determine the right candidate?'" Holmes says.

Another concern with AI usage is privacy. “These AI tools are handled by third parties," Holmes continues. “What kind of contracts are in place to make sure information is protected?"

For independent insurance agents, that means that any commercial client who uses AI to aid the hiring process or to process data is at risk. “These are going to be considerations for companies in any sector that uses AI tools," she warns.

A company using AI tools can also be dragged into litigation by the creators of the AI program, a scenario increasingly likely with the rise of AI litigation, Holmes notes. “If someone sues the third-party developer of the tool, who's to say that the third party can't bring in the company that's using the tool? The creator will probably argue the issue came with how the tool was used, not how it was designed."

“Ultimately, it would be up to the court to determine the strength and scope of contractual obligations, waivers and risk transfer agreements—but no matter the defense, a company is at risk of being brought into the lawsuit for the courts to decide," Holmes says, who points out that whatever agreements are in place, “it's going to take a court ruling" to settle issues.

The insurance industry, in turn, will “probably see some application changes, and even some tailored endorsements on policies about these risks," Holmes says. In fact, based on the insurance industry's response to past issues, “we think the insurance industry can help drive some reforms," she continues.

“Governments right now are trying to create regulations and parameters around the use of AI with privacy and discrimination," she says. “But, at the end of the day, once the third-party developers start to be sued, if their policies are broad, it's going to be a whole frontier of litigation. At the insurance level, the insurers are going to say, 'Hold on—we can't keep having these claims.' It will drive the third-party creators to say, 'Maybe we need to actually operate in some parameters because we have our insurance carrier to answer to.'"

Until that happens, independent insurance agents can help commercial clients mitigate AI usage risks by encouraging companies to obtain an audit of their software contracts and ensuring risk transfer agreements are in place, such as additional insured endorsements on third-party policies.

“It's always important to protect from potential issues by making sure the third party that created the tool has some skin in the game to make sure the tools are being operated in accordance with regulatory guidelines and laws as they come out," Holmes adds. 

AnneMarie McPherson Spears is IA news editor. 

17637
Monday, April 1, 2024
Technology
Digital Edition