In-House Counsel
How much do legal leaders trust artificial intelligence in high-stakes decisions? New study sheds light

Only 37% of legal leaders trust the use of generative artificial intelligence in high-stakes decisions, showing limited confidence in its ability to interpret complex issues, according to a new study of 500 legal and business leaders. (Image from Shutterstock)
Only 37% of legal leaders trust the use of generative artificial intelligence in high-stakes decisions, showing limited confidence in its ability to interpret complex issues, according to a new study of 500 legal and business leaders.
The study by Paragon Legal, a legal services company that advises businesses and corporate legal departments, also reveals that:
• 39% say their organizations are adopting AI too quickly.
• 36% have used AI-generated insights that they do not fully trust.
• 37% have restricted or disabled AI tools because of concerns over compliance.
“As artificial intelligence becomes more deeply integrated into legal practice, leaders are increasingly navigating where to draw the line between innovation and integrity,” according to the study. “Trust in AI remains cautious, especially when human judgment and accountability are at stake.”
Nearly two-thirds (65%) of legal and business leaders who responded to the survey say AI should be used for legal support, rather than attorney-led decisions. However, the study also shows that 41% admit that they have reclassified tasks to assist with AI adoption.
A key takeaway for legal leaders is to “establish clear accountability frameworks before allowing AI to influence substantive legal decisions,” according to the study. It also suggests that “human sign-off should remain the standard, especially where regulatory interpretation or client impact is involved.”
Write a letter to the editor, share a story tip or update, or report an error.
