Share this
The Role of AI in Mediation: Where We Are and Where We Are Headed
by James South on 24/07/25
By James South
We have spent years speculating about the future of artificial intelligence in mediation. But that future is no longer theoretical - it’s already here. The conversation has shifted from “What might happen?” to “How are we already using it and how it will develop in the future?” AI is beginning to embed itself into our mediation practices in ways both subtle and transformative.
Yet, mediators are divided on this topic Some have embraced AI tools. Others are wary, either due to unfamiliarity or concern that AI undermines the human core of mediation.
AI is already part of our field. As its capabilities grow and society becomes more comfortable with its use, mediators must ask not if, but how they will engage with it, for the benefit of the process and the parties.
Practical Applications: AI in the Mediator’s Toolkit
The following are ways I know AI is already being used in mediation, but the list is not exhaustive. For more complete guidelines I recommend the newly published IBA guideline on the use of generative AI in mediation, as a starting point.
- Mediator Selection Made Smarter
At CEDR, we are developing an AI-powered mediator selection engine to assist client advisers. This system analyses the nature of a dispute, the mediator profile preferred by the parties (e.g. sector expertise, professional background, style), and suggests suitable candidates. Eventually, it may even check the availability of these mediators using integrated calendar tools.
Why it matters: It enhances fairness, efficiency and precision, ensuring that the best-fit mediators are consistently put forward.
- AI-Enhanced Preparation
This is where AI is currently having the most impact. Personally, I’ve used it in recent mediations to:
- Summarise key issues and timelines
- Highlight divergences between parties
- Suggest areas worth exploring
- Provide deep dives into legal concepts
Because this use is private, and involves mediators working behind the scenes, feels less controversial and seems to be more widely accepted.
- Body Language and Emotional Cues (Experimental Use)
Some mediators are experimenting with AI tools that analyse online participants’ facial expressions and tone of voice to infer emotions or intent. These insights are kept confidential to the mediator.
Personally, I haven’t used this, and remain sceptical. Ethical concerns and practical limitations abound, and I would have thought as a minimum it should be disclosed to the parties that these tools are being used
- Negotiation Support for Parties
One particularly promising application involves AI-driven option generation. Some mediators, with party consent, input anonymised case summaries and each party’s interests into an AI. The AI suggests settlement options, which are jointly reviewed.
The benefit? These AI-generated ideas are perceived as neutral and can spark creative solutions neither side had considered. Maintaining party self- determination is key here and using it as an aid to option generation and consideration only.
- Reality Testing and Risk Analysis
Some law firms are already using proprietary AI tools to estimate litigation outcomes. Mediators must now navigate conversations where one party cites an AI’s prediction of their likelihood of success.
There’s also potential for parties to jointly use such tools during mediation to objectively assess case strength, although adoption here remains limited due to strategic concerns.
Addressing the Concerns
Mediators have raised valid concerns about AI. But none of these are insurmountable:
- Losing the Essence of Mediation
The human element is central to mediation. Critics argue AI risks stripping away the interpersonal essence.
But let’s remember: online mediation once felt like a threat to “real” mediation too. Now it’s a valued option. Likewise, AI is best viewed as an assistant, not a replacement. Used wisely, it enhances rather than replaces human connection.
- Party Autonomy
Mediation is voluntary and party-led. Any AI use must preserve that. The key is consent and transparency, particularly when AI is used with the parties in the mediation.
- Privacy and Confidentiality
This is the most frequently cited concern. Uploading confidential materials to tools like ChatGPT can risk data exposure. These are not insurmountable. Solutions include:
- Anonymising sensitive data
- Using locally hosted or private AI systems
- Clear disclosure and consent
- Neutrality and Impartiality
AI outputs are only as good as their inputs. Mediators must take care not to unintentionally bias AI by how information is framed. Again, mediator training in neutrality and framing of language can extend to how we engage with these tools.
- Obsolescence
Will AI replace mediators entirely? Not soon. But futurists predict dramatic workforce shifts. Like all professionals, mediators must adapt. Rather than fear obsolescence, we should ensure our human value is amplified, not replaced by AI.
Final Thoughts
AI is not a distant innovation. It is already influencing mediation practices, from preparation and case analysis to option generation and beyond.
Mediation has always been flexible and continues to adapt. Going forward we should consider how to use AI thoughtfully and ethically and for the benefit of the parties we serve.