This year’s ILTACON in Washington was heavy on AI, but the conversation with vendors has shifted. Legal IT Insider’s briefings weren’t about potential use cases or speculative roadmaps. Instead, they focused on how AI is now being embedded into the tools lawyers use every day — and, crucially, how those tools are starting to talk to each other.
Taken together, they point to an inflection point, where agentic workflows, data integration, and open ecosystems define the agenda. But it’s important amidst the latest buzzwords to remember that agents are only as good as the tools they have to work with, and AI only as good as its underlying data. Also, as we talk about autonomous AI, end users are still struggling with cloud implementations and infrastructure challenges, and need vendors to be business partners that help them to make progress at speed.
Assistants that act, not just surface information
If 2023 and 2024 were about copilots, 2025 is about agents. Rather than passively surfacing suggestions, a number of AI tools are, in theory, capable of taking the initiative and executing tasks across connected systems.
Having access to the right tools and connected systems is key. Litera’s chief product officer, Adam Ryan, stressed that Litera One is now an integrated product (it’s been a long time coming) and introduced Lito, Litera’s new AI agent. He sketched a scenario where a lawyer receives an email about energy sector experience. Instead of manually digging into databases, Lito can read the request, determine it’s an “experience search” use case, and query Litera’s Foundation database. Within the same interface, it surfaces relevant matters, complete with details on who worked on them and connected narratives.
At NetDocuments, chief product officer Dan Hauck described to Legal IT Insider their new editing tool, commenting: “The agent has access to it and can edit your documents in the way you want. You give it instructions of the changes that you want to see directly in Microsoft Word and it will surface the appropriate suggestions that you can review and accept or reject, which speeds up the process and is driven by an agentic workflow.”
AI output is only as good as the data it has access to and NetDocuments at the conference also unveiled new AI profiling capabilities, which run in the background to ensure that documents have correct and complete metadata.
Generally speaking, there is a greater pragmatism to vendor conversations now that was missing in the earlier GenAI years.
It’s worth noting here that most law firms we speak to are not yet ready for truly agentic capabilities. Vendors including the big research players Thomson Reuters and LexisNexis as well as NetDocuments and smaller vendors such as Definely are solving for this by laying out the various steps that agents are taking and including a human in the loop approval process. By opening up the AI blackbox, they are gaining trust as we move towards a new era of autonomous AI.
A word of warning: it doesn’t take a rocket scientist to see that agentic AI has for some vendors become the latest marketing tool, and it is important to be clear around what is being promised.
Data integration across platforms
The second theme of ILTA was unmistakable: data integration is a make or break factor.
Harvey’s co-founder and CEO Winston Weinberg cut to the chase at their company briefing: “The first problem we’re trying to solve is how to expand product sets so you can access all the tools and data that a human can.” Without that, he warned, you end up with “two million context window prompts” and no clear answers.
Harvey’s roadmap is all about expanding its surface area — connecting to systems like iManage, LexisNexis, and more recently publishing giant Wolters Kluwer — so that a lawyer can issue a single query and get synthesised, contextualised answers directly within their workflow. Weinberg said: “What we’re trying to do is get all of the surface area of all of the context that a lawyer needs to complete a task and we’re expanding the product surface so you can enter a search, search all resources, and apply that to the document automatically.”
The common thread: no one is talking about AI in isolation anymore. It’s about orchestration — pulling together multiple data sources into a workflow that actually reflects how lawyers practice.
MCP: The open ecosystem era
If there was a buzz across the briefings, it was MCP — Model Context Protocol, or the equivalent of APIs for large language models.
During ILTACON iManage announced that its customers will be able to connect MCP-compatible AI applications to its cloud – a major development for customers, depending on how far this goes.
At our briefing with NetDocuments, CTO John Motz told Legal IT Insider: “MCP eliminates the need to grab a document, understand it and grab the metadata. It brings the context back in the MCP workflow so it can hit the LLM with that context,” but he warned, “MCP without semantic search is just key word search. You need deep enrichment.”
We’ll be bringing you a lot more on this in the coming weeks, including end user reactions as to how MCP will be impacting their search strategies. The general view is that it is huge.
The next stage of maturity
Perhaps the most striking difference at ILTACON 2025 was tone. Legal tech vendors are no longer selling what AI will be able to do. They are showing what AI is already doing in live environments, whether drafting edits, profiling documents, or retrieving experience data.
That said, big questions remain. The debate over the growing functionality of mainstream tech and its impact on buyers’ choices looms large. We will shortly bring you the ILTACON feedback of a raft of senior legal tech leaders, and Gil Perez, chief innovation officer at Freshfields says: “What we’re seeing as we move forward is that the big cloud providers are providing AI baked in. A year ago, summarisation or creating a chronology was a big deal, but today it’s table stakes.” Legal tech vendors are having to run fast to keep up. Many law firms are now habitually using GenAI technology but if you had to guess just one, it would be Copilot.
There is also a disconnect in where law firms are on the cloud adoption curve, as much as cloud is now just a conversation around ‘when’ not ‘if’. In order to take advantage of much of the new technology, law firms need to be in the cloud, but Tony McKenna, former president of ILTA, tells us at the end of his term, in an interview shortly to be released:
Andy Powell’s research on cloud adoption is always well received and one of the biggest takeaways of the G100 and 200 is that our aspirations are always higher than our actual circumstances. There are lots of good reasons for that but it’s a year-on-year statistic and it creates a bit of a chuckle in the room. With regard to the big ticket things, especially around practice management systems, our aspirations are always higher than our actual achievements. Even though we know that getting GenAI in the cloud is easier, it’s still hard to get your DMS and PMS to the cloud. It’s not just about cost. One of the big things has been functionality. The view is that cloud products have started to have more functionality, and that is a key differentiator this year.”
McKenna encourages vendors to provide support, observing:
The critical difference between business partners and vendors is that a business partner assists you and supports you commercially and with resources and skills. Say you move your DMS to the cloud but your PMS is on-premises. You still can’t change what your staff do and start to realign staff to work more on the front end. That is where a business partner is critical, because they recognise that you have a lots of stuff going on and are there to help.
Regardless of the challenges, it is clear that this year at ILTACON, the AI conversation has matured. The focus is shifting from novelty to utility, from pilots to production. The next twelve months will test whether the orchestration on display in Washington translates into tangible gains for law firms and their clients.