Q&A: Neil Araujo and Shawn Misquitta discuss iManage’s Copilot integration 

iManage today (20 May) announced the launch of an integration with Copilot, bridging the user’s Microsoft 365 environment and the content stored in the iManage Knowledge Work platform. This is a long-anticipated integration that iManage customers trialing or having rolled out Copilot have been waiting for.  

iManage says that the integration enables the AI engine to ground the responses in the specific context of the organisation’s prior work product while also respecting the security context of the user.  

Legal IT Insider’s editor Caroline Hill spoke with iManage’s CEO Neil Araujo and EVP of product management, Shawn Misquitta about what the integration means for customers.  

Caroline: It’s great to see the Copilot integration, it’s funny because this is a long-awaited integration in the world of gen AI, where everything moves at 100 miles an hour. Tell us about the work you’ve been doing behind the scenes, including the key priorities. 

Neil: There has been a fair amount of anticipation for this integration, because Copilot is, think of it as being a personal assistant. It helps you answer questions, get things done. And if it can answer questions based on information and data that is sitting in iManage, those answers become more meaningful. They are safer, because you see only or you can access only what you’ve got access to, and it adds value certainly to what Copilot can do. So yes, it has been anticipated. But Copilot was just out a few months back so it’s not like it’s been years and whenever we build something, I think the expectation, particularly in the legal market, is you’ve got to get it right rather than get it out fast, and that’s always been our philosophy.

Caroline: You say that the integration will respect the security context of the user, so can firms be assured that they won’t be able to access anything they shouldn’t? 

Neil: Yes, absolutely. It will respect all policies and the security context of the end user that’s making the request from Copilot.

Caroline: While law firms have been awaiting this integration with the DMS, equally they aren’t going to just unleash people in terms of enabling them to pull up and use all and any content, how do you envisage this being used sensibly? 

Neil: First of all, it allows you to interact with the DMS using natural language, right? So you can ask a question like ‘Can you find me the last NDA we had with Caroline Hill.’ And it will do a search and say ‘I found three NDAs, is this the one you’re looking for?’ And you can you say then ‘What is the expiry data on the latest NDA’ and it will look at the document and tell you what the expiry date is, so the user doesn’t have to go and do a search, find the document, wait for it to open, and then answer the one question they are looking for. It will be much faster.

Caroline: In terms of knowledge, from the conversations I’m having, because the DMS has so much content in it people aren’t sure how much they will ever be able to trust it for pulling up precedents, and particularly the right precedents, would you agree with them? 

Neil:  Yes I agree and that is why we have iManage Insight+ which allows you to curate precedents and documents, and the 10,000 documents are more important than the 10 million. At ConnectLive one of the keynote speakers is a customer who will be talking about their experience of using it and showing how they went from a big pile of documents to about 50 different categories, so that you know where your NDAs are and where your service agreements are, and all of that classification is done by a nice little robot called iManage AI. It’s magical.

Caroline: So for people to get the most value from the integration will they need Insight+? 

Neil: The integration that we have is through our API and it goes against the entire repository as well as Insight+, so it depends on the context. One of the things that I think is nice with the integration is it gives you attributions when it gives you a response, saying, ‘Hey, I found this answer here in this document.’ Why is that important? It’s because that way you know it’s not picking up some garbage that has just been lying around. It allows you to have a conversation, so ‘Find me the most current leads we have with this particular landlord.’ It works pretty well in how it constructs that query and brings the document back, and if it can’t find something, it will tell you ‘I couldn’t find that, can you maybe formulate your query differently?’

Caroline: That’s interesting, so while a lot of stuff in the DMS will be around the practice of law, the ability to search in that way makes it very relevant for the business of law and being able to gain extra insights.

Neil: Yes absolutely. And for corporate legal departments, that’s part of why they use the DMS. It’s slightly different than a law firm.

Caroline: You made a point in your press release announcing the integration of saying that this is grounded. Is there any risk of hallucination?  

Neil: We’ve done a fair amount of work to reduce hallucinations and to provide attribution so that the work can be checked. I don’t think anyone can say you can eliminate hallucinations right at this point. But it’s something that our engineers have taken into account, and we’ll be watching for very closely.  

Shawn: There are two things we’ve done here. When we approached AI, one of the things we did, for example, with the piece Neil was talking about around the automatic classification and categorization of information. If we are able to guide the large language models with saying, ‘Look at this, we already know that this is a type of lease agreement’ and you provide that context to the large language model, we reduce the risk of hallucination. It’s not eliminated a hundred percent. But we reduce it. 

Using a combination of the AI technologies, the classification and enrichment, and that what we have is custom built, not something that’s off the shelf, it’s not OpenAI and it’s all within the iManage Cloud, using that to contextualize and limit the amount of information that’s sent to the LLM produces better results in what we’ve seen. Now, of course, as customers use it and they deal with different types of content, different languages. I think we’ll learn more but that’s the approach we’ve used to help knowledge workers. 

Caroline: Sadly we’ve run out of time but is there anything I haven’t asked you that’s important about this announcement?  

Neil: We’re presenting this week at Build, which is Microsoft’s big developer conference, and Microsoft thought it was cool enough to bring us in front of their entire audience. The reason that we’ve been able to move this quickly is the years of investment that we’ve made in Azure and the way we’ve embraced Azure as a platform. That’s enabled us to deliver on what we’re doing with the Office 365 integration as well as what we’re doing with Copilot, in very robust ways that give you the functionality with the security and with the operational SLAs that you need. This is not a flash in the pan. This is an evolution based on years of investment in Azure and Microsoft.