Agiloft launches Gen AI Prompt Lab including ‘bring your own LLM’ capability 

Contract lifecycle management vendor Agiloft today (2 May) announced the launch of its Generative AI Prompt Lab, which includes pre-configured prompt templates that can leverage customer’s own data – and their own large language model – across multiple use cases.  

This is available as a commercial add on to Agiloft’s CLM platform and is part of the AI Platform that includes the generative AI redlining capabilities launched earlier this year. 

At the point of launch, the Prompt Lab has five prebuilt templates: 

  • Contract summarisation – A summary that customers can configure for their own particular needs, whether that be the legal or procurement team, from an attached document. 
  • Supporting document analysis – Analysis and summary of any supporting documents that are related to the contract in the same contract PDF. This might be helpful, for example, during an approval process of a complex contract with many supporting documents. 
  • Clause comparison – Creates a summary of the differences between the extracted contract clause and your standard clause library clause. That might be useful during revisions or versions of a particular contract agreement.  
  • Generation of a termination letter – A user can get a skeleton draft of a termination letter that they can then complete and send out to the customer.
  • Contract description – A three sentence summary of the contract to use as the contract description. 

In a demonstration to Legal IT Insider of the new capability, Agiloft’s chief product officer Andy Wishart said that in choosing the contract summary, a user will be able to utilise the large language model they want to use. “Customers can either choose to utilize the GPT models that we are connecting to within Agiloft directly, or they can bring their own model. So if they have a model running in either Azure Open AI Services or within OpenAI, they can bring their model keys and can authenticate with that model.  

“That’s important for two reasons. When we’ve been working with customers over the last few months, there are some that are forward thinking and experimenting with large language models. They’re standing up their own instances of GPT models running in Azure or OpenAI and some of them are doing some fine tuning of that model. So being able to connect to their fine-tuned models could give them more contextually relevant results for their organization. It also means that if they’ve got some special terms with OpenAI or Microsoft that are really important to them under their agreement, then their use of generative AI within Agiloft is covered then by those special terms as well.” 

Users can select where the output is stored. A test mode helps no code administrators on the Agiloft platform to try things out within generative AI before they then decide to put that into action and execution. They can also modify the prompt. 

Wishart said: “Once they’ve created and tested a prompt, either a copy of one of the templates that we’ve created or their own prompts, you can take any data, infuse that data into the prompt, take the output, store it in a data field within Agiloft, and then you can use those answers to report or display to the user, or to trigger a workflow within the overall Agiloft platform.”

He adds: “We think this opens up a whole world of possibility for our customers. With these no code capabilities they can take the power of AI into their own hands, so we’re excited to see where customers will go with this. We’re hoping that customers will be sharing some of their prompt templates within our community to help Agiloft accelerate use cases across the gen AI space. This is one of the most exciting things that we’ve done recently and we’re excited to see the potential that the community can bring.”