From the September Orange Rag…
If you’ve talked about one exciting development in the AI space recently, that may well be MCP, the model context protocol launched by Anthropic last year, which provides a standardised way for large language models (LLMs) to connect with external data sources and tools.
Extremely cool and also misunderstood, MCP is being looked at for everything from reimaging enterprise search strategies to making you tea in bed, so we spoke with Yannic Kilcher, chief technology officer of AI search platform DeepJudge, who has almost three hundred thousand followers on his YouTube channel, for a cut the crap chat. FYI Kilcher’s YouTube channel is his own personal account, and you can find it here: https://www.youtube.com/c/YannicKilcher
Starting with some back to basics on what it actually is, Kilcher said: “MCP is a standard for different systems to communicate with each other, so it’s a protocol. The P stands for protocol, right? So in that sense, it’s just a standard of how systems communicate.
“We have had these standards before and as you already know, they are generally called an API. An API is where one system exposes a set of functionality to the outside world, and so you can think of MCP as just an API. It’s how things are commonly agreed upon. Maybe one good analogy is email. I can send every person in the world almost an email because the email format is standardised. So no matter whether you use Gmail or Outlook or Apple iCloud Mail or whatever, the format of email is standardised so I can send you an email and it will be received. MCP is the equivalent for LLMs and means developers don’t have to write API adapters for every single system.”
What MCP doesn’t do, is add any new capabilities. Kilcher says: “That’s really important. A lot of places will imply that now you can do all these great things but what MCP actually is, is just an API adapter that makes it so that you don’t have to write the API adapter. So, for example, let’s say take for an example SharePoint. If I wanted a large language model to interact with SharePoint, MCP effectively tells the LLM how to use my API, but SharePoint isn’t magically more powerful because of that, and I can’t do more things than I could before. It’s just that the work of writing that adapter has been done for me.”
In terms of search, Kilcher says: “What it can do is use the regular search that the system exposes to read the 10 search results and summarise them for you, or something like this.”
It’s important to note that much like an API, developers choose which functionalities to expose.
In terms of who has to care about this, Kilcher says: “I don’t think a CTO level necessarily needs to know about it because if their strategy is ‘I’m going to connect these two things on an application level’ then nobody cares whether that’s done via MCP or via API. What can change is how you make users interact with enterprise search.
LLMs work in tandem with, not in place of search engines, but MCP may change your requirements of them. Kilcher says: “Whereas before you might have been looking for something that has a really good UI and you might have been looking for an enterprise search engine that itself has a good chat interface, what you can do now is you can effectively say ‘we can use our own chat.’
“Communication and the UI is taken care of, so it more a case of ‘what capabilities do you have?’ Because that’s what matters.”