MCPs are APIs for LLMs

March 20, 2025

Released by Anthropic last November, the Model Context Protocol is described as, “a new standard for connecting AI assistants to the systems where data lives, including content repositories, business tools, and development environments.” But even that description is a bit jargony.Here’s the simple version:

An MCP server exposes a bunch of end points, like any other API server, but it must have end points that list all the available functions on a server in a standard way an MCP client can understand.

MCP clients (usually LLM-powered), like Anthropic’s Claude Desktop, can then be connected to MCP servers and immediately know what tools are available for them to use.LLMs connected to MCPs can now call MCP servers using the specs provided by the API.

That’s it! It’s incredible simple, a standard to enable the Web 2.0 era for LLM applications, giving models plug-and-play access to tools, data, and prompt libraries.

Source: MCPs are APIs for LLMs | Drew Breunig

Suddenly in AI Engineering circles talk of MCP is everywhere. Drew Breunig has this straightforward introduction.