AI Agents path

🤖 Using AI & AI Agents

Chapter 19 of 24

⚖️ Chapter 19: MCP vs API

How MCP differs from traditional APIs: discovery, interoperability, and integration

A traditional API is a fixed set of endpoints: you code which URLs to call, what parameters to send, and how to parse responses. Adding a new capability usually means new endpoint or new fields and then updating every client. MCP is built for LLM-driven tools: the client discovers at runtime what tools and resources a server offers (name, description, parameters). The server can add or change tools without the client’s code changing. You get a single protocol and a consistent shape (tools + resources) that any LLM app can consume.

MCP vs traditional API

AspectTraditional APIMCP
DiscoveryYou code which endpoints to call; static.Client asks the server what tools/resources exist; dynamic list at runtime.
IntegrationEach app writes its own client for each API (auth, request shape, errors).One MCP client talks to any MCP server; same protocol, same shape.
ChangeNew endpoint or parameter often means updating app code and redeploying.Server adds a new tool; clients discover it on next connect; no app code change.
LLM useYou manually map API endpoints to tool definitions for the LLM.Server advertises tools (name, description, params); client passes them to the LLM as-is.

Example: When to use which

Use an API when you have a fixed, app-to-app integration and don’t need dynamic tool discovery (e.g. your backend calling a payment provider). Use MCP when you want AI apps and agents to discover and use capabilities at runtime — files, search, DB, your custom tools — without hardcoding each integration.