← AI Agents path

πŸ€– Using AI & AI Agents

Chapter 20 of 24

βš™οΈ Chapter 20: MCP: Client, Server, Host, Resources & Tools

Client, server, host; exposing resources and tools

MCP has three main roles. The client is the app that hosts the LLM (Cursor, Claude Desktop, your agent). It connects to one or more MCP servers, discovers their tools and resources, and passes that list to the model. When the model asks to run a tool or read a resource, the client sends the request to the right server and returns the result to the model. The server is the process that exposes capabilities: it implements the MCP protocol, registers tools (callable actions with name and parameters) and resources (read-only content by URI). The host is where the server runs β€” local (stdio) or remote (e.g. SSE). Resources are read-only (files, docs, DB rows); tools can have side effects (search, run code, call APIs). The client discovers everything at connection time so the LLM can use tools and resources by name without knowing which server implements them.

MCP conversation flow

1
User asks in app (e.g. Cursor): "Search for flights to Tokyo"
2
App (MCP client) already has tool list from connected MCP server(s)
3
App sends user message + tool list to LLM
4
LLM returns: call tool search_flights(destination: Tokyo)
5
App sends tool call to MCP server β†’ server runs it (e.g. calls airline API) β†’ returns result
6
App sends observation back to LLM; LLM may call more tools or give final answer
7
User sees: "Here are 3 flights to Tokyo…"

MCP: Client, Server, Host

  • Client β€” The app that has the LLM (e.g. Cursor, Claude Desktop, your chatbot). It connects to one or more MCP servers, discovers their tools/resources/prompts, and passes them to the model. When the model wants to call a tool or read a resource, the client sends the request to the right server.
  • Server β€” The process that exposes capabilities. It implements the MCP protocol: list tools, list resources, run a tool, read a resource, etc. One server can expose many tools and resources (e.g. a "filesystem" server with read_file, write_file, list_dir).
  • Host β€” The machine or environment where the server runs. It can be local (same machine as the client) or remote. Connection is typically stdio (local) or SSE (network).
Client (App)
↔ MCP protocol ↔
Server (Tools + Resources)

Resources = read-only content (e.g. file contents, doc by URI). Tools = callable actions (e.g. search, run script). The server advertises both; the client lets the LLM use them.

MCP: Model Context Protocol

A standard way for an LLM app (client) to talk to external capabilities (server).

Client (App / IDE)
↔
MCP Server

What the server can expose

  • MCP ServerExposes tools, resources, prompts to the client
  • ToolsActions the model can invoke (e.g. search, run code)
  • ResourcesRead-only data (files, docs) the model can read
  • PromptsPre-defined prompt templates from the server

Example: Cursor IDE + MCP

When you ask Cursor to "open the docs for React", the app might use an MCP server that exposes a tool.

Client β†’ Server (request)

Call tool: fetch_url with url: "https://react.dev"

Server β†’ Client (response)

Returns content or summary of the page. The app injects this into the LLM context so it can answer using the docs.

MCP standardizes the shape of these requests and responses so any compatible app can use any MCP server (files, DB, search, custom APIs) without custom code per integration.