To expose your own MCP server, you implement the MCP protocol (typically using the official SDK for your language). Your server runs as a process that the client connects to (via stdio for local, or SSE for remote). You register tools (name, description, parameters schema) so the client can list them and the LLM can request them. You register resources (URIs and optionally content) so the client can read them when the model needs context. You can also expose prompts — pre-defined prompt templates the client can list and invoke. When the client sends "run tool X with args Y", your server executes the logic and returns the result. When the client requests a resource by URI, your server returns the content. The client then passes results back to the LLM. This way any MCP-compatible app (Cursor, Claude Desktop, your own) can use your capabilities without custom integration code.
How to expose your MCP server
- Implement an MCP server (e.g. in TypeScript/Python) that speaks the MCP protocol.
- Expose Tools: register callable actions (name, description, parameters schema).
- Expose Resources: register read-only URIs (e.g. file://, custom scheme) and return content when the client requests them.
- Optionally expose Prompts: pre-defined prompt templates the client can list and invoke.
- Run the server (stdio or SSE). The client connects and discovers tools/resources/prompts.
- The client passes the list to the LLM; when the model asks to call a tool or read a resource, the client sends the request to your server and returns the result.
Use the official MCP SDK for your language (e.g. @modelcontextprotocol/sdk for TypeScript). The server can run locally (stdio) or over the network (SSE). Clients like Cursor or Claude Desktop discover capabilities at connection time.
Example: Minimal tool exposure
You implement one tool: get_time with no arguments. The server returns the current time. The client discovers it, the LLM can call it when the user asks "What time is it?", and your server responds with the time. The same pattern scales to many tools (search, DB, APIs) and resources (files, docs).