MCP Documentation and SDK Reference Guide

Where to find the MCP docs, the llms-full.txt trick for learning with AI, and official SDKs for TypeScript, Python, and more.

7 min read
MCP documentation MCP TypeScript SDK MCP quick start model context protocol docs

Where to find the MCP docs that matter

The official MCP documentation lives at modelcontextprotocol.io. It covers the protocol specification, server and client architecture, transport layers, and every primitive type (tools, resources, prompts). But the docs site is only one of six resources you should know about, and most people never find the other five.

Here is the full map of what's available and where to find it.

ResourceURLWhat it gives you
Official docs sitemodelcontextprotocol.ioProtocol spec, architecture guides, integration docs
llms-full.txtmodelcontextprotocol.io/llms-full.txtEntire docs in one text file, built for pasting into AI
Quick start guidemodelcontextprotocol.io/quickstartStep-by-step weather server tutorial
TypeScript SDKgithub.com/modelcontextprotocol/typescript-sdkOfficial SDK for building servers and clients in TS/Node
Python SDKgithub.com/modelcontextprotocol/python-sdkOfficial SDK for Python-based servers
Example servers repogithub.com/modelcontextprotocol/serversReference implementations and community servers

The introduction section is the right starting point. It walks through what MCP is, why it exists, and how the client-server architecture works. If you've already read through our earlier guides on what MCP does, the introduction section on the docs site fills in the protocol-level details that sit underneath those concepts.

Once the "what" and "why" make sense, move directly to the quick start guide for server developers. That's where you start building.

The llms-full.txt trick

This is the single most useful thing the MCP docs offer, and almost nobody talks about it. The file at modelcontextprotocol.io/llms-full.txt contains the entire MCP documentation in plain text. One page. Every section, every code example, every specification detail.

The whole point of this file is to paste it into an AI chat and ask questions.

# Copy the entire contents of llms-full.txt, then paste into Claude:

You: [paste full contents of llms-full.txt]

You: Based on this documentation, explain how MCP tool
     definitions work and show me the schema format
     for registering a new tool.

This works because modern AI models handle large context windows well. The full MCP docs fit comfortably inside Claude's context. Once the text is in the conversation, you can ask anything about the protocol, request code examples, or troubleshoot errors at your own pace.

Three high-value ways to use this:

  • Paste the docs and ask for a plain-language explanation of any concept that feels unclear from reading alone
  • Paste the docs alongside your own server code and ask the model to check whether your implementation follows the spec correctly
  • Paste the docs plus the source code from an example server and ask the model to explain how that server works line by line

> Tip: You don't need to copy-paste manually every time. Save the llms-full.txt file locally. Then you can reference it in Claude Code or attach it in any AI chat interface whenever you need it. The file updates as the docs change, so re-download it periodically.

SDKs for every language that matters

The MCP project maintains official SDKs across multiple languages. Each SDK implements the full MCP specification, so you get the same capabilities regardless of which one you pick. The choice comes down to what language you're already working in.

Additional SDKs exist for Swift, Ruby, Rust, and PHP. The full list lives at modelcontextprotocol.io/docs/sdk.

The TypeScript SDK is the most mature option. It ships server libraries (tools, resources, prompts, transport handlers), client libraries (transport adapters, OAuth helpers), and middleware packages for frameworks like Express and Hono. Adding a tool to your MCP server with the TypeScript SDK takes a few lines of code. Resources and prompts follow the same pattern.

The Python SDK is equally capable and includes its own example servers. If you're already comfortable with Python, it's a perfectly solid choice.

> Tip: Whichever SDK you choose, check the /examples folder in that repository before writing your first server. Most SDKs include working sample servers that show the exact patterns you need for registering tools, handling requests, and connecting to clients.

The quick start guide and example servers

The quick start at modelcontextprotocol.io/quickstart walks you through building a weather server from scratch. It's the same weather server referenced throughout these guides. The tutorial covers creating the project, setting up your environment, registering tools (get-forecast and get-alerts), and connecting to MCP clients like Claude Desktop and Cursor.

This is the right first project. The weather server is small enough to finish in a single sitting but complete enough to teach you the core patterns: tool registration, request handling, and client configuration. Every MCP server you build afterward follows the same structure.

After the quick start, your next stop is the example servers repository at github.com/modelcontextprotocol/servers. This repo contains reference implementations that demonstrate more advanced patterns. The filesystem server, for instance, shows how to expose file operations as MCP tools. Other servers demonstrate database access, API integrations, and multi-tool setups.

The source code in these servers is open. You can read it, copy patterns from it, and paste it into an AI conversation alongside the llms-full.txt docs. Ask the model to explain how a specific server works, or have it combine patterns from multiple examples into a starting point for your own.

> Tip: When you find a community server that does something close to what you need, don't start from zero. Copy the relevant source code, paste it into Claude alongside the MCP docs, and describe what you want to change. The model can adapt existing server code to your specific use case faster than you can write it by hand.

Using the docs and AI together to learn faster

The traditional way to learn a new protocol is to read the docs top to bottom, then try to build something, then go back to the docs when you get stuck. That cycle works but it's slow.

The faster approach is to use the docs and an AI model as a combined learning system. Paste the llms-full.txt into a conversation and treat the model like a technical co-pilot who has memorized the entire specification.

Step 1: Start with the quick start guide. Follow it step by step. Build the weather server. Get it connected to a client. This gives you a working mental model of how the pieces fit together.

Step 2: Paste the full docs into Claude. Ask specific questions about things you noticed during the quick start. "What other transport options exist besides stdio?" or "How do I add a resource alongside my tools?"

Step 3: Read example server source code. Find a server that does something interesting, read through it, and ask the model to explain the parts you don't recognize.

Step 4: Start your own server. When you're ready to build something real, paste the docs and your planned architecture into a conversation. Ask the model to scaffold the server structure before you write any code.

This loop compresses weeks of documentation study into a few focused sessions. The docs are thorough. The AI can interpret them for your specific situation. Used together, they get you from "I've never built an MCP server" to "I have a working custom server" faster than either resource can alone.

> Start here: Open modelcontextprotocol.io/llms-full.txt in your browser. Save the file locally. Paste it into your next Claude conversation and ask one question about something you want to build. That single action will teach you more about MCP in ten minutes than an hour of reading docs on your own.

Keep Going

Ready to Start Building?

Pick the next step that matches where you are right now.

Tutorial
Claude Code Basics

Start with the terminal basics. A hands-on, step-by-step guide to your first 10 minutes with Claude Code.

Start the Tutorial
Guide
AI-Powered Workflows

Automate your client work. Learn how to connect AI tools into workflows that handle repetitive tasks for you.

Read the Guide
Community
Join the Community

Connect with other fractional leaders building with AI. Share workflows, get feedback, and learn from operators who are ahead of you.

Apply to Join