·8 min read·Development

MCP: How the 'USB-C for AI' Became the Standard Nobody Expected

Model Context Protocol went from Anthropic's November 2024 release to 97 million monthly SDK downloads and adoption by ChatGPT, Gemini, and VS Code. Technical deep-dive and getting started guide.

mcpmodel-context-protocolai-toolsanthropicdeveloper-toolsartificial-intelligence
MCP Model Context Protocol

In November 2024, Anthropic quietly released the Model Context Protocol (MCP)—an open standard for connecting AI models to external tools and data. Fourteen months later, it has become the de facto standard for AI integrations, with 97 million+ monthly SDK downloads, 10,000+ active servers, and adoption by ChatGPT, Gemini, Microsoft Copilot, and VS Code. In December 2025, Anthropic donated MCP to the Linux Foundation, cementing its status as an industry standard. Here's everything you need to know.


What MCP Actually Does

Think of MCP as USB-C for AI—a universal connector that lets any AI model talk to any tool or data source.

The Problem MCP Solves

Before MCP, connecting AI to tools meant:

  • Custom integrations for each model × each tool
  • N models × M tools = N×M integrations
  • Inconsistent interfaces, duplicated effort

With MCP:

  • Each tool implements MCP once
  • Each model supports MCP once
  • Everything connects to everything

The Architecture

text
┌─────────────┐     MCP Protocol     ┌─────────────┐
│   AI Host   │◄────────────────────►│  MCP Server │
│  (Claude,   │                      │  (Tool/DB/  │
│   ChatGPT)  │                      │   Service)  │
└─────────────┘                      └─────────────┘
       │                                    │
       │       Standardized JSON-RPC        │
       │      over stdio/SSE/WebSocket      │
       ▼                                    ▼
   Any AI App                          Any Data Source

The Adoption Explosion

Timeline

DateMilestone
Nov 2024Anthropic releases MCP
Jan 2025First 1,000 community servers
Mar 2025OpenAI adds MCP support
Jun 2025Google Gemini integration
Aug 2025Microsoft Copilot adoption
Oct 2025VS Code native support
Dec 2025Donated to Linux Foundation
Jan 202610,000+ servers, 97M+ downloads

Who's Using It

CompanyIntegration
AnthropicClaude Desktop, Claude.ai
OpenAIChatGPT, Codex CLI
GoogleGemini, AI Studio
MicrosoftCopilot, VS Code
CursorNative MCP support
JetBrainsIDE integrations
SourcegraphCody AI
Estimate: 90% of organizations will use MCP by end of 2026.

Core Concepts

Resources

Resources are data exposed to AI models. Examples:

  • File contents
  • Database records
  • API responses
  • Screenshots

json
{
  "uri": "file:///path/to/document.md",
  "name": "Project README",
  "mimeType": "text/markdown"
}

Tools

Tools are actions the AI can take. Examples:

  • Run shell commands
  • Query databases
  • Send messages
  • Create files

json
{
  "name": "run_query",
  "description": "Execute a SQL query",
  "inputSchema": {
    "type": "object",
    "properties": {
      "query": {"type": "string"}
    }
  }
}

Prompts

Prompts are pre-built templates for common tasks:

json
{
  "name": "explain_code",
  "description": "Explain a code snippet",
  "arguments": [
    {"name": "code", "required": true}
  ]
}

Building Your First MCP Server

Python Example

python
from mcp import Server, Resource, Tool

server = Server("my-server")

@server.resource("config://app") async def get_config(): """Expose application configuration.""" return Resource( uri="config://app", name="App Configuration", text=open("config.json").read() )

@server.tool("search_logs") async def search_logs(query: str, limit: int = 100): """Search application logs.""" results = log_search(query, limit) return {"matches": results}

if __name__ == "__main__": server.run()

TypeScript Example

typescript
import { Server } from "@modelcontextprotocol/sdk";

const server = new Server({ name: "my-server", version: "1.0.0" });

server.addResource({ uri: "docs://readme", name: "Documentation", async read() { return { text: await fs.readFile("README.md", "utf-8") }; } });

server.addTool({ name: "deploy", description: "Deploy to production", inputSchema: { type: "object", properties: { environment: { type: "string", enum: ["staging", "production"] } } }, async execute({ environment }) { return await deployTo(environment); } });

server.start();


Popular MCP Servers

Official Servers

ServerFunctionUse Case
@mcp/filesystemFile operationsLocal file access
@mcp/gitGit operationsRepository management
@mcp/postgresPostgreSQLDatabase queries
@mcp/sqliteSQLiteLocal databases
@mcp/puppeteerBrowser automationWeb scraping
@mcp/fetchHTTP requestsAPI integration

Community Favorites

ServerFunctionStars
mcp-server-githubGitHub API2,400+
mcp-server-notionNotion integration1,800+
mcp-server-slackSlack operations1,500+
mcp-server-linearLinear issues1,200+
mcp-server-stripePayment processing900+

Connecting MCP to Claude

Claude Desktop Configuration

Edit ~/Library/Application Support/Claude/claude_desktop_config.json:

json
{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/files"]
    },
    "github": {
      "command": "npx",
      "args": ["-y", "mcp-server-github"],
      "env": {
        "GITHUB_TOKEN": "your-token-here"
      }
    },
    "postgres": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-postgres"],
      "env": {
        "DATABASE_URL": "postgresql://..."
      }
    }
  }
}

VS Code Configuration

json
{
  "mcp.servers": {
    "project": {
      "command": "node",
      "args": ["./mcp-server/index.js"]
    }
  }
}

Security Considerations

The Risks

MCP servers can:

  • Read sensitive files
  • Execute arbitrary commands
  • Access databases
  • Make network requests

This is powerful and dangerous.

Best Practices

RiskMitigation
Credential exposureUse environment variables, not config files
Overly broad accessScope servers to specific directories/resources
Untrusted serversOnly use audited, trusted servers
Command injectionValidate and sanitize all inputs
Data exfiltrationMonitor and log all tool invocations

Server Isolation

json
{
  "mcpServers": {
    "safe-fs": {
      "command": "npx",
      "args": [
        "@modelcontextprotocol/server-filesystem",
        "/safe/directory/only"
      ],
      "sandbox": true
    }
  }
}

Anti-Patterns to Avoid

1. The "God Server"

Don't: Create one server that does everything Do: Create focused, single-purpose servers

2. Leaking Secrets

Don't: Hardcode tokens in server code Do: Use environment variables, secret managers

3. Unrestricted File Access

Don't: server-filesystem / Do: server-filesystem /project/specific/path

4. No Input Validation

Don't: Pass user input directly to shell commands Do: Validate, sanitize, and constrain all inputs

5. Missing Error Handling

Don't: Let errors propagate uncaught Do: Handle errors gracefully, provide useful messages

The Linux Foundation and Agentic AI Foundation

In December 2025, Anthropic donated MCP to the Linux Foundation, which created the Agentic AI Foundation to govern it.

What This Means

  • Neutral governance: No single company controls MCP
  • Industry collaboration: Competitors cooperate on standards
  • Long-term stability: Foundation ensures continuity
  • Accelerated adoption: Enterprise confidence increases

Foundation Members

  • Anthropic (founding)
  • OpenAI
  • Google
  • Microsoft
  • Amazon
  • Meta
  • Plus 50+ other organizations

Building Production MCP Systems

Observability

python
import logging
from opentelemetry import trace

tracer = trace.get_tracer(__name__)

@server.tool("query_database") async def query_database(sql: str): with tracer.start_as_current_span("mcp_query_database") as span: span.set_attribute("sql.query", sql[:100])

start = time.time() result = await db.execute(sql)

span.set_attribute("result.count", len(result)) logging.info(f"Query completed in {time.time()-start:.2f}s")

return result

Rate Limiting

python
from asyncio import Semaphore

rate_limiter = Semaphore(10) # Max 10 concurrent

@server.tool("expensive_operation") async def expensive_operation(params): async with rate_limiter: return await do_expensive_thing(params)

Caching

python
from functools import lru_cache

@lru_cache(maxsize=1000) def cached_lookup(key: str): return database.get(key)

@server.resource("data://{key}") async def get_data(key: str): return Resource( uri=f"data://{key}", text=json.dumps(cached_lookup(key)) )


What's Next for MCP

2026 Roadmap

FeatureTimelineImpact
Streaming resourcesQ1 2026Real-time data feeds
Server discoveryQ1 2026Automatic connection
Permission scopesQ2 2026Granular access control
Multi-modal resourcesQ2 2026Images, audio, video
Server federationQ3 2026Distributed architectures

Emerging Patterns

  • Agent-to-agent communication: MCP as inter-agent protocol
  • Enterprise catalogs: Managed MCP server registries
  • Compliance frameworks: Audit-ready server implementations

Getting Started Checklist

Week 1: Explore

  • [ ] Install Claude Desktop
  • [ ] Configure filesystem server
  • [ ] Try GitHub and/or Notion servers
  • [ ] Read 3 popular server implementations

Week 2: Build

  • [ ] Identify a data source to expose
  • [ ] Create minimal MCP server
  • [ ] Add 2-3 tools
  • [ ] Test with Claude

Week 3: Deploy

  • [ ] Add proper error handling
  • [ ] Implement logging
  • [ ] Security audit
  • [ ] Share with team

Conclusion

MCP's journey from Anthropic side project to industry standard demonstrates the power of open protocols. By solving the N×M integration problem, MCP has:

  • Enabled tools to work with any AI model
  • Reduced integration effort by orders of magnitude
  • Created a thriving ecosystem of 10,000+ servers
  • Established trust through Linux Foundation governance

For developers, MCP is now table stakes. Understanding how to build and consume MCP servers isn't optional—it's essential for working with AI in 2026 and beyond.

The "USB-C for AI" analogy isn't just marketing. Like USB-C unified charging and data transfer, MCP is unifying how AI connects to the world. Get connected.


Sources:
  • Anthropic MCP Documentation
  • Linux Foundation Agentic AI Foundation
  • Thoughtworks Technology Radar
  • Community server registry (mcp.so)

Written by Vinod Kurien Alex