By Connie · This article contains affiliate links. We may earn a commission at no extra cost to you if you sign up through our links.
- MCP crossed 97 million monthly SDK downloads on March 25, 2026 — the fastest adoption of any AI infrastructure standard in history.
- Anthropic donated the protocol to the Linux Foundation's Agentic AI Foundation (AAIF) in December 2025. OpenAI, Google, Microsoft, Amazon are founding members.
- 10,000+ active public MCP servers for GitHub, Slack, PostgreSQL, AWS, and thousands of other integrations.
- MCP is now the universal standard for AI agent tool use — supported by ChatGPT, Claude, Cursor, Gemini, Microsoft Copilot, and VS Code.
- Think of MCP as USB for AI agents: one connector that works with everything, so you never write custom integration code again.
On March 25, 2026, Anthropic's Model Context Protocol crossed 97 million monthly SDK downloads. To put that number in context: it took npm (the Node.js package manager, now central to all web development) years to reach comparable adoption. MCP did it in 16 months from its November 2024 launch. It is now the fastest-adopted AI infrastructure standard in history.
This isn't just a developer milestone. MCP is the reason AI agents in 2026 can connect to your company's tools, databases, and services without a team of engineers writing custom glue code. It's the plumbing that makes agentic AI practical at scale — and understanding it is the difference between knowing AI as a chatbot and knowing AI as a system.
What Is Model Context Protocol — The Simple Explanation
The core problem before MCP: every AI agent needed custom code to talk to every tool. If you wanted Claude to read your GitHub issues and update your Slack channel, an engineer had to write a custom GitHub connector and a custom Slack connector. Multiply that by thousands of tools and hundreds of AI platforms, and you have a combinatorial explosion of integration work.
MCP solves this with a single standard. Any AI client (ChatGPT, Claude, Cursor, your custom agent) can talk to any MCP server (GitHub, Slack, PostgreSQL, your internal database). One protocol. Any-to-any connectivity.
MCP is to AI agents what HTTP is to the web — a universal communication protocol that lets any AI model connect to any external tool or data source through a consistent, standardized interface.
The protocol defines how clients and servers exchange three types of resources:
- Tools: Functions the AI can call (run a database query, create a GitHub issue, send a Slack message)
- Resources: Data the AI can read (file contents, database records, API responses)
- Prompts: Reusable instruction templates that server owners define for common workflows
The 97 Million Number: What It Actually Means
The 97 million figure counts monthly SDK downloads across Python and TypeScript — the two official MCP SDKs. These aren't unique users; a single developer downloading the SDK for CI/CD counts as a download every month. But the growth trajectory tells the real story.
| Milestone | Date | Monthly Downloads |
|---|---|---|
| MCP launch | November 2024 | ~100K |
| First major ecosystem signal | February 2025 | ~5M |
| Donated to Linux Foundation (AAIF) | December 2025 | ~40M |
| 10,000 active public servers | January 2026 | ~65M |
| 97M milestone | March 25, 2026 | 97M+ |
For comparison: Kubernetes took 3 years to reach similar developer mindshare. Docker took 2. MCP's 16-month trajectory from experimental spec to universal standard reflects how quickly the AI agent ecosystem is moving.
The Agentic AI Foundation: Who's Running MCP Now
In December 2025, Anthropic donated MCP to the Linux Foundation, which established the Agentic AI Foundation (AAIF) as a directed fund to provide neutral governance. The founding members represent every major AI and cloud platform:
The governance model mirrors how Linux Foundation manages Kubernetes: strategic investments and budget oversight go through AAIF, while day-to-day technical direction stays with the MCP maintainers. This gives the protocol the credibility of neutral stewardship while preserving technical agility.
Also donated to AAIF: goose (Block's open-source local-first AI agent framework built on MCP) and AGENTS.md (the emerging standard for defining AI agent behaviors in repositories — think .gitignore but for AI agents).
What You Can Do With MCP Today
With 10,000+ active public MCP servers, the protocol has moved well beyond developer experimentation. Here are the most popular categories:
| Category | Key MCP Servers | What Your AI Agent Can Do |
|---|---|---|
| Developer Tools | GitHub, GitLab, Jira, Linear | Create PRs, file issues, review code, update sprint boards |
| Databases | PostgreSQL, MySQL, SQLite, Supabase | Query, insert, and update records in natural language |
| Communication | Slack, Gmail, Outlook, Notion | Read channels, draft replies, create docs, search messages |
| Cloud Infrastructure | AWS, GCP, Azure, Cloudflare | Provision resources, check logs, manage deployments |
| Search & Research | Brave Search, Perplexity, arXiv | Retrieve live web results, academic papers, news |
| Productivity | Google Calendar, Airtable, Zapier | Schedule meetings, update records, trigger workflows |
| File Systems | Local files, Google Drive, Dropbox | Read, write, search, and organize files |
| Custom Internal Tools | Any API with MCP wrapper | Your own databases, internal APIs, proprietary data |
Which AI Platforms Support MCP
MCP support has become table stakes for major AI platforms in 2026. Here's the current landscape:
| Platform | MCP Support | Notes |
|---|---|---|
| Claude (Anthropic) | Native, full spec | Original creator. Best-in-class MCP implementation; Claude Desktop has built-in server management |
| ChatGPT (OpenAI) | Full support (Dec 2025) | Added via ChatGPT Plugins evolution; GPT-5.4 has first-class MCP tool calling |
| Cursor | Native, full spec | MCP is central to Cursor's agentic coding workflow; 10M+ developer users |
| Gemini (Google) | Full support | Integrated into Gemini 3.1 Pro and Cloud Vertex AI; managed remote MCP servers launched March 2026 |
| Microsoft Copilot | Full support | Azure-managed MCP servers for enterprise (Maps, BigQuery, internal tools) |
| VS Code | Built-in (1.97+) | Native MCP server management; install and configure from settings UI |
| Windsurf (Codeium) | Full support | AI code editor; tight MCP integration for agentic coding tasks |
Why MCP Reaching 97M Installs Changes Everything
The 97M milestone isn't just a vanity metric. It represents a threshold: MCP has achieved the network effects needed to become self-sustaining as a standard. Here's why that matters:
- It locks in the protocol, not the model. Before MCP, switching AI providers meant rebuilding all your integrations. With MCP, your tools work with any model. Vendor lock-in is broken at the integration layer.
- It sets the floor for agent capability. Any AI model that doesn't support MCP is a less capable agent. Every competitor must now support the standard to stay competitive.
- It enables the enterprise AI market to scale. Enterprise AI adoption was bottlenecked by integration complexity. MCP removes that bottleneck — any enterprise tool with an MCP server is instantly available to any AI agent.
- It shifts AI competition from integration to reasoning. When every AI can connect to the same tools, differentiation shifts entirely to how well the model reasons over tool outputs. This is why model capability benchmarks are becoming more important, not less.
Kubernetes became the universal container orchestration standard not because it was technically perfect, but because it achieved critical mass — once major cloud providers adopted it, everyone else had to. MCP is at that same inflection point in 2026. The Linux Foundation donation, AAIF founding members, and 97M downloads together represent "critical mass." The standard is now set.
What MCP Means if You're Not a Developer
If you use AI tools daily but don't write code, MCP still directly affects your experience. Every AI assistant that can connect to your calendar, your email, your project management tool, your databases — that capability comes from MCP. The difference between an AI assistant that's just a chatbot and an AI assistant that can actually do things in your digital environment is, largely, whether it supports MCP.
When you see an AI tool advertised as "agentic," "autonomous," or capable of "using tools," what that usually means in practice is: it supports MCP servers. The 10,000+ servers cover almost every mainstream tool a knowledge worker uses.
Related Coverage
- OWASP Agentic AI Top 10: the biggest security risks from MCP-powered agents
- Google Ironwood vs NVIDIA: the AI chip race powering MCP-scale workloads
- Meta's 75% AI coding mandate: how MCP-connected agents are replacing human engineers
- Anthropic $380B valuation: the company that built MCP just raised $30B
Frequently Asked Questions
- Anthropic: "Donating the Model Context Protocol and establishing the Agentic AI Foundation" (December 9, 2025)
- MCP Blog: "MCP joins the Agentic AI Foundation" (December 9, 2025)
- Linux Foundation: "Linux Foundation Announces the Formation of the Agentic AI Foundation (AAIF)" (December 9, 2025)
- AI Unfiltered / Artur Markus: "Anthropic's Model Context Protocol Hits 97 Million Installs on March 25" (March 28, 2026)
- GitHub Blog: "MCP joins the Linux Foundation" (December 9, 2025)
- modelcontextprotocol.io: Protocol specification and statistics (March 2026)
Get the best AI tools tips — weekly
Honest reviews, tutorials, and Happycapy tips. No spam.