MCP and Innovation Paradox: Why open standards will save AI by themselves

Rate this post

Join our daily and weekly newsletters for the latest updates and exclusive content of a leading AI coverage industry. Learn more


The bigger models do not move the next wave of AI innovation. The real interruption is higher: standardization.

Started by Anthropic in November 2024. Model context protocol (MCP) Standards how AI applications interact with the world beyond their training data. Like HTTP and Rest Standardized How web applications connect to services, MCP standardizes how AI models connect to tools.

You may have read a dozen articles explaining what MCP is. But what I miss the most is boring – and powerful – part: MCP is standard. Standards not only organize technology; They create growth flywheels. Take them early, and you drive the wave. Ignore them and you are lagging behind. This article explains why MCP matters now, what challenges it presents and how it is already redesigning the ecosystem.

How MCP moves us from chaos to context

Meet Lily, a product manager at a cloud infrastructure company. She Juggling projects In half a dozen tools such as Jira, Figma, Github, Slack, Gmail and Confluence. Like many, it drowns in updates.

By 2024, Lily saw how good models in LLMS (LLMS) had taken place in the synthesis of information. She noticed the opportunity: if she could put all the tools of her team in a model, she can automate updates, make communications and answer questions on request. But each model had its own personalized way to contact the services. Each integration puts it deeper into the platform of a supplier. When she had to withdraw transcripts from Gong, it meant to build another connection, which made it even more difficult to move to a better LLM later.

Anthropic starts MCP: an open protocol for standardizing how the context flows to LLMS. MCP quickly took support from OPENAI., AWS., Azure., Microsoft Copilot Studio And soon, Google. Official SDK are available for Python., Typescript., Java., C#., Rust., Cooler and SwiftS Community SDK for Go Others followed. Adoption was quick.

Today, Lily manages everything through Claude related to her work applications via a local MCP server. Condition reports are prepared. Leadership updates are one hint. With the advent of new models, it can exchange them without losing any of its integrations. When writing code on the side, she uses a cursor with a model of Openai and the same MCP server as in ClodS Her IDE already understands the product she builds. MCP did this easy.

The power and effects of the standard

Lily’s story shows a simple truth: no one likes to use fragmented tools. No user likes to be locked in suppliers. And no company wants to rewrite integration every time they change models. You want the freedom to use the best tools. MCP delivers.

Now with the standards are consequences.

First, SAAS suppliers without strong public APIs are vulnerable to aging. MCP tools depend on these APIs and customers will require support for their AI applications. With the actual standard it appears, there are no excuses.

Second, AI applications development cycles are about to accelerate dramatically. Developers no longer have to write a custom code to test simple AI applications. Instead, they can integrate MCP servers with easily accessible MCP customers, such as Claude, Cursor and Windsurf desktop.

Third, the switch costs collapse. As integrations are separated from specific models, organizations can migrate from Claude to Openai to Gemini – or mix models – without restoring the infrastructure. Future LLM suppliers It will benefit from an existing ecosystem around MCP, which allows them to focus on better price indicators.

Navigating Challenges with MCP

Each standard introduces new friction points or leaves existing friction points. MCP is no exception.

Trust is critical: Dozens of MCP registers have appeared, offering thousands of community support servers. But if you do not control the server – or trust the country that does it – you run the risk of missing out secrets to an unfamiliar third party. If you are SAAS company, provide official servers. If you are a developer, look for formal servers.

The quality is variable: APIS develops and poorly maintained MCP servers can easily fall out of sync. LLMS rely on high quality metadata to determine which tools to use. There is no reputable MCP register yet, intensifying the need for official servers from reliable countries. If you are a SAAS company, keep your servers while your APIs develop. If you are a developer, look for formal servers.

Large MCP servers increase costs and more usefulness: The incorporation of too many tools into a server increases costs by consuming markers and overcomes models with too much choice. LLMS is easy to confuse if they have access to too many tools. This is the worst of both worlds. Smaller servers focused on tasks will be important. Keep this in mind while building and spreading servers.

The challenges of permission and identity continue: These problems existed before MCP and they still exist with MCP. Imagine that Lily allowed Claude to send emails and give well -meaning instructions such as: “Quickly send Chris an update of the status.” Instead of sending an email to his boss, Chris, LLM sends emails to everyone named Chris in his contact list to make sure Chris is receiving the message. People will have to stay in the high -reasoning cycle.

Looking forward

MCP is not over – this is a major change in AI applications infrastructure.

And, like any well -acceded standard before it, MCP creates a self -made flywheel: every new server, every new integration, every new application combines inertia.

New tools, platforms and registers are already emerging to simplify the construction, testing, implementation and detection of MCP servers. With the development of the AI ​​ecosystem, applications will offer simple interfaces for inclusion in new opportunities. Teams that cover the protocol will deliver products faster with better integration stories. Companies offering public APIs and official MCP servers may be part of the history of integration. Late adopters will have to fight for relevance.

Noah Schwartz is the leader of the product for PostmanS


 
Report

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *