Back to Blog
AIMCP

MCP = Function Calling + Standardization (Just for Tools) Mcp_client Part1

The Equation That Changes Everything

December 20, 202512 min read
MCP = Function Calling + Standardization (Just for Tools) Mcp_client Part1

Let me share a simple equation that explains the Model Context Protocol (MCP) revolution: MCP = Function Calling + Standardization

That’s it. If you understand function calling from OpenAI, Anthropic, you already understand 80% of MCP. The remaining 20%? That’s the standardization magic that transforms isolated tools into a universal ecosystem.

While MCP offers other features like resources and prompts for servers, and sampling and elicitation for clients, we’re going to focus exclusively on tools — because tools are where the real revolution happens. With just tools/list and tools/call, you can build anything.

The Before and After: A Revolution in Access

Before MCP: The Developer Bottleneck

Imagine lina, a marketing manager. She wants Claude desktop to:

  • Send personalized emails to her clients
  • Update her CRM with conversation notes
  • Schedule follow-up meetings in Google Calendar
  • Generate reports from her analytics dashboard

The Reality: Impossible. Each integration required a developer to write custom code, understand different APIs, handle authentication, manage errors. lina would submit a ticket, wait weeks, and get something that half-worked.

Developers weren’t happy either. Every AI integration was unique:

  • OpenAI had its function calling format
  • Anthropic had its tool use specification
  • LangChain had its own tool syntax and crewai also (and others).
  • Each framework required learning new patterns and syntax for defining and calling tools
  • Nothing was reusable — a tool written for OpenAI couldn’t work with Anthropic, a LangChain tool couldn’t work with other framework.

After MCP: The Democratization

Same lina, same needs. But now she adds three lines to her Claude Desktop configuration:

{
  "mcpServers": {
    "gmail": {"command": "npx", "args": ["@modelcontextprotocol/server-gmail"]}
  }
}

That’s it. No coding. No waiting. She can immediately tell Claude: “Send follow-up emails to everyone who opened our newsletter but didn’t click through.”

The Magic: big companies like Google, Slack, and Notion now provide MCP servers. lina doesn’t need to understand their APIs — she just connects and uses (and read some docs).

For developers? Instead of learning 10 different APIs, they learn one protocol. Instead of maintaining custom integrations, they configure standard servers. Hours of work become minutes of configuration.

The Architecture: Understanding the Symphony

MCP creates a beautiful orchestration between three key players:

The MCP Host: The Conductor

This is your AI application — Claude Desktop, Cursor, or any AI-powered tool. Like a conductor leading an orchestra, the host doesn’t play instruments directly but coordinates all the musicians. When you ask Claude to “email the team and update the spreadsheet,” the host orchestrates multiple servers to fulfill your request.

The MCP Client: The Interpreter

For each service the host wants to talk to, it creates a dedicated interpreter — an MCP client. If your host connects to Gmail, Slack, and GitHub, it creates three separate clients, each maintaining a one-to-one conversation with their assigned server. This ensures clean separation — your Gmail client only talks to Gmail, never mixing signals.

The MCP Server: The Specialist

Each MCP server is an expert in its domain. The Gmail server knows email, the GitHub server knows code, the database server knows data. They wait for requests, execute them expertly, and return results. The beauty? These servers can run anywhere — on your laptop (local) or in the cloud (remote).

Our Testing Laboratory: Three Transports, One Protocol (in the hard way)

Building MCP servers isn’t trivial — especially when you add authentication, security, and production concerns. But here’s the real challenge: if you want to integrate MCP into your own product, you can’t just use Claude Desktop — you need your own client. Your product needs its own way to connect to MCP servers, and that’s where things get interesting.

Client development has lagged behind server development. While there are plenty of MCP servers available, building a client that fits into your specific product requires deep understanding of the protocol. You can’t just embed Claude Desktop into your app — you need to build your own client that matches your product’s needs, UI, and workflow.

That’s why we’re taking a unique approach: instead of jumping into SDKs and frameworks, we’re using the terminal itself as our client. By simulating a client with just curl commands and raw JSON, we’ll understand the exact workflow between client and server. This raw understanding is essential for anyone who wants to integrate MCP into their own products.

Once you grasp this raw communication, building your own client becomes achievable. You’ll know exactly what your client needs to do, how to handle sessions, how to manage multiple servers, and how to integrate it all into your product. (And yes, I’m building a full MCP client from scratch with Python and a beautiful interface — because every product needs its own tailored MCP integration!)

Let me show you exactly how MCP works by walking through our real server conversations using just the terminal as our client. No SDKs, no abstractions, just pure protocol communication. The hard way? Yes. The only way to build your own client? Absolutely.

Transport 1: stdio (Standard Input/Output)

For our weather server, we run it with mcp.run(transport=stdio). This creates a direct communication channel—like having a conversation with someone sitting next to you.

In our case, we start the server with:

uv run weather.py

(Note: Other servers might use different commands like node server.js or python app.py—the principle remains the same)

Now, in the same terminal where the server is running, we type our client commands directly. The server listens on standard input, and we become the client by typing JSON messages:

Step 1: Initialize the Connection

{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2025-06-18","capabilities":{},"clientInfo":{"name":"bash-client","version":"1.0"}}}

Our Weather Server Responds:

{"jsonrpc":"2.0","id":1,"result":{"protocolVersion":"2025-06-18","capabilities":{"experimental":{},"prompts":{"listChanged":false},"resources":{"subscribe":false,"listChanged":false},"tools":{"listChanged":false}},"serverInfo":{"name":"weather","version":"1.12.4"}}}

The server just introduced itself! It’s our weather server, version 1.12.4, and it supports tools.

Step 2: Confirm Everything is Ready

{"jsonrpc":"2.0","method":"notifications/initialized","params":{}}

No response expected — this just tells the server we’re ready to work.

Step 3: Discover Available Tools

{"jsonrpc":"2.0","id":2,"method":"tools/list","params":{}}

Our Server Shows Its Toolbox:

{"jsonrpc":"2.0","id":2,"result":{"tools":[{"name":"get_alerts","description":"Get weather alerts for a US state","inputSchema":{"properties":{"state":{"title":"State","type":"string"}},"required":["state"]}},{"name":"get_forecast","description":"Get weather forecast for a location","inputSchema":{"properties":{"latitude":{"type":"number"},"longitude":{"type":"number"}},"required":["latitude","longitude"]}}]}}

Step 4: Use a Tool

{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"get_alerts","arguments":{"state":"NY"}}}

Our Server Executes and Returns:

{"jsonrpc":"2.0","id":1,"result":{"content":[{"type":"text","text":"No active alerts for this state."}],"isError":false}}

That’s a complete MCP conversation with our stdio server! Initialize, discover, execute. Simple and instant.

Transport 2: Streamable HTTP

For the same weather server, we can run it with mcp.run(transport="streamable_http"). This adds network capabilities and session management—perfect for remote access.

Step 1: Initialize and Get Session ID

curl -X POST http://localhost:8000/mcp \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -d '{
    "jsonrpc": "2.0",
    "id": 1,
    "method": "initialize",
    "params": {
      "protocolVersion": "2025-06-18",
      "capabilities": {},
      "clientInfo": {"name": "my-client", "version": "1.0"}
    }
  }'

Our Server Response with Magic Session ID:

HTTP/1.1 200 OK
date: Tue, 12 Aug 2025 21:47:12 GMT
server: uvicorn
cache-control: no-cache, no-transform
connection: keep-alive
content-type: text/event-stream
mcp-session-id: <your_session_id>
x-accel-buffering: no
Transfer-Encoding: chunked
event: message
data: {"jsonrpc":"2.0","id":1,"result":{"protocolVersion":"2025-06-18","capabilities":{"experimental":{},"prompts":{"listChanged":false},"resources":{"subscribe":false,"listChanged":false},"tools":{"listChanged":false}},"serverInfo":{"name":"weather","version":"1.12.4"}}}

Notice it’s the same weather server, just accessed over HTTP! The session ID becomes our ticket for all future requests.

Step 2: Confirm Initialization

curl -X POST http://localhost:8000/mcp \
  -H "Mcp-Session-Id: <your_session_id>" \
  -d '{"jsonrpc":"2.0","method":"notifications/initialized","params":{}}'

Server Acknowledges:

HTTP/1.1 202 Accepted

Step 3: List Available Tools

curl -X POST http://localhost:8000/mcp \
  -H "Mcp-Session-Id: c9479101c3474aa5a2f21d87ec1c5573" \
  -d '{"jsonrpc":"2.0","id":3,"method":"tools/list","params":{}}'

Our Server Streams Back the Same Tools:

HTTP/1.1 200 OK
date: Tue, 12 Aug 2025 21:48:05 GMT
server: uvicorn
cache-control: no-cache, no-transform
connection: keep-alive
content-type: text/event-stream
mcp-session-id: <your_session_id>
x-accel-buffering: no
Transfer-Encoding: chunked
event: message
data: {"jsonrpc":"2.0","id":3,"result":{"tools":[{"name":"get_alerts","description":"Get weather alerts for a US state.\n\n    Args:\n        state: Two-letter US state code (e.g. CA, NY)\n    ","inputSchema":{"properties":{"state":{"title":"State","type":"string"}},"required":["state"],"title":"get_alertsArguments","type":"object"},"outputSchema":{"properties":{"result":{"title":"Result","type":"string"}},"required":["result"],"title":"get_alertsOutput","type":"object"}},{"name":"get_forecast","description":"Get weather forecast for a location.\n\n    Args:\n        latitude: Latitude of the location\n        longitude: Longitude of the location\n    ","inputSchema":{"properties":{"latitude":{"title":"Latitude","type":"number"},"longitude":{"title":"Longitude","type":"number"}},"required":["latitude","longitude"],"title":"get_forecastArguments","type":"object"},"outputSchema":{"properties":{"result":{"title":"Result","type":"string"}},"required":["result"],"title":"get_forecastOutput","type":"object"}}]}}

Step 4: Call a Tool

curl -X POST http://localhost:8000/mcp \
  -H "Mcp-Session-Id: <your_session_id>" \
  -d '{
    "jsonrpc":"2.0",
    "id":4,
    "method":"tools/call",
    "params":{"name":"get_alerts","arguments":{"state":"NY"}}
  }'

Our Server Executes:

event: message
data: {"jsonrpc":"2.0","id":4,"result":{"content":[{"type":"text","text":"No active alerts for this state."}]}}

Same server, same tools, different transport. The protocol remains constant!

Transport 3: SSE with DeepWiki’s Remote Server

For SSE testing, we used DeepWiki’s production remote server. SSE is the older approach, now mostly replaced by streamable HTTP, but it’s important to understand since some servers still use it.

Terminal 1 — Open the Listening Channel:

curl https://mcp.deepwiki.com/sse

DeepWiki Server Immediately Gives You a Session Endpoint:

event: endpoint
data: /sse/message?sessionId=f785892ce8821982ee91bbddd3ebc6d22a2d81120995c5bcf6e5cbd0ea4dfb56

Important: With SSE, there’s no explicit initialization! The GET request itself initializes the session — very different from our weather server’s stdio and HTTP transports.

Terminal 2 — Send Commands Using the Session:

curl -X POST https://mcp.deepwiki.com/sse/message?sessionId=f785892ce8821982ee91bbddd3ebc6d22a2d81120995c5bcf6e5cbd0ea4dfb56 \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc":"2.0","id":2,"method":"tools/list","params":{}}'

Response Appears in Terminal 1:

event: message
data: {"jsonrpc":"2.0","id":2,"result":{"tools":[{"name":"read_wiki_structure","description":"Get documentation topics for a GitHub repository"},{"name":"read_wiki_contents","description":"View documentation about a GitHub repository"},{"name":"ask_question","description":"Ask any question about a GitHub repository"}]}}

Using DeepWiki’s Tools:

# Terminal 2
curl -X POST https://mcp.deepwiki.com/sse/message?sessionId=f785892ce8821982ee91bbddd3ebc6d22a2d81120995c5bcf6e5cbd0ea4dfb56 \
  -d '{
    "jsonrpc":"2.0",
    "id":3,
    "method":"tools/call",
    "params":{
      "name":"read_wiki_structure",
      "arguments":{"repoName":"facebook/react"}
    }
  }'

Response in Terminal 1:

event: message
data: {"jsonrpc":"2.0","id":3,"result":{"content":[{"type":"text","text":"Available pages for facebook/react:\n\n- 1 Overview\n  - 1.1 Core React Types\n- 2 Build System\n  - 2.1 Feature Flags\n- 3 React Reconciler\n  - 3.1 Fiber Architecture\n- 4 Server-Side Rendering\n  - 4.1 React Server Components"}]}}

The Transport Reality Check

After extensive experimentation with our weather server and DeepWiki’s remote server, here’s what actually matters:

stdio (our weather server with transport=stdio):

  • Lightning fast (2ms)
  • Perfect for local tools
  • No network overhead
  • good for dev.

Streamable HTTP (our weather server with transport="streamable_http"):

  • Network-capable (50–200ms)
  • Handles sessions elegantly
  • Works everywhere
  • good for prod.

SSE (DeepWiki’s remote server):

  • Complex debugging (request in one terminal, response in another)
  • No explicit initialization
  • Legacy but still in production
  • With the latest MCP versions, users prefer streamable HTTP because it can handle both request-response and streaming in a cleaner way

The beauty? Our weather server works identically with both stdio and streamable HTTP — we just change the transport parameter. The protocol stays the same, only the delivery mechanism changes.

The Brutal Truth About MCP

MCP isn’t perfect. Let’s be honest:

The Good:

  • Universal protocol for all AI tools
  • Non-developers can configure integrations
  • Write once, works with any AI client
  • Growing ecosystem of servers

The Challenges:

  • MCP is still not mature — we’re in the early days of the protocol
  • Security isn’t bulletproof yet — researchers have found vulnerabilities that need addressing
  • Latency is higher than direct function calling — if you compare OpenAI’s function calling to MCP, direct function calling is faster (but that’s expected, MCP adds a standardization layer)
  • The ecosystem is just beginning — we’ll see performance improvements and security hardening as it matures

Conclusion: From Overengineering to Revolution

When I first saw MCP, I thought: “This is overengineering. Everything MCP does, we can already do with function calling.”

I was right about the first part — you CAN do everything with traditional function calling. But I was wrong about it being overengineering.

MCP equals function calling plus standardization. That simple equation changes everything because of what standardization enables:

Time Savings: Instead of spending weeks integrating each service, it’s now hours of configuration.

Accessibility: Non-developers can now configure and use AI tools. This was impossible before.

Democratization: Every company, every developer, every enthusiast can contribute. The community builds together.

Richness: When companies build their own MCP servers, they expose capabilities that external developers would never know about. Internal teams understand their products deeply — they add details, features, and tools that make their MCP servers incredibly powerful. Slack’s team knows messaging better than any external developer ever could.

True AI Power: Here’s what really matters — with a well-designed, secure protocol, AI can access anything. The more context, tools, and actions available to AI, the more use cases we can create. An AI with access to 100 MCP servers isn’t just 100x more useful — it’s exponentially more capable because it can combine those tools in ways we haven’t imagined.

We showed you the raw protocol through our weather server experiments — the actual JSON messages flying back and forth across stdio and streamable HTTP. We explored DeepWiki’s SSE implementation to understand the full spectrum of transports. No magic, just standardization. Initialize, discover tools with tools/list, execute with tools/call.

The same weather server works with both stdio and streamable HTTP — just change transport=stdio to transport="streamable_http". The protocol remains constant. That's the power of standardization.

Yes, MCP has growing pains. Yes, it adds some latency. Yes, security needs work. But we’re witnessing the birth of something transformative: AI that can actually DO things, not just talk about them. AI that anyone can enhance, not just programmers. AI that gets more powerful every day as the community builds.

Start with stdio for local tools. Graduate to streamable HTTP for cloud services. Understand SSE for legacy systems. But most importantly, start building. The ecosystem is waiting for your contribution.

The MCP revolution isn’t coming — it’s here. What seemed like overengineering is actually the foundation for AI that can truly act in the world. Build your server, create your client, pick a transport, and watch your AI come alive. The protocol is simple, the community is growing, and the possibilities are endless.

Enjoyed this article? Share it!