
In this tutorial, we explore the Advanced Model Context Protocol (MCP) and demonstrate how to use it to address one of the most unique challenges in modern AI systems: enabling real-time interaction between AI models and external data or tools. Traditional models operate in isolation, limited to their training data, but through MCP, we create a bridge that enables models to access live resources, run specialized tools, and adapt dynamically to changing contexts. We walk through building an MCP server and client from scratch, showing how each component contributes to this powerful ecosystem of intelligent collaboration. Check out the FULL CODES here.
import asyncio
from dataclasses import dataclass, asdict
from typing import Dict, List, Any, Optional, Callable
from datetime import datetime
import random
@dataclass
class Resource:
uri: str
name: str
description: str
mime_type: str
content: Any = None
@dataclass
class Tool:
name: str
description: str
parameters: Dict[str, Any]
handler: Optional[Callable] = None
@dataclass
class Message:
role: str
content: str
timestamp: str = None
def __post_init__(self):
if not self.timestamp:
self.timestamp = datetime.now().isoformat()
We begin by defining the fundamental building blocks of MCP: resources, tools, and messages. We design these data structures to represent how information flows between AI systems and their external environments in a clean, structured way. Check out the FULL CODES here.
def __init__(self, name: str):
self.name = name
self.resources: Dict[str, Resource] = {}
self.tools: Dict[str, Tool] = {}
self.capabilities = {“resources”: True, “tools”: True, “prompts”: True, “logging”: True}
print(f”✓ MCP Server ‘{name}’ initialized with capabilities: {list(self.capabilities.keys())}”)
def register_resource(self, resource: Resource) -> None:
self.resources[resource.uri] = resource
print(f” → Resource registered: {resource.name} ({resource.uri})”)
def register_tool(self, tool: Tool) -> None:
self.tools[tool.name] = tool
print(f” → Tool registered: {tool.name}”)
async def get_resource(self, uri: str) -> Optional[Resource]:
await asyncio.sleep(0.1)
return self.resources.get(uri)
async def execute_tool(self, tool_name: str, arguments: Dict[str, Any]) -> Any:
if tool_name not in self.tools:
raise ValueError(f”Tool ‘{tool_name}’ not found”)
tool = self.tools[tool_name]
if tool.handler:
return await tool.handler(**arguments)
return {“status”: “executed”, “tool”: tool_name, “args”: arguments}
def list_resources(self) -> List[Dict[str, str]]:
return [{“uri”: r.uri, “name”: r.name, “description”: r.description} for r in self.resources.values()]
def list_tools(self) -> List[Dict[str, Any]]:
return [{“name”: t.name, “description”: t.description, “parameters”: t.parameters} for t in self.tools.values()]
We implement the MCP server that manages resources and tools while handling execution and retrieval operations. We ensure it supports asynchronous interaction, making it efficient and scalable for real-world AI applications. Check out the FULL CODES here.
def __init__(self, client_id: str):
self.client_id = client_id
self.connected_servers: Dict[str, MCPServer] = {}
self.context: List[Message] = []
print(f”n✓ MCP Client ‘{client_id}’ initialized”)
def connect_server(self, server: MCPServer) -> None:
self.connected_servers[server.name] = server
print(f” → Connected to server: {server.name}”)
async def query_resources(self, server_name: str) -> List[Dict[str, str]]:
if server_name not in self.connected_servers:
raise ValueError(f”Not connected to server: {server_name}”)
return self.connected_servers[server_name].list_resources()
async def fetch_resource(self, server_name: str, uri: str) -> Optional[Resource]:
if server_name not in self.connected_servers:
raise ValueError(f”Not connected to server: {server_name}”)
server = self.connected_servers[server_name]
resource = await server.get_resource(uri)
if resource:
self.add_to_context(Message(role=”system”, content=f”Fetched resource: {resource.name}”))
return resource
async def call_tool(self, server_name: str, tool_name: str, **kwargs) -> Any:
if server_name not in self.connected_servers:
raise ValueError(f”Not connected to server: {server_name}”)
server = self.connected_servers[server_name]
result = await server.execute_tool(tool_name, kwargs)
self.add_to_context(Message(role=”system”, content=f”Tool ‘{tool_name}’ executed”))
return result
def add_to_context(self, message: Message) -> None:
self.context.append(message)
def get_context(self) -> List[Dict[str, Any]]:
return [asdict(msg) for msg in self.context]
We create the MCP client that connects to the server, queries resources, and executes tools. We maintain a contextual memory of all interactions, enabling continuous, stateful communication with the server. Check out the FULL CODES here.
await asyncio.sleep(0.2)
sentiments = [“positive”, “negative”, “neutral”]
return {“text”: text, “sentiment”: random.choice(sentiments), “confidence”: round(random.uniform(0.7, 0.99), 2)}
async def summarize_text(text: str, max_length: int = 100) -> Dict[str, str]:
await asyncio.sleep(0.15)
summary = text[:max_length] + “…” if len(text) > max_length else text
return {“original_length”: len(text), “summary”: summary, “compression_ratio”: round(len(summary) / len(text), 2)}
async def search_knowledge(query: str, top_k: int = 3) -> List[Dict[str, Any]]:
await asyncio.sleep(0.25)
mock_results = [{“title”: f”Result {i+1} for ‘{query}'”, “score”: round(random.uniform(0.5, 1.0), 2)} for i in range(top_k)]
return sorted(mock_results, key=lambda x: x[“score”], reverse=True)
We define a set of asynchronous tool handlers, including sentiment analysis, text summarization, and knowledge search. We use them to simulate how the MCP system can execute diverse operations through modular, pluggable tools. Check out the FULL CODES here.
print(“=” * 60)
print(“MODEL CONTEXT PROTOCOL (MCP) – ADVANCED TUTORIAL”)
print(“=” * 60)
print(“n[1] Setting up MCP Server…”)
server = MCPServer(“knowledge-server”)
print(“n[2] Registering resources…”)
server.register_resource(Resource(uri=”docs://python-guide”, name=”Python Programming Guide”, description=”Comprehensive Python documentation”, mime_type=”text/markdown”, content=”# Python GuidenPython is a high-level programming language…”))
server.register_resource(Resource(uri=”data://sales-2024″, name=”2024 Sales Data”, description=”Annual sales metrics”, mime_type=”application/json”, content={“q1”: 125000, “q2”: 142000, “q3”: 138000, “q4”: 165000}))
print(“n[3] Registering tools…”)
server.register_tool(Tool(name=”analyze_sentiment”, description=”Analyze sentiment of text”, parameters={“text”: {“type”: “string”, “required”: True}}, handler=analyze_sentiment))
server.register_tool(Tool(name=”summarize_text”, description=”Summarize long text”, parameters={“text”: {“type”: “string”, “required”: True}, “max_length”: {“type”: “integer”, “default”: 100}}, handler=summarize_text))
server.register_tool(Tool(name=”search_knowledge”, description=”Search knowledge base”, parameters={“query”: {“type”: “string”, “required”: True}, “top_k”: {“type”: “integer”, “default”: 3}}, handler=search_knowledge))
client = MCPClient(“demo-client”)
client.connect_server(server)
print(“n” + “=” * 60)
print(“DEMONSTRATION: MCP IN ACTION”)
print(“=” * 60)
print(“n[Demo 1] Listing available resources…”)
resources = await client.query_resources(“knowledge-server”)
for res in resources:
print(f” • {res[‘name’]}: {res[‘description’]}”)
print(“n[Demo 2] Fetching sales data resource…”)
sales_resource = await client.fetch_resource(“knowledge-server”, “data://sales-2024″)
if sales_resource:
print(f” Data: {json.dumps(sales_resource.content, indent=2)}”)
print(“n[Demo 3] Analyzing sentiment…”)
sentiment_result = await client.call_tool(“knowledge-server”, “analyze_sentiment”, text=”MCP is an amazing protocol for AI integration!”)
print(f” Result: {json.dumps(sentiment_result, indent=2)}”)
print(“n[Demo 4] Summarizing text…”)
summary_result = await client.call_tool(“knowledge-server”, “summarize_text”, text=”The Model Context Protocol enables seamless integration between AI models and external data sources…”, max_length=50)
print(f” Summary: {summary_result[‘summary’]}”)
print(“n[Demo 5] Searching knowledge base…”)
search_result = await client.call_tool(“knowledge-server”, “search_knowledge”, query=”machine learning”, top_k=3)
print(” Top results:”)
for result in search_result:
print(f” – {result[‘title’]} (score: {result[‘score’]})”)
print(“n[Demo 6] Current context window…”)
context = client.get_context()
print(f” Context length: {len(context)} messages”)
for i, msg in enumerate(context[-3:], 1):
print(f” {i}. [{msg[‘role’]}] {msg[‘content’]}”)
print(“n” + “=” * 60)
print(“✓ MCP Tutorial Complete!”)
print(“=” * 60)
print(“nKey Takeaways:”)
print(“• MCP enables modular AI-to-resource connections”)
print(“• Resources provide context from external sources”)
print(“• Tools enable dynamic operations and actions”)
print(“• Async design supports efficient I/O operations”)
if __name__ == “__main__”:
import sys
if ‘ipykernel’ in sys.modules or ‘google.colab’ in sys.modules:
await run_mcp_demo()
else:
asyncio.run(run_mcp_demo())
We bring everything together into a complete demonstration where the client interacts with the server, fetches data, runs tools, and maintains context. We witness the full potential of MCP as it seamlessly integrates AI logic with external knowledge and computation.
In conclusion, the uniqueness of the problem we solve here lies in breaking the boundaries of static AI systems. Instead of treating models as closed boxes, we design an architecture that enables them to query, reason, and act on real-world data in structured, context-driven ways. This dynamic interoperability, achieved through the MCP framework, represents a major shift toward modular, tool-augmented intelligence. By understanding and implementing MCP, we position ourselves to build the next generation of adaptive AI systems that can think, learn, and connect beyond their original confines.
Check out the FULL CODES here. Feel free to check out our GitHub Page for Tutorials, Codes and Notebooks. Also, feel free to follow us on Twitter and don’t forget to join our 100k+ ML SubReddit and Subscribe to our Newsletter. Wait! are you on telegram? now you can join us on telegram as well.
Asif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is committed to harnessing the potential of Artificial Intelligence for social good. His most recent endeavor is the launch of an Artificial Intelligence Media Platform, Marktechpost, which stands out for its in-depth coverage of machine learning and deep learning news that is both technically sound and easily understandable by a wide audience. The platform boasts of over 2 million monthly views, illustrating its popularity among audiences.
Be the first to comment