A Code Implementation to Building a Context-Aware AI Assistant in Google Colab Using LangChain, LangGraph, Gemini Pro, and Model Context Protocol (MCP) Principles with Tool Integration Support

BTCC
A Code Implementation to Building a Context-Aware AI Assistant in Google Colab Using LangChain, LangGraph, Gemini Pro, and Model Context Protocol (MCP) Principles with Tool Integration Support
Blockonomics


In this hands-on tutorial, we bring the core principles of the Model Context Protocol (MCP) to life by implementing a lightweight, context-aware AI assistant using LangChain, LangGraph, and Google’s Gemini language model. While full MCP integration typically involves dedicated servers and communication protocols, this simplified version demonstrates how the same ideas, context retrieval, tool invocation, and dynamic interaction can be recreated in a single notebook using a modular agent architecture. The assistant can respond to natural language queries and selectively route them to external tools (like a custom knowledge base), mimicking how MCP clients interact with context providers in real-world setups.

!pip install langchain langchain-google-genai langgraph python-dotenv
!pip install google-generativeai

First, we install essential libraries. The first command installs LangChain, LangGraph, the Google Generative AI LangChain wrapper, and environment variable support via python-dotenv. The second command installs Google’s official generative AI client, which enables interaction with Gemini models.

import os
os.environ[“GEMINI_API_KEY”] = “Your API Key”

Here, we set your Gemini API key as an environment variable so the model can securely access it without hardcoding it into your codebase. Replace “Your API Key” with your actual key from Google AI Studio.

from langchain.tools import BaseTool
from langchain_google_genai import ChatGoogleGenerativeAI
from langchain.prompts import ChatPromptTemplate
from langchain.schema.messages import HumanMessage, AIMessage
from langgraph.prebuilt import create_react_agent
import os

Phemex

model = ChatGoogleGenerativeAI(
model=”gemini-2.0-flash-lite”,
temperature=0.7,
google_api_key=os.getenv(“GEMINI_API_KEY”)
)

class SimpleKnowledgeBaseTool(BaseTool):
name: str = “simple_knowledge_base”
description: str = “Retrieves basic information about AI concepts.”

def _run(self, query: str):
knowledge = {
“MCP”: “Model Context Protocol (MCP) is an open standard by Anthropic designed to connect AI assistants with external data sources, enabling real-time, context-rich interactions.”,
“RAG”: “Retrieval-Augmented Generation (RAG) enhances LLM responses by dynamically retrieving relevant external documents.”
}
return knowledge.get(query, “I don’t have information on that topic.”)

async def _arun(self, query: str):
return self._run(query)

kb_tool = SimpleKnowledgeBaseTool()
tools = [kb_tool]
graph = create_react_agent(model, tools)

In this block, we initialize the Gemini language model (gemini-2.0-flash-lite) using LangChain’s ChatGoogleGenerativeAI, with the API key securely loaded from environment variables. We then define a custom tool named SimpleKnowledgeBaseTool that simulates an external knowledge source by returning predefined answers to queries about AI concepts like “MCP” and “RAG.” This tool acts as a basic context provider, similar to how an MCP server would operate. Finally, we use LangGraph’s create_react_agent to build a ReAct-style agent that can reason through prompts and dynamically decide when to call tools, mimicking MCP’s tool-aware, context-rich interactions principle.

import nest_asyncio
import asyncio

nest_asyncio.apply()

async def chat_with_agent():
inputs = {“messages”: []}

print(“🤖 MCP-Like Assistant ready! Type ‘exit’ to quit.”)
while True:
user_input = input(“nYou: “)
if user_input.lower() == “exit”:
print(“👋 Ending chat.”)
break

from langchain.schema.messages import HumanMessage, AIMessage
inputs[“messages”].append(HumanMessage(content=user_input))

async for state in graph.astream(inputs, stream_mode=”values”):
last_message = state[“messages”][-1]
if isinstance(last_message, AIMessage):
print(“nAgent:”, last_message.content)

inputs[“messages”] = state[“messages”]

await chat_with_agent()

Finally, we set up an asynchronous chat loop to interact with the MCP-inspired assistant. Using nest_asyncio, we enable support for running asynchronous code inside the notebook’s existing event loop. The chat_with_agent() function captures user input, feeds it to the ReAct agent, and streams the model’s responses in real time. With each turn, the assistant uses tool-aware reasoning to decide whether to answer directly or invoke the custom knowledge base tool, emulating how an MCP client interacts with context providers to deliver dynamic, context-rich responses.

In conclusion, this tutorial offers a practical foundation for building context-aware AI agents inspired by the MCP standard. We’ve created a functional prototype demonstrating on-demand tool use and external knowledge retrieval by combining LangChain’s tool interface, LangGraph’s agent framework, and Gemini’s powerful language generation. Although the setup is simplified, it captures the essence of MCP’s architecture: modularity, interoperability, and intelligent context injection. From here, you can extend the assistant to integrate real APIs, local documents, or dynamic search tools, evolving it into a production-ready AI system aligned with the principles of the Model Context Protocol.

Here is the Colab Notebook. Also, don’t forget to follow us on Twitter and join our Telegram Channel and LinkedIn Group. Don’t Forget to join our 85k+ ML SubReddit.

🔥 [Register Now] miniCON Virtual Conference on OPEN SOURCE AI: FREE REGISTRATION + Certificate of Attendance + 3 Hour Short Event (April 12, 9 am- 12 pm PST) + Hands on Workshop [Sponsored]

Asif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is committed to harnessing the potential of Artificial Intelligence for social good. His most recent endeavor is the launch of an Artificial Intelligence Media Platform, Marktechpost, which stands out for its in-depth coverage of machine learning and deep learning news that is both technically sound and easily understandable by a wide audience. The platform boasts of over 2 million monthly views, illustrating its popularity among audiences.



Source link

Bybit

Be the first to comment

Leave a Reply

Your email address will not be published.


*