How to Build First MCP Server from Scratch | Model Context Protocol Hands-On Tutorial
In this masterclass, πππ²ππππ© ππ‘ππ€π«ππππ«ππ², Director of AI in Tech at Piramal Capital & Housing Finance Limited, demonstrates how to set up and build your first MCP server.
Whether youβre building AI agents or exploring MCP for the first time, this tutorial gives you a solid foundation for MCP Architecture and Capabilities.
In this video, we cover:
β Recap of how MCP empowers AI agents with context
β Host, client & server roles in MCP explained
β JSON-RPC communication flow in MCP
β Step-by-step: building your first MCP server in Python
β Setting up VS Code, Python, UV, and Cloud Desktop
β Defining tools, prompts, and resources inside your server
β Testing your MCP server using MCP Inspector
β A real example: the βCreative Additionβ tool in action
β Tips to avoid hallucinations & ensure accurate outputs
00:00 Introduction
10:00 JSON-RPC and message flow in MCP
15:40 Tools, Prompts, and Resources
22:30 Setting up the dev environment (Python, UV, VS Code, Cloud Desktop)
30:10 Writing your first MCP server in Python
38:40 βCreative Additionβ tool implementation
47:20 Testing with MCP Inspector
54:10 Registering the server in Cloud Desktop
01:02:00 Handling hallucinations with docstrings
01:02:46 Conclusion
In this masterclass, πππ²ππππ© ππ‘ππ€π«ππππ«ππ², Director of AI & Tech at Pyramal Finance, introduces the Model Context Protocol (MCP), a framework developed by Anthropic in 2024, designed to address the challenges of context and integration in AI systems.
Jaydeep emphasizes understanding the βwhyβ behind MCP, drawing parallels to innovations like blockchain, to leverage its potential creatively. The session begins by highlighting the importance of context in AI, explaining that large language models (LLMs) like ChatGPT, Gemini, and Claude require specific and detailed input to generate useful outputs.
Examples include specifying βGarfield the catβ to get targeted responses or providing detailed prompts for image generation, demonstrating that generic input often leads to generic outputs.
JD then discusses the limitations of early LLM interactions and introduces function calls from OpenAI, which allowed LLMs to autonomously invoke external APIs or functions, such as fetching weather data.
While function calls extended LLM capabilities, organizations encountered integration complexity and maintenance challengesβa classic NΓM problemβwhen multiple LLMs and APIs were combined. Changes in API endpoints or service migrations often required manual code updates, making systems brittle and difficult to scale.
The Model Context Protocol addresses these challenges by introducing a client-server model where the MCP client (LLM or AI agent) connects to MCP servers that expose tools, resources, or prompt templates. MCP servers advertise three main primitives: tools (to perform actions via APIs), resources (read-only data sources for context), and prompt templates (predefined prompts for standardized interaction).
This architecture enables modularity, scalability, and dynamic self-discovery, allowing AI systems to autonomously access and utilize available capabilities without hardcoding. JD compares MCP to traditional APIs, noting that while both are client-server models, MCP is purpose-built for AI, supports runtime discovery, and allows for seamless integration of multiple tools with minimal developer effort.
Practical demonstrations include connecting ChatGPT to Gmail via MCP to autonomously fetch purchase details, showcasing how MCP enables LLMs to act as autonomous agents with hands, eyes, and tools, while the LLM functions as the brain.
The session concludes with an overview of MCPβs advantagesβsimplified integration, reduced redundancy, dynamic context management, and modularityβand sets the stage for future modules, which will cover hands-on coding, building MCP servers, and advanced MCP applications, encouraging participants to explore why MCP exists and how it can transform AI-driven systems.
Jaydeep Chakrabarty
Jaydeep Chakrabarty, Director of AI in Tech at Piramal Capital & Housing Finance Limited, is a technologist, open-source contributor, and thought leader in Artificial Intelligence. Previously at Thoughtworks, he led Generative AI engagements, R&D, and tech communities. As a speaker and author, Jaydeep shares insights on AI, innovation, and the future of technology.