Loading video player...
LLMs and AI agents like ChatGPT and Claude are powerful — but they hallucinate when they can’t access real tools. In this hands-on tutorial, you’ll learn how to build an MCP (Model Context Protocol) server in Python to give your LLMs and agents real capabilities — such as unit conversions, math operations and file reading. 🎯 Whether you’re working with VS Code, GitHub Copilot, or Cloud Desktop, you’ll see step-by-step how to: - Create and decorate tools using the Fast MCP library - Integrate them into ChatGPT/Claude or any LLM agent - Test your MCP server with the MCP Inspector - Connect your tool to VS Code’s MCP plugin - Avoid hallucinations and make your LLMs accurate and reliable ⏱️ Timestamps: 00:00 - Intro and motivation 00:57 - Big-Picture story of what we'll be doing 01:41 - Intro to "uv" package and project manager in Python 03:26 - Building tools for MCP server 04:53 - What if there was no MCP?! Any alternative way? 06:22 - How to develop the MCP server? 13:34 - Working with "MCP Inspector" and testing our MCP server with it 15:56 - How to integrate our MCP server in VS Code's Github Copilot 21:40 - Is this in its best shape to publish? No! See the next video :) 📦 Tools & Frameworks Fast MCP, uv / uvx, Python 3.10+, VS Code MCP, GitHub Copilot, MCP Inspector