Empowering AI Interactions with Model Context Protocol

I gave Claude root access to my server... Model Context Protocol explained

Estimated read time: 1:20

    Learn to use AI like a Pro

    Get the latest AI workflows to boost your productivity and business performance, delivered weekly by expert consultants. Enjoy step-by-step guides, weekly Q&A sessions, and full access to our AI workflow archive.

    Canva Logo
    Claude AI Logo
    Google Gemini Logo
    HeyGen Logo
    Hugging Face Logo
    Microsoft Logo
    OpenAI Logo
    Zapier Logo
    Canva Logo
    Claude AI Logo
    Google Gemini Logo
    HeyGen Logo
    Hugging Face Logo
    Microsoft Logo
    OpenAI Logo
    Zapier Logo

    Summary

    The video by Fireship dives into the rising popularity and application of the Model Context Protocol (mCP), a novel approach to building APIs tailored for AI applications. The protocol, developed by Anthropic, standardizes interaction between servers and language models like Claude. Throughout the video, the creator explains how mCP can enhance AI-driven coding and automate complex tasks involving Databases and Cloud infrastructure. It also explores a fascinating project called "HorseTender," where the protocol facilitates communication between databases, REST APIs, and server resources, seamlessly integrating AI into the workflow. Despite potential risks of mCP, such as AI errors or misuse, the video highlights its potential in revolutionizing software development and calls for responsible use.

      Highlights

      • Model Context Protocol: A game-changer for AI application APIs, similar to a 'USB-C port' for context sharing. πŸ”Œ
      • Anthropic's CEO predicts most coding will soon be AI-generated using mCP, envisioning a future dominated by AI. 🌟
      • mCP enables exciting projects like Claude-powered HorseTender, merging AI with whimsical applications. 🐎
      • Savola boosts mCP projects with straightforward cloud deployment and predictable pricing plans. πŸ’‘
      • Creative uses of mCP already include automated trading and complex web tasks, pushing innovation boundaries. 🌐
      • Fireship recommends coding responsibly with mCP to avoid disastrous consequences from AI actions. πŸ’₯

      Key Takeaways

      • Model Context Protocol is revolutionizing API development for AI, resembling a 'USB-C port' for AI apps. πŸš€
      • It's designed by Anthropic and is crucial for building APIs that allow AI to access and manipulate data effectively. πŸ€–
      • The protocol simplifies data fetching and action execution by language models on servers. πŸ’»
      • The use of mCP enables unique projects like "HorseTender," which integrates AI for matchmaking horses. 🐴
      • Savola's platform provides easy cloud infrastructure setup for mCP projects, with user-friendly pricing. ☁️
      • Responsible use of Model Context Protocol is urged, as its potential errors could have massive impacts. ⚠️

      Overview

      In a world where technology is continually shaping the future, the Model Context Protocol (mCP) emerges as a transformative approach to building APIs for AI applications. Conceptualized as a 'USB-C port' for AI models, mCP is pivotal for developers looking to create seamless, efficient, and powerful interactions between language models and digital resources. The video by Fireship explores how this groundbreaking protocol simplifies the process of coding with AI, making it accessible and efficient.

        Developed by the innovative minds at Anthropic, the protocol allows for the standardized exchange of context between AI clients like Claude and server resources. This novel method promotes creativity and automation in software development, pushing the boundaries of what’s possible. Notably, the video features the quirky project 'HorseTender', a delightful blend of AI integration for horse matchmaking, illustrating the whimsical yet impactful capabilities of mCP. Fireship passionately discusses the impending AI-driven future, though it also urges developers to approach this powerful tool with caution due to potential risks.

          While the Model Context Protocol heralds a new era of coding, emphasizing AI's role in software development, it also carries potential risks that should not be overlooked. Fireship emphasizes scrutiny in using such technology, especially considering the potential for AI-induced errors or misuses. Supported by platforms like Savola, which offer easy deployment and reliable pricing, mCP is poised to reshape the landscape of software engineering, but calls for a balanced approach to embrace its full potential responsibly.

            Chapters

            • 00:00 - 00:30: Introduction to Model Context Protocol (mCP) The chapter discusses the rising popularity of Model Context Protocol (mCP) among developers worldwide as a new way to build APIs. It highlights a fascinating example of mCP's application, where someone used it to get 'clawed' to design 3D art in Blender, driven by 'Vibes.' It also mentions the formal recognition of mCP as a standard in the OpenAI agents SDK. The chapter briefly refers to various other API protocols like REST, GraphQL, RPC, and SOAP that are known to seasoned software developers.
            • 00:30 - 01:00: The Shift to Vibe Coding The chapter titled 'The Shift to Vibe Coding' explores the changing landscape in the world of web development. It discusses how traditional gatekeepers no longer hold power over aspiring developers who previously needed to know technical details like architectures and protocols. Instead, there's a new approach termed 'Vibe Coding,' where focus shifts to using large language models (LLMs) effectively to achieve desired outcomes. It's emphasized that being a 'Vibe coder' involves understanding the 'model context protocol,' which is likened to being a universal standard for building APIs, similar to a USBC port for AI applications.
            • 01:00 - 01:30: Building an mCP Server The chapter discusses the ambitious expectations of AI technology, particularly large language models, as expressed by the CEO of Anthropic. The text forecasts that AI will be writing virtually all code by the end of the year. The chapter provides a tutorial on building an mCP server, examining whether this technology could eliminate all white-collar jobs. Set against the backdrop of a video filmed on March 31st, 2025, the chapter also touches on integrating a storage bucket, a Postgres database, and a standard REST...
            • 01:30 - 02:00: Connecting mCP with Cloud Infrastructure The chapter discusses the integration of the model context protocol (mCP) with cloud infrastructures. It begins by explaining the process of connecting APIs with the mCP, which enhances Claude's capabilities by providing access to new data and allowing code execution on servers. This capability is being used for various applications, such as automated trading, large-scale web scraping, and cloud infrastructure management. The chapter highlights Savola as an excellent platform for building an mCP server, particularly for managing Kubernetes clusters.
            • 02:00 - 02:30: Understanding mCP Architecture The chapter discusses the mCP architecture, which is supported by Google Kubernetes Engine and Cloudflare, highlighting its user-friendliness compared to platforms like AWS. It emphasizes mCP's predictable linear pricing model and its free starting cost, making it ideal for certain projects. The architecture involves a client-server relationship, with the client being a desktop and the server maintaining a connection to facilitate information exchange.
            • 02:30 - 03:30: Adding Resources and Tools to the Server The chapter titled 'Adding Resources and Tools to the Server' explains the importance of understanding HTTP verbs within the context of REST API, focusing particularly on resources and tools. Resources refer to elements such as files or database queries that provide necessary context for the model, akin to a GET request in REST. In contrast, tools are actions performed by the model, similar to a POST request, that might involve writing data to a database.
            • 03:30 - 04:30: Running and Using the Server This chapter discusses the process of defining tools and resources on a server to enable automatic identification and use by language learning models (LLMs) when prompted. The author shares their experience of working on an app project called 'Horse Tender', which initially included an impractical feature due to a misunderstanding of the end users (horses). As a response to this failure, the project is pivoting towards utilizing artificial intelligence, leveraging existing data and server infrastructure, such as a storage bucket containing all relevant photos.
            • 04:30 - 05:30: Future of AI and Coding The chapter discusses the integration of different technologies to enhance user data management and application development. It elaborates on the use of a PostgreSQL database to store profile data and relationship information of 'horses', which likely refers to a project or application being developed. The system includes a REST API written in TypeScript to retrieve this data for web, iOS, and Android applications. A significant focus is on the development process, highlighting the use of a git repository linked to a CI/CD (Continuous Integration/Continuous Deployment) pipeline. This setup allows for code to be pushed to development or staging branches, enabling testing before moving into production. Overall, the chapter emphasizes the importance of a robust development pipeline and modern coding practices in the future of AI and coding.
            • 05:30 - 06:00: Conclusion and Thanks This chapter discusses the simplicity and convenience provided by the tool that handles deployments and cache busting automatically. The author introduces a Dino project and demonstrates the use of a class named 'mCP server' from the official SDK, noting that alternatives are available for other programming languages such as Python and Java. The chapter also highlights the use of Zod for schema validation, which ensures that the language model adheres to a specified data shape to prevent irrelevant outputs. Following the server setup, the subsequent step involves adding resources to it.

            I gave Claude root access to my server... Model Context Protocol explained Transcription

            • 00:00 - 00:30 it seems like every developer in the world is getting down with mCP right now model context protocol is the hot new way to build apis and if you don't know what that is you're ngmi people are doing crazy things with it like this guy got clawed to design 3d art and blender powered entirely on Vibes and just a few days ago it became an official standard in the open AI agents SDK if you're an OG subscriber to this channel you probably know what a rest API is you might even know about graphql or RPC or maybe many years ago you use soap when I was a kid the software engineering
            • 00:30 - 01:00 Gatekeepers told me I couldn't be a web developer unless I could explain the difference between these architectures and protocols well now the turns have tabled and these Gatekeepers have been utterly demolished because we're all just Vibe coders now embracing the exponentials pretending code doesn't even exist and just chilling with llms until we get the end result we're looking for that being said you can't call yourself a true vibe coder unless you know about model context protocol which is basically a new standard for building apis that you can think of like a USBC port for AI application it was
            • 01:00 - 01:30 designed by anthropic the team behind Claud and provides a standard way to give large language models context and they're so bullish on this technology that the CEO of anthropic said he expects virtually all code to be written by AI by the end of the year in today's video we'll actually build an mCP server and find out if it can truly make the world a better place by eliminating all white collar jobs it is March 31st 2025 and you're watching the code report contrary to popular belief a Fire Ship is still a tutorial channel in today's video we'll take a storage bucket a postgress database and a regular rest
            • 01:30 - 02:00 API and then connect them all together with the model context protocol not only will this allow Claude to have access to data it didn't have before but it can also execute code on our server like write to the database or upload files and people of the internet are already using it to do crazy stuff like automated ston and shitcoin trading industrial scale web scraping and as a tool to manage Cloud infrastructure like your kubernetes cluster speaking of which to build our own mCP server we'll need some Cloud infrastructure and one of the best places to do that is savola
            • 02:00 - 02:30 which itself is powered by Google kubernetes engine and cloudflare under the hood they were nice enough to sponsor this video but the reason I like their platform so much is that it's far easier to use than something like AWS but provides linear predictable pricing unlike most of the application and database hosting startups out there and it's free to get started which makes it perfect for this project now like other API architectures mCP has a client in a server the client in our case will be claw desktop then will'll develop a server that maintains a connection with that client so the client and server can pass information back and forth via the
            • 02:30 - 03:00 transport layer now in arrest API you have a bunch of different HTTP verbs that you can send requests to Via different URLs but in the model context protocol we're really only concerned with two main things resources and tools a resource might be a file a database query or some other information the model can use for context conceptually you can think of it like a git request and rest meanwhile a tool is an action that can be performed like writing something to a database so that'd be more like a poster request and rest what
            • 03:00 - 03:30 we do as developers is Define tools and resources on the server is so the llm can automatically identify and use them when they have a prompt that needs them in my life I've been working on an app I consider my magnum opus called horse tender but as it turns out swiping left and right was a bad feature idea because horses don't have fingers so like every other failing startup in Silicon Valley right now we're going to Pivot to artificial intelligence luckily we can leverage our existing data and servers like here in savola I have a storage bucket and it contains all of the photos
            • 03:30 - 04:00 that my users uploaded in addition for user data we have a postgress database it has all the profile data for each one of our horses as well as the relationships they form together and then finally I have a traditional rest API written in typescript that fetches this data for my web IOS and Android apps and what's especially cool about my code is that it's in a git repo hooked up to a cicd pipeline that means after we write our model context protocol server we can push our code to the dev or staging branches to test it before it actually goes into production while
            • 04:00 - 04:30 automatically handles all the deployments and cash busting for us automatically and now we're ready to jump into some code here I have a dino project and the first thing you'll notice is that I'm importing a class called mCP server this comes from the official SDK but if you're not using typescript they have a bunch of other languages like python Java and so on we'll also be using Zod here which is a tool used for schema validation which allows us to provide a specific data shape to the llm is so it doesn't just hallucinate a bunch of random crap now after we create a server we can start adding resources to it the resource will
            • 04:30 - 05:00 first have a name like horses looking for love and then the second argument is a URI for the resource then finally the third argument is a callback function that we can use to fetch the data in this example I'm writing a query to our postgress database which is hosted in the cloud on savola then accessed on the server with the postgress JS library but you could access any data here when something is a resource though it should only be used for fetching data where there's no side effects or computations if you do have a side effect or computation you should instead use a tool like for horse tender we might want
            • 05:00 - 05:30 the AI to automatically create matches and set up dates between horses we already have a restful API endpoint that can handle that and we could actually leverage that code here essentially creating an API for our API in fact many of these mCP servers are actually just apis for apis and that sounds like dumb over engineering but having a protocol like this makes it a lot easier to Plug and Play between different models and just makes llm apps more reliable in general the case in point notice how I'm using Zod here to validate the shape of
            • 05:30 - 06:00 the data going into this function that prevents the llm from hallucinating random stuff here basically when you prompt Claud it's going to need to figure out the proper arguments to this function is so providing data types along with a description will make your mCP server far more reliable and then the final step is to run the server in this case I'm going to use standard IO as the transport layer to use it locally but if deploy to the cloud you can also use server sent events or HTTP congratulations you just built an mCP server but now the question is how do we Act actually use it to use it you'll now
            • 06:00 - 06:30 need a client that supports the model context protocol like Claude desktop there are many other mCP clients out there if you don't want to use CLA desktop like cursor and wisor for example and you could even develop your own client but that's an entirely separate topic altogether once installed you can go to the developer settings which will bring you to a config file where you can add multiple mCP servers in the config file all you have to do is provide a command to run the actual server which in our case would be the doo command for the main.ts file where we find our server code you'll need to
            • 06:30 - 07:00 restart clad but then it should show your mCP server is running in this case my horse is running which means I should probably go and catch it then you can go back to the prompt screen to attach it that's going to fetch the resource from the server so CLA can use it as context in the next prompt and because clae is multimodal you could also add PDFs images or anything else to the context really like all the horse images in our Savala storage bucket and now magically you can prompt Claude about things specific to your application like if we want to find
            • 07:00 - 07:30 out which horses are single and ready to mingle we can make a prompt like this and it will use the context that we just fetched from our database then if we want Claude to write to the database we could make a prompt like this where it'll connect two horses from the context on a date you'll need to Grant a permission to do this and then Claude will automatically figure out which data to send it but based on the schema we validated with Zod and it'll use our server tool to mutate data in the actual application I can't imagine anything ever possibly going wrong here and anthropic is extremely bullish on this being the future like their CEO just
            • 07:30 - 08:00 said that 90% of coding will be done entirely by AI within the next 6 months and nearly all code will be AI generated within a year I'm going to go ahead and press X to doubt there and I think it's only a matter of time before some AI agent accidentally wipes out billions of dollars in customer data or becomes self-aware and just deletes the data for fun that being said though there's all kinds of amazing tools being built with mCP right now and you can check those out on the awesome mCP repo just please make sure to Vibe code responsibly huge thanks to Savala for making this video possible and enjoy this $50 stimulus
            • 08:00 - 08:30 check to try out their awesome platform this has been the code report thanks for watching and I will see you in the next one