Using MCP with LangGraph agents

Using MCP with LangGraph agents

Estimated read time: 1:20

    Learn to use AI like a Pro

    Get the latest AI workflows to boost your productivity and business performance, delivered weekly by expert consultants. Enjoy step-by-step guides, weekly Q&A sessions, and full access to our AI workflow archive.

    Canva Logo
    Claude AI Logo
    Google Gemini Logo
    HeyGen Logo
    Hugging Face Logo
    Microsoft Logo
    OpenAI Logo
    Zapier Logo
    Canva Logo
    Claude AI Logo
    Google Gemini Logo
    HeyGen Logo
    Hugging Face Logo
    Microsoft Logo
    OpenAI Logo
    Zapier Logo

    Summary

    In this video by LangChain, the process of connecting Language Model Systems (LMS) to various data sources and tools using the Model Context Protocol (MCP) is explored. The MCP standardizes how tools and data sources are integrated, allowing them to be easily connected to LangGraph agents. The video demonstrates a basic setup involving a math server with tools like add and multiply, showing how they're registered and utilized with LangGraph agents. It highlights the efficiency of using multiple server clients and the capability to connect and use tools from various MCP servers, ultimately enhancing tool accessibility and functionality for applications through a standardized open-source protocol.

      Highlights

      • Introduction of the Model Context Protocol (MCP) which standardizes LMS connections to tools and data sources. ๐Ÿ“š
      • Demonstration of a simple MCP server setup with a math tool and its integration with LangGraph agents. ๐Ÿงฎ
      • The potential for using multiple server clients to organize and access a variety of MCP server tools. ๐ŸŒ

      Key Takeaways

      • MCP protocol standardizes connections between LMS and various data sources and tools, simplifying integrations. ๐Ÿค–
      • LangGraph agents can seamlessly connect with MCP servers, turning server-defined tools into LangChain tools. ๐Ÿ”—
      • The protocol supports multiple servers, making it easy to organize and access tools from different domains efficiently. ๐Ÿ—‚๏ธ

      Overview

      The video introduces the Model Context Protocol (MCP) as a standardized way to connect Language Model Systems (LMS) with different data sources and tools. It highlights how integrating these elements can be challenging, but MCP facilitates a smooth process by open-sourcing the implementation. By leveraging MCP, various tools defined within servers can be easily connected to LangGraph agents, ensuring seamless functionality across systems.

        An example implementation is showcased, where a simple server setup with math tools like addition and multiplication is connected to a LangGraph agent. This integration demonstrates the process of importing, defining, and initializing connections through MCP. The setup allows for seamless execution of tool calls within the system, creating an efficient environment for developers working with LMS.

          Furthermore, the video discusses the advantages of using a multi-server client, which significantly enhances the organization and accessibility of diverse tools from different MCP servers. By connecting various servers, users can effortlessly manage numerous tools, expanding their applications' capabilities. This standardized protocol not only simplifies the process but also opens the door to extensive integration possibilities, providing a robust framework for developers.

            Chapters

            • 00:00 - 00:30: Introduction to MCP and LangGraph Chapter 'Introduction to MCP and LangGraph' discusses the challenges of connecting Learning Management Systems (LMS) to various context sources, such as tools and data sources. It introduces the Anthropics Model Context Protocol (MCP), an open-source implementation meant to standardize these connections. The chapter further mentions new capabilities, specifically the ability to connect any MCP server containing data sources and tool definitions as tools to a LangGraph agent.
            • 00:30 - 02:30: Example Implementation of MCP Server This chapter presents a basic implementation of an mCP server referred to as a math server. The server registers simple mathematical tools like โ€˜addโ€™ and โ€˜multiply.โ€™ It offers an efficient way to encapsulate tools with similar functionalities, making them accessible via the mCP protocol to learning management systems (LMS). Furthermore, integrating this server with a larger graph agent requires only a few imports from the mCP library, demonstrating its ease of use. The chapter primarily provides a foundational understanding of setting up and connecting an mCP server.
            • 01:30 - 03:00: Connecting MCP Server with LangGraph Agent The chapter titled 'Connecting MCP Server with LangGraph Agent' discusses the process of configuring an MCP client by specifying server parameters which are located in a local directory. The chapter highlights the steps to initialize a session and establish a connection. A recent addition is described, where tools defined within an MCP server can be transformed into Lang chain tools, facilitating their integration into applications such as Create React Agent.
            • 03:00 - 04:00: Working with Multiple Servers The chapter titled 'Working with Multiple Servers' discusses the integration and execution of tools within a server environment. It specifically highlights the use of a pre-built React implementation in Lang graph. The chapter outlines how converted tools can be integrated and executed seamlessly, with the server handling tool messages and executing tasks like multiplication. The process demonstrates a functioning system where the input tools initiate server responses, which in turn call the necessary tools, ensuring smooth operation and integration.
            • 04:00 - 05:00: Using Different Models and Tools The chapter titled 'Using Different Models and Tools' discusses the utilization of different tools within the lsmith environment. It highlights how the tools in the maths server.py are automatically discovered by RLM, making them readily available when mCP tools are loaded. This process facilitates seamless binding to the LM in the context of creating a react structure, showcasing an efficient integration of different models and tools.
            • 05:00 - 05:30: Clients and Hosts The chapter 'Clients and Hosts' explores the process of conducting agent steps in a trace. It highlights the feature of organizing different tools using the overall trace, following through tool executions till an answer is derived. A notable addition discussed is the multi-server mCP client, facilitating the easy loading of tools from multiple servers, providing a straightforward method to integrate various tools into the workflow.
            • 05:30 - 06:30: Applications and Third-Party Integrations The chapter discusses the flexibility of integrating various models and tools with an agent, highlighting that there's no restriction to a specific model like CLA 35 Sonet. It mentions the use of protocols that allow connections with different tools through language learning models (LLMs), such as a math server and a weather server. These servers have respective tools for mathematical computations and weather updates, demonstrating the capability of integrating third-party applications to enhance functionality.
            • 06:30 - 07:00: Conclusion and Documentation In this chapter, the focus is on the separation and connection of various tool servers, showcasing an example of connecting to two different servers. It includes a specific case where a tool call to retrieve weather information is executed successfully. Additionally, there's a conceptual explanation of how mCP servers function, providing context tools and prompts to clients, which is illustrated through a diagram.

            Using MCP with LangGraph agents Transcription

            • 00:00 - 00:30 this is lson Lang chain connecting LMS to different sources of context like tools like data sources is notoriously challenging anthropics model context protocol is an interesting open- Source implementation to standardize the way this can be done now we've recently add the ability to take any mCP server which contains data sources tool definitions and connect them as tools to any Lang graph agent and I want to show an
            • 00:30 - 01:00 example of that right now this is a super simple example implementation of an mCP server so you can see This Server we're just going to call math we're going to register a few different tools add multiply and that's it so servers are a really nice way to encapsulate some set of tools with similar functionality and surface them to LMS using the mCP protocol now to connect that server with lag graph agent all I need to do is a few imports from mCP I'm going to specify model and I'm Supply
            • 01:00 - 01:30 some server parameters and this is just simply pointing to that file which is in my directory here I pass those parameters to an mCP client I Define here and all I need to do is start my session initialize connection and this step is what we recently added the ability to take some set of tools defined in an mCP server and convert them into Lang chain tools then those tools can be added directly to for example example create react agent a
            • 01:30 - 02:00 pre-built react implementation in Lang graph that's quite popular we can see we take those tools that have been converted and pass them right in as normal and we can go ahead and run and we can see the messages out so everything looks pretty good so here is our input Claude makes a tool call the tool message is automatically handled so the server is going to basically execute the tool pass back a tool message the model receives that makes a second tool call to perform multiply and we get our
            • 02:00 - 02:30 final answer now I want to show you something else that's pretty interesting I can look at the trace here in lsmith and let me go to that first model call you can see the tools in that maths server.py are all discovered automatically by rlm so basically all the tools in the server are available and discoverable when we run this load mCP tools they're all loaded for us which is great and then we go ahead and bind them to our LM in this create react
            • 02:30 - 03:00 agent step and they're all here and this you can see is the overall Trace that follows the execution of the tool calls and then the tool executions and so forth down until we get the answer now one thing I like about this is we actually can use this really nicely to organize different tools that we want to work with so we added this multi-server mCP client that lets you load tools from various servers really easily and then you can very simply just pull in all
            • 03:00 - 03:30 those corresponding tools to your agent another important clarification here is that this model supplied currently reasoning CLA 35 Sonet but you can also use other models so there's no restriction on what particular model you use this protocol connects with any tool calling llms so in this case we have our math server we have a weather server math server has some math tools weather server has in this case just some mock tools that Returns the weather in a given location but the point is we can
            • 03:30 - 04:00 really nicely separate our various tool servers and easily connect to them so in this case you can see we connect to two different servers and there we go in this particular case we can see the tool call to get weather works as expected we return the result and so just to kind of flesh out what's happening under the hood here a bit you can think about this uh as laid out in this diagram so mCP servers provide context tools and prompts to clients and we saw uh the
            • 04:00 - 04:30 ability to create a simple math server we create a simple get weather server but there's many other servers available which I'll show you in a minute and each server can have many different tools kind of topically organized based on that server's role so it's a really nice way to kind of organize tools now clients maintain connections to servers inside different host apps and so for example what are those apps well one of the main reasons that anthropic open source this was because the claw desktop app can then get access to all sarch of
            • 04:30 - 05:00 different NCP servers giving it access to many different tools and many different data sources so it's obviously really useful if you're using a cloud desktop app but it's open source so we actually use the python client to connect it to L graph agent that's exactly what we just did and we use load MCB tools to convert the tools as defined in the mCP server into Lang chain tools that can be used directly in a line graph agent so it's just another example of an application that can be built on top of this open source protocol so we'll link to the library in
            • 05:00 - 05:30 this video description and I do want to call out something that's pretty interesting if you poke around the documentation on model context protocol you can see a large set of servers that are available to you and this is where the power really comes in you can see there's a whole bunch of reference servers for different tools and a large number of official thirdparty Integrations as well which is quite cool we also see some other community servers there's actually quite a bit here so with this open protocol to expose various types of M CP servers and the
            • 05:30 - 06:00 ability to load any of them as tools pass them to a l graph agent it provides a very nice way to access lots of different tools from Agents trying to build so it provides a really nice way to bind many different tools via a common protocol to agents that you're trying to build so it's worth a careful look at this it's a really neat open source protocol and I'm sure it's going to be growing quite a bit in the coming days weeks and months and feel free to leave any comments or questions below thanks