📄️ Introduction to tools
Understand tools at Portia and add your own.
📄️ Tool selection
Learn how to select the tools that the LLM can use to answer a user query.
📄️ Add custom tools
Let's build two custom tools that allow an LLM to write / read content to / from a local file. We're going to create our custom tools in a separate folder called mycustomtools at the root of the project directory and create a filewritertool.py and filereadertool.py file within it, with the following:
📄️ Use clarifications in custom tools
You can raise a Clarification in any custom tool definition to prompt a plan run to interrupt itself and solicit input (SDK reference ↗).
📄️ Integrating an MCP Server
The Model Context Protocol (MCP) makes it very easy to integrate third-party tools into your Portia AI project. To find out more you can visit the official MCP docs (↗). We offer the two methods currently available to interact with MCP servers: