CrewAI Tools
CrewAI Tools is a Python library that provides a diverse collection of pre-built and customizable tools designed to extend the capabilities of agents within the CrewAI framework. These tools empower agents with functions for tasks such as web searching, data analysis, file management, web scraping, database interactions, and more. The library is actively maintained with frequent updates addressing features, bug fixes, and security vulnerabilities, currently at version 1.14.1.
Warnings
- breaking Changes within the core CrewAI framework (e.g., in output types for tasks/crews, event system refactors) can indirectly affect how tools are integrated and how their outputs are processed. Always review CrewAI's changelog when upgrading `crewai-tools`.
- gotcha Many powerful tools (e.g., `SerperDevTool`, `GithubSearchTool`, various RAG tools) require API keys or specific credentials to be set as environment variables (e.g., `SERPER_API_KEY`, `GITHUB_PAT`) or passed directly. Failure to configure these will lead to runtime errors.
- deprecated CVEs (Common Vulnerabilities and Exposures) in underlying dependencies (e.g., `cryptography`, `transformers`, `litellm`) are frequently patched in `crewai-tools` releases. Running outdated versions may expose your application to known security risks.
- gotcha When creating custom tools, if your tool involves asynchronous operations (e.g., network requests), you must implement the `_arun` method instead of the synchronous `_run` method. Using `_run` with `async` logic will lead to blocking behavior or errors.
Install
-
pip install crewai-tools
Imports
- FileReadTool
from crewai_tools import FileReadTool
- SerperDevTool
from crewai_tools import SerperDevTool
- ScrapeWebsiteTool
from crewai_tools import ScrapeWebsiteTool
Quickstart
import os
from crewai import Agent, Task, Crew, Process
from crewai_tools import FileReadTool, SerperDevTool
# Set your Serper API key as an environment variable or pass directly
# os.environ["SERPER_API_KEY"] = "YOUR_SERPER_API_KEY"
# Initialize tools
read_tool = FileReadTool(file_path='./my_document.txt')
search_tool = SerperDevTool()
# Define Agents
researcher = Agent(
role='Senior Researcher',
goal='Uncover the latest trends in AI and provide a summary',
backstory='An expert in AI research, known for insightful analysis.',
verbose=True,
allow_delegation=False,
tools=[search_tool] # Agent has access to the search tool
)
writer = Agent(
role='Content Writer',
goal='Craft engaging content based on research findings',
backstory='A talented writer who transforms complex data into compelling narratives.',
verbose=True,
allow_delegation=False,
tools=[read_tool] # Agent has access to the file read tool
)
# Define Tasks
research_task = Task(
description='Conduct a comprehensive search on current AI trends and identify key developments.',
expected_output='A bullet-point summary of 3-5 major AI trends.',
agent=researcher
)
write_task = Task(
description='Write a short blog post (approx. 300 words) based on the AI trends summary provided.',
expected_output='A well-structured and engaging blog post.',
agent=writer
)
# Assemble a Crew
crew = Crew(
agents=[researcher, writer],
tasks=[research_task, write_task],
process=Process.sequential,
verbose=2
)
# Ensure 'my_document.txt' exists for the FileReadTool example
with open('./my_document.txt', 'w') as f:
f.write('Initial context about AI: AI is rapidly advancing, with new models and applications emerging constantly.')
# Kick off the crew's work
print("Crew starting to work...")
result = crew.kickoff()
print("\n\nCrew finished with results:")
print(result)