Skip to content

FloTorch CrewAI Plugin Overview

The FloTorch CrewAI Plugin provides CrewAI-compatible implementations that integrate with FloTorch’s Gateway and Workspace services. It enables you to build multi-agent AI systems using CrewAI while leveraging FloTorch’s infrastructure.

Key Features:

  • Distributed Tracing - Built-in observability with configurable tracing for production monitoring
  • Configurable Logging - Flexible logging configuration for debugging and monitoring (see SDK Logging Configuration)

Exports:

  • FlotorchCrewAIAgent – Main agent manager with FloTorch integration
  • FlotorchCrewAILLM – CrewAI-compatible LLM wrapper
  • FlotorchCrewAIMemory – Memory service for CrewAI
  • FlotorchCrewAISession – Session management for CrewAI

Terminal window
pip install flotorch[crewai]

You can configure credentials in one of two ways:

Section titled “Option A: Environment variables (recommended)”

Note: To create and manage your API keys, see the API Keys documentation.

Terminal window
export FLOTORCH_API_KEY="<your_api_key>"
export FLOTORCH_BASE_URL="https://gateway.flotorch.cloud"
# Optional: Configure logging
export FLOTORCH_LOG_DEBUG=true
export FLOTORCH_LOG_PROVIDER="console" # or "file"
export FLOTORCH_LOG_FILE="flotorch_logs.log" # if provider is "file" (default: "flotorch_logs.log")

For comprehensive logging configuration details, see the SDK Logging Configuration guide.

Note: When creating an agent in the FloTorch Console, use goal for writing the goal and system prompt for writing the backstory. See Creating an Agent for detailed steps.

from flotorch.crewai.agent import FlotorchCrewAIAgent
agent = FlotorchCrewAIAgent(
agent_name="weather-agent",
base_url="https://gateway.flotorch.cloud",
api_key="<your_api_key>",
)

agent_manager = FlotorchCrewAIAgent(
agent_name="weather-agent",
base_url="https://gateway.flotorch.cloud",
api_key="<your_api_key>",
custom_tools=[weather_tool],
)
agent = agent_manager.get_agent()
task = agent_manager.get_task()
workflow = Crew(
agents = [agent],
tasks = [task]
)
agent = Agent(
role="Weather Agent",
goal="Provide accurate weather information using the weather tool. Return only the tool's output without modification. user_query:{query}",
backstory="A professional weather specialist that provides precise weather data. Always uses the weather tool to get current conditions and forecasts.",
tools=[weather_tool],
allow_delegation=False,
verbose=False,
llm=model,
)
task = Task(
description="Get weather information using the weather tool. Return the tool's output without modification. user_query: {query}",
expected_output="Weather data from the weather tool",
agent=agent
)
workflow = Crew(
agents = [agent],
tasks = [task]
)

The main entry point for loading agent configurations from the FloTorch Console. It provides fully configured, CrewAI-compatible agents with minimal code setup, supporting multi-agent workflows and task-based collaboration.

A CrewAI-compatible LLM wrapper that integrates with FloTorch Gateway for model inference, providing seamless access to managed language models with support for multi-agent Crew workflows.

  • FlotorchMemoryStorage - External memory storage backing CrewAI’s ExternalMemory for long-term persistent storage
  • FlotorchCrewAISession (as Memory Backend) - Short-term memory storage backing CrewAI’s ShortTermMemory for conversation context

Manages persistent session storage through FloTorch Gateway, enabling conversation continuity across multiple interactions and supporting multi-agent Crew workflows.

Explore the detailed documentation for each component: