FloTorch CrewAI Plugin Overview
The FloTorch CrewAI Plugin provides CrewAI-compatible implementations that integrate with FloTorch’s Gateway and Workspace services. It enables you to build multi-agent AI systems using CrewAI while leveraging FloTorch’s infrastructure.
Key Features:
- Distributed Tracing - Built-in observability with configurable tracing for production monitoring
- Configurable Logging - Flexible logging configuration for debugging and monitoring (see SDK Logging Configuration)
Exports:
FlotorchCrewAIAgent– Main agent manager with FloTorch integrationFlotorchCrewAILLM– CrewAI-compatible LLM wrapperFlotorchCrewAIMemory– Memory service for CrewAIFlotorchCrewAISession– Session management for CrewAI
Installation
Section titled “Installation”pip install flotorch[crewai]Configuration
Section titled “Configuration”You can configure credentials in one of two ways:
Option A: Environment variables (recommended)
Section titled “Option A: Environment variables (recommended)”Note: To create and manage your API keys, see the API Keys documentation.
export FLOTORCH_API_KEY="<your_api_key>"export FLOTORCH_BASE_URL="https://gateway.flotorch.cloud"
# Optional: Configure loggingexport FLOTORCH_LOG_DEBUG=trueexport FLOTORCH_LOG_PROVIDER="console" # or "file"export FLOTORCH_LOG_FILE="flotorch_logs.log" # if provider is "file" (default: "flotorch_logs.log")For comprehensive logging configuration details, see the SDK Logging Configuration guide.
Option B: Pass to constructors explicitly
Section titled “Option B: Pass to constructors explicitly”Note: When creating an agent in the FloTorch Console, use
goalfor writing the goal andsystem promptfor writing the backstory. See Creating an Agent for detailed steps.
from flotorch.crewai.agent import FlotorchCrewAIAgent
agent = FlotorchCrewAIAgent( agent_name="weather-agent", base_url="https://gateway.flotorch.cloud", api_key="<your_api_key>",)Agent Creation: FloTorch vs CrewAI
Section titled “Agent Creation: FloTorch vs CrewAI”Core Components
Section titled “Core Components”FlotorchCrewAIAgent
Section titled “FlotorchCrewAIAgent”The main entry point for loading agent configurations from the FloTorch Console. It provides fully configured, CrewAI-compatible agents with minimal code setup, supporting multi-agent workflows and task-based collaboration.
FlotorchCrewAILLM
Section titled “FlotorchCrewAILLM”A CrewAI-compatible LLM wrapper that integrates with FloTorch Gateway for model inference, providing seamless access to managed language models with support for multi-agent Crew workflows.
Memory Services
Section titled “Memory Services”- FlotorchMemoryStorage - External memory storage backing CrewAI’s
ExternalMemoryfor long-term persistent storage - FlotorchCrewAISession (as Memory Backend) - Short-term memory storage backing CrewAI’s
ShortTermMemoryfor conversation context
FlotorchCrewAISession
Section titled “FlotorchCrewAISession”Manages persistent session storage through FloTorch Gateway, enabling conversation continuity across multiple interactions and supporting multi-agent Crew workflows.
Next Steps
Section titled “Next Steps”Explore the detailed documentation for each component:
- Agent Configuration - Comprehensive agent setup and usage patterns
- LLM Configuration - Language model setup and customization
- Memory Integration - Implement persistent memory capabilities
- Session Management - Configure session persistence and state management
References
Section titled “References”- Official CrewAI documentation: CrewAI docs