Triform
Triform is a visual platform for building, running, and monitoring AI-powered systems. Users compose logic on a Canvas using Nodes (Agents, Flows, Actions) connected by Edges. Triton, the built-in AI assistant, helps build, modify, and debug Projects through natural conversation.
Core Concepts
Projects are complete, deployable AI systems containing Actions, Agents, and Flows. Agents are LLM-powered components with prompts, model configs, and tools. Actions are atomic Python functions with typed inputs/outputs. Flows are orchestration graphs connecting components with edges defining data flow. Important notes:- The Canvas is the visual workspace; Chat Panel (Cmd/Ctrl+K) accesses Triton
- All components live within Organizations → Projects → Library hierarchy
- Executions are tracked with logs, traces, and metrics for debugging
- Projects deploy to staging/production and expose API endpoints
- Dependencies must be pure Python (no binary installs)
Getting Started
- Quickstart: 5-minute guide to sign in, create a Flow with Triton, and execute it
- Workspace Overview: Canvas, Properties Panel, Chat Panel, and navigation
- Login: Authentication via Discord or GitHub
Core Concepts
- Agents: LLM logic with prompts, tools, and observability patterns
- Flows: Graph building blocks (Input/Output nodes, edges) and patterns (linear, branching, parallel)
- Actions: Python structure, I/O contracts, and workflow
- Glossary: Comprehensive definitions of all Triform terms
- Projects: Organization, structure, and deployment
- Executions: Running components, viewing results, and debugging
- Payloads: JSON input data format and schema matching
- Triggers: Webhooks, schedules, and manual execution
Triton AI Assistant
- Triton Overview: What Triton can do, how to use effectively, workflows, and limitations
- Building with Triton: Creating Agents, Actions, Flows from natural language
- Defining Projects: Conversational Project creation and structure
- Editing Components: Modifying, debugging, and optimizing existing work
Workspace Interface
- Canvas Overview: Main visual workspace and navigation
- Flow View Basics: Nodes, edges, and graph visualization
- Create and Connect: Adding nodes and wiring components
- Node Interactions: Selection, configuration, and manipulation
- Properties Panel Overview: Component configuration interface
- Execute Panel: Running components with payloads and viewing results
- Chat Panel: Interacting with Triton
Tutorials
- Build a New Project: End-to-end guide to creating a complete Project
- Edit an Existing Project: Modifying and extending Projects
- Edit a Specific Component: Focused changes to Actions, Agents, or Flows
- Integrate Project into Your App: API keys, endpoints, and external integration
API Reference
- API Introduction: Authentication with Bearer tokens
- OpenAPI Specification: Complete API schema
Optional
- Project Variables: Environment-specific configuration and secrets
- API Keys: Creating and managing authentication tokens
- Deployments: Publishing to staging and production
- Quick Reference: Cheat sheet for common operations
- Organization Admin: Managing Organizations
- Members and Roles: Team collaboration and permissions
- Quotas Overview: Account limits and resource usage
- Security Overview: Data protection and compliance
- Changelog: Platform updates and new features
- Community: Discord server and support channels
About this file
This page follows the llms.txt specification - a proposed standard for providing LLM-friendly information to help AI assistants use website documentation at inference time. The plain text version is available at/llms.txt
.