Quick Start
Let's build a quick AI-powered solution on Triform in a couple minutes. For this example, we'll build a tool that answers a question based on a website passed in as an URL using an Jina.ai's reader API and an LLM provided by Berget.ai.
Creating a project
To create your first project, navigate to the dashboard and click the plus button in the top left. Give your project a suitable name and intention.
Creating your first flow
Once you've created a project, we'll have to place the first building block in it; a flow. A flow is just a DAG (directed acyclic graph), or even simpler just a container of individual actions or other flows. Each project consists of top-level flows. Think of each top-level flow as for example an API endpoint or a single branch in your project. In this case, our only top-level flow will be the tool that handles all questions.
Creating your first action
Now that we have a flow, we'll build out our tool in it. Let's start by creating an action that will scrape the website that will act as the source of truth for the question asked. You can create any type of node with the node selector that appeared when you entered your (so far empty) flow. The input to the action will be the input to the parent flow, as it is connected to the "input"-node (the blue one with an arrow) already. The node selector can always be opened by dragging from any of the available handles on a node and dropping it anywhere, this action will be quite familiar if you have ever used for example the 3D software Blender.
Select the node in the canvas various properties of the node will appear on the right. Expand the code editor to see the code in the action.
We've provided the code below for the action we're building here if you don't want to write it yourself. The function decorated with the @triform.entrypoint
will be executed. Notice that we're passing through the original question, as it will be needed for the next action. Don't forget to save it afterwards!
"""
Scraper
Purpose: This module provides a simple web scraping functionality using the r.jina.ai service.
It takes a URL and a question as input, scrapes the website content in Markdown format,
and returns the scraped content along with the original question.
Args:
scraper_input (ScraperInputModel): Contains the URL to scrape and a question to pass through.
Returns:
ScraperOutputModel: Contains the scraped Markdown content and the original question.
"""
from __future__ import annotations
import requests # For making HTTP requests to the Jina API
from pydantic import BaseModel, ConfigDict
# --- Input Models ---
class ScraperInputModel(BaseModel):
model_config = ConfigDict(populate_by_name=True)
url: str
question: str
# --- Output Models ---
class ScraperOutputModel(BaseModel):
model_config = ConfigDict(populate_by_name=True)
markdown: str
question: str
@triform.entrypoint
def main(scraper_input: ScraperInputModel) -> ScraperOutputModel:
"""
Main function for the code
Args:
scraper_input: Input for the Scraper component
Returns:
Output from the Scraper component
"""
api_url: str = f"https://r.jina.ai/{scraper_input.url}"
headers: dict[str, str] = {
"Accept": "text/markdown", # Request Markdown content
"X-Return-Format": "markdown", # Jina specific header for markdown
}
# It is good practice to set a timeout for network requests.
response = requests.get(api_url, headers=headers, timeout=30)
response.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx)
markdown_content: str = response.text
return ScraperOutputModel(
markdown=markdown_content, question=scraper_input.question
)
Great! If you'd like to try out your newly created action yourself you can use the Execution-panel right below the Code-panel we placed the code in. For the payload, you can use for example:
{
"url": "https://docs.triform.ai",
"question": "What is Triform?"
}
Finishing your flow
Now that we have our scraped markdown data, we need to actually answer the user's question. For this, we'll need another action that will call an external LLM API. We'll use Llama 3.1 8B Instruct provided by Berget.ai. You know the drill by now, create an action and write your code in it, or just copy the code below.
"""
Answerer
Purpose: This component answers questions using Berget.ai's OpenAI-compliant API.
It takes a question and markdown content as input, then queries the LLM to generate an answer.
The API key is loaded from the LLM_KEY environment variable.
"""
from __future__ import annotations
import json
import os
import urllib.request
from pydantic import BaseModel, ConfigDict
# --- Environment Variables ---
# LLM_KEY: The API key for the Berget.ai LLM API (required)
# --- Input Models ---
class AnswererInputModel(BaseModel):
"""
Input parameters for Answerer component
"""
model_config = ConfigDict(populate_by_name=True)
question: str
markdown: str
# --- Output Models ---
class AnswererOutputModel(BaseModel):
"""
Answer output from Answerer component
"""
model_config = ConfigDict(populate_by_name=True)
answer: str
def _call_llm_api(api_key: str, question: str, context: str) -> str:
"""
Calls the Berget.ai LLM API to get an answer to the question based on the context.
Args:
api_key: The API key for Berget.ai.
question: The question to ask the LLM.
context: The markdown content to provide as context.
Returns:
The answer from the LLM.
Raises:
ValueError: If the API key is missing.
RuntimeError: If the API call fails or returns an unexpected response.
"""
if not api_key:
raise ValueError(
"API key for Berget.ai is missing. Set the LLM_KEY environment variable."
)
api_url = "https://api.berget.ai/v1/chat/completions"
system_prompt = (
"You are an AI assistant. Answer the user's question based on the provided markdown content. "
"If the answer cannot be found in the content, state that clearly. "
"Be concise and stick to the information given in the markdown."
)
user_prompt = (
f"Markdown Content:\n```markdown\n{context}\n```\n\nQuestion: {question}"
)
payload = {
"model": "llama-3.1-8b-instruct",
"messages": [
{"role": "system", "content": system_prompt},
{"role": "user", "content": user_prompt},
],
"max_tokens": 500,
"temperature": 0.2, # Low temperature for factual, grounded answers
}
headers = {
"Authorization": f"Bearer {api_key}",
"Content-Type": "application/json",
}
data = json.dumps(payload).encode("utf-8")
req = urllib.request.Request(api_url, data=data, headers=headers, method="POST")
with urllib.request.urlopen(req) as response:
if response.status == 200:
response_body = json.loads(response.read().decode("utf-8"))
if response_body.get("choices") and len(response_body["choices"]) > 0:
message = response_body["choices"][0].get("message")
if message and message.get("content"):
return message["content"].strip()
raise RuntimeError(
f"LLM API response is not in the expected format: {response_body}"
)
else:
raise RuntimeError(
f"LLM API call failed with status {response.status}: {response.read().decode()}"
)
@triform.entrypoint
def main(answerer_input: AnswererInputModel) -> AnswererOutputModel:
"""
Main function for the code
Args:
answerer_input: Input parameters for Answerer component
Returns:
Answer output from Answerer component
"""
api_key = os.environ.get("LLM_KEY")
if not api_key:
# In a real application, consider a more robust way to handle missing API keys,
# perhaps by raising a custom exception or returning a specific error response.
raise ValueError(
"LLM_KEY environment variable not set. Please set it before running the application."
)
answer_text = _call_llm_api(
api_key=api_key,
question=answerer_input.question,
context=answerer_input.markdown,
)
return AnswererOutputModel(answer=answer_text)
Now you might be thinking, how do we give the action access to a secret API token to interact with the LLM API?
Brilliant question! Open the last panel in the right sidebar conveniently labelled "Environment Variables". Attach a new (secret)
variable with the key LLM_KEY
value BERGET-KEY HERE
.
Try it out
And that's it! Let's try and run the entire flow now. Go up a level by clicking the blue "input"-node, select your newly-created Flow and try and execute, you can use the same example payload as before:
{
"url": "https://docs.triform.ai",
"question": "What is Triform?"
}
Pretty easy, right?
By the way, all the code in this example was generated by the AI Builder.