AI Agent
Biom’s AI agent is a conversational interface that lets you analyze data, run models, generate pipelines, organize files, and execute custom Python code — all through natural language.Opening the AI panel
Click the AI icon in the sidebar, or use the prompt bar at the bottom of the screen. The AI panel opens as a chat interface alongside your viewer.File references
Reference files in your prompts using the@ mention syntax:
- Type
@in the prompt bar - A file picker popover appears showing your files and folders
- Select a file or type to filter
- The file appears as an inline badge in your prompt
Smart suggestions
When you drag and drop a file onto a pane, Biom suggests relevant actions based on the file type:| File Type | Suggested Actions |
|---|---|
| Scientific TIFF (.tif, .ome.tif) | SAM3 segmentation, PlantCV analysis, Suite2p calcium imaging |
| Microscopy (.czi, .nd2, .lif) | SAM3 segmentation, PlantCV analysis, Suite2p calcium imaging |
| Standard images (.png, .jpg) | Segmentation, PlantCV, SExtractor |
| Video (.mp4, .avi, .mov) | DeepLabCut tracking, SAM3 video segmentation |
| DICOM (.dcm) | Segmentation, description |
| Neural data (.nwb, .h5, .bin) | SpikeInterface spike sorting |
| Astronomy (.fits) | SExtractor source extraction |
| All files | Generate a processing pipeline |
Chat features
- Markdown rendering — responses are formatted with headings, lists, code blocks, and tables
- Inline images — analysis results appear as images in the chat
- Action buttons — suggested follow-up actions appear as clickable buttons
- Analysis badges — results like cell counts display as inline badges
- Chart visualizations — bar charts and histograms render directly in chat
- Segmentation overlays — segmentation masks appear on the canvas
LLM providers
Biom supports multiple LLM backends:| Provider | Default Model | Notes |
|---|---|---|
| OpenAI | gpt-4o-mini | Default provider |
| Anthropic | Claude Sonnet | Direct API integration |
| Ollama | Llama 3.1 | Self-hosted, local models |