Skip to main content

AI Agent

Biom’s AI agent is a conversational interface that lets you analyze data, run models, generate pipelines, organize files, and execute custom Python code — all through natural language.

Opening the AI panel

Click the AI icon in the sidebar, or use the prompt bar at the bottom of the screen. The AI panel opens as a chat interface alongside your viewer.

File references

Reference files in your prompts using the @ mention syntax:
  1. Type @ in the prompt bar
  2. A file picker popover appears showing your files and folders
  3. Select a file or type to filter
  4. The file appears as an inline badge in your prompt
You can reference multiple files in a single prompt. The AI agent will use the referenced files as context for your request.

Smart suggestions

When you drag and drop a file onto a pane, Biom suggests relevant actions based on the file type:
File TypeSuggested Actions
Scientific TIFF (.tif, .ome.tif)SAM3 segmentation, PlantCV analysis, Suite2p calcium imaging
Microscopy (.czi, .nd2, .lif)SAM3 segmentation, PlantCV analysis, Suite2p calcium imaging
Standard images (.png, .jpg)Segmentation, PlantCV, SExtractor
Video (.mp4, .avi, .mov)DeepLabCut tracking, SAM3 video segmentation
DICOM (.dcm)Segmentation, description
Neural data (.nwb, .h5, .bin)SpikeInterface spike sorting
Astronomy (.fits)SExtractor source extraction
All filesGenerate a processing pipeline

Chat features

  • Markdown rendering — responses are formatted with headings, lists, code blocks, and tables
  • Inline images — analysis results appear as images in the chat
  • Action buttons — suggested follow-up actions appear as clickable buttons
  • Analysis badges — results like cell counts display as inline badges
  • Chart visualizations — bar charts and histograms render directly in chat
  • Segmentation overlays — segmentation masks appear on the canvas

LLM providers

Biom supports multiple LLM backends:
ProviderDefault ModelNotes
OpenAIgpt-4o-miniDefault provider
AnthropicClaude SonnetDirect API integration
OllamaLlama 3.1Self-hosted, local models
The provider is selected automatically based on configured API keys, or can be set explicitly.