Skip to content

Ollama

Leverage OLLAM to use LLM locally directly on your device

Provides access to locally-run Ollama models for AI agent nodes without external API calls.

Connect this to AI agents when you want to use local, privacy-focused LLMs running on your machine via Ollama.

SettingNotes
ModelSource-backed field from the node schema.

Returns a chat model dependency that can be connected to AI agent nodes.

  • No explicit credential or node dependency is declared in the node description.

Connect Ollama to an AI agent or dependency input that accepts this dependency type, then run the agent with data from previous nodes.

  • Check that required settings are present before running the node.
  • If the node uses browser page data, run it on the target tab after the page has loaded.
  • If it calls an external service, verify credentials, permissions, and rate limits.
  • No dedicated source test was found next to this node; verify behavior manually when changing this page.