Skip to main content

Ollama

ConsoleX can connect to any model served through Ollama, whether it is running locally or on a remote machine.

Prerequisites

  1. Make sure your Ollama endpoint is deployed and reachable over the network.
  2. If the model is running locally, expose it to the public internet through a tunnel such as Cloudflare Tunnels, Ngrok, or a similar solution.

Add an Ollama model in ConsoleX

In model management, click Add New Integration, choose Ollama, and fill in:

  • Model Name: the Ollama model name
  • Base URL: the endpoint URL

After saving, test the connection to verify that the integration works correctly.

References