Skip to main content

How to integrate and use Ollama models?

Quentin avatar
Written by Quentin
Updated over 6 months ago

ConsoleX supports integrating and invoking any models deployed via Ollama, providing users with great flexibility and possibilities. Whether you are using a locally deployed model or a model on a remote server, ConsoleX can seamlessly integrate with it.

Preparations

1. Ensure that you have deployed an Ollama model endpoint that is accessible via the web.

2. If your model is deployed locally, you will need to map the endpoint to the public internet through a tunnel solution. You can opt for Cloudflare's Zero Trust or other technology solutions.

Integrating Ollama Models on EvalsOne/ConsoleX

In the model management section, click on "Add New Integration," select Ollama as the provider, and configure the settings in the popup interface as follows:

Model Name: Enter the name of the Ollama model.

Base URL: Enter the endpoint URL.

After adding, you can test the model's connectivity to verify if it was successfully added.

References

Ollama Supported Model List

Cloudflare Tunnel Usage Guide

Did this answer your question?