Ollama Models
ConsoleX supports integrating and calling any model deployed through Ollama, providing users with great flexibility and possibilities. Whether you're using locally deployed models or models on remote servers, ConsoleX can seamlessly integrate with them.
Prerequisites
- Ensure you have deployed an Ollama model endpoint that can be accessed over the network.
- If your model is deployed locally, you need to map the endpoint to the public internet through a tunnel solution. You can choose to use Cloudflare Tunnels, Ngrok, or other technical solutions.
Integrating Ollama Models on ConsoleX
In the model management section, click "Add New Integration", select Ollama as the provider, and configure the following settings in the popup interface:
- Model Name: Enter the name of the Ollama model.
- Base URL: Enter the endpoint URL.
After adding, you can test the model's connectivity to verify whether it was successfully added.
References
- List of models supported by Ollama: https://ollama.com/library
- Cloudflare Tunnel usage guide: https://developers.cloudflare.com/cloudflare-one/connections/connect-networks/get-started/create-remote-tunnel