Custom Models
ConsoleX AI lets you connect your own model endpoints through BYOK (Bring Your Own Key). This guide explains how to configure and use private model endpoints inside ConsoleX.
Supported model types
ConsoleX AI supports model endpoints from the following providers and platforms:
- OpenAI
- Azure OpenAI
- Anthropic
- Google AI Studio
- Google Vertex AI
- AWS Bedrock
- Ollama
- DeepSeek
- xAI
- Groq
- Qwen
- OpenRouter
- Mistral
- Any endpoint compatible with the OpenAI API
How to add a custom model
To connect your own model endpoint:
- Open the model dropdown in the lower-right corner of the message input area and enter the Model Hub.
- Click Add Custom Model in the upper-right corner.
- Choose a provider from the list.
- Fill in the required endpoint details, such as API key and base URL.
- Click Save.
Usage limits
- Calls made through private model endpoints do not consume your ConsoleX balance.
- Free users can connect up to 3 private models.
- Paid users can connect unlimited private model endpoints.