ConsoleX AI provides an unified interface to access over 200+ models from all mainstream providers, like OpenAI, Anthropic, Google Gemini, Amazon Bedrock, Ollama etc, either by using out-of-the-box shared endpoints or integrating your own endpoints by API Keys.
Access shared endpoints
To ease the usage of LLM, ConsoleX AI provides a series of out-of-the-box shared endpoints, you can directly use them without any configuration.
When using shared endpoints, you will need to consume your Credits on ConsoleX. The prices for the shared endpoints we offer are typically 1.3 times the official model prices to ensure the sustainability of our services. We reserve the right to adjust our pricing strategy within a reasonable range in the future.
Integrate your own endpoints
You can also integrate your own endpoints by API Keys. Using your own endpoints on ConsoleX won't consume your Credits, the cost will be charged by your own endpoint provider.
To integrate your own endpoints, you can follow the steps below:
Click the "Endpoints" link on the bottom left corner of the page.Click the "Add Private Endpoint" button on the top right of the page.
Choose a model provider from the list.
Enter the necessary information for the endpoint, such as API Key, base URL, etc.
Click the "Save" button.
Endpoint Management
You can customize the enabled endpoints in the "Endpoints" page. Only the endpoints you have enabled will be available for use in the chat interface.
Set default endpoint
You can set a default endpoint and preset parameters for your account in the "Settings" page. This will make ConsoleX AI use this endpoint as the default endpoint for each chat session.