Skip to main content

Model Usage Overview

ConsoleX AI provides a unified, easy-to-use conversational interface to access cutting-edge models from numerous mainstream model providers and cloud vendors (such as OpenAI, Anthropic, Google Gemini, Amazon Bedrock, Ollama, etc.). You can use ready-to-use shared models or integrate your own private model endpoints through API keys and other configurations.

Using Shared Models

Shared models

For convenient use of large language models (LLMs), ConsoleX AI provides a series of ready-to-use shared models that you can use directly without any configuration. When using shared models, you need to consume your balance (Credits) on ConsoleX. The shared models provided on ConsoleX are divided into essential models and premium models. Any user can use essential models with their credits, while premium models are limited to subscription members only. View Details

Using Custom Private Models

Supported providers

You can also integrate your own private model endpoints through API keys. Using your own model endpoints will not consume your balance on ConsoleX, and costs will be charged by the model provider. ConsoleX AI provides extensive support for model providers and employs secure encryption measures to ensure your API key security. View Details

Model Endpoint Management

Model hub

You can view all model lists, including shared models and private model endpoints, on the "Model Hub" page, and can enable or disable any model at any time. Only the model endpoints you enable can be used in the chat interface. In the model endpoint management page, you can also set an alias for any model for easy identification.

Artifacts Preview

Click the icon below the message input box to enable Artifacts preview for the current conversation, viewing generated results in a more visual way in a dedicated window. View Details

Scheduled Chat

Scheduled Chats

Let AI automatically execute preset conversation tasks, automating your daily workflows. View Details

More Features

  • Switch Models Anytime

    When initiating AI conversations, you can see the list of enabled models in the model list at the bottom right of the input box, and switch the model to use for the next message at any time.

  • Adjust Parameter Options

    Click the

    icon in the top right corner of the page to see more parameter options related to the current model in the right sidebar, such as temperature, maximum output tokens, Top-p, Top-k, frequency penalty, presence penalty, thinking mode, reasoning depth, structured output, etc. The specific supported options and parameter adjustment ranges vary by different models.

  • Image and File Upload

    By clicking the

    icon below the message input box, you can directly upload files or use files from the file box, supporting various common image formats, PDF, Word/Excel/PPT/Markdown document formats, and various common program files.

  • Compare Different Model Generation Outputs

    Below the model response, click the regenerate

    icon to switch to other models and regenerate results for easy comparison.

  • Set Default Endpoint

    You can set the default model and preset parameters for the current account in the bottom left menu "Settings > Chat Settings" interface. This will make ConsoleX AI use this model as the default endpoint in every new chat session.

  • Manage Conversations with Folders

    Organize your conversations effortlessly by creating color-coded folders. Categorize and save important chats with custom color identifiers to keep your message history neatly arranged and easily accessible. View Details