Applicability
This topic applies only to OCP Enterprise Edition. OCP Community Edition does not support this feature.
Background information
OceanBase Database is a native distributed database system. Its complex distributed architecture and multi-node collaboration and data distribution pose significant challenges in management and technical support. Administrators must not only be familiar with routine database operations and maintenance but also deeply understand distributed system principles, including handling complex issues such as data consistency and load balancing. In technical support, once a failure occurs, it is difficult to search for relevant documents and troubleshoot the issue, requiring a comprehensive consideration of factors such as network, node, and software configurations. Due to the professionalism and complexity of the technology, technical support teams must have strong technical expertise and extensive experience, which can be a significant burden for many enterprises.
To address the challenges of managing and providing technical support for OceanBase, we aim to leverage the capabilities of PowerRAG and large language models to implement intelligent Q&A functionality, which can be activated from any interface. This approach will help users better understand and manage OceanBase, effectively reducing database maintenance costs and improving service quality and efficiency.
Features
OceanBase AI Assistant integrates advanced large language models (LLMs) and Retrieval-Augmented Generation (PowerRAG) technology, along with built-in OceanBase knowledge base and native vector and full-text indexing capabilities, to create a core intelligent Q&A engine. Based on PowerRAG's multi-round retrieval mechanism, LLMs can accurately call technical documents, best practices, and failure cases from the knowledge base, providing systematic and professional solutions for core database operations and maintenance scenarios such as system configuration optimization, failure diagnosis and repair, and performance bottleneck analysis.
In terms of large model configuration, OCP allows you to flexibly integrate third-party large model services such as Tongyi Qianwen, Baidu ErnieBot, and DeepSeek, or deploy a self-built large model through the System Management > External Integration > AI Model feature. You can then configure LLM, embedding, and reranking models (optional) to quickly activate AI Assistant capabilities.
Technical principles
In OCP, AI Assistant stores documents as vectors in OceanBase Database. When you ask questions to AI Assistant through Smart Chat, AI Assistant uses the large model to convert your questions into vectors, retrieves similar vectors from the database, and then sends the retrieved documents and your questions to the large model, which generates a more accurate answer based on the documents.
Prerequisites
Before using AI-powered intelligent chat, ensure that your OCP meets the following conditions:
- The MetaDB of OCP must be V4.3.5 BP2 or later.
- If OCP is newly deployed, the OAT used to deploy OCP must be V4.3.2 or later.
Procedure
Before enabling AI-powered intelligent chat, you need to perform the following configurations:
- Connect to an AI model vendor. After you connect to the built-in AI model vendor, all models of this vendor can be selected in the system model settings.
- (Optional) Add a model. If the built-in AI model vendor or model cannot meet your needs, you can add a custom AI model vendor or model.
- Set the system model. Select the model that AI Assistant needs from the existing models in the system.
- Enable AI-powered intelligent chat.
Disclaimer
For specific information about how a third party processes your personal data, please refer to the relevant third-party privacy policy documents. If you do not agree to the transmission and processing of your data by any third-party large model, do not use it.
All content generated by this application is generated by the AI large model you selected. We cannot guarantee the accuracy and completeness of the generated content, which is for your reference only.