This topic introduces how to build typical AI-integrated applications using OceanBase. By combining OceanBase's vector search capabilities with a variety of AI frameworks and services, you can quickly develop intelligent solutions.
AI integration use cases
Vector databases are essential infrastructure for building smart applications. OceanBase's vector search features can be seamlessly connected with leading AI services and frameworks. Below are some common scenarios:
Integration with OpenAI API
OpenAI provides advanced language models and embedding services. By integrating the OpenAI API, you can:
- Use OpenAI's embedding models to generate vector representations of text
- Store these vectors in OceanBase for fast and efficient searching
- Build intelligent Q&A systems powered by GPT models
- Enable sophisticated features like semantic search
Typical scenarios include:
- Smart customer service platforms
- Document search systems
- Personalized recommendation engines
Integration with Tongyi Qianwen API
Tongyi Qianwen (Qwen) is Alibaba Cloud's large language model service, designed for natural language processing in Chinese. By connecting with the Tongyi Qianwen API, you can:
- Leverage Tongyi Qianwen's text embedding capabilities
- Support semantic understanding and search in Chinese
- Build intelligent applications tailored to specific industries
Suitable scenarios include:
- Enterprise knowledge base search
- Smart business assistants
- Multilingual document management
Integration with Langchain
Langchain is a framework for developing applications powered by large language models. OceanBase can serve as the vector storage backend for Langchain, enabling you to:
- Load documents and convert them into vectors
- Build conversational search chains
- Implement agent-based systems
- Create Q&A solutions for knowledge bases
Integration benefits include:
- Streamlined development process for LLM applications
- Access to a wide range of components and tools
- Flexible customization to meet diverse application needs
Integration with LlamaIndex
LlamaIndex is a data management framework designed for LLM applications. Integrating it with OceanBase lets you:
- Efficiently manage and index structured data
- Handle complex data queries and searches
- Build data-intensive AI applications
Key features include:
- Support for multiple data sources
- Mechanisms for data updates and synchronization
- Enhanced query performance