OceanBase logo

OceanBase

A unified distributed database ready for your transactional, analytical, and AI workloads.

DEPLOY YOUR WAY

OceanBase Cloud

The best way to deploy and scale OceanBase

OceanBase Enterprise

Run and manage OceanBase on your infra

TRY OPEN SOURCE

OceanBase Community Edition

The free, open-source distributed database

OceanBase seekdb

Open source AI native search database

Customer Stories

Real-world success stories from enterprises across diverse industries.

View All
BY USE CASES

Mission-Critical Transactions

Global & Multicloud Application

Elastic Scaling for Peak Traffic

Real-time Analytics

Active Geo-redundancy

Database Consolidation

Resources

Comprehensive knowledge hub for OceanBase.

Blog

Live Demos

Training & Certification

Documentation

Official technical guides, tutorials, API references, and manuals for all OceanBase products.

View All
PRODUCTS

OceanBase Cloud

OceanBase Database

Tools

Connectors and Middleware

QUICK START

OceanBase Cloud

OceanBase Database

BEST PRACTICES

Practical guides for utilizing OceanBase more effectively and conveniently

Company

Learn more about OceanBase – our company, partnerships, and trust and security initiatives.

About OceanBase

Partner

Trust Center

Contact Us

International - English
中国站 - 简体中文
日本 - 日本語
Sign In
Start on Cloud

A unified distributed database ready for your transactional, analytical, and AI workloads.

DEPLOY YOUR WAY

OceanBase Cloud

The best way to deploy and scale OceanBase

OceanBase Enterprise

Run and manage OceanBase on your infra

TRY OPEN SOURCE

OceanBase Community Edition

The free, open-source distributed database

OceanBase seekdb

Open source AI native search database

Customer Stories

Real-world success stories from enterprises across diverse industries.

View All
BY USE CASES

Mission-Critical Transactions

Global & Multicloud Application

Elastic Scaling for Peak Traffic

Real-time Analytics

Active Geo-redundancy

Database Consolidation

Comprehensive knowledge hub for OceanBase.

Blog

Live Demos

Training & Certification

Documentation

Official technical guides, tutorials, API references, and manuals for all OceanBase products.

View All
PRODUCTS
OceanBase CloudOceanBase Database
ToolsConnectors and Middleware
QUICK START
OceanBase CloudOceanBase Database
BEST PRACTICES

Practical guides for utilizing OceanBase more effectively and conveniently

Learn more about OceanBase – our company, partnerships, and trust and security initiatives.

About OceanBase

Partner

Trust Center

Contact Us

Start on Cloud
编组
All Products
    • Databases
    • iconOceanBase Database
    • iconOceanBase Cloud
    • iconOceanBase Tugraph
    • iconInteractive Tutorials
    • iconOceanBase Best Practices
    • Tools
    • iconOceanBase Cloud Platform
    • iconOceanBase Migration Service
    • iconOceanBase Developer Center
    • iconOceanBase Migration Assessment
    • iconOceanBase Admin Tool
    • iconOceanBase Loader and Dumper
    • iconOceanBase Deployer
    • iconKubernetes operator for OceanBase
    • iconOceanBase Diagnostic Tool
    • iconOceanBase Binlog Service
    • Connectors and Middleware
    • iconOceanBase Database Proxy
    • iconEmbedded SQL in C for OceanBase
    • iconOceanBase Call Interface
    • iconOceanBase Connector/C
    • iconOceanBase Connector/J
    • iconOceanBase Connector/ODBC
    • iconOceanBase Connector/NET
icon

OceanBase Cloud

  • Product Updates & Announcements
    • What's new
      • Release notes for 2026
      • Release notes for 2025
      • Release notes for 2024
      • Release history
    • Product announcements
      • Data development module deprecation notice
      • Optimization of Backup and Restore commercialization strategy
      • Cross-AZ data transfer billing (OceanBase Cloud on AWS)
      • Database Proxy pricing update
      • AWS instance pricing adjustment
  • Product Introduction
    • Overview
    • Management mode and scenarios
    • Core features
      • High availability with cross-cloud active-active architecture
      • High availability with cross-cloud primary-standby databases
      • Multi-level caching in shared storage
      • Multi-layer online scaling and on-demand adjustment
    • Deployment modes
    • Storage architecture
    • Product specifications
    • Product billing
      • Overview
      • Instance billing
        • Tencent Cloud instance billing
        • Alibaba Cloud instance billing
        • Huawei Cloud instance billing
        • AWS instance billing
        • GCP instance billing
      • Backup and restore billing
      • SQL audit billing
      • Migrations billing
      • Database proxy billing
      • Binlog service billing
      • Overview of OceanBase Cloud support plans
      • Read-only replica billing
    • Supported database versions
  • Get Started
    • Get started with a transactional instance
    • Get started with an analytical instance
    • Get started with a Key-Value instance
  • Work with Transactional Instances
    • Overview
    • Create an instance
      • Overview
      • Create via OceanBase Cloud official website
      • Create via AWS Marketplace
      • Create via GCP Marketplace
      • Create via Huawei Cloud Marketplace
      • Create via Alibaba Cloud Marketplace
      • Create via Azure Marketplace
    • Connect to an instance
      • MySQL compatible mode
        • Overview
        • Get connection string
          • Overview
          • Connect using AWS PrivateLink
          • Connect using Azure Private Link
          • Connect using Google Cloud Private Service Connect
          • Connect using Huawei Cloud VPC Endpoint
          • Connect using Alibaba Cloud VPC
          • Connect using a public IP address
          • Connect using a Huawei Cloud peering connection
        • Connect with clients
          • Connect to OceanBase Cloud by using Client ODC
          • Connect to OceanBase Cloud by using a MySQL client
          • Connect to OceanBase Cloud by using OBClient
        • Connect with drivers
          • Java
            • Connect to OceanBase Cloud using SpringBoot
            • SpringBatch sample application for connecting to OceanBase Cloud
            • spring-jdbc
            • SpringDataJPA sample application for connecting to OceanBase Cloud
            • Hibernate application development with OceanBase Cloud
            • Sample program for connecting to OceanBase Cloud
            • connector-j
            • Use TestContainers to connect to and use OceanBase Cloud
          • Python
            • Connect to OceanBase Cloud using mysqlclient
            • Connect to OceanBase Cloud using PyMySQL
            • Use the MySQL-connector-python driver to connect to and use OceanBase Cloud
            • Use SQLAlchemy to connect to an OceanBase Cloud database
            • Connect to an OceanBase Cloud database using Django
            • Connect to an OceanBase Cloud database by using peewee
          • C
            • Use MySQL Connector/C to connect to OceanBase Cloud
          • Go
            • Connect to OceanBase Cloud using the Go-SQL-Driver/MySQL driver
            • Connect to OceanBase Cloud using GORM
          • PHP
            • Use the EXT driver to connect to OceanBase Cloud
            • Connect to OceanBase Cloud by using the MySQLi driver
            • Use the PDO driver to connect to OceanBase Cloud
          • Rust
            • Rust application example for connecting to OceanBase Cloud
            • SeaORM example for connecting to OceanBase Cloud
          • ruby
            • ActiveRecord sample application for OceanBase Cloud
            • Connect to OceanBase Cloud by using mysql2
            • Connect to OceanBase Cloud by using Sequel
        • Use database connection pool
          • Database connection pool configuration
          • Connect to OceanBase Cloud by using a Tomcat connection pool
          • Connect to OceanBase Cloud by using a C3P0 connection pool
          • Connect to OceanBase Cloud by using a Proxool connection pool
          • Connect to OceanBase Cloud by using a HikariCP connection pool
          • Connect to OceanBase Cloud by using a DBCP connection pool
          • Connect to OceanBase Cloud by using Commons Pool
          • Connect to OceanBase Cloud by using a Druid connection pool
      • Oracle compatible mode
        • Overview
        • Get connection string
          • Overview
          • Connect using AWS PrivateLink
          • Connect using Azure Private Link
          • Connect using Google Cloud Private Service Connect
          • Connect using Huawei Cloud VPC Endpoint
          • Connect using a public IP address
        • Connect with clients
          • Connect to OceanBase Cloud by using OBClient
          • Connect to OceanBase Cloud by using Client ODC
        • Connect with drivers
          • Java
            • Connect to OceanBase Cloud using OceanBase Connector/J
            • Connect to OceanBase Cloud by using Spring Boot
            • SpringBatch application example for connecting to OceanBase Cloud
            • Connect to OceanBase Cloud using Spring JDBC
            • Connect to OceanBase Cloud by using Spring Data JPA
            • Connect to OceanBase Cloud by using Hibernate
            • Use MyBatis to connect to OceanBase Cloud
            • Use JFinal to connect to OceanBase Cloud
          • Python
            • Python Driver for Oracle Mode
          • C
            • Connect to OceanBase Cloud using OceanBase Connector/C
            • Connect to OceanBase Cloud using OceanBase Connector/ODBC
            • Use SqlSugar to connect to OceanBase Cloud
        • Use database connection pool
          • Database connection pool configuration
          • Sample program that uses a Tomcat connection pool to connect to OceanBase Cloud
          • C3P0 connection pool connects to OceanBase Cloud
          • Connect to OceanBase Cloud using Proxool connection pool
          • Sample program that uses HikariCP to connect to OceanBase Cloud
          • Use DBCP connection pool to connect to OceanBase Cloud
          • Connect to OceanBase Cloud by using Commons Pool
          • Connect to OceanBase Cloud by using a Druid connection pool
    • Developer guide
      • MySQL compatible mode
        • Plan database objects
          • Create a database
          • Create a table group
          • Create a table
          • Create an index
          • Create an external table
        • Write data
          • Insert data
          • Update data
          • Delete data
          • Replace data
          • Generate test data in batches
        • Read data
          • Single-table queries
          • Join tables
            • INNER JOIN queries
            • FULL JOIN queries
            • LEFT JOIN queries
            • RIGHT JOIN queries
            • Subqueries
            • Lateral derived tables
          • Use operators and functions in queries
            • Use arithmetic operators in queries
            • Use numerical functions in queries
            • Use string concatenation operators in queries
            • Use string functions in queries
            • Use datetime functions in queries
            • Use type conversion functions in queries
            • Use aggregate functions in queries
            • Use NULL-related functions in queries
            • Use the CASE conditional operator in queries
            • Use the SELECT ... FOR UPDATE statement to lock query results
            • Use the SELECT ... LOCK IN SHARE MODE statement to lock query results
          • Use a DBLink in queries
          • Set operations
        • Manage transactions
          • Overview
          • Start a transaction
          • Savepoints
            • Mark a savepoint
            • Roll back a transaction to a savepoint
            • Release a savepoint
          • Commit a transaction
          • Roll back a transaction
      • Oracle compatible mode
        • Plan database objects
          • Create a table group
          • Create a table
          • Create an index
          • Create an external table
        • Write data
          • Insert data
          • Update data
          • Delete data
          • Replace data
          • Generate test data in batches
        • Read data
          • Single-table queries
          • Join tables
            • INNER JOIN queries
            • FULL JOIN queries
            • LEFT JOIN queries
            • RIGHT JOIN queries
            • Subqueries
            • Lateral derived tables
          • Use operators and functions in queries
            • Use arithmetic operators in queries
            • Use numerical functions in queries
            • Use string concatenation operators in queries
            • Use string functions in queries
            • Use datetime functions in queries
            • Use type conversion functions in queries
            • Use aggregate functions in queries
            • Use NULL-related functions in queries
            • Use CASE functions in queries
            • Use the SELECT ... FOR UPDATE statement to lock query results
          • Use a DBLink in queries
          • Set operations
        • Manage transactions
          • Overview
          • Start a transaction
          • Savepoints
            • Mark a savepoint
            • Roll back a transaction to a savepoint
          • Commit a transaction
          • Roll back a transaction
    • Manage instances
      • Manage instances
        • View the instance list
        • Instance overview
        • Stop and restart instances
        • Unit migration
      • Manage tenants
        • Tenant overview
        • Create a tenant
        • Modify tenant specifications
        • Modify tenant names
        • Add an endpoint
        • Resource isolation
          • Overview
          • Manage resource groups
            • Create a resource group
            • View a resource group
            • Edit a resource group
            • Delete a resource group
          • Manage isolation rules
            • Create an isolation rule
            • View isolation rules
            • Edit an isolation rule
            • Delete a quarantine rule
        • Modify primary zone
        • Modify the maximum number of connections for a tenant proxy
        • Monitor tenant performance
          • Overview
          • View performance and SQL monitoring details
          • View transaction monitoring details
          • View storage and cache monitoring details
          • View Binlog service monitoring
          • Customize a monitoring dashboard for a tenant
        • Diagnostics
          • Real-time diagnostics
            • SQL diagnostics
              • Top SQL
              • Slow SQL
              • Suspicious SQL
              • High-risk SQL
            • SQL audit
        • Manage tenant parameters
          • Manage tenant parameters
          • Parameters for tenants
          • Parameter template overview
        • Delete a tenant
        • Manage databases and accounts
          • Create accounts
          • Manage accounts
          • Create a database (MySQL compatible mode)
          • Manage databases (MySQL compatible mode)
      • Monitor instance performance
        • Overview
        • Monitor the performance of databases in an instance
        • Monitor multidimensional metrics of an instance
        • Monitor the performance of hosts in an instance
        • Monitor database proxy
        • Monitor database proxy hosts
        • Monitor cross-cloud network performance
        • Customize a monitoring dashboard for an instance
      • Manage major compactions
        • Initiate a major compaction
        • View compaction records
        • Update time for compactions
      • Manage instance parameters
        • Manage parameters
        • Parameters for cluster instances
      • Change instance configurations
        • Enable storage auto-scaling
        • View history of configuration changes
        • Change configuration
        • Change configuration temporarily
        • Switch the deployment mode
      • Manage standby instances
        • Overview
        • Create a standby instance
        • Create a cross-cloud standby instance
        • Create a standby instance for an Alibaba Cloud primary instance
        • View details of primary and standby instances
        • Configure global endpoint
        • Enable automatic forwarding for write requests of standby databases
        • Primary-standby instance switchover
        • Initiate failover
        • Detach a standby instance
        • Release a standby instance
      • Release an instance
      • Database proxy
        • Overview
        • Manage database proxy
        • Direct load
      • Manage alerts
        • Overview
        • Manage alert rules
          • Create an alert rule
          • View an alert rule
          • Edit an alert rule
          • Delete an alert rule
        • View alert history
        • Manage alert templates
          • Create an alert template
          • View an alert template
          • Edit an alert template
          • Copy an alert rule template
          • Delete an alert template
        • Manage muting rules
          • Create an alert muting rule
          • View an alert muting rule
          • Edit an alert muting rule
          • Delete an alert muting rule
        • Manage alert notification templates
          • Create an alert notification template
          • View an alert notification template
          • Edit an alert notification template
          • Copy an alert notification template
          • Delete an alert notification template
        • Manage alert contacts
          • Add an alert contact
          • Add an alert contact group
          • View an alert contact
          • Edit an alert contact
          • Delete an alert contact
          • Obtain a webhook URL
        • Monitoring metrics for alerts
      • Backup and restore
        • Overview
        • Backup strategy
        • Initiate a backup immediately
        • Data backup
        • Initiate a restore
        • Data restore
        • Restore data from the instance recycle bin
      • Diagnostics
        • View performance monitoring data
        • Capacity diagnostics
        • One-click diagnostics
          • Initiate one-click diagnostics
          • View one-click diagnostic report
            • Exceptions
            • Real-time diagnostics
            • Optimization suggestions
            • Capacity management
            • Security management
        • Real-time diagnostics
          • SQL diagnostics
            • Top SQL
            • Slow SQL
            • Suspicious SQL
            • High-risk SQL
            • SQL details
            • SQL monitoring metrics list
          • Session management
            • Session management
          • Request analysis
            • Request analysis
        • Root cause diagnostics
          • Exception handling
          • Enable system autonomy
        • SQL audit
        • Materialized view analysis
        • Optimization center
          • Optimization suggestions
          • Manage active outlines
          • SQL review
          • View the optimization history
      • Manage tags
      • Manage read-only replicas
        • Overview
        • Instance read-only replicas
          • Add a read-only replica to an instance
          • View read-only replicas of an instance
          • Manage read-only replicas of an instance
          • Delete a read-only replica of an instance
        • Tenant read-only replicas
          • Add a read-only replica to a tenant
          • View read-only replicas of a tenant
          • Manage read-only replicas of a tenant
          • Delete a read-only replica of a tenant
      • Manage JVM-dependent services
    • Data source management
      • Create a data source
      • Manage data sources
      • User privileges
        • User privileges for compatibility assessment
        • User privileges for data migration
        • User privileges for performance assessment
        • User privileges for data archiving
        • User privileges for data cleanup
      • Connect via private network
        • AWS
        • Huawei Cloud
        • Alibaba Cloud
        • Google Cloud
        • Azure
        • Private IP address segments
      • Connect via public network
        • AWS
        • Huawei Cloud
        • Alibaba Cloud
        • Google Cloud
        • Azure
    • Data lifecycle management
      • Archive data
      • Clean up data
    • Manage recycle Bin
      • Instance recycle bin
      • Manage databases and tables in recycle bin
        • Overview
        • Instance-level recycle bin
        • Tenant-level recycle bin
  • Work with Analytical Instances
    • Overview
    • Core features
    • Create an instance
    • Connect to an instance
      • Overview
      • Get connection string
        • Overview
        • Connect using AWS PrivateLink
        • Connect using a public IP address
      • Connect with clients
        • Connect to OceanBase Cloud by using Client ODC
        • Connect to OceanBase Cloud by using a MySQL client
        • Connect to OceanBase Cloud by using OBClient
      • Connect with drivers
        • Java
          • Connect to OceanBase Cloud by using Spring Boot
          • Connect to OceanBase Cloud by using Spring Batch
          • Connect to OceanBase Cloud by using Spring Data JDBC
          • Connect to OceanBase Cloud by using Spring Data JPA
          • Connect to OceanBase Cloud by using Hibernate
          • Connect to OceanBase Cloud by using MyBatis
          • Connect to OceanBase Cloud using MySQL Connector/J
        • Python
          • Connect to OceanBase Cloud by using mysqlclient
          • Connect to OceanBase Cloud by using PyMySQL
          • Connect to OceanBase Cloud using MySQL Connector/Python
        • C
          • Connect to OceanBase Cloud using MySQL Connector/C
        • Go
          • Connect to OceanBase Cloud using Go-SQL-Driver/MySQL
        • PHP
          • Connect to OceanBase Cloud using PHP
      • Use database connection pool
        • Database connection pool configuration
        • Connect to OceanBase Cloud by using a Tomcat connection pool
        • Connect to OceanBase Cloud by using a C3P0 connection pool
        • Connect to OceanBase Cloud by using a Proxool connection pool
        • Connect to OceanBase Cloud by using a HikariCP connection pool
        • Connect to OceanBase Cloud by using a DBCP connection pool
        • Connect to OceanBase Cloud by using Commons Pool
        • Connect to OceanBase Cloud by using a Druid connection pool
    • Data table design
      • Table overview
      • Best practices
        • Unit 1: Best practices for optimizing storage structures and query performance
        • Unit 2: Best practices for creating special indexes
    • Export data
    • OceanBase data processing
    • Query acceleration
      • Statistics
      • Materialized views for query acceleration
      • Select a query parallelism level
    • Manage instances
      • Instance overview
      • Change configuration
      • Modify primary zone
      • Manage parameters
      • Backup and restore
        • Backup overview
        • Backup strategies
        • Immediate backup
        • Data backup
        • Initiate restore
        • Data restore
      • Monitor instance performance
        • Overview
        • Monitor the performance of databases in an instance
        • Monitor the performance of hosts in an instance
      • Manage major compactions
        • Initiate a major compaction
        • View compaction records
        • Update time for compactions
      • Database proxy
        • Overview
        • Manage database proxy
        • Direct load
      • Manage alerts
        • Overview
        • Manage alert rules
          • Create an alert rule
          • View an alert rule
          • Edit an alert rule
          • Delete an alert rule
        • View alert history
        • Manage alert templates
          • Create an alert template
          • View an alert template
          • Edit an alert template
          • Copy an alert template
          • Delete an alert template
        • Manage muting rules
          • Create an alert muting rule
          • View an alert muting rule
          • Edit an alert muting rule
          • Delete an alert muting rule
        • Manage alert notification templates
          • Create an alert notification template
          • View an alert notification template
          • Edit an alert notification template
          • Copy an alert notification template
          • Delete an alert notification template
        • Manage alert contacts
          • Add an alert contact
          • Add an alert contact group
          • View an alert contact
          • Edit an alert contact
          • Delete an alert contact
          • Obtain a webhook URL
        • Monitoring metrics for alerts
      • Diagnostics
        • View performance monitoring data
        • Capacity diagnostics
        • Real-time diagnostics
          • SQL diagnostics
            • Top SQL
            • Slow SQL
            • Suspicious SQL
            • High-risk SQL
            • SQL details
            • SQL monitoring metrics list
          • Session management
            • Session management
          • Optimization management
            • Manage active outlines
            • View the optimization history
          • Request analysis
            • Request analysis
      • Stop and restart instances
      • Release instances
      • Manage databases and accounts
        • Create and manage accounts
        • Create a database
        • Manage databases
      • Manage tags
    • Data lifecycle management
      • Archive data
      • Clean up data
    • Performance diagnosis and tuning
      • Use the DBMS_XPLAN package for performance diagnostics
      • Use the GV$SQL_PLAN_MONITOR view for performance analysis
      • Views related to AP performance analysis
    • Performance testing
    • Product integration
    • Manage recycle Bin
      • View instance recycle bin
      • Manage databases and tables in recycle bin
        • Overview
        • Instance recycle bin
  • Work with Key-Value Instances
    • Try out Key-Value instances
      • Create an instance
      • Create a tenant
      • Create an account for a database user
      • OBKV HBase data operation examples
    • Use Table model
      • Create an instance
      • Manage instances
        • Manage instances
          • View the instance list
          • Instance overview
          • Stop and restart instances
          • Release an instance
        • Manage tenants
          • Create a tenant
          • Modify tenant specifications
          • Modify tenant names
          • Delete a tenant
          • Tenant overview
          • Resource isolation
            • Overview
            • Manage resource groups
              • Create a resource group
              • View a resource group
              • Edit a resource group
              • Delete a resource group
            • Manage isolation rules
              • Create an isolation rule
              • View isolation rules
              • Edit an isolation rule
              • Delete a quarantine rule
          • Monitor tenant performance
            • Overview
            • View performance and SQL monitoring details
            • View transaction monitoring details
            • View storage and cache monitoring details
            • OBKV-Table
            • Customize a monitoring dashboard for a tenant
          • Diagnostics
            • Top SQL
          • Manage tenant parameters
            • Manage tenant parameters
            • Parameters for tenants
          • Manage databases and accounts
            • Create and manage accounts
            • Create a database
            • Manage databases
          • Switch primary zone
        • Monitor instance performance
          • Overview
          • Monitor the performance of databases in an instance
          • Monitor multi-dimensional metrics of an instance
          • Monitor the performance of hosts in a cluster
          • Customize monitoring dashboards for an instance
        • Manage major compactions
          • Initiate major compactions
          • View compaction records
          • Update time for compactions
        • Manage instance parameters
          • Parameter management overview
          • Parameters for cluster instances
        • Change instance configurations
          • View history of configuration changes
          • Change configuration
          • Switch the deployment mode
        • Database proxy
          • Overview
          • Manage database proxy
        • Manage alerts
          • Overview
          • Manage alert rules
            • Create an alert rule
            • View an alert rule
            • Edit an alert rule
            • Delete an alert rule
          • View alert history
          • Manage alert templates
            • Create an alert template
            • View an alert template
            • Edit an alert template
            • Copy an alert template
            • Delete an alert template
          • Manage muting rules
            • Create an alert muting rule
            • View an alert muting rule
            • Edit an alert muting rule
            • Delete an alert muting rule
          • Manage alert contacts
            • Add an alert contact
            • Add an alert contact group
            • View an alert contact
            • Edit an alert contact
            • Delete an alert contact
            • Obtain a webhook URL
          • Monitoring metrics for alerts
        • Backup and restore
          • Backup overview
          • Backup strategies
          • Immediate backup
          • Data backup
          • Initiate restore
          • Data restore
        • Diagnostics
          • View performance monitoring data
          • Top SQL
          • Capacity diagnostics
          • Request analysis
        • Manage tags
        • Manage recycle Bin
          • View instance recycle bin
          • Manage databases and tables in recycle bin
            • Overview
            • Instance-level recycle bin
            • Tenant-level recycle bin
    • Use HBase model
      • OBKV-HBase Overview
      • Create an instance
      • Develop in HBase model
        • Connect to an instance by using the OBKV-HBase client
      • Manage instances
        • Manage instances
          • View the instance list
          • Instance overview
          • Stop and restart instances
          • Release an instance
        • Manage tenants
          • Create a tenant
          • Modify tenant specifications
          • Modify tenant names
          • Delete a tenant
          • Tenant overview
          • Resource isolation
            • Overview
            • Manage resource groups
              • Create a resource group
              • View a resource group
              • Edit a resource group
              • Delete a resource group
            • Manage isolation rules
              • Create an isolation rule
              • View isolation rules
              • Edit an isolation rule
              • Delete a quarantine rule
          • Monitor tenant performance
            • Overview
            • View performance and SQL monitoring details
            • View transaction monitoring details
            • View storage and cache monitoring details
            • OBKV-HBase
            • Customize a monitoring dashboard for a tenant
          • Diagnostics
            • Top SQL
          • Manage tenant parameters
            • Manage tenant parameters
            • Parameters for tenants
          • Manage databases and accounts
            • Create and manage accounts
            • Create a database
            • Manage databases
          • Switch primary zone
        • Monitor instance performance
          • Overview
          • Monitor the performance of databases in an instance
          • Monitor multi-dimensional metrics of an instance
          • Monitor the performance of hosts in a cluster
          • Customize monitoring dashboards for an instance
        • Manage major compactions
          • Initiate major compactions
          • View compaction records
          • Update time for compactions
        • Manage instance parameters
          • Parameter management overview
          • Parameters for cluster instances
        • Change instance configurations
          • View history of configuration changes
          • Change configuration
          • Switch the deployment mode
        • Database proxy
          • Overview
          • Manage database proxy
        • Manage alerts
          • Overview
          • Manage alert rules
            • Create an alert rule
            • View an alert rule
            • Edit an alert rule
            • Delete an alert rule
          • View alert history
          • Manage alert templates
            • Create an alert template
            • View an alert template
            • Edit an alert template
            • Copy an alert template
            • Delete an alert template
          • Manage muting rules
            • Create an alert muting rule
            • View an alert muting rule
            • Edit an alert muting rule
            • Delete an alert muting rule
          • Manage alert contacts
            • Add an alert contact
            • Add an alert contact group
            • View an alert contact
            • Edit an alert contact
            • Delete an alert contact
            • Obtain a webhook URL
          • Monitoring metrics for alerts
        • Backup and restore
          • Backup overview
          • Backup strategies
          • Immediate backup
          • Data backup
          • Initiate restore
          • Data restore
        • Diagnostics
          • View performance monitoring data
          • Top SQL
          • Capacity diagnostics
          • Request analysis
        • Manage tags
        • Manage recycle Bin
          • View instance recycle bin
          • Manage databases and tables in recycle bin
            • Overview
            • Instance-level recycle bin
            • Tenant-level recycle bin
      • Performance test
    • Connect Key-Value instances
      • Overview
      • Connect using a public IP address
  • Migrations
    • Data migration and import solutions
    • Data assessment and migration quick start
    • Assess compatibility
      • Overview
      • Perform online assessment
      • Perform offline assessment
      • Manage compatibility assessment tasks
        • View a compatibility assessment task
        • View and download a compatibility assessment report
        • Stop a compatibility assessment task
        • Delete a compatibility assessment task
      • Obtain files for upload
      • Configure PrivateLink
      • Add an IP address to an allowlist
    • Migrate data
      • Overview
      • Migrations specification
      • Purchase a data migration instance
      • Migrate data from a MySQL database to a MySQL-compatible tenant of OceanBase Database
      • Migrate data from a MySQL-compatible tenant of OceanBase Database to a MySQL database
      • Migrate data between OceanBase database tenants of the same compatibility mode
      • Migrate data between OceanBase database tenants of different compatibility modes
      • Migrate data from an Oracle database to an Oracle-compatible tenant of OceanBase Database
      • Migrate data from an Oracle-compatible tenant of OceanBase Database to an Oracle database
      • Configure a two-way synchronization task
      • Migrate data from an OceanBase database to a Kafka instance
      • Migrate data from a TiDB database to a MySQL-compatible tenant of OceanBase Database
      • Migrate incremental data from a MySQL-compatible tenant of OceanBase Database to a TiDB Database
      • Migrate data from a PostgreSQL database to an OceanBase database
      • Migrate incremental data from an OceanBase Database to a PostgreSQL database
      • Manage data migration tasks
        • View details of a data migration task
        • Rename a data migration task
        • View and modify migration objects
        • View and modify migration parameters
        • Configure alert monitoring
        • Manage data migration tasks by using tags
        • Start, stop, and resume a data migration task
        • Clone a data migration task
        • Terminate and release a data migration task
      • Features
        • Custom DML/DDL configurations
        • DDL synchronization scope
        • Use SQL conditions to filter data
        • Rename a migration object
        • Set an incremental synchronization timestamp
        • Instructions on schema migration
        • Configure and modify matching rules
        • Wildcard rules
        • Import migration objects
        • Download conflict data
        • Change a topic
        • Column filtering
        • Data formats
      • Authorize an Alibaba Cloud account
      • SQL statements for querying table objects
      • Online DDL tools
      • Create a trigger
      • Modify the log level of a self-managed PostgreSQL instance
      • Supported DDL statements for synchronization and their limitations
        • DDL synchronization from Aurora MySQL DB clusters to MySQL-compatible tenants of OceanBase Database
        • DDL synchronization from MySQL-compatible tenants of OceanBase Database to Aurora MySQL DB clusters
        • DDL synchronization between MySQL-compatible tenants of OceanBase Database
        • DDL synchronization from Oracle databases to Oracle-compatible tenants of OceanBase Database
        • DDL synchronization from Oracle-compatible tenants of OceanBase Database to Oracle databases
        • DDL synchronization between Oracle-compatible tenants of OceanBase Database
        • DDL synchronization from OceanBase databases to Kafka instances
    • Data subscription
      • Create a data subscription task
      • Manage data subscription tasks
        • View details of a data subscription task
        • Configure subscription information
        • Modify the name of a data subscription task
        • View and modify subscription objects
        • View data subscription parameters
        • Set up data subscription alerts
        • Start, stop, and resume data subscription tasks
        • Clone a data subscription task
        • Release a data subscription task
      • Manage private connections for data subscriptions
      • Configure consumer subscription
      • Message formats
    • Data validation
      • Overview
      • Create a data validation task
      • Manage data validation tasks
        • View details of a data validation task
        • View and modify validation objects
        • View and modify validation parameters
        • Manage data validation tasks with tags
        • Start, pause, and resume data validation tasks
        • Clone a data validation task
        • Release a data validation task
      • Features
        • Import validation objects
        • Rename the validation object
        • Filter objects by using SQL conditions
        • Configure the matching rules for the validation object
    • Assess performance
      • Overview
      • Obtain traffic files from a database instance
      • Create a full performance assessment task
      • Create an SQL file parsing task
      • Create an SQL file replay task
      • Manage performance assessment tasks
        • View the details of a performance assessment task
        • View a performance assessment report
        • Retry and stop a performance assessment task
        • Delete a performance assessment task
      • Obtain a database instance
      • Create an access key
    • Import data
      • Import data
      • Direct load
      • Supported file formats and encoding formats for Data Import
      • Sample data introduction
    • Binlog service
      • Overview
      • Purchase the Binlog service
      • Manage Binlog Service
        • View details of the Binlog service
        • Change configuration
        • Modify the auto-scaling strategy for storage space
        • Modify the elasticity strategy for compute units
        • Disable the Binlog service
  • Security
    • OceanBase Cloud account settings
      • Modify login password
      • Multi-factor authentication
      • Manage AccessKeys
      • Time zone settings
      • Manage cloud marketplace accounts
      • Account audit
    • Organizations and projects
      • Overview
      • Manage organization information
      • Project management
        • Manage projects
        • Cross-project bidirectional authorization
        • Subscribe to project messages
      • Manage members
      • Permissions for roles
      • Cost management
        • Overview
        • Cost details
        • Manage cost units
      • Operation audit
    • Database accounts and privileges
      • Account privileges
      • Authorize cloud vendor accounts
      • AWS KMS key management
      • Support access control
    • Security and encryption
      • Set allowlist groups
      • SSL encryption
      • Transparent Data Encryption (TDE)
    • Monitoring dashboard
    • Events
  • SQL Console
    • Overview
    • Access SQL Console
    • SQL editing and execution
    • PL compilation
    • Result set editing
    • Execution analysis
    • Database object management
      • Create a table
      • Create a view
      • Create a function
      • Create a stored procedure
      • Create a program package
      • Create a trigger
      • Create a type
      • Create a sequence
      • Create a synonym
    • Session variable management
    • Functional keys in SQL Console
  • Integrations
    • Overview
    • Schema evolution
      • Liquibase
      • Flyway
    • Data ingestion
      • Canal
      • dbt
      • Debezium
      • Flink
      • Glue
      • Informatica Cloud
      • Kafka
      • Maxwell
      • SeaTunnel
      • DataWorks
      • NiFi
    • SQL development
      • DataGrip
      • DBeaver
      • Navicat
      • TablePlus
    • Orchestration
      • DolphinScheduler
      • Linkis
      • Airflow
    • Visualization
      • Grafana
      • Power BI
      • Quick BI
      • Superset
      • Tableau
    • Observability
      • Datadog
      • Prometheus
    • Database management
      • Bytebase
    • AI
      • LlamaIndex
      • Dify
      • LangChain
      • Tongyi Qianwen
      • OpenAI
      • n8n
      • Trae
      • SpringAI
      • Cline
      • Cursor
      • Continue
      • Toolbox
      • CamelAI
      • Firecrawl
      • Hugging Face
      • Ollama
      • Google Gemini
      • Cloudflare Workers AI
      • Qoder
      • OpenCode
      • Claude Code
      • GitHub Copilot
      • Codex
      • Jina AI
      • Augment Code
      • Claude Code
      • Kiro
    • Development tools
      • Cloudflare Workers
      • Vercel
  • Best practices
    • Best practices for achieving high availability through cross-cloud active-active deployment
    • High availability through cross-cloud primary-standby databases (1:1)
    • High availability through cross-cloud primary-standby databases (1:n)
    • High host CPU usage
    • Best practices for read/write splitting in OceanBase Cloud
  • References
    • System architecture
    • System management
    • Database object management
    • Database design and specification constraints
    • SQL reference
    • System views
    • Parameters and system variables
    • Error codes
    • Performance tuning
    • Open API References
      • Overview
      • Service endpoints
      • Using API
      • Open APIs
        • Cluster management
          • DescribeInstances
          • DescribeInstance
          • CreateInstance
          • DeleteInstance
          • ModifyInstanceName
          • describe-node-options
          • StopCluster
          • StartCluster
          • ModifyInstanceSpec
          • DescribeInstanceTopology
          • DescribeReadonlyInstances
          • CreateReadonlyInstance
          • ModifyReadonlyInstanceSpec
          • ModifyReadonlyInstanceDiskSize
          • ModifyReadonlyInstanceNodeNum
          • DeleteReadonlyInstance
          • DescribeInstanceAvailableRoZones
          • DescribeInstanceParameters
          • UpdateInstanceParameters
          • DescribeInstanceParametersHistory
          • ModifyInstanceTagList
          • ModifyInstanceNodeNum
        • Tenant management
          • DescribeTenants
          • DescribeTenant
          • CreateTenants
          • DeleteTenants
          • ModifyTenantName
          • ModifyTenant
          • ModifyTenantUserDescription
          • ModifyTenantUserStatus
          • GetTenantCreateConstraints
          • ModifyTenantPrimaryZone
          • GetTenantCreateCpuConstraints
          • GetTenantCreateMemConstraints
          • GetTenantModifyCpuConstraints
          • GetTenantModifyMemConstraints
          • CreateTenantSecurityIpGroup
          • DescribeTenantSecurityIpGroups
          • ModifyTenantSecurityIpGroup
          • DeleteTenantSecurityIpGroup
          • DescribeTenantPrivateLink
          • DeletePrivatelinkConnection
          • CreatePrivatelinkService
          • ConnectPrivatelinkService
          • AddPrivatelinkServiceUser
          • BatchKillProcessList
          • DescribeProcessStatsComposition
          • DescribeTenantAvailableRoZones
          • DescribeTenantAddressInfo
          • ModifyTenantReadonlyReplica
          • DescribeTenantParameters
          • UpdateTenantParameters
          • DescribeTenantParametersHistory
          • ModifyTenantTagList
        • Tenant user management
          • CreateTenantUser
          • DescribeTenantUsers
          • DeleteTenantUsers
          • ModifyTenantUserPassword
          • ModifyTenantUserRoles
        • Database management
          • CreateDatabase
          • DescribeDatabases
          • DeleteDatabases
          • ModifyDatabaseUserRoles
        • Backup and restore
          • DescribeDataBackupSet
          • DescribeRestorableTenants
          • ModifyBackupStrategy
          • CreateTenantRestoreTask
          • CreateDataBackupTask
          • DescribeOneDataBackupSet
        • Database proxy management
          • CreateTenantAddress
          • CreateTenantSingleTunnelSLBAddress
          • DeleteTenantAddress
          • DescribeTenantAddress
          • ModifyOdpClusterSpec
          • ModifyTenantAddressPort
          • ModifyTenantAddressDomainPrefix
          • ConfirmPrivatelinkConnection
          • DescribeTenantAddressInfo
        • Monitoring management
          • DescribeTenantMetrics
          • DescribeMetricsData
          • DescribeNodeMetrics
        • Diagnostic management
          • DescribeOasTopSQLList
          • DescribeOasAnomalySQLList
          • DescribeOasSlowSQLList
          • DescribeOasSQLText
          • DescribeSqlAudits
          • DescribeOutlineBinding
          • DescribeSampleSqlRawTexts
          • DescribeSQLTuningAdvices
          • DescribeOasSlowSQLSamples
          • DescribeOasSQLTrends
          • DescribeOasSQLPlanGroup
        • Security management
          • CreateSecurityIpGroup
          • DescribeInstanceSSL
          • ModifyInstanceSSL
          • DescribeTenantEncryption
          • ModifyTenantEncryption
          • ModifySecurityIps
          • DeleteSecurityIpGroup
          • DescribeTenantSecurityConfigs
          • DescribeInstanceSecurityConfigs
        • Tag management
          • DescribeTags
          • CreateTags
          • UpdateTag
          • DeleteTag
        • Historical event management
          • DescribeOperationEvents
      • Differences between ApsaraDB for OceanBase APIs and OceanBase Cloud APIs
    • Download OBClient
      • Download OBClient
      • Download OceanBase Connector/J
      • Download client ODC
      • Download OceanBase Connector/ODBC
      • Download OBClient Libs
    • Metrics References
      • Cluster database
      • Cluster hosts
      • Binlog service
      • Cross-cloud network channel connection
      • Performance and SQL
      • Transactions
      • Storage and caching
      • Proxy database
      • Proxy host
    • ODC User Guide
      • What is ODC?
        • What is ODC?
        • Limitations
      • Quick Start
        • Client ODC
          • Overview
          • Install Client ODC
          • Use Client ODC
        • Web ODC
          • Overview
          • Use Web ODC
      • Data Source Management
        • Create a data source
        • Data sources and project collaboration
        • Database O&M
          • Session management
          • Global variable management
          • Recycle bin management
      • SQL Development
        • Edit and execute SQL statements
        • Perform PL compilation and debugging
        • Edit and export the result set of an SQL statement
        • Execution analysis
        • Generate test data
        • System settings
        • Database objects
          • Table objects
            • Overview
            • Create a table
          • View objects
            • Overview
            • Create a view
            • Manage views
          • Materialized view objects
            • Overview
            • Create a materialized view
            • Manage materialized views
          • Function objects
            • Overview
            • Create a function
            • Manage functions
          • Stored procedure objects
            • Overview
            • Create a stored procedure
            • Manage stored procedures
          • Sequence objects
            • Overview
            • Create a sequence
            • Manage sequences
          • Package objects
            • Overview
            • Create a program package
            • Manage program packages
          • Trigger objects
            • Overview
            • Create a trigger
            • Manage triggers
          • Type objects
            • Overview
            • Create a type
            • Manage types
          • Synonym objects
            • Overview
            • Create a synonym
            • Manage synonyms
      • Import and Export
        • Import schemas and data
        • Export schemas and data
      • Database Change Management
        • User Permission Management
          • Users and roles
          • Automatic authorization
          • User permission management
        • Project collaboration management
        • Risk levels, risk identification rules, and approval processes
        • SQL check specifications
        • SQL window specification
        • Database change management
        • Batch database change management
        • Online schema changes
        • Synchronize shadow tables
        • Schema comparison
      • Data Lifecycle Management
        • Partitioning Plan Management
          • Manage partitioning plans
          • Set partitioning strategies
          • Examples
        • SQL plan task
      • Data Desensitization and Auditing
        • Desensitize data
        • Operation records
      • Notification Management
        • Overview
        • View notification records
        • Manage Notification Channel
          • Create a notification channel
          • View, edit, and delete a notification channel
          • Configure a custom channel
        • Manage notification rules
      • Best Practices
        • Tips for SQL development
        • Explore ODC team workspaces
        • Understanding real-time SQL diagnostics for OceanBase AP
        • OceanBase historical database solutions
        • ODC SQL check for automatic identification of high-risk operations
        • Manage and modify sharded databases and tables via ODC
        • Data masking and control practices
        • Enterprise-level control and collaboration: Safeguard every database change
    • Data Development
      • Overview
      • Workspace management
      • Worksheet management
      • Compute node pool management
      • Workflow management
      • Dashboard management
      • Manage Git repositories
      • SQL development
        • SQL editing and execution
        • Result set editing
        • Execution analysis
        • Database object management
          • Create a table
          • Create a view
          • Create a function
          • Create a stored procedure
        • Session variable management
        • Git integration
      • Sample datasets
      • Data development terms
  • Manage Billing
    • Access billing
    • View monthly bills
    • View payment details
    • View orders
    • Use vouchers for payment
    • View invoices
  • Legal Agreements
    • OceanBase Cloud Services Agreement
    • Service Level Agreement
    • OceanBase Data Processing Addendum
    • Service Level Agreement for OceanBase Cloud Migration Service

Download PDF

Release notes for 2026 Release notes for 2025 Release notes for 2024 Release history Data development module deprecation notice Optimization of Backup and Restore commercialization strategy Cross-AZ data transfer billing (OceanBase Cloud on AWS) Database Proxy pricing update AWS instance pricing adjustment Overview Management mode and scenarios High availability with cross-cloud active-active architecture High availability with cross-cloud primary-standby databases Multi-level caching in shared storage Multi-layer online scaling and on-demand adjustment Deployment modes Storage architecture Product specifications Overview Backup and restore billing SQL audit billing Migrations billing Database proxy billing Binlog service billing Overview of OceanBase Cloud support plans Read-only replica billing Supported database versions Get started with a transactional instance Get started with an analytical instance Get started with a Key-Value instance Overview Overview Create via OceanBase Cloud official website Create via AWS Marketplace Create via GCP Marketplace Create via Huawei Cloud Marketplace Create via Alibaba Cloud Marketplace Create via Azure Marketplace Release an instance Manage tags Manage JVM-dependent services Create a data source Manage data sources Archive data Clean up data Instance recycle bin Overview Core features Create an instance Overview Table overview Export data OceanBase data processing Statistics Materialized views for query acceleration Select a query parallelism level Instance overview Change configuration Modify primary zone Manage parameters Stop and restart instances Release instances Manage tags Archive data Clean up data Use the DBMS_XPLAN package for performance diagnostics Use the GV$SQL_PLAN_MONITOR view for performance analysis Views related to AP performance analysis Performance testing Product integration View instance recycle bin Create an instance Create a tenant Create an account for a database user OBKV HBase data operation examples Create an instance OBKV-HBase Overview Create an instance Performance test Overview Connect using a public IP address Data migration and import solutions Data assessment and migration quick start Overview Perform online assessment Perform offline assessment Obtain files for upload Configure PrivateLink Add an IP address to an allowlist Overview Migrations specification Purchase a data migration instance Migrate data from a MySQL database to a MySQL-compatible tenant of OceanBase Database Migrate data from a MySQL-compatible tenant of OceanBase Database to a MySQL database Migrate data between OceanBase database tenants of the same compatibility mode Migrate data between OceanBase database tenants of different compatibility modes Migrate data from an Oracle database to an Oracle-compatible tenant of OceanBase Database Migrate data from an Oracle-compatible tenant of OceanBase Database to an Oracle database Configure a two-way synchronization task Migrate data from an OceanBase database to a Kafka instance
OceanBase logo

The Unified Distributed Database for the AI Era.

Follow Us
Products
OceanBase CloudOceanBase EnterpriseOceanBase Community EditionOceanBase seekdb
Resources
DocsBlogLive DemosTraining & Certification
Company
About OceanBaseTrust CenterLegalPartnerContact Us
Follow Us

© OceanBase 2026. All rights reserved

Cloud Service AgreementPrivacy PolicySecurity
Contact Us
Document Feedback
  1. Documentation Center
  2. OceanBase Cloud
iconOceanBase Cloud

    SpringBatch sample application for connecting to OceanBase Cloud

    Last Updated:2026-04-07 08:08:33  Updated
    share
    What is on this page
    Prerequisites
    Procedure
    Step 1: Obtain the connection string of the OceanBase Cloud database
    Step 2: Import the java-oceanbase-springbatch project into IntelliJ IDEA
    Step 3: Modify the database connection information in the java-oceanbase-springbatch project
    Step 4: Run the java-oceanbase-springbatch project
    FAQ
    1. Connection timeout
    2. Character set
    3. SSL connection
    4. Special characters in the account password
    Project code
    Introduction to the pom.xml file
    application.properties file
    BatchApplication.java file
    Introduction to the BatchConfig.java file
    Introduction to the People.java file
    Introduction to the PeopleDESC.java file
    Introduction to the AddPeopleDescProcessor.java file
    Introduction to AddDescPeopleWriter.java
    Introduction to the AddPeopleWriter.java file
    Introduction to the BatchConfigTest.java file
    AddPeopleDescProcessorTest.java file
    AddDescPeopleWriterTest.java
    AddPeopleWriterTest.java File Introduction
    Full code
    References

    folded

    share

    This topic describes how to use the SpringBatch framework and OceanBase Cloud to build an application that performs basic operations such as creating tables, inserting data, and querying data.

    Download the Java OceanBase SpringBatch sample project

    Prerequisites

    • You have registered an Alibaba Cloud account and created an instance and a MySQL compatible tenant. For more information, see Create an instance and Create a tenant.
    • You have installed JDK 1.8 and Maven.
    • You have installed IntelliJ IDEA.

    Note

    The code examples in this topic are run in IntelliJ IDEA 2021.3.2 (Community Edition). You can also use your preferred tool to run the code examples.

    Procedure

    Note

    The operations described in this topic are performed in the Windows environment. If you are using other operating systems or compilers, the operations may vary slightly.

    1. Obtain the connection string of the OceanBase Cloud database.
    2. Import the java-oceanbase-springbatch project into IntelliJ IDEA.
    3. Modify the database connection information in the java-oceanbase-springbatch project.
    4. Run the java-oceanbase-springbatch project.

    Step 1: Obtain the connection string of the OceanBase Cloud database

    1. Log in to the OceanBase Cloud console. In the instance list, find the target instance and click the Connect > Get Connection String option in the target tenant.

      For more information, see Obtain the connection string.

    2. Fill in the following URL based on the information of the created OceanBase Cloud database.

      Note

      The URL information is required in the application.properties file.

      jdbc:oceanbase://host:port/schema_name?user=$user_name&password=$password&characterEncoding=utf-8
      

      Parameter description:

      • host: the endpoint of the OceanBase Cloud database, for example, t********.********.oceanbase.cloud.
      • port: the port of the OceanBase Cloud database. The default value is 3306.
      • schema_name: the name of the schema to be accessed.
      • user_name: the username for accessing the database.
      • password: the password for the account.
      • characterEncoding: the character encoding.

    For more information about the URL parameters, see Database URL.

    Step 2: Import the java-oceanbase-springbatch project into IntelliJ IDEA

    1. Open IntelliJ IDEA and choose File > Open....

      file

    2. In the Open File or Project window that appears, select the project file and click OK.

    3. IntelliJ IDEA automatically identifies various files in the project and displays the project structure, file list, module list, and dependency relationships in the Project tool window. The Project tool window is usually located on the left side of the IntelliJ IDEA interface and is open by default. If the Project tool window is closed, you can click View > Tool Windows > Project in the menu bar or press Alt + 1 to reopen it.

      Note

      When you import a project using IntelliJ IDEA, it automatically detects the pom.xml file in the project, downloads the required dependency libraries based on the described dependencies, and adds them to the project.

    4. View the project.

    springbatch

    Step 3: Modify the database connection information in the java-oceanbase-springbatch project

    Modify the database connection information in the application.properties file based on the information obtained in Step 1: Obtain the connection string of the OceanBase Cloud database.

    Here is an example:

    • The name of the database driver is com.mysql.cj.jdbc.Driver.
    • The endpoint of the OceanBase Cloud database is t5******.********.oceanbase.cloud.
    • The port is 3306.
    • The name of the schema to be accessed is test.
    • The username of the tenant is mysql001.
    • The password is ******.

    Here is the sample code:

    spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
    spring.datasource.url=jdbc:oceanbase://t********.********.oceanbase.cloud:3306/test?characterEncoding=utf-8
    spring.datasource.username=mysql001
    spring.datasource.password=******
    
    spring.jpa.show-sql=true
    spring.jpa.hibernate.ddl-auto=update
    
    spring.batch.job.enabled=false
    
    logging.level.org.springframework=INFO
    logging.level.com.example=DEBUG
    

    Step 4: Run the java-oceanbase-springbatch project

    • Run the AddDescPeopleWriterTest.java file.

      1. In the project structure, go to src > test > java and find the AddDescPeopleWriterTest.java file.
      2. In the tool menu bar, choose Run > Run... > AddDescPeopleWriterTest.testWrite, or click the green triangle in the upper-right corner to run.
      3. View the log information and output results in the console of IntelliJ IDEA.
      Data in the people_desc table:
      PeopleDESC [name=John, age=25, desc=This is John with age 25]
      PeopleDESC [name=Alice, age=30, desc=This is Alice with age 30]
      Batch Job execution completed.
      
    • Run the AddPeopleWriterTest.java file.

      1. In the project structure, go to src > test > java and find the AddDescPeopleWriterTest.java file.
      2. In the tool menu bar, choose Run > Run... > AddPeopleWriterTest.testWrite, or click the green triangle in the upper-right corner to run.
      3. View the log information and output results in the console of IntelliJ IDEA.
      Data in the people table:
      People [name=zhangsan, age=27]
      People [name=lisi, age=35]
      Batch Job execution completed.
      

    FAQ

    1. Connection timeout

    If you encounter a connection timeout issue, you can configure the connection timeout parameter in the JDBC URL:

    jdbc:mysql://host:port/database?connectTimeout=30000&socketTimeout=60000
    

    2. Character set

    To ensure the correct character encoding, set the appropriate character set parameter in the JDBC URL:

    jdbc:mysql://host:port/database?characterEncoding=utf8&useUnicode=true
    

    3. SSL connection

    To enable an SSL connection to OceanBase Cloud, add the following parameter to the JDBC URL:

    jdbc:mysql://host:port/database?useSSL=true&requireSSL=true
    

    4. Special characters in the account password

    If the username or password contains special characters (such as #), you need to URL-encode them:

    String encodedPassword = URLEncoder.encode(password, "UTF-8");
    

    Notice

    When using MySQL Connector/J 8.x, ensure that the account password does not contain the # character. Otherwise, you may encounter a connection error.

    Project code

    Click java-oceanbase-springbatch to download the project code, which is a compressed file named java-oceanbase-springbatch.

    After decompressing it, you will find a folder named java-oceanbase-springbatch. The directory structure is as follows:

    │  pom.xml
    │
    ├─.idea
    │
    ├─src
    │  ├─main
    │  │  ├─java
    │  │  │  └─com
    │  │  │      └─oceanbase
    │  │  │          └─example
    │  │  │              └─batch
    │  │  │                  │──BatchApplication.java
    │  │  │                  │
    │  │  │                  ├─config
    │  │  │                  │   └─BatchConfig.java
    │  │  │                  │
    │  │  │                  ├─model
    │  │  │                  │   ├─People.java
    │  │  │                  │   └─PeopleDESC.java
    │  │  │                  │
    │  │  │                  ├─processor
    │  │  │                  │   └─AddPeopleDescProcessor.java
    │  │  │                  │
    │  │  │                  └─writer
    │  │  │                      ├─AddDescPeopleWriter.java
    │  │  │                      └─AddPeopleWriter.java
    │  │  │
    │  │  └─resources
    │  │      └─application.properties
    │  │
    │  └─test
    │      └─java
    │          └─com
    │              └─oceanbase
    │                  └─example
    │                      └─batch
    │                          ├─config
    │                          │   └─BatchConfigTest.java
    │                          │
    │                          ├─processor
    │                          │   └─AddPeopleDescProcessorTest.java
    │                          │
    │                          └─writer
    │                              ├─AddDescPeopleWriterTest.java
    │                              └─AddPeopleWriterTest.java
    │
    └─target
    

    File description:

    • pom.xml: the configuration file of the Maven project, which contains information about the project's dependencies, plugins, and build process.
    • .idea: a directory used by the IDE (Integrated Development Environment) to store project-related configuration information.
    • src: a directory typically used to store the source code of the project.
    • main: a directory that stores the main source code and resource files.
    • java: a directory that stores the Java source code.
    • com.oceanbase.example.batch: the package name.
    • BatchApplication.java: the entry class of the application, which contains the main method of the application.
    • config: a directory that stores the configuration classes of the application.
    • BatchConfig.java: the configuration class of the application, used to configure some properties and behaviors of the application.
    • model: a directory that stores the data model classes of the application.
    • People.java: a data model class for people.
    • PeopleDESC.java: a data model class for people's DESC information.
    • processor: a directory that stores the processor classes of the application.
    • AddPeopleDescProcessor.java: a processor class for adding people's DESC information.
    • writer: a directory that stores the writer classes of the application.
    • AddDescPeopleWriter.java: a writer class for writing people's DESC information.
    • AddPeopleWriter.java: a writer class for writing people's information.
    • resources: a directory that stores the configuration files and other static resources of the application.
    • application.properties: the configuration file of the application, used to configure the properties of the application.
    • test: a directory that stores the test code and resource files.
    • BatchConfigTest.java: the test class of the application's configuration class.
    • AddPeopleDescProcessorTest.java: the test class of the add-people-DESC processor.
    • AddDescPeopleWriterTest.java: the test class of the writer for writing people's DESC information.
    • AddPeopleWriterTest.java: the test class of the writer for writing people's information.
    • target: a directory that stores the compiled Class files, JAR packages, and other files.

    Introduction to the pom.xml file

    Note

    If you only want to verify the example, you can use the default code without any modifications. You can also modify the pom.xml file according to your needs as described below.

    The content of the pom.xml configuration file is as follows:

    1. File declaration statement.

      This statement declares that the file is an XML file using XML version 1.0 and character encoding UTF-8.

      Sample code:

      <?xml version="1.0" encoding="UTF-8"?>
      
    2. Configure the namespaces and POM model version.

      1. Use xmlns to set the POM namespace to http://maven.apache.org/POM/4.0.0.
      2. Use xmlns:xsi to set the XML namespace to http://www.w3.org/2001/XMLSchema-instance.
      3. Use xsi:schemaLocation to set the POM namespace to http://maven.apache.org/POM/4.0.0 and the location of the POM's XSD file to https://maven.apache.org/xsd/maven-4.0.0.xsd.
      4. Use the <modelVersion> element to set the POM model version used by the POM file to 4.0.0.

      Sample code:

       <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
               xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
           <modelVersion>4.0.0</modelVersion>
      </project>
      
    3. Configure the parent project information.

      1. Use <groupId> to set the parent project identifier to org.springframework.boot.
      2. Use <artifactId> to set the parent project dependency to spring-boot-starter-parent.
      3. Use <version> to set the parent project version to 2.7.11.
      4. Use relativePath to indicate that the parent project path is empty.

      Sample code:

       <parent>
           <groupId>org.springframework.boot</groupId>
           <artifactId>spring-boot-starter-parent</artifactId>
           <version>2.7.11</version>
           <relativePath/>
       </parent>
      
    4. Configure the basic information.

      1. Use <groupId> to set the project identifier to com.oceanbase.
      2. Use <artifactId> to set the project dependency to java-oceanbase-springboot.
      3. Use <version> to set the project version to 0.0.1-SNAPSHOT.
      4. Use description to describe the project information as Demo project for Spring Batch.

      Sample code:

       <groupId>com.oceanbase</groupId>
       <artifactId>java-oceanbase-springboot</artifactId>
       <version>0.0.1-SNAPSHOT</version>
       <name>java-oceanbase-springbatch</name>
       <description>Demo project for Spring Batch</description>
      
    5. Configure the Java version.

      Set the Java version used by the project to 1.8.

      Sample code:

        <properties>
            <java.version>1.8</java.version>
        </properties>
      
    6. Configure the core dependencies.

      1. Set the organization to org.springframework.boot, the name to spring-boot-starter, and use this dependency to access the components supported by Spring Boot, including Web, data processing, security, and Test.

        Sample code:

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter</artifactId>
        </dependency>
        
      2. Set the organization to org.springframework.boot, the name to spring-boot-starter-jdbc, and use this dependency to access the JDBC-related features provided by Spring Boot, including connection pools and data source configurations.

        Sample code:

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-jdbc</artifactId>
        </dependency>
        
      3. Set the organization to org.springframework.boot, the name to spring-boot-starter-test, and the scope to test. Use this dependency to access the testing framework and tools provided by Spring Boot, including JUnit, Mockito, and Hamcrest.

        Sample code:

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        
      4. Set the organization to com.oceanbase, the name to oceanbase-client, and the version to 2.4.12. Use this dependency to access the client features provided by OceanBase, including connections, queries, and transactions.

        Sample code:

            <dependency>
                <groupId>com.oceanbase</groupId>
                <artifactId>oceanbase-client</artifactId>
                <version>2.4.12</version>
            </dependency>
        
      5. Set the organization to org.springframework.boot, the name to spring-boot-starter-batch, and use this dependency to access the batch processing features provided by Spring Boot.

        Sample code:

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-batch</artifactId>
        </dependency>
        
      6. Set the organization to org.springframework.boot, the name to spring-boot-starter-data-jpa, and use this dependency to access the necessary dependencies and configurations for data access using JPA. Spring Boot Starter Data JPA is a Spring Boot starter.

        Sample code:

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-data-jpa</artifactId>
        </dependency>
        
      7. Set the organization to org.apache.tomcat, the name to tomcat-jdbc, and use this dependency to access the JDBC connection pool features provided by Tomcat, including connection pool configurations, connection acquisition and release, and connection management.

        Sample code:

        <dependency>
            <groupId>org.apache.tomcat</groupId>
            <artifactId>tomcat-jdbc</artifactId>
        </dependency>
        
      8. Set the dependency architecture to junit, the name to junit, the version to 4.10, and the scope to test. Use this dependency to add the JUnit unit test dependency configuration.

        Sample code:

        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.10</version>
            <scope>test</scope>
        </dependency>
        
      9. Set the organization to javax.activation, the name to javax.activation-api, and the version to 1.2.0. Use this dependency to import the Java Activation Framework (JAF) library.

        Sample code:

        <dependency>
            <groupId>javax.activation</groupId>
            <artifactId>javax.activation-api</artifactId>
            <version>1.2.0</version>
        </dependency>
        
      10. Set the organization to jakarta.persistence, the name to jakarta.persistence-api, and the version to 2.2.3. Use this dependency to add the Jakarta Persistence API dependency configuration. Sample code:

        <dependency>
            <groupId>jakarta.persistence</groupId>
            <artifactId>jakarta.persistence-api</artifactId>
            <version>2.2.3</version>
        </dependency>
        
    7. Configure the Maven plugin.

      Set the organization to org.springframework.boot, the name to spring-boot-maven-plugin, and use this plugin to package Spring Boot applications into executable JAR or WAR packages that can be directly run.

      Sample code:

       <build>
           <plugins>
               <plugin>
                   <groupId>org.springframework.boot</groupId>
                   <artifactId>spring-boot-maven-plugin</artifactId>
               </plugin>
           </plugins>
       </build>
      

    application.properties file

    The application.properties file is used to configure database connections and other related settings. This includes database drivers, connection URLs, usernames, and passwords. It also contains configurations for JPA (Java Persistence API) and Spring Batch, as well as log level settings.

    1. Database connection configuration.

      • Use spring.datasource.driver to specify the database driver as com.mysql.cj.jdbc.Driver, which is used to connect to the OceanBase Cloud database.
      • Use spring.datasource.url to specify the URL for connecting to the database.
      • Use spring.datasource.username to specify the username for connecting to the database.
      • Use spring.datasource.password to specify the password for connecting to the database.

      Sample code:

      spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
      spring.datasource.url=jdbc:oceanbase://host:port/schema_name?characterEncoding=utf-8
      spring.datasource.username=user_name
      spring.datasource.password=******
      
    2. JPA configuration.

      • Use spring.jpa.show-sql to specify whether to display SQL statements in the log, set to true to display SQL statements.
      • Use spring.jpa.hibernate.ddl-auto to specify the Hibernate DDL operation behavior, set to update to automatically update the database structure when the application starts.

      Sample code:

      spring.jpa.show-sql=true
      spring.jpa.hibernate.ddl-auto=update
      
    3. Spring Batch configuration:

      Use spring.batch.job.enabled to specify whether to enable Spring Batch jobs, set to false to disable automatic execution of batch jobs.

      Sample code:

      spring.batch.job.enabled=false
      

      Note

      In Spring Batch, the spring.batch.job.enabled property controls the execution behavior of batch jobs.

      • spring.batch.job.enabled=true (default): Indicates that all defined batch jobs are automatically executed when the Spring Boot application starts. This means that Spring Batch automatically discovers and executes all defined jobs when the application starts.
      • spring.batch.job.enabled=false: Indicates that automatic execution of batch jobs is disabled. This is typically used in development or testing environments, or when you want to manually control job execution. When set to false, jobs will not be automatically executed when the application starts. You can manually trigger jobs using other methods such as REST APIs or command-line interfaces.
      In summary, setting spring.batch.job.enabled=false helps prevent jobs from being automatically executed when the application starts, providing greater flexibility in controlling when batch jobs are executed.

    4. Log configuration:

      • Use logging.level.org.springframework to set the log level for the Spring framework to INFO.
      • Use logging.level.com.example to set the log level for custom application code to DEBUG.

      Sample code:

      logging.level.org.springframework=INFO
      logging.level.com.example=DEBUG
      

    BatchApplication.java file

    The BatchApplication.java file is the entry point of the Spring Boot application.

    The code in the BatchApplication.java file mainly includes the following parts:

    1. Import other classes and interfaces.

      Declare the interfaces and classes included in the current file:

      • SpringApplication class: used to start the Spring Boot application.
      • SpringBootApplication annotation: used to mark the class as the entry point of the Spring Boot application.

      Sample code:

          import org.springframework.boot.SpringApplication;
          import org.springframework.boot.autoconfigure.SpringBootApplication;
      
    2. Define the BatchApplication class.

      Use the @SpringBootApplication annotation to mark the BatchApplication class as the entry point of the Spring Boot application. In the BatchApplication class, define a static main method as the entry point of the application. In this method, use the SpringApplication.run method to start the Spring Boot application. Also, define a method named runBatchJob to run the batch job.

      Sample code:

      
      
          @SpringBootApplication
          public class BatchApplication {
              public static void main(String[] args) {
                  SpringApplication.run(BatchApplication.class, args);
              }
      
              public void runBatchJob() {
              }
          }
      

    Introduction to the BatchConfig.java file

    The BatchConfig.java file is used to configure components such as steps, readers, processors, and writers for batch processing jobs.

    The code in the BatchConfig.java file mainly includes the following parts:

    1. Import other classes and interfaces.

      The following interfaces and classes are declared in this file:

      • People class: used to store personnel information read from the database.
      • PeopleDESC class: used to store description information after the personnel information is converted or processed.
      • AddPeopleDescProcessor class: an implementation class of the ItemProcessor interface. It converts the People object read to the PeopleDESC object.
      • AddDescPeopleWriter class: an implementation class of the ItemWriter interface. It writes the PeopleDESC object to the target location.
      • Job interface: represents a batch processing job.
      • Step interface: represents a step in a job.
      • EnableBatchProcessing annotation: a Spring Batch configuration annotation used to enable and configure the Spring Batch processing feature.
      • JobBuilderFactory class: used to create and configure jobs.
      • StepBuilderFactory class: used to create and configure steps.
      • RunIdIncrementer class: a Spring Batch run ID (Run ID) auto-incrementer. It is used to increase the run ID each time the job is run.
      • ItemProcessor interface: used to process or convert the read items.
      • ItemReader interface: used to read items from the data source.
      • ItemWriter interface: used to write the processed or converted items to the specified target location.
      • JdbcCursorItemReader class: used to read data from the database and return the cursor result set.
      • Autowired annotation: used for dependency injection.
      • Bean annotation: used to create and configure beans.
      • ComponentScan annotation: used to specify the package or class to be scanned for components.
      • Configuration annotation: used to mark a class as a configuration class.
      • EnableAutoConfiguration annotation: used to enable Spring Boot auto-configuration.
      • SpringBootApplication annotation: used to mark the class as the entry point of the Spring Boot application.
      • DataSource interface: used to represent the database connection.

      Code:

      import com.oceanbase.example.batch.model.People;
      import com.oceanbase.example.batch.model.PeopleDESC;
      import com.oceanbase.example.batch.processor.AddPeopleDescProcessor;
      import com.oceanbase.example.batch.writer.AddDescPeopleWriter;
      import org.springframework.batch.core.Job;
      import org.springframework.batch.core.Step;
      import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
      import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
      import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
      import org.springframework.batch.core.launch.support.RunIdIncrementer;
      import org.springframework.batch.item.ItemProcessor;
      import org.springframework.batch.item.ItemReader;
      import org.springframework.batch.item.ItemWriter;
      import org.springframework.batch.item.database.JdbcCursorItemReader;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
      import org.springframework.boot.autoconfigure.SpringBootApplication;
      import org.springframework.context.annotation.Bean;
      import org.springframework.context.annotation.ComponentScan;
      import org.springframework.context.annotation.Configuration;
      import org.springframework.jdbc.core.BeanPropertyRowMapper;
      
      import javax.sql.DataSource;
      
    2. Define the BatchConfig class.

      This is a simple Spring Batch batch processing job. It defines the methods for reading, processing, and writing data, and encapsulates these steps into a job. By using Spring Batch annotations and auto-configuration features, you can create corresponding component instances through the @Bean methods in the configuration class and use these components in step1 to complete data reading, processing, and writing.

      • Use @Configuration to indicate that this class is a configuration class.
      • Use @EnableBatchProcessing to enable the Spring Batch processing feature. This annotation automatically creates necessary beans, such as JobRepository and JobLauncher.
      • Use @SpringBootApplication as the main class annotation for Spring Boot applications, which is the starting point of the Spring Boot application.
      • Use @ComponentScan to specify the package to be scanned for components. It tells Spring to scan and register all components in this package and its subpackages.
      • Use @EnableAutoConfiguration to automatically configure the infrastructure of the Spring Boot application.

      Code:

       @Configuration
       @EnableBatchProcessing
       @SpringBootApplication
       @ComponentScan("com.oceanbase.example.batch.writer")
       @EnableAutoConfiguration
       public class BatchConfig {
       }
      
      1. Define the @Autowired annotation.

        Use the @Autowired annotation to inject JobBuilderFactory, StepBuilderFactory, and DataSource into the member variables of the BatchConfig class. JobBuilderFactory is a factory class used to create and configure jobs (Job), StepBuilderFactory is a factory class used to create and configure steps (Step), and DataSource is an interface used to obtain the database connection.

        Code:

        @Autowired
        private JobBuilderFactory jobBuilderFactory;
        
        @Autowired
        private StepBuilderFactory stepBuilderFactory;
        
        @Autowired
        private DataSource dataSource;
        
      2. Define the @Bean annotation.

        Use the @Bean annotation to define several methods for creating the reader, processor, writer, step, and job components of the batch processing job.

        • Use the peopleReader method to create an ItemReader component instance. This component uses JdbcCursorItemReader to read People object data from the database. Set the data source dataSource, set the RowMapper to map database rows to People objects, and set the SQL query statement to SELECT * FROM people.

        • Use the addPeopleDescProcessor method to create an ItemProcessor component instance. This component uses AddPeopleDescProcessor to process People objects and returns the converted PeopleDESC objects.

        • Use the addDescPeopleWriter method to create an ItemWriter component instance. This component uses AddDescPeopleWriter to write PeopleDESC objects to the target location.

        • Use the step1 method to create a Step component instance. The step name is step1. Obtain the step builder through stepBuilderFactory.get, set the reader to the ItemReader component, set the processor to the ItemProcessor component, set the writer to the ItemWriter component, set the chunk size to 10, and finally call build to build and return the configured Step.

        • Use the importJob method to create a Job component instance. The job name is importJob. Obtain the job builder through jobBuilderFactory.get, set the incrementer to RunIdIncrementer, set the initial step of the job flow to Step, and finally call build to build and return the configured Job.

          Code:

          @Bean
          public ItemReader<People> peopleReader() {
              JdbcCursorItemReader<People> reader = new JdbcCursorItemReader<>();
              reader.setDataSource((javax.sql.DataSource) dataSource);
              reader.setRowMapper(new BeanPropertyRowMapper<>(People.class));
              reader.setSql("SELECT * FROM people");
              return reader;
          }
          
          @Bean
          public ItemProcessor<People, PeopleDESC> addPeopleDescProcessor() {
              return new AddPeopleDescProcessor();
          }
          
          @Bean
          public ItemWriter<PeopleDESC> addDescPeopleWriter() {
              return new AddDescPeopleWriter();
          }
          
          @Bean
          public Step step1(ItemReader<People> reader, ItemProcessor<People, PeopleDESC> processor,
                          ItemWriter<PeopleDESC> writer) {
              return stepBuilderFactory.get("step1")
                      .<People, PeopleDESC>chunk(10)
                      .reader(reader)
                      .processor(processor)
                      .writer(writer)
                      .build();
          }
          
          @Bean
          public Job importJob(Step step1) {
              return jobBuilderFactory.get("importJob")
                      .incrementer(new RunIdIncrementer())
                      .flow(step1)
                      .end()
                      .build();
          }
          

    Introduction to the People.java file

    The People.java file defines a People class that stores information about a person. The class contains two private member variables, name and age, and corresponding getter and setter methods. It also overrides the toString method to print the information of an object. In this class, name indicates the name of a person, and age indicates the age of a person. You can use the getter and setter methods to obtain and set the values of these attributes.

    The People class is used to store and pass data in the input and output of a batch program. In the read and write operations of a batch program, you use the People object to store data, use the setter method to set data, and use the getter method to obtain data.

    Sample code:

        public class People {
            private String name;
            private int age;
    
                // getters and setters
    
            public String getName() {
                return name;
            }
    
            public void setName(String name) {
                this.name = name;
            }
    
            public int getAge() {
                return age;
            }
    
            public void setAge(int age) {
                this.age = age;
            }
            @Override
            public String toString() {
                return "People [name=" + name + ", age=" + age + "]";
            }
            // Getters and setters
        }
    

    Introduction to the PeopleDESC.java file

    The PeopleDESC.java file defines a PeopleDESC class that stores information about a person. The PeopleDESC class has four attributes: name, age, desc, and id, which indicate the name, age, description, and ID of a person, respectively. The class contains corresponding getter and setter methods to access and set the values of these attributes. It also overrides the toString method to return the string representation of the class, which contains the name, age, and description.

    Similar to the People class, the PeopleDESC class is used to store and pass data in the input and output of a batch program.

    Sample code:

        public class PeopleDESC {
            private String name;
            private int age;
            private String desc;
            private int id;
    
            public String getName() {
                return name;
            }
    
            public void setName(String name) {
                this.name = name;
            }
    
            public int getAge() {
                return age;
            }
    
            public void setAge(int age) {
                this.age = age;
            }
    
            public String getDesc() {
                return desc;
            }
    
            public void setDesc(String desc) {
                this.desc = desc;
            }
    
            public int getId() {
                return id;
            }
    
            public void setId(int id) {
                this.id = id;
            }
    
            @Override
            public String toString() {
                return "PeopleDESC [name=" + name + ", age=" + age + ", desc=" + desc + "]";
            }
        }
    

    Introduction to the AddPeopleDescProcessor.java file

    The AddPeopleDescProcessor.java file defines a AddPeopleDescProcessor class that implements the ItemProcessor interface. This class is used to convert a People object to a PeopleDESC object.

    The code in the AddPeopleDescProcessor.java file mainly includes the following parts:

    1. Import other classes and interfaces.

      The code declares that the current file contains the following interfaces and classes:

      • People class: stores the information of a person read from a database.
      • PeopleDESC class: stores the description of a person after the information is converted or processed.
      • ItemProcessor interface: processes or converts the read items.

      Sample code:

      import com.oceanbase.example.batch.model.People;
      import com.oceanbase.example.batch.model.PeopleDESC;
      import org.springframework.batch.item.ItemProcessor;
      
    2. Define the AddPeopleDescProcessor class.

      The AddPeopleDescProcessor class of the ItemProcessor interface is used to convert a People object to a PeopleDESC object, and implement the logic for processing input data in a batch program.

      In the process method of this class, first, a PeopleDESC object desc is created. Then, the name and age attributes of the People object are obtained from the item parameter, and the values of these attributes are set to the desc object. The desc attribute of the desc object is also set to a description generated based on the attributes of the People object. Finally, the processed PeopleDESC object is returned.

      Sample code:

      public class AddPeopleDescProcessor implements ItemProcessor<People, PeopleDESC> {
          @Override
          public PeopleDESC process(People item) throws Exception {
              PeopleDESC desc = new PeopleDESC();
              desc.setName(item.getName());
              desc.setAge(item.getAge());
              desc.setDesc("This is " + item.getName() + " with age " + item.getAge());
              return desc;
          }
      }
      

    Introduction to AddDescPeopleWriter.java

    The AddDescPeopleWriter.java file implements the AddDescPeopleWriter class, which implements the ItemWriter interface. This class is used to write People objects to a database.

    The code in the AddDescPeopleWriter.java file mainly includes the following parts:

    1. Import other classes and interfaces.

      Declare that the current file contains the following interfaces and classes:

      • PeopleDESC class: used to store description information after conversion or processing of personnel information.
      • ItemWriter interface: used to write processed or converted items to the specified target location.
      • @Autowired annotation: used for dependency injection.
      • JdbcTemplate class: provides methods for executing SQL statements.
      • List interface: used to operate on query result sets.

      Code:

      import com.oceanbase.example.batch.model.PeopleDESC;
      import org.springframework.batch.item.ItemWriter;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.jdbc.core.JdbcTemplate;
      
      import java.util.List;
      
    2. Define the AddDescPeopleWriter class.

      1. Use the @Autowired annotation to automatically inject the JdbcTemplate instance. This instance is used to perform database operations when writing data.

        Code:

            @Autowired
            private JdbcTemplate jdbcTemplate;
        
      2. In the write method, iterate through the input List<? extends PeopleDESC> and extract each PeopleDESC object. First, execute the SQL statement DROP TABLE people_desc to delete the people_desc table if it exists. Then, execute the SQL statement CREATE TABLE people_desc (id INT PRIMARY KEY, name VARCHAR2(255), age INT, description VARCHAR2(255)) to create a people_desc table with columns id, name, age, and description. Finally, use the SQL statement INSERT INTO people_desc (id, name, age, description) VALUES (?, ?, ?, ?) to insert the attribute values of each PeopleDESC object into the people_desc table.

        Code:

            @Override
            public void write(List<? extends PeopleDESC> items) throws Exception {
                // Drop the table if it exists
                jdbcTemplate.execute("DROP TABLE people_desc");
                // Create the table
                String createTableSql = "CREATE TABLE people_desc (id INT PRIMARY KEY, name VARCHAR2(255), age INT, description VARCHAR2(255))";
                jdbcTemplate.execute(createTableSql);
                for (PeopleDESC item : items) {
                    String sql = "INSERT INTO people_desc (id, name, age, description) VALUES (?, ?, ?, ?)";
                    jdbcTemplate.update(sql, item.getId(), item.getName(), item.getAge(), item.getDesc());
                }
            }
        

    Introduction to the AddPeopleWriter.java file

    The AddPeopleWriter.java file implements the AddDescPeopleWriter class of the ItemWriter interface, which is used to write PeopleDESC objects to a database.

    The AddPeopleWriter.java file contains the following code:

    1. Import other classes and interfaces.

      Declare the following interfaces and classes in the current file:

      • People class: used to store personnel information read from a database.
      • ItemWriter interface: used to write processed or converted items to a specified target location.
      • Autowired annotation: used for dependency injection.
      • JdbcTemplate class: provides methods for executing SQL statements.
      • Component annotation: used to mark the class as a Spring component.
      • List interface: used to operate on query result sets.

      Code:

      import com.oceanbase.example.batch.model.People;
      import org.springframework.batch.item.ItemWriter;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.jdbc.core.JdbcTemplate;
      import org.springframework.stereotype.Component;
      
      import java.util.List;
      
    2. Define the AddPeopleWriter class.

      1. Use the @Autowired annotation to automatically inject the JdbcTemplate instance, which is used to execute database operations when writing data.

        Code:

            @Autowired
            private JdbcTemplate jdbcTemplate;
        
      2. In the write method, traverse the input List<? extends People> and extract each People object. First, execute the SQL statement DROP TABLE people to delete an existing table named people. Then, execute the SQL statement CREATE TABLE people (name VARCHAR2(255), age INT) to create a table named people with two columns: name and age. Finally, use the SQL statement INSERT INTO people (name, age) VALUES (?, ?) to insert the attribute values of each People object into the people table.

        Code:

        @Override
        public void write(List<? extends People> items) throws Exception {
            // Drop the table if it exists
            jdbcTemplate.execute("DROP TABLE people");
            // Create the table
            String createTableSql = "CREATE TABLE people (name VARCHAR2(255), age INT)";
            jdbcTemplate.execute(createTableSql);
            for (People item : items) {
                String sql = "INSERT INTO people (name, age) VALUES (?, ?)";
                jdbcTemplate.update(sql, item.getName(), item.getAge());
            }
        }
        

    Introduction to the BatchConfigTest.java file

    The BatchConfigTest.java file is a class that uses JUnit for testing, used to test the job configuration of Spring Batch.

    The BatchConfigTest.java file contains the following code:

    1. Import other classes and interfaces.

      Declare the following interfaces and classes in the current file:

      • Assert class: used to assert test results.
      • Test annotation: used to mark a method as a test method.
      • RunWith annotation: used to specify a test runner.
      • Job interface: represents a batch processing job.
      • JobExecution class: used to represent the execution of a batch processing job.
      • JobParameters class: used to represent the parameters of a batch processing job.
      • JobParametersBuilder class: used to build the parameters of a batch processing job.
      • JobLauncher interface: used to start a batch processing job.
      • Autowired annotation: used for dependency injection.
      • SpringBootTest annotation: used to specify the test class as a Spring Boot test.
      • SpringRunner class: used to specify the test runner as SpringRunner.

      Code:

      import org.junit.Assert;
      import org.junit.jupiter.api.Test;
      import org.junit.runner.RunWith;
      import org.springframework.batch.core.Job;
      import org.springframework.batch.core.JobExecution;
      import org.springframework.batch.core.JobParameters;
      import org.springframework.batch.core.JobParametersBuilder;
      import org.springframework.batch.core.launch.JobLauncher;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.boot.test.context.SpringBootTest;
      import org.springframework.test.context.junit4.SpringRunner;
      
      import javax.batch.runtime.BatchStatus;
      import java.util.UUID;
      
    2. Define the BatchConfigTest class.

      Use the SpringBootTest annotation and SpringRunner runner to perform integration tests for Spring Boot. In the testJob method, use the JobLauncherTestUtils helper class to start a batch processing job and use assertions to verify the job's execution status.

      1. Use the @Autowired annotation to automatically inject the JobLauncherTestUtils instance.

        Code:

        @Autowired
        private JobLauncherTestUtils jobLauncherTestUtils;
        
      2. Use the @Test annotation to mark the testJob method as a test method. In this method, first create a JobParameters object, then use the jobLauncherTestUtils.launchJob method to start the batch processing job, and use the Assert.assertEquals method to assert that the job's execution status is COMPLETED.

        Code:

        @Test
        public void testJob() throws Exception {
            JobParameters jobParameters = new JobParametersBuilder()
                    .addString("jobParam", "paramValue")
                    .toJobParameters();
        
            JobExecution jobExecution = jobLauncherTestUtils.launchJob(jobParameters);
        
            Assert.assertEquals(BatchStatus.COMPLETED, jobExecution.getStatus());
        }
        
      3. Use the @Autowired annotation to automatically inject the JobLauncher instance.

        Code:

        @Autowired
        private JobLauncher jobLauncher;
        
      4. Use the @Autowired annotation to automatically inject the Job instance.

        Code:

        @Autowired
        private Job job;
        
      5. Define an internal class named JobLauncherTestUtils to assist in starting the batch processing job. In this class, define a launchJob method to start the batch processing job. In this method, use the jobLauncher.run method to start the job and return the job's execution result.

        Code:

        private class JobLauncherTestUtils {
            public JobExecution launchJob(JobParameters jobParameters) throws Exception {
                return jobLauncher.run(job, jobParameters);
            }
        }
        

    AddPeopleDescProcessorTest.java file

    The AddPeopleDescProcessorTest.java file is a class that uses JUnit for testing Spring Batch job configurations.

    The code in the AddPeopleDescProcessorTest.java file mainly includes the following parts:

    1. Import other classes and interfaces.

      Declare the interfaces and classes included in the current file:

      • People class: stores the personnel information read from the database.
      • PeopleDESC class: stores the description information after the personnel information is converted or processed.
      • Assert class: verifies whether the expected results and actual results in the test are consistent.
      • Test annotation: marks the test method.
      • RunWith annotation: specifies the test runner.
      • Autowired annotation: performs dependency injection.
      • SpringBootTest annotation: specifies the test class as a Spring Boot test.
      • SpringRunner class: specifies the test runner as SpringRunner.

      Code:

      import com.oceanbase.example.batch.model.People;
      import com.oceanbase.example.batch.model.PeopleDESC;
      import org.junit.Assert;
      import org.junit.jupiter.api.Test;
      import org.junit.runner.RunWith;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.boot.test.context.SpringBootTest;
      import org.springframework.test.context.junit4.SpringRunner;
      
    2. Define the AddPeopleDescProcessorTest class.

      Use the SpringBootTest annotation and SpringRunner runner for Spring Boot integration testing.

      1. Use the @Autowired annotation to automatically inject the AddPeopleDescProcessor instance.

        Code:

        @Autowired
        private AddPeopleDescProcessor processor;
        
      2. Use the @Test annotation to mark the testProcess method as a test method. In this method, first create a People object, then use the processor.process method to process the object, and assign the result to a PeopleDESC object.

        Code:

        @Test
        public void testProcess() throws Exception {
        People people = new People();
        people.setName("John");
        people.setAge(25);
        
        PeopleDESC desc = processor.process(people);
        }
        

    AddDescPeopleWriterTest.java

    AddDescPeopleWriterTest.java is a class that uses JUnit for testing the write logic of AddDescPeopleWriter.

    The AddDescPeopleWriterTest.java file contains the following code:

    1. Reference other classes and interfaces.

      This file contains the following interfaces and classes:

      • The PeopleDESC class: used for storing the converted or processed description information for persons.
      • Assert class: Used to assert test results.
      • @Test annotation: marks a method as a test method.
      • RunWith annotation: specifies the test runner.
      • The @Autowired annotation: used for dependency injection.
      • The SpringBootTest annotation: specifies that the test class is for Spring Boot tests.
      • JdbcTemplate class: provides methods for executing SQL statements.
      • SpringRunner class: specifies the test runner as SpringRunner.
      • ArrayList class for creating an empty list.
      • List interface: used to operate on the query result set.

      The following code:

      import com.oceanbase.example.batch.model.PeopleDESC;
      import org.junit.Assert;
      import org.junit.jupiter.api.Test;
      import org.junit.runner.RunWith;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.boot.test.context.SpringBootTest;
      import org.springframework.jdbc.core.JdbcTemplate;
      import org.springframework.test.context.junit4.SpringRunner;
      
      import java.util.ArrayList;
      import java.util.List;
      
    2. Define the AddDescPeopleWriterTest class.

      You can perform integrated testing of Spring Boot applications using the SpringBootTest annotation and the SpringRunner runner.

      1. Inject instances by using the @Autowired annotation. Autowire the AddPeopleDescProcessor and JdbcTemplate instances by using the @Autowired annotation.

        Code example:

        @Autowired
        private AddDescPeopleWriter writer;
        @Autowired
        private JdbcTemplate jdbcTemplate;
        
      2. Run the @Test method to insert and query the test data. The testWrite method is marked as the test method by using the @Test annotation. In this method, a peopleDescList list is created and two PeopleDESC objects are added to the list. Then, the data in the list is written to the database by using the writer.write method. Next, the jdbcTemplate method is called to execute a query statement, and data is retrieved from the people_desc table. Assertions are used to verify the data. Finally, the query result is output to the console, and a job execution completion message is printed.

        1. Insert data into the people_desc table. First, an empty list of PeopleDESC objects, peopleDescList, is created. Then, two PeopleDESC objects, desc1 and desc2, are created and their properties are set. The desc1 and desc2 objects are added to the peopleDescList list. The write method of writer is then called to write the objects in the peopleDescList list to the people_desc table in the database. The JdbcTemplate is used to execute a query statement SELECT COUNT(*) FROM people_desc to obtain the number of records in the people_desc table and assign the result to the variable count. Finally, the Assert.assertEquals method is used to perform an assertion to verify whether the value of count is equal to 2.

          The code is as follows:

             List<PeopleDESC> peopleDescList = new ArrayList<>();
             PeopleDESC desc1 = new PeopleDESC();
             desc1.setId(1);
             desc1.setName("John");
             desc1.setAge(25);
             desc1.setDesc("This is John with age 25");
             peopleDescList.add(desc1);
             PeopleDESC desc2 = new PeopleDESC();
             desc2.setId(2);
             desc2.setName("Alice");
             desc2.setAge(30);
             desc2.setDesc("This is Alice with age 30");
             peopleDescList.add(desc2);
             writer.write(peopleDescList);
          
             String selectSql = "SELECT COUNT(*) FROM people_desc";
             int count = jdbcTemplate.queryForObject(selectSql, Integer.class);
             Assert.assertEquals(2, count);
          
        2. Output the data of the people_desc table. First, the JdbcTemplate executes the statement SELECT * FROM people_desc and uses a lambda expression to handle the query result. In the lambda expression, the methods such as rs.getInt and rs.getString are used to obtain the field values from the result set and set them to a new PeopleDESC object. The new PeopleDESC object is added to a result list resultDesc. Then, a prompt message people_desc table data: is printed, followed by a for loop to traverse each PeopleDESC object in the resultDesc list. The System.out.println method is used to print each object. Finally, an execution completion message is printed.

          The code is as follows:

          List<PeopleDESC> resultDesc = jdbcTemplate.query("SELECT * FROM people_desc", (rs, rowNum) -> {
             PeopleDESC desc = new PeopleDESC();
             desc.setId(rs.getInt("id"));
             desc.setName(rs.getString("name"));
             desc.setAge(rs.getInt("age"));
             desc.setDesc(rs.getString("description"));
             return desc;
          });
          
          System.out.println("people_desc table data:");
          for (PeopleDESC desc : resultDesc) {
             System.out.println(desc);
          }
          
          // Output information after the job finishes.
          System.out.println("Batch Job execution completed.");
          

    AddPeopleWriterTest.java File Introduction

    The AddPeopleWriterTest.java file is a class that uses JUnit to test the writing logic of the AddPeopleWriterTest class.

    The code in the AddPeopleWriterTest.java file is as follows:

    1. References other classes and interfaces.

      This file contains the following interfaces and classes:

      • The People class: Used to store personnel information read from the database.
      • Test annotation: used to mark test methods.
      • RunWith annotation: specifies the test runner.
      • Autowired: A dependency injection annotation.
      • @SpringBootApplication annotation: indicates the class as the entry point of a Spring Boot application.
      • SpringBootTest annotation: used to specify that the test class is a Spring Boot test.
      • ComponentScan Annotation: specifies the package or class for which the component scan is performed.
      • JdbcTemplate class: provides methods to execute SQL statements.
      • SpringRunner class: specifies the test runner as SpringRunner.
      • ArrayList class, which is used to create an empty list.
      • The List interface: used for operating on the query result set.

      Here is the code:

      import com.oceanbase.example.batch.model.People;
      import org.junit.jupiter.api.Test;
      import org.junit.runner.RunWith;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.boot.autoconfigure.SpringBootApplication;
      import org.springframework.boot.test.context.SpringBootTest;
      import org.springframework.context.annotation.ComponentScan;
      import org.springframework.jdbc.core.JdbcTemplate;
      import org.springframework.test.context.junit4.SpringRunner;
      
      import java.util.ArrayList;
      import java.util.List;
      
    2. Define the AddPeopleWriterTest class.

      Use the SpringBootTest annotation and the SpringRunner runner to perform integration testing in Spring Boot. Use the @ComponentScan annotation to specify the package paths to be scanned.

      1. Inject instances by using @Autowired. Use the @Autowired annotation to inject addPeopleWriter and JdbcTemplate instances.

        Sample code:

        @Autowired
        private AddPeopleWriter addPeopleWriter;
        @Autowired
        private JdbcTemplate jdbcTemplate;
        
      2. Insert and output the test data.

        1. Insert data into the people table. First, create an empty peopleList list of People objects. Then, create two People objects named person1 and person2 and set their Name and Age attributes. Next, add these People objects to the peopleList list. Then, call the write method of addPeopleWriter and pass peopleList as the parameter to write these objects to the database.

          Here is the code:

             List<People> peopleList = new ArrayList<>();
             People person1 = new People();
             person1.setName("zhangsan");
             person1.setAge(27);
             peopleList.add(person1);
             People person2 = new People();
             person2.setName("lisi");
             person2.setAge(35);
             peopleList.add(person2);
             addPeopleWriter.write(peopleList);
          
        2. Output data from the people table. Then, a query statement SELECT * FROM people is executed by using the JdbcTemplate object, and the query result is processed by using a lambda expression. In the lambda expression, the rs.getString and rs.getInt methods are used to obtain field values from the result set. Then, the field values are set to a newly created People object. Each newly created People object is added to a result list result. Then, a prompt message people table data: is printed. Then, each People object in the result list is traversed by using a for loop, and the System.out.println method is used to print the contents of each object. Finally, a message indicating that the operation is completed is printed.

          Code example:

             List<People> result = jdbcTemplate.query("SELECT * FROM people", (rs, rowNum) -> {
                 People person = new People();
                 person.setName(rs.getString("name"));
                 person.setAge(rs.getInt("age"));
                 return person;
             });
          
             System.out.println("people table data:");
             for (People person : result) {
                 System.out.println(person);
             }
          
             // Output information after the job is completed.
             System.out.println("Batch Job execution completed.");
          

    Full code

    pom.xml
    application.properties
    BatchApplication.java
    BatchConfig.java
    People.java
    PeopleDESC.java
    AddPeopleDescProcessor.java
    AddDescPeopleWriter.java
    AddPeopleWriter.java
    BatchConfigTest.java
    AddPeopleDescProcessorTest.java
    AddDescPeopleWriterTest.java
    AddPeopleWriterTest.java
    <?xml version="1.0" encoding="UTF-8"?>
    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
             xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
        <modelVersion>4.0.0</modelVersion>
        <parent>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-parent</artifactId>
            <version>2.7.11</version>
            <relativePath/> <!-- lookup parent from repository -->
        </parent>
        <groupId>com.oceanbase</groupId>
        <artifactId>java-oceanbase-springboot</artifactId>
        <version>0.0.1-SNAPSHOT</version>
        <name>java-oceanbase-springbatch</name>
        <description>Demo project for Spring Batch</description>
        <properties>
            <java.version>1.8</java.version>
        </properties>
        <dependencies>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter</artifactId>
            </dependency>
            <dependency>
                <groupId>com.oceanbase</groupId>
                <artifactId>oceanbase-client</artifactId>
                <version>2.4.3</version>
            </dependency>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-jdbc</artifactId>
            </dependency>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-test</artifactId>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-batch</artifactId>
            </dependency>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-data-jpa</artifactId>
            </dependency>
            <dependency>
                <groupId>org.apache.tomcat</groupId>
                <artifactId>tomcat-jdbc</artifactId>
            </dependency>
            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <version>4.10</version>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>javax.activation</groupId>
                <artifactId>javax.activation-api</artifactId>
                <version>1.2.0</version>
            </dependency>
            <dependency>
                <groupId>jakarta.persistence</groupId>
                <artifactId>jakarta.persistence-api</artifactId>
                <version>2.2.3</version>
            </dependency>
        </dependencies>
    
        <build>
            <plugins>
                <plugin>
                    <groupId>org.springframework.boot</groupId>
                    <artifactId>spring-boot-maven-plugin</artifactId>
                </plugin>
            </plugins>
        </build>
    
    </project>
    
    
    #configuration database
    
    spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
    spring.datasource.url=jdbc:oceanbase://host:port/schema_name?characterEncoding=utf-8
    spring.datasource.username=user_name
    spring.datasource.password=
    
    # JPA
    spring.jpa.show-sql=true
    spring.jpa.hibernate.ddl-auto=update
    
    # Spring Batch
    spring.batch.job.enabled=false
    
    #
    logging.level.org.springframework=INFO
    logging.level.com.example=DEBUG
    
    package com.oceanbase.example.batch;
    
    import org.springframework.boot.SpringApplication;
    import org.springframework.boot.autoconfigure.SpringBootApplication;
    
    @SpringBootApplication
    public class BatchApplication {
        public static void main(String[] args) {
            SpringApplication.run(BatchApplication.class, args);
        }
    
        public void runBatchJob() {
        }
    }
    
    
    package com.oceanbase.example.batch.config;
    
    import com.oceanbase.example.batch.model.People;
    import com.oceanbase.example.batch.model.PeopleDESC;
    import com.oceanbase.example.batch.processor.AddPeopleDescProcessor;
    import com.oceanbase.example.batch.writer.AddDescPeopleWriter;
    import org.springframework.batch.core.Job;
    import org.springframework.batch.core.Step;
    import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
    import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
    import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
    import org.springframework.batch.core.launch.support.RunIdIncrementer;
    import org.springframework.batch.item.ItemProcessor;
    import org.springframework.batch.item.ItemReader;
    import org.springframework.batch.item.ItemWriter;
    import org.springframework.batch.item.database.JdbcCursorItemReader;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
    import org.springframework.boot.autoconfigure.SpringBootApplication;
    import org.springframework.context.annotation.Bean;
    import org.springframework.context.annotation.ComponentScan;
    import org.springframework.context.annotation.Configuration;
    import org.springframework.jdbc.core.BeanPropertyRowMapper;
    
    import javax.sql.DataSource;
    //import javax.activation.DataSource;
    
    @Configuration
    @EnableBatchProcessing
    @SpringBootApplication
    @ComponentScan("com.oceanbase.example.batch.writer")
    @EnableAutoConfiguration
    public class BatchConfig {
        @Autowired
        private JobBuilderFactory jobBuilderFactory;
    
        @Autowired
        private StepBuilderFactory stepBuilderFactory;
    
        @Autowired
        private DataSource dataSource;// Use the default dataSource provided by Spring Boot auto-configuration
    
        @Bean
        public ItemReader<People> peopleReader() {
            JdbcCursorItemReader<People> reader = new JdbcCursorItemReader<>();
            reader.setDataSource((javax.sql.DataSource) dataSource);
            reader.setRowMapper(new BeanPropertyRowMapper<>(People.class));
            reader.setSql("SELECT * FROM people");
            return reader;
        }
    
        @Bean
        public ItemProcessor<People, PeopleDESC> addPeopleDescProcessor() {
            return new AddPeopleDescProcessor();
        }
    
        @Bean
        public ItemWriter<PeopleDESC> addDescPeopleWriter() {
            return new AddDescPeopleWriter();
        }
    
        @Bean
        public Step step1(ItemReader<People> reader, ItemProcessor<People, PeopleDESC> processor,
                          ItemWriter<PeopleDESC> writer) {
            return stepBuilderFactory.get("step1")
                    .<People, PeopleDESC>chunk(10)
                    .reader(reader)
                    .processor(processor)
                    .writer(writer)
                    .build();
        }
    
        @Bean
        public Job importJob(Step step1) {
            return jobBuilderFactory.get("importJob")
                    .incrementer(new RunIdIncrementer())
                    .flow(step1)
                    .end()
                    .build();
        }
    }
    
    package com.oceanbase.example.batch.model;
    
    public class People {
        private String name;
        private int age;
    
            // getters and setters
    
        public String getName() {
            return name;
        }
    
        public void setName(String name) {
            this.name = name;
        }
    
        public int getAge() {
            return age;
        }
    
        public void setAge(int age) {
            this.age = age;
        }
        @Override
        public String toString() {
            return "People [name=" + name + ", age=" + age + "]";
        }
        // Getters and setters
    }
    
    package com.oceanbase.example.batch.model;
    
    public class PeopleDESC {
        private String name;
        private int age;
        private String desc;
        private int id;
    
        public String getName() {
            return name;
        }
    
        public void setName(String name) {
            this.name = name;
        }
    
        public int getAge() {
            return age;
        }
    
        public void setAge(int age) {
            this.age = age;
        }
    
        public String getDesc() {
            return desc;
        }
    
        public void setDesc(String desc) {
            this.desc = desc;
        }
    
        public int getId() {
            return id;
        }
    
        public void setId(int id) {
            this.id = id;
        }
    
        @Override
        public String toString() {
            return "PeopleDESC [name=" + name + ", age=" + age + ", desc=" + desc + "]";
        }
    }
    
    package com.oceanbase.example.batch.processor;
    
    import com.oceanbase.example.batch.model.People;
    import com.oceanbase.example.batch.model.PeopleDESC;
    import org.springframework.batch.item.ItemProcessor;
    
    
    public class AddPeopleDescProcessor implements ItemProcessor<People, PeopleDESC> {
        @Override
        public PeopleDESC process(People item) throws Exception {
            PeopleDESC desc = new PeopleDESC();
            desc.setName(item.getName());
            desc.setAge(item.getAge());
            desc.setDesc("This is " + item.getName() + " with age " + item.getAge());
            return desc;
        }
    }
    
    package com.oceanbase.example.batch.writer;
    
    import com.oceanbase.example.batch.model.PeopleDESC;
    import org.springframework.batch.item.ItemWriter;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.jdbc.core.JdbcTemplate;
    
    import java.util.List;
    
    public class AddDescPeopleWriter implements ItemWriter<PeopleDESC> {
        @Autowired
        private JdbcTemplate jdbcTemplate;
    
        @Override
        public void write(List<? extends PeopleDESC> items) throws Exception {
            // Drop the table if it already exists.
            jdbcTemplate.execute("DROP TABLE people_desc");
            // Table creation statement
            String createTableSql = "CREATE TABLE people_desc (id INT PRIMARY KEY, name VARCHAR2(255), age INT, description VARCHAR2(255))";
            jdbcTemplate.execute(createTableSql);
            for (PeopleDESC item : items) {
                String sql = "INSERT INTO people_desc (id, name, age, description) VALUES (?, ?, ?, ?)";
                jdbcTemplate.update(sql, item.getId(), item.getName(), item.getAge(), item.getDesc());
            }
        }
    }
    
    package com.oceanbase.example.batch.writer;
    
    import com.oceanbase.example.batch.model.People;
    import org.springframework.batch.item.ItemWriter;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.jdbc.core.JdbcTemplate;
    import org.springframework.stereotype.Component;
    
    import java.util.List;
    
    @Component
    public class AddPeopleWriter implements ItemWriter<People> {
        @Autowired
        private JdbcTemplate jdbcTemplate;
    
        @Override
        public void write(List<? extends People> items) throws Exception {
            // First, delete existing tables.
            jdbcTemplate.execute("DROP TABLE people");
            // The CREATE TABLE statement.
            String createTableSql = "CREATE TABLE people (name VARCHAR2(255), age INT)";
            jdbcTemplate.execute(createTableSql);
            for (People item : items) {
                String sql = "INSERT INTO people (name, age) VALUES (?, ?)";
                jdbcTemplate.update(sql, item.getName(), item.getAge());
            }
        }
    }
    
    
    package com.oceanbase.example.batch.config;
    
    import com.oceanbase.example.batch.writer.AddDescPeopleWriter;
    import org.junit.Assert;
    import org.junit.jupiter.api.Test;
    import org.junit.runner.RunWith;
    import org.springframework.batch.core.Job;
    import org.springframework.batch.core.JobExecution;
    import org.springframework.batch.core.JobParameters;
    import org.springframework.batch.core.JobParametersBuilder;
    import org.springframework.batch.core.launch.JobLauncher;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.boot.test.context.SpringBootTest;
    
    import org.springframework.test.context.junit4.SpringRunner;
    
    import javax.annotation.Resource;
    import javax.batch.runtime.BatchStatus;
    import java.util.UUID;
    
    @RunWith(SpringRunner.class)
    @SpringBootTest
    public class BatchConfigTest {
    
        @Test
        public void testJob() throws Exception {
            JobParameters jobParameters = new JobParametersBuilder()
                    .addString("jobParam", UUID.randomUUID().toString())
                    .toJobParameters();
    
            JobLauncherTestUtils jobLauncherTestUtils = new JobLauncherTestUtils();
            JobExecution jobExecution = jobLauncherTestUtils.launchJob(jobParameters);
            Assert.assertEquals(BatchStatus.COMPLETED.toString(), jobExecution.getStatus().toString());
        }
    
        @Autowired
        private JobLauncher jobLauncher;
    
        @Autowired
        private Job job;
    
        private class JobLauncherTestUtils {
    
            public JobExecution launchJob(JobParameters jobParameters) throws Exception {
                return jobLauncher.run(job, jobParameters);
            }
        }
    }
    
    
    package com.oceanbase.example.batch.processor;
    
    import com.oceanbase.example.batch.model.People;
    import com.oceanbase.example.batch.model.PeopleDESC;
    import org.junit.jupiter.api.Test;
    import org.junit.runner.RunWith;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.boot.test.context.SpringBootTest;
    import org.springframework.test.context.junit4.SpringRunner;
    
    @RunWith(SpringRunner.class)
    @SpringBootTest
    public class AddPeopleDescProcessorTest {
        @Autowired
        private AddPeopleDescProcessor processor;
    
        @Test
        public void testProcess() throws Exception {
            People people = new People();
      //      people.setName("John");
      //      people.setAge(25);
    
            PeopleDESC desc = processor.process(people);
    
    //      Assert.assertEquals("John", desc.getName());
    //        Assert.assertEquals(25, desc.getAge());
     //       Assert.assertEquals("This is John with age 25", desc.getDesc());
        }
    }
    
    
    package com.oceanbase.example.batch.writer;
    
    import com.oceanbase.example.batch.model.PeopleDESC;
    import org.junit.Assert;
    import org.junit.jupiter.api.Test;
    import org.junit.runner.RunWith;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.boot.test.context.SpringBootTest;
    import org.springframework.jdbc.core.JdbcTemplate;
    import org.springframework.test.context.junit4.SpringRunner;
    
    import java.util.ArrayList;
    import java.util.List;
    
    @RunWith(SpringRunner.class)
    @SpringBootTest
    public class AddDescPeopleWriterTest {
        @Autowired
        private AddDescPeopleWriter writer;
        @Autowired
        private JdbcTemplate jdbcTemplate;
    
        @Test
        public void testWrite() throws Exception {
    
            // Insert data into the people_desc table.
            List<PeopleDESC> peopleDescList = new ArrayList<>();
            PeopleDESC desc1 = new PeopleDESC();
            desc1.setId(1);
            desc1.setName("John");
            desc1.setAge(25);
            desc1.setDesc("This is John with age 25");
            peopleDescList.add(desc1);
            PeopleDESC desc2 = new PeopleDESC();
            desc2.setId(2);
            desc2.setName("Alice");
            desc2.setAge(30);
            desc2.setDesc("This is Alice with age 30");
            peopleDescList.add(desc2);
            writer.write(peopleDescList);
    
            String selectSql = "SELECT COUNT(*) FROM people_desc";
            int count = jdbcTemplate.queryForObject(selectSql, Integer.class);
            Assert.assertEquals(2, count);
    
            // Output the data in the people_desc table.
            List<PeopleDESC> resultDesc = jdbcTemplate.query("SELECT * FROM people_desc", (rs, rowNum) -> {
                PeopleDESC desc = new PeopleDESC();
                desc.setId(rs.getInt("id"));
                desc.setName(rs.getString("name"));
                desc.setAge(rs.getInt("age"));
                desc.setDesc(rs.getString("description"));
                return desc;
            });
    
            System.out.println("people_desc table data:");
            for (PeopleDESC desc : resultDesc) {
                System.out.println(desc);
            }
    
            // Output the information after the job is completed.
            System.out.println("Batch Job execution completed.");
        }
    }
    
    package com.oceanbase.example.batch.writer;
    
    import com.oceanbase.example.batch.model.People;
    import org.junit.jupiter.api.Test;
    import org.junit.runner.RunWith;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.boot.autoconfigure.SpringBootApplication;
    import org.springframework.boot.test.context.SpringBootTest;
    import org.springframework.context.annotation.ComponentScan;
    import org.springframework.jdbc.core.JdbcTemplate;
    import org.springframework.test.context.junit4.SpringRunner;
    
    import java.util.ArrayList;
    import java.util.List;
    
    @RunWith(SpringRunner.class)
    @SpringBootTest
    @SpringBootApplication
    @ComponentScan("com.oceanbase.example.batch.writer")
    public class AddPeopleWriterTest {
    
        @Autowired
        private AddPeopleWriter addPeopleWriter;
        @Autowired
        private JdbcTemplate jdbcTemplate;
    
        @Test
        public void testWrite() throws Exception {
            // Insert data into the people table.
            List<People> peopleList = new ArrayList<>();
            People person1 = new People();
            person1.setName("zhangsan");
            person1.setAge(27);
            peopleList.add(person1);
            People person2 = new People();
            person2.setName("lisi");
            person2.setAge(35);
            peopleList.add(person2);
            addPeopleWriter.write(peopleList);
    
            // Query and output the result.
            List<People> result = jdbcTemplate.query("SELECT * FROM people", (rs, rowNum) -> {
                People person = new People();
                person.setName(rs.getString("name"));
                person.setAge(rs.getInt("age"));
                return person;
            });
    
            System.out.println("people table data:");
            for (People person : result) {
                System.out.println(person);
            }
    
            // Output the information after the job is completed.
            System.out.println("Batch Job execution completed.");
        }
    }
    
    

    References

    For more information about OceanBase Connector/J, see OceanBase JDBC driver.

    Previous topic

    Connect to OceanBase Cloud using SpringBoot
    Last

    Next topic

    spring-jdbc
    Next
    What is on this page
    Prerequisites
    Procedure
    Step 1: Obtain the connection string of the OceanBase Cloud database
    Step 2: Import the java-oceanbase-springbatch project into IntelliJ IDEA
    Step 3: Modify the database connection information in the java-oceanbase-springbatch project
    Step 4: Run the java-oceanbase-springbatch project
    FAQ
    1. Connection timeout
    2. Character set
    3. SSL connection
    4. Special characters in the account password
    Project code
    Introduction to the pom.xml file
    application.properties file
    BatchApplication.java file
    Introduction to the BatchConfig.java file
    Introduction to the People.java file
    Introduction to the PeopleDESC.java file
    Introduction to the AddPeopleDescProcessor.java file
    Introduction to AddDescPeopleWriter.java
    Introduction to the AddPeopleWriter.java file
    Introduction to the BatchConfigTest.java file
    AddPeopleDescProcessorTest.java file
    AddDescPeopleWriterTest.java
    AddPeopleWriterTest.java File Introduction
    Full code
    References