OceanBase logo

OceanBase

A unified distributed database ready for your transactional, analytical, and AI workloads.

DEPLOY YOUR WAY

OceanBase Cloud

The best way to deploy and scale OceanBase

OceanBase Enterprise

Run and manage OceanBase on your infra

TRY OPEN SOURCE

OceanBase Community Edition

The free, open-source distributed database

OceanBase seekdb

Open source AI native search database

Customer Stories

Real-world success stories from enterprises across diverse industries.

View All
BY USE CASES

Mission-Critical Transactions

Global & Multicloud Application

Elastic Scaling for Peak Traffic

Real-time Analytics

Active Geo-redundancy

Database Consolidation

Resources

Comprehensive knowledge hub for OceanBase.

Blog

Live Demos

Training & Certification

Documentation

Official technical guides, tutorials, API references, and manuals for all OceanBase products.

View All
PRODUCTS

OceanBase Cloud

OceanBase Database

Tools

Connectors and Middleware

QUICK START

OceanBase Cloud

OceanBase Database

BEST PRACTICES

Practical guides for utilizing OceanBase more effectively and conveniently

Company

Learn more about OceanBase – our company, partnerships, and trust and security initiatives.

About OceanBase

Partner

Trust Center

Contact Us

International - English
中国站 - 简体中文
日本 - 日本語
Sign In
Start on Cloud

A unified distributed database ready for your transactional, analytical, and AI workloads.

DEPLOY YOUR WAY

OceanBase Cloud

The best way to deploy and scale OceanBase

OceanBase Enterprise

Run and manage OceanBase on your infra

TRY OPEN SOURCE

OceanBase Community Edition

The free, open-source distributed database

OceanBase seekdb

Open source AI native search database

Customer Stories

Real-world success stories from enterprises across diverse industries.

View All
BY USE CASES

Mission-Critical Transactions

Global & Multicloud Application

Elastic Scaling for Peak Traffic

Real-time Analytics

Active Geo-redundancy

Database Consolidation

Comprehensive knowledge hub for OceanBase.

Blog

Live Demos

Training & Certification

Documentation

Official technical guides, tutorials, API references, and manuals for all OceanBase products.

View All
PRODUCTS
OceanBase CloudOceanBase Database
ToolsConnectors and Middleware
QUICK START
OceanBase CloudOceanBase Database
BEST PRACTICES

Practical guides for utilizing OceanBase more effectively and conveniently

Learn more about OceanBase – our company, partnerships, and trust and security initiatives.

About OceanBase

Partner

Trust Center

Contact Us

Start on Cloud
编组
All Products
    • Databases
    • iconOceanBase Database
    • iconOceanBase Cloud
    • iconOceanBase Tugraph
    • iconInteractive Tutorials
    • iconOceanBase Best Practices
    • Tools
    • iconOceanBase Cloud Platform
    • iconOceanBase Migration Service
    • iconOceanBase Developer Center
    • iconOceanBase Migration Assessment
    • iconOceanBase Admin Tool
    • iconOceanBase Loader and Dumper
    • iconOceanBase Deployer
    • iconKubernetes operator for OceanBase
    • iconOceanBase Diagnostic Tool
    • iconOceanBase Binlog Service
    • Connectors and Middleware
    • iconOceanBase Database Proxy
    • iconEmbedded SQL in C for OceanBase
    • iconOceanBase Call Interface
    • iconOceanBase Connector/C
    • iconOceanBase Connector/J
    • iconOceanBase Connector/ODBC
    • iconOceanBase Connector/NET
icon

OceanBase Cloud

  • Product Updates & Announcements
    • What's new
      • Release notes for 2026
      • Release notes for 2025
      • Release notes for 2024
      • Release history
    • Product announcements
      • Data development module deprecation notice
      • Optimization of Backup and Restore commercialization strategy
      • Cross-AZ data transfer billing (OceanBase Cloud on AWS)
      • Database Proxy pricing update
      • AWS instance pricing adjustment
  • Product Introduction
    • Overview
    • Management mode and scenarios
    • Core features
      • High availability with cross-cloud active-active architecture
      • High availability with cross-cloud primary-standby databases
      • Multi-level caching in shared storage
      • Multi-layer online scaling and on-demand adjustment
    • Deployment modes
    • Storage architecture
    • Product specifications
    • Product billing
      • Overview
      • Instance billing
        • Tencent Cloud instance billing
        • Alibaba Cloud instance billing
        • Huawei Cloud instance billing
        • AWS instance billing
        • GCP instance billing
      • Backup and restore billing
      • SQL audit billing
      • Migrations billing
      • Database proxy billing
      • Binlog service billing
      • Overview of OceanBase Cloud support plans
      • Read-only replica billing
    • Supported database versions
  • Get Started
    • Get started with a transactional instance
    • Get started with an analytical instance
    • Get started with a Key-Value instance
  • Work with Transactional Instances
    • Overview
    • Create an instance
      • Overview
      • Create via OceanBase Cloud official website
      • Create via AWS Marketplace
      • Create via GCP Marketplace
      • Create via Huawei Cloud Marketplace
      • Create via Alibaba Cloud Marketplace
      • Create via Azure Marketplace
    • Connect to an instance
      • MySQL compatible mode
        • Overview
        • Get connection string
          • Overview
          • Connect using AWS PrivateLink
          • Connect using Azure Private Link
          • Connect using Google Cloud Private Service Connect
          • Connect using Huawei Cloud VPC Endpoint
          • Connect using Alibaba Cloud VPC
          • Connect using a public IP address
          • Connect using a Huawei Cloud peering connection
        • Connect with clients
          • Connect to OceanBase Cloud by using Client ODC
          • Connect to OceanBase Cloud by using a MySQL client
          • Connect to OceanBase Cloud by using OBClient
        • Connect with drivers
          • Java
            • Connect to OceanBase Cloud using SpringBoot
            • SpringBatch sample application for connecting to OceanBase Cloud
            • spring-jdbc
            • SpringDataJPA sample application for connecting to OceanBase Cloud
            • Hibernate application development with OceanBase Cloud
            • Sample program for connecting to OceanBase Cloud
            • connector-j
            • Use TestContainers to connect to and use OceanBase Cloud
          • Python
            • Connect to OceanBase Cloud using mysqlclient
            • Connect to OceanBase Cloud using PyMySQL
            • Use the MySQL-connector-python driver to connect to and use OceanBase Cloud
            • Use SQLAlchemy to connect to an OceanBase Cloud database
            • Connect to an OceanBase Cloud database using Django
            • Connect to an OceanBase Cloud database by using peewee
          • C
            • Use MySQL Connector/C to connect to OceanBase Cloud
          • Go
            • Connect to OceanBase Cloud using the Go-SQL-Driver/MySQL driver
            • Connect to OceanBase Cloud using GORM
          • PHP
            • Use the EXT driver to connect to OceanBase Cloud
            • Connect to OceanBase Cloud by using the MySQLi driver
            • Use the PDO driver to connect to OceanBase Cloud
          • Rust
            • Rust application example for connecting to OceanBase Cloud
            • SeaORM example for connecting to OceanBase Cloud
          • ruby
            • ActiveRecord sample application for OceanBase Cloud
            • Connect to OceanBase Cloud by using mysql2
            • Connect to OceanBase Cloud by using Sequel
        • Use database connection pool
          • Database connection pool configuration
          • Connect to OceanBase Cloud by using a Tomcat connection pool
          • Connect to OceanBase Cloud by using a C3P0 connection pool
          • Connect to OceanBase Cloud by using a Proxool connection pool
          • Connect to OceanBase Cloud by using a HikariCP connection pool
          • Connect to OceanBase Cloud by using a DBCP connection pool
          • Connect to OceanBase Cloud by using Commons Pool
          • Connect to OceanBase Cloud by using a Druid connection pool
      • Oracle compatible mode
        • Overview
        • Get connection string
          • Overview
          • Connect using AWS PrivateLink
          • Connect using Azure Private Link
          • Connect using Google Cloud Private Service Connect
          • Connect using Huawei Cloud VPC Endpoint
          • Connect using a public IP address
        • Connect with clients
          • Connect to OceanBase Cloud by using OBClient
          • Connect to OceanBase Cloud by using Client ODC
        • Connect with drivers
          • Java
            • Connect to OceanBase Cloud using OceanBase Connector/J
            • Connect to OceanBase Cloud by using Spring Boot
            • SpringBatch application example for connecting to OceanBase Cloud
            • Connect to OceanBase Cloud using Spring JDBC
            • Connect to OceanBase Cloud by using Spring Data JPA
            • Connect to OceanBase Cloud by using Hibernate
            • Use MyBatis to connect to OceanBase Cloud
            • Use JFinal to connect to OceanBase Cloud
          • Python
            • Python Driver for Oracle Mode
          • C
            • Connect to OceanBase Cloud using OceanBase Connector/C
            • Connect to OceanBase Cloud using OceanBase Connector/ODBC
            • Use SqlSugar to connect to OceanBase Cloud
        • Use database connection pool
          • Database connection pool configuration
          • Sample program that uses a Tomcat connection pool to connect to OceanBase Cloud
          • C3P0 connection pool connects to OceanBase Cloud
          • Connect to OceanBase Cloud using Proxool connection pool
          • Sample program that uses HikariCP to connect to OceanBase Cloud
          • Use DBCP connection pool to connect to OceanBase Cloud
          • Connect to OceanBase Cloud by using Commons Pool
          • Connect to OceanBase Cloud by using a Druid connection pool
    • Developer guide
      • MySQL compatible mode
        • Plan database objects
          • Create a database
          • Create a table group
          • Create a table
          • Create an index
          • Create an external table
        • Write data
          • Insert data
          • Update data
          • Delete data
          • Replace data
          • Generate test data in batches
        • Read data
          • Single-table queries
          • Join tables
            • INNER JOIN queries
            • FULL JOIN queries
            • LEFT JOIN queries
            • RIGHT JOIN queries
            • Subqueries
            • Lateral derived tables
          • Use operators and functions in queries
            • Use arithmetic operators in queries
            • Use numerical functions in queries
            • Use string concatenation operators in queries
            • Use string functions in queries
            • Use datetime functions in queries
            • Use type conversion functions in queries
            • Use aggregate functions in queries
            • Use NULL-related functions in queries
            • Use the CASE conditional operator in queries
            • Use the SELECT ... FOR UPDATE statement to lock query results
            • Use the SELECT ... LOCK IN SHARE MODE statement to lock query results
          • Use a DBLink in queries
          • Set operations
        • Manage transactions
          • Overview
          • Start a transaction
          • Savepoints
            • Mark a savepoint
            • Roll back a transaction to a savepoint
            • Release a savepoint
          • Commit a transaction
          • Roll back a transaction
      • Oracle compatible mode
        • Plan database objects
          • Create a table group
          • Create a table
          • Create an index
          • Create an external table
        • Write data
          • Insert data
          • Update data
          • Delete data
          • Replace data
          • Generate test data in batches
        • Read data
          • Single-table queries
          • Join tables
            • INNER JOIN queries
            • FULL JOIN queries
            • LEFT JOIN queries
            • RIGHT JOIN queries
            • Subqueries
            • Lateral derived tables
          • Use operators and functions in queries
            • Use arithmetic operators in queries
            • Use numerical functions in queries
            • Use string concatenation operators in queries
            • Use string functions in queries
            • Use datetime functions in queries
            • Use type conversion functions in queries
            • Use aggregate functions in queries
            • Use NULL-related functions in queries
            • Use CASE functions in queries
            • Use the SELECT ... FOR UPDATE statement to lock query results
          • Use a DBLink in queries
          • Set operations
        • Manage transactions
          • Overview
          • Start a transaction
          • Savepoints
            • Mark a savepoint
            • Roll back a transaction to a savepoint
          • Commit a transaction
          • Roll back a transaction
    • Manage instances
      • Manage instances
        • View the instance list
        • Instance overview
        • Stop and restart instances
        • Unit migration
      • Manage tenants
        • Tenant overview
        • Create a tenant
        • Modify tenant specifications
        • Modify tenant names
        • Add an endpoint
        • Resource isolation
          • Overview
          • Manage resource groups
            • Create a resource group
            • View a resource group
            • Edit a resource group
            • Delete a resource group
          • Manage isolation rules
            • Create an isolation rule
            • View isolation rules
            • Edit an isolation rule
            • Delete a quarantine rule
        • Modify primary zone
        • Modify the maximum number of connections for a tenant proxy
        • Monitor tenant performance
          • Overview
          • View performance and SQL monitoring details
          • View transaction monitoring details
          • View storage and cache monitoring details
          • View Binlog service monitoring
          • Customize a monitoring dashboard for a tenant
        • Diagnostics
          • Real-time diagnostics
            • SQL diagnostics
              • Top SQL
              • Slow SQL
              • Suspicious SQL
              • High-risk SQL
            • SQL audit
        • Manage tenant parameters
          • Manage tenant parameters
          • Parameters for tenants
          • Parameter template overview
        • Delete a tenant
        • Manage databases and accounts
          • Create accounts
          • Manage accounts
          • Create a database (MySQL compatible mode)
          • Manage databases (MySQL compatible mode)
      • Monitor instance performance
        • Overview
        • Monitor the performance of databases in an instance
        • Monitor multidimensional metrics of an instance
        • Monitor the performance of hosts in an instance
        • Monitor database proxy
        • Monitor database proxy hosts
        • Monitor cross-cloud network performance
        • Customize a monitoring dashboard for an instance
      • Manage major compactions
        • Initiate a major compaction
        • View compaction records
        • Update time for compactions
      • Manage instance parameters
        • Manage parameters
        • Parameters for cluster instances
      • Change instance configurations
        • Enable storage auto-scaling
        • View history of configuration changes
        • Change configuration
        • Change configuration temporarily
        • Switch the deployment mode
      • Manage standby instances
        • Overview
        • Create a standby instance
        • Create a cross-cloud standby instance
        • Create a standby instance for an Alibaba Cloud primary instance
        • View details of primary and standby instances
        • Configure global endpoint
        • Enable automatic forwarding for write requests of standby databases
        • Primary-standby instance switchover
        • Initiate failover
        • Detach a standby instance
        • Release a standby instance
      • Release an instance
      • Database proxy
        • Overview
        • Manage database proxy
        • Direct load
      • Manage alerts
        • Overview
        • Manage alert rules
          • Create an alert rule
          • View an alert rule
          • Edit an alert rule
          • Delete an alert rule
        • View alert history
        • Manage alert templates
          • Create an alert template
          • View an alert template
          • Edit an alert template
          • Copy an alert rule template
          • Delete an alert template
        • Manage muting rules
          • Create an alert muting rule
          • View an alert muting rule
          • Edit an alert muting rule
          • Delete an alert muting rule
        • Manage alert notification templates
          • Create an alert notification template
          • View an alert notification template
          • Edit an alert notification template
          • Copy an alert notification template
          • Delete an alert notification template
        • Manage alert contacts
          • Add an alert contact
          • Add an alert contact group
          • View an alert contact
          • Edit an alert contact
          • Delete an alert contact
          • Obtain a webhook URL
        • Monitoring metrics for alerts
      • Backup and restore
        • Overview
        • Backup strategy
        • Initiate a backup immediately
        • Data backup
        • Initiate a restore
        • Data restore
        • Restore data from the instance recycle bin
      • Diagnostics
        • View performance monitoring data
        • Capacity diagnostics
        • One-click diagnostics
          • Initiate one-click diagnostics
          • View one-click diagnostic report
            • Exceptions
            • Real-time diagnostics
            • Optimization suggestions
            • Capacity management
            • Security management
        • Real-time diagnostics
          • SQL diagnostics
            • Top SQL
            • Slow SQL
            • Suspicious SQL
            • High-risk SQL
            • SQL details
            • SQL monitoring metrics list
          • Session management
            • Session management
          • Request analysis
            • Request analysis
        • Root cause diagnostics
          • Exception handling
          • Enable system autonomy
        • SQL audit
        • Materialized view analysis
        • Optimization center
          • Optimization suggestions
          • Manage active outlines
          • SQL review
          • View the optimization history
      • Manage tags
      • Manage read-only replicas
        • Overview
        • Instance read-only replicas
          • Add a read-only replica to an instance
          • View read-only replicas of an instance
          • Manage read-only replicas of an instance
          • Delete a read-only replica of an instance
        • Tenant read-only replicas
          • Add a read-only replica to a tenant
          • View read-only replicas of a tenant
          • Manage read-only replicas of a tenant
          • Delete a read-only replica of a tenant
      • Manage JVM-dependent services
    • Data source management
      • Create a data source
      • Manage data sources
      • User privileges
        • User privileges for compatibility assessment
        • User privileges for data migration
        • User privileges for performance assessment
        • User privileges for data archiving
        • User privileges for data cleanup
      • Connect via private network
        • AWS
        • Huawei Cloud
        • Alibaba Cloud
        • Google Cloud
        • Azure
        • Private IP address segments
      • Connect via public network
        • AWS
        • Huawei Cloud
        • Alibaba Cloud
        • Google Cloud
        • Azure
    • Data lifecycle management
      • Archive data
      • Clean up data
    • Manage recycle Bin
      • Instance recycle bin
      • Manage databases and tables in recycle bin
        • Overview
        • Instance-level recycle bin
        • Tenant-level recycle bin
  • Work with Analytical Instances
    • Overview
    • Core features
    • Create an instance
    • Connect to an instance
      • Overview
      • Get connection string
        • Overview
        • Connect using AWS PrivateLink
        • Connect using a public IP address
      • Connect with clients
        • Connect to OceanBase Cloud by using Client ODC
        • Connect to OceanBase Cloud by using a MySQL client
        • Connect to OceanBase Cloud by using OBClient
      • Connect with drivers
        • Java
          • Connect to OceanBase Cloud by using Spring Boot
          • Connect to OceanBase Cloud by using Spring Batch
          • Connect to OceanBase Cloud by using Spring Data JDBC
          • Connect to OceanBase Cloud by using Spring Data JPA
          • Connect to OceanBase Cloud by using Hibernate
          • Connect to OceanBase Cloud by using MyBatis
          • Connect to OceanBase Cloud using MySQL Connector/J
        • Python
          • Connect to OceanBase Cloud by using mysqlclient
          • Connect to OceanBase Cloud by using PyMySQL
          • Connect to OceanBase Cloud using MySQL Connector/Python
        • C
          • Connect to OceanBase Cloud using MySQL Connector/C
        • Go
          • Connect to OceanBase Cloud using Go-SQL-Driver/MySQL
        • PHP
          • Connect to OceanBase Cloud using PHP
      • Use database connection pool
        • Database connection pool configuration
        • Connect to OceanBase Cloud by using a Tomcat connection pool
        • Connect to OceanBase Cloud by using a C3P0 connection pool
        • Connect to OceanBase Cloud by using a Proxool connection pool
        • Connect to OceanBase Cloud by using a HikariCP connection pool
        • Connect to OceanBase Cloud by using a DBCP connection pool
        • Connect to OceanBase Cloud by using Commons Pool
        • Connect to OceanBase Cloud by using a Druid connection pool
    • Data table design
      • Table overview
      • Best practices
        • Unit 1: Best practices for optimizing storage structures and query performance
        • Unit 2: Best practices for creating special indexes
    • Export data
    • OceanBase data processing
    • Query acceleration
      • Statistics
      • Materialized views for query acceleration
      • Select a query parallelism level
    • Manage instances
      • Instance overview
      • Change configuration
      • Modify primary zone
      • Manage parameters
      • Backup and restore
        • Backup overview
        • Backup strategies
        • Immediate backup
        • Data backup
        • Initiate restore
        • Data restore
      • Monitor instance performance
        • Overview
        • Monitor the performance of databases in an instance
        • Monitor the performance of hosts in an instance
      • Manage major compactions
        • Initiate a major compaction
        • View compaction records
        • Update time for compactions
      • Database proxy
        • Overview
        • Manage database proxy
        • Direct load
      • Manage alerts
        • Overview
        • Manage alert rules
          • Create an alert rule
          • View an alert rule
          • Edit an alert rule
          • Delete an alert rule
        • View alert history
        • Manage alert templates
          • Create an alert template
          • View an alert template
          • Edit an alert template
          • Copy an alert template
          • Delete an alert template
        • Manage muting rules
          • Create an alert muting rule
          • View an alert muting rule
          • Edit an alert muting rule
          • Delete an alert muting rule
        • Manage alert notification templates
          • Create an alert notification template
          • View an alert notification template
          • Edit an alert notification template
          • Copy an alert notification template
          • Delete an alert notification template
        • Manage alert contacts
          • Add an alert contact
          • Add an alert contact group
          • View an alert contact
          • Edit an alert contact
          • Delete an alert contact
          • Obtain a webhook URL
        • Monitoring metrics for alerts
      • Diagnostics
        • View performance monitoring data
        • Capacity diagnostics
        • Real-time diagnostics
          • SQL diagnostics
            • Top SQL
            • Slow SQL
            • Suspicious SQL
            • High-risk SQL
            • SQL details
            • SQL monitoring metrics list
          • Session management
            • Session management
          • Optimization management
            • Manage active outlines
            • View the optimization history
          • Request analysis
            • Request analysis
      • Stop and restart instances
      • Release instances
      • Manage databases and accounts
        • Create and manage accounts
        • Create a database
        • Manage databases
      • Manage tags
    • Data lifecycle management
      • Archive data
      • Clean up data
    • Performance diagnosis and tuning
      • Use the DBMS_XPLAN package for performance diagnostics
      • Use the GV$SQL_PLAN_MONITOR view for performance analysis
      • Views related to AP performance analysis
    • Performance testing
    • Product integration
    • Manage recycle Bin
      • View instance recycle bin
      • Manage databases and tables in recycle bin
        • Overview
        • Instance recycle bin
  • Work with Key-Value Instances
    • Try out Key-Value instances
      • Create an instance
      • Create a tenant
      • Create an account for a database user
      • OBKV HBase data operation examples
    • Use Table model
      • Create an instance
      • Manage instances
        • Manage instances
          • View the instance list
          • Instance overview
          • Stop and restart instances
          • Release an instance
        • Manage tenants
          • Create a tenant
          • Modify tenant specifications
          • Modify tenant names
          • Delete a tenant
          • Tenant overview
          • Resource isolation
            • Overview
            • Manage resource groups
              • Create a resource group
              • View a resource group
              • Edit a resource group
              • Delete a resource group
            • Manage isolation rules
              • Create an isolation rule
              • View isolation rules
              • Edit an isolation rule
              • Delete a quarantine rule
          • Monitor tenant performance
            • Overview
            • View performance and SQL monitoring details
            • View transaction monitoring details
            • View storage and cache monitoring details
            • OBKV-Table
            • Customize a monitoring dashboard for a tenant
          • Diagnostics
            • Top SQL
          • Manage tenant parameters
            • Manage tenant parameters
            • Parameters for tenants
          • Manage databases and accounts
            • Create and manage accounts
            • Create a database
            • Manage databases
          • Switch primary zone
        • Monitor instance performance
          • Overview
          • Monitor the performance of databases in an instance
          • Monitor multi-dimensional metrics of an instance
          • Monitor the performance of hosts in a cluster
          • Customize monitoring dashboards for an instance
        • Manage major compactions
          • Initiate major compactions
          • View compaction records
          • Update time for compactions
        • Manage instance parameters
          • Parameter management overview
          • Parameters for cluster instances
        • Change instance configurations
          • View history of configuration changes
          • Change configuration
          • Switch the deployment mode
        • Database proxy
          • Overview
          • Manage database proxy
        • Manage alerts
          • Overview
          • Manage alert rules
            • Create an alert rule
            • View an alert rule
            • Edit an alert rule
            • Delete an alert rule
          • View alert history
          • Manage alert templates
            • Create an alert template
            • View an alert template
            • Edit an alert template
            • Copy an alert template
            • Delete an alert template
          • Manage muting rules
            • Create an alert muting rule
            • View an alert muting rule
            • Edit an alert muting rule
            • Delete an alert muting rule
          • Manage alert contacts
            • Add an alert contact
            • Add an alert contact group
            • View an alert contact
            • Edit an alert contact
            • Delete an alert contact
            • Obtain a webhook URL
          • Monitoring metrics for alerts
        • Backup and restore
          • Backup overview
          • Backup strategies
          • Immediate backup
          • Data backup
          • Initiate restore
          • Data restore
        • Diagnostics
          • View performance monitoring data
          • Top SQL
          • Capacity diagnostics
          • Request analysis
        • Manage tags
        • Manage recycle Bin
          • View instance recycle bin
          • Manage databases and tables in recycle bin
            • Overview
            • Instance-level recycle bin
            • Tenant-level recycle bin
    • Use HBase model
      • OBKV-HBase Overview
      • Create an instance
      • Develop in HBase model
        • Connect to an instance by using the OBKV-HBase client
      • Manage instances
        • Manage instances
          • View the instance list
          • Instance overview
          • Stop and restart instances
          • Release an instance
        • Manage tenants
          • Create a tenant
          • Modify tenant specifications
          • Modify tenant names
          • Delete a tenant
          • Tenant overview
          • Resource isolation
            • Overview
            • Manage resource groups
              • Create a resource group
              • View a resource group
              • Edit a resource group
              • Delete a resource group
            • Manage isolation rules
              • Create an isolation rule
              • View isolation rules
              • Edit an isolation rule
              • Delete a quarantine rule
          • Monitor tenant performance
            • Overview
            • View performance and SQL monitoring details
            • View transaction monitoring details
            • View storage and cache monitoring details
            • OBKV-HBase
            • Customize a monitoring dashboard for a tenant
          • Diagnostics
            • Top SQL
          • Manage tenant parameters
            • Manage tenant parameters
            • Parameters for tenants
          • Manage databases and accounts
            • Create and manage accounts
            • Create a database
            • Manage databases
          • Switch primary zone
        • Monitor instance performance
          • Overview
          • Monitor the performance of databases in an instance
          • Monitor multi-dimensional metrics of an instance
          • Monitor the performance of hosts in a cluster
          • Customize monitoring dashboards for an instance
        • Manage major compactions
          • Initiate major compactions
          • View compaction records
          • Update time for compactions
        • Manage instance parameters
          • Parameter management overview
          • Parameters for cluster instances
        • Change instance configurations
          • View history of configuration changes
          • Change configuration
          • Switch the deployment mode
        • Database proxy
          • Overview
          • Manage database proxy
        • Manage alerts
          • Overview
          • Manage alert rules
            • Create an alert rule
            • View an alert rule
            • Edit an alert rule
            • Delete an alert rule
          • View alert history
          • Manage alert templates
            • Create an alert template
            • View an alert template
            • Edit an alert template
            • Copy an alert template
            • Delete an alert template
          • Manage muting rules
            • Create an alert muting rule
            • View an alert muting rule
            • Edit an alert muting rule
            • Delete an alert muting rule
          • Manage alert contacts
            • Add an alert contact
            • Add an alert contact group
            • View an alert contact
            • Edit an alert contact
            • Delete an alert contact
            • Obtain a webhook URL
          • Monitoring metrics for alerts
        • Backup and restore
          • Backup overview
          • Backup strategies
          • Immediate backup
          • Data backup
          • Initiate restore
          • Data restore
        • Diagnostics
          • View performance monitoring data
          • Top SQL
          • Capacity diagnostics
          • Request analysis
        • Manage tags
        • Manage recycle Bin
          • View instance recycle bin
          • Manage databases and tables in recycle bin
            • Overview
            • Instance-level recycle bin
            • Tenant-level recycle bin
      • Performance test
    • Connect Key-Value instances
      • Overview
      • Connect using a public IP address
  • Migrations
    • Data migration and import solutions
    • Data assessment and migration quick start
    • Assess compatibility
      • Overview
      • Perform online assessment
      • Perform offline assessment
      • Manage compatibility assessment tasks
        • View a compatibility assessment task
        • View and download a compatibility assessment report
        • Stop a compatibility assessment task
        • Delete a compatibility assessment task
      • Obtain files for upload
      • Configure PrivateLink
      • Add an IP address to an allowlist
    • Migrate data
      • Overview
      • Migrations specification
      • Purchase a data migration instance
      • Migrate data from a MySQL database to a MySQL-compatible tenant of OceanBase Database
      • Migrate data from a MySQL-compatible tenant of OceanBase Database to a MySQL database
      • Migrate data between OceanBase database tenants of the same compatibility mode
      • Migrate data between OceanBase database tenants of different compatibility modes
      • Migrate data from an Oracle database to an Oracle-compatible tenant of OceanBase Database
      • Migrate data from an Oracle-compatible tenant of OceanBase Database to an Oracle database
      • Configure a two-way synchronization task
      • Migrate data from an OceanBase database to a Kafka instance
      • Migrate data from a TiDB database to a MySQL-compatible tenant of OceanBase Database
      • Migrate incremental data from a MySQL-compatible tenant of OceanBase Database to a TiDB Database
      • Migrate data from a PostgreSQL database to an OceanBase database
      • Migrate incremental data from an OceanBase Database to a PostgreSQL database
      • Manage data migration tasks
        • View details of a data migration task
        • Rename a data migration task
        • View and modify migration objects
        • View and modify migration parameters
        • Configure alert monitoring
        • Manage data migration tasks by using tags
        • Start, stop, and resume a data migration task
        • Clone a data migration task
        • Terminate and release a data migration task
      • Features
        • Custom DML/DDL configurations
        • DDL synchronization scope
        • Use SQL conditions to filter data
        • Rename a migration object
        • Set an incremental synchronization timestamp
        • Instructions on schema migration
        • Configure and modify matching rules
        • Wildcard rules
        • Import migration objects
        • Download conflict data
        • Change a topic
        • Column filtering
        • Data formats
      • Authorize an Alibaba Cloud account
      • SQL statements for querying table objects
      • Online DDL tools
      • Create a trigger
      • Modify the log level of a self-managed PostgreSQL instance
      • Supported DDL statements for synchronization and their limitations
        • DDL synchronization from Aurora MySQL DB clusters to MySQL-compatible tenants of OceanBase Database
        • DDL synchronization from MySQL-compatible tenants of OceanBase Database to Aurora MySQL DB clusters
        • DDL synchronization between MySQL-compatible tenants of OceanBase Database
        • DDL synchronization from Oracle databases to Oracle-compatible tenants of OceanBase Database
        • DDL synchronization from Oracle-compatible tenants of OceanBase Database to Oracle databases
        • DDL synchronization between Oracle-compatible tenants of OceanBase Database
        • DDL synchronization from OceanBase databases to Kafka instances
    • Data subscription
      • Create a data subscription task
      • Manage data subscription tasks
        • View details of a data subscription task
        • Configure subscription information
        • Modify the name of a data subscription task
        • View and modify subscription objects
        • View data subscription parameters
        • Set up data subscription alerts
        • Start, stop, and resume data subscription tasks
        • Clone a data subscription task
        • Release a data subscription task
      • Manage private connections for data subscriptions
      • Configure consumer subscription
      • Message formats
    • Data validation
      • Overview
      • Create a data validation task
      • Manage data validation tasks
        • View details of a data validation task
        • View and modify validation objects
        • View and modify validation parameters
        • Manage data validation tasks with tags
        • Start, pause, and resume data validation tasks
        • Clone a data validation task
        • Release a data validation task
      • Features
        • Import validation objects
        • Rename the validation object
        • Filter objects by using SQL conditions
        • Configure the matching rules for the validation object
    • Assess performance
      • Overview
      • Obtain traffic files from a database instance
      • Create a full performance assessment task
      • Create an SQL file parsing task
      • Create an SQL file replay task
      • Manage performance assessment tasks
        • View the details of a performance assessment task
        • View a performance assessment report
        • Retry and stop a performance assessment task
        • Delete a performance assessment task
      • Obtain a database instance
      • Create an access key
    • Import data
      • Import data
      • Direct load
      • Supported file formats and encoding formats for Data Import
      • Sample data introduction
    • Binlog service
      • Overview
      • Purchase the Binlog service
      • Manage Binlog Service
        • View details of the Binlog service
        • Change configuration
        • Modify the auto-scaling strategy for storage space
        • Modify the elasticity strategy for compute units
        • Disable the Binlog service
  • Security
    • OceanBase Cloud account settings
      • Modify login password
      • Multi-factor authentication
      • Manage AccessKeys
      • Time zone settings
      • Manage cloud marketplace accounts
      • Account audit
    • Organizations and projects
      • Overview
      • Manage organization information
      • Project management
        • Manage projects
        • Cross-project bidirectional authorization
        • Subscribe to project messages
      • Manage members
      • Permissions for roles
      • Cost management
        • Overview
        • Cost details
        • Manage cost units
      • Operation audit
    • Database accounts and privileges
      • Account privileges
      • Authorize cloud vendor accounts
      • AWS KMS key management
      • Support access control
    • Security and encryption
      • Set allowlist groups
      • SSL encryption
      • Transparent Data Encryption (TDE)
    • Monitoring dashboard
    • Events
  • SQL Console
    • Overview
    • Access SQL Console
    • SQL editing and execution
    • PL compilation
    • Result set editing
    • Execution analysis
    • Database object management
      • Create a table
      • Create a view
      • Create a function
      • Create a stored procedure
      • Create a program package
      • Create a trigger
      • Create a type
      • Create a sequence
      • Create a synonym
    • Session variable management
    • Functional keys in SQL Console
  • Integrations
    • Overview
    • Schema evolution
      • Liquibase
      • Flyway
    • Data ingestion
      • Canal
      • dbt
      • Debezium
      • Flink
      • Glue
      • Informatica Cloud
      • Kafka
      • Maxwell
      • SeaTunnel
      • DataWorks
      • NiFi
    • SQL development
      • DataGrip
      • DBeaver
      • Navicat
      • TablePlus
    • Orchestration
      • DolphinScheduler
      • Linkis
      • Airflow
    • Visualization
      • Grafana
      • Power BI
      • Quick BI
      • Superset
      • Tableau
    • Observability
      • Datadog
      • Prometheus
    • Database management
      • Bytebase
    • AI
      • LlamaIndex
      • Dify
      • LangChain
      • Tongyi Qianwen
      • OpenAI
      • n8n
      • Trae
      • SpringAI
      • Cline
      • Cursor
      • Continue
      • Toolbox
      • CamelAI
      • Firecrawl
      • Hugging Face
      • Ollama
      • Google Gemini
      • Cloudflare Workers AI
      • Jina AI
      • Augment Code
      • Claude Code
      • Kiro
    • Development tools
      • Cloudflare Workers
      • Vercel
  • Best practices
    • Best practices for achieving high availability through cross-cloud active-active deployment
    • High availability through cross-cloud primary-standby databases (1:1)
    • High availability through cross-cloud primary-standby databases (1:n)
    • High host CPU usage
    • Best practices for read/write splitting in OceanBase Cloud
  • References
    • System architecture
    • System management
    • Database object management
    • Database design and specification constraints
    • SQL reference
    • System views
    • Parameters and system variables
    • Error codes
    • Performance tuning
    • Open API References
      • Overview
      • Service endpoints
      • Using API
      • Open APIs
        • Cluster management
          • DescribeInstances
          • DescribeInstance
          • CreateInstance
          • DeleteInstance
          • ModifyInstanceName
          • describe-node-options
          • StopCluster
          • StartCluster
          • ModifyInstanceSpec
          • DescribeInstanceTopology
          • DescribeReadonlyInstances
          • CreateReadonlyInstance
          • ModifyReadonlyInstanceSpec
          • ModifyReadonlyInstanceDiskSize
          • ModifyReadonlyInstanceNodeNum
          • DeleteReadonlyInstance
          • DescribeInstanceAvailableRoZones
          • DescribeInstanceParameters
          • UpdateInstanceParameters
          • DescribeInstanceParametersHistory
          • ModifyInstanceTagList
          • ModifyInstanceNodeNum
        • Tenant management
          • DescribeTenants
          • DescribeTenant
          • CreateTenants
          • DeleteTenants
          • ModifyTenantName
          • ModifyTenant
          • ModifyTenantUserDescription
          • ModifyTenantUserStatus
          • GetTenantCreateConstraints
          • ModifyTenantPrimaryZone
          • GetTenantCreateCpuConstraints
          • GetTenantCreateMemConstraints
          • GetTenantModifyCpuConstraints
          • GetTenantModifyMemConstraints
          • CreateTenantSecurityIpGroup
          • DescribeTenantSecurityIpGroups
          • ModifyTenantSecurityIpGroup
          • DeleteTenantSecurityIpGroup
          • DescribeTenantPrivateLink
          • DeletePrivatelinkConnection
          • CreatePrivatelinkService
          • ConnectPrivatelinkService
          • AddPrivatelinkServiceUser
          • BatchKillProcessList
          • DescribeProcessStatsComposition
          • DescribeTenantAvailableRoZones
          • DescribeTenantAddressInfo
          • ModifyTenantReadonlyReplica
          • DescribeTenantParameters
          • UpdateTenantParameters
          • DescribeTenantParametersHistory
          • ModifyTenantTagList
        • Tenant user management
          • CreateTenantUser
          • DescribeTenantUsers
          • DeleteTenantUsers
          • ModifyTenantUserPassword
          • ModifyTenantUserRoles
        • Database management
          • CreateDatabase
          • DescribeDatabases
          • DeleteDatabases
          • ModifyDatabaseUserRoles
        • Backup and restore
          • DescribeDataBackupSet
          • DescribeRestorableTenants
          • ModifyBackupStrategy
          • CreateTenantRestoreTask
          • CreateDataBackupTask
          • DescribeOneDataBackupSet
        • Database proxy management
          • CreateTenantAddress
          • CreateTenantSingleTunnelSLBAddress
          • DeleteTenantAddress
          • DescribeTenantAddress
          • ModifyOdpClusterSpec
          • ModifyTenantAddressPort
          • ModifyTenantAddressDomainPrefix
          • ConfirmPrivatelinkConnection
          • DescribeTenantAddressInfo
        • Monitoring management
          • DescribeTenantMetrics
          • DescribeMetricsData
          • DescribeNodeMetrics
        • Diagnostic management
          • DescribeOasTopSQLList
          • DescribeOasAnomalySQLList
          • DescribeOasSlowSQLList
          • DescribeOasSQLText
          • DescribeSqlAudits
          • DescribeOutlineBinding
          • DescribeSampleSqlRawTexts
          • DescribeSQLTuningAdvices
          • DescribeOasSlowSQLSamples
          • DescribeOasSQLTrends
          • DescribeOasSQLPlanGroup
        • Security management
          • CreateSecurityIpGroup
          • DescribeInstanceSSL
          • ModifyInstanceSSL
          • DescribeTenantEncryption
          • ModifyTenantEncryption
          • ModifySecurityIps
          • DeleteSecurityIpGroup
          • DescribeTenantSecurityConfigs
          • DescribeInstanceSecurityConfigs
        • Tag management
          • DescribeTags
          • CreateTags
          • UpdateTag
          • DeleteTag
        • Historical event management
          • DescribeOperationEvents
      • Differences between ApsaraDB for OceanBase APIs and OceanBase Cloud APIs
    • Download OBClient
      • Download OBClient
      • Download OceanBase Connector/J
      • Download client ODC
      • Download OceanBase Connector/ODBC
      • Download OBClient Libs
    • Metrics References
      • Cluster database
      • Cluster hosts
      • Binlog service
      • Cross-cloud network channel connection
      • Performance and SQL
      • Transactions
      • Storage and caching
      • Proxy database
      • Proxy host
    • ODC User Guide
      • What is ODC?
        • What is ODC?
        • Limitations
      • Quick Start
        • Client ODC
          • Overview
          • Install Client ODC
          • Use Client ODC
        • Web ODC
          • Overview
          • Use Web ODC
      • Data Source Management
        • Create a data source
        • Data sources and project collaboration
        • Database O&M
          • Session management
          • Global variable management
          • Recycle bin management
      • SQL Development
        • Edit and execute SQL statements
        • Perform PL compilation and debugging
        • Edit and export the result set of an SQL statement
        • Execution analysis
        • Generate test data
        • System settings
        • Database objects
          • Table objects
            • Overview
            • Create a table
          • View objects
            • Overview
            • Create a view
            • Manage views
          • Materialized view objects
            • Overview
            • Create a materialized view
            • Manage materialized views
          • Function objects
            • Overview
            • Create a function
            • Manage functions
          • Stored procedure objects
            • Overview
            • Create a stored procedure
            • Manage stored procedures
          • Sequence objects
            • Overview
            • Create a sequence
            • Manage sequences
          • Package objects
            • Overview
            • Create a program package
            • Manage program packages
          • Trigger objects
            • Overview
            • Create a trigger
            • Manage triggers
          • Type objects
            • Overview
            • Create a type
            • Manage types
          • Synonym objects
            • Overview
            • Create a synonym
            • Manage synonyms
      • Import and Export
        • Import schemas and data
        • Export schemas and data
      • Database Change Management
        • User Permission Management
          • Users and roles
          • Automatic authorization
          • User permission management
        • Project collaboration management
        • Risk levels, risk identification rules, and approval processes
        • SQL check specifications
        • SQL window specification
        • Database change management
        • Batch database change management
        • Online schema changes
        • Synchronize shadow tables
        • Schema comparison
      • Data Lifecycle Management
        • Partitioning Plan Management
          • Manage partitioning plans
          • Set partitioning strategies
          • Examples
        • SQL plan task
      • Data Desensitization and Auditing
        • Desensitize data
        • Operation records
      • Notification Management
        • Overview
        • View notification records
        • Manage Notification Channel
          • Create a notification channel
          • View, edit, and delete a notification channel
          • Configure a custom channel
        • Manage notification rules
      • Best Practices
        • Tips for SQL development
        • Explore ODC team workspaces
        • Understanding real-time SQL diagnostics for OceanBase AP
        • OceanBase historical database solutions
        • ODC SQL check for automatic identification of high-risk operations
        • Manage and modify sharded databases and tables via ODC
        • Data masking and control practices
        • Enterprise-level control and collaboration: Safeguard every database change
    • Data Development
      • Overview
      • Workspace management
      • Worksheet management
      • Compute node pool management
      • Workflow management
      • Dashboard management
      • Manage Git repositories
      • SQL development
        • SQL editing and execution
        • Result set editing
        • Execution analysis
        • Database object management
          • Create a table
          • Create a view
          • Create a function
          • Create a stored procedure
        • Session variable management
        • Git integration
      • Sample datasets
      • Data development terms
  • Manage Billing
    • Access billing
    • View monthly bills
    • View payment details
    • View orders
    • Use vouchers for payment
    • View invoices
  • Legal Agreements
    • OceanBase Cloud Services Agreement
    • Service Level Agreement
    • OceanBase Data Processing Addendum
    • Service Level Agreement for OceanBase Cloud Migration Service

Download PDF

Release notes for 2026 Release notes for 2025 Release notes for 2024 Release history Data development module deprecation notice Optimization of Backup and Restore commercialization strategy Cross-AZ data transfer billing (OceanBase Cloud on AWS) Database Proxy pricing update AWS instance pricing adjustment Overview Management mode and scenarios High availability with cross-cloud active-active architecture High availability with cross-cloud primary-standby databases Multi-level caching in shared storage Multi-layer online scaling and on-demand adjustment Deployment modes Storage architecture Product specifications Overview Backup and restore billing SQL audit billing Migrations billing Database proxy billing Binlog service billing Overview of OceanBase Cloud support plans Read-only replica billing Supported database versions Get started with a transactional instance Get started with an analytical instance Get started with a Key-Value instance Overview Overview Create via OceanBase Cloud official website Create via AWS Marketplace Create via GCP Marketplace Create via Huawei Cloud Marketplace Create via Alibaba Cloud Marketplace Create via Azure Marketplace Release an instance Manage tags Manage JVM-dependent services Create a data source Manage data sources Archive data Clean up data Instance recycle bin Overview Core features Create an instance Overview Table overview Export data OceanBase data processing Statistics Materialized views for query acceleration Select a query parallelism level Instance overview Change configuration Modify primary zone Manage parameters Stop and restart instances Release instances Manage tags Archive data Clean up data Use the DBMS_XPLAN package for performance diagnostics Use the GV$SQL_PLAN_MONITOR view for performance analysis Views related to AP performance analysis Performance testing Product integration View instance recycle bin Create an instance Create a tenant Create an account for a database user OBKV HBase data operation examples Create an instance OBKV-HBase Overview Create an instance Performance test Overview Connect using a public IP address Data migration and import solutions Data assessment and migration quick start Overview Perform online assessment Perform offline assessment Obtain files for upload Configure PrivateLink Add an IP address to an allowlist Overview Migrations specification Purchase a data migration instance Migrate data from a MySQL database to a MySQL-compatible tenant of OceanBase Database Migrate data from a MySQL-compatible tenant of OceanBase Database to a MySQL database Migrate data between OceanBase database tenants of the same compatibility mode Migrate data between OceanBase database tenants of different compatibility modes Migrate data from an Oracle database to an Oracle-compatible tenant of OceanBase Database Migrate data from an Oracle-compatible tenant of OceanBase Database to an Oracle database Configure a two-way synchronization task Migrate data from an OceanBase database to a Kafka instance
OceanBase logo

The Unified Distributed Database for the AI Era.

Follow Us
Products
OceanBase CloudOceanBase EnterpriseOceanBase Community EditionOceanBase seekdb
Resources
DocsBlogLive DemosTraining & Certification
Company
About OceanBaseTrust CenterLegalPartnerContact Us
Follow Us

© OceanBase 2026. All rights reserved

Cloud Service AgreementPrivacy PolicySecurity
Contact Us
Document Feedback
  1. Documentation Center
  2. OceanBase Cloud
iconOceanBase Cloud

    SpringBatch application example for connecting to OceanBase Cloud

    Last Updated:2026-04-07 08:08:33  Updated
    share
    What is on this page
    Prerequisites
    Procedure
    Step 1: Obtain the connection string of the OceanBase Cloud database
    Step 2: Import the java-oceanbase-springbatch project into IDEA
    Step 3: Modify the database connection information in the java-oceanbase-springbatch project
    Step 4: Run the java-oceanbase-springbatch project
    Project code
    Introduction to the pom.xml file
    Introduction to the application.properties file
    Introduction to the BatchApplication.java file
    Introduction to the BatchConfig.java file
    Introduction to the People.java file
    Introduction to the PeopleDESC.java file
    Introduction to the AddPeopleDescProcessor.java file
    AddDescPeopleWriter.java file
    Introduction to the AddPeopleWriter.java file
    Introduction to the BatchConfigTest.java file
    AddPeopleDescProcessorTest.java file
    Introduction to AddDescPeopleWriterTest.java
    AddPeopleWriterTest.java
    Full code
    References

    folded

    share

    This topic describes how to build an application by using the SpringBatch framework and OceanBase Cloud to perform basic operations such as creating tables, inserting data, and querying data.

    Download the java-oceanbase-springbatch sample project SpringBatch application example for connecting to OceanBase Cloud (Oracle compatible mode)

    Prerequisites

    • You have registered an account on OceanBase Cloud, created an instance and an Oracle-compatible tenant. For more information, see Create an instance and Create a tenant.
    • You have installed JDK 1.8 and Maven.
    • You have installed IntelliJ IDEA.

    Note

    The code examples in this topic are run in IntelliJ IDEA 2021.3.2 (Community Edition). You can also choose your preferred tool to run the code examples.

    Procedure

    Note

    The following steps are based on the Windows environment. If you are using a different operating system or compiler, the steps may vary slightly.

    1. Obtain the connection string of the OceanBase Cloud database.
    2. Import the java-oceanbase-springbatch project into IDEA.
    3. Modify the database connection information in the java-oceanbase-springbatch project.
    4. Run the java-oceanbase-springbatch project.

    Step 1: Obtain the connection string of the OceanBase Cloud database

    1. Log in to the OceanBase Cloud console. In the instance list, expand the information of the target instance, and in the target tenant, choose Connect > Get Connection String.

      For more information, see Obtain a connection string.

    2. Fill in the URL with the information of the created OceanBase Cloud database.

      The URL for connecting to the Oracle compatible mode of the OceanBase Cloud database is as follows:

      jdbc:oceanbase://$host:$port/$schema_name?user=$user_name&password=$password
      

      Parameter description:

      • $host: the connection address of the OceanBase Cloud database, for example, t********.********.oceanbase.cloud.
      • $port: the connection port of the OceanBase Cloud database. The default value is 1521.
      • $schema_name: the name of the schema to be accessed.
      • $user_name: the account for accessing the database.
      • $password: the password of the account.

      For more information about the URL parameters, see Database URL.

    Step 2: Import the java-oceanbase-springbatch project into IDEA

    1. Start IntelliJ IDEA and choose File > Open....

      file

    2. In the Open File or Project window that appears, select the project file and click OK.

    3. IntelliJ IDEA automatically identifies various files in the project and displays the project structure, file list, module list, and dependency relationships in the Project tool window. The Project tool window is usually located on the left side of the IntelliJ IDEA interface and is open by default. If the Project tool window is closed, you can click View > Tool Windows > Project in the menu bar or press Alt + 1 to reopen it.

      Note

      When you import a project into IntelliJ IDEA, it automatically detects the pom.xml file in the project, downloads the required dependency libraries based on the described dependencies, and adds them to the project.

    4. View the project.

      springbatch

    Step 3: Modify the database connection information in the java-oceanbase-springbatch project

    Modify the database connection information in the application.properties file based on the information obtained in Step 1: Obtain the connection string of the OceanBase Cloud database.

    Here is an example:

    • The name of the database driver is com.oceanbase.jdbc.Driver.
    • The connection address of the OceanBase Cloud database is t5******.********.oceanbase.cloud.
    • The access port is 1521.
    • The name of the schema to be accessed is sys.
    • The tenant account is oracle001.
    • The password is ******.

    Here is the sample code:

    spring.datasource.driver-class-name=com.oceanbase.jdbc.Driver
    spring.datasource.url=jdbc:oceanbase://t5******.********.oceanbase.cloud:1521/sys?characterEncoding=utf-8
    spring.datasource.username=oracle001
    spring.datasource.password=******
    
    spring.jpa.show-sql=true
    spring.jpa.hibernate.ddl-auto=update
    
    spring.batch.job.enabled=false
    
    logging.level.org.springframework=INFO
    logging.level.com.example=DEBUG
    

    Step 4: Run the java-oceanbase-springbatch project

    • Run the AddDescPeopleWriterTest.java file.

      1. Find the AddDescPeopleWriterTest.java file in the src > test > java directory.
      2. In the tool menu bar, choose Run > Run... > AddDescPeopleWriterTest.testWrite, or click the green triangle in the upper right corner to run.
      3. View the log information and output results in the IDEA console.
      Data in the people_desc table:
      PeopleDESC [name=John, age=25, desc=This is John with age 25]
      PeopleDESC [name=Alice, age=30, desc=This is Alice with age 30]
      Batch job execution completed.
      
    • Run the AddPeopleWriterTest.java file.

      1. Find the AddDescPeopleWriterTest.java file in the src > test > java directory.
      2. In the tool menu bar, choose Run > Run... > AddPeopleWriterTest.testWrite, or click the green triangle in the upper right corner to run.
      3. View the log information and output results in the IDEA console.
      Data in the people table:
      People [name=zhangsan, age=27]
      People [name=lisi, age=35]
      Batch job execution completed.
      

    Project code

    Click java-oceanbase-springbatch to download the project code, which is a compressed package named java-oceanbase-springbatch.

    After decompressing it, you will find a folder named java-oceanbase-springbatch. The directory structure is as follows:

    │  pom.xml
    │
    ├─.idea
    │
    ├─src
    │  ├─main
    │  │  ├─java
    │  │  │  └─com
    │  │  │      └─oceanbase
    │  │  │          └─example
    │  │  │              └─batch
    │  │  │                  │──BatchApplication.java
    │  │  │                  │
    │  │  │                  ├─config
    │  │  │                  │   └─BatchConfig.java
    │  │  │                  │
    │  │  │                  ├─model
    │  │  │                  │   ├─People.java
    │  │  │                  │   └─PeopleDESC.java
    │  │  │                  │
    │  │  │                  ├─processor
    │  │  │                  │   └─AddPeopleDescProcessor.java
    │  │  │                  │
    │  │  │                  └─writer
    │  │  │                      ├─AddDescPeopleWriter.java
    │  │  │                      └─AddPeopleWriter.java
    │  │  │
    │  │  └─resources
    │  │      └─application.properties
    │  │
    │  └─test
    │      └─java
    │          └─com
    │              └─oceanbase
    │                  └─example
    │                      └─batch
    │                          ├─config
    │                          │   └─BatchConfigTest.java
    │                          │
    │                          ├─processor
    │                          │   └─AddPeopleDescProcessorTest.java
    │                          │
    │                          └─writer
    │                              ├─AddDescPeopleWriterTest.java
    │                              └─AddPeopleWriterTest.java
    │
    └─target
    

    File description:

    • pom.xml: the configuration file of the Maven project, which contains the dependencies, plugins, and build information of the project.
    • .idea: the directory used by the IDE (integrated development environment) to store project-related configuration information.
    • src: the directory where the source code of the project is stored.
    • main: the directory where the main source code and resource files are stored.
    • java: the directory where the Java source code is stored.
    • com: the root directory where the Java packages are stored.
    • oceanbase: the root directory of the project.
    • example: the root directory of the project.
    • batch: the main package name of the project.
    • BatchApplication.java: the entry class of the application, which contains the main method of the application.
    • config: the folder where the configuration classes of the application are stored.
    • BatchConfig.java: the configuration class of the application, which is used to configure some properties and behaviors of the application.
    • model: the folder where the model classes of the application are stored.
    • People.java: the data model class of the personnel.
    • PeopleDESC.java: the data model class of the personnelDESC.
    • processor: the folder where the processor classes of the application are stored.
    • AddPeopleDescProcessor.java: the processor class for adding personnelDESC information.
    • writer: the folder where the writer classes of the application are stored.
    • AddDescPeopleWriter.java: the writer class for writing personnelDESC information.
    • AddPeopleWriter.java: the writer class for writing personnel information.
    • resources: the folder where the configuration files and other static resource files of the application are stored.
    • application.properties: the configuration file of the application, which is used to configure the properties of the application.
    • test: the directory where the test code and resource files are stored.
    • BatchConfigTest.java: the test class of the application configuration class.
    • AddPeopleDescProcessorTest.java: the test class of the processor for adding personnelDESC information.
    • AddDescPeopleWriterTest.java: the test class of the writer for writing personnelDESC information.
    • AddPeopleWriterTest.java: the test class of the writer for writing personnel information.
    • target: the directory where the compiled class files, JAR packages, and other files are stored.

    Introduction to the pom.xml file

    Note

    If you only want to verify the sample, you can use the default code without any modifications. You can also modify the pom.xml file based on your specific requirements as explained below.

    The content of the pom.xml configuration file is as follows:

    1. File declaration statement.

      This statement declares the file as an XML file using XML version 1.0 and character encoding UTF-8.

      Sample code:

      <?xml version="1.0" encoding="UTF-8"?>
      
    2. Configure the namespaces and model version of POM.

      1. Use xmlns to set the POM namespace to http://maven.apache.org/POM/4.0.0.
      2. Use xmlns:xsi to set the XML namespace to http://www.w3.org/2001/XMLSchema-instance.
      3. Use xsi:schemaLocation to set the POM namespace to http://maven.apache.org/POM/4.0.0 and the location of the POM XSD file to https://maven.apache.org/xsd/maven-4.0.0.xsd.
      4. Use the <modelVersion> element to set the POM model version to 4.0.0.

      Sample code:

       <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
               xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
           <modelVersion>4.0.0</modelVersion>
      </project>
      
    3. Configure the parent information.

      1. Use <groupId> to set the parent identifier to org.springframework.boot.
      2. Use <artifactId> to set the parent dependency to spring-boot-starter-parent.
      3. Use <version> to set the parent version to 2.7.11.
      4. Use relativePath to indicate that the parent path is empty.

      Sample code:

       <parent>
           <groupId>org.springframework.boot</groupId>
           <artifactId>spring-boot-starter-parent</artifactId>
           <version>2.7.11</version>
           <relativePath/>
       </parent>
      
    4. Configure the basic information.

      1. Use <groupId> to set the project identifier to com.oceanbase.
      2. Use <artifactId> to set the project dependency to java-oceanbase-springboot.
      3. Use <version> to set the project version to 0.0.1-SNAPSHOT.
      4. Use description to introduce the project information as Demo project for Spring Batch.

      Sample code:

       <groupId>com.oceanbase</groupId>
       <artifactId>java-oceanbase-springboot</artifactId>
       <version>0.0.1-SNAPSHOT</version>
       <name>java-oceanbase-springbatch</name>
       <description>Demo project for Spring Batch</description>
      
    5. Configure the Java version.

      Set the Java version used by the project to 1.8.

      Sample code:

        <properties>
            <java.version>1.8</java.version>
        </properties>
      
    6. Configure the core dependencies.

      1. Set the organization of the dependency to org.springframework.boot, the name to spring-boot-starter, and use this dependency to access the default components supported by Spring Boot, including Web, data processing, security, and Test features.

        Sample code:

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter</artifactId>
        </dependency>
        
      2. Set the organization of the dependency to org.springframework.boot, the name to spring-boot-starter-jdbc, and use this dependency to access the JDBC-related features provided by Spring Boot, including connection pools and data source configurations.

        Sample code:

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-jdbc</artifactId>
        </dependency>
        
      3. Set the organization of the dependency to org.springframework.boot, the name to spring-boot-starter-test, and the scope to test. Use this dependency to access the testing framework and tools provided by Spring Boot, including JUnit, Mockito, and Hamcrest.

        Sample code:

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        
      4. Set the organization of the dependency to com.oceanbase, the name to oceanbase-client, and the version to 2.4.3. Use this dependency to access the client features provided by OceanBase, including connections, queries, and transactions.

        Sample code:

            <dependency>
                <groupId>com.oceanbase</groupId>
                <artifactId>oceanbase-client</artifactId>
                <version>2.4.3</version>
            </dependency>
        
      5. Set the organization of the dependency to org.springframework.boot, the name to spring-boot-starter-batch, and use this dependency to access the batch processing features provided by Spring Boot.

        Sample code:

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-batch</artifactId>
        </dependency>
        
      6. Set the organization of the dependency to org.springframework.boot, the name to spring-boot-starter-data-jpa, and use this dependency to access the necessary dependencies and configurations for data access using JPA. Spring Boot Starter Data JPA is a Spring Boot starter.

        Sample code:

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-data-jpa</artifactId>
        </dependency>
        
      7. Set the organization of the dependency to org.apache.tomcat, the name to tomcat-jdbc, and use this dependency to access the JDBC connection pool features provided by Tomcat, including connection pool configuration, connection acquisition and release, and connection management.

        Sample code:

        <dependency>
            <groupId>org.apache.tomcat</groupId>
            <artifactId>tomcat-jdbc</artifactId>
        </dependency>
        
      8. Set the test framework of the dependency to junit, the name to junit, the version to 4.10, and the scope to test. Use this dependency to add JUnit unit test dependencies.

        Sample code:

        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.10</version>
            <scope>test</scope>
        </dependency>
        
      9. Set the organization of the dependency to javax.activation, the name to javax.activation-api, and the version to 1.2.0. Use this dependency to introduce the Java Activation Framework (JAF) library.

        Sample code:

        <dependency>
            <groupId>javax.activation</groupId>
            <artifactId>javax.activation-api</artifactId>
            <version>1.2.0</version>
        </dependency>
        
      10. Set the organization of the dependency to jakarta.persistence, the name to jakarta.persistence-api, and the version to 2.2.3. Use this dependency to add the Jakarta Persistence API dependency. Sample code:

        <dependency>
            <groupId>jakarta.persistence</groupId>
            <artifactId>jakarta.persistence-api</artifactId>
            <version>2.2.3</version>
        </dependency>
        
    7. Configure the Maven plugins.

      Set the organization of the dependency to org.springframework.boot, the name to spring-boot-maven-plugin, and use this plugin to package the Spring Boot application into an executable JAR or WAR file, which can be directly run.

      Sample code:

       <build>
           <plugins>
               <plugin>
                   <groupId>org.springframework.boot</groupId>
                   <artifactId>spring-boot-maven-plugin</artifactId>
               </plugin>
           </plugins>
       </build>
      

    Introduction to the application.properties file

    The application.properties file is used to configure database connections and other related settings. This includes database drivers, connection URLs, usernames, and passwords. It also contains configurations for JPA (Java Persistence API) and Spring Batch, as well as log level settings.

    1. Database connection configuration.

      • Use spring.datasource.driver to specify the database driver as com.oceanbase.jdbc.Driver, which is used to establish a connection with OceanBase Cloud.
      • Use spring.datasource.url to specify the URL for connecting to the database.
      • Use spring.datasource.username to specify the username for connecting to the database.
      • Use spring.datasource.password to specify the password for connecting to the database.

      Sample code:

      spring.datasource.driver-class-name=com.oceanbase.jdbc.Driver
      spring.datasource.url=jdbc:oceanbase://host:port/schema_name?characterEncoding=utf-8
      spring.datasource.username=user_name
      spring.datasource.password=******
      
    2. JPA configuration.

      • Use spring.jpa.show-sql to specify whether to display SQL statements in the logs. Setting it to true means SQL statements will be displayed.
      • Use spring.jpa.hibernate.ddl-auto to specify the Hibernate DDL operation behavior. Setting it to update means the database structure will be automatically updated when the application starts.

      Sample code:

      spring.jpa.show-sql=true
      spring.jpa.hibernate.ddl-auto=update
      
    3. Spring Batch configuration:

      Use spring.batch.job.enabled to specify whether to enable Spring Batch jobs. Setting it to false means Spring Batch jobs are disabled.

      Sample code:

      spring.batch.job.enabled=false
      
    4. Log configuration:

      • Use logging.level.org.springframework to specify the log level for the Spring framework as INFO.
      • Use logging.level.com.example to specify the log level for custom application code as DEBUG.

      Sample code:

      logging.level.org.springframework=INFO
      logging.level.com.example=DEBUG
      

    Introduction to the BatchApplication.java file

    The BatchApplication.java file is the entry point of the Spring Boot application.

    The code in the BatchApplication.java file mainly includes the following parts:

    1. Importing other classes and interfaces.

      Declare that this file contains the following interfaces and classes:

      • SpringApplication class: used to start the Spring Boot application.
      • SpringBootApplication annotation: used to mark this class as the entry point of the Spring Boot application.

      Sample code:

          import org.springframework.boot.SpringApplication;
          import org.springframework.boot.autoconfigure.SpringBootApplication;
      
    2. Define the BatchApplication class.

      Use the @SpringBootApplication annotation to mark the BatchApplication class as the entry point of the Spring Boot application. Define a static main method in the BatchApplication class as the entry point of the application. In this method, use the SpringApplication.run method to start the Spring Boot application. Define a method named runBatchJob to run the batch job.

      Sample code:

      
      
          @SpringBootApplication
          public class BatchApplication {
              public static void main(String[] args) {
                  SpringApplication.run(BatchApplication.class, args);
              }
      
              public void runBatchJob() {
              }
          }
      

    Introduction to the BatchConfig.java file

    The BatchConfig.java file is used to configure components such as steps, readers, processors, and writers for batch processing jobs.

    The code in the BatchConfig.java file mainly includes the following parts:

    1. Import other classes and interfaces.

      Declare the interfaces and classes included in the current file:

      • People class: stores the personnel information read from the database.
      • PeopleDESC class: stores the description information after the personnel information is converted or processed.
      • AddPeopleDescProcessor class: converts the read People object to the PeopleDESC object. This class implements the ItemProcessor interface.
      • AddDescPeopleWriter class: writes the PeopleDESC object to the target location. This class implements the ItemWriter interface.
      • Job interface: represents a batch processing job.
      • Step interface: represents a step in a job.
      • EnableBatchProcessing annotation: a Spring Batch configuration annotation used to enable and configure Spring Batch processing.
      • JobBuilderFactory class: used to create and configure jobs.
      • StepBuilderFactory class: used to create and configure steps.
      • RunIdIncrementer class: a Spring Batch run ID (Run ID) auto-incrementer used to increase the run ID each time a job is run.
      • ItemProcessor interface: used to process or convert the read items.
      • ItemReader interface: used to read items from the data source.
      • ItemWriter interface: used to write the processed or converted items to the specified target location.
      • JdbcCursorItemReader class: used to read data from the database and return the cursor result set.
      • Autowired annotation: used for dependency injection.
      • Bean annotation: used to create and configure beans.
      • ComponentScan annotation: used to specify the packages or classes to be scanned for components.
      • Configuration annotation: used to mark a class as a configuration class.
      • EnableAutoConfiguration annotation: used to enable Spring Boot auto-configuration.
      • SpringBootApplication annotation: used to mark the class as the entry point of a Spring Boot application.
      • DataSource interface: used to represent the database connection.

      Sample code:

      import com.oceanbase.example.batch.model.People;
      import com.oceanbase.example.batch.model.PeopleDESC;
      import com.oceanbase.example.batch.processor.AddPeopleDescProcessor;
      import com.oceanbase.example.batch.writer.AddDescPeopleWriter;
      import org.springframework.batch.core.Job;
      import org.springframework.batch.core.Step;
      import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
      import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
      import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
      import org.springframework.batch.core.launch.support.RunIdIncrementer;
      import org.springframework.batch.item.ItemProcessor;
      import org.springframework.batch.item.ItemReader;
      import org.springframework.batch.item.ItemWriter;
      import org.springframework.batch.item.database.JdbcCursorItemReader;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
      import org.springframework.boot.autoconfigure.SpringBootApplication;
      import org.springframework.context.annotation.Bean;
      import org.springframework.context.annotation.ComponentScan;
      import org.springframework.context.annotation.Configuration;
      import org.springframework.jdbc.core.BeanPropertyRowMapper;
      
      import javax.sql.DataSource;
      
    2. Define the BatchConfig class.

      This is a simple Spring Batch batch processing job. It defines the methods for reading, processing, and writing data and encapsulates these steps into a job. By using Spring Batch annotations and auto-configuration features, you can create corresponding component instances through the @Bean methods in the configuration class and use these components in step1 to complete the data reading, processing, and writing.

      • Use @Configuration to indicate that this class is a configuration class.
      • Use @EnableBatchProcessing to enable Spring Batch processing. This annotation automatically creates necessary beans such as JobRepository and JobLauncher.
      • Use @SpringBootApplication as the main class annotation for Spring Boot applications, serving as the starting point of a Spring Boot application.
      • Use @ComponentScan to specify the packages to be scanned for components, telling Spring to scan and register all components in this package and its subpackages.
      • Use @EnableAutoConfiguration to automatically configure the infrastructure of Spring Boot applications.

      Sample code:

       @Configuration
       @EnableBatchProcessing
       @SpringBootApplication
       @ComponentScan("com.oceanbase.example.batch.writer")
       @EnableAutoConfiguration
       public class BatchConfig {
       }
      
      1. Define the @Autowired annotation.

        Use the @Autowired annotation to inject JobBuilderFactory, StepBuilderFactory, and DataSource into the member variables of the BatchConfig class. JobBuilderFactory is a factory class used to create and configure jobs (Job), StepBuilderFactory is a factory class used to create and configure steps (Step), and DataSource is an interface used to obtain the database connection.

        Sample code:

        @Autowired
        private JobBuilderFactory jobBuilderFactory;
        
        @Autowired
        private StepBuilderFactory stepBuilderFactory;
        
        @Autowired
        private DataSource dataSource;
        
      2. Define the @Bean annotation.

        Use the @Bean annotation to define several methods for creating readers, processors, writers, steps, and jobs for batch processing.

        • Use the peopleReader method to create an ItemReader component instance. This component uses JdbcCursorItemReader to read People object data from the database. Set the data source to dataSource, set the RowMapper to map database rows to People objects, and set the SQL query statement to SELECT * FROM people.

        • Use the addPeopleDescProcessor method to create an ItemProcessor component instance. This component uses AddPeopleDescProcessor to process People objects and returns the converted PeopleDESC objects.

        • Use the addDescPeopleWriter method to create an ItemWriter component instance. This component uses AddDescPeopleWriter to write PeopleDESC objects to the target location.

        • Use the step1 method to create a Step component instance. The step name is step1. Use stepBuilderFactory.get to obtain the step builder, set the reader to the ItemReader component, set the processor to the ItemProcessor component, set the writer to the ItemWriter component, set the chunk size to 10, and finally call build to build and return the configured Step.

        • Use the importJob method to create a Job component instance. The job name is importJob. Use jobBuilderFactory.get to obtain the job builder, set the incrementer to RunIdIncrementer, set the initial step of the job flow to Step, and finally call build to build and return the configured Job.

          Sample code:

          @Bean
          public ItemReader<People> peopleReader() {
              JdbcCursorItemReader<People> reader = new JdbcCursorItemReader<>();
              reader.setDataSource((javax.sql.DataSource) dataSource);
              reader.setRowMapper(new BeanPropertyRowMapper<>(People.class));
              reader.setSql("SELECT * FROM people");
              return reader;
          }
          
          @Bean
          public ItemProcessor<People, PeopleDESC> addPeopleDescProcessor() {
              return new AddPeopleDescProcessor();
          }
          
          @Bean
          public ItemWriter<PeopleDESC> addDescPeopleWriter() {
              return new AddDescPeopleWriter();
          }
          
          @Bean
          public Step step1(ItemReader<People> reader, ItemProcessor<People, PeopleDESC> processor,
                          ItemWriter<PeopleDESC> writer) {
              return stepBuilderFactory.get("step1")
                      .<People, PeopleDESC>chunk(10)
                      .reader(reader)
                      .processor(processor)
                      .writer(writer)
                      .build();
          }
          
          @Bean
          public Job importJob(Step step1) {
              return jobBuilderFactory.get("importJob")
                      .incrementer(new RunIdIncrementer())
                      .flow(step1)
                      .end()
                      .build();
          }
          

    Introduction to the People.java file

    The People.java file creates a People class data model to represent a person's information. This class includes two private member variables, name and age, along with corresponding getter and setter methods. Finally, the toString method is overridden to print the object's information. Here, name represents the person's name, and age represents the person's age. You can use the getter and setter methods to obtain and set the values of these attributes.

    This class provides a way to store and pass data for batch processing programs. In batch processing, the People object is used to store data, and the setter method is used to set data, while the getter method is used to obtain data.

    Code:

        public class People {
            private String name;
            private int age;
    
                // getters and setters
    
            public String getName() {
                return name;
            }
    
            public void setName(String name) {
                this.name = name;
            }
    
            public int getAge() {
                return age;
            }
    
            public void setAge(int age) {
                this.age = age;
            }
            @Override
            public String toString() {
                return "People [name=" + name + ", age=" + age + "]";
            }
            // Getters and setters
        }
    

    Introduction to the PeopleDESC.java file

    The PeopleDESC.java file creates a PeopleDESC class data model to represent a person's information. The PeopleDESC class has four attributes: name, age, desc, and id, which represent the person's name, age, description, and identifier, respectively. This class includes corresponding getter and setter methods to access and set the attribute values. The toString method is overridden to return the string representation of the class, including the name, age, and description.

    Similar to the People class, the PeopleDESC class is used to store and pass data in batch processing programs.

    Code:

        public class PeopleDESC {
            private String name;
            private int age;
            private String desc;
            private int id;
    
            public String getName() {
                return name;
            }
    
            public void setName(String name) {
                this.name = name;
            }
    
            public int getAge() {
                return age;
            }
    
            public void setAge(int age) {
                this.age = age;
            }
    
            public String getDesc() {
                return desc;
            }
    
            public void setDesc(String desc) {
                this.desc = desc;
            }
    
            public int getId() {
                return id;
            }
    
            public void setId(int id) {
                this.id = id;
            }
    
            @Override
            public String toString() {
                return "PeopleDESC [name=" + name + ", age=" + age + ", desc=" + desc + "]";
            }
        }
    

    Introduction to the AddPeopleDescProcessor.java file

    The AddPeopleDescProcessor.java file defines a class named AddPeopleDescProcessor that implements the ItemProcessor interface to convert a People object to a PeopleDESC object.

    The code in the AddPeopleDescProcessor.java file mainly includes the following parts:

    1. Import other classes and interfaces.

      Declare the interfaces and classes included in this file:

      • People class: used to store the information of a person read from a database.
      • PeopleDESC class: used to store the description information of a person after conversion or processing.
      • ItemProcessor interface: used to process or convert the read items.

      Code:

      import com.oceanbase.example.batch.model.People;
      import com.oceanbase.example.batch.model.PeopleDESC;
      import org.springframework.batch.item.ItemProcessor;
      
    2. Define the AddPeopleDescProcessor class.

      The AddPeopleDescProcessor class of the ItemProcessor interface is used to convert a People object to a PeopleDESC object, implementing the logic for processing input data in batch processing.

      In the process method of this class, first, a PeopleDESC object desc is created. Then, the item parameter is used to obtain the name and age attributes of the People object and set these attributes to the desc object. At the same time, the desc attribute of the desc object is assigned a value, which is a description generated based on the attributes of the People object. Finally, the processed PeopleDESC object is returned.

      Code:

      public class AddPeopleDescProcessor implements ItemProcessor<People, PeopleDESC> {
          @Override
          public PeopleDESC process(People item) throws Exception {
              PeopleDESC desc = new PeopleDESC();
              desc.setName(item.getName());
              desc.setAge(item.getAge());
              desc.setDesc("This is " + item.getName() + " with age " + item.getAge());
              return desc;
          }
      }
      

    AddDescPeopleWriter.java file

    The AddDescPeopleWriter.java file implements the AddDescPeopleWriter class of the ItemWriter interface, which is used to write People objects to a database.

    The AddDescPeopleWriter.java file contains the following main parts:

    1. Import other classes and interfaces.

      Declare the following interfaces and classes in the current file:

      • PeopleDESC class: used to store the description information of the personnel after conversion or processing.
      • ItemWriter interface: used to write the processed or converted items to the specified target location.
      • Autowired annotation: used for dependency injection.
      • JdbcTemplate class: provides methods for executing SQL statements.
      • List interface: used to operate on the result set.

      Code:

      import com.oceanbase.example.batch.model.PeopleDESC;
      import org.springframework.batch.item.ItemWriter;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.jdbc.core.JdbcTemplate;
      
      import java.util.List;
      
    2. Define the AddDescPeopleWriter class.

      1. Use the @Autowired annotation to automatically inject the JdbcTemplate instance, which is used to execute database operations when writing data.

        Code:

            @Autowired
            private JdbcTemplate jdbcTemplate;
        
      2. In the write method, traverse the input List<? extends PeopleDESC> and extract each PeopleDESC object. First, execute the SQL statement DROP TABLE people_desc to delete the table named people_desc if it exists. Then, execute the SQL statement CREATE TABLE people_desc (id INT PRIMARY KEY, name VARCHAR2(255), age INT, description VARCHAR2(255)) to create a table named people_desc with four columns: id, name, age, and description. Finally, use the SQL statement INSERT INTO people_desc (id, name, age, description) VALUES (?, ?, ?, ?) to insert the attribute values of each PeopleDESC object into the people_desc table.

        Code:

            @Override
            public void write(List<? extends PeopleDESC> items) throws Exception {
                // Delete the table if it exists
                jdbcTemplate.execute("DROP TABLE people_desc");
                // Create the table
                String createTableSql = "CREATE TABLE people_desc (id INT PRIMARY KEY, name VARCHAR2(255), age INT, description VARCHAR2(255))";
                jdbcTemplate.execute(createTableSql);
                for (PeopleDESC item : items) {
                    String sql = "INSERT INTO people_desc (id, name, age, description) VALUES (?, ?, ?, ?)";
                    jdbcTemplate.update(sql, item.getId(), item.getName(), item.getAge(), item.getDesc());
                }
            }
        

    Introduction to the AddPeopleWriter.java file

    The AddPeopleWriter.java file implements the AddDescPeopleWriter class of the ItemWriter interface, which is used to write PeopleDESC objects to a database.

    The code in the AddPeopleWriter.java file mainly includes the following parts:

    1. Import other classes and interfaces.

      Declare the following interfaces and classes in this file:

      • People class: used to store personnel information read from the database.
      • ItemWriter interface: used to write processed or converted items to the specified target location.
      • @Autowired annotation: used for dependency injection.
      • JdbcTemplate class: provides methods for executing SQL statements.
      • @Component annotation: used to mark this class as a Spring component.
      • List interface: used to operate on query result sets.

      Code:

      import com.oceanbase.example.batch.model.People;
      import org.springframework.batch.item.ItemWriter;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.jdbc.core.JdbcTemplate;
      import org.springframework.stereotype.Component;
      
      import java.util.List;
      
    2. Define the AddPeopleWriter class.

      1. Use the @Autowired annotation to automatically inject a JdbcTemplate instance, which is used to execute database operations when writing data.

        Code:

            @Autowired
            private JdbcTemplate jdbcTemplate;
        
      2. In the write method, traverse the input List<? extends People> and extract each People object. First, execute the SQL statement DROP TABLE people to delete the table named people if it exists. Then, execute the SQL statement CREATE TABLE people (name VARCHAR2(255), age INT) to create a table named people with two columns, name and age. Finally, use the SQL statement INSERT INTO people (name, age) VALUES (?, ?) to insert the attribute values of each People object into the people table.

        Code:

        @Override
        public void write(List<? extends People> items) throws Exception {
            // Drop the table if it exists
            jdbcTemplate.execute("DROP TABLE people");
            // Create the table
            String createTableSql = "CREATE TABLE people (name VARCHAR2(255), age INT)";
            jdbcTemplate.execute(createTableSql);
            for (People item : items) {
                String sql = "INSERT INTO people (name, age) VALUES (?, ?)";
                jdbcTemplate.update(sql, item.getName(), item.getAge());
            }
        }
        

    Introduction to the BatchConfigTest.java file

    The BatchConfigTest.java file is a class that uses JUnit for testing, used to test the job configuration of Spring Batch.

    The code in the BatchConfigTest.java file mainly includes the following parts:

    1. Import other classes and interfaces.

      Declare the following interfaces and classes in this file:

      • Assert class: used to assert test results.
      • @Test annotation: used to mark a method as a test method.
      • @RunWith annotation: used to specify the test runner.
      • Job interface: represents a batch processing job.
      • JobExecution class: used to represent the execution of a batch processing job.
      • JobParameters class: used to represent the parameters of a batch processing job.
      • JobParametersBuilder class: used to build the parameters of a batch processing job.
      • JobLauncher interface: used to start a batch processing job.
      • @Autowired annotation: used for dependency injection.
      • @SpringBootTest annotation: used to specify the test class as a Spring Boot test.
      • SpringRunner class: used to specify the test runner as SpringRunner.

      Code:

      import org.junit.Assert;
      import org.junit.jupiter.api.Test;
      import org.junit.runner.RunWith;
      import org.springframework.batch.core.Job;
      import org.springframework.batch.core.JobExecution;
      import org.springframework.batch.core.JobParameters;
      import org.springframework.batch.core.JobParametersBuilder;
      import org.springframework.batch.core.launch.JobLauncher;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.boot.test.context.SpringBootTest;
      import org.springframework.test.context.junit4.SpringRunner;
      
      import javax.batch.runtime.BatchStatus;
      
    2. Define the BatchConfigTest class.

      By using the @SpringBootTest annotation and the SpringRunner runner, you can perform integration tests for Spring Boot. In the testJob method, use the JobLauncherTestUtils helper class to start a batch processing job and use assertions to verify the job's execution status.

      1. Use the @Autowired annotation to automatically inject a JobLauncherTestUtils instance.

        Code:

        @Autowired
        private JobLauncherTestUtils jobLauncherTestUtils;
        
      2. Use the @Test annotation to mark the testJob method as a test method. In this method, first create a JobParameters object, then use the jobLauncherTestUtils.launchJob method to start the batch processing job, and use the Assert.assertEquals method to assert that the job's execution status is COMPLETED.

        Code:

        @Test
        public void testJob() throws Exception {
            JobParameters jobParameters = new JobParametersBuilder()
                    .addString("jobParam", "paramValue")
                    .toJobParameters();
        
            JobExecution jobExecution = jobLauncherTestUtils.launchJob(jobParameters);
        
            Assert.assertEquals(BatchStatus.COMPLETED, jobExecution.getStatus());
        }
        
      3. Use the @Autowired annotation to automatically inject a JobLauncher instance.

        Code:

        @Autowired
        private JobLauncher jobLauncher;
        
      4. Use the @Autowired annotation to automatically inject a Job instance.

        Code:

        @Autowired
        private Job job;
        
      5. Define an internal class named JobLauncherTestUtils to assist in starting the batch processing job. In this class, define a launchJob method to start the batch processing job. In this method, use the jobLauncher.run method to start the job and return the job's execution result.

        Code:

        private class JobLauncherTestUtils {
            public JobExecution launchJob(JobParameters jobParameters) throws Exception {
                return jobLauncher.run(job, jobParameters);
            }
        }
        

    AddPeopleDescProcessorTest.java file

    The AddPeopleDescProcessorTest.java file is a class that uses JUnit for testing Spring Batch job configurations.

    The code in the AddPeopleDescProcessorTest.java file mainly includes the following parts:

    1. Import other classes and interfaces.

      Declare the interfaces and classes included in the current file:

      • People class: stores the personnel information read from the database.
      • PeopleDESC class: stores the description information after the personnel information is converted or processed.
      • Test annotation: marks a method as a test method.
      • RunWith annotation: specifies the test runner.
      • Autowired annotation: performs dependency injection.
      • SpringBootTest annotation: specifies the test class as a Spring Boot test.
      • SpringRunner class: specifies the test runner as SpringRunner.

      Sample code:

      import com.oceanbase.example.batch.model.People;
      import com.oceanbase.example.batch.model.PeopleDESC;
      import org.junit.jupiter.api.Test;
      import org.junit.runner.RunWith;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.boot.test.context.SpringBootTest;
      import org.springframework.test.context.junit4.SpringRunner;
      
    2. Define the AddPeopleDescProcessorTest class.

      Use the SpringBootTest annotation and SpringRunner runner for Spring Boot integration testing.

      1. Use the @Autowired annotation to automatically inject the AddPeopleDescProcessor instance.

        Sample code:

        @Autowired
        private AddPeopleDescProcessor processor;
        
      2. Use the @Test annotation to mark the testProcess method as a test method. In this method, first create a People object, then use the processor.process method to process the object, and assign the result to a PeopleDESC object.

        Sample code:

        @Test
        public void testProcess() throws Exception {
            People people = new People();
            PeopleDESC desc = processor.process(people);
        }
        

    Introduction to AddDescPeopleWriterTest.java

    The AddDescPeopleWriterTest.java file is a class that uses JUnit to test the write logic of AddDescPeopleWriter.

    The code in the AddDescPeopleWriterTest.java file mainly includes the following parts:

    1. Reference other classes and interfaces.

      The current file contains the following interfaces and classes:

      • PeopleDESC class: stores description information of people after conversion or processing of people information.
      • Assert class: Used to assert test results.
      • Test annotation: specifies the test method.
      • RunWith annotation: specifies a test runner.
      • Autowired annotation: used for dependency injection.
      • @SpringBootTest annotation: specifies that the test class is for Spring Boot testing.
      • JdbcTemplate class: provides methods for executing SQL statements.
      • SpringRunner class: specify the test runner as SpringRunner.
      • ArrayList class to create an empty list.
      • List interface: For querying results.

      Sample code:

      import com.oceanbase.example.batch.model.PeopleDESC;
      import org.junit.Assert;
      import org.junit.jupiter.api.Test;
      import org.junit.runner.RunWith;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.boot.test.context.SpringBootTest;
      import org.springframework.jdbc.core.JdbcTemplate;
      import org.springframework.test.context.junit4.SpringRunner;
      
      import java.util.ArrayList;
      import java.util.List;
      
    2. Define the AddDescPeopleWriterTest class.

      You can write an integration test for Spring Boot by using the SpringBootTest annotation and the SpringRunner runner.

      1. Use @Autowired to inject an instance. Use the @Autowired annotation to automatically inject the AddPeopleDescProcessor and JdbcTemplate instances.

        Sample code:

        @Autowired
        private AddDescPeopleWriter writer;
        @Autowired
        private JdbcTemplate jdbcTemplate;
        
      2. Use @Test to test data insertion and output. Annotate the testWrite method with the @Test annotation to indicate that it is a test method. In this method, first create an empty peopleDescList and add two PeopleDESC objects to it. Then, call the writer.write method to write the data from the list to the database. Use jdbcTemplate to execute a query statement and retrieve data from the people_desc table. Use assertion statements to verify the accuracy of the data. Finally, print the query results to the console and output a message indicating that the job execution has completed.

        1. Insert data into the people_desc table. First, an empty list peopleDescList of PeopleDESC objects is created. Then, two PeopleDESC objects desc1 and desc2 are created, and their properties are set. desc1 and desc2 are added to peopleDescList. The write method of writer is called, writing the objects in peopleDescList to the people_desc table in the database. The JdbcTemplate is used to execute the query statement SELECT COUNT(*) FROM people_desc, obtaining the number of records in the people_desc table, and assigning the result to the variable count. Finally, the Assert.assertEquals method is used for assertions to check whether the value of count is equal to 2.

          Code:

             List<PeopleDESC> peopleDescList = new ArrayList<>();
             PeopleDESC desc1 = new PeopleDESC();
             desc1.setId(1);
             desc1.setName("John");
             desc1.setAge(25);
             desc1.setDesc("This is John with age 25");
             peopleDescList.add(desc1);
             PeopleDESC desc2 = new PeopleDESC();
             desc2.setId(2);
             desc2.setName("Alice");
             desc2.setAge(30);
             desc2.setDesc("This is Alice with age 30");
             peopleDescList.add(desc2);
             writer.write(peopleDescList);
          
             String selectSql = "SELECT COUNT(*) FROM people_desc";
             int count = jdbcTemplate.queryForObject(selectSql, Integer.class);
             Assert.assertEquals(2, count);
          
        2. Query the data from the people_desc table. The JdbcTemplate is used to query data from the people_desc table by using the SELECT * FROM people_desc statement, and the query result is processed by using a lambda expression. In the lambda expression, the field values in the query result set are obtained by using methods such as rs.getInt and rs.getString, and the field values are set to a new PeopleDESC object. The new PeopleDESC object is added to the resultDesc list. Then, a prompt line of people_desc table data: is printed. A for loop is used to traverse each PeopleDESC object in the resultDesc list. The System.out.println method is used to print the content of each PeopleDESC object. At the end of the program, an execution completion message is printed.

          The code is as follows:

          List<PeopleDESC> resultDesc = jdbcTemplate.query("SELECT * FROM people_desc", (rs, rowNum) -> {
             PeopleDESC desc = new PeopleDESC();
             desc.setId(rs.getInt("id"));
             desc.setName(rs.getString("name"));
             desc.setAge(rs.getInt("age"));
             desc.setDesc(rs.getString("description"));
             return desc;
          });
          
          System.out.println("people_desc data:");
          for (PeopleDESC desc : resultDesc) {
             System.out.println(desc);
          }
          
          // Outputs the information after the job execution is completed.
          System.out.println("Batch Job execution completed.");
          

    AddPeopleWriterTest.java

    The AddPeopleWriterTest.java file is a class that uses JUnit to test the writing logic of AddPeopleWriterTest.

    The AddPeopleWriterTest.java file contains the following parts:

    1. References other classes and interfaces.

      The current file includes the following interfaces and classes:

      • People class: Stores the information of personnel read from the database.
      • Test annotation: indicates a test method.
      • RunWith annotation: Specifies the test runner.
      • Autowired annotation: Used to inject dependencies.
      • SpringBootApplication annotation: used to indicate that this class serves as the entry point of a Spring Boot application.
      • SpringBootTest annotation: Specifies the test class as a Spring Boot test class.
      • ComponentScan annotation: specifies the package or class to be scanned for components.
      • JdbcTemplate class: provides methods for executing SQL statements.
      • SpringRunner class: specifies the test runner as SpringRunner.
      • ArrayList class for creating an empty list.
      • List interface: used to operate a query result set.

      The following example shows the code:

      import com.oceanbase.example.batch.model.People;
      import org.junit.jupiter.api.Test;
      import org.junit.runner.RunWith;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.boot.autoconfigure.SpringBootApplication;
      import org.springframework.boot.test.context.SpringBootTest;
      import org.springframework.context.annotation.ComponentScan;
      import org.springframework.jdbc.core.JdbcTemplate;
      import org.springframework.test.context.junit4.SpringRunner;
      
      import java.util.ArrayList;
      import java.util.List;
      
    2. Define the AddPeopleWriterTest class.

      We use the SpringBootTest annotation and the SpringRunner runner for integration testing of Spring Boot applications. The package to be scanned is specified using the @ComponentScan annotation.

      1. Inject the instance by using @Autowired. Use the @Autowired annotation to automatically inject instances of addPeopleWriter and JdbcTemplate.

        Sample code:

        @Autowired
        private AddPeopleWriter addPeopleWriter;
        @Autowired
        private JdbcTemplate jdbcTemplate;
        
      2. Use the @Test annotation to test data insertion and output.

        1. Insert data into the people table. First, create an empty peopleList of People objects. Then, create two People objects person1 and person2, set their name and age properties, and add them to peopleList. After that, call the write method of the addPeopleWriter object and pass peopleList to the method to write these People objects to the database.

          The following is the code:

             List<People> peopleList = new ArrayList<>();
             People person1 = new People();
             person1.setName("zhangsan");
             person1.setAge(27);
             peopleList.add(person1);
             People person2 = new People();
             person2.setName("lisi");
             person2.setAge(35);
             peopleList.add(person2);
             addPeopleWriter.write(peopleList);
          
        2. Output the data in the people table. First, we execute a query statement SELECT * FROM people using JdbcTemplate. We use a lambda expression to process the query results. In the lambda expression, we use the rs.getString and rs.getInt methods to extract field values from the result set and assign these field values to a new People object. We add the new People object to a result list result. We then print a message that prompts the user to press Enter to continue, followed by a message that indicates the operation was performed. We next loop through the result list by using a for loop and use System.out.println to print the contents of each People object. Finally, we print the message "Job has been completed."

          The following sample code is available:

             List<People> result = jdbcTemplate.query("SELECT * FROM people", (rs, rowNum) -> {
                 People person = new People();
                 person.setName(rs.getString("name"));
                 person.setAge(rs.getInt("age"));
                 return person;
             });
          
             System.out.println("Data in the people table:");
             for (People person : result) {
                 System.out.println(person);
             }
          
             // Output the information after the job is executed.
             System.out.println("Batch Job execution completed.");
          

    Full code

    pom.xml
    application.properties
    BatchApplication.java
    BatchConfig.java
    People.java
    PeopleDESC.java
    AddPeopleDescProcessor.java
    AddDescPeopleWriter.java
    AddPeopleWriter.java
    BatchConfigTest.java
    AddPeopleDescProcessorTest.java
    AddDescPeopleWriterTest.java
    AddPeopleWriterTest.java
    <?xml version="1.0" encoding="UTF-8"?>
    <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
             xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
        <modelVersion>4.0.0</modelVersion>
        <parent>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-parent</artifactId>
            <version>2.7.11</version>
            <relativePath/> <!-- lookup parent from repository -->
        </parent>
        <groupId>com.oceanbase</groupId>
        <artifactId>java-oceanbase-springboot</artifactId>
        <version>0.0.1-SNAPSHOT</version>
        <name>java-oceanbase-springbatch</name>
        <description>Demo project for Spring Batch</description>
        <properties>
            <java.version>1.8</java.version>
        </properties>
        <dependencies>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter</artifactId>
            </dependency>
            <dependency>
                <groupId>com.oceanbase</groupId>
                <artifactId>oceanbase-client</artifactId>
                <version>2.4.3</version>
            </dependency>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-jdbc</artifactId>
            </dependency>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-test</artifactId>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-batch</artifactId>
            </dependency>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-data-jpa</artifactId>
            </dependency>
            <dependency>
                <groupId>org.apache.tomcat</groupId>
                <artifactId>tomcat-jdbc</artifactId>
            </dependency>
            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <version>4.10</version>
                <scope>test</scope>
            </dependency>
            <dependency>
                <groupId>javax.activation</groupId>
                <artifactId>javax.activation-api</artifactId>
                <version>1.2.0</version>
            </dependency>
            <dependency>
                <groupId>jakarta.persistence</groupId>
                <artifactId>jakarta.persistence-api</artifactId>
                <version>2.2.3</version>
            </dependency>
        </dependencies>
    
        <build>
            <plugins>
                <plugin>
                    <groupId>org.springframework.boot</groupId>
                    <artifactId>spring-boot-maven-plugin</artifactId>
                </plugin>
            </plugins>
        </build>
    
    </project>
    
    
    #configuration database
    
    spring.datasource.driver-class-name=com.oceanbase.jdbc.Driver
    spring.datasource.url=jdbc:oceanbase://host:port/schema_name?characterEncoding=utf-8
    spring.datasource.username=user_name
    spring.datasource.password=
    
    # JPA
    spring.jpa.show-sql=true
    spring.jpa.hibernate.ddl-auto=update
    
    # Spring Batch
    spring.batch.job.enabled=false
    
    #
    logging.level.org.springframework=INFO
    logging.level.com.example=DEBUG
    
    package com.oceanbase.example.batch;
    
    import org.springframework.boot.SpringApplication;
    import org.springframework.boot.autoconfigure.SpringBootApplication;
    
    @SpringBootApplication
    public class BatchApplication {
        public static void main(String[] args) {
            SpringApplication.run(BatchApplication.class, args);
        }
    
        public void runBatchJob() {
        }
    }
    
    
    package com.oceanbase.example.batch.config;
    
    import com.oceanbase.example.batch.model.People;
    import com.oceanbase.example.batch.model.PeopleDESC;
    import com.oceanbase.example.batch.processor.AddPeopleDescProcessor;
    import com.oceanbase.example.batch.writer.AddDescPeopleWriter;
    import org.springframework.batch.core.Job;
    import org.springframework.batch.core.Step;
    import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
    import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
    import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
    import org.springframework.batch.core.launch.support.RunIdIncrementer;
    import org.springframework.batch.item.ItemProcessor;
    import org.springframework.batch.item.ItemReader;
    import org.springframework.batch.item.ItemWriter;
    import org.springframework.batch.item.database.JdbcCursorItemReader;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
    import org.springframework.boot.autoconfigure.SpringBootApplication;
    import org.springframework.context.annotation.Bean;
    import org.springframework.context.annotation.ComponentScan;
    import org.springframework.context.annotation.Configuration;
    import org.springframework.jdbc.core.BeanPropertyRowMapper;
    
    import javax.sql.DataSource;
    //import javax.activation.DataSource;
    
    @Configuration
    @EnableBatchProcessing
    @SpringBootApplication
    @ComponentScan("com.oceanbase.example.batch.writer")
    @EnableAutoConfiguration
    public class BatchConfig {
        @Autowired
        private JobBuilderFactory jobBuilderFactory;
    
        @Autowired
        private StepBuilderFactory stepBuilderFactory;
    
        @Autowired
        private DataSource dataSource;// Use the default dataSource provided by Spring Boot auto-configuration
    
        @Bean
        public ItemReader<People> peopleReader() {
            JdbcCursorItemReader<People> reader = new JdbcCursorItemReader<>();
            reader.setDataSource((javax.sql.DataSource) dataSource);
            reader.setRowMapper(new BeanPropertyRowMapper<>(People.class));
            reader.setSql("SELECT * FROM people");
            return reader;
        }
    
        @Bean
        public ItemProcessor<People, PeopleDESC> addPeopleDescProcessor() {
            return new AddPeopleDescProcessor();
        }
    
        @Bean
        public ItemWriter<PeopleDESC> addDescPeopleWriter() {
            return new AddDescPeopleWriter();
        }
    
        @Bean
        public Step step1(ItemReader<People> reader, ItemProcessor<People, PeopleDESC> processor,
                          ItemWriter<PeopleDESC> writer) {
            return stepBuilderFactory.get("step1")
                    .<People, PeopleDESC>chunk(10)
                    .reader(reader)
                    .processor(processor)
                    .writer(writer)
                    .build();
        }
    
        @Bean
        public Job importJob(Step step1) {
            return jobBuilderFactory.get("importJob")
                    .incrementer(new RunIdIncrementer())
                    .flow(step1)
                    .end()
                    .build();
        }
    }
    
    package com.oceanbase.example.batch.model;
    
    public class People {
        private String name;
        private int age;
    
            // getters and setters
    
        public String getName() {
            return name;
        }
    
        public void setName(String name) {
            this.name = name;
        }
    
        public int getAge() {
            return age;
        }
    
        public void setAge(int age) {
            this.age = age;
        }
        @Override
        public String toString() {
            return "People [name=" + name + ", age=" + age + "]";
        }
        // Getters and setters
    }
    
    package com.oceanbase.example.batch.model;
    
    public class PeopleDESC {
        private String name;
        private int age;
        private String desc;
        private int id;
    
        public String getName() {
            return name;
        }
    
        public void setName(String name) {
            this.name = name;
        }
    
        public int getAge() {
            return age;
        }
    
        public void setAge(int age) {
            this.age = age;
        }
    
        public String getDesc() {
            return desc;
        }
    
        public void setDesc(String desc) {
            this.desc = desc;
        }
    
        public int getId() {
            return id;
        }
    
        public void setId(int id) {
            this.id = id;
        }
    
        @Override
        public String toString() {
            return "PeopleDESC [name=" + name + ", age=" + age + ", desc=" + desc + "]";
        }
    }
    
    package com.oceanbase.example.batch.processor;
    
    import com.oceanbase.example.batch.model.People;
    import com.oceanbase.example.batch.model.PeopleDESC;
    import org.springframework.batch.item.ItemProcessor;
    
    
    public class AddPeopleDescProcessor implements ItemProcessor<People, PeopleDESC> {
        @Override
        public PeopleDESC process(People item) throws Exception {
            PeopleDESC desc = new PeopleDESC();
            desc.setName(item.getName());
            desc.setAge(item.getAge());
            desc.setDesc("This is " + item.getName() + " with age " + item.getAge());
            return desc;
        }
    }
    
    package com.oceanbase.example.batch.writer;
    
    import com.oceanbase.example.batch.model.PeopleDESC;
    import org.springframework.batch.item.ItemWriter;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.jdbc.core.JdbcTemplate;
    
    import java.util.List;
    
    public class AddDescPeopleWriter implements ItemWriter<PeopleDESC> {
        @Autowired
        private JdbcTemplate jdbcTemplate;
    
        @Override
        public void write(List<? extends PeopleDESC> items) throws Exception {
            // Delete the table if it exists.
            jdbcTemplate.execute("DROP TABLE people_desc");
            // create table statement
            String createTableSql = "CREATE TABLE people_desc (id INT PRIMARY KEY, name VARCHAR2(255), age INT, description VARCHAR2(255))";
            jdbcTemplate.execute(createTableSql);
            for (PeopleDESC item : items) {
                String sql = "INSERT INTO people_desc (id, name, age, description) VALUES (?, ?, ?, ?)";
                jdbcTemplate.update(sql, item.getId(), item.getName(), item.getAge(), item.getDesc());
            }
        }
    }
    
    package com.oceanbase.example.batch.writer;
    
    import com.oceanbase.example.batch.model.People;
    import org.springframework.batch.item.ItemWriter;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.jdbc.core.JdbcTemplate;
    import org.springframework.stereotype.Component;
    
    import java.util.List;
    
    @Component
    public class AddPeopleWriter implements ItemWriter<People> {
        @Autowired
        private JdbcTemplate jdbcTemplate;
    
        @Override
        public void write(List<? extends People> items) throws Exception {
            // Delete an existing table.
            jdbcTemplate.execute("DROP TABLE people");
            // CREATE TABLE statement
            String createTableSql = "CREATE TABLE people (name VARCHAR2(255), age INT)";
            jdbcTemplate.execute(createTableSql);
            for (People item : items) {
                String sql = "INSERT INTO people (name, age) VALUES (?, ?)";
                jdbcTemplate.update(sql, item.getName(), item.getAge());
            }
        }
    }
    
    
    package com.oceanbase.example.batch.config;
    
    import org.junit.Assert;
    import org.junit.jupiter.api.Test;
    import org.junit.runner.RunWith;
    import org.springframework.batch.core.Job;
    import org.springframework.batch.core.JobExecution;
    import org.springframework.batch.core.JobParameters;
    import org.springframework.batch.core.JobParametersBuilder;
    import org.springframework.batch.core.launch.JobLauncher;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.boot.test.context.SpringBootTest;
    import org.springframework.test.context.junit4.SpringRunner;
    
    import javax.batch.runtime.BatchStatus;
    
    @RunWith(SpringRunner.class)
    @SpringBootTest
    public class BatchConfigTest {
        @Autowired
        private JobLauncherTestUtils jobLauncherTestUtils;
    
        @Test
        public void testJob() throws Exception {
            JobParameters jobParameters = new JobParametersBuilder()
                    .addString("jobParam", "paramValue")
                    .toJobParameters();
    
            JobExecution jobExecution = jobLauncherTestUtils.launchJob(jobParameters);
    
            Assert.assertEquals(BatchStatus.COMPLETED, jobExecution.getStatus());
        }
    
        @Autowired
        private JobLauncher jobLauncher;
    
        @Autowired
        private Job job;
    
        private class JobLauncherTestUtils {
            public JobExecution launchJob(JobParameters jobParameters) throws Exception {
                return jobLauncher.run(job, jobParameters);
            }
        }
    }
    
    package com.oceanbase.example.batch.processor;
    
    import com.oceanbase.example.batch.model.People;
    import com.oceanbase.example.batch.model.PeopleDESC;
    import org.junit.jupiter.api.Test;
    import org.junit.runner.RunWith;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.boot.test.context.SpringBootTest;
    import org.springframework.test.context.junit4.SpringRunner;
    
    @RunWith(SpringRunner.class)
    @SpringBootTest
    public class AddPeopleDescProcessorTest {
        @Autowired
        private AddPeopleDescProcessor processor;
    
        @Test
        public void testProcess() throws Exception {
            People people = new People();
      //      people.setName("John");
      //      people.setAge(25);
    
            PeopleDESC desc = processor.process(people);
    
    //      Assert.assertEquals("John", desc.getName());
    //        Assert.assertEquals(25, desc.getAge());
     //       Assert.assertEquals("This is John with age 25", desc.getDesc());
        }
    }
    
    
    package com.oceanbase.example.batch.writer;
    
    import com.oceanbase.example.batch.model.PeopleDESC;
    import org.junit.Assert;
    import org.junit.jupiter.api.Test;
    import org.junit.runner.RunWith;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.boot.test.context.SpringBootTest;
    import org.springframework.jdbc.core.JdbcTemplate;
    import org.springframework.test.context.junit4.SpringRunner;
    
    import java.util.ArrayList;
    import java.util.List;
    
    @RunWith(SpringRunner.class)
    @SpringBootTest
    public class AddDescPeopleWriterTest {
        @Autowired
        private AddDescPeopleWriter writer;
        @Autowired
        private JdbcTemplate jdbcTemplate;
    
        @Test
        public void testWrite() throws Exception {
    
            // Insert data into the people_desc table
            List<PeopleDESC> peopleDescList = new ArrayList<>();
            PeopleDESC desc1 = new PeopleDESC();
            desc1.setId(1);
            desc1.setName("John");
            desc1.setAge(25);
            desc1.setDesc("This is John with age 25");
            peopleDescList.add(desc1);
            PeopleDESC desc2 = new PeopleDESC();
            desc2.setId(2);
            desc2.setName("Alice");
            desc2.setAge(30);
            desc2.setDesc("This is Alice with age 30");
            peopleDescList.add(desc2);
            writer.write(peopleDescList);
    
            String selectSql = "SELECT COUNT(*) FROM people_desc";
            int count = jdbcTemplate.queryForObject(selectSql, Integer.class);
            Assert.assertEquals(2, count);
    
            // Output data from the people_desc table
            List<PeopleDESC> resultDesc = jdbcTemplate.query("SELECT * FROM people_desc", (rs, rowNum) -> {
                PeopleDESC desc = new PeopleDESC();
                desc.setId(rs.getInt("id"));
                desc.setName(rs.getString("name"));
                desc.setAge(rs.getInt("age"));
                desc.setDesc(rs.getString("description"));
                return desc;
            });
    
            System.out.println("people_desc table data:");
            for (PeopleDESC desc : resultDesc) {
                System.out.println(desc);
            }
    
            // Output information after the job is completed
            System.out.println("Batch Job execution completed.");
        }
    }
    
    package com.oceanbase.example.batch.writer;
    
    import com.oceanbase.example.batch.model.People;
    import org.junit.jupiter.api.Test;
    import org.junit.runner.RunWith;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.boot.autoconfigure.SpringBootApplication;
    import org.springframework.boot.test.context.SpringBootTest;
    import org.springframework.context.annotation.ComponentScan;
    import org.springframework.jdbc.core.JdbcTemplate;
    import org.springframework.test.context.junit4.SpringRunner;
    
    import java.util.ArrayList;
    import java.util.List;
    
    @RunWith(SpringRunner.class)
    @SpringBootTest
    @SpringBootApplication
    @ComponentScan("com.oceanbase.example.batch.writer")
    public class AddPeopleWriterTest {
    
        @Autowired
        private AddPeopleWriter addPeopleWriter;
        @Autowired
        private JdbcTemplate jdbcTemplate;
    
        @Test
        public void testWrite() throws Exception {
            // Insert data into the people table
            List<People> peopleList = new ArrayList<>();
            People person1 = new People();
            person1.setName("zhangsan");
            person1.setAge(27);
            peopleList.add(person1);
            People person2 = new People();
            person2.setName("lisi");
            person2.setAge(35);
            peopleList.add(person2);
            addPeopleWriter.write(peopleList);
    
            // Query and output the result
            List<People> result = jdbcTemplate.query("SELECT * FROM people", (rs, rowNum) -> {
                People person = new People();
                person.setName(rs.getString("name"));
                person.setAge(rs.getInt("age"));
                return person;
            });
    
            System.out.println("people table data:");
            for (People person : result) {
                System.out.println(person);
            }
    
            // Output information after the job is completed
            System.out.println("Batch Job execution completed.");
        }
    }
    
    

    References

    For more information about OceanBase Connector/J, see OceanBase JDBC driver.

    Previous topic

    Connect to OceanBase Cloud by using Spring Boot
    Last

    Next topic

    Connect to OceanBase Cloud using Spring JDBC
    Next
    What is on this page
    Prerequisites
    Procedure
    Step 1: Obtain the connection string of the OceanBase Cloud database
    Step 2: Import the java-oceanbase-springbatch project into IDEA
    Step 3: Modify the database connection information in the java-oceanbase-springbatch project
    Step 4: Run the java-oceanbase-springbatch project
    Project code
    Introduction to the pom.xml file
    Introduction to the application.properties file
    Introduction to the BatchApplication.java file
    Introduction to the BatchConfig.java file
    Introduction to the People.java file
    Introduction to the PeopleDESC.java file
    Introduction to the AddPeopleDescProcessor.java file
    AddDescPeopleWriter.java file
    Introduction to the AddPeopleWriter.java file
    Introduction to the BatchConfigTest.java file
    AddPeopleDescProcessorTest.java file
    Introduction to AddDescPeopleWriterTest.java
    AddPeopleWriterTest.java
    Full code
    References