OceanBase logo

OceanBase

A unified distributed database ready for your transactional, analytical, and AI workloads.

DEPLOY YOUR WAY

OceanBase Cloud

The best way to deploy and scale OceanBase

OceanBase Enterprise

Run and manage OceanBase on your infra

TRY OPEN SOURCE

OceanBase Community Edition

The free, open-source distributed database

OceanBase seekdb

Open source AI native search database

Customer Stories

Real-world success stories from enterprises across diverse industries.

View All
BY USE CASES

Mission-Critical Transactions

Global & Multicloud Application

Elastic Scaling for Peak Traffic

Real-time Analytics

Active Geo-redundancy

Database Consolidation

Resources

Comprehensive knowledge hub for OceanBase.

Blog

Live Demos

Training & Certification

Documentation

Official technical guides, tutorials, API references, and manuals for all OceanBase products.

View All
PRODUCTS

OceanBase Cloud

OceanBase Database

Tools

Connectors and Middleware

QUICK START

OceanBase Cloud

OceanBase Database

BEST PRACTICES

Practical guides for utilizing OceanBase more effectively and conveniently

Company

Learn more about OceanBase – our company, partnerships, and trust and security initiatives.

About OceanBase

Partner

Trust Center

Contact Us

International - English
中国站 - 简体中文
日本 - 日本語
Sign In
Start on Cloud

A unified distributed database ready for your transactional, analytical, and AI workloads.

DEPLOY YOUR WAY

OceanBase Cloud

The best way to deploy and scale OceanBase

OceanBase Enterprise

Run and manage OceanBase on your infra

TRY OPEN SOURCE

OceanBase Community Edition

The free, open-source distributed database

OceanBase seekdb

Open source AI native search database

Customer Stories

Real-world success stories from enterprises across diverse industries.

View All
BY USE CASES

Mission-Critical Transactions

Global & Multicloud Application

Elastic Scaling for Peak Traffic

Real-time Analytics

Active Geo-redundancy

Database Consolidation

Comprehensive knowledge hub for OceanBase.

Blog

Live Demos

Training & Certification

Documentation

Official technical guides, tutorials, API references, and manuals for all OceanBase products.

View All
PRODUCTS
OceanBase CloudOceanBase Database
ToolsConnectors and Middleware
QUICK START
OceanBase CloudOceanBase Database
BEST PRACTICES

Practical guides for utilizing OceanBase more effectively and conveniently

Learn more about OceanBase – our company, partnerships, and trust and security initiatives.

About OceanBase

Partner

Trust Center

Contact Us

Start on Cloud
编组
All Products
    • Databases
    • iconOceanBase Database
    • iconOceanBase Cloud
    • iconOceanBase Tugraph
    • iconInteractive Tutorials
    • iconOceanBase Best Practices
    • Tools
    • iconOceanBase Cloud Platform
    • iconOceanBase Migration Service
    • iconOceanBase Developer Center
    • iconOceanBase Migration Assessment
    • iconOceanBase Admin Tool
    • iconOceanBase Loader and Dumper
    • iconOceanBase Deployer
    • iconKubernetes operator for OceanBase
    • iconOceanBase Diagnostic Tool
    • iconOceanBase Binlog Service
    • Connectors and Middleware
    • iconOceanBase Database Proxy
    • iconEmbedded SQL in C for OceanBase
    • iconOceanBase Call Interface
    • iconOceanBase Connector/C
    • iconOceanBase Connector/J
    • iconOceanBase Connector/ODBC
    • iconOceanBase Connector/NET
icon

OceanBase Loader and Dumper

V4.3.3

  • Document Overview
  • Introduction
  • Technical mechanism
  • Preparations
    • Prepare the environment
    • Prepare data
    • Download OBLOADER & OBDUMPER
  • User Guide (OBLOADER)
    • Quick start
    • Command-line options
    • Direct load
    • Data processing
      • Define control files
      • Preprocessing functions
      • Case expressions
    • Use cases of command-line options
    • Performance tuning
    • Error handling
    • FAQ
  • User Guide (OBDUMPER)
    • Quick start
    • Command-line options
    • Data processing
      • Define control files
      • Preprocessing functions
      • Case expressions
    • Performance tuning
    • FAQ
  • Security features
  • Connection settings
  • Self-service troubleshooting
  • Release Note
    • Release Note
      • 4.x
        • OBLOADER & OBDUMPER V4.3.3
        • OBLOADER & OBDUMPER V4.3.2.1
        • OBLOADER & OBDUMPER V4.3.2
        • OBLOADER & OBDUMPER V4.3.1.1
        • OBLOADER & OBDUMPER V4.3.1
        • OBLOADER & OBDUMPER V4.3.0
        • OBLOADER & OBDUMPER V4.2.8.2
        • OBLOADER & OBDUMPER V4.2.8.1
        • OBLOADER & OBDUMPER V4.2.8
        • OBLOADER & OBDUMPER V4.2.6
        • OBLOADER & OBDUMPER V4.2.7
        • OBLOADER & OBDUMPER V4.2.5
        • OBLOADER & OBDUMPER V4.2.4
        • OBLOADER & OBDUMPER V4.2.1
        • OBLOADER \& OBDUMPER V4.1.0
        • OBLOADER \& OBDUMPER V4.0.0
      • 3.x
        • OBLOADER \& OBDUMPER V3.1.0
        • OBLOADER \& OBDUMPER V3.0.0
    • Version rules

Download PDF

Document Overview Introduction Technical mechanism Prepare the environment Prepare data Download OBLOADER & OBDUMPER Quick start Command-line options Direct load Define control files Preprocessing functionsCase expressions Use cases of command-line options Performance tuning Error handling FAQ Quick start Command-line options Define control files Preprocessing functionsCase expressions Performance tuning FAQ Security features Connection settings Self-service troubleshooting Version rules
OceanBase logo

The Unified Distributed Database for the AI Era.

Follow Us
Products
OceanBase CloudOceanBase EnterpriseOceanBase Community EditionOceanBase seekdb
Resources
DocsBlogLive DemosTraining & Certification
Company
About OceanBaseTrust CenterLegalPartnerContact Us
Follow Us

© OceanBase 2026. All rights reserved

Cloud Service AgreementPrivacy PolicySecurity
Contact Us
Document Feedback
  1. Documentation Center
  2. OceanBase Loader and Dumper
  3. V4.3.3
iconOceanBase Loader and Dumper
V 4.3.3
  • V 4.3.5
  • V 4.3.4.1
  • V 4.3.4
  • V 4.3.3.1
  • V 4.3.3
  • V 4.3.2.1
  • V 4.3.2
  • V 4.3.1
  • V 4.2.8
  • V 4.2.7
  • V 4.2.6
  • V 4.2.5 and earlier

Error handling

Last Updated:2025-03-05 02:55:22  Updated
share
What is on this page
Error type
Valid errors
Errors that cannot be handled
Best practices

folded

share

If the imported data contains bad record or discard record errors, OBLOADER V4.2.4 or later allows you to control the impact of such dirty data on the exit status of the process. You can manually fix such errors based on the generated error file.

This topic describes the mechanism for handling errors that occur when you use OBLOADER to import data. It also describes the optional parameters.

Error type

Valid errors

Data errors

Error type Description
Data type mismatch For example, VARCHAR data is inserted into an INT column.
Invalid NULL value A NOT NULL constraint is set.
Data type overflow A string or INT type value exceeds the limit.
Data preprocessing failure A call to a preprocessing function defined in the control file failed.
Column count mismatch The number of columns in the file does not match the number of columns in the database.
Duplicate primary key or unique key The primary key or unique key in the table is duplicated.

Note

  • When a data error occurs during data import, the import task cannot be retried.
  • You can view details of errors including data type mismatch, invalid NULL values, data type overflow, data preprocessing failure, and column count mismatch in the ob-loader-dumper.bad file in the {ob-loader-dumper}/logs directory, and details of errors that involve a duplicate primary or unique key in the ob-loader-dumper.discard file in the {ob-loader-dumper}/logs directory.

Environment errors

Error type Description
Network errors
  • The connection times out.
  • The connection is disconnected.
  • I/O times out.
Database system errors
  • The transaction is terminated.
  • The transaction is rolled back.
  • The tenant memory reaches the threshold.
  • The server times out.

Note

  • When an environment error occurs during data import, a single import task can be retried at most five times. The number of retries cannot be modified. After an import task reaches the retry threshold, the import task is marked as failed and the process is terminated.
  • If an environment error occurs when a table without a primary key or unique key is imported, the import task will not be retried and the current subtask is marked as failed to avoid reinserting the same batch of data.

Errors that cannot be handled

Error type Description
Original file inaccessible For example, the file is damaged or the disk access privilege is not granted.
Grammar errors in the original file For example, a CSV file has no delimiter or an SQL file has no line break.
Unable to connect to the database For example, the network between the client and server is disconnected or a connection parameter is incorrect.
OS errors For example, the JVM OOM error is returned or the process is unexpectedly terminated.

Note

When an error that cannot be handled occurs during data import, the import task cannot be retried and is marked as failed, and the process is terminated.

Best practices

In this example, the imported data has a known error. To fix this error, perform the following steps:

  1. Prepare data.

    Sample code:

    DROP TABLE IF EXISTS `example`;
    CREATE TABLE `example` (c1 TINYINT PRIMARY KEY, c2 VARCHAR(12) NOT NULL UNIQUE);
    
    INSERT INTO `example` (c1, c2) VALUES
     (0, NULL),              -- The column cannot be empty.
     (1, 'one'),
     (2, 'two'),
     (40, 'forty'),          
     (50, 'fifty-four'),     
     (77, 'seventy-seven'),  -- The string exceeds 12 characters in length.
     (600, 'six hundred'),   -- The number is beyond the range supported for the TINYINT data type.
     (40, 'forty'),         -- Conflicts with '40'
     (42, 'fifty-four');     -- Conflicts with `'fifty-four'`
    
  2. Use OBLOADER to import data and specify the --strict option to skip errors and continue.

    Sample code:

    $./obloader -h xx.x.x.x -P 2883 -u test -p ****** --sys-user **u*** --sys-password ****** -c cluster_a -t mysql -D USERA --csv --table 'example'  -f /output --strict
    

    The default value of the --strict option is true, which indicates that the tool exits at the failed state (System exit 1) when the imported data contains bad record or discard record errors. To prevent dirty data from affecting the exit state (System exit 0) of the tool, you must set the --strict option to false. The --strict option can be used in combination with the --max-discards or --max-errors option to specify that when the amount of duplicate data or the number of errors is within the specified range, the tool will skip the error and continue.

  3. Run OBLOADER, delete the error data, and skip four errors.

    Sample result:

    ...
    2023-07-11 06:25:22 [WARN] Bad records were found in file: "/home/admin/obloaderobdumper/output/data/USERA/TABLE/example.1.0.csv". Check "ob-loader-dumper.bad" for details
    2023-07-11 06:25:22 [WARN] Skipped 4 problematic records for table `USERA`.`example`. Check "logs" folder for details
    2023-07-11 06:25:24 [INFO] Drain and halt the worker group finished
    2023-07-11 06:25:24 [INFO] Close connections: 26 of the BIZ DataSource.
    2023-07-11 06:25:24 [INFO] Shutdown task context finished
    2023-07-11 06:25:24 [ERROR] Error: Skipped 4 problematic records for table `USERA`.`example`. Check "logs" folder for details
    2023-07-11 06:25:24 [INFO] ----------   Finished Tasks: 1       Running Tasks: 0        Progress: 100.00%       ----------
    2023-07-11 06:25:24 [INFO]
    
    All Load Tasks Finished:
    
    ----------------------------------------------------------
    |  No.#  | Type   | Name       | Count       | Status    |
    ----------------------------------------------------------    
    |  1     | TABLE  | example     | 9 -> 5     | FAILURE   |                
    ----------------------------------------------------------
    
    Total Count: 5          End Time: 2023-07-11 06:25:24
    
    2023-07-11 06:25:24 [INFO] Load record finished. Total Elapsed: 3.281 s
    2023-07-11 06:25:24 [ERROR] System exit 1
    ...
    

    Example of data to be imported:

    ...
    SELECT * FROM `example`;
    +----+------------+
    | c1 | c2         |
    +----+------------+
    |  0 | NULL       |
    |  1 | one        |
    |  2 | two        |
    | 40 | forty      |
    | 50 | fifty-four |
    +----+------------+
    5 rows in set (0.05 sec)
    ...
    
  4. View error details.

    Note

    • View details of data type mismatch and data type overflow errors in the {ob-loader-dumper}/logs/ob-loader-dumper.bad file.
    • View details of duplicate primary key or unique key errors in the {ob-loader-dumper}/logs/ob-loader-dumper.discard file.
    ...
    cd /output/logs
    cat ob-loader-dumper.bad
    INSERT INTO example (`c1`,`c2`) VALUES ('77','seventy-seven');
    Cause: Data too long for column 'c2' at row 1   
    
    INSERT INTO example (`c1`,`c2`) VALUES ('600','six hundred');
    Cause: Out of range value for column
    
    cat ob-loader-dumper.discard
    INSERT INTO example (`c1`,`c2`) VALUES ('40','forty');
    INSERT INTO example (`c1`,`c2`) VALUES ('42','fifty-four');
    ...
    
  5. Manually fix such errors based on the error messages.

Previous topic

Performance tuning
Last

Next topic

FAQ
Next
What is on this page
Error type
Valid errors
Errors that cannot be handled
Best practices