Inserting a Large JSON Dataset (1000+ Rows)

Handle large JSON arrays with thousands of records by using batch sizing to split them into multiple INSERT statements for database compatibility.

Basic INSERT

Detailed Explanation

Handling Large Datasets

When your JSON array contains hundreds or thousands of records, a single INSERT statement may exceed database limits. The bulk INSERT tool solves this with configurable batch sizing that splits your data into multiple INSERT statements.

The Problem

Most databases have practical limits on statement size:

Database Typical Limit
MySQL max_allowed_packet (default 4MB)
PostgreSQL ~1GB but practical limit ~65,535 parameters
SQLite SQLITE_MAX_VARIABLE_NUMBER (default 999)
SQL Server 1,000 rows per INSERT

The Solution: Batch Sizing

Set the batch size to 100 or 500 rows, and the tool generates multiple INSERT statements:

-- Batch 1 of 10
INSERT INTO events ("id", "action", "timestamp")
VALUES
  (1, 'login', '2024-01-01 09:00:00'),
  (2, 'click', '2024-01-01 09:01:00'),
  -- ... 98 more rows
  (100, 'logout', '2024-01-01 10:40:00');

-- Batch 2 of 10
INSERT INTO events ("id", "action", "timestamp")
VALUES
  (101, 'login', '2024-01-01 11:00:00'),
  -- ... and so on

Choosing the Right Batch Size

  • 10-50 rows: Very safe for any database, useful for debugging
  • 100-500 rows: Good default for production imports
  • 1000 rows: Maximum for SQL Server, works well for PostgreSQL
  • Larger: Only recommended for PostgreSQL with well-tuned settings

Performance Tip

Wrap your batches in a transaction (BEGIN; ... COMMIT;) to avoid per-statement transaction overhead. The tool has a "Wrap in transaction" toggle for exactly this purpose.

Use Case

You are migrating 50,000 event log records from a JSON export to MySQL. Setting the batch size to 500 generates 100 INSERT statements, each well within MySQL's packet limit, that you can execute sequentially in a migration script.

Try It — JSON to Bulk INSERT

Open full tool