Logo
Blog
Handling 20M Record Inserts in MySQL Without Async UI Changes

Handling 20M Record Inserts in MySQL Without Async UI Changes

adoption-trends

Avatar
Neel Shah
September 9, 2025

Introduction

In modern enterprise systems like User Access Management (UAM), we often deal with millions of records in a single operation.
Recently, we faced a real challenge:

  • Store 20 million records into MySQL in a single API call.

Sounds straightforward? It wasn’t. Let’s break down why, and how we solved it.

Problem Statement

  • Requirement: Store 20M records in MySQL in one API call.
  • Constraints:
    • No async approach allowed → UI cannot be changed.
    • Current implementation attempts direct inserts but fail with performance bottlenecks and runtime issues.
  • Challenges faced:
    • Long transactions → DB locks.
    • Memory blowups.
    • ClassCastException, NoClassDefFoundError-like runtime conflicts when frameworks tried to handle massive payloads.
    • Async queues (Kafka, RabbitMQ) → rejected because UI changes would be huge.

Why Direct Insert Fails

  • A single transaction with 20M rows is too heavy.
  • MySQL struggles with:
    • Transaction logs growing huge.
    • Index updates blocking other queries.
    • High replication lag in case of master-slave setups.
  • ORM frameworks (Hibernate, Sequelize, etc.) add overhead per row.

Solution Approaches

We explored multiple paths before landing on a working strategy:

1. Chunking Inside the Backend (No UI Change)

Even if the API receives 20M records at once, the backend can split them into chunks of 50k–100k and insert them in loops.

javascript
// Example: Node.js bulk insert with chunking
async function bulkInsert(records, connection) {
const chunkSize = 50000;
for (let i = 0; i < records.length; i += chunkSize) {
const chunk = records.slice(i, i + chunkSize);
await connection.query("INSERT INTO users (id, name, role) VALUES ?", [chunk]);
console.log(`Inserted ${i + chunk.length} records so far...`);
}
}

✅ Appears as one API call to UI.
✅ Keeps DB load balanced.
❌ Still slower than streaming solutions.

2. Redis as a Staging Layer

Instead of writing everything directly into MySQL, we first push records into Redis.

  • Step 1: Store all 20M in Redis (fast, in-memory).
  • Step 2: Backend service drains Redis into MySQL in smaller bulk inserts.

This prevents the API request from holding all records in memory while still appearing synchronous to UI.

javascript
// Example: pushing to Redis
await redisClient.rpush("bulkRecords", JSON.stringify(record));

3. MySQL LOAD DATA for Maximum Speed

The fastest way is to convert the incoming dataset into a CSV or TSV file, then stream it into MySQL with LOAD DATA INFILE.

sql
LOAD DATA LOCAL INFILE '/tmp/users.csv'
INTO TABLE users
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(id, name, role);

Why this works:

  • Avoids ORM overhead.
  • Uses MySQL’s internal optimizations.
  • Can insert millions of rows in minutes.

Database Optimizations

While applying these methods, we also tuned MySQL:

  • Disabled secondary indexes during insert, re-enabled after.
  • Increased innodb_buffer_pool_size.
  • Used transactions per chunk instead of per row.
  • Applied table partitioning for long-term scalability.

Key Takeaways

  • Don’t try to push 20M rows in one transaction → it’s a recipe for failure.
  • Use chunking, Redis staging, or LOAD DATA depending on your system constraints.
  • Keep UI unchanged while backend silently optimizes the workflow.
  • Always benchmark insert speeds and adjust chunkSize + DB configs.

Conclusion

Massive inserts in MySQL are possible — but not with brute force. When async queues aren’t an option, backend chunking + Redis staging + MySQL LOAD DATA provide a solid solution.

At the end of the day, it’s about balancing developer constraints, UI limitations, and database performance.

Contact Us

Thank you for reading our comprehensive guide on "Solving Runtime Class Conflicts with a Custom ClassLoader" We hope you found it insightful and valuable. If you have any questions, need further assistance, or are looking for expert support in developing and managing your Java projects, our team is here to help!

Reach out to us for Your Java Project Needs:

🌐 Website: https://www.prometheanz.com

📧 Email: [email protected]


Copyright © 2025 PrometheanTech. All Rights Reserved.