This content originally appeared on DEV Community and was authored by davinceleecode
When working with large datasets in .NET, it's tempting to use foreach
loops or raw SQL to insert records. But in Entity Framework Core (EF Core), there's a better way — efficient, clean, and scalable.
This guide shows how to handle bulk inserts in EF Core the right way — especially when inserting hundreds or thousands of records.
The Common Pitfalls (What NOT to Do)
❌ 1. Insert One-by-One in a Loop
foreach (var product in products)
{
dbContext.Products.Add(product);
dbContext.SaveChanges(); // ❌ Very slow, one DB call per insert
}
- Causes N separate database calls
- Slows down drastically as data grows
- Risk of transaction bloat and timeouts
❌ 2. Manually Build Raw SQL
INSERT INTO Products (Name, Price) VALUES ('A', 10), ('B', 20); -- ❌ Hard to maintain
- Risk of SQL injection if not parameterized
- No benefit from EF tracking
- Difficult to integrate with domain logic
The Recommended Approach
✅ Use AddRange() + SaveChanges()
var products = new List<Product>();
for (int i = 0; i < 1000; i++)
{
products.Add(new Product { Name = $"Product {i}", Price = 10.0m });
}
dbContext.Products.AddRange(products);
await dbContext.SaveChangesAsync(); // ✅ Single batch insert
- Fast and clean
- One DB round-trip
- Works well for up to ~10,000 rows
✅ Handling Very Large Datasets (Chunking)
When inserting more than 10,000 records, chunk your data:
const int batchSize = 500;
var chunks = products.Chunk(batchSize); // .NET 6+
foreach (var chunk in chunks)
{
dbContext.Products.AddRange(chunk);
await dbContext.SaveChangesAsync();
dbContext.ChangeTracker.Clear(); // 🚀 Prevent memory bloat
}
- Keeps memory usage low
- Avoids EF tracking overload
- Improves throughput for very large inserts
.Chunk()
is available in .NET 6+. For earlier versions, use your own batching logic.
✅ Tips
Technique | When to Use | Notes |
---|---|---|
AddRange + SaveChanges | Up to ~10k inserts | Safe and simple |
Chunked batches | 10k+ records | Prevents memory leaks |
EFCore.BulkExtensions | High-performance bulk ops (100k+) | External lib, extremely fast |
COPY (PostgreSQL) / TVPs | DB-level jobs or imports | Not EF-based, advanced use case |
✅ Prevent Abuse in APIs
To avoid overload from users uploading massive data:
- ✅ Limit payloads to 100–1000 items per request
- ✅ Use ModelState or middleware to reject oversized payloads
- ✅ Apply rate-limiting to prevent abuse
📎 Summary
EF Core can handle large inserts efficiently — if you use it right.
- Use
AddRange()
for natural batching - Avoid per-record inserts and raw SQL
- Split into chunks for massive loads
- Clear the change tracker after each batch
If you found this helpful, consider supporting my work at ☕ Buy Me a Coffee.
This content originally appeared on DEV Community and was authored by davinceleecode

davinceleecode | Sciencx (2025-06-27T02:00:00+00:00) Supercharging Entity Framework Core: Efficient Bulk Inserts Without Loops or Raw SQL. Retrieved from https://www.scien.cx/2025/06/27/supercharging-entity-framework-core-efficient-bulk-inserts-without-loops-or-raw-sql/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.