As stated in comments – Entity Framework (not sure about Entity Framework Core) by default calls DetectChanges
on every Add
. This function, among other things, scans all entities already tracked by a context to detect changes in and between them. That means time complexity of this function is O(n), where n is number of entities already tracked by a context. When you do a lot of adds in a loop – time complexity becomes O(n^2), where n is total number of items added. So even with tiny numbers like 3000 rows, perfomance drops down very significatly.
To fix this (arguable design) issue there are couple of options:
set AutoDetectChangesEnabled of context to
false
. Then manually callDetectChanges
beforeSaveChanges
.or use
AddRange
instead of adding entities one by one, it callsDetectChanges
just once.
Another notes:
Try to avoid reusing context between operations. You said there was already 3000 entities tracked by context before you called first Add. It’s better to create new context every time you need it, do the stuff, then dispose it. Perfomance impact is negligible (and connections are managed by connection pool and are not necessary open or close every time you create\dispose a context), but you will have much less problems like this one (reusing context can bite not only in the scenario you have now, but in several others).
Use
AsNoTracking
queries if you do not intend to modify entities returned by specific query (or if you intend to modify some of them later by attaching to context). Then context will not track them which will reduce the possibility of mentioned and other perfomance problems.
As for Linq To Sql – it has a similar concept of “detect changes”, but it is automatically called only before commiting changes to database, not on every add, so you do not see the same problem there.
CLICK HERE to find out more related problems solutions.