This project has moved and is read-only. For the latest updates, please go here.

Performance issues

Dec 6, 2012 at 10:11 AM

Hi, Tamas.

I'm using Effort to unit test a database.  What I'm seeing is that as the database grows in size each unit test takes longer.

Over the last few weeks we've seen a rise from an average time per test from about a quarter of a second to several seconds.

Most of the tests are pretty simple (get this record from the database or update this record).

There have been quite sharp jumps in the test durations which correspond to the times when I added tables and or fields to the entities.  There doesn't appear to be a noticeable effect as new csv files are added to the tests or as additional lines are added, which implies to me that the bulk of the time for the tests are taken up with the creation of the schema rather than the loading of the CSV, though I've made no attempt to profile.

The database currently has around 30 tables with all bar 5 or so defined classes. 

The csv files have in total 10K of data in them.

With 200 unit tests (and rising), for each to take seconds is becoming a matter for concern.

Any suggestion (or investigations) most welcome.

Dec 6, 2012 at 10:48 AM

Hi Iain,

Currently the main focus is on correctness, but I am aware of the fact that Effort and NMemory should perform better. For now, I suggest you to enable paralell unit test execution. If you use transient databases then there is nothing more to do. If you use persistent databases, then you should ensure that each unit test uses a unique identifier (e.g. use guid) for its database. It is also highly recommended to clean up the persistent database after a test executed(DbContext.Database.Delete, ObjectContext.DeleteDatabase). I would use dedicated methods annotated with TestInitialize and TestCleanup attributes (or similar in other unit test libraries) to achieve this.

I am not sure about the bottlenecks, but I think you should also try to wrap your CsvDataLoader with CachingDataLoader. This may speed database initialization a little bit.

However, if you had some free time to spend detecting bottlenecks with a performance profiler, I would kindly receive any analysis result about this.

Dec 6, 2012 at 12:25 PM

I'm not sure what you you mean by persistent databases.  Is it possible for Effort to save state between context creations?  Or do mean running against SQL server or equivalent.  The former would be nice. 

I will look at the CachingDataLoader.

Parallel unit tests are a good idea, but the current team development environment is a (virtual) dual core machine with limited memory and 5 users.  Parallelisation is not realistic (and would probably make matters worse)!  Hopefully, that's being addressed but it's a bit annoying at the moment.


Dec 6, 2012 at 12:31 PM

The former. All the factory components have CreateTransient and CreatePersistent methods, I wrote a blog post about this.

Dec 6, 2012 at 12:52 PM


Dec 12, 2012 at 1:46 PM

I've done a little bit more work on this.

Specifically I've generated a test program which has my CSV test data and simply runs up the database and gets a count from one of the tables.

The time (about 2 seconds) to run the program is more or less entirely spent in DbCreateDatabase.  There is no visible resources allocated to loading the csv file that I can see.

What I'd like to propose is that you split the DB generation and data load.

So we would create the empty in-memory DB at class initialize and then for each unit test clear the tables and load the csv files.  I expect this would have a dramatic improvement on the per-test performance.