Making tests faster by using only partial database in EntityFramework Effort

effort entity-framework entity-framework-6 unit-testing

Question

Use case: We have a quite large database (about 200 tables) that is used in a large (legacy) system. It's implemented as a database-first approach, with one edmx file defining the entire database. We are using XUnit and Effort for automatic testing. The problem is that these tests are very slow. It takes something like 7-8 minutes to run our current test suite, even though test coverage isn't anywhere near what we want it to be.

I've noticed that if I create a smaller subset of the edmx file, by removing some tables that aren't needed, tests run faster.

I'm looking for a solution where for a particular test, or suite of tests, we can somehow make Effort only create the subset of tables that are needed (I think in many cases, we'll only need one table).

Currently we're setting up our connection like this:

connection = EntityConnectionFactory.CreateTransient("metadata=res://entities.csdl|res://entities.ssdl|res://entities.msl");

Is there some way we can (for instance, by running an XML transformation in runtime), make Effort only create the data structures it need for a subset of tables that we define?

1
3
11/2/2019 1:29:28 PM

Expert Answer

Disclaimer: I'm the owner of the project Entity Framework Effort

Our library has a feature that allows creating a restore point and rollbacking to it.

So by using this trick, you could use the CreateRestorePoint() only once when all tables are created and then for every test, start them with RollbackToRestorePoint. (There is several other ways to make it works but I guess you get the point)

It will without a doubt make your test run A LOT faster since the table will not have to be created every time.

Here is an example:

var conn = Effort.DbConnectionFactory.CreateTransient();

using (var context = new EntityContext(conn))
{
    context.EntitySimples.Add(new EntitySimple { ColumnInt = 1 });
    context.EntitySimples.Add(new EntitySimple { ColumnInt = 2 });
    context.EntitySimples.Add(new EntitySimple { ColumnInt = 3 });
    context.SaveChanges();
}

// Create a RestorePoint that will save all current entities in the "Database"
conn.CreateRestorePoint();


// Make any change
using (var context = new EntityContext(conn))
{
    context.EntitySimples.RemoveRange(context.EntitySimples);
    context.SaveChanges();
}

// Rollback to the restore point to make more tests
conn.RollbackToRestorePoint();
4
11/21/2019 7:19:03 PM

Popular Answer

Separate out Unit test and Integration test. For Integration test you can use Database and run on higher environments (to save time) but on local environments you can make use of Faker\Bogus and NBuilder to generate massive data for unit test.

https://dzone.com/articles/using-faker-and-nbuilder-to-generate-massive-data

Other option is you can create resource file corresponding to your unit test cases https://www.danylkoweb.com/Blog/the-fastest-way-to-mock-a-database-for-unit-testing-B6

I would also like to take you look at InMemoryDB vs SqlLite performance, http://www.mukeshkumar.net/articles/efcore/unit-testing-with-inmemory-provider-and-sqlite-in-memory-database-in-ef-core

Although above example is for EFCore, in EF6 also we can use SqlLite https://www.codeproject.com/Tips/1056400/Setting-up-SQLite-and-Entity-Framework-Code-First

So my recommendation for you is to go with sqllite for Integration testing scenarios. For Unit test you can go either with sqllite or with Faker\Bogus and NBuilder.

Hope it helps!



Related Questions





Related

Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow