In SQL Server, Entity Framework 6 vs. Dapper vs. stored procedures vs. requests

dapper database entity-framework linq sql-server


I've long used EF with LINQ and SQL Server in my project.

And I was looking for a solution to make my queries to the database run more quickly. Additionally, I read a lot about Dapper and Procedures and how they are quicker than EF. I added Dapper to the project and a method, however my tests produced odd outcomes. There are no advantages with EF, Dapper, or stored procedures—the outcomes are essentially same.

I started by checking a request that has a lot of Join. Almost identical results were obtained using EF, Dapper, and Procedure. Then I made the decision to run a test using a single, straightforward table devoid of relations.

I have ZipCodes in a table. 43200 records are present.

ZipCode list

I tested using EF, Dapper, a stored procedure, and a request in SQL Server with 1,000 records, 10,000 records, and 43200 records.


string query =
            "SELECT TOP (43200) [Zip]\r\n      ,[City]\r\n      ,[State]\r\n      ,[Latitude]\r\n      ,[Longitude]\r\n      ,[TimeZone]\r\n      ,[DST]\r\n  FROM [dbo].[ZipCodes]";

using (IDbConnection connection = new SqlConnection(_connectionString))
    var result = connection.QueryAsync<ZipCodes>(query).Result.ToList();
    return result;


var zip = db.ZipCodes.AsNoTracking().Take(43200).ToList();

stored method

    SELECT TOP (43200) 
        [Zip], [City], [State], [Latitude], [Longitude], [TimeZone], [DST] 

SQL Server request over time


SELECT TOP (43200) 
    [Zip], [City], [State], [Latitude], [Longitude], [TimeZone], [DST]


I utilize Stopwatch in the program.

string first = "", second = "", third="";
System.Diagnostics.Stopwatch swatch = new System.Diagnostics.Stopwatch();
swatch = new Stopwatch();
swatch.Start(); Dapper request;


first = swatch.Elapsed.ToString(@"m\:ss\.fff");
swatch = new Stopwatch();

and so forth

Results: (in milliseconds)

                       1000     10000      43200
EF                      107      1085       4527
Dapper                  139      1084       4036
Stored procedure        129      1089       4519
SQL query                 8        17         60

There is really little difference between EF, Dapper, and a stored procedure. Why is that?

And why are requests from code 15–70 times slower than SQL Server queries?

Is it okay or not?

3/17/2019 10:32:20 AM

Popular Answer

Dapper or ADO + Sprocs won't suddenly make slow-running EF-based code run quicker. You must look into and get rid of the reasons of the performance difficulties if you want to solve the performance issue.

These performance concerns are the result of two main issues, according to the top level.

  • overloading with data
  • frequent data loading

The main things I watch out for are: (as a start, there are a lot more items, but these are the big wins)

  1. The term "lazy loading" refers to a situation where the code loads from a group of related entities, but does so by accessing each of those linked entities separately after the initial load.

    • Hunting Technique: Connect a SQL profiler to a database that only your debugging app instance is using. (Or a local database)
    • Symptom: Seeing a whole lot of "SELECT TOP (1)..." queries after a main query to load a single entity, or collection.
    • Fix: Implementing eager loading is the easiest fix (.Include() ) to load these assortments. Using is the superior solution..Select() to just load the attributes that the aforementioned function need.
  2. .ToList() Away from.ToList() calls might result in significant performance issues as systems mature since developers had an EF issue that was fixed by calling a.ToList . Typically, issues come up when programmers attempt to invoke a function within a.Where() or .Select() expression. Because EF is unable to comprehend these to pass a SQL equivalent, adding the.ToList() transforms it to Linq2Object, and presto—it functions!

    • Hunting Technique: Seek examples of.ToList() and mark any instances where you discover a.ToList() a before.Select() or .Where() etc.
    • Symptom: Eliminating the excess.ToList() results in an EF error.
    • Fix: Determine if a DbFunctions equivalent exists for the problematic function. Working with the DateTime functions contained in DbFunctions is a regular source of trouble. In other situations, locate the problematic function, create a View Model for the specified anticipated data, and then add a property to the View Model that will execute the problematic function.
  3. Entities with client-side pagination: another another error in development done without enough test data. It is possible to create queries that essentially return all data without taking the entire number of records into account. Client-side grid pagination is used to show the data, and although it "functions," it is quite sluggish. When there were just 200 rows in the database, it ran well, but now 50,000 records cause it to stutter. (and it won't get better)

    • Hunting Technique: Any API or Controller function that returns collections is worth examining. Use these questions.ToList() or .Skip() +.Take() ? Do these methods yield view models or entities?
    • Paginated lists take a long time to load. Pages load quickly after they are loaded.
    • Fix: Server-side pagination should be used for calls from pagination controls. This entails providing the server calls with information about sorting, page size, and page count. By loading just the number of rows required for the control to show, the EF queries may be made more effective. Additionally, the code should produce view models for the search results that only show the columns that are visible as well as the keys required to load a complete entity when necessary. (For instance, when a user clicks a record to access it.) When just a small number of fields need to be shown, entities might be cumbersome.
  4. Indexing a database: Has the database been watched over and kept up with? Do indexes exist and are they maintained? Is the database for SQL Server being backed up and having its Tx Log shrunk? Code These problems often arise during first implementations when a system is started up without taking the backup database into account. The database that supports systems is not given any attention as they expand.

    • Hunting Method: Does the database have a dedicated DBA that oversees it and certifies that it is operating at peak efficiency?
    • Symptoms: The database has not received DBA attention or examination. For instance, the program uses GUID PKs that are set to NEWID() orGuid.New() without the need for index upkeep. Database indexes are not configured. The database file (.MDF) is 15 times smaller than the transaction log (.LDF), etc.
    • Hire a DBA to fix. Switch to NEWSEQUENTIALID() and set up some periodic index maintenance tasks if you are using GUID keys.
3/17/2019 10:09:46 PM

Related Questions


Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow
Licensed under: CC-BY-SA with attribution
Not affiliated with Stack Overflow