A Notable Entity Framework Performance Eater: Casting nchar to nvarchar under the covers

Brian Sullivan was having a weird query performance problem with one particular EF query and asked about it here on Stackoverflow. Some very knowledgeable people asked all the right questions. It made no sense.

He finally discovered the source of the problem himself.

His code first model defined a string property which SQL Server automatically presumes is an nvarchar. But his database had an varchar. Since these are coercible, Entity Framework decided to coerce the type under the covers. And it was very expensive. Brian fixed the problem by applying a mapping that just said “hey EF, that’s an varchar by the way” and all was well again.

This is one of those awful problems that will be hard to solve and hard to do a web search for. But hopefully the SO question or this blog post will help the next dev coming down the pipe.

Brian has written his own, more detailed blog post about this here -> Entity Framework Code-First Performance Issue with String Queries

  Sign up for my newsletter so you don't miss my conference & Pluralsight course announcements!  

19 thoughts on “A Notable Entity Framework Performance Eater: Casting nchar to nvarchar under the covers

  1. Quite interesting! I have many varchars in my SQL Server database. I never knew there was this performance penalty when using varchars. Should I stick with nvarchars by default in my new tables ? Thanks

  2. @James – There’s no perf penalty for using varchars per se. The performance problem occurred because EF assumed that my column was nvarchar, since .NET strings are Unicode and I hadn’t otherwise annotated it. So, when the DB processed the query, it had to convert every value in my table from ASCII to Unicode in order to do the comparison the generated query was asking it to.

    Long story short, as long as you properly annotate your models (or use Database First), varchars will performmmm just fine.

  3. We never use nvarchar. Is there anyway to globally tell EF to use varchar instead of nvarchar, or does one have decorate every string?

  4. At the moment I’m using code First that was "reverse engineered" from an existing database using EF 4.x DbContext Generator for C#. Most of the types in the database are varchar, not nvarchar. The generated classes had no annotations to indicate they are varchar. Was type conversion taking place in this case ? I didn’t notice any performance improvement when I added the annotations.

  5. Hi James,

    If you used the reverse engineer for code first, all of the configurations will be created using the fluent API, not data annotations. You should see a folder called mappings and in there a mapping file for each class that was created. Can you see the varchar mappings in there?

  6. Thanks for reply Julie. Yes all annotations were created using fluent API, not data annotations. There were NO VARCHAR mappings at all!! Does this mean that EF is doing type conversion without these annotations ?

    A separate question – can I use Fluent API and data annotations concurrently or do I have to use one over the other ?

  7. I can’t answer the first authoritaritively without setting up an example and seeing how far I can get to the internals. 2nd…yes, but make sure you have a good reason to do so (your app, so you are the best judge of this) because you will pay a price when it’s time to maintain or extend your app. 🙂

  8. You are right it’s not a good idea to mix both types of annotations.

    Just to give you a short snippet, this is what I got with reverse engineering… FirstName, LastName and Locality were all varchar, not nvarchar. Still no annotations for varchar. Originally I was using Database First on this database, then I converted the Model to code first.

    this.Property(t => t.FirstName)


    this.Property(t => t.LastName)


    this.Property(t => t.Locality)


  9. I think this is worth looking into on entityframework.codeplex.com. They are working on that tool. Even though EF can coerce it, because of the perf problem, it would be worth it for …actually I thnk it’s going to be the System.Data.SqlClient api that’s responsible .. to be explicit.

  10. I’ve seen this behavior in ADO.Net as well.

    SqlCommand.AddWithValue("param", "value") assumes nvarchar. This causes implicit type conversion in the queries

  11. Roger, If the command is a stored proc, I would think that would not be an issue since the type only gets converted once whne the paremter is passed, right?

  12. Thanks for this interesting article. I can see this being quite a pitfall in my organisation that we should now hopefully be able to avoid.

    One thing I haven’t been able to understand from the posts however is why, given that EF thinks the property on the entity is an nvarchar, would it then generate a varchar on the database? Is this issue only present when a code first model is reverse engineered from an existing database?

  13. @Kate, I don’t think EF was generating the varchar. I *think* that SQL Server was receiving the nvarchar and doing the coersion. Hopefully someone who is better versed in the database end of things can confirm or clarify.

    1. That’s correct. Because of “Datatype Precedence”, SQL Server has to convert the entire VARCHAR column to NVARCHAR to be able to compare to an NVARCHAR parameter used in the criteria. Such a thing forces either a Table Scan (Clustered Index Scan) or an Index Scan of a non-Clustered Index (faster because it’s more narrow) and cannot do a very high performance Index Seek. It’s known as a “non-SARGable” predicate (“SARG” is short for “Search ARGument”).

      There are some pretty good articles on the subject under the following Google Search…

  14. My understanding of this performance hit is that it is NOT with nvarchar per se either…. but rather with specifically nvarchar(MAX)

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.