Defining a Defining Query in EF Core 2.1

I have to cut out some text from a too-long article I’ve written for a magazine (links when it’s published), so here is a simple example of using the new ToQuery method for creating a defining query in EF Core 2.1 (currently in Preview 2).

ToQuery is associated with the new Query Type feature that allows you to use types that are not mapped to a table in the database and are therefore not true entities, don’t require a key and are not change tracked.

I’m starting with a simple model that includes these two entities, which are mapped to tables in my DbContext.

public class Team
    {
        public int TeamId { get; set; }
        public string Name { get; set; }
        public string TwitterAlias { get; set; }
        public List Members { get; set; }
    }

    public class TeamMember
    {
        public int TeamMemberId { get; set; }
        public string Name { get; set; }
        public string Role { get; set; }
        public int TeamId { get; set; }
        public TimeSpan TypicalCommuteTime { get; private set; }
        public void CalculateCommuteTime (DateTime start, DateTime end)
        {
            TypicalCommuteTime = end.Subtract(start);
        }
    }

You’ll need to pre-define the type being used for the defining query, mine will have the Name and TypicalCommuteTime for the team member.

public class TeamCommute
{
   public TeamCommute(string name, TimeSpan commuteTime) 
  {
    Name = name;
    TypicalCommuteTime = commuteTime;
  }
  public string Name { get; set; }
  public TimeSpan TypicalCommuteTime { get; set; }
}

You can define queries directly in OnModelBuilding using either raw sql with FromSql or a LINQ query inside a new method called ToQuery. Here’s an example of using a Query type with a defining query and Linq:

modelBuilder.Query<TeamCommute>()   .ToQuery(() => 
   TeamMembers.Select(m => new TeamCommute( m.Name, m.TypicalCommuteTime ) )

With this query defined on the TeamCommute class, you can now use that in queries in your code for example:

var commutes = context.Query<TeamCommute>().ToList();

Keep I mind that you can’t define both a ToQuery and a ToView mapping on the same type.

New Pluralsight Course! EF Core 2: Getting Started

I’ve recently published my 19th course on Pluralsight.com: Entity Framework Core 2: Getting Started.

It’s 2hrs 40 minutes long and focuses on the basics.

This is using EF Core 2.0.1 in Visual Studio 2017.

Future plans: I’ve begun working on an intermediate level course to follow up and have others in the pipeline…such as a course to cover features of EF Core 2.1 when it gets released (I will wait until it has RTMd for stability) and other advanced topics. I am also planning to do a cross-platform version using VS Code on macOS because that’s my fave these days.

If you are not a Pluralsight subscriber, send me a note and I can give you a 30-day trial so you can watch the course. Be warned: the trial is akin to a gateway drug to becoming a subscriber.

Here is the table of contents for the course:

Introducing a New, Lighter Weight Version of EF   32m 40s
Introduction and Overview
What Is Entity Framework Core?
Where You Can Build and Run Apps with EF Core
How EF Core Works
The Path From EF6 to EF Core to EF Core
EF Core 2 New Features
Looking Ahead to EF Core 2.1 and Beyond
Review and Resources

Creating a Data Model and Database with EF Core    42m 36s
Introduction and Overview
Setting up the Solution
Adding EF Core with the NuGet Package Manager
Creating the Data Model with EF Core
Specifying the Data Provider and Connection String
Understanding EF Core Migrations
Adding Your First Migration
Inspecting Your First Migration
Using Migrations to Script or Directly Create the Database
Recreating the Model in .NET Core
Adding Many-to-many and One-to-one Relationships
Reverse Engineering an Existing Database
Review and Resources

Interacting with Your EF Core Data Model 34m 11s
Introduction and Overview
Getting EF Core to Output SQL Logs
Inserting Simple Objects
Batching Commands When Saving
Querying Simple Objects
Filtering Data in Queries
Updating Simple Objects
Disconnected Updates
Deleting Objects with EF Core
Review and Resources

Querying and Saving Related Data   20m 49s
Introduction and Overview
Inserting Related Data
Eager Loading Related Data
Projecting Related Data in Queries
Using Related Data to Filter Objects
Modifying Related Data
Review and Resources

Using EF Core in Your Applications    30m 52s
Introduction and Overview
EF Core on the Desktop or Device
The Desktop Application: Windows Presentation Foundation (WPF)
Creating the WPF Application
Walking Through the WPF Data Access
EF Core in ASP.NET Core MVC
Adding Related Data into the MVC App
Coding the MVC App’s Relationships
Review and Resources

The Secret to Running EF Core 2.0 Migrations from a NET Core or NET Standard Class Library

I have had two people that watched my Pluralsight EF Core Getting Started course (which will soon be joined by an EF Core 2: Getting Started course) ask the same question, which mystified me at first.

The were running migrations commands which caused the project to compile, but the commands did not do anything. For example, add-migration didn’t add a migration file. get-dbcontext did not return any information. The most curious part was there was no error message! I was able to duplicate the problem.

With EF6 it was possible to use migrations from a class library with no exe project in sight. EF Core migrations can run from a .NET Framework or .NET Core project but not .NET Standard. It needs a runtime. A common workaround is that even if you haven’t gotten to the UI part of your app yet, to just add a .NET Core console app project to the solution, add the EF Core Design Nuget package to it and set it as the startup project. But it’s still possible to do this without adding in a dummy project.

We already knew about the multi-targetting fix which solved an error when you try to run migrations from a .NET Standard library. But even with that fix in place, we were getting the mysterious nothingness.

The answer to the question was buried in a GitHub issue and in comments for the Migrations document in the EF Core docs. This same solution solved a problem I was having when trying to use migrations in a UWP app (again, not .NET Core or .NET Framework) that used a separate class library to host its DbContext.

I’m writing this blog post to surface the solution until it is resolved.

The solution that we used with EF Core 1.0 in order to run migrations from a .NET Standard library was to multi-target for .Net Standard (so you can use the library in a few places) and .NET Core (so you can run migrations).

That means replacing

<PropertyGroup>       
  <TargetFramework>netstandard20</TargetFramework>
</PropertyGroup>

with

<PropertyGroup> 
  <TargetFrameworks>netcoreapp2.0;netstandard2.0</TargetFrameworks>
</PropertyGroup>

Notice that the attribute name is now plural and there is a semi-colon between the two SDKs.

But there’s one more secret which is not in the documentation.

For .NET Standard 2.0 (and EF Core 2.0), you also need to add the following to csproj.

<PropertyGroup>
 <GenerateRuntimeConfigurationFiles>true</GenerateRuntimeConfigurationFiles></PropertyGroup>

Now with the DbContext project set as the startup and ensuring that the package manager console (or command line) are pointing to the same project, your migration commands will work.

Thanks to Christopher Moffat who found the solution in the GitHub issues and shared it in the comments on the EF Core Package Manager Console Tools document.


Screenshot for Tony ..see my comment in reply to your comment below.

Domain-Driven Design Europe 2018 in Amsterdam

I’m excited to be attending and speaking at DDD Europe 2018 in Amsterdam on Feb 1-2 2018. It’s an honor to be on the speaker roster with so many DDD gurus and other people with amazing DDD experiences stories to share.

The lowest early-bird ticket prices can still be bought through Nov 30 at €599 (+21% VAT =  €724 (app. ~$860US +). The ticket will go to €699 (+VAT) from Dec 1-Dec31 and then to €749 (+VAT) until the conference.

Prior to the conference,  there are also 10 amazing workshops ranging from 1/2 day to 2 days across January 30 – 31st.

I’ll be doing a 2  hour workshop during the conference proper on using EF Core 2 to map DDD patterns in your domain. It will be a hands-on workshop and my intention is to build some koans for attendees to work with. Although the flavor of hands-on may shift as I continue to percolate ideas.

Watch my Domain-Driven Design Fundamentals course on Pluralsight

Copying an Azure Function App using the Portal, GitHub and Cloud Shell

I’ve got an Azure Function App with some functions that I built for a demo for a conference session. I first used this at AngularMix and had named the function app AngularMix2017. Then when I did it for DevIntersection this past week, I recreated it as DevIntersection2017 because you can’t rename a function app. I’m doing it yet again next week at another conference (NetConf.co) and decided it was time to create it as a reusable name: DataApiDemo.

While you can’t flat out copy a function app if you are working strictly in the portal (which is how I’ve been working during my first learning stages of building function apps), you can easily enough duplicate it by downloading and uploading it again.

Keep in mind that if you are developing functions in VS 2017 or VS Code (combined with the Azure CLI) you can just use the tooling to publish the function apps along with the settings. But I don’t like to take the easy path. I learned a lot of new tricks and tools by going this route. Even if I leverage the tooling on my next go round, I’ve learned a lot which gives me a better understanding overall, more control and troubleshooting skills.

My first step was to create the new function app (dataapidemo).

The Overview blade of an Azure function app has a “download app content” option. That will pull down all of the relevant files onto your machine. So next, I go to the overview of the DevIntersection2017 function app and click on the download app content link.

When downloading the app content, you have an option to also grab the app settings. I have a number of secrets stored in my settings ..things like keys to my databases , account ids and more. If you check that, the settings will get downloaded to a local settings file. If you were using this feature to continue your development locally, e.g. in VS2017, then you’ll want them. But for my scenario, they are not useful and even potentially a security issue (you’ll see shortly) so I’m going to leave that unchecked.

I can see the files in the designated folder on my computer.

I’ve already created a Github repository to store these files in: https://github.com/julielerman/DataApiAzureFunctions.

At the command line in the new folder, I’ll git init, add all of the files into the local repo and commit them. Then I can run the git commands to connect that folder to my GitHub repo and then push my local repository into the online repository.

When that’s complete, the files are all in my GitHub repo.

Since I’m using a public repository, this is why I didn’t want the local.settings.json file filled with my secrets to download. Of course I could have deleted it locally or told Git to ignore that file. A few other points about this. If I were developing this locally, then I can have that file locally and use VS2017 or Azure CLI tools to publish my app along with the settings to the portal. But that is not my goal here.

Now I can take advantage of a long-time feature of the Azure Portal: connect it to a repository to auto-deploy from the repository to my app (in the case to my function app).

Back in the portal, I select the new function app again, then from its Platform Features page, open the Deployment Options. Choose Setup new deployment and follow the steps to connect the function app to your repository. When you’ve completed the setup, you can use the Sync button to force the deployment to happen. All of the files are now in my new dataapidemo function, including the function.json files that contain the function setting (e.g. integrations etc) and any other files I may have added to my functions.

Because I am using Twilio, one of my functions has a reference to Twilio in a project.json file and I need to restore that package to this function app. I did that by opening up the project.json and making a small mod to it..adding or removing a blank line, then saving it. That triggers the package restore to happen.

What’s not there however are the function app settings. These are settings that are defined to the app and used by one or more of the functions. There are some default settings created by Azure but I also have some others …those secrets, such as the account and telephone number for my Twilio integration as well as a key for the function.

Some of these settings are my own custom settings that I want in the new function. I could add them in via the portal app settings interface, but I don’t like that path when I have a slew of settings.

That means moving on to yet another amazing tool in the Azure Portal — Cloud Shell. Here you can run Bash or Powershell commands from the Azure CLI directly on services in your subscription. You can also do this locally on your machine with the extra caveat that there you must provide credentials and other metadata.

In the portal, open the Cloud Shell. This is gives you an interactive terminal window.

If you have multiple subscriptions, you want to be sure that it’s set to the one where your app function is stored.

You can do this by listing the subscriptions with the az account list command.

If the one you want is not default, then set it with:

az account set -s [ID]

This shell can do auto-complete so if the id starts with a123, you can type

az account set -s a123

then hit the tab and the id will get finished . Hit enter and it becomes the default subscription to work in.

I’ll use the azure functionapp commands to get the settings into my function app.

First I’ll list the existing settings. You do this with the command

az functionapp config appsettings list --name [functionname] --resource-group [resource name]

The parameter shortcuts for name and resource-group are -n and -g. Mine are both the same name: dataapidemo. Here’s what the command looks like (except for the wrapping that my blog is forcing):

az functionapp config appsettings list -n dataapidemo -g dataapidemo

The appsettings are listed out as JSON by default although you can affect the output format with the output parameter.

I’ll need to see the settings I want to copy from the other function app so I run:

az functionapp config appsettings list -n devintersection2017 -g devintersection2017 -o tsv

which give me the list of settings from which to choose.

In addition to listing the settings, you can add new ones with the set verb and delete with delete verb. Both would go in place of the verb list.

You use SET by combining the SET command and the –settings parameter. After the –settings parameter, you can list one or more setting with this format:

az functionapp config appsettings set -n dataapidemo -g dataapidemo --settings "property1name=value", "property2name=value"

Note that this is wrapping here on the blog post but not in the shell window.

So I added the handful of settings I wanted to add which looked more like:

az functionapp config appsettings set -n dataapidemo -g dataapidemo --settings "TwilioSID=myvalue", "TwilioAccount=myvalue"

There were others I had to add. For example, my function needs to know how to get to my Cosmos DB document database, so it has a setting that uses the db name as the property name and the dbs account endpoint as the value.

Once I had done that, I was able to get rid of the function apps that were named for each conference and use a common one no matter where I am sharing it.

And boyohboy did I learn a lot of new things!

Another Use Case for DbContext.Add in EFCore (and a DDD win)

If you are like me and design your classes following Domain-Driven Design principals , you may find yourself with code like this for controlling how objects get added to collections in the root entity.

public class Samurai {

  public Samurai (string name) : this()
  {
     Name = name;
  }

  private Samurai ()
  {
    _quotes=new List<Quote>();
  }

  public int Id { get; private set; }
  public string Name { get; private set; }
  private readonly List _quotes = new List ()
  private IEnumerable Quotes => _quotes.ToList ();
  public void AddQuote (string quoteText) {
      var newQuote=new Quote(quoteText,Id);
      _quotes.Add (newQuote);
  }

I have a fully encapsulated collection of Quotes. The only way to add a new quote is through the AddQuote method. You can’t just call Samura.Quotes.Add(myquote).

Additionally, because I want to control how developers interact with my API, there is no DbSet for Quotes. You have to do all of your queries and updates via context.Samurais.

A big downside to this is that if I have a new quote and I know the ID of the samurai, I have to first query for the samurai and then use the AddQuote. That really bugs me. I just want to create a new quote, push in the Samurai’s ID value and save it. And that requires either raw SQL or a DbSet<Quote>. I don’t like either option. Raw SQL is a hack in this case and DbSet<Quote> will open my API up to potential misuse.

I was thinking about this problem while laying in bed this morning (admit it, that’s the first thing you do when you wake up, too, right?) and had an idea.

In EF Core, we can now add objects directly to the context without going through the DbSet. The context can figure out what DbSet the entity belongs to and apply the right info to the change tracker. I thought this was handy for being able to call

myContext.AddRange(personobjectA, accountobjectB, productObjectC);

Although I haven’t run into a good use case for leveraging that yet.

What occurred to me is that if DbContext.Add is  using  reflection, maybe EF Core can find a private DbSet.

So I added a private DbSet to my DbContext class:

private DbSet<Quote> Quotes { get; set; }
 And tried out this code (notice I’m using context.Add, not context.Quotes.Add):
static void AddQuoteToSamurai () 
{
  using (var context =newSamuraiContext ()) 
  {
    var quote=newQuote("Voila",1);
    context.Add(quote);
    context.SaveChanges();
  }
}
And it worked! But this isn’t complete yet. I’m breaking my rule of ensuring that only my aggregate root can manage quotes. So this is “dangerous” code from my DDD perspective. However, I was happy to know that EF Core would support this capability.
Currently, Samurai.AddQuote does not have any additional logic to be performed on the quote. What if I were to add in a “RemoveBadWords” rule before a quote can get added?
public void AddQuote (string quoteText) 
{
 Utilities.RemoveBadWords(quoteText);
 var newQuote=new Quote(quoteText,Id);
  _quotes.Add (newQuote);
}
 Now I have an important reason to use Samurai to do the deed. I can add a second, static AddQuote method that also takes an int. Because it’s static, it’s a pass through method.
public static Quote AddQuote(string quoteText,int samuraiId)
{
  Utilities.RemoveBadWords(quoteText); 
  var newQuote=newQuote(quoteText,samuraiId);
  return newQuote;
}

This works and now I don’t have to have an instance of Samurai to use it:

staticvoid AddQuoteToSamurai () 
{
  using (var context =newSamuraiContext ()) {
    context.Add(Samurai.AddQuote("static voila",1));
    context.SaveChanges();
}

One thing I was worried about was if I had an instance of Samurai and tried to use this to add a quote to a different samurai. That would break the aggregate root…it’s job is to manage its own quotes only. It shouldn’t know about other Samurais.

But .NET protects me from that. I can’t call the static method from an instance of Samurai.

I still think that there’s a little bit of code smell from a DDD perspective about having this static, pass-through method in an aggregate root so will have to investigate that (or wait for any unhappy DDDers in my comments). But for now I am happy that I can avoid having to query for an instance of Samurai just to do this one task.

Next Up: Devintersection, Las Vegas Oct 30-Nov 2

I have been away from home more than at home this fall! I have two trips behind me:

Trip 1: London for ProgNet, Salt Lake City for Pluralsight Live and Denver for Explore DDD.

Trip 2: Orlando for AngularMix then a side trip to Miami to visit friends and relatives

I’m home again for a bit then off again to Las Vegas for Devintersection. If you are still thinking about going (you should, really) you can still get a small discount using the code “LERMAN” when you register.

I will be giving 3 talks, participating in a panel and of course attending talks.

One for SURE that I’ll attend is on EF Core by two members of the team (and my friends!) Diego Vega and Andrew Peters on Tuesday.

I’m also doing an EF Core 2 talk which will be complementary to their session (not redundant) . That talk is on Wednesday morning.

 

Later on Wednesday I’m doing a session (should be FUN) for developers to take advantage of SQL Server in containers for quick dev environments. There I will show setting up and using a docker container with SQL Server for Linux (on my mac) and then a windows container for SQL Server Developer. For a dev or testing environment these are such fast and easy ways to spin up a SQL Server.

In my last session, which is on Thursday, I’m going to mostly code (yay! What is more fun that that?) to build up a data api and provide some design guidance at the same time as letting you get MORE eyeballs full of EF Core 2.0. And for a bonus, I finally got my hands on Azure Functions so I get to show off what I built on there as well.

After that, I’ll be on the closing panel. One NEVER knows what to expect there. Should be fun.

And also how cool is this graphic that the conference created, just for me to share just with you!