All posts by Julie Lerman

Links to My Recent DDD+EFCore Content

Time to aggregate the various articles and videos I’ve created filled with lessons on how EF Core support for direct mapping of domain models to your relational databases has been improving:


A Small Lesson on env Files in docker-compose

I’ve been working a lot with docker lately and learning learning learning. I have written a 3-part series for my MSDN Mag Data Points column that will be out in April, May and June 2019 issues. I have another YUGE blog post that I will publish to accompany the May article. And I’m working on others.

I explored Docker environment variables and different ways to feed them into a Docker image or container.

My docker-compose file referenced an environment variable named DB_PW without specifying it’s value.

dataapidocker:
image: ${DOCKER_REGISTRY-}dataapidocker
build:
context: .
dockerfile: DataAPIDocker/Dockerfile
environment:
- DB_PW

Docker will read environment variables from the host to discover the value.

But this was a pain. I wasn’t permanently storing the DB_PW on my development machine and had to remember to set it frequently. Elton Stoneman (from Docker) said I should *really* consider using the Docker env file feature. This lets you set variables in an environment file and let the docker-compose file read from that. And I can keep that special file out of my source control. (Always my worry!)

I started by following docs that showed using a file named anything.env. I created a file called hush-hush.env where I specified the variable. This is the full content of the file:

DB_PW=thebigsecret

Then in docker-compose, in the service, the env_file tag lets you point to it. You can even remove the environment tag in the yml file.

dataapidocker:
image: ${DOCKER_REGISTRY-}dataapidocker
build:
context: .
dockerfile: DataAPIDocker/Dockerfile
env_file:
- hush-hush.env

This worked perfectly. My app was able to discover the environment variable in code.

But then I evolved my solution to use another container for my database. The primary container depends on the new mssql container. And the mssql container requires I pass in the database password as an environment variable. Since DB_PW already exists, I can do that easily enough with substitution (via curly braces). Here’s the new docker-compose file:

version: '3.4'
services:
dataapidocker:
image: ${DOCKER_REGISTRY-}dataapidocker
build:
context: .
dockerfile: DataAPIDocker/Dockerfile
env_file:
- hush-hush.env
depends_on:
- db
db:
image: mcr.microsoft.com/mssql/server
volumes:
- mssql-server-julie-data:/var/opt/mssql/data
environment:
SA_PASSWORD: "${DB_PW}"
ACCEPT_EULA: "Y"
ports:
- "1433:1433"
volumes:
mssql-server-julie-data: {}

And there’s an order of operations issue here. When I build the docker-compose file, it complains that DB_PW is not available and my app fails. The db service is not getting the contents of my hush-hush.env file. I tried a number of things, such as adding env_file to the db service. In the end, here’s what I learned.

The substitution use requires that the DB_PW variable be defined in docker-compose. I added that back in to the primary service, but it was not getting the value from hush-hush.env.

But you can have a .env file that has no name. The extension *is* the full name of the file. Docker-compose will read that early enough and provide the value from the .env file to the declared DB_PW. Then all of the pieces fell in place. The mssql container was spun up with the value from DB_PW as its environment variable. And my app code was able to read the environment variable that Docker passed into the running container for its own tasks.

The final docker-compose.yml file looks like this:

version: '3.4'
services:
dataapidocker:
image: ${DOCKER_REGISTRY-}dataapidocker
build:
context: .
dockerfile: DataAPIDocker/Dockerfile
environment :
- DB_PW
depends_on:
- db
db:
image: mcr.microsoft.com/mssql/server
volumes:
- mssql-server-julie-data:/var/opt/mssql/data
environment:
SA_PASSWORD: "${DB_PW}"
ACCEPT_EULA: "Y"
ports:
- "1433:1433"
volumes:
mssql-server-julie-data: {}

And it relies on a file named “.env” with my variable key value pair defined (same as hush-hush.env above).

DB_PW=thebigsecret

Resources:
https://docs.docker.com/compose/environment-variables/

Announcing: Deep Dive into EF Core 2-Day Workshop

Join me in London June 17-18 for a 2-day deep dive into Entity Framework Core.

This is a new addition to Skills Matter course catalog. Because it is a new course, we are looking to get feedback on the proposed list of topics to be covered. If you are interested in attending, your input will be helpful.

Is the list of topics too long for 2 days? Does it touch on what you would want to learn in an advanced class? You can provide feedback on the course description page.

Day 1: Leverage Advanced Features

  • High level review of EF Core differences from EF6
  • Implementing logging to capture EF Core’s database and in-memory activity. Learn about different types of logging data to be captured
  • Learn various approaches to seeding such as via database scripts, code or using the migration-based seeding introduced in EF Core 2.1. You’ll also learn when each approach may be appropriate
  • Using migrations during development,within source control and during deployments
  • Integration testing your EF Core code

Day 2: EF Core in Your Software Architecture

  • The Great Repository Debate: Pros and Cons of the Repository Pattern/Generic Repositories for exposing EF Core
  • Designing Data Layers/APIs
  • Understanding complicated mapping conventions and supplementing those with custom mappings using the Fluent API
  • Designing for performance
  • *Bonus topic* If all modules have been covered, you will also look at EF Core in Azure Functions and EF Core with Azure Cosmos DB (given adequate time)

A Few Coding Patterns with the MongoDB C# API

In the February 2019 issue of MSDN Magazine (Exploring the Multi-Model Capability of Azure Cosmos DB Using Its API for MongoDB), my Data Points column explored working with the MongoDB model of Azure Cosmos DB using the mongocsharpdriver. I started by working against a local instance of MongoDB and then the Azure instance. But the column was a little bit long so I cut out a few extraneous sections . So I’m placing them here and linking to this blog post from the article.

In the article I used an IMongoCollection object to query and store data into the database. You must specify a type for the collection object to serialize and deserialize. In the article I typed the collection to my classes, e.g.,  Collection<Ship>. It’s also possible to type the collection generically to a BsonDocument. Here’s some information about that and a little bit of code.

Typing Collections to BsonDocument

Another path for mapping is to use a BsonDocument typed collection object that isn’t dependent on a particular type. This would allow you to have more generic methods. But it also means manually serializing and deserializing your objects, which is easy using ToBsonDocument for serializing:

var coll = db.GetCollection<BsonDocument> ("Ships");
coll.InsertOne (ship.ToBsonDocument());

Given that the documents have discriminators, you can then specify a type in your query to retrieve specific types although, by default, hierarchies don’t get accounted for. The article refers to documentation on polymorphism for the C# API. Here’s the link. Check  to learn how to properly implement polymorphism in more detail . The following code will only pull back documents where _t matches the configured discriminator for Ship into ships and for DecommissionedShip into dShips:

var coll = db.GetCollection<BsonDocument> ("Ships");
var ships = coll.AsQueryable().OfType<Ship>().ToList();
var dShips = coll.AsQueryable()
                  .OfType<DecommissionedShip>().ToList();

Encapsulating the MongoClient, Database and Collection

Specifying a typed collection instance repeatedly, as I did in the article demos, can become a drag. You could set them up in advance, for example in a class that acts as a context for interacting with the database, as shown here:

public class ExpanseContext
{
  public IMongoDatabase ExpanseDb { get; private set; }
  public IMongoCollection<Ship> Ships { get; private set; }
  public IMongoCollection<Character> Characters {get;private set;}
  public ExpanseContext()
  {
    ExpanseDb=new MongoClient().GetDatabase("ExpanseDatabase");
    Ships=ExpanseDb.GetCollection<Ship>("ships");
    Characters=ExpanseDb.GetCollection<Character>("ships"); 
  } 
}

This refactored code to insert a document is much more readable:

private static void InsertViaContext ()
{
  var context = new ExpanseContext ();
  var ship = new Ship { Name = "Agatha King" };
  context.Ships.InsertOne (ship);
}

Logging in EF Core 2.2 Has a Simpler Syntax–More like ASP.NET Core

Logging EF Core’s memory operations and SQL operations has evolved a few times since EF Core arrived. It takes advantage of the same underlying features that ASP.NET Core uses. If you are using ASP.NET Core, logging is baked in and it is really simple to turn it on for EF Core and add filtering. See Shawn Wildermuth’s blog post about EF Core logging in ASP.NET Core.

But if you aren’t using ASP.NET Core, it’s a little more complicated. Not terribly, but still there’s some extra work to do. It involves setting up an ILoggerFactory in your DbContext and defining any filters at the same time.

I wrote an article about this (with the focus being on taking advantage of the various available filters for EF Core logging) in MSDN Magazine earlier this …oh wait, it’s Jan 1, so I can say “last  year”.  Data Points – Logging SQL and Change-Tracking Events in EF Core. I also used it heavily in my EF Core 2 Getting Started course, EF Core 2:Mappings and EF Core 2.1: What’s New courses on Pluralsight. (Note that I’ve updated the sample code for the Getting Started course to EF Core 2.2 and put it on GitHub at github.com/julielerman/PluralsightEFCore2GettingStarted)

My article and courses were using Console apps to demonstrate EF Core behavior and therefore the ConsoleLoggerProvider to tie the logger to the console. Note that the Data Points article contains a lot of good details about the various types of filtering. So you can use the new syntax (below) to specify that there should be a filter, but be sure to read the article to learn about the flavors of filtering and what type of details you’ll be able to see based on the choices you make.

But the logging API has continued to evolve and is providing some of the same shortcuts that ASP.NET had created. And the ConsoleLoggerProvider has been deprecated. The API is not part of EF Core. It’s part of .NET Core. Both EF Core and ASP.NET Core use it.

If you are using EF Core 2.2, the syntax has changed (simplified) and it’s going to get even more streamlined in 3.0.

In fact, if you use the earlier syntax with 2.2, you’ll get a warning about the ConsoleLoggerProvider:

Obsolete(“This method is obsolete and will be removed in a future version. The recommended alternative is using LoggerFactory to configure filtering and ConsoleLoggerOptions to configure logging options.”)

For a point of comparison, here is an example of using theold syntax to turn on logging, only show logs related to database commands and only show messages that are tagged as “Information”.

EF Core 2.0 & 2.1 Logic

public static readonly LoggerFactory MyConsoleLoggerFactory
            = new LoggerFactory(new[] {
              new ConsoleLoggerProvider((category, level)
                => category == DbLoggerCategory.Database.Command.Name
               && level == LogLevel.Information, true) });

Once your logger factory field is defined in the context class you tell the DbContext to use it when configuring.

protected override void OnConfiguring
  (DbContextOptionsBuilder optionsBuilder)
{
  var connectionString = 
    ConfigurationManager.ConnectionStrings["WPFDatabase"].ToString();
  optionsBuilder
    .UseLoggerFactory(MyConsoleLoggerFactory)
    .EnableSensitiveDataLogging(true)
    .UseSqlServer(connectionString);
}

So it’s the creation of the logger factory whose syntax is a little convoluded. The newer API follows how ASP.NET Core lets you filter with an AddFilter method that takes the filters as parameters. No lambdas needed. Also configuring the filter is a separate bit of logic that tellig the logger that it should be tied to the console.

EF Core 2.2 Logic

With EF Core 2.2, you can set up the logger factory in the constructor or another method as long as it’s available when you are configuring the option builder. I’m creating it in a method then using that method as a parameter of UseLoggerFactory. I’m still filtering on showing only database commands and log details flagged as Information.

private ILoggerFactory GetLoggerFactory()
{
  IServiceCollection serviceCollection = new ServiceCollection();
  serviceCollection.AddLogging(builder =>
         builder.AddConsole()
                .AddFilter(DbLoggerCategory.Database.Command.Name, 
                           LogLevel.Information)); 
  return serviceCollection.BuildServiceProvider()
          .GetService<ILoggerFactory>();
}

and then I’m calling GetLoggerFactory() in the UseLogging method on the optionsbuilder:

optionsBuilder.UseLoggerFactory(GetLoggerFactory())

Packages and References

In order to use the AddConsole() method, you still have to use the Microsoft.Extensions.Logging.Console package that the earlier ConsoleLoggerProvider was in. However, you do not need a using statement for the namespace (as you did for the ConsoleLoggerProvider).

Getting the SQL Server 2019 for Linux CTP2.0 Docker Image

If you are used to pulling the mssql-server images from the microsoft repository, e.g.,

docker pull microsoft/mssql-server

that won’t work for the 2019 CTP.

I was able to repull (aka update) using the former repository, but that wasn’t working for the CTP whose tag is vNext-CTP2.0-ubuntu.

I finally noticed the new docker pull command on the docker hub page for the image

It says: docker pull mcr.microsoft.com/mssql/server

So the command for pulling the CTP using it’s tag is as follows:

docker pull mcr.microsoft.com/mssql/server:vNext-CTP2.0-ubuntu

 

My First Newsletter: New Course, Pluralsight Discount, Workshops & More

I recently decided it was time to start a newsletter to be sure people who are interested don’t miss out on things like new Pluralsight courses or articles that I’ve published, conferences I’m speaking at and even workshops that I’m teaching. I figure with 26K twitter followers, there might be a few people interested.

Read the June newsletter

Subscribe to my newsletter

I just sent out the first newsletter yesterday. Here are some highlights: