Quick Tips for Migrating EF Core at Runtime on Azure

A question on my EF Core 2 Getting Started course on Pluralsight asks:

Hi Julie
Thank you for your course. I have a question. Can you please advise what is the best practice to deploy code first migration to the production database (azure).? I mean i have created 
asp.net core mvc and ef.core application using the code first migration approach. after deployed to azure. If i change any schema in our code first approach how can i update schema in production database ( without loosing the data in production database)

My response with a few ideas I thought were worth sharing outside of the course discussion. (Note that this is just high level)

For simple solutions, one path I have used is to call database.migrate at startup. The VS publish workflow has an option you can check to apply migrations when the app is published. You can see this in the MS docs for deploying an aspnet core app to Azure:

Or you can do it programmaticly in the program.cs file which will perform any needed migrations. There’s an example of this in my April 2019 MSDN Magazine article . If you need something more robust, then you could instead generate scripts (perhaps Idempotent scripts) with EF Core migrations and then include them with your updates and use a tool that can apply those scripts. If you use Redgate tools, perhaps their SQL Change Automation tool. Another type of tool is a migrator tool like FlywayDB (flywaydb.org) or LiquidBase. I’ve used Flyway with containers. Here’s a very recent conference talk I did where I used it: bit.ly/30AhgAr

Publishing a Single Image Docker Container with Secrets from VS2017 and Running it on Azure

(Written in advance, but published on May 2 when the relevant article is finally available.)

I’ve just finished writing a three part series on building a containerized ASP.NET Core API that uses EF Core for its data persistence. All of this was done in VS 2017 and I took advantage of the VS2017 Tools for Docker.

The article series will be in the April, May and June issues of MSDN Magazine.

Part 1: EF Core in a Docker Containerized App, Apr 2019

Part 2: EF Core in a Docker Containerized App, May 2019

But I didn’t have room to include the important task of deploying the app I’d written, although I worked hard to do it. Well, the deployment was pretty easy but there were some new steps to earn in order to deal with storing a password for making a connection to my Azure SQL database. I will relay those steps in this blog post.

My API uses EF Core and targets an Azure SQL database. So whether I’m debugging locally with IIS or Kestrel, debugging locally inside of a Docker container or running the app from a server or the cloud, I can always access that database.

That means I have a connection string to deal with but I want to keep the password a secret.

The structure of the solution is here. My ASP.NET Core API project is DataAPIDocker. And because I used the docker tools to add container orchestration, I have another folder in the solution for docker-compose.

image

I go into detail in part 2 of the article (the one in the May 2019 issue) but the bottom line is that I use a docker environment variable in my docker-compose.yml file.

version: ‘3.4’

services:
dataapidocker:
image: ${DOCKER_REGISTRY-}dataapidocker
build:
context: .
dockerfile: DataAPIDocker/Dockerfile
environment:
– DB_PW

In the environment mapping, I have a sequence item where I’m defining the DB_PW key but I’m not including a value. Becasue there’s no value there, Docker will look in the host’s environment variables. Because I’m only debugging, I create a temporary environment variable on my system with the value of the password and when I debug or run the app from VS2017, the password variable will be found. That environment variable gets passed into the running container and my app has code to read it and include that password in teh connection string.

So its all self-contained, nice and neat.

Publishing the Image to Azure’s ACI Registry

Once you’ve got the app working it’s time to publish it. But we’re using Docker, so you’re not publishing the app, but the docker image that can run the app for you. Docker tools will help with this also.

Right click the project and choose Publish.

image

Then you will want to create a publish profile. And part of that profile is to choose where to publish the image. Here you’ve got options. I have a VIsual Studio Subscription and can publish it to an Azure Container Registry if I want or to Docker Hub or to some other registry.

image

My goal for this blog post is to get the image into the Azure Container Registry so that’s my choice. You can have multiple container registries in your azure account. And you can store any number of images in a single registry. Well, there may be technical or financial constraints, but the point is that you can have multiple images in a registry. I’m not here to advise on how to manage azure finances, just how to do the task.

Here’s the overview page of a registry I let the publishing tool create for me.  I’ve circled the link to see repositories which is where your images are accessible.

image

You may have different versions of a particular image so each “set” is a different repository. I have three repositories in mine where I’ve been experimenting.

image

The dataapi has only one image which the publishing tool automatically tagged “latest” for me. I can have other versions under different tags.

Back in Visual Studio, after walking through the publishing tool’s questions for creating a new repository, the final step is to go ahead an publish which will build the image and push it up to the target repository. Keep in mind that you’ll want VS2017 to be set to run a RELEASE build, not a DEBUG build.

If its your first time pushing this image to the repository, the tooling will also push the ASP.NET Core SDK and runtime images that are listed in the app’s Dockerfile .

I was surprised to see this, wondering why Azure didn’t just grab them from docker hub and why I was uploading those big files directly. Naturally I tweeted my confusion:

There’s more to the story but it is beyond the scope of my goals here.

A cool feature of this registry is that you can right click and run an image. Which is fine if you aren’t trying to orchestrate a number of images and that matches my case. This image does run independently.

Right click on the image and choose Run instance. Azure will create a container and run it as an Azure Container Instance. Although first you need to define specs for the instance.

image

It’s kind of magical because you don’t have to create and manage a virtual machine to run the container on if it’s a simple application.

What About Environment Variables for the Container Instance?

The instance will run but the Magazines controller that needs to read from the Azure SQL database will fail because we haven’t provided the password which the container expects to be provided through the host’s environment variable. So for my image, right click and run wasn’t quite enough.

This is where I had to do a lot of reading, research and experimentation until I got the solution working. (Keep in mind that if I were running this on a virtual machine of my own devising, you can just pass the variable in when you manually call docker run.)

There are two ways to provide an environment variable to a container instance.

One, through the portal, means rather than right clicking the image, you need to start by creating a new container instance in Azure and pointing to the image. This path lets you assign up to 3 environment variables in the configuration:
image

Using an On-The-Fly Variable to Pass into the Container

I’m going to do a first pass creating an variable on the fly to pass to EnvironmentVariable. Then I’ll show you how to use the Azure Key Vault

EnvironmentVariable expects a hashtable.

Create a new variable (I’ll call it envVars in homage to the resource where I learned this) and assign a single key value pair:

$envVars = @{‘DB_PW’=’eiluj’}

The other tricky part is providing the credentials to access the image in the registry. We don’t have to do that when using the portal to create the container because we’ve already provided them. But now I need to provide them.

You’ll need the user name and password from the repository:

Then you can use PowerShell to create a secure string from the password and then use that secure string along with the username to create a PowerShell credential object.

TIP: If you have multiple subscriptions, be sure you’re pointing to the correct one where the target resource group is.

TIP: A cool thing you can do in cloud shell is type DIR to list your subscriptions and then use CD to get into the correct one! Checkout the PowerShell Cloud Shell quick start for details

.

TIP: If like me, you mess around with the database to experience cause & effect, remember that in my sample code, the database gets migrated on app startup. In the case of having it in a container that means when the container instance is run. So if you run the container, then delete the database, you won’t see the db again until the container is spun up again. Stopping & restarting has the same effect. Of course this is just for testing things out, not production! Once again, something that had me stuck for over an hour until I had my aha moment.

Creating a KeyVault and Adding My Secret Password

Links to My Recent DDD+EFCore Content

Time to aggregate the various articles and videos I’ve created filled with lessons on how EF Core support for direct mapping of domain models to your relational databases has been improving:


A Small Lesson on env Files in docker-compose

I’ve been working a lot with docker lately and learning learning learning. I have written a 3-part series for my MSDN Mag Data Points column that will be out in April, May and June 2019 issues. I have another YUGE blog post that I will publish to accompany the May article. And I’m working on others.

I explored Docker environment variables and different ways to feed them into a Docker image or container.

My docker-compose file referenced an environment variable named DB_PW without specifying it’s value.

dataapidocker:
image: ${DOCKER_REGISTRY-}dataapidocker
build:
context: .
dockerfile: DataAPIDocker/Dockerfile
environment:
- DB_PW

Docker will read environment variables from the host to discover the value.

But this was a pain. I wasn’t permanently storing the DB_PW on my development machine and had to remember to set it frequently. Elton Stoneman (from Docker) said I should *really* consider using the Docker env file feature. This lets you set variables in an environment file and let the docker-compose file read from that. And I can keep that special file out of my source control. (Always my worry!)

I started by following docs that showed using a file named anything.env. I created a file called hush-hush.env where I specified the variable. This is the full content of the file:

DB_PW=thebigsecret

Then in docker-compose, in the service, the env_file tag lets you point to it. You can even remove the environment tag in the yml file.

dataapidocker:
image: ${DOCKER_REGISTRY-}dataapidocker
build:
context: .
dockerfile: DataAPIDocker/Dockerfile
env_file:
- hush-hush.env

This worked perfectly. My app was able to discover the environment variable in code.

But then I evolved my solution to use another container for my database. The primary container depends on the new mssql container. And the mssql container requires I pass in the database password as an environment variable. Since DB_PW already exists, I can do that easily enough with substitution (via curly braces). Here’s the new docker-compose file:

version: '3.4'
services:
dataapidocker:
image: ${DOCKER_REGISTRY-}dataapidocker
build:
context: .
dockerfile: DataAPIDocker/Dockerfile
env_file:
- hush-hush.env
depends_on:
- db
db:
image: mcr.microsoft.com/mssql/server
volumes:
- mssql-server-julie-data:/var/opt/mssql/data
environment:
SA_PASSWORD: "${DB_PW}"
ACCEPT_EULA: "Y"
ports:
- "1433:1433"
volumes:
mssql-server-julie-data: {}

And there’s an order of operations issue here. When I build the docker-compose file, it complains that DB_PW is not available and my app fails. The db service is not getting the contents of my hush-hush.env file. I tried a number of things, such as adding env_file to the db service. In the end, here’s what I learned.

The substitution use requires that the DB_PW variable be defined in docker-compose. I added that back in to the primary service, but it was not getting the value from hush-hush.env.

But you can have a .env file that has no name. The extension *is* the full name of the file. Docker-compose will read that early enough and provide the value from the .env file to the declared DB_PW. Then all of the pieces fell in place. The mssql container was spun up with the value from DB_PW as its environment variable. And my app code was able to read the environment variable that Docker passed into the running container for its own tasks.

The final docker-compose.yml file looks like this:

version: '3.4'
services:
dataapidocker:
image: ${DOCKER_REGISTRY-}dataapidocker
build:
context: .
dockerfile: DataAPIDocker/Dockerfile
environment :
- DB_PW
depends_on:
- db
db:
image: mcr.microsoft.com/mssql/server
volumes:
- mssql-server-julie-data:/var/opt/mssql/data
environment:
SA_PASSWORD: "${DB_PW}"
ACCEPT_EULA: "Y"
ports:
- "1433:1433"
volumes:
mssql-server-julie-data: {}

And it relies on a file named “.env” with my variable key value pair defined (same as hush-hush.env above).

DB_PW=thebigsecret

Resources:
https://docs.docker.com/compose/environment-variables/

Announcing: Deep Dive into EF Core 2-Day Workshop

Join me in London June 17-18 for a 2-day deep dive into Entity Framework Core.

This is a new addition to Skills Matter course catalog. Because it is a new course, we are looking to get feedback on the proposed list of topics to be covered. If you are interested in attending, your input will be helpful.

Is the list of topics too long for 2 days? Does it touch on what you would want to learn in an advanced class? You can provide feedback on the course description page.

Day 1: Leverage Advanced Features

  • High level review of EF Core differences from EF6
  • Implementing logging to capture EF Core’s database and in-memory activity. Learn about different types of logging data to be captured
  • Learn various approaches to seeding such as via database scripts, code or using the migration-based seeding introduced in EF Core 2.1. You’ll also learn when each approach may be appropriate
  • Using migrations during development,within source control and during deployments
  • Integration testing your EF Core code

Day 2: EF Core in Your Software Architecture

  • The Great Repository Debate: Pros and Cons of the Repository Pattern/Generic Repositories for exposing EF Core
  • Designing Data Layers/APIs
  • Understanding complicated mapping conventions and supplementing those with custom mappings using the Fluent API
  • Designing for performance
  • *Bonus topic* If all modules have been covered, you will also look at EF Core in Azure Functions and EF Core with Azure Cosmos DB (given adequate time)

A Few Coding Patterns with the MongoDB C# API

In the February 2019 issue of MSDN Magazine (Exploring the Multi-Model Capability of Azure Cosmos DB Using Its API for MongoDB), my Data Points column explored working with the MongoDB model of Azure Cosmos DB using the mongocsharpdriver. I started by working against a local instance of MongoDB and then the Azure instance. But the column was a little bit long so I cut out a few extraneous sections . So I’m placing them here and linking to this blog post from the article.

In the article I used an IMongoCollection object to query and store data into the database. You must specify a type for the collection object to serialize and deserialize. In the article I typed the collection to my classes, e.g.,  Collection<Ship>. It’s also possible to type the collection generically to a BsonDocument. Here’s some information about that and a little bit of code.

Typing Collections to BsonDocument

Another path for mapping is to use a BsonDocument typed collection object that isn’t dependent on a particular type. This would allow you to have more generic methods. But it also means manually serializing and deserializing your objects, which is easy using ToBsonDocument for serializing:

var coll = db.GetCollection<BsonDocument> ("Ships");
coll.InsertOne (ship.ToBsonDocument());

Given that the documents have discriminators, you can then specify a type in your query to retrieve specific types although, by default, hierarchies don’t get accounted for. The article refers to documentation on polymorphism for the C# API. Here’s the link. Check  to learn how to properly implement polymorphism in more detail . The following code will only pull back documents where _t matches the configured discriminator for Ship into ships and for DecommissionedShip into dShips:

var coll = db.GetCollection<BsonDocument> ("Ships");
var ships = coll.AsQueryable().OfType<Ship>().ToList();
var dShips = coll.AsQueryable()
                  .OfType<DecommissionedShip>().ToList();

Encapsulating the MongoClient, Database and Collection

Specifying a typed collection instance repeatedly, as I did in the article demos, can become a drag. You could set them up in advance, for example in a class that acts as a context for interacting with the database, as shown here:

public class ExpanseContext
{
  public IMongoDatabase ExpanseDb { get; private set; }
  public IMongoCollection<Ship> Ships { get; private set; }
  public IMongoCollection<Character> Characters {get;private set;}
  public ExpanseContext()
  {
    ExpanseDb=new MongoClient().GetDatabase("ExpanseDatabase");
    Ships=ExpanseDb.GetCollection<Ship>("ships");
    Characters=ExpanseDb.GetCollection<Character>("ships"); 
  } 
}

This refactored code to insert a document is much more readable:

private static void InsertViaContext ()
{
  var context = new ExpanseContext ();
  var ship = new Ship { Name = "Agatha King" };
  context.Ships.InsertOne (ship);
}

Building C# Project-based Azure Functions in Visual Studio Code

I’ve been using the Azure Functions for Visual Studio Code for some time now and they continue to evolve in great ways. The latest shift threw me for a loop so I thought I would document some of it for those who may not have started yet. I should also state that for the past year or so I have focused on writing my functions with JavaScript just because I like to mix things up a bit and it also makes it easier to share Azure Functions with devs who are not .NET focused.

I’ve also written about creating Azure Functions with VS Code in MSDN Magazine but again, this has changed since I wrote about it. I’ve been using the version 2 APIs for a while so I’m not talking (well, writing) about that change from v1 to v2 here but the change in the experience using the Azure Functions extension.

Also notable is that the Azure Functions team actually recommends using Visual Studio for building C# based apps and VS Code for JavaScript. But I’m so often on my MacBook and love using VS Code so I am going down this path with C# anyway.

Having revisited the docs enough times to finally notice some key information, I realize why the experience has changed with the extension working with C#. Previously, I’d worked with C# script functions (.csx) which is the same as what you use when you work directly in the portal. But the extension templates now drive you to C# project functions and there’s a big difference. C# script functions are more like the javascript functions. They depend on a manually created function.json file to define the bindings and can install the appropriate extension packages based on the bindings. The csx files are compiled at run time. With a C# class library, you develop as you would other C# class libraries – installing the relevant packages and then using attributes to identify methods as Azure Functions as well as trigger, input and output bindings. When you compile the library, the Azure Functions tooling will generate a function.json file for you that gets deployed.

Because I was used to creating functions with JavaScript or the C# script path, the new default for the Azure Function extension that uses C# class libraries instead really threw me for a loop. So I decided to document walking through this workflow as I has to learn it. I think I still prefer the lighter weight C# script (.csx) or JavaScript flow but that might align with my preference in many scenarios for VS Code over Visual Studio.

Preparing Visual Studio Code

So first things first: you’ll need to install the Azure Functions and Azure Account extensions into VS Code, Azure Functions extension relies on the Azure Functions Core Tools. The extension installation instructions will help  you get all that you need and the extension does check for updates and prompt you to update those tools as needed. In fact, I got this prompt last night.2019-01-16_09-28-10.png

With the extensions installed, it is time to create a function app project. You should already have a folder created to house your project and you might as well have it open in VS Code. Mine is named AzureFunctionProj.

Then you can click on the “Create New Project” icon on the function bar (the icons show up when you hover over the bar) to create a new Function App project inside your folder.

2019-01-17_18-11-39.png

This part of the workflow has not changed.

  1. It will ask you to point to a folder and the open folder should be there as a default to select.
  2. It will then have you select a language you’ll use in your app. From the options (C#, JavaScript, Python (still a preview) and Java (also a preview), I’ll choose C#.

As a result a new .NET Core project will be created using the template and you’ll see the following in the folder explorer:

2019-01-17_18-19-58.png

All of these files inside AzureFunctionProj folder were created by the template. Most importantly the csproj file where I’ve highlighted some of the most relevant settings.

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>netcoreapp2.1</TargetFramework>
    <AzureFunctionsVersion>v2</AzureFunctionsVersion>
  </PropertyGroup>
  <ItemGroup>
   <PackageReference Include="Microsoft.NET.Sdk.Functions"
                     Version="1.0.24" />
  </ItemGroup>
  <ItemGroup>
    <None Update="host.json">
      <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
    </None>
    <None Update="local.settings.json">
     <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
     <CopyToPublishDirectory>Never</CopyToPublishDirectory>
    </None>
  </ItemGroup>
</Project>

Creating a function in the function app is where things are quite a bit different.

You begin as always with the “add a function” icon2019-01-17_18-11-39 copy.png

The first step is familiar, selecting which folder has the function app that the function should be in:

2019-01-17_22-06-37.png

I’ll choose AzureFunctionProj.

Then there are a number of trigger templates to choose from which is nice but more interesting are the three options at the bottom:

2019-01-17_16-42-25.pngFirst is the project runtime and I definitely want v2 (“~2”). You can use different languages for different functions in the function app. This is showing the default I chose already: C#. And finally, the list of trigger templates is filtered to “Verified”.

You can change these options by clicking on them.

The filter options are Verified, Core and All. Core and All currently reveal the same list, which includes a few extra preview triggers: DurableFunctionsOrchestration, SendGrid, EventHubTrigger and IotHubTrigger.

2019-01-17_16-43-20.png

Now in the past I was creating JavaScript functions inside my .NET Core Function App but now I am going to continue with C# because this is where things really surprised me. I’ll choose  HttpTrigger and am then prompted to provide a name. I’ll just leave the default: HttpTriggerCSharp. I’m then asked to provide a namespace name for the class that will be created. Default is “Company.Project”. I’ll change it to FunctionTests.HttpTest1. The final bit of info to be collected is that you need to choose the security for the function. Of the options, I will select Anonymous because its a demo and I don’t want to have to deal with credentials.

That’s it. The function is created.

Some More Class Library Project Differences

My past experience gave me the expectation that a new folder is created inside the app function folder with the name of the function and inside there, would be a class file for the function and a function.json file to contain the binding configurations. the class file was there (though not in its own folder). But there was no function.json file. Also  interesting and new to me were the FunctionName attribute on the Run method and  the HttpTrigger attribute on the HttpRequest in the Run method’s signature. Also, I’m not used to having all of those using statements when using csx script.2019-01-19_15-16-45.png

When building the project, .NET Core reads that attribute and builds a function.json that goes in the bin folder for deployment. 2019-01-17_22-38-03.png

But it’s more than just the familiar bindings. Notice the generatedBy , configurationSource , scriptFile and entryPoint tags.

So the first binding, the httpTrigger binding, looks familiar to me. The function will respond to httpTrigger.

You can test out this default either by running or debugging. To run, you can use VS Code’s CTRL-F5 keyboard combo or, if you prefer using the CLI, you use Azure Function CLI command:

func host start

That will run the function and provide a url to try out. The template “stake-in-the-ground” method is written to accept either a query parameter or JSON in the body. I’ll just use a query parameter:

http://localhost:7071/api/HttpTriggerCSharp?name=Julie

And the browser outputs

2019-01-19_15-57-04.png

Adding an Output Binding to
Azure Cosmos DB

What if I wanted to add an output binding? I’m used to doing that by editing function.json. But since I’m on this path of attribute defined bindings, I will add the binding that way. Let’s add an output binding for Azure Cosmos DB. That way this function will respond to an HTTP Request and insert some data into a Cosmos DB database. I already have an Azure Cosmos DB account, so I will define this to target a new collection in a new database in the existing account.

In order to use work with a Azure Cosmos DB binding, I need to add the relevant package to my project. Because I’m building my function using a C# class library, I can just do this as I would for any other Nuget package…by adding it directly to csproj or adding the package with the dotnet core CLI:

But now I’m just writing a C# class library so I can add the package either with the dotnet CLI or just add it manually into .csproj. I’ll use the CLI:

dotnet add package Microsoft.Azure.WebJobs.Extensions.CosmosDB --version 3.0.3

Note that if I were building the function with JavaScript or  C# script, I would need to register the package using the tools CLI (func extensions install -p packagename).

Now the package is in csproj:

<ItemGroup>
  <PackageReference 
    Include="Microsoft.Azure.WebJobs.Extensions.CosmosDb"
    Version="3.0.3"/>
  <PackageReference
    Include="Microsoft.NET.Sdk.Functions"
    Version="1.0.24"/>
</ItemGroup>
After restoring, I can add the output binding. Currently the default function is asynchronous and you can’t add out parameters to an async method. Return values are one option. See https://docs.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings#using-the-function-return-value for details. ICollector or IAsyncCollector is another (https://docs.microsoft.com/en-us/azure/azure-functions/functions-dotnet-class-library#writing-multiple-output-values).
But I’m going to  just make the Run method synchronous and create an output parameter instead. Like the trigger binding parameter, I’ll need to add an attribute to the output parameter to specify the binding. I’m also supplying parameters for the database and collection names, the name of the setting that has the connection string to the database account and one last setting to ensure the database and collection get created if needed.
[FunctionName("HttpTriggerCSharp")]
public static ActionResult Run(
   [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post",                 Route = null)]
   HttpRequest req,
   [CosmosDB(databaseName: "CSharpDatabase",
             collectionName: "CSharpCollection",
             ConnectionStringSetting = "MyCosmosDBConnection",
             CreateIfNotExists=true)] out dynamic document,
   ILogger log)
I added the MyCosmosDBConnection is defined in the local.settings.json file:
{
  "IsEncrypted": false,
  "Values": {
     "AzureWebJobsStorage": "",
     "FUNCTIONS_WORKER_RUNTIME": "dotnet",
     "MyCosmosDBConnection": "this is where my connection string goes",
   }
}
Note that if you were creating a new function with an Azure Cosmos DB trigger, the tooling would prompt you for all of the relevant database information, include that an attribute in the function code and add them to the function.json.
Now there is just one last puzzle piece. The function expects me to provide a value to the document variable which the binding will then insert into the database. The full class listing is below and in there the single line for populating the document with a Name and Added property is highlighted in red. The function and the binding will do all of the rest.
using System;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;

namespace FunctionTests.HttpTest1
{
  public static class HttpTriggerCSharp
  {
    [FunctionName("HttpTriggerCSharp")]
    public static ActionResult Run(
      [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]
      HttpRequest req, 
      [CosmosDB(databaseName: "CSharpDatabase",
                collectionName: "CSharpCollection",
                ConnectionStringSetting = "MyCosmosDBConnection",
                CreateIfNotExists=true)] out dynamic document,   
      ILogger log)
    {
      log.LogInformation("C# HTTP trigger function processed a request.");
      string name = req.Query["name"];
      document=new { Name = name, Added = DateTime.Now };
      return name != null
        ? (ActionResult)new OkObjectResult($"Hello, {name}")
        : new BadRequestObjectResult("Please pass a name on the query string");
     }
  }
}
Based on my previous experience of manually configuring function.json, I expected after building, the function.json in the bin folder to include the CosmosDB output binding information but it didn’t. Again, this is due to the differences between JavaScript/C# script functions and C# library functions.
Within VS Code I can run or debug the function. Debug is F5 or the debug icon. To run, without debugging, you can use CTRL-F5 or  the tools CLI command:
func host start
When using the CLI command, I found in my testing that the version I’m using seems to require me to run dotnet clean first for it to succeed. CTRL-F5 does that step for you . I made a note of this in a pre-existing GitHub issue. You can read that here.
There is a lot of useful info in the official docs. Two that I leaned on are: