Tag Archives: Azure Functions

Building C# Project-based Azure Functions in Visual Studio Code

I’ve been using the Azure Functions for Visual Studio Code for some time now and they continue to evolve in great ways. The latest shift threw me for a loop so I thought I would document some of it for those who may not have started yet. I should also state that for the past year or so I have focused on writing my functions with JavaScript just because I like to mix things up a bit and it also makes it easier to share Azure Functions with devs who are not .NET focused.

I’ve also written about creating Azure Functions with VS Code in MSDN Magazine but again, this has changed since I wrote about it. I’ve been using the version 2 APIs for a while so I’m not talking (well, writing) about that change from v1 to v2 here but the change in the experience using the Azure Functions extension.

Also notable is that the Azure Functions team actually recommends using Visual Studio for building C# based apps and VS Code for JavaScript. But I’m so often on my MacBook and love using VS Code so I am going down this path with C# anyway.

Having revisited the docs enough times to finally notice some key information, I realize why the experience has changed with the extension working with C#. Previously, I’d worked with C# script functions (.csx) which is the same as what you use when you work directly in the portal. But the extension templates now drive you to C# project functions and there’s a big difference. C# script functions are more like the javascript functions. They depend on a manually created function.json file to define the bindings and can install the appropriate extension packages based on the bindings. The csx files are compiled at run time. With a C# class library, you develop as you would other C# class libraries – installing the relevant packages and then using attributes to identify methods as Azure Functions as well as trigger, input and output bindings. When you compile the library, the Azure Functions tooling will generate a function.json file for you that gets deployed.

Because I was used to creating functions with JavaScript or the C# script path, the new default for the Azure Function extension that uses C# class libraries instead really threw me for a loop. So I decided to document walking through this workflow as I has to learn it. I think I still prefer the lighter weight C# script (.csx) or JavaScript flow but that might align with my preference in many scenarios for VS Code over Visual Studio.

Preparing Visual Studio Code

So first things first: you’ll need to install the Azure Functions and Azure Account extensions into VS Code, Azure Functions extension relies on the Azure Functions Core Tools. The extension installation instructions will help  you get all that you need and the extension does check for updates and prompt you to update those tools as needed. In fact, I got this prompt last night.2019-01-16_09-28-10.png

With the extensions installed, it is time to create a function app project. You should already have a folder created to house your project and you might as well have it open in VS Code. Mine is named AzureFunctionProj.

Then you can click on the “Create New Project” icon on the function bar (the icons show up when you hover over the bar) to create a new Function App project inside your folder.

2019-01-17_18-11-39.png

This part of the workflow has not changed.

  1. It will ask you to point to a folder and the open folder should be there as a default to select.
  2. It will then have you select a language you’ll use in your app. From the options (C#, JavaScript, Python (still a preview) and Java (also a preview), I’ll choose C#.

As a result a new .NET Core project will be created using the template and you’ll see the following in the folder explorer:

2019-01-17_18-19-58.png

All of these files inside AzureFunctionProj folder were created by the template. Most importantly the csproj file where I’ve highlighted some of the most relevant settings.

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>netcoreapp2.1</TargetFramework>
    <AzureFunctionsVersion>v2</AzureFunctionsVersion>
  </PropertyGroup>
  <ItemGroup>
   <PackageReference Include="Microsoft.NET.Sdk.Functions"
                     Version="1.0.24" />
  </ItemGroup>
  <ItemGroup>
    <None Update="host.json">
      <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
    </None>
    <None Update="local.settings.json">
     <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
     <CopyToPublishDirectory>Never</CopyToPublishDirectory>
    </None>
  </ItemGroup>
</Project>

Creating a function in the function app is where things are quite a bit different.

You begin as always with the “add a function” icon2019-01-17_18-11-39 copy.png

The first step is familiar, selecting which folder has the function app that the function should be in:

2019-01-17_22-06-37.png

I’ll choose AzureFunctionProj.

Then there are a number of trigger templates to choose from which is nice but more interesting are the three options at the bottom:

2019-01-17_16-42-25.pngFirst is the project runtime and I definitely want v2 (“~2”). You can use different languages for different functions in the function app. This is showing the default I chose already: C#. And finally, the list of trigger templates is filtered to “Verified”.

You can change these options by clicking on them.

The filter options are Verified, Core and All. Core and All currently reveal the same list, which includes a few extra preview triggers: DurableFunctionsOrchestration, SendGrid, EventHubTrigger and IotHubTrigger.

2019-01-17_16-43-20.png

Now in the past I was creating JavaScript functions inside my .NET Core Function App but now I am going to continue with C# because this is where things really surprised me. I’ll choose  HttpTrigger and am then prompted to provide a name. I’ll just leave the default: HttpTriggerCSharp. I’m then asked to provide a namespace name for the class that will be created. Default is “Company.Project”. I’ll change it to FunctionTests.HttpTest1. The final bit of info to be collected is that you need to choose the security for the function. Of the options, I will select Anonymous because its a demo and I don’t want to have to deal with credentials.

That’s it. The function is created.

Some More Class Library Project Differences

My past experience gave me the expectation that a new folder is created inside the app function folder with the name of the function and inside there, would be a class file for the function and a function.json file to contain the binding configurations. the class file was there (though not in its own folder). But there was no function.json file. Also  interesting and new to me were the FunctionName attribute on the Run method and  the HttpTrigger attribute on the HttpRequest in the Run method’s signature. Also, I’m not used to having all of those using statements when using csx script.2019-01-19_15-16-45.png

When building the project, .NET Core reads that attribute and builds a function.json that goes in the bin folder for deployment. 2019-01-17_22-38-03.png

But it’s more than just the familiar bindings. Notice the generatedBy , configurationSource , scriptFile and entryPoint tags.

So the first binding, the httpTrigger binding, looks familiar to me. The function will respond to httpTrigger.

You can test out this default either by running or debugging. To run, you can use VS Code’s CTRL-F5 keyboard combo or, if you prefer using the CLI, you use Azure Function CLI command:

func host start

That will run the function and provide a url to try out. The template “stake-in-the-ground” method is written to accept either a query parameter or JSON in the body. I’ll just use a query parameter:

http://localhost:7071/api/HttpTriggerCSharp?name=Julie

And the browser outputs

2019-01-19_15-57-04.png

Adding an Output Binding to
Azure Cosmos DB

What if I wanted to add an output binding? I’m used to doing that by editing function.json. But since I’m on this path of attribute defined bindings, I will add the binding that way. Let’s add an output binding for Azure Cosmos DB. That way this function will respond to an HTTP Request and insert some data into a Cosmos DB database. I already have an Azure Cosmos DB account, so I will define this to target a new collection in a new database in the existing account.

In order to use work with a Azure Cosmos DB binding, I need to add the relevant package to my project. Because I’m building my function using a C# class library, I can just do this as I would for any other Nuget package…by adding it directly to csproj or adding the package with the dotnet core CLI:

But now I’m just writing a C# class library so I can add the package either with the dotnet CLI or just add it manually into .csproj. I’ll use the CLI:

dotnet add package Microsoft.Azure.WebJobs.Extensions.CosmosDB --version 3.0.3

Note that if I were building the function with JavaScript or  C# script, I would need to register the package using the tools CLI (func extensions install -p packagename).

Now the package is in csproj:

<ItemGroup>
  <PackageReference 
    Include="Microsoft.Azure.WebJobs.Extensions.CosmosDb"
    Version="3.0.3"/>
  <PackageReference
    Include="Microsoft.NET.Sdk.Functions"
    Version="1.0.24"/>
</ItemGroup>
After restoring, I can add the output binding. Currently the default function is asynchronous and you can’t add out parameters to an async method. Return values are one option. See https://docs.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings#using-the-function-return-value for details. ICollector or IAsyncCollector is another (https://docs.microsoft.com/en-us/azure/azure-functions/functions-dotnet-class-library#writing-multiple-output-values).
But I’m going to  just make the Run method synchronous and create an output parameter instead. Like the trigger binding parameter, I’ll need to add an attribute to the output parameter to specify the binding. I’m also supplying parameters for the database and collection names, the name of the setting that has the connection string to the database account and one last setting to ensure the database and collection get created if needed.
[FunctionName("HttpTriggerCSharp")]
public static ActionResult Run(
   [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post",                 Route = null)]
   HttpRequest req,
   [CosmosDB(databaseName: "CSharpDatabase",
             collectionName: "CSharpCollection",
             ConnectionStringSetting = "MyCosmosDBConnection",
             CreateIfNotExists=true)] out dynamic document,
   ILogger log)
I added the MyCosmosDBConnection is defined in the local.settings.json file:
{
  "IsEncrypted": false,
  "Values": {
     "AzureWebJobsStorage": "",
     "FUNCTIONS_WORKER_RUNTIME": "dotnet",
     "MyCosmosDBConnection": "this is where my connection string goes",
   }
}
Note that if you were creating a new function with an Azure Cosmos DB trigger, the tooling would prompt you for all of the relevant database information, include that an attribute in the function code and add them to the function.json.
Now there is just one last puzzle piece. The function expects me to provide a value to the document variable which the binding will then insert into the database. The full class listing is below and in there the single line for populating the document with a Name and Added property is highlighted in red. The function and the binding will do all of the rest.
using System;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;

namespace FunctionTests.HttpTest1
{
  public static class HttpTriggerCSharp
  {
    [FunctionName("HttpTriggerCSharp")]
    public static ActionResult Run(
      [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]
      HttpRequest req, 
      [CosmosDB(databaseName: "CSharpDatabase",
                collectionName: "CSharpCollection",
                ConnectionStringSetting = "MyCosmosDBConnection",
                CreateIfNotExists=true)] out dynamic document,   
      ILogger log)
    {
      log.LogInformation("C# HTTP trigger function processed a request.");
      string name = req.Query["name"];
      document=new { Name = name, Added = DateTime.Now };
      return name != null
        ? (ActionResult)new OkObjectResult($"Hello, {name}")
        : new BadRequestObjectResult("Please pass a name on the query string");
     }
  }
}
Based on my previous experience of manually configuring function.json, I expected after building, the function.json in the bin folder to include the CosmosDB output binding information but it didn’t. Again, this is due to the differences between JavaScript/C# script functions and C# library functions.
Within VS Code I can run or debug the function. Debug is F5 or the debug icon. To run, without debugging, you can use CTRL-F5 or  the tools CLI command:
func host start
When using the CLI command, I found in my testing that the version I’m using seems to require me to run dotnet clean first for it to succeed. CTRL-F5 does that step for you . I made a note of this in a pre-existing GitHub issue. You can read that here.
There is a lot of useful info in the official docs. Two that I leaned on are:

Copying an Azure Function App using the Portal, GitHub and Cloud Shell

I’ve got an Azure Function App with some functions that I built for a demo for a conference session. I first used this at AngularMix and had named the function app AngularMix2017. Then when I did it for DevIntersection this past week, I recreated it as DevIntersection2017 because you can’t rename a function app. I’m doing it yet again next week at another conference (NetConf.co) and decided it was time to create it as a reusable name: DataApiDemo.

While you can’t flat out copy a function app if you are working strictly in the portal (which is how I’ve been working during my first learning stages of building function apps), you can easily enough duplicate it by downloading and uploading it again.

Keep in mind that if you are developing functions in VS 2017 or VS Code (combined with the Azure CLI) you can just use the tooling to publish the function apps along with the settings. But I don’t like to take the easy path. I learned a lot of new tricks and tools by going this route. Even if I leverage the tooling on my next go round, I’ve learned a lot which gives me a better understanding overall, more control and troubleshooting skills.

My first step was to create the new function app (dataapidemo).

The Overview blade of an Azure function app has a “download app content” option. That will pull down all of the relevant files onto your machine. So next, I go to the overview of the DevIntersection2017 function app and click on the download app content link.

When downloading the app content, you have an option to also grab the app settings. I have a number of secrets stored in my settings ..things like keys to my databases , account ids and more. If you check that, the settings will get downloaded to a local settings file. If you were using this feature to continue your development locally, e.g. in VS2017, then you’ll want them. But for my scenario, they are not useful and even potentially a security issue (you’ll see shortly) so I’m going to leave that unchecked.

I can see the files in the designated folder on my computer.

I’ve already created a Github repository to store these files in: https://github.com/julielerman/DataApiAzureFunctions.

At the command line in the new folder, I’ll git init, add all of the files into the local repo and commit them. Then I can run the git commands to connect that folder to my GitHub repo and then push my local repository into the online repository.

When that’s complete, the files are all in my GitHub repo.

Since I’m using a public repository, this is why I didn’t want the local.settings.json file filled with my secrets to download. Of course I could have deleted it locally or told Git to ignore that file. A few other points about this. If I were developing this locally, then I can have that file locally and use VS2017 or Azure CLI tools to publish my app along with the settings to the portal. But that is not my goal here.

Now I can take advantage of a long-time feature of the Azure Portal: connect it to a repository to auto-deploy from the repository to my app (in the case to my function app).

Back in the portal, I select the new function app again, then from its Platform Features page, open the Deployment Options. Choose Setup new deployment and follow the steps to connect the function app to your repository. When you’ve completed the setup, you can use the Sync button to force the deployment to happen. All of the files are now in my new dataapidemo function, including the function.json files that contain the function setting (e.g. integrations etc) and any other files I may have added to my functions.

Because I am using Twilio, one of my functions has a reference to Twilio in a project.json file and I need to restore that package to this function app. I did that by opening up the project.json and making a small mod to it..adding or removing a blank line, then saving it. That triggers the package restore to happen.

What’s not there however are the function app settings. These are settings that are defined to the app and used by one or more of the functions. There are some default settings created by Azure but I also have some others …those secrets, such as the account and telephone number for my Twilio integration as well as a key for the function.

Some of these settings are my own custom settings that I want in the new function. I could add them in via the portal app settings interface, but I don’t like that path when I have a slew of settings.

That means moving on to yet another amazing tool in the Azure Portal — Cloud Shell. Here you can run Bash or Powershell commands from the Azure CLI directly on services in your subscription. You can also do this locally on your machine with the extra caveat that there you must provide credentials and other metadata.

In the portal, open the Cloud Shell. This is gives you an interactive terminal window.

If you have multiple subscriptions, you want to be sure that it’s set to the one where your app function is stored.

You can do this by listing the subscriptions with the az account list command.

If the one you want is not default, then set it with:

az account set -s [ID]

This shell can do auto-complete so if the id starts with a123, you can type

az account set -s a123

then hit the tab and the id will get finished . Hit enter and it becomes the default subscription to work in.

I’ll use the azure functionapp commands to get the settings into my function app.

First I’ll list the existing settings. You do this with the command

az functionapp config appsettings list --name [functionname] --resource-group [resource name]

The parameter shortcuts for name and resource-group are -n and -g. Mine are both the same name: dataapidemo. Here’s what the command looks like (except for the wrapping that my blog is forcing):

az functionapp config appsettings list -n dataapidemo -g dataapidemo

The appsettings are listed out as JSON by default although you can affect the output format with the output parameter.

I’ll need to see the settings I want to copy from the other function app so I run:

az functionapp config appsettings list -n devintersection2017 -g devintersection2017 -o tsv

which give me the list of settings from which to choose.

In addition to listing the settings, you can add new ones with the set verb and delete with delete verb. Both would go in place of the verb list.

You use SET by combining the SET command and the –settings parameter. After the –settings parameter, you can list one or more setting with this format:

az functionapp config appsettings set -n dataapidemo -g dataapidemo --settings "property1name=value", "property2name=value"

Note that this is wrapping here on the blog post but not in the shell window.

So I added the handful of settings I wanted to add which looked more like:

az functionapp config appsettings set -n dataapidemo -g dataapidemo --settings "TwilioSID=myvalue", "TwilioAccount=myvalue"

There were others I had to add. For example, my function needs to know how to get to my Cosmos DB document database, so it has a setting that uses the db name as the property name and the dbs account endpoint as the value.

Once I had done that, I was able to get rid of the function apps that were named for each conference and use a common one no matter where I am sharing it.

And boyohboy did I learn a lot of new things!