Resize Windows’ Screen Resolution with a Touch of a Stream Deck Button

I have a lovely pair wide screen monitors with 1080p resolution. However, when recording software training courses for Pluralsight, we are asked to use a resolution of 1280×720 so that text and code are legible across a variety of devices and sizes.

Therefore, when recording a course, which may take me many many weeks, I tend to leave one of my monitors at 1280×720. But I’m constantly doing other things on that monitor such as email or writing and that resolution is discomforting.

There is no easy way to change the resolution other than going into system settings. But I now have a super easy way to change that monitor’s resolution back and forth.

Like many of us who are now creating content at home (although I am not streaming on twitch like many of my friends) I recently added an Elgato Stream Deck controller to my toolkit, along with some key lights, too! I use the stream deck to control the lights while recording  video that requires that I be in it.

Step 1: Find a command line tool for affecting screen resolution.

There are some apps and there are a few CLI tools. After asking around on Twitter, I learned about Display Changer 2, used by two very trusted nerds (and friends): 

However, there are TWO versions of Display Changer and DC2 is the programmable one. So not knowing there was a simpler version, I got stuck on a path that was way more complicated than I needed. I learned how to create xml configuration files for various display settings, then create a PowerShell script to execute DC2 against those config files. But Stream Deck can’t run PowerShell files, so then I had to create a batch file to run the PS1 file. It was madness but I was determined and got it all working. And then blogged the very complicated path. Oy vey!

Then back to my brainy pal who said…well, y’know….:

Now this is my pal from whom I learned YET A BETTER WAY to load utensils into a dishwasher that I am now obsessed with (the dishwasher trick, that is. I do like Glenn, but no, I’m not obsessed with him)! So I always trust him, but I looked at that and thought “DC2 doesn’t have these switches and what is DC64?”.

I went back to 12noon’s website and realized that the “Display Changer” is different than “Display Changer II” and is a simpler tool to work with. Even though I felt like such a dope for getting stuck on the complicated path, I was happy for the MUCH EASIER way!  So if you haven’t ever configured a Stream Deck button, let’s finish this up with Glenn’s easy street way.

Step 2: Identify the Monitor

I’d recommend practicing the command at the command line before just shoving it into Stream Deck. Also, since I was aiming for my secondary monitor, I needed to use the 
dccmd -listmonitors
command to find out how to address that monitor. Turns out it’s 
"\\.\DISPLAY2"

So the command to change that monitor’s resolution to 1280×720 is:

"C:\Program Files (x86)\12noon Display Changer\dc.exe" -monitor="\\.\DISPLAY2" -width=1280 -height=720

Step 3: Configure Stream Deck buttons to run the batch files

In the Stream Deck app, drag the System/Open option onto the button you want to configure for 720p.

In the settings, leave Title blank.

In the App/File setting, paste in your command:

"C:\Program Files (x86)\12noon Display Changer\dc.exe" -monitor="\\.\DISPLAY2" -width=1280 -height=720

In the icon selector, you can choose Create New Icon to design then download an icon for your button.

Setup another button to change the monitor back to your default resolution. Mine is 1080p.

"C:\Program Files (x86)\12noon Display Changer\dc.exe" -monitor="\\.\DISPLAY2" -width=1920-height=1080

Originally these were the icons I crated for my buttons. They are good enough for me and bright colors.

This image has an empty alt attribute; its file name is streamdeck_key_720p-3.png   This image has an empty alt attribute; its file name is streamdeck_key_1080p-2.png

But Glenn was unimpressed and created some new ones and sent them to me. I’m sure he’ll be happy for me to share them.

This image has an empty alt attribute; its file name is key_julie_720.png   This image has an empty alt attribute; its file name is key_julie_1080.png

Here’s the Stream Deck with it’s new buttons. The setup works like a charm!

MSDN Mag Data Points Column Archives in Microsoft Docs

With MSDN Magazine shutting down, all of the content has been archived on the Microsoft docs site.

You can get to a listing of magazines by issue at https://docs.microsoft.com/en-us/archive/msdn-magazine/msdn-magazine-issues

And if you are looking specifically for my Data Points column archives, here is a link to the list of those articles:

https://docs.microsoft.com/en-us%5Carchive%5Cmsdn-magazine%5Cauthors%5CJulie_Lerman

Pluralsight is totally free for the month of April

While many of you who read my blog are already Pluralsight subscribers with work or personal subscriptions, there are so many who do not have access. So Pluralsight is opening up the entire library of over 7,000 courses for people to watch while stuck at home. And you do not need to use a credit card to sign up.*

So whether you want to watch one my my courses such as 

Or any of the other 7,000+ courses from some of the most knowledgeable devs who happen to be great at teaching ….

Have at it!

There is also a free plan for business accounts.

Business Free April Details: “To support your team’s skill development during these new challenges, for a limited time we’ve extended our free team trial from 14 days to 30 days.”

*The fine print: Free April is open to anyone who is not a current, active subscriber. New free accounts and reactivated accounts opened through April 30, 2020 will have access to Pluralsight’s library of video courses through April. Payment information will not be required for new free accounts opened through April 30, 2020. New free accounts opened after May 1, 2020, will only have access to a portion of Pluralsight’s library and will require payment information.

November Conferences: BuildStuff in Lithuania and GOTO in Copenhagen

I have one last conference trip coming up in 2019 which is a two-fer.

First, I’ll be at BuildStuff in Vilnius Lithuania Nov 13-17. I’m excited to be giving a keynote, “Living with Your Legacy”. If you are planning to attend but haven’t registered yet, you can use my last name “LERMAN” as a discount code. Register here. Twitter hashtag is #BUILDSTUFFLT.

From Lithuania, I’ll then be traveling to Copenhagen for GOTO Copenhagen. (Nov 18-20). This conference also has a discount code, “speakerfriend”. The twitter hashtag for this conference is #gotocph.

 

Resources from BASTA! “Living with Your Legacy” Keynote

I made reference to various books, podcasts, great minds and more in the keynote at BASTA! Conference. Here are links for the curious:

Michael Feathers‘ Book “Working Effectively with Legacy Code”, Prentice Hall, 2004

Eric Evans‘ book, Domain-Driven Design, Addison-Wesley, 2003

Michael Feathers’ blog post Is Technical Debt Just a Metaphor?

Legacy Code Rocks podcast

Corgi Bytes (Andrea Goulet & M. Scott Ford)

Nick Tune DDD Hidden Lessons talk from Craft Conference

Accelerate, The Science of Devops: 2018 book, 2019 update (free PDF from Google) . Nicole Ferguson, PhD,  Jez Humble,  Gene Kim

Grady Booch on Cobol at 60

 

 

 

 

 

Sad News: MSDN Mag & (Data Points) Coming to an End

From MSDN Mag

Perhaps I live in a world of rose-colored glasses but this was a surprise to me when Michael Desmond, the Editor-in-Chief, called to let me know in advance of the public announcement.

I’ve loved the magazine long before I was blessed with the opportunity to write for it. But as a columnist for the past 10 years, I’m somewhat heartbroken. The Data Points column (I hope this link will continue to work) that I inherited from John Papa in early 2010 has been a font of inspiration for me. I’ve used it as an excuse to dig into data related technologies that I was curious about.

As long as there was some data angle, the topic was fair game. This allowed me to take my first successful adventure even into Git (2014: Git, It’s Just Data), serverless computing (2018: Creating Azure Functions to Interact with Cosmos DB) and even front end development (2015: Revisiting JavaScript Data Binding — Now with Aurelia). And on the true data front, I had a great reason to start learning about document databases (2011: What the Heck Are Document Databases?) which led to a lot of articles involving Azure Cosmos DB. Thanks to SQL Server for Linux, I finally had the courage to dive into Docker and have gotten to write a slew of articles on different aspects of that.

Many of you have also followed my journey as I gained more and more expertise in Domain Driven Design and continually checked in to see how well EF and then EF Core helped as a mapper between DDD-focus designed classes and a relational database. The first article I wrote connecting the two was in 2013: Shrink EF Models with DDD Bounded Contexts.

And of course there have been a few articles on or using Entity Framework! Over 50 of them!

In all, if you count the final two articles I have in the works, I’ve written (if I’m counting correctly) 82 columns. And there were a few articles in there as well that weren’t for the column such as one on Visual Studio Live Share and Azure Data Studio for special issues.

But what I will never forget is the very first article I wrote for the magazine. It was a gift of an opportunity thanks to the amazing Sara Spalding who was at the time in charge of the entire MSDN operation (at an impressively young age). That was the April 2005 issue and I wrote an article on ADO.NET 2.0. I still have the magazine!

People have suggested that with the demise of the magazine, I should just continue the column on my blog. But admittedly, in addition to the incredible opportunity to share my learnings with an interested audience, having deadlines, a copy editor, tech reviewers and equally important: a paycheck, really drove me to produce this column so diligently for almost 10 years. But I’ll never stop researching and sharing! That’s for sure.

With the evolution of MSDN docs, an entire team to create those and the still growing developer advocacy teams at Microsoft, you will certainly continue to get great content from them as MSDN Magazine yields to this fantastic resource.

Quick Tips for Migrating EF Core at Runtime on Azure

A question on my EF Core 2 Getting Started course on Pluralsight asks:

Hi Julie
Thank you for your course. I have a question. Can you please advise what is the best practice to deploy code first migration to the production database (azure).? I mean i have created 
asp.net core mvc and ef.core application using the code first migration approach. after deployed to azure. If i change any schema in our code first approach how can i update schema in production database ( without loosing the data in production database)

My response with a few ideas I thought were worth sharing outside of the course discussion. (Note that this is just high level)

For simple solutions, one path I have used is to call database.migrate at startup. The VS publish workflow has an option you can check to apply migrations when the app is published. You can see this in the MS docs for deploying an aspnet core app to Azure:

Or you can do it programmaticly in the program.cs file which will perform any needed migrations. There’s an example of this in my April 2019 MSDN Magazine article . If you need something more robust, then you could instead generate scripts (perhaps Idempotent scripts) with EF Core migrations and then include them with your updates and use a tool that can apply those scripts. If you use Redgate tools, perhaps their SQL Change Automation tool. Another type of tool is a migrator tool like FlywayDB (flywaydb.org) or LiquidBase. I’ve used Flyway with containers. Here’s a very recent conference talk I did where I used it: bit.ly/30AhgAr

Publishing a Single Image Docker Container with Secrets from VS2017 and Running it on Azure

(Written in advance, but published on May 2 when the relevant article is finally available.)

I’ve just finished writing a three part series on building a containerized ASP.NET Core API that uses EF Core for its data persistence. All of this was done in VS 2017 and I took advantage of the VS2017 Tools for Docker.

The article series will be in the April, May and June issues of MSDN Magazine.

Part 1: EF Core in a Docker Containerized App, Apr 2019

Part 2: EF Core in a Docker Containerized App, May 2019

But I didn’t have room to include the important task of deploying the app I’d written, although I worked hard to do it. Well, the deployment was pretty easy but there were some new steps to earn in order to deal with storing a password for making a connection to my Azure SQL database. I will relay those steps in this blog post.

My API uses EF Core and targets an Azure SQL database. So whether I’m debugging locally with IIS or Kestrel, debugging locally inside of a Docker container or running the app from a server or the cloud, I can always access that database.

That means I have a connection string to deal with but I want to keep the password a secret.

The structure of the solution is here. My ASP.NET Core API project is DataAPIDocker. And because I used the docker tools to add container orchestration, I have another folder in the solution for docker-compose.

image

I go into detail in part 2 of the article (the one in the May 2019 issue) but the bottom line is that I use a docker environment variable in my docker-compose.yml file.

version: ‘3.4’

services:
dataapidocker:
image: ${DOCKER_REGISTRY-}dataapidocker
build:
context: .
dockerfile: DataAPIDocker/Dockerfile
environment:
– DB_PW

In the environment mapping, I have a sequence item where I’m defining the DB_PW key but I’m not including a value. Becasue there’s no value there, Docker will look in the host’s environment variables. Because I’m only debugging, I create a temporary environment variable on my system with the value of the password and when I debug or run the app from VS2017, the password variable will be found. That environment variable gets passed into the running container and my app has code to read it and include that password in teh connection string.

So its all self-contained, nice and neat.

Publishing the Image to Azure’s ACI Registry

Once you’ve got the app working it’s time to publish it. But we’re using Docker, so you’re not publishing the app, but the docker image that can run the app for you. Docker tools will help with this also.

Right click the project and choose Publish.

image

Then you will want to create a publish profile. And part of that profile is to choose where to publish the image. Here you’ve got options. I have a VIsual Studio Subscription and can publish it to an Azure Container Registry if I want or to Docker Hub or to some other registry.

image

My goal for this blog post is to get the image into the Azure Container Registry so that’s my choice. You can have multiple container registries in your azure account. And you can store any number of images in a single registry. Well, there may be technical or financial constraints, but the point is that you can have multiple images in a registry. I’m not here to advise on how to manage azure finances, just how to do the task.

Here’s the overview page of a registry I let the publishing tool create for me.  I’ve circled the link to see repositories which is where your images are accessible.

image

You may have different versions of a particular image so each “set” is a different repository. I have three repositories in mine where I’ve been experimenting.

image

The dataapi has only one image which the publishing tool automatically tagged “latest” for me. I can have other versions under different tags.

Back in Visual Studio, after walking through the publishing tool’s questions for creating a new repository, the final step is to go ahead an publish which will build the image and push it up to the target repository. Keep in mind that you’ll want VS2017 to be set to run a RELEASE build, not a DEBUG build.

If its your first time pushing this image to the repository, the tooling will also push the ASP.NET Core SDK and runtime images that are listed in the app’s Dockerfile .

I was surprised to see this, wondering why Azure didn’t just grab them from docker hub and why I was uploading those big files directly. Naturally I tweeted my confusion:

There’s more to the story but it is beyond the scope of my goals here.

A cool feature of this registry is that you can right click and run an image. Which is fine if you aren’t trying to orchestrate a number of images and that matches my case. This image does run independently.

Right click on the image and choose Run instance. Azure will create a container and run it as an Azure Container Instance. Although first you need to define specs for the instance.

image

It’s kind of magical because you don’t have to create and manage a virtual machine to run the container on if it’s a simple application.

What About Environment Variables for the Container Instance?

The instance will run but the Magazines controller that needs to read from the Azure SQL database will fail because we haven’t provided the password which the container expects to be provided through the host’s environment variable. So for my image, right click and run wasn’t quite enough.

This is where I had to do a lot of reading, research and experimentation until I got the solution working. (Keep in mind that if I were running this on a virtual machine of my own devising, you can just pass the variable in when you manually call docker run.)

There are two ways to provide an environment variable to a container instance.

One, through the portal, means rather than right clicking the image, you need to start by creating a new container instance in Azure and pointing to the image. This path lets you assign up to 3 environment variables in the configuration:
image

Using an On-The-Fly Variable to Pass into the Container

I’m going to do a first pass creating an variable on the fly to pass to EnvironmentVariable. Then I’ll show you how to use the Azure Key Vault

EnvironmentVariable expects a hashtable.

Create a new variable (I’ll call it envVars in homage to the resource where I learned this) and assign a single key value pair:

$envVars = @{‘DB_PW’=’eiluj’}

The other tricky part is providing the credentials to access the image in the registry. We don’t have to do that when using the portal to create the container because we’ve already provided them. But now I need to provide them.

You’ll need the user name and password from the repository:

Then you can use PowerShell to create a secure string from the password and then use that secure string along with the username to create a PowerShell credential object.

TIP: If you have multiple subscriptions, be sure you’re pointing to the correct one where the target resource group is.

TIP: A cool thing you can do in cloud shell is type DIR to list your subscriptions and then use CD to get into the correct one! Checkout the PowerShell Cloud Shell quick start for details

.

TIP: If like me, you mess around with the database to experience cause & effect, remember that in my sample code, the database gets migrated on app startup. In the case of having it in a container that means when the container instance is run. So if you run the container, then delete the database, you won’t see the db again until the container is spun up again. Stopping & restarting has the same effect. Of course this is just for testing things out, not production! Once again, something that had me stuck for over an hour until I had my aha moment.

Creating a KeyVault and Adding My Secret Password