Scott Hanselman's Blog

Scott Hanselman's Blog

Description

Scott Hanselman on Programming, User Experience, The Zen of Computers and Life in General

Link: www.hanselman.com/blog/

Episodes

Cool WSL (Windows Subsystem for Linux) tips and tricks you (or I) didn't know were possible

Nov 13, 2019

Description:

It's no secret I dig WSL (Windows Subsystem for Linux) and now that WSL2 is available in Windows Insiders Slow it's a great time to really explore the options that are available. What I'm finding is so interesting about WSL and how it relates to the Windows system around it is how you can cleanly move data between worlds. This isn't an experience you can easily have with full virtual machines, and it speaks to the tight integration of Linux and Windows.

Look at all this cool stuff you can do when you mix your peanut butter and chocolate!

Run Windows Explorer from Linux and access your distro's files

When you're at the WSL/bash command line and you want to access your files visually, you can run "explorer.exe ." where . is the current directory, and you'll get a Windows Explorer window with your Linux files served to you over a local network plan9 server.

Accessing WSL files from Explorer Use Real Linux commands (not Cgywin) from Windows

I've blogged this before, but there are now aliases for PowerShell functions that allow you to use real Linux commands from within Windows.

You can call any Linux command directly from DOS/Windows/whatever by just putting it after WSL.exe, like this!

C:\temp> wsl ls -la | findstr "foo"
-rwxrwxrwx 1 root root 14 Sep 27 14:26 foo.bat

C:\temp> dir | wsl grep foo
09/27/2016 02:26 PM 14 foo.bat

C:\temp> wsl ls -la > out.txt

C:\temp> wsl ls -la /proc/cpuinfo
-r--r--r-- 1 root root 0 Sep 28 11:28 /proc/cpuinfo

C:\temp> wsl ls -la "/mnt/c/Program Files"
...contents of C:\Program Files... Use Real Windows commands (not Wine) from Linux

Windows executables are callable/runnable from WSL/Linux because the the Windows Path is in the $PATH until Windows. All you have to do is call it with .exe at the end, explicitly. That's how "Explorer.exe ." works above. You can also notepad.exe, or whatever.exe!

Run Visual Studio Code and access (and build!) your Linux apps natively on Windows

You can run "code ." when you're in a folder within WSL and you'll get prompted to install the VS Remote extensions. That effectively splits Visual Studio Code in half and runs the headless VS Code Server inside Linux with the VS Code client in the Windows world.

You'll also need to install Visual Studio Code and the Remote - WSL extension. Optionally, check out the beta Windows Terminal for the best possible terminal experience on Windows.

Here's a great series from the Windows Command LIne blog:

You can find the full series here: Part 1 Take your Linux development experience in Windows to the next level with WSL and Visual Studio Code Remote Part 2 An In Depth Tutorial on Linux Development on Windows with WSL and Visual Studio Code Part 3 Tips and Tricks for Linux development with WSL and Visual Studio Code

Here's the benefits of WSL 2

Virtual machines are resource intensive and create a very disconnected experience. The original WSL was very connected, but had fairly poor performance compared to a VM. WSL 2 brings a hybrid approach with a lightweight VM, a completely connected experience, and high performance.

Again, now available on Windows 10 Insiders Slow.

Run multiple Linuxes in seconds, side by side

Here I'm running "wsl --list --all" and I have three Linuxes already on my system.

C:\Users\scott>wsl --list --all
Windows Subsystem for Linux Distributions:
Ubuntu-18.04 (Default)
Ubuntu-16.04
Pengwin

I can easily run them, and also assign a profile to each so they appear in my Windows Terminal dropdown.

Run an X Windows Server under Windows using Pengwin

Pengwin is a custom WSL-specific Linux distro that's worth the money. You can get it at the Windows Store. Combine Pengwin with an X Server like X410 and  you've got a very cool integrated system.

Easily move WSL Distros between Windows systems

Ana Betts points out this great technique where you can easily move your perfect WSL2 distro from one machine to n machines.

wsl --export MyDistro ./distro.tar

# put it somewhere, dropbox, onedrive, elsewhere

mkdir ~/AppData/Local/MyDistro
wsl --import MyDistro ~/AppData/Local/MyDistro ./distro.tar --version 2

That's it. Get your ideal Linux setup sync'ed on all your systems.

Use the Windows Git Credential Provider within WSL

All of these things culminate in this lovely blog post by Ana Betts where she integrates the Windows Git Credential Provider in WSL by making /usr/bin/git-credential-manager into a shell script that calls the Windows git creds manager. Genius. This would only be possible given this clean and tight integration.

Now, go out there, install WSL, Windows Terminal, and make yourself a shiny Linux Environment on Windows.

Sponsor: Like C#? We do too! That’s why we've developed a fast, smart, cross-platform .NET IDE which gives you even more coding power. Clever code analysis, rich code completion, instant search and navigation, an advanced debugger... With JetBrains Rider, everything you need is at your fingertips. Code C# at the speed of thought on Linux, Mac, or Windows. Try JetBrains Rider today!


© 2019 Scott Hanselman. All rights reserved.
     

New YouTube Series: Computer things they didn't teach you in school

Nov 8, 2019

Description:

OK, fine maybe they DID teach you this in class. But, you'd be surprised how many people think they know something but don't know the background or the etymology of a term. I find these things fascinating. In a world of bootcamp graduates, community college attendees (myself included!), and self-taught learners, I think it's fun to explore topics like the ones I plan to cover in my new YouTube Series "Computer things they didn't teach you."

BOOK RECOMMENDATION: I think of this series as being in the same vein as the wonderful "Imposter's Handbook" series from Rob Conery (I was also involved, somewhat). In Rob's excellent words: "Learn core CS concepts that are part of every CS degree by reading a book meant for humans. You already know how to code build things, but when it comes to conversations about Big-O notation, database normalization and binary tree traversal you grow silent. That used to happen to me and I decided to change it because I hated being left out. I studied for 3 years and wrote everything down and the result is this book."

Of course it'll take exactly 2 comments before someone comments with "I don't know what crappy school you're going to but we learned this stuff when they handed us our schedule." Fine, maybe this series isn't for you.

In fact I'm doing this series and putting it out there for me. If it helps someone, all the better!

In this first video I cover the concept of Carriage Returns and Line Feeds. But do you know WHY it's called a Carriage Return? What's a carriage? Where did it go? Where is it returning from? Who is feeding it lines?

What would you suggest I do for the next video in the series? I'm thinking Unicode, UTF-8, BOMs, and character encoding.

Sponsor: Octopus Deploy wanted me to let you know that Octopus Server is now free for small teams, without time limits. Give your team a single place to release, deploy and operate your software.


© 2019 Scott Hanselman. All rights reserved.
     

Announcing .NET Jupyter Notebooks

Nov 6, 2019

Description:

Graphs in Jupyter NotebooksJupyter Notebooks has been the significant player in the interactive development space for many years, and Notebooks have played a vital role in the continued popularity of languages like Python, R, Julia, and Scala. Interactive experiences like this give users with a lightweight tool (I like to say "interactive paper") for learning, iterative development, and data science and data manipulation.

The F# community has enjoyed F# in Juypter Notebooks from years with the pioneering functional work of Rick Minerich, Colin Gravill and many other contributors!

As Try .NET has grown to support more interactive C# and F# experiences across the web with runnable code snippets, and an interactive documentation generator for .NET Core with the dotnet try global tool, we're happy to take that same codebase to the next level, by announcing C# and F# in Jupyter notebooks.

.NET in Jupyter Notebooks

Even better you can start playing with it today, locally or in the cloud!

.NET in Anaconda locally .NET Core 3.0 SDK and 2.1 as currently the dotnet try global tool targets 2.1. Jupyter : JupyterLab can be installed using Anaconda or conda or pip. For more details on how to do this please checkout the offical Jupyter installation guide. Install the .NET Kernel Open Anaconda Prompt (Installed with Anaconda Install the dotnet try global tool dotnet tool install --global dotnet-try

Please note: If you have the dotnet try global tool already installed, you will need to uninstall the older version and get the latest before grabbing the Jupyter kernel-enabled version of the dotnet try global tool.

Check to see if Jupyter is installed

jupyter kernelspec list

Install the .NET kernel!

dotnet try jupyter install

dotnet try jupyter install

Test installation

jupyter kernelspec list

You should see the .net-csharp and .net-fsharp listed.

jupyter kernelspec list

To start a new notebook, you can either type jupyter lab Anaconda prompt or launch a notebook using the Anaconda Navigator.

Once Jupyter Lab has launched in your preferred browser, you have the option to create a C# or a F# notebook

.NET C# and F# in Jupyter Notebooks

Now you can write .NET and and prose side by side, and just hit Shift-Enter to run each cell.

Example C# code in Jupyter Notebooks

For more information on our APIs via C# and F#, please check out our documentation on the binder side or in the dotnet/try repo in the NotebookExamples folder.

C# and F# samples and docs

Features

To explore some of the features that .NET notebooks ships with, I put together dashboard for the Nightscout GitHub repo.

HTML output : By default .NET notebooks ship with several helper methods for writing HTML. From basic helpers that enable users to write out a string as HTML or output Javascript to more complex HTML with PocketView. Below I'm using the display() helper method.

Nightscout

Importing packages : You can load NuGet packages using the following syntax. If you've used Rosyln-powered scripting this #r for a reference syntax will be familiar.

#r "nuget:<package name>,<package version>"

For Example

#r "nuget:Octokit, 0.32.0" #r "nuget:NodaTime, 2.4.6" using Octokit; using NodaTime; using NodaTime.Extensions; using XPlot.Plotly;

Do note that when you run a cell like this with a #r reference that you'll want to wait as that NuGet package is installed, as seen below with the ... detailed output.

installing nuget packages in Jupyter Notebooks

Object formatters : By default, the .NET notebook experience enables users to display useful information about an object in table format.

The code snippet below will display all opened issues in the nightscout/cgm-remote-monitor repo.

display(openSoFar.Select(i => new {i.CreatedAt, i.Title, State = i.State.StringValue, i.Number}).OrderByDescending(d => d.CreatedAt));

With the object formatter feature, the information will be displayed in a easy to read table format.

Querying the Nightscout repository Plotting

Visualization is powerful storytelling tool and,a key feature of the Jupyter notebook experience. As soon as you import the wonderful XPlot.Plotly F# Visualization Package into your notebooks(using Xplot.Ploty;) you can begin creating rich data visualizations in .NET.

The graphs are interactive too! Hover over the different data points to see the values.

Issue report over the last year Learn, Create and Share

To learn, create and share .NET notebooks please check out the following resources:

Learn: To learn online checkout the dotnet/try binder image for a zero install experience. Create: To get started on your machine check out the dotnet/try repo. Select the option highlighted option
 68223835-86614680-ffbb-11e9-9161-bcafd6c3133d Share: If you want to share notebooks you have made using the .NET Jupyter kernel, the easiest way is to generate a Binder image that anyone can run on the web. For more information on how to do this please check out the .NET Jupyter documentation.

Checkout the online .NET Jupyter Notebook I created for to explore the NightScout GitHub project using C# and the Octokit APIs.

The source is here https://github.com/shanselman/NightscoutDashboard  but you can run the notebook live just by going to mybinder https://mybinder.org/v2/gh/shanselman/NightScoutDashboard/master?urlpath=lab 

We hope you enjoy this new .NET Interactive experience and that you're pleasantly surprised by this evolution of the .NET Try interactive kernel.

Sponsor: Octopus Deploy wanted me to let you know that Octopus Server is now free for small teams, without time limits. Give your team a single place to release, deploy and operate your software.


© 2019 Scott Hanselman. All rights reserved.
     

Trying out Visual Studio Online - Using the cloud to manage and compile your code is amazing

Nov 1, 2019

Description:

Visual Studio Online was announced in preview, so I wanted to try it out. I already dig the Visual Studio "Remote" technology (it's almost impossibly amazing) so moving the local container or WSL to the Cloud makes a lot of sense. Check out their blog post here.

There's three quick starts.

VS Online in the browser VS Online in VS Code VS Online in Visual Studio (in private preview so I don't have that but you can sign up!)

Sweet. I'll start with with Browser version, which is the easiest one I'm guessing. I head to the login page. I'm using the new Edge browser.

I see this page that says I have no "environments" set up.

VS Online for the Browser

I'll make a plan. I changed mine to fall asleep (suspend) in 5 minutes, but the default is 30. Pricing is here.

Creating a 4 Core, 8 Gig of RAM environment

Now it's making my environment.

Sweet. VS Online

I clicked on it. Then opened a new Terminal, ran "dotnet new web" and I'm basically in a thin VS Code, except in the browser. I've got intellicode, I can install the C# extension.

I'm in VS Code but it's not, it's VS Online in a browser

Since I'm running a .NET app I had to run these commands in a new terminal to generate and trust certs for SSL.

dotnet dev-certs https
dotnet dev-certs https -- trust

Then I hit the Debug menu to build and compile my app IN THE CLOUD and I get "connecting to the forwarded port" as its "localhost" is in the cloud.

Connecting to the forwarded port

Now I've hit a breakpoint! That's bonkers.

Hello World IN THE CLOUD

Now to try it in VS Code rather than online in the browser. I installed the Visual Studio Online extention and clicked on the little Remote Environment thing on the left side after running VS Code.

This is amazing. Look on the left side there. You can see my Raspberry PI as an SSH target. You can see my new VS Online Plan, you can see my Docker Containers because I'm running Docker for Windows, you can see my WSL Targets as I've got multiple local Linuxes.

Since I'm running currently in VS Online (see the HanselmanTestPlan1 in the lower corner in green) I can just hit F5 and it compiles and runs.

It's a client-server app. VS Code is doing some of the work, but the heavy lifting is in the cloud. Same as if I split the work between Windows and WSL locally, in this case VS Code is talking to that 8 gig Linux Environment I made earlier.

VS Code attached to VS Online

When I hit localhost:500x it's forwarded up to the cloud:

Port forwarding

Amazing. Now I can do dev on a little cheapo laptop but have a major server to the work in the cloud. I can then head over to https://online.visualstudio.com/environments and delete it, or I can let it suspend.

Suspending a VS Online Environment

I'm going to continue to explore this and see if I can open my blog and podcast sites in this world. Then I can open and develop on them from anywhere. Or soon from my iPad Pro!

Go give Visual Studio Online a try. It's Preview but it's lovely.

Sponsor: Like C#? We do too! That’s why we've developed a fast, smart, cross-platform .NET IDE which gives you even more coding power. Clever code analysis, rich code completion, instant search and navigation, an advanced debugger... With JetBrains Rider, everything you need is at your fingertips. Code C# at the speed of thought on Linux, Mac, or Windows. Try JetBrains Rider today!


© 2019 Scott Hanselman. All rights reserved.
     

Adafruit's Circuit Playground Express simulated Visual Studio Code's Device Simulator Express

Oct 29, 2019

Description:

I'm an unabashed Adafruit fan and I often talking about them because I'm always making cool stuff with their hardware and excellent tutorials. You should check out the YouTube video we made when I visited Adafruit Industries in New York with my nephew. They're just a lovely company.

While you're at it, go sign up for the Adabox Subscription and get amazing hardware projects mailed to you in a mystery box regularly!

One of the devices I keep coming back to is the extremely versatile Circuit Playground Express. It's under $25 and does a LOT.

It's got 10 NeoPixels, a motion sensor, temp sensor, light sensor, sound sensor, buttons, a slide, and a speaker. It even can receive and transmit IR for any remote control. It's great for younger kids because you can use alligator clips for the input output pins which means no soldering for easy projects.

You can also mount the Circuit Playground Express onto a Crickit which is the "Creative Robotics & Interactive Construction Kit. It's an add-on lets you #MakeRobotFriend using CircuitPython, MakeCode, or Arduino." The Crickit makes it easy to control motors and adds additional power options to drive them! Great for creating small bots or battlebots as my kids do.

MakeCode

The most significant - and technically impressive, in my opinion - aspect of the Circuit Playground Express is that it doesn't dictate the tech you use! There's 3 great ways to start.

Start your journey with Microsoft MakeCode block-based or Javascript programming. Then, you can use the same board to try CircuitPython, with the Python interpreter running right on the Express. As you progress, you can advance to using Arduino IDE, which has full support of all the hardware down to the low level, so you can make powerful projects.

Start by exploring MakeCode for Circuit Playground Express by just visiting https://makecode.adafruit.com/ and running in the browser!

Device Simulator Express for Adafruit Circuit Playground Express

Next, check out the Device Simulator Express extension for Visual Studio Code! This was made over the summer by Christella Cidolit, Fatou Mounezo, Jonathan Wang, Lea Akkari, Luke Slevinsky, Michelle Yao, and Rachel Phinnemore, the interns at the Microsoft Garage Vancouver!

Christella Cidolit, Fatou Mounezo, Jonathan Wang, Lea Akkari, Luke Slevinsky, Michelle Yao, and Rachel Phinnemore

This great extension lets YOU, Dear Reader, code for a Circuit Playground Express without the physical hardware! And when you've got one in your hards, it makes development even easier. That means:

Device simulation for those without hardware Code deployment to devices Auto-completion and error flagging Debugging with the simulator

You'll need these things:

Visual Studio Code Node Python 3.7.4: Make sure you've added python and pip to your PATH in your environment variables. (1) Python VS Code extension: This will be installed automatically from the marketplace when you install... Device Simulator Express

Fire up Visual Studio Code with the Device Simulator Express extension installed and then select "Device Simulator Express: New File" in the command palette (CTRL+SHIFT+P to open the palette).

Device Simulator Express

There's a lot of potential here! You've got the simulated device on the right and the Python code on the left. There's step by step debugging in this virtual device. There's a few cool things I can think of to make this extension easier to set up and get started that would be it a killer experience for an intermediate developer who is graduating from MakeCode into a Code editor like VS Code.

It's early days and the interns are back in school but I'm hoping to see this project move forward and get improved. I'll blog more details as I have them!

Sponsor: Like C#? We do too! That’s why we've developed a fast, smart, cross-platform .NET IDE which gives you even more coding power. Clever code analysis, rich code completion, instant search and navigation, an advanced debugger... With JetBrains Rider, everything you need is at your fingertips. Code C# at the speed of thought on Linux, Mac, or Windows. Try JetBrains Rider today!


© 2019 Scott Hanselman. All rights reserved.
     

Be a Technology Tourist

Oct 24, 2019

Description:

Passport Pages by daimoneklund used under CCI was talking to Tara and we were marveling that in in 1997 15% of Americans had Passports. However, even now less than half do. Consider where the US is physically located. It's isolated in a hemisphere with just Canada and Mexico as neighbors. In parts of Europe a 30 minute drive will find three or four languages, while I can't get to Chipotle in 30 minutes where I live.

A friend who got a passport and went overseas at age 40 came back and told me "it was mind-blowing. There's billions of people who will never live here...and don't want to...and that's OK. It was so useful for me to see other people's worlds and learn that."

I could tease my friend for their awakening. I could say a lot of things. But for a moment consider the context of someone geographically isolated learning - being reminded - that someone can and will live their whole life and never need or want to see your world.

Travel of any kind opens eyes.

Now apply this to technology. I'm a Microsoft technologist today but I've done Java and Mainframes at Nike, Pascal and Linux at Intel, and C and C++ in embedded systems as a consultant. It's fortunate that my technology upbringing has been wide-reaching and steeped in diverse and hybrid systems, but that doesn't negate someone else's bubble. But if I'm going to speak on tech then I need to have a wide perspective. I need to visit other (tech) cultures and see how they live.

You may work for Microsoft, Google, or Lil' Debbie Snack Cakes but just like you should consider getting a passport, you should absolutely visit other (tech) cultures. Travel will make you more well-rounded. Embrace the ever-changing wonders of the world and of technology. Go to their meet-ups, visit their virtual conferences, follow people outside your space, try to build their open source software, learn a foreign (programming) language. They may not want or need to visit yours, but you'll be a better and more well-rounded person when you return home if you're chose to be technology tourist.

Sponsor: Like C#? We do too! That’s why we've developed a fast, smart, cross-platform .NET IDE which gives you even more coding power. Clever code analysis, rich code completion, instant search and navigation, an advanced debugger... With JetBrains Rider, everything you need is at your fingertips. Code C# at the speed of thought on Linux, Mac, or Windows. Try JetBrains Rider today!
© 2019 Scott Hanselman. All rights reserved.
     

Create exceptional interactive documentation with Try .NET - The Polly NuGet library did!

Oct 23, 2019

Description:

I've blogged at length about the great open source project called "Polly"

NuGet Package of the Week: Polly wanna fluently express transient exception handling policies in .NET? Adding Cross-Cutting Memory Caching to an HttpClientFactory in ASP.NET Core with Polly The Programmer's Hindsight - Caching with HttpClientFactory and Polly Part 2 Adding Resilience and Transient Fault handling to your .NET Core HttpClient with Polly

and I've blogged about "Try .NET" which is a wonderful .NET Core global tool that lets you make interactive in-browser documentation and create workshops that can be run both online and locally (totally offline!)

Introducing the Try .NET Global Tool - interactive in-browser documentation and workshop creator

If you've got .NET Core installed, you can try it in minutes! Just do this:

dotnet tool install --global dotnet-try
dotnet try demo

Even better, you can just clone a Try .NET enabled repository with markdown files that have a few magic herbs and spices, then run "dotnet try" in that cloned folder.

What does this have to do with Polly, the lovely .NET resilience and transient fault handling library that YOU should be using every day? Well, my friends, check out this lovely bit of work by Bryan J Hogan! He's created some interactive workshop-style demos using Try .NET!

How easy is it to check out? Let's give it a try. I've run dotnet tool install --global dotnet-try already. You may need to run update if you've installed it a while back.

git clone https://github.com/bryanjhogan/trydotnet-polly.git
dotnet try

That's it. What does it do? It'll launch your browser to a local website powered by Try .NET that looks like this!

Interactive local documentation with Try.NET

Sweet! Ah, but Dear Reader, scroll down! Let me try out one of the examples. You'll see a Monaco-based local text editor (the same edit that powers VS Code) and you're able to run - and modify - local code samples IN THE BROWSER!

Checking out Polly exception retries

Here's the code as text to make it more accessible.

RetryPolicy retryPolicy = Policy.Handle<Exception>()
.Retry(3, (exception, retryCount) =>
{
Console.WriteLine($"{exception.GetType()} thrown, retrying {retryCount}.");
});

int result = retryPolicy.Execute(() => errorProneCode.QueryTheDatabase());

Console.WriteLine($"Received a response of {result}.");

And the output appears below the sample, again, in a console within the browser:

System.Exception thrown, retrying 1.
System.InsufficientMemoryException thrown, retrying 2.
Received a response of 0.

You can see that Polly gives you a RetryPolicy that can envelop your code and handle things like transient errors, occasional flaky server responses, or whatever else you want it to do. It can be configured as a policy outside your code, or coded inline fluently like this.

NOTE the URL! See that it's a .MD or Markdown file? Try .NET has a special handler that reads in a regular markdown file and executes it. The result is an HTML representation of your Markdown *and* your sample, now executable!

What's the page/image above look like as Markdown? Like this:

# Polly Retries Part 2

### Retrying When an Exception Occurs
The Polly NuGet package has been added and we are going to use the Retry Policy when querying database.
The policy states that if an exception occurs, it will retry up to three times.

Note how you execute the unreliable code inside the policy. `retryPolicy.Execute(() => errorProneCode.QueryTheDatabase());`


``` cs --region retryIfException --source-file .\src\Program.cs --project .\src\PollyDemo.csproj
```

#### Next: [Retrying Based on a Result &raquo;](./retryIfIncorrectStatus.md) Previous: [Before You Add Polly &laquo;](../lettingItFail.md)

Note the special ``` region. The code isn't inline, but rather it lives in a named region in Program.cs in a project in this same repository, neatly under the /src folder. The region is presented in the sample, but as samples are usually more complex and require additional libraries and such, the region name and project context is passed into your app as Try.NET executes it.

Go check out some Try .NET enabled sample repositories. Just make sure you have the Try .NET global tool installed, then go clone and "dotnet try" any of these!

https://github.com/bryanjhogan/trydotnet-polly https://github.com/dotnet/try-samples (there's like 5 sample areas in here, do "dotnet try" in the different directories!)

If you're doing classwork, teaching workshops, making assignments for homework, or even working in a low-bandwidth or remote environment this is great as you can put the repositories on a USB key and once they've run once they'll run offline!

Now, be inspired by (and star on GitHub) Bryan's great work and go make your own interactive .NET documentation!

Sponsor: Like C#? We do too! That’s why we've developed a fast, smart, cross-platform .NET IDE which gives you even more coding power. Clever code analysis, rich code completion, instant search and navigation, an advanced debugger... With JetBrains Rider, everything you need is at your fingertips. Code C# at the speed of thought on Linux, Mac, or Windows. Try JetBrains Rider today!


© 2019 Scott Hanselman. All rights reserved.
     

How to make a pretty prompt in Windows Terminal with Powerline, Nerd Fonts, Cascadia Code, WSL, and oh-my-posh

Oct 17, 2019

Description:

I've blogged about Patching the new Cascadia Code to include Powerline Glyphs and other Nerd Fonts for the Windows Terminal but folks have asked very specifically, how do I make my prompt look like that?

Step One - Get the Terminal

Get Windows Terminal free from the Store. You can also get it from GitHub's releases but I recommend the store because it'll stay up to date automatically.

Note that if you were an early adopter of the Windows Terminal and you've released updated beyond 0.5, I'd recommend you delete or zero-out your profiles.json and let the Terminal detect and automatically recreate your profiles.json.

Lovely powerline in Windows Terminal

Step Two for PowerShell - Posh-Git and Oh-My-Posh

Per these directions, install Posh-Git and Oh-My-Posh. This also assumes you've installed Git for Windows.

Install-Module posh-git -Scope CurrentUser
Install-Module oh-my-posh -Scope CurrentUser

Run these commands from PowerShell or PowerShell Core. I recommend PowerShell 6.2.3 or above. You can also use PowerShell on Linux too, so be aware. When you run Install-Module for the first time you'll get a warning that you're downloading and installing stuff from the internet so follow the prompts appropriately.

Also get PSReadline if you're on PowerShell Core:

Install-Module -Name PSReadLine -AllowPrerelease -Scope CurrentUser -Force -SkipPublisherCheck

Then run "notepad $PROFILE" and add these lines to the end:

Import-Module posh-git
Import-Module oh-my-posh
Set-Theme Paradox

Now that word Paradox there is optional. It's actually the name of a theme and you can (and should!) pick the theme that makes you happy and use that theme's name here. I like Agnoster, Paradox, or Fish, myself. Read more over here. https://github.com/JanDeDobbeleer/oh-my-posh

Step Two for Ubuntu/WSL

There's a number of choices for Powerline or Powerline-like prompts from Ubuntu. I like Powerline-Go for it's easy defaults.

I just installed Go, then installed powerline-go with go get.

sudo apt install golang-go
go get -u github.com/justjanne/powerline-go

Add this to your ~/.bashrc. You may already have a GOPATH so be aware.

GOPATH=$HOME/go
function _update_ps1() {
PS1="$($GOPATH/bin/powerline-go -error $?)"
}
if [ "$TERM" != "linux" ] && [ -f "$GOPATH/bin/powerline-go" ]; then
PROMPT_COMMAND="_update_ps1; $PROMPT_COMMAND"
fi

GOTCHA: If you are using WSL2, it'll be lightning fast with git prompts if your source code is in your Ubuntu/Linux mount, somewhere under ~/. However, if your source is under /mnt/c or /mnt anywhere, the git calls being made to populate the prompt are super slow. Be warned. Do your Linux source code/git work in the Linux filesystem for speed until WSL2 gets the file system faster until /mnt.

At this point your Ubuntu/WSL prompt will look awesome as well!

Powerline in WSL

Fonts look weird? Uh oh!

Step Three - Get a better font

If you do all this and you see squares and goofy symbols, it's likely that the font you're using doesn't have the advanced Powerline glyphs. Those glyphs are the ones that make this prompt look so cool!

Weird fonts

At the time of this writing there is active talk of getting Powerline and other Nerd Fonts into Cascadia Code, the new font that ships with Windows Terminal. In the short term, you can get a forked version of Cascadia Code called Delugia Code and download that.

Then from within Windows Terminal, hit "Ctrl+," to edit your profile.json and change the "fontFace" of your profile or profiles to this:

"fontFace": "DelugiaCode NF",

And that's it!

Remember also you can get lots of Nerd Fonts at https://www.nerdfonts.com/, just make sure you get one (or generate one!) that includes the PowerLine Glyphs.

Have fun!

Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!


© 2019 Scott Hanselman. All rights reserved.
     

Assert your assumptions - .NET Core and subtle locale issues with WSL's Ubuntu

Oct 15, 2019

Description:

I thought this was an interesting and subtle bug behavior that was not only hard to track down but hard to pin down. I wasn't sure 'whose fault it was.'

Here's the story. Feel free to follow along and see what you get.

I was running on Ubuntu 18.04 under WSL.

I made a console app using .NET Core 3.0. You can install .NET Core here http://dot.net/get-core3

I did this:

dotnet new console
dotnet add package Humanizer --version 2.6.2

Then made Program.cs look like this. Humanizer is a great .NET Standard library that you'll learn about and think "why didn't .NET always have this!?"

using System;
using Humanizer;

namespace dotnetlocaletest
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine(3501.ToWords());
}
}
}

You can see that I want the app to print out the number 3051 as words. Presumably in English, as that's my primary language, but you'll note I haven't indicated that here. Let's run it.

image

Note that app this works great and as expected in Windows.

scott@IRONHEART:~/dotnetlocaletest$ dotnet run
3501

Huh. It didn't even try. That's weird.

My Windows machine is en-us (English in the USA) but what's my Ubuntu machine?

scott@IRONHEART:~/dotnetlocaletest$ locale
LANG=C.UTF-8
LANGUAGE=

Looks like it's nothing. It's "C.UTF-8" and it's nothing. C in this context means the POSIX default locate. It's the most basic. C.UTF-8 is definitely NOT the same as en_US.utf8. It's a locate of sorts, but it's not a place.

What if I tell .NET explicitly where I am?

static void Main(string[] args)
{
Thread.CurrentThread.CurrentUICulture = new CultureInfo("en-US");
Console.WriteLine(3501.ToWords());
}

And running it.

scott@IRONHEART:~/dotnetlocaletest$ dotnet run
three thousand five hundred and one

OK, so things work well if the app declares "hey I'm en-US!" and Humanizer works well.

What's wrong? Seems like Ubuntu's "C.UTF-8" isn't "invariant" enough to cause Humanizer to fall back to an English default?

Seems like other people have seen unusual or subtle issues with Ubuntu installs that are using C.UTF-8 versus a more specific locale like en-US.UTF8.

I could fix this in a few ways. I could set the locale specifically in Ubuntu:

locale-gen en_US.UTF-8
update-locale LANG=en_US.UTF-8

Fortunately Humanizer 2.7.2 and above has fixed this issue and falls back correctly. Whose "bug" was it? Tough one but in this case, Humanizer had some flawed fallback logic. I updated to 2.7.2 and now C.UTF-8 falls back to a neutral English.

That said, I think it could be argued that WSL/Canonical/Ubuntu should detected my local language and/or set locale to it on installation.

The lesson here is that your applications - especially ones that are expected to work in multiple locales in multiple languages - take "input" from a lot of different places. Phrased differently, not all input comes from the user.

System locale and language, time, timezone, dates, are all input as ambient context to your application. Make sure you assert your assumptions about what "default" is. In this case, my little app worked great on en-US but not on "C.UTF-8." I was able to explore the behavior and learn that there was both a local workaround (I could detected and set a default locale if needed) and there was a library fix available as well.

Assert your assumptions!

Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!


© 2019 Scott Hanselman. All rights reserved.
     

Visual Studio for Nintendo Switch? - FUZE4 Nintendo Switch is an amazing coding app

Oct 10, 2019

Description:

I love my Nintendo Switch. It's a brilliant console that fits into my lifestyle. I use it on planes, the kids play it on long car rides, and it's great both portable and docked.

NOTE: Check out my blog post on The perfect Nintendo Switch travel set up and recommended accessories

But I never would have predicted "Visual Studio Core for Nintendo Switch" - now that's in a massive pair of air quotes because FUZE4 Nintendo Switch has no relationship to Microsoft or Visual Studio but it's a really competent coding application that works with USB keyboards! It's an amazing feeling to literally plug in a keyboard and start writing games for Switch...ON A SWITCH! Seriously, don't sleep on this app if you or your kids want to make Switch games.

Coding on a Nintendo Switch with FUZE4 Switch

Per the Fuze website:

This is not a complex environment like C++, JAVA or Python. It is positioned as a stepping stone from the likes of Scratch , to more complex real-world ones. In fact everything taught using FUZE is totally applicable in the real-world, it is just that it is presented in a far more accessible, engaging and fun way.

If you're in the UK, there are holiday workshops and school events all over. If you're elsewhere, FUZE also has started the FuzeArena site as a forum to support you in your coding journey on Switch. There is also a new YouTube channel with Tutorials on FUZE Basic starting with Hello World!

FUZE4 includes a very nice and complete code editor with Syntax Highlighting and Code bookmarks. You can plug in any USB keyboard - I used a Logitech USB keyboard with the USB wireless Dongle! - and you or the children in your life can code away. You just RUN the program with the "start" or + button on the Nintendo Switch.

It can't be overstated how many asserts, bitmaps, sample apps, and 3D models that FUZE4 comes with. You may explore initially and mistakenly think it's a shallow app. IT IS NOT. There is a LOT here. You don't need to make all the assets yourself, and if you're interested in game makers like PICO8 then the idea of making a Switch game with minimal effort will be super attractive to you.

Writing code with FUZE4 3D Demos with FUZE4 Lots of Programs come with FUZE4 Nintendo Switch Software Keyboard inside FUZE4 Get started with code and FUZE4

FUZE and FUZE Basic also exists on the Raspberry Pi and there are boot images available to check out. It also supports the Raspberry Pi Sense Hat add-on board.

They are also working on FUZE4 Windows as well so stay turned for that! If you register for their forums you can also check out their PDF workbooks and language tutorials. However, if you're like me, you'll have more fun reading the code for the included samples and games and figuring things out from there.

FUZE4 on the Nintendo Switch is hugely impressive and frankly, I'm surprised more people aren't talking about it. Don't sleep on FUZE4, my kids have been enjoying it. I do recommend you use an external USB keyboard to have the best coding experience. You can buy FUZE4 as a digital download on the Nintendo Shop.

Sponsor: Like C#? We do too! That’s why we've developed a fast, smart, cross-platform .NET IDE which gives you even more coding power. Clever code analysis, rich code completion, instant search and navigation, an advanced debugger...With JetBrains Rider, everything you need is at your fingertips. Code C# at the speed of thought on Linux, Mac, or Windows. Try JetBrains Rider today!


© 2019 Scott Hanselman. All rights reserved.
     

Video Interview: Between Two Nerds - Building Careers with Empathy

Oct 8, 2019

Description:

I was fortunate to be a guest on Steve Carroll's talk show "Careers Behind the Code," but Amanda Silver calls it "Between Two Nerds," so I'm going with that superior title. We also snuck a fern into the shot so that's cool.

In the interview I talk about Empathy and why I think it's an essential skill for developers, designers, and program managers to develop - deeply.

Steve also asked me about how I got my start in tech, and I tell the story about the day I showed up at home and the family van was gone.

I wasn't sure how this interview would turn out, and I don't usually like talking about myself (I'm much more comfortable talking to YOU on my podcast) but folks seem to think the show turned out well.

I also speak a little about "Living your Life by Default" and the importance of mindfulness, which is a partner to Empathy. Please do check out the interview and let me know what you think!

Sponsor: Like C#? We do too! That’s why we've developed a fast, smart, cross-platform .NET IDE which gives you even more coding power. Clever code analysis, rich code completion, instant search and navigation, an advanced debugger...With JetBrains Rider, everything you need is at your fingertips. Code C# at the speed of thought on Linux, Mac, or Windows. Try JetBrains Rider today!


© 2019 Scott Hanselman. All rights reserved.
     

Now is the time to make a fresh new Windows Terminal profiles.json

Oct 3, 2019

Description:

I've been talking about it for months, but in case you haven't heard, there's a new Windows Terminal in town. You can download it and start using it now from the Windows Store. It's free and open source.

At the time of this writing, Windows Terminal is around version 0.5. It's not officially released as a 1.0 so things are changing all the time.

Here's your todo - Have you installed the Windows Terminal before? Have you customize your profile.json file? If so, I want you to DELETE your profiles.json!

Your profiles.json is somewhere like C:\Users\USERNAME\AppData\Local\Packages\Microsoft.WindowsTerminal_8wekyb3d8bbwe\LocalState but you can get to it from the drop down in the Windows Terminal like this:

Windows Terminal dropdown

When you hit Settings, Windows Terminal will launch whatever app is registered to handle JSON files. In my case, I'm using Visual Studio Code.

I have done a lot of customization on my profiles.json, so before I delete or "zero out" my profiles.json I will save a copy somewhere. You should to!

You can just "ctrl-a" and delete all of your profiles.json when it's open and Windows Terminal 0.5 or greater will recreate it from scratch by detecting the shells you have. Remember, a Console or Terminal isn't a Shell!

Note the new profiles.json also includes another tip! You can hold ALT- and click settings to see the default settings! This new profiles.json is simpler to read and understand because there's an inherited default.


// To view the default settings, hold "alt" while clicking on the "Settings" button.
// For documentation on these settings, see: https://aka.ms/terminal-documentation

{
"$schema": "https://aka.ms/terminal-profiles-schema",

"defaultProfile": "{61c54bbd-c2c6-5271-96e7-009a87ff44bf}",

"profiles":
[
{
// Make changes here to the powershell.exe profile
"guid": "{61c54bbd-c2c6-5271-96e7-009a87ff44bf}",
"name": "Windows PowerShell",
"commandline": "powershell.exe",
"hidden": false
},
{
// Make changes here to the cmd.exe profile
"guid": "{0caa0dad-35be-5f56-a8ff-afceeeaa6101}",
"name": "cmd",
"commandline": "cmd.exe",
"hidden": false
},
{
"guid": "{574e775e-4f2a-5b96-ac1e-a2962a402336}",
"hidden": false,
"name": "PowerShell Core",
"source": "Windows.Terminal.PowershellCore"
},
...

You'll notice there's a new $schema that gives you dropdown Intellisense so you can autocomplete properties and their values now! Check out the Windows Terminal Documentation here https://aka.ms/terminal-documentation and the complete list of things you can do in your profiles.json is here.

I've made these changes to my Profile.json.

Split panes

I've added "requestedTheme" and changed it to dark, to get a black titleBar with tabs.

requestedTheme = dark

I also wanted to test the new (not even close to done) splitscreen features, that give you a simplistic tmux style of window panes, without any other software.

// Add any keybinding overrides to this array.
// To unbind a default keybinding, set the command to "unbound"
"keybindings": [
{ "command": "closeWindow", "keys": ["alt+f4"] },
{ "command": "splitHorizontal", "keys": ["ctrl+-"]},
{ "command": "splitVertical", "keys": ["ctrl+\\"]}
]

Then I added an Ubuntu specific color scheme, named UbuntuLegit.

// Add custom color schemes to this array
"schemes": [
{
"background" : "#2C001E",
"black" : "#4E9A06",
"blue" : "#3465A4",
"brightBlack" : "#555753",
"brightBlue" : "#729FCF",
"brightCyan" : "#34E2E2",
"brightGreen" : "#8AE234",
"brightPurple" : "#AD7FA8",
"brightRed" : "#EF2929",
"brightWhite" : "#EEEEEE",
"brightYellow" : "#FCE94F",
"cyan" : "#06989A",
"foreground" : "#EEEEEE",
"green" : "#300A24",
"name" : "UbuntuLegit",
"purple" : "#75507B",
"red" : "#CC0000",
"white" : "#D3D7CF",
"yellow" : "#C4A000"
}
],

And finally, I added a custom command prompt that runs Mono's x86 developer prompt.

{
"guid": "{b463ae62-4e3d-5e58-b989-0a998ec441b8}",
"hidden": false,
"name": "Mono",
"fontFace": "DelugiaCode NF",
"fontSize": 16,
"commandline": "C://Windows//SysWOW64//cmd.exe /k \"C://Program Files (x86)//Mono//bin//setmonopath.bat\"",
"icon": "c://Users//scott//Dropbox//mono.png"
}

Note I'm using forward slashes an double escaping them, as well as backslash escaping quotes.

Save your profiles.json away somewhere, make sure your Terminal is updated, then delete it or empty it and you'll likely get some new "free" shells that the Terminal will detect, then you can copy in just the few customizations you want.

Sponsor: Like C#? We do too! That’s why we've developed a fast, smart, cross-platform .NET IDE which gives you even more coding power. Clever code analysis, rich code completion, instant search and navigation, an advanced debugger... With JetBrains Rider, everything you need is at your fingertips. Code C# at the speed of thought on Linux, Mac, or Windows. Try JetBrains Rider today!


© 2019 Scott Hanselman. All rights reserved.
     

A wonderfully unholy alliance - Real Linux commands for PowerShell with WSL function wrappers

Oct 1, 2019

Description:

Dropdown filled with shells in the Windows TerminalI posted recently about What's the difference between a console, a terminal, and a shell? The world of Windows is interesting - and a little weird and unfamiliar to non-Windows people. You might use Ubuntu or Mac and you've picked your shell like zsh or bash or pwsh, but then you come to Windows and we're hopping between shells (and now operating systems with WSL!) on a tab by tab basis.

If you're using a Windows shell like PowerShell because you like it's .NET Core based engine and powerful scripting language, you might still miss common *nix shell commands like ls, grep, sed and more.

No matter what shell you're using in Windows (powershell, yori, cmd, whatever) you can always call into your default Ubuntu instance with "wsl command" so "wsl ls" or "wsl grep" but it'd be nice to make those more naturally and comfortably integrated.

Now there's a new series of "function wrappers" that make Linux commands available directly in PowerShell so you can easily transition between multiple environments.

This might seem weird but it allows us to create amazing piped commands that move in and out of Windows and Linux, PowerShell and bash. It's actually pretty amazing and very natural if you, like me, are non-denominational in your choice of operating system and preferred shell.

These function wrappers are very neatly designed and even expose TAB completion across operating systems! That means I can type Linux commands in PowerShell and TAB completion comes along!

It's super easy to set up. From Mike Battista's Github

Install PowerShell Core Install the Windows Subsystem for Linux (WSL) Install the WslInterop module with Install-Module WslInterop Import commands with Import-WslCommand either from your profile for persistent access or on demand when you need a command (e.g. Import-WslCommand "awk", "emacs", "grep", "head", "less", "ls", "man", "sed", "seq", "ssh", "tail", "vim")

You'll do your Install-Module just one, and then run notepad $profile and add just a that single last line. Make sure you change it to expose the WSL/Linux commands that you want. Once you're done, you can just open PowerShell Core and mix and match your commands!

From the blog, "With these function wrappers in place, we can now call our favorite Linux commands in a more natural way without having to prefix them with wsl or worry about how Windows paths are translated to WSL paths:"

man bash less -i $profile.CurrentUserAllHosts ls -Al C:\Windows\ | less grep -Ein error *.log tail -f *.log

It's a really genius thing and kudos to Mike for sharing it with us! Go try it now. https://github.com/mikebattista/PowerShell-WSL-Interop

Sponsor: Like C#? We do too! That’s why we've developed a fast, smart, cross-platform .NET IDE which gives you even more coding power. Clever code analysis, rich code completion, instant search and navigation, an advanced debugger... With JetBrains Rider, everything you need is at your fingertips. Code C# at the speed of thought on Linux, Mac, or Windows. Try JetBrains Rider today!


© 2019 Scott Hanselman. All rights reserved.
     

How to download over 80 free 101-level C#, .NET, and ASP.NET for beginners videos for offline viewing

Sep 26, 2019

Description:

Earlier this week I announced over 80 new free videos in our .NET Core 3.0 launch video series - Announcing free C#, .NET, and ASP.NET for beginners video courses and tutorials

Three questions came up consistently:

My work or country blocks YouTube! What about me? How can I download these and watch them offline? I have very low bandwidth. Can I get smaller versions I can download over 3G/4G?

Here's some answers for you!

My work or country blocks YouTube! What about me?

First, we have updated http://dot.net/videos to include links to BOTH YouTube *and* to the same videos hosted on Microsoft's Channel 9, which shouldn't be blocked by you country or company!

How can I download these and watch them offline?

Good question! Here's how to download a whole series with PowerShell! Let's say I want to download "C# 101."

First, head over to https://dot.net/videos.

C# 101 videos

Second, click "Watch on Channel 9" there at the bottom of the series you want, in my case, C# 101.

Note that there's a link there in the corner that says "RSS" - that's Really Simple Syndication!

RSS Videos for Channel 9 Videos

Right click on the one you want, for example MP4 Low for people on low bandwidth connections! (Question #3 gets answered too, two for one!), and say "Copy Link Address." Now that link is in your clipboard!

Next, I made a little PowerShell script and put it here in a Gist. If you want, you can right click on this link here and Save Link As and name it something like DownloadVideos.ps1. Maybe save it in C:\temp or c:\users\YOURNAME\Desktop\DotNetVideos. Whatever makes you happy. Make sure you saved it with a *.ps1 extension.

Finally, open up the PS1 file in a text editor and check lines 2 and 3. Put in a path that's correct for YOUR computer, again, like C:\temp, or your downloads folder.

#CHECK THE PATH ON LINE 2 and the FEED on LINE 3
cd "C:\users\scott\Downloads"
$a = ([xml](new-object net.webclient).downloadstring("https://channel9.msdn.com/Series/CSharp-101/feed/mp4"))
$a.rss.channel.item | foreach{
$url = New-Object System.Uri($_.enclosure.url)
$file = $url.Segments[-1]
$file
if (!(test-path $file)) {
(New-Object System.Net.WebClient).DownloadFile($url, $file)
}
}

Make sure the RSS link that you copied earlier above is correct on line 3. We need a Local Folder and we need our Remote RSS Link.

NOTE: If you're an expert you might think this PowerShell script isn't fancy enough or doesn't do x or y, but it'll do pretty nicely for this project. You're welcome to fork it or improve it here.

And finally, open up PowerShell on your machine from the Start Menu and run your downloadvideos.ps1 script like in this screenshot.

What about getting Low Bandwidth videos I can download on a slow connection!

The low bandwidth videos are super small, some smaller than a JPEG! The largest is just 20 megs, so the full C# course is under 200 megs total.

You can download ALL the videos for EACH playlist by visiting http://dot.net/videos, getting the RSS URL for the video playlist you want, and running the PS1 script again with the changed URL on line 3.

Hope this helps!

Sponsor: Like C#? We do too! That’s why we've developed a fast, smart, cross-platform .NET IDE which gives you even more coding power. Clever code analysis, rich code completion, instant search and navigation, an advanced debugger... With JetBrains Rider, everything you need is at your fingertips. Code C# at the speed of thought on Linux, Mac, or Windows. Try JetBrains Rider today!
© 2019 Scott Hanselman. All rights reserved.
     

Announcing free C#, .NET, and ASP.NET for beginners video courses and tutorials

Sep 24, 2019

Description:

If you've been thinking about learning C#, now is the time to jump in! I've been working on this project for months and I'm happy to announce http://dot.net/videos 

There's nearly a hundred short videos (with more to come!) that will teach you topics like C# 101, .NET, making desktop apps, making ASP.NET web apps, learning containers and Dockers, or even starting with Machine Learning. There's a ton of great, slow-paced beginner videos. Most are less than 10 minutes long and all are organized into Playlists on YouTube!

If you are getting started, I'd recommend starting with these three series in this order - C#, .NET, then ASP.NET. After that, pick the topics that make you the happiest.

Lots of .NET learning videos and tutorials up on YouTube, free!

If you don't have access to YouTube where you are, all these videos are also on Channel 9 *and* can be downloaded locally via RSS feed! https://channel9.msdn.com/Browse/Series

Lots of .NET learning videos and tutorials up on YouTube, free!

If you like these, let me know what other topics you'd like us to cover! We are just getting started and already have intermediate and advanced C# classes in the works!

Sponsor: Like C#? We do too! That’s why we've developed a fast, smart, cross-platform .NET IDE which gives you even more coding power. Clever code analysis, rich code completion, instant search and navigation, an advanced debugger... With JetBrains Rider, everything you need is at your fingertips. Code C# at the speed of thought on Linux, Mac, or Windows. Try JetBrains Rider today!


© 2019 Scott Hanselman. All rights reserved.
     

What's the difference between a console, a terminal, and a shell?

Sep 20, 2019

Description:

I see a lot of questions that are close but the questions themselves show an underlying misunderstanding of some important terms.

Why would I use Windows Terminal over PowerShell? I don't need WSL for bash, I use Cygwin. Can I use conemu with PowerShell Core or do I need to use Windows Terminal?

Let's start with a glossary and clarify some words first.

Terminal

The word Terminal comes from terminate, indicating that it's the terminating end or "terminal" end of a communications process. You'll often hear "dumb terminal" when referring to a text-based environment where the computer you are sitting next to is just taking input and showing text while the real work happens at the other end in a mainframe or large computer.

TTY or "teletypewriter" was the first kind of terminal. Rather than a screen you'd have a literal typewriter in front of you. When you type on it, you're seeing the text on a piece of paper AND inputing that text into a computer. When that computer replies, you'll see the typewriter automatically type on the same paper.

Photo by Texas.713 used under CC

When we refer to a Terminal in the software sense, we're referring to a literal software version of a TTY or Terminal. The Windows Terminal is that. It's really good at displaying textual output. It can take input and pass it on. But the Terminal isn't smart. It doesn't actually process your input, it doesn't look at your files or think.

Console

Folks in the mid 20th century would have a piece of furniture in their living room called a console or console cabinet. A Console in the context of computers is a console or cabinet with a screen and keyboard combined inside it. But, it's effectively a Terminal. Technically the Console is the device and the Terminal is now the software program inside the Console.

image

In the software world a Terminal and a Console are, for all intents, synonymous.

Shell

A shell is the program that the terminal sends user input to. The shell generates output and passes it back to the terminal for display. Here's some examples of Shells:

bash, fish, zsh, ksh, sh, tsch PowerShell, pwsh cywin cmd, yori, 4dos, command.com

Here's an important point that should make more sense now that you have these terminals - Your choice of shell doesn't and shouldn't dictate your choice of terminal application.

Aside: WSL and WSL2 (the Windows Subsystem for Linux) are a complete local Linux (or many Linuxes) that run on Windows 10. They are full and real. WSL2 ships a real Linux kernel and runs in on Windows. Cygwin is NOT a Linux. Cygwin is a large collection of GNU and Open Source tools which provide functionality similar to Linux on Windows - but it is not Linux. It's a simulacrum. It's GNU utils compiled against Win32. It's great, but it's important for you to know what the difference is. Cygwin may let you run your shell scripts but it will NOT run Apache, Docker, or other real ELF-binaries and Linux apps.

Your Choice of Windows Consoles?

There are a number of shells that ship with Windows. Here's a few I'm running now. Note the "chrome" or the border and title around them? Those shells are all hosted by a the legacy Windows console you have never heard of called conhost.exe. You can go to the command prompt, type powershell, cmd, or ubuntu and any number of shells will run. Conhost does the work of input and output.

Now, forget that conhost exists, because it sucks - it's super old.

Shells that come with Windows

Pseudo Console, Pseudo Terminal, PTY, Pseudo TTY (ConPTY)

Pseudo Terminals are terminal emulators or software interfaces that emulate terminals. They pretend to be terminals like the ones above. *Nix systems have long had a pseudo-terminal (PTY) infrastructure and now Windows as a pseudoconsole (ConPTY) as well.

Window's new ConPTY interface is the future of consoles and terminals on Windows. If you choose a 3rd party (non-built-in) console applications for Windows, make sure it supports ConPTY and it'll be a better experience than some of the older consoles that use screen scraping or other hacks.

image

Back to your choice of Windows Consoles

Remembering there's a lot of shells you can use in Windows, there's a lot of 3rd party consoles you can use if you don't like conhost.exe (and you shouldn't).

Hyper ConEmu cmder Console2 ConsoleZ Terminus FluentTerminal ZOC MobaXterm Babun (dead) 4NT/jpSoftware (not free) Putty MinTTY Windows Terminal (free in Microsoft Store) XTermjs - a Typescript component that lets you integrate terminals into your apps VSCode includes a Terminal Visual Studio 2019 Preview includes a Terminal

All of these Terminals support ALL the shells above and any shells I've missed. Because a shell isn't a terminal. Pick the one that makes you happy. I use PowerShell Core and Ubuntu in WSL2 in the Windows Terminal.

Windows Terminal

Hope this helps clear things up.

Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!


© 2019 Scott Hanselman. All rights reserved.
     

Patching the new Cascadia Code to include Powerline Glyphs and other Nerd Fonts for the Windows Terminal

Sep 17, 2019

Description:

Microsoft released a nice new ligature-friendly open source font this week called Cascadia Code. It'll eventually be shipped with the open source Windows Terminal (you can get it from the store fee) but for now you can just download and install the TTF.

I've blogged about Fira Code and Monospaced Programming Fonts with Ligatures before. Just like keyboards, mice, monitors, text editors, and all the other things that we as developers put in our toolkits, Fonts are a very personal thing. Lots of folks have tweeted me, "why is this better than <font I use>." I dunno. Try it. Coke vs. Pepsi. If it makes you happy, use it.

I use Cascadia Code for my Terminals and I use Fira Code for my code editor. ¯\_(ツ)_/¯

That said, one important thing that you may want to know about is that you have FULL control of your fonts! Lots of folks want certain glyphs, or a fancy bash prompt, or they use posh-git, or PowerLine, or all of the above.

Right now Cascadia Code doesn't include every glyph in the world, but don't let that hold you back. Fix it.

For example, if I go install "Oh my Posh" and spice up my PowerShell Core prompt, it might look like this with Cascadia Code today.

Cascadia Code with no Nerd Fonts

But if I patch Cascadia Code on my own machine to include Nerd Fonts and other glyphs, I'll get this lovely prompt in Windows Terminal:

Cascadia Code with Nerd Fonts and PowerLine

So you have the power to do a lot of things. Don't be satisfied. Nest, and make your prompt your own! There are lots of Nerd Fonts but I want to patch Cascadia Code today (I'm sure they'll do it themselves one day, but I'm impatient) and make it look the way I want. You can to!

Starting with FontForge in Ubuntu under WSL

Using WSL2 and Ubuntu, I installed the Nerd Fonts Patcher and ran it on my downloaded version of Cascadia code like this:

scott@IRONHEART:/mnt/d/github/nerd-fonts$ fontforge -script font-patcher /mnt/c/Users/scott/Downloads/Cascadia.ttf
Copyright (c) 2000-2014 by George Williams. See AUTHORS for Contributors.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
with many parts BSD <http://fontforge.org/license.html>. Please read LICENSE.
Based on sources from 11:21 UTC 24-Sep-2017-ML-D.
Based on source from git with hash:
The following table(s) in the font have been ignored by FontForge
Ignoring 'DSIG' digital signature table
Warning: Mac string is a subset of the Windows string in the 'name' table
for the License string in the English (US) language.
Adding 53 Glyphs from Seti-UI + Custom Set
╢████████████████████████████████████████╟ 100%
Adding 198 Glyphs from Devicons Set
╢████████████████████████████████████████╟ 100%

Done with Patch Sets, generating font...

Generated: Cascadia Code Nerd Font

Cool! I could even go nuts and add -c and add thousands of glyphs. It just depends on what I need. I could just go --powerline and --fontawesome and call it a day. It's up to you! Salt your Fonts to taste!

Now I can install my local modified TTF like any other, then go into my profile.json in Windows Terminal and set the font face to my new personal custom "CascadiaCode Nerd Font!" Boom. All set.

UPDATE:  Alistair has created a forked version with the added glyphs. You may (or may not) be able to download his forked and renamed version from this Github comment. Slick!

Please also check out my YouTube video on blinging out your PowerShell prompt in the Windows Terminal!

Check out my YouTubes

Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!


© 2019 Scott Hanselman. All rights reserved.
     

Emulating a PlayStation 1 (PSX) entirely with C# and .NET

Sep 12, 2019

Description:

I was reading an older post in an emulator forum where someone was asking for a Playstation 1 (PSX) emulator written in C#, and the replies went on and on about how C# and .NET are not suited for emulation, C# is far too slow, negativity, blah blah.

Of course, that's silly. Good C# can run at near-native speed given all the work happening in the runtime/JITter, etc.

I then stumbled on this very early version of a PSX Emulator in C#. Now, if you were to theoretically have a Playtation SCPH1001.BIN BIOS and then physically owned a Playstation (as I do) and then created a BIN file from your physical copy of Crash Bandicoot, you could happily run it as you can see in the screenshot below.

Crash Bandicoot on a C#-based PSX Emulator

This project is very early days, as the author points out, but I was able to Git Clone and directly open the code in Visual Studio 2019 Community (which is free) and run it immediately. Note that as of the time of this blog post, the BIOS location *and* BIN files are hardcoded in the CD.cs and BUS.cs files. I named the BIN file "somegame.bin."

PSX Emualtor in C# inside Visual Studio

A funny note, since the code is unbounded as it currently sits, while I get about 30fps in Debug mode, in Release mode the ProjectPSX Emulator runs at over 120fps on my system, emulating a PlayStation 1 at over 220% of the usual CPU speed!

Just to make sure there's no confusion, and to support the author I want to repeat this question and answer here:

Can i use this emulator to play?

"Yes you can, but you shouldn't. There are a lot of other more capable emulators out there. This is a work in progress personal project with the aim to learn about emulators and hardware implementation. It can and will break during emulation as there are a lot of unimplemented hardware features."

This is a great codebase to learn from and read - maybe even support with your own Issues and PRs if the author is willing, but as they point out, it's neither complete nor ready for consumption.

Again, from the author who has other interesting emulators you can read:

I started doing a Java Chip8 and a C# Intel 8080 CPU (used on the classic arcade Space Invaders). Some later i did Nintendo Gameboy. I wanted to keep forward to do some 3D so i ended with the PSX as it had a good library of games...

Very cool stuff! Reading emulator code is a great way to not only learn about a specific language but also to learn 'the full stack.' We often hear Full Stack in the context of a complete distributed web application, but for many the stack goes down to the metal. This emulator literally boots up from the real BIOS of a Playstation and emulates the MIPS R3000A, a BUS to connect components, the CPU, the CD-ROM, and display.

An emulator has to lie at every step so that when an instruction is reached it can make everyone involved truly believe they are really running on a Playstation. If it does its job, no one suspects! That's why it's so interesting.

You can also press TAB to see the VRAM visualized as well as textures and color lookup tables which is super interesting!

Visualizing VRAM

One day, some day, there will be no physical hardware in existence for some of these old/classic consoles. Even today, lots of people play games for NES and SNES on a Nintendo Switch and may never see or touch the original hardware. It's important to support emulation development and sites like archive.org with Donations to make sure that history is preserved!

NOTE: It's also worth pointing out that it took me about 15 minutes to port this from .NET Framework 4.7.2 to .NET Core 3.0. More on this, perhaps, in another post. I'll also do a benchmark and see if it's faster.

I encourage you to go give a Github Star to ProjectPSX and enjoy reading this interesting bit of code. You can also read about the PSX Hardware written by Martin Korth for a trove of knowledge.

Sponsor: Develop Xamarin applications without difficulty with the latest JetBrains Rider: Xcode integration, JetBrains Xamarin SDK, and manage the required SDKs for Android development, all right from the IDE. Get it today
© 2019 Scott Hanselman. All rights reserved.
     

How to fix dfu-util, STM, WinUSB, Zadig, Bootloaders and other Firmware Flashing issues on Windows

Sep 11, 2019

Description:

Flashing devices wtih dfu-utilI'm pretty happy with Windows 10 as my primary development box. It can do most anything I want, run a half-dozen Linuxes, and has a shiny new open source Terminal, and has great support for Docker now.

However.

For years - YEARS I SAY - Windows has been a huge hassle when you want to flash the firmware of various devices over USB.

The term "dfu" means Device Firmware Update and dfu-util is the Device Firmware Update Utility, natch.

Very often I'll find myself with a device like a Particle Photon, Wilderness Labs Meadow, or some STM32 device that uses the ST Bootloader.

The Mac and Linux instructions usually say something like "plug it in and party on" but folks like myself with Windows have to set up a WinUSB Driver (libusb-win32 or libusbK) as dfu-util uses those libraries to speak to USB devices.

If you plug in a device, the vast majority of Windows users want the device to 'just work.' My non-technical parent doesn't want Generic USB drivers so they can flash the firmware on their mouse. I, however, as an aristocrat, sometimes want to do low-level stuff and flash an OS on a Microcontroller.

Today, the easiest way to swap the "inbox" driver with WinUSB is using a utility called Zadig. Per their docs:

Zadig is a Windows application that installs generic USB drivers,
such as WinUSB, libusb-win32/libusb0.sys or libusbK, to help you access USB devices.

It can be especially useful for cases where: you want to access a device using a libusb-based application you want to upgrade a generic USB driver you want to access a device using WinUSB

If you follow the instructions when flashing a device and don't have the right USB driver installed you'll likely get an error like this:

Cannot open DFU device 0483:df11

That's not a lot to go on. The issue is that the default "inbox" driver that Windows uses for devices like this isn't set up for Generic USB access with libraries like "libusb."

Install a generic USB driver for your device - WinUSB using Zadig

Run Zadig and click Options | List All Devices.

Here you can see me finding the ST device within Zadig and replacing the driver with WinUSB. In my case the device was listened under STM32 Bootloader. Be aware that you can mess up your system if you select something like your WebCam instead of the hardware device you mean to select.

In Zadig, select the STM32 Bootloader

In this state, you can see in the Device Manager that there's an "STM Device in DFU Mode."

STM Device in DFU Mode

Now I run Zadig and replace the driver with WinUSB. Here's the result. Note the SUCCESS and the changed Driver on the left.

Replace it with WinUSB

Here the STM32 Bootloader device now exists in Universal Serial Bus Devices in Device Manager.

STM32 Bootloader

Now I can run dfu-util --list again. Note the before and after in the screenshot below. I run dfu-util --list and it finds nothing. I replace the bootloader with the generic WinUSB driver and run dfu-util again and it finds the devices.

Flashing devices with dfu-util

At this point I can follow along and flash my devices per whatever instructions my manufacturer/project/boardmaker intends.

NOTE: When using dfu-util on Windows, I recommend you either be smart about your PATH and add dfu-util, or better yet, make sure the dfu-util.exe and libusb.dlls are local to your firmware so there's no confusion about what libraries are being used.
Keep dfu-util and libusb together

I'd love to see this extra step in Windows removed, but for now, I hope this write up makes it clearer and helps the lone Googler who finds this post.

Sponsor: Develop Xamarin applications without difficulty with the latest JetBrains Rider: Xcode integration, JetBrains Xamarin SDK, and manage the required SDKs for Android development, all right from the IDE. Get it today


© 2019 Scott Hanselman. All rights reserved.
     

Visual Studio now includes an integrated Terminal

Sep 5, 2019

Description:

It's early days (preview) but there's now a Terminal integrated into Visual Studio! Taking a nod from the 2017 plugin, the Terminal is now build in as an experimental feature using features from the NEW open source Windows Terminal.

Rather than build everything from scratch, the Visual Studio terminal shares most of its core with the Windows Terminal!

assuming you have Visual Studio 2019 16.3 Preview 3 or above, you’ll want to enable it by visiting the Preview Features page. Go to Tools > Options > Preview Features, enable the Experimental VS Terminal option and restart Visual Studio.

Tools > Options > Preview Features, enable the Experimental VS Terminal option and restart Visual Studio.

Make sure you restart after changing this option.

It's super early days and there's lots of things coming.

You can set up Profiles but you can't use them yet as the default is the only one used. In the future the Integrated Terminal will add a dropdown and + button like the Windows Terminal.

Multiple Profiles for the VS Integrated Terminal

Also note that if you want to integrate WSL (bash) you'll want to select c:\windows\sysnative\wsl.exe and pass in your preferred Distribution. Here you can see me running Ubuntu inside of VS2019. Sweet.

A terminal inside VS!

Grab the Preview of 16.3p3 now or wait a bit and you'll see more and more updates to the new VS Integrated Terminal in the coming months!

Sponsor: Uno Platform is the Open Source platform for building single codebase, native mobile, desktop and web apps using only C# and XAML. Built on top of Xamarin and WebAssembly! Check out the Uno Platform tutorial!


© 2019 Scott Hanselman. All rights reserved.
     

Introducing open source Windows 10 PowerToys

Sep 4, 2019

Description:

Microsoft Windows PowerToysYesterday the Windows Team announced the first preview and code release of PowerToys for Windows 10. This first preview includes two utilities:

The Windows key shortcut guide. Just hold down WIN+KEY for help A pro window manager called FancyZones. Check out this article for all the details!

Many years ago there was PowerToys for Windows 95 and frankly, it's overdue that we have them for Windows 10 – and bonus points for being open source!

These tools are also open source and hosted on GitHub! Maybe you have an open source project that's a "PowerToy?" Let me know in the comments. A great example of a PowerToy is something that takes a Windows Features and turns it up to 11!

EarTrumpet is a favorite example of mine of a community "PowerToy." It takes the volume control and the Windows auto subsystem and tailors it for the pro/advanced user. You should definitely try it out!

As for these new Windows 10 Power Toys, here’s what the Windows key shortcut guide looks like:

PowerToys - Shortcut Guide

And here's Fancy Zones. It's very sophisticated. Be sure to watch the YouTube to see how to use it.

Fancy Zones

To kick the tires on the first two utilities, download the installer here.

The main PowerToys service runs when Windows starts and a user logs in. When the service is running, a PowerToys icon appears in the system tray. Selecting the icon launches the PowerToys settings UI. The settings UI lets you enable and disable individual utilities and provides settings for each utility. There is also a link to the help doc for each utility. You can right click the tray icon to quit the Power Toys service.

We'd love to see YOU make a PowerToy and maybe it'll get bundled with the PowerToys installer!

How to create new PowerToys

See the instructions on how to install the PowerToys Module project template.
Specifications for the PowerToys settings API.

We ask that before you start work on a feature that you would like to contribute, please read our Contributor's Guide. We will be happy to work with you to figure out the best approach, provide guidance and mentorship throughout feature development, and help avoid any wasted or duplicate effort.

Additional utilities in the pipeline are: Maximize to new Virtual Desktop widget The MTND widget shows a pop-up button when a user hovers over the maximize / restore button on any window. Clicking it creates a new desktop, sends the app to that desktop and maximizes the app on the new desktop. Process terminate tool Batch file renamer Animated gif screen recorder

If you find bugs or have suggestions, please open an issue in the Power Toys GitHub repo.

Sponsor: Uno Platform is the Open Source platform for building single codebase, native mobile, desktop and web apps using only C# and XAML. Built on top of Xamarin and WebAssembly! Check out the Uno Platform tutorial!


© 2019 Scott Hanselman. All rights reserved.
     

Deploying a MSDeploy-packaged Web application to a Linux Azure App Service with Azure DevOps

Aug 29, 2019

Description:

For bizarre and unknown historical reasons, when using MSDeploy to make a ZIP package to upload a website to a web server you get a massively deep silly path like yada/yada/C_C/Temp/package/WebApplication1/obj/Release/Package/PackageTmp. I use .NET Core so I usually do a "dotnet publish" and get a sane path for my build artifacts in my CI/CD (Continues Integration/Continuous Deployment) pipeline.

I'm using the original pipeline editor on free Azure DevOps (I'm still learning DevOps YAML for this, and this visual pipeline editor IMHO is more friendly for getting started.

However, I'm using a "Visual Studio Build" task which is using MSDeploy and these MSBuild arguments.

/p:DeployOnBuild=true
/p:WebPublishMethod=Package
/p:PackageAsSingleFile=true
/p:SkipInvalidConfigurations=true
/p:PackageLocation="$(build.artifactstagingdirectory)\\"

Azure Dev OpsLater on in the process I'm taking this package/artifact - now named "drop.zip" and I'm publishing it to Azure App Service.

I'm using the "Azure App Service Deploy" task in the DevOps release pipeline and it works great when publishing to a Windows Azure App Service Plan. Presumably because it's using, again, MSDeploy and it knows about these folders.

However, I wanted to also deploy to a Linux Azure App Service. Recently there was a massive (near 35%) price drop for Premium App Services. I'm running an S1 and I can move to a P1V2 and get double the memory, move to SSDs, and get double the perf for basically the same money. I may even be able to take TWO of my S1s and pack all my websites (19 at this point) into just one Premium. It'll be faster and way cheaper.

Trick is, I'll need to move my Windows web apps to Linux web app. That's cool, since I'm using .NET Core - in my case 2.1 and 2.2 - then I'll just republish. I decided to take my existing Azure DevOps release pipeline and just add a second task to publish to Linux for testing. If it works I'll just disable the Windows one. No need to rebuild the whole pipeline from scratch.

Unfortunately the Linux Azure App Service has its deployment managed as a straight ZIP deployment; it was ending up with a TON of nested folders from MSDeploy!

NOTE: If I'm giving bad advice or I am missing something obvious, please let me know in the comments! Perhaps there's a "this zip file has a totally bonkers directory structure, fix it for Linux" checkbox that I missed?

I could redo the whole build pipeline and build differently, but I'd be changing two variables and it already works today on Windows.

I could make another build pipeline for Linux and build differently, but that sounds tedious and again, a second variable. I have a build artifact now, it's just in a weird structure.

How did I know the build artifact had a weird folder structure? I remember that I could just download any build artifact and look at it! Seems obvious when you say it but it's a good reminder that all these magical black box processes that move data from folder to folder are not black boxes - you can always check the result of a step. The output of one step becomes the input to the next.

downloading an artifact from Azure DevOps

I should probably have a Windows Build and Linux Build (two separate build agents) but the site isn't complex enough and it doesn't do anything that isn't clearly cross-platform friendly.

Anthony Chu suggested that I just remove the folders by restructuring the zip file (unzipping/zipping it). Could be a simple way to get both Windows and Linux publishing from a single artifact. I can fix it all up with a fresh build and release pipeline another time when I have the energy to learn this YAML format. (Speaking of the Azure DevOps YAML which doesn't have a friendly editor or validator, not speaking of YAML as a generic concept)

Unzipping and zipping up MSDeploy mess

I unzip the weird folder structure, then zip it back up from a new root. It then cleanly deploys to the Linux Azure App Service from the same artifact I made built for the Windows App Service.

Ironically here's a YAML view of the tasks, although I build them with the visual editor.

steps:
- task: ExtractFiles@1
displayName: 'Extract files - MSDeploy Crap'
inputs:
destinationFolder: linuxdrop
steps:
- task: ArchiveFiles@2
displayName: 'Archive linuxdrop/Content/D_C/a/1/s/hanselminutes.core/obj/Release/netcoreapp2.2/PubTmp/Out'
inputs:
rootFolderOrFile: 'linuxdrop/Content/D_C/a/1/s/hanselminutes.core/obj/Release/netcoreapp2.2/PubTmp/Out'
includeRootFolder: false
steps:
- task: AzureRmWebAppDeployment@4
displayName: 'Azure App Service Deploy: hanselminutes-core-linux'
inputs:
azureSubscription: 'Azure MSDN)'
appType: webAppLinux
WebAppName: 'hanselminutes-linux'
packageForLinux: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
RuntimeStack: 'DOTNETCORE|2.2'

Just to be clear, this isn't standard and it's a pretty rare edge case and it may not work for everyone but isn't it nice to google for a super rare edge case and instead of feeling all alone you find an answer?

Sponsor: Looking for a tool for performance profiling, unit test coverage, and continuous testing that works cross-platform on Windows, macOS, and Linux? Check out the latest JetBrains Rider!
© 2019 Scott Hanselman. All rights reserved.
     

Totally unsupported hacks - Add Windows Terminal to the Win+X Shortcut menu

Aug 27, 2019

Description:

You shouldn't do this and if you choose to do this you may hurt yourself or one of your beloved pets.

You have been warned.

The Windows+X hotkey has been around for many years as is a simple right-click style context list of Developer/Administrator stuff that your techies might need in the course of human events.

There's one obscure setting in Settings | Taskbar where you can set the main option for the Command Prompt to be replaced with PowerShell, although that was flipped to "on" by default many years ago.

Replace Command Prompt with PowerShell

I want Windows Terminal in that Win+X menu.

Fast Forward to a world with lots of alternative console hosts, Linux running on Windows natively, not to mention cross-platform open source PowerShell Core, AND the new open source Windows Terminal (that you can just go download right now in the Windows Store) we find ourselves in a middle place. We want to replace the default console with the Windows Terminal everywhere as the default but that's gonna be a while.

Until then, we can integrate the Windows Terminal into our lives in a few obvious ways.

Pin Windows Terminal to your taskbar Train yourself to Win+R and run "wt" rather than "cmd.exe" at wt.exe is a shim that launches the store-based Windows Terminal. Add Windows Terminal to the Win+X menu.

It is that last one that concerns me today.

The Win+X implementation is a totally bonkers thing that I just don't understand with its origins lost to the mist of forgotten time.

You can go check out C:\Users\USERNAME\AppData\Local\Microsoft\Windows\WinX and find it full of LNK files. Just drop yours in there, right? Well, I say nay nay!

They didn't want just anyone dropping stuff in there so to add a new application to Windows+X you need to:

Make or find a LNK file for your application. BUT! Your lnk file can't (today?) be a LNK to a Windows Store app - more on that later. They appear to be ignored today. Store a special hash in your LNK file per Rafael's excellent writeup here so that they are considered "Approved Links." Rafael's utility has the source at GitHub and the binary here. Make a new Group 4 folder in the \WinX folder above OR update Group 3 and copy your link in there considering the numbering scheme. Note the ordering in the registry at HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\ShellCompatibility\InboxApp

OR

Download a closed source utility by Sergey Tkachenko that promises to do these things for you, from WinReview.ru. More on that later.

Here's my WinX\Group3 folder . Note the shortcut at the top there.

image

I wanted to find a link to the Windows Terminal but it's harder than it looks. I can't find a real LNK file anywhere on my system. BUT I was able to find a synthetic one and make a copy by going "Win+R" and running "shell:AppsFolder" which brings you to a magic not-a-folder folder.

Not a folder folder

That is a folder of lies. I tried making a copy of this LNK, moving it to my deskop, hashing it with Rafael's util but it's ignored, presumably because it's a Windows Store LNK. Instead, I'll head out to cmd.exe and type "where wt.exe" to find the wt.exe shim and make a link to that!

C:\Users\scott>where wt.exe
C:\Users\scott\AppData\Local\Microsoft\WindowsApps\wt.exe

These files are also lies, but lies of a another type. Zero byte lies.

Zero Byte Lies

Right-click wt.exe and Create Shortcut. Then drag that shortcut out of there and into somewhere else like your Desktop. You can then use hashlnk and move it to the WinX folder.

OR, you can use this scary and totally unsupported utility hosted at a questionable website that you have no business visiting. It's called Win+X Menu Editor and it was a chore to download. So much so that I'm going to hide a copy in my DropBox for the day in the near future when this utility and website disappear.

Be careful when you go download this utility, the site is full of scary links that say Download Now but they are all lies. You want the subtle text link that points to a ZIP file, just above the Donate button that says "Download Win+X Menu Editor."

In this utility you can add an item that points to your new WT.LNK file and it will use Rafael's code and copy the LNK file to the right place and re-number stuff if needed. Again, be careful as you never know. You might mess up your whole life with stuff like this. It worked for me.

Win+X Menu Editor

And there you go.

Windows Terminal in the WIN+X menu

Lovely. Now IMHO in some ideal future this should just happen out of the box, but until then it's nice to know I can do it myself.

Sponsor: Looking for a tool for performance profiling, unit test coverage, and continuous testing that works cross-platform on Windows, macOS, and Linux? Check out the latest JetBrains Rider!


© 2019 Scott Hanselman. All rights reserved.
     

dotnet new worker - Windows Services or Linux systemd services in .NET Core

Aug 23, 2019

Description:

dotnet new workerYou've long been able to write Windows Services in .NET and .NET Core, and you could certainly write a vanilla Console App and cobble something together for a long running headless service as well. However, the idea of a Worker Process, especially a long running one is a core part of any operating system - Windows, Linux, or Mac.

Now that open source .NET Core is cross-platform, it's more than reasonable to want to write OS services in .NET Core. You might write a Windows Service with .NET Core or a systemd process for Linux with it as well.

Go grab a copy of .NET Core 3.0 - as of the time of this writing it's very close to release, and Preview 8 is supported in Production.

If you're making a Windows Service, you can use the Microsoft.Extensions.Hosting.WindowsService package and tell your new Worker that its lifetime is based on ServiceBase.

public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.UseWindowsService()
.ConfigureServices(services =>
{
services.AddHostedService<Worker>();
});

If you're making a Linux worker and using systemd you'd add the Microsoft.Extensions.Hosting.Systemd package and tell your new Worker that its lifetime is managed by systemd!

public static IHostBuilder CreateHostBuilder(string[] args) =>
Host.CreateDefaultBuilder(args)
.UseSystemd()
.ConfigureServices((hostContext, services) =>
{
services.AddHostedService<Worker>();
});

The Worker template in .NET Core makes all this super easy and familiar if you're used to using .NET already. For example, logging is built in and regular .NET log levels like LogLevel.Debug or LogLevel.Critical are automatically mapped to systemd levels like Debug and Crit so I could run something like sudo journalctl -p 3 -u testapp and see my app's logs, just alike any other Linux process because it is!

You'll notice that a Worker doesn't look like a Console App. It has a Main but your work is done in a Worker class. A hosted service or services is added with AddHostedService and then a lot of work is abstracted away from you. The Worker template and BackgroundService base class brings a lot of the useful conveniences you're used to from ASP.NET over to your Worker Service. You get dependency injection, logging, process lifetime management as seen above, etc, for free!

public class Worker : BackgroundService
{
private readonly ILogger<Worker> _logger;

public Worker(ILogger<Worker> logger)
{
_logger = logger;
}

protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
while (!stoppingToken.IsCancellationRequested)
{
_logger.LogInformation("Worker running at: {time}", DateTimeOffset.Now);
await Task.Delay(1000, stoppingToken);
}
}
}

This is a very useful template and it's available from the command line as "dotnet new worker" or from File New Project in Visual Studio 2019 Preview channel.

Also check out Brady Gaster's excellent blog post on running .NET Core workers in containers in Azure Container Instances (ACI). This is super useful if you have some .NET Core and you want to Do A Thing in the cloud but you also want per-second billing for your container.

Sponsor: Get the latest JetBrains Rider with WinForms designer, Edit & Continue, and an IL (Intermediate Language) viewer. Preliminary C# 8.0 support, rename refactoring for F#-defined symbols across your entire solution, and Custom Themes are all included.


© 2019 Scott Hanselman. All rights reserved.
     

Review: UniFi from Ubiquiti Networking is the ultimate prosumer home networking solution

Aug 20, 2019

Description:

UniFi mapI LOVE my Amplifi Wi-Fi Mesh Network. I've had it for two years and it's been an absolute star performer. We haven't had a single issue. Rock solid. That's really saying something. From unboxing to installation to running it (working from home for a tech company, so you know I'm pushing this system) it's been totally stable. I recommend Amplifi unreservedly to any consumer or low-key pro-sumer who has been frustrated with their existing centrally located router giving them reliable wi-fi everywhere in their home.

That said...I recently upgraded my home internet service provider. For the last 10 years I've had fiber optic to the house with 35 Mbp/s up/down and it's been great. Then I called them a a few years back and got 100/100. The whole house was presciently wired by me for Gigabit back in 2007 (!) with a nice wiring closet and everything. Lately 100/100 hasn't been really cutting it when I'm updating a dozen laptops for a work event, copying a VM to the cloud while my spouse is watching 4k netflix and two boys are updating App Store apps. You get the idea. Modern bandwidth requirements and life has changed since 2007. We've got over 40 devices on the network now and many are doing real work.

I called an changed providers to a cable provider that offered true gigabit. However, I was rarely getting over 300-400 Mbp/s on my Amplifi. There is a "hardware NAT" option that really helps, but short of running the Amplifi in Bridged Mode and losing a lot of its epic features, it was clear that I was outgrowing this prosumer device.

Give I'm a professional working at home doing stuff that is more than the average Joe or Jane, what's a professional option?

UniFi from Ubiquiti

Amplifi is the consumer/prosumer line from Ubiquiti Networks and UniFi (UBNT) is the professional line.  You'll literally find these installed at business or even sports stadiums. This is serious gear.

Let me be honest. I knew UniFi existed. Knew (I thought) all about it and I resisted. My friends and fellow nerds insisted it was easy but I kept seeing massive complex network diagrams and convinced myself it wasn't worth the hassle.

My friends, I was wrong. It's not hard. If you are doing business at home, have a gigabit network pipe, a wired home network, and/or have a dozen or more network devices, you're a serious internet person and you might want to consider serious internet networking gear.

Everything is GREAT

Now, UniFi is more expensive than Amplifi as it's pro gear. While an Amplifi Mesh WiFi system is just about $300-350 USD, UniFi Pro gear will cost more and you'll need stuff to start out and it won't always feel intuitive as you plan your system. It is worth it and I'm thrilled with the result. The flexibility and customizability its offered has been epic. There are literally no internet issues in our house or property anymore. I've even been able to add wired and wireless non-cloud-based security cameras throughout the property. Additionally, remember how the house is already wired in nearly every room with Cat6 (or Cat5e) cabling? UniFi has reintroduced me to the glorious world of PoE+ (Power over Ethernet) and removed a half dozen AC wall plugs from my system.

Plan your Network

You can test out the web-based software yourself LIVE at https://demo.ui.com and see what managing a large network would be like. Check out their map of the FedEx Forum Stadium and how they get full coverage. You can see a simulated map of my house (not really my house) in the screenshot above. When you set up a controller you can place physical devices (ones you have) and test out virtual devices (ones you are thinking of buying) and see what they would look like on a real map of your home (supplied by you). You can even draw 3D walls and describe their material (brick, glass, steel) and their dB signal loss.

UniFi.beginner.950

When you are moving to UniFi you'll need:

USG - UniFi Security Gateway - This has 3 gigabit points and has a WAN port for your external network (plug your router into this) and a LAN port for your internal network (plug your internal switch into this). This is the part that doles out DHCP. UniFi Cloud Key or Cloud Key Gen2 Plus It's not intuitive what the USG does vs the Cloud Key but you need both. I got the Gen2 because it includes a 1TB hard drive that allows me to store my security video locally. It also is itself a PoE client so I don't need to plug it into the wall. I just wired it with a single Ethernet cable to the PoE switch below and left it in the wiring closet. There's a smaller cheaper Cloud Key if you don't need a hard drive. You don't technically need a Cloud Key I believe, as all the UniFi Controller Software is free and you can run it in on any machine you have laying around. Folks have run them on any Linux or Windows machine they have, or even on a Synology or other NAS. I like the idea of having it "just work" so I got the Cloud Key. UniFi Switch (of some kind and number of ports) 8 port 150 watt UniFi Switch 24 port UniFi Switch - 24 ports may be overkill for most but it's only 8 lbs and will handle even the largest home network. And it's under $200 USD right now on Amazon 24 port UniFi Switch with PoE - I got this one because it has 250W of PoE power. If you aren't interested in power over ethernet you can save money with the non-PoE version or a 16 port version but I REALLY REALLY recommend you use PoE because the APs work better with it.
PoE switch showing usage on many ports

Now once you've got the administrative infrastructure above, you just need to add whatever UniFi APs - access points - and/or optional cameras that you want!

NOTE/TIP - A brilliant product from Ubiquiti that I think is flying under the radar is the Unifi G3 Flex PoE camera. It's just $75 and it's tiny but it's absolutely brilliant. Full 1080p video and night vision. I'll talk about the magic of PoE later on but you can just plug this in anywhere in the house - no AC adapter - and you've got a crystal clear security camera or cameras anywhere in the house. They are all powered from the PoE switch!

I had a basic networking closet I put the USG Gateway into the closet with a patch cable to the cable modem (the DOCSIS 3.1 cable modem that I bought because I got tired of renting it from the service provider) then added the Switch with PoE, and plugged the Cloud Key into it. Admin done.

Here's the lovely part.

Since I have cable throughout the house, I can just plug in the UniFi Access Points in various room and they get power immediately. I can try different configs and test the signal strength. I found the perfect config after about 4 days of moving things around and testing on the interactive map. The first try was fine but I strove for perfect.

There's lots of UniFi Access Points to choose from. The dual radio Pro version can get pretty expensive if you have a lot so I got the Lite PoE AP. You can also get a 5 pack of the nanoHD UniFi Access Points.

These Access Points are often mounted in the ceiling in pro installations, and in a few spots I really wanted something more subtle AND I could use a few extra Ethernet ports. Since I already had an Ethernet port in the wall, I could just wall mount the UniFi Wall Mounted AP. It's both a wireless AP that radiates outward into the room AND it turns your one port into two, or you can get one that becomes a switch with more ports and extends your PoE abilities. So I can add this to a room, plug a few devices in AND a PoE powered Camera with no wall-warts or AC adapters!

NOTE: I did need to add a new ethernet RJ45 connector to plug into the female connector of the UniFi in-wall AP. Just be sure to plan and take inventory. You may already have full cables with connectors pulled to your rooms. Be aware.

There are a TON of great Wireless AP options from UniFi so make sure you explore them all and understand what you want.

In-Wall AP

Here's the resulting setup and choices I made, as viewed in the UniFi Controller Software:

List of Ubnt devices

I have the Gateway, the Switch with PoE, and five APs. Three are the disc APs and two are in-wall APs. They absolutely cover and manage my entire two story house and yards front and back. It's made it super easy for me to work from home and be able to work effectively from any room. My kids and family haven't had any issues with any tablets or phones.

As of the time of these writing I have 27 wireless devices on the system and 11 wired (at least those are the ones that are doing stuff at this hour).

My devices as viewed in the UniFi controller

Note how it will tell you how each device's WiFi experience is. I use this Experience information to help me manage the network and see if the APs are appropriately placed. There is a TON of great statistics and charts and graphics. It's info-rich to say the LEAST.

NOTE: To answer a common question - In an installation like this you've got a single SSID even though there's lots of APs and your devices will quietly and automatically roam between them!
Log showing roaming between APs

The iPhone app is very full-featured as well and when you've got deep packet introspection turn on you can see a ton of statistical information at the price of a smidge of throughput performance.

iPhone StatsiPhone Bandwidth

I have had NO problem hitting 800-950Mbs over wired and I feel like there's no real limit to the perf of this system. I've done game streaming over Steam and Xbox game streaming for hours without a hiccup. Netflix doesn't buffer anymore, even on the back porch.

a lot of bandwidth with no drops

You can auto-optimize, or you can turn off a plethora of feature and manage everything manually. I was able to twitch a few APs to run their 2.4Ghz Wi-Fi radios on less crowded channels in order to get out of the way of the loud neighbors on channel 11.

I have a ton of control over the network now, unlimited expandability and it has been a fantastically stable network. All the APs are wire backed and the wireless bandwidth is rock solid. I've been extremely impressed with the clean roaming from room to room while streaming from Netflix. It's a tweakers (ahem) dream network.

* I use Amazon referral links and donate the little money to my kids' school. You support charter schools when you use these links.

Sponsor: Get the latest JetBrains Rider with WinForms designer, Edit & Continue, and an IL (Intermediate Language) viewer. Preliminary C# 8.0 support, rename refactoring for F#-defined symbols across your entire solution, and Custom Themes are all included.


© 2019 Scott Hanselman. All rights reserved.
     

SharpScript from ServiceStack lets you run .NET apps directly from a GitHub Gist!

Aug 15, 2019

Description:

I've blogged about ServiceStack before. It's an extraordinary open source project - an ecosystem of its own even - that is designed to be an alternative to the WCF, ASP.NET MVC, and ASP.NET Web API frameworks. I enjoy it so much I even helped write its tagline "Thoughtfully architected, obscenely fast, thoroughly enjoyable web services for all"

ServiceStack is an easy drop-in that simplifies creating Web Services in any ASP.NET Web App, but also in Self Hosting Console Apps, Windows Services and even Windows and OSX Desktop Apps - supporting both .NET Framework and .NET Core. The easiest way to get started is to create a new project from a ServiceStack VS.NET Template.

ServiceStack has released a new and amazing project that is absolutely audacious in its scope and elegant in its integration with the open source .NET Core ecosystem - #Script (pronounced "sharp script.")

Scripts IN your app!

There are a number of .NET projects that simulate REPL's or allow basic scripting, like "dotnet script" as an example or ScriptCS but I'm deeply impressed with #Script. To start with, #Script is somewhat better suited for scripting than Razor and it doesn't require precompilation. #Script is appropriate for live documents or Email Templates for example.

Here's a basic example of embedding a ScriptContext in your app:

var context = new ScriptContext().Init();
var output = context.EvaluateScript("Time is now: {{ now | dateFormat('HH:mm:ss') }}");

Where ServiceStack's #Script really shines is its use of .NET Core Global Tools. They've nabbed two global tool names - web and app (sassy!) and allow one to create SharpApps. From their site:

Sharp Apps leverages #Script to develop entire content-rich, data-driven websites without needing to write any C#, compile projects or manually refresh pages - resulting in the easiest and fastest way to develop Web Apps in .NET!

The web tool is cross platform and the app global tool is great for Windows as it supports .NET Core Windows Desktop Apps.

Your app IS a script!

You can write interactive SharpScripts or SharpApps that uses Chromium as a host.

You can literally run a "desktop" app self contained from a GitHub Gist!

Sharp Apps can also be published to Gists where they can be run on-the-fly without installation, they're always up-to-date, have tiny footprints are fast to download and launch that can also run locally, off-line and cross-platform across Windows, macOS and Linux OS's.

There's also a "gallery" that maps short names to existing examples. So run "app open" to get a list, then "app open name" to run one. You can just "app open blog" and you're running a quick local blog.

SharpApps

Easy to develop and run

The global tools make SharpApp a complete dev and runtime experience because you can just run "app" in the source folder and as you make code changes the hot-reloader updates the site as you Ctrl-S (save) a file!

If you've got .NET Core SDK installed (it's super quick) then just grab the local tool here (app on Windows and web anywhere else):

dotnet tool install --global app

And if you have a existing .NET Core web app you can launch it and run it in a Chromium Embedded Framework (CEF) browser with "app foo.dll" Check out this example on how to make and run a .NET Core app on the Windows Desktop with #Script.

ServiceStack CEF App

Then you can make a shortcut and add it to to the desktop with

app shortcut Acme.dll

Slick!

Code in #Script is done in markdown ```code blocks, while in Razor it's @{ } but it does use mustache template style. Go try out some of their Starter Projects!

#Script and SharpApps is an extraordinary addition to the .NET Core ecosystem and I'm just touching the surface. Do check out their site at https://sharpscript.net.

What do you think?

Sponsor: Develop Xamarin applications without difficulty with the latest JetBrains Rider: Xcode integration, JetBrains Xamarin SDK, and manage the required SDKs for Android development, all right from the IDE. Get it today!


© 2019 Scott Hanselman. All rights reserved.
     

I miss Microsoft Encarta

Aug 13, 2019

Description:

imageMicrosoft Encarta came out in 1993 and was one of the first CD-ROMs I had. It stopped shipping in 2009 on DVD. I recently found a disk and was impressed that it installed just perfectly on my latest Window 10 machine and runs nicely.

Encarta existed in an interesting place between the rise of the internet and computer's ability to deal with (at the time) massive amounts of data. CD-ROMs could bring us 700 MEGABYTES which was unbelievable when compared to the 1.44MB (or even 120KB) floppy disks we were used to. The idea that Encarta was so large that it was 5 CD-ROMs (!) was staggering, even though that's just a few gigs today. Even a $5 USB stick could hold Encarta - twice!

My kids can't possibly intellectualize the scale that data exists in today. We could barely believe that a whole bookshelf of Encyclopedias was now in our pockets. I spent hours and hours just wandering around random articles in Encarta. The scope of knowledge was overwhelming, but accessible. But it was contained - it was bounded. Today, my kids just assume that the sum of all human knowledge is available with a single search or a "hey Alexa" so the world's mysteries are less mysterious and they become bored by the Paradox of Choice.

image

In a world of 4k streaming video, global wireless, and high-speed everything, there's really no analog to the feeling we got watching the Moon Landing as a video in Encarta - short of watching it live on TV in 1969! For most of us, this was the first time we'd ever seen full-motion video on-demand on a computer in any sort of fidelity - and these are mostly 320x240 or smaller videos!

First Steps on the Moon

A generation of us grew up hearing MLK's "I have a dream" speech inside Microsoft Encarta!

MLK I have a Dream

Remember the Encarta "So, you wanna play some Basketball" Video?

LeBron James from 2003

Amazed by Google Earth? You never saw the globe in Encarta.

Globe in Encarta

You'll be perhaps surprised to hear that the Encarta Timeline works even today on across THREE 4k monitors at nearly 10,000 pixels across! This was a product that was written over 10 years ago and could never have conceived of that many pixels. It works great!

The Encarta Timeline across 3 4k monitors

Most folks at Microsoft don't realize that Encarta exists and is used TODAY all over the developing world on disconnected or occasionally connected computers. (Perhaps Microsoft could make the final version of Encarta available for a free final download so that we might avoid downloading illegal or malware infested versions?)

What are your fond memories of Encarta? If you're not of the Encarta generation, what's your impression of it? Had you heard or thought of it?

Sponsor: Develop Xamarin applications without difficulty with the latest JetBrains Rider: Xcode integration, JetBrains Xamarin SDK, and manage the required SDKs for Android development, all right from the IDE. Get it today!


© 2019 Scott Hanselman. All rights reserved.
     

The PICO-8 Virtual Fantasy Console is an idealized constrained modern day game maker

Aug 8, 2019

Description:

Animated GIF of PICO-8I love everything about PICO-8. It's a fantasy gaming console that wants you - and the kids in your life and everyone you know - to make games!

How cool is that?

You know the game Celeste? It's available on every platform, has one every award and is generally considered a modern-day classic. Well the first version was made on PICO-8 in 4 days as a hackathon project and you can play it here online. Here's the link when they launched in 4 years ago on the forums. They pushed the limits, as they call out "We used pretty much all our resources for this. 8186/8192 code, the entire spritemap, the entire map, and 63/64 sounds." How far could one go? Wolf3D even?

"A fantasy console is like a regular console, but without the inconvenience of actual hardware. PICO-8 has everything else that makes a console a console: machine specifications and display format, development tools, design culture, distribution platform, community and playership. It is similar to a retro game emulator, but for a machine that never existed. PICO-8's specifications and ecosystem are instead designed from scratch to produce something that has it's own identity and feels real. Instead of physical cartridges, programs made for PICO-8 are distributed on .png images that look like cartridges, complete with labels and a fixed 32k data capacity."

What a great start and great proof that you can make an amazing game in a small space. If you loved GameBoys and have fond memories of GBA and other small games, you'll love PICO-8.

How to play PICO-8 cartridges

Demon CastleIf you just want to explore, you can go to https://www.lexaloffle.com and just play in your browser! PICO-8 is a "fantasy console" that doesn't exist physically (unless you build one, more on that later). If you want to develop cartridges and play locally, you can buy the whole system (any platform) for $14.99, which I have.

If you have Windows and Chrome or New Edge you can just plug in your Xbox Controller with a micro-USB cable and visit https://www.lexaloffle.com/pico-8.php and start playing now! It's amazing - yes I know how it works but it's still amazing - to me to be able to play a game in a web browser using a game controller. I guess I'm easily impressed.

It wasn't very clear to me how to load and play any cartridge LOCALLY. For example, I can play Demon Castle here on the Forums but how do I play it locally and later, offline?

The easy way is to run PICO-8 and hit ESC to get their command line. Then I type LOAD #cartid where #cartid is literally the id of the cartridge on the forums. In the case of Demon Castle it's #demon_castle-0 so I can just LOAD #demon_castle-0 followed by RUN.

Alternatively - and this is just lovely - if I see the PNG pic of the cartridge on a web page, I can just save that PNG locally and save it in C:\Users\scott\AppData\Roaming\pico-8\carts then run it with LOAD demon_castle-0 (or I can include the full filename with extensions). THAT PNG ABOVE IS THE ACTUAL GAME AS WELL. What a clever thing - a true virtual cartridge.

One of the many genius parts of the PICO-8 is that the "Cartridges" are actually PNG pictures of cartridges. Drink that in for a second. They save a screenshot of the game while the cart is running, then they hide the actual code in a steganographic process - they are hiding the code in two of the bits of the color channels! Since the cart pics are 160*205 there's enough room for 32k.

A p8 file is source code and a p8.png is the compiled cart!

How to make PICO-8 games

The PICO-8 software includes everything you need - consciously constrained - to make AND play games. You hit ESC to move between the game and the game designer. It includes a sprite and music editor as well.

From their site, the specifications are TIGHT on purpose because constraints are fun. When I write for the PalmPilot back in the 90s I had just 4k of heap and it was the most fun I've had in years.

Display - 128x128 16 colours Cartridge Size - 32k Sound - 4 channel chip blerps Code - Lua Sprites - 256 8x8 sprites Map - 128x32 cels

"The harsh limitations of PICO-8 are carefully chosen to be fun to work with, to encourage small but expressive designs, and to give cartridges made with PICO-8 their own particular look and feel."

The code you will use is LUA. Here's some demo code of a Hello World that animates 11 sprites and includes two lines of text

t = 0
music(0) -- play music from pattern 0

function _draw()
cls()
for i=1,11 do -- for each letter
for j=0,7 do -- for each rainbow trail part
t1 = t + i*4 - j*2 -- adjusted time
y = 45-j + cos(t1/50)*5 -- vertical position
pal(7, 14-j) -- remap colour from white
spr(16+i, 8+i*8, y) -- draw letter sprite
end
end

print("this is pico-8", 37, 70, 14)
print("nice to meet you", 34, 80, 12)
spr(1, 64-4, 90) -- draw heart sprite
t += 1
end

That's just a simple example, there's a huge forum with thousands of games and lots of folks happy to help you in this new world of game creation with the PICO-8. Here's a wonderful PICO-8 Cheat Sheet to print out with a list of functions and concepts. Maybe set it as your wallpaper while developing? There's a detailed User Manual and a 72 page PICO-8 Zine PDF which is really impressive!

And finally, be sure to bookmark this GitHub hosted amazing curated list of PICO-8 resources! https://github.com/pico-8/awesome-PICO-8

image image

Writing PICO-8 Code in another Editor

There is a 3 year old PICO-8 extension for Visual Studio Code that is a decent start, although it's created assuming a Mac, so if you are a Windows user, you will need to change the Keyboard Shortcuts to something like "Ctrl-Shift-Alt-R" to run cartridges. There's no debugger that I'm seeing. In an ideal world we'd use launch.json and have a registered PICO-8 type and that would make launching after changing code a lot clearer.

There is a more recent "pico8vscodeditor" extension by Steve Robbins that includes snippets for loops and some snippets for the Pico-8 API. I recommend this newer fleshed out extension - kudos Steve! Be sure to include the full path to your PICO-8 executable, and note that the hotkey to run is a chord, starting with "Ctrl-8" then "R."

Telling VS-Code about PICO-8

Editing code directly in the PICO-8 application is totally possible and you can truly develop an entire cart in there, but if you do, you're a better person than I. Here's a directory listing in VSCode on the left and PICO-8 on the right.

Directories in PICO-8

And some code.

Editing Pico-8 code

You can expert to HTML5 as well as binaries for Windows, Mac, and Linux. It's a full game maker! There are also other game systems out there like PicoLove that take PICO-8 in different directions and those are worth knowing about as well.

What about a physical PICO-8 Console

A number of folks have talked about the ultimate portable handheld PICO-8 device. I have done a lot of spelunking and as of this writing it doesn't exist.

You could get a Raspberry Pi Zero and put this Waveshare LCD hat on top. The screen is perfect. But the joystick and buttons...just aren't. There's also no sound by default. But $14 is a good start. The Tiny GamePi15, also from Waveshare could be good with decent buttons but it has a 240x240 screen. The full sized Game Hat looks promising and has a large 480x320 screen so you could play PICO-8 at a scaled 256x256. The RetroStone is also close but you're truly on your own, compiling drivers yourself (twitter thread) from what I can gather The ClockworkPI GameShell is SOOOO close but the screen is 320x240 which makes 128x128 an awkward scaled mess with aliasing, and the screen the Clockwork folks chose doesn't have a true grid if pixels. Their pixels are staggered. Hopefully they'll offer an alternative module one day, then this would truly be the perfect device. There are clear instructions on how to get going. The PocketCHIP has a great screen but a nightmare input keyboard.

For now, any PC, Laptop, or Rasberry Pi with a proper setup will do just fine for you to explore the PICO-8 and the world of fantasy consoles!

Sponsor: OzCode is a magical debugging extension for C#/.NET devs working in Visual Studio. Get to the root cause of your bugs faster with heads-up display, advanced search inside objects, LINQ query debugging, side-by-side object comparisons & more. Try for free!
© 2019 Scott Hanselman. All rights reserved.
     

Good, Better, Best - creating the ultimate remote worker webcam setup on a budget

Aug 6, 2019

Description:

I've been a remote worker and an occasional YouTuber for well over a decade. I'm always looking for a better setup because the goal is clear - how can I interact with you and my co-workers in a way that has high-enough fidelity that I don't need to drive to Seattle every week!

I believe if my camera is clear and my audio is clear than I can really have a remote relationship with my team that is effective and true.

Everyone has a webcam these days and can just get on a video call and have a chat - but is it of sufficient quality that you feel like you're really having a good conversation with folks and truly connecting!

Here's a shot of my setup during a meeting I'm in here at Microsoft:

My setup - webcam and camera

Here's my thoughts on Good, Better, add Best set-ups for remotes and YouTubers without spending thousands.

Good High quality video for Webcams and RemotesThe Logitech C270 Webcam can be gotten for as little as $20 or less! It's wholly adequate with enough light. It only does 720p and it's USB2 so I can't enthusiastically recommend it but it's OK again, if you throw light at it. In the dark is just a webcam.
Logitech C270 The Logitech USB Headset H570 is decent, as is the lovely Jabra UC Voice corded headset. I prefer the Jabra because it only covers one ear and doesn't give me the "two covered ears" claustrophobic feeling. To be clear - audio quality matters. Any crappy headset (or quality one as above) will ALWAYS be better than your webcam's default or your laptop's default. Always. Mics need to be closer to your mouth to sound good.
31kFcFWgIL Small webcam Ringlight. Light light light. Webcams, especially cheap ones NEED LIGHT. It feels weird and I get it but the quality is SO MUCH BETTER with some decent fill light. Get a ring light that's powered by USB and use it on calls. Yes, it looks ridiculous but it WORKS.
Ringlight Better

Logitech BrioHow can we improve on the GOOD setup. Clearer videos and better sound/sound feel.

Some folks feel the Logitech Brio is overhyped and I think that's fair. It's a "4k" camera that's not as impressive as it should be. That said, it's a solid camera and arguably the best Logitech has to offer.

If I could suggest a middle of the road solid "BETTER" setup for a remote worker, I'd recommend these

Logitech Brio - solid 1080p 30fps Logtitech USB Headset LED Light ring

The lights are the magic.

Now, moving beyond USB headsets, I love adding speakerphones - not for the mic, literally for the speaker. I love the Plantronics Portable USB Speakerphone. Requires no drivers, it just shows up as a mic and speaker automatically. I have it front and center in front of my monitor and I use it every day. It makes me feel like my Home Office is a real Office somehow.

Plantronics Calisto 610-M Portable USB Speakerphone

If conversations are private I'll use the headset above for the audio but when I want the sound to "come from the monitor" I'll SPLIT the audio. This is a pro tip. You can set up the Mic input as the headset mic and the Speaker output as a Speakerphone (or your main speakers). I like using the Speakerphone for voice and keeping the computer's output as the main speakers. Having this separate of voice and computer sounds is a small trick I play on myself but it helps to create a sense of location where the remote video person comes out of separate speakers.

Selecting Output Speakers

Best

Let's spend a little bit of money, but not so much that we break the bank.

I'm going to make my own webcam. Rather than a plastic of the shelf single webcam, let's take an actual mirrorless camera - the kind you'd take to a photography class - and make it a HIGH QUALITY webcam.

We need a great camera and it needs to support HDMI out. The camera also needs to be able to stay on all day long, not overheat, and it needs to run on AC power (not on battery).

Here's a list of cameras that have clean HDMI out and can stay on all day. You might have one of these cameras in your closet! I like the Sony A6000 and here's its characteristics.

Sony A6000 - I found this on Craigslist for $300. Max resolution: 1080p and a buttery smooth 60fps Clean HDMI: Yes Unlimited runtime: Yes Connection type: Micro HDMI Power: Dummy Battery Verified by: Elgato Notes: Requires dummy battery for power (sold separately) Retains full autofocus with clean HDMI output
Sony A6000 I need a "dummy battery" for this camera. Turns out this is a whole class of thing you can buy. Who knew? This camera has micro-HDMI so I need a micro-HDMI to HDMI cable. Now this is just a loose camera, so how I will mount it on my monitor? I like mounting it INSIDE the Ring Light. If you don't want the light you can just get this clamp mount. Or you can do what I did - get the CLAMP then the LIGHT and then put the CAMERA in that like a sandwic
Flexible Jaws Clamp Clip Mount Holder This camera and cameras like it output HDMI and I need that HDMI to be inputted into my computer and I want the HDMI output of the camera to look like it's a regular Webcam. The magical device that does this for us is the Elgato CamLink 4k. It's literally a little stick with HDMI input on one end and a USB3 on the other side. It took 5 minutes to install. This device also has the added benefit of being a generic "capture card" if you want to record or broadcast your gaming consoles OR other computers!

Elgato CamLink 4k

Here's a YouTube video I made that shows you these cameras, before and after - Good, Better, and BEST!

Logitech Brio vs Sony A6000 with Elgato Camlink 4k - No Contest

What do you think? Thanks to John Miller and Jeff Fritz for their help and guidance!

* I use Amazon referral links and donate the little money to my kids' school. You support charter schools when you use these links.

Sponsor: OzCode is a magical debugging extension for C#/.NET devs working in Visual Studio. Get to the root cause of your bugs faster with heads-up display, advanced search inside objects, LINQ query debugging, side-by-side object comparisons & more. Try for free!


© 2019 Scott Hanselman. All rights reserved.
     

Dotnet Depends is a great text mode development utility made with Gui.cs

Aug 1, 2019

Description:

I love me some text mode. ASCII, ANSI, VT100. Keep your 3D accelerated ray traced graphics and give me a lovely emoji-based progress bar.

Miguel has a nice thing called Gui.cs and I bumped into it in an unexpected and lovely place. There are hundreds of great .NET Global Tools that you can install to make your development lifecycle smoother, and I was installing Martin Björkström's lovely "dotnet depends" tool (go give him a GitHub star now!)  like this:

dotnet tool install -g dotnet-depends

Then I headed over to my Windows Terminal (get it free in the Store) and ran "dotnet depends" on my main website's code and was greeted by this (don't sweat the line spacing, that's a Terminal bug that'll be fixed soon):

dotnet depends in the Windows Terminal

How nice is this! It's a fully featured dependency explorer but it's all in text mode and doesn't require me to use the mouse and take my hands of the keyboard. If I'm already deep into the terminal/text mode, this is a great example of a solid, useful tool.

But how hard was it to make? Surprisingly little as his code is very simple. This is a testament to how he used the API and how Miguel designed it. He's separated the UI and the Business Logic, of course. He does the analysis work and stores it in a graph variable.

Here they're setting up some panes for the (text mode) Windows:

Application.Init();

var top = new CustomWindow();

var left = new FrameView("Dependencies")
{
Width = Dim.Percent(50),
Height = Dim.Fill(1)
};
var right = new View()
{
X = Pos.Right(left),
Width = Dim.Fill(),
Height = Dim.Fill(1)
};

It's split in half at this point, with the left side staying  at 50%.

var orderedDependencyList = graph.Nodes.OrderBy(x => x.Id).ToImmutableList();
var dependenciesView = new ListView(orderedDependencyList)
{
CanFocus = true,
AllowsMarking = false
};
left.Add(dependenciesView);
var runtimeDependsView = new ListView(Array.Empty<Node>())
{
CanFocus = true,
AllowsMarking = false
};
runtimeDepends.Add(runtimeDependsView);
var packageDependsView = new ListView(Array.Empty<Node>())
{
CanFocus = true,
AllowsMarking = false
};
packageDepends.Add(packageDependsView);
var reverseDependsView = new ListView(Array.Empty<Node>())
{
CanFocus = true,
AllowsMarking = false
};
reverseDepends.Add(reverseDependsView);

right.Add(runtimeDepends, packageDepends, reverseDepends);
top.Add(left, right, helpText);
Application.Top.Add(top)

The right side gets three ListViews added to it and the left side gets the dependencies view. Top it off with some clean data binding to the views and an initial call to UpdateLists. Anytime the dependenciesView gets a SelectedChanged event we'll call UpdateLists again.

top.Dependencies = orderedDependencyList;
top.VisibleDependencies = orderedDependencyList;
top.DependenciesView = dependenciesView;

dependenciesView.SelectedItem = 0;
UpdateLists();

dependenciesView.SelectedChanged += UpdateLists;

Application.Run();

What's in update lists? Filtering code for that graph variable from before.

void UpdateLists()
{
var selectedNode = top.VisibleDependencies[dependenciesView.SelectedItem];

runtimeDependsView.SetSource(graph.Edges.Where(x => x.Start.Equals(selectedNode) && x.End is AssemblyReferenceNode)
.Select(x => x.End).ToImmutableList());
packageDependsView.SetSource(graph.Edges.Where(x => x.Start.Equals(selectedNode) && x.End is PackageReferenceNode)
.Select(x => $"{x.End}{(string.IsNullOrEmpty(x.Label) ? string.Empty : " (Wanted: " + x.Label + ")")}").ToImmutableList());
reverseDependsView.SetSource(graph.Edges.Where(x => x.End.Equals(selectedNode))
.Select(x => $"{x.Start}{(string.IsNullOrEmpty(x.Label) ? string.Empty : " (Wanted: " + x.Label + ")")}").ToImmutableList());
}

That's basically it and it's fast as heck. Probably to be expected from the folks that brought you Midnight Commander.

Are you working on any utilities or cool projects and might want to consider - gasp - text mode over a website?

Sponsor: Looking for a tool for performance profiling, unit test coverage, and continuous testing that works cross-platform on Windows, macOS, and Linux? Check out the latest JetBrains Rider!


© 2019 Scott Hanselman. All rights reserved.
     

Docker Desktop for WSL 2 integrates Windows 10 and Linux even closer

Jul 30, 2019

Description:

Being able to seamlessly run Linux on Windows is making a bunch of common development tasks easier. When you're running WSL2 (Windows Subsystem for Linux 2) in a version of Windows 10 greater than build 18945, a BUNCH of useful and interesting scenarios light up and stuff just works.

Docker for Windows (download the Docker Desktop for WSL 2 Tech preview here) is great, but it has historically worked on Windows by creating a Hyper-V virtual machine called Moby that is visible within the Hyper-V client. It's a utility VM, but it's one you're aware of.

Docker for Windows using WSL2

However, if WSL2 runs a real Linux kernel in Windows 10 and it's managing a virtual machine platform underneath (and not visible to) Hyper-V client tools, then why not just let WSL2 handle containers for us?

That's exactly what the Docker Desklop WSL 2 Tech Preview aims to do. And just like WSL 2, it's fast.

...the time required to start a Docker daemon after a cold start is significantly faster. It takes less than 2 seconds to start the Docker daemon when compared to tens of seconds in the current version of Docker Desktop.

Once you've got a Linux (Ubuntu or the like) set up in WSL 2, you can right click on Docker Deskop and click "WSL 2 Tech Preview." This is a goofy and not-super-intuitive UI for now but it's a moment in time.

Click WSL 2 Tech Preview

Then you just hit Start.

NOTE: If you've already installed Docker within WSL 2 at the command line, stop it and let Docker Desktop manage its lifecycle.

Here's the beginnings of their UI.

Docker for WSL2

When I drop out to PowerShell/CMD on Windows I can run "docker context ls."

C:\Users\Scott\Desktop> docker context ls
NAME DESCRIPTION DOCKER ENDPOINT
default Current DOCKER_HOST based configuration npipe:////./pipe/docker_engine
wsl * Docker daemon hosted in WSL 2 npipe:////./pipe/docker_wsl

You can see there's two contexts, and I've run "docker context use wsl" and that's now my default.

Here is docker images from Ubuntu, and again from Windows (in PowerShell Core). They are the same!

Docker images in Ubuntu Docker images from Powershell

Sweet. Here I am using PowerShell Core (which is open source and cross-platform, natch) to manage my builds which are themselves cross-platform and I can run both a docker build or a metal build on both Windows or Linux, all seamlessly on the same box.

building docker images

Also note, Simon from Docker points out "We are using a non default dataroot in this mode to avoid corrupting a datastore you use without docker desktop in case something goes wrong. Stopping the docker desktop wsl daemon and restarting the one you installed manually should bring everything back." I noticed this because my "Windows Docker" and my original WSL2 docker had a list of images that I naively expected to be available here, but this is a new context and new dataroot so you may need to fetch images again in this new world if you're have been historically an active docker user.

So far I'm super impressed. Linux on the Windows Desktop feels right. It's Peanut Butter and Chocolate.

Sponsor: Looking for a tool for performance profiling, unit test coverage, and continuous testing that works cross-platform on Windows, macOS, and Linux? Check out the latest JetBrains Rider!


© 2019 Scott Hanselman. All rights reserved.
     

Ruby on Rails on Windows is not just possible, it's fabulous using WSL2 and VS Code

Jul 25, 2019

Description:

I've been trying on and off to enjoy Ruby on Rails development on Windows for many years. I was doing Ruby on Windows as long as 13 years ago. There's been many valiant efforts to make Rails on Windows a good experience. However, given that Windows 10 can run Linux with WSL (Windows Subsystem for Linux) and now Windows runs Linux at near-native speeds with an actual shipping Linux Kernel using WSL2, Ruby on Rails folks using Windows should do their work in WSL2.

Running Ruby on Rails on Windows Get a recent Windows 10

WSL2 will be released later this year but for now you can easily get it by signing up for Windows Insiders Fast and making sure your version of Windows is 18945 or greater. Just run "winver" to see your build number. Run Windows Update and get the latest.

Enable WSL2

You'll want the newest Windows Subsystem for Linux. From a PowerShell admin prompt run this:

Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux

and head over to the Windows Store and search for "Linux" or get Ubuntu 18.04 LTS directly. Download it, run it, make your sudo user.

Make sure your distro is running at max speed with WSL2. That earlier PowerShell prompt run wsl --list -v to see your distros and their WSL versions.

C:\Users\Scott\Desktop> wsl --list -v
NAME STATE VERSION
* Ubuntu-18.04 Running 2
Ubuntu Stopped 1
WLinux Stopped 1

You can upgrade any WSL1 distro like this, and once it's done, it's done.

wsl --set-version "Ubuntu-18.04" 2

And certainly feel free to get cool fonts and styles and make yourself a nice shiny Linux experience...maybe with the Windows Terminal.

Get the Windows Terminal

Bonus points, get the new open source Windows Terminal for a better experience at the command line. Install it AFTER you've set up Ubuntu or a Linux and it'll auto-populate its menu for you. Otherwise, edit your profiles.json and make a profile with a commandLine like this:

"commandline" : "wsl.exe -d Ubuntu-18.04"

See how I'm calling wsl -d (for distro) with the short name of the distro?

Ubuntu in the Terminal Menu

Since I have a real Ubuntu environment on Windows I can just follow these instructions to set up Rails!

Set up Ruby on Rails

Ubuntu instructions work because it is Ubuntu! https://gorails.com/setup/ubuntu/18.04

Additionally, I can install as as many Linuxes as I want, even a Dev vs. Prod environment if I like. WSL2 is much lighter weight than a full Virtual Machine.

Once Rails is set up, I'll try making a new hello world:

rails new myapp

and here's the result!

Ruby on Rails in the new Windows Terminal

I can also run "explorer.exe ." and launch Windows Explorer and see and manage my Linux files. That's allowed now in WSL2 because it's running a Plan9 server for file access.

Ubuntu files inside Explorer on Windows 10

Install VS Code and the VS Code Remote Extension Pack

I'm going to install the VSCode Remote Extension pack so I can develop from Windows on remote machines OR in WSL or  Container directly. I can click the lower level corner of VS Code or check the Command Palette for this list of menu items. Here I can "Reopen Folder in WSL" and pick the distro I want to use.

Remote options in VS Code

Now that I've opened the folder for development WSL look closely at the lower left corner. You can see I'm in a WSL development mode AND Visual Studio Code is recommending I install a Ruby VS Code extension...inside WSL! I don't even have Ruby and Rails on Windows. I'm going to have the Ruby language servers and VS Code headless parts live in WSL - in Linux - where they'll be the most useful.

Ruby inside WSL

This synergy, this balance between Windows (which I enjoy) and Linux (whose command line I enjoy) has turned out to be super productive. I'm able to do all the work I want - Go, Rust, Python, .NET, Ruby - and move smoothly between environments. There's not a clear separation like there is with the "run it in a VM" solution. I can access my Windows files from /mnt/c from within Linux, and I can always get to my Linux files at \\wsl$ from within Windows.

Note that I'm running rails server -b=0.0.0.0 to bind on all available IPs, and this makes Rails available to "localhost" so I can hit the Rails site from Windows! It's my machine, so it's my localhost (the networking complexities are handled by WSL2).

$ rails server -b=0.0.0.0
=> Booting Puma
=> Rails 6.0.0.rc2 application starting in development
=> Run `rails server --help` for more startup options
Puma starting in single mode...
* Version 3.12.1 (ruby 2.6.2-p47), codename: Llamas in Pajamas
* Min threads: 5, max threads: 5
* Environment: development
* Listening on tcp://0.0.0.0:3000
Use Ctrl-C to stop

Here it is in new Edge (chromium). So this is Ruby on Rails running in WSL, as browsed to from Windows, using the new Edge with Chromium at its heart. Cats and dogs, living together, mass hysteria.

Ruby on Rails on Windows from WSL

Even better, I can install the ruby-debug-ide gem inside WSL and now I'm doing interactive debugging from VS Code, but again, note that the "work" is happening inside WSL.

Debugging Rails on Windows

Enjoy!

Sponsor: Get the latest JetBrains Rider with WinForms designer, Edit & Continue, and an IL (Intermediate Language) viewer. Preliminary C# 8.0 support, rename refactoring for F#-defined symbols across your entire solution, and Custom Themes are all included.


© 2019 Scott Hanselman. All rights reserved.
     

System.Text.Json and new built-in JSON support in .NET Core

Jul 23, 2019

Description:

In a world where JSON (JavaScript Object Notation) is everywhere it's long been somewhat frustrating that .NET didn't have built-in JSON support. JSON.NET is great and has served us well but it's remained a 3rd party dependency for basic stuff like an ASP.NET web site or a simple console app.

Back in 2018 plans were announced to move JSON into .NET Core 3.0 as an intrinsic supported feature, and while they're at it, get double the performance or more with Span<T> support and no memory allocations. ASP.NET in .NET Core 3.0 removes the JSON.NET dependency but still allows you to add it back in a single line if you'd like.

NOTE: This is all automatic and built in with .NET Core 3.0, but if you’re targeting .NET Standard or .NET Framework. Install the System.Text.Json NuGet package (make sure to include previews and install version 4.6.0-preview6.19303.8 or higher). In order to get the integration with ASP.NET Core, you must target .NET Core 3.0.

It's very clean as well. Here's a simple example.

using System;
using System.Text.Json;
using System.Text.Json.Serialization;

namespace verysmall
{
class WeatherForecast
{
public DateTimeOffset Date { get; set; }
public int TemperatureC { get; set; }
public string Summary { get; set; }
}

class Program
{
static void Main(string[] args)
{
var w = new WeatherForecast() { Date = DateTime.Now, TemperatureC = 30, Summary = "Hot" };
Console.WriteLine(JsonSerializer.Serialize<WeatherForecast>(w));
}
}
}

The default options result in minified JSON as well.

{"Date":"2019-07-27T00:58:17.9478427-07:00","TemperatureC":30,"Summary":"Hot"}

Of course, when you're returning JSON from a Controller in ASP.NET it's all automatic and with .NET Core 3.0 it'll automatically use the new System.Text.Json unless you override it.

Here's an example where we pull out some fake Weather data (5 randomly created reports) and return the array.

[HttpGet]
public IEnumerable<WeatherForecast> Get()
{
var rng = new Random();
return Enumerable.Range(1, 5).Select(index => new WeatherForecast
{
Date = DateTime.Now.AddDays(index),
TemperatureC = rng.Next(-20, 55),
Summary = Summaries[rng.Next(Summaries.Length)]
})
.ToArray();
}

The application/json is used and JSON is returned by default. If the return type was just string, we'd get text/plain. Check out this YouTube video to learn more details about System.Text.Json works and how it was designed. I'm looking forward to working with it more!

Sponsor: Get the latest JetBrains Rider with WinForms designer, Edit & Continue, and an IL (Intermediate Language) viewer. Preliminary C# 8.0 support, rename refactoring for F#-defined symbols across your entire solution, and Custom Themes are all included.


© 2019 Scott Hanselman. All rights reserved.
     

Installing PowerShell with one line as a .NET Core global tool

Jul 18, 2019

Description:

I'PowerShell Mascotm a huge fan of .NET Core global tools. I've done a podcast on Global Tools. Just like Node and other platform have globally tools that can be easily and quickly installed and then used in build scripts, CI/CD (Continuous Integration/Continuous Deployment) systems, or just general command line utilities, .NET Global Tools are easily made (by you!) and distributed via NuGet.

Some cool examples (and there are hundreds) are the "Try .NET" Workshop runner and creator that can you can use to make interactive documentation, or coverlet for code coverage. There's a great and growing list of .NET Core Global Tools on GitHub.

If you've got the .NET SDK installed you can try out a global tool just like this.

dotnet tool install -g dotnetsay

Then run this example with "dotnetsay," it's fun.

stepping back a moment, you may be familiar with PowerShell. It's a scripting language and a command line shell like Bash or DOS or the Windows Command Prompt. You may think of PowerShell as a tool for maintaining and managing Windows Servers.

However in recent years, PowerShell has gone cross platform and runs most anywhere. It's lightweight and has .NET Core at its, ahem, core. You can use PowerShell for scripting systems on any platform and if you're a .NET developer the team has made installing and immediately using PowerShell in scripts a one liner - which is genius. It's PowerShell as a .NET Global Tool.

Here's an example output from my system running Ubuntu. I just "dotnet tool install --global PowerShell."

$ dotnet --version
2.1.502
$ dotnet tool install --global PowerShell
You can invoke the tool using the following command: pwsh
Tool 'powershell' (version '6.2.2') was successfully installed.
$ pwsh
PowerShell 6.1.1
https://aka.ms/pscore6-docs
Type 'help' to get help.
PS /mnt/c/Users/Scott/Desktop>
exit

Here I've checked that I have .NET 2.x or above, then I install PowerShell. I can run scripts or I can drop into the interactive shell. Note the PS prompt and my current directory above.

In fact, PowerShell is so useful as a scripting language when combined with .NET Core that PowerShell has been included as a global tool within the .NET Core 3.0 Preview Docker images since Preview 4. This means you can use PowerShell lines/scripts inside Docker images.

FROM mcr.microsoft.com/dotnet/core/sdk:3.0
RUN pwsh -c Get-Date
RUN pwsh -c "Get-Module -ListAvailable | Select-Object -Property Name, Path"

Being able to easily install PowerShell as a global tool means you can count on it in your scripts, CI/CDs systems, or docker containers. It's also nice to be able to be able to use existing PowerShell scripts cross platform.

I'm impressed with this idea - installing PowerShell itself as a .NET Global Tool. Very clever and useful.

Sponsor: Ossum unifies agile planning, version control, and continuous integration into a smart platform that saves 3x the time and effort so your team can focus on building their next great product. Sign up free.


© 2019 Scott Hanselman. All rights reserved.
     

DragonFruit and System.CommandLine is a new way to think about .NET Console apps

Jul 16, 2019

Description:

There's some interesting stuff quietly happening in the "Console App" world within open source .NET Core right now. Within the https://github.com/dotnet/command-line-api repository are three packages:

System.CommandLine.Experimental System.CommandLine.DragonFruit System.CommandLine.Rendering

These are interesting experiments and directions that are exploring how to make Console apps easier to write, more compelling, and more useful.

The one I am the most infatuated with is DragonFruit.

Historically Console apps in classic C look like this:

#include <stdio.h>

int main(int argc, char *argv[])
{
printf("Hello, World!\n");
return 0;
}

That first argument argc is the count of the number of arguments you've passed in, and argv is an array of pointers to 'strings,' essentially. The actual parsing of the command line arguments and the semantic meaning of the args you've decided on are totally on you.

C# has done it this way, since always.

static void Main(string[] args)
{
Console.WriteLine("Hello World!");
}

It's a pretty straight conceptual port from C to C#, right? It's an array of strings. Argc is gone because you can just args.Length.

If you want to make an app that does a bunch of different stuff, you've got a lot of string parsing before you get to DO the actual stuff you're app is supposed to do. In my experience, a simple console app with real proper command line arg validation can end up with half the code parsing crap and half doing stuff.

myapp.com someCommand --param:value --verbose

The larger question - one that DragonFruit tries to answer - is why doesn't .NET do the boring stuff for you in an easy and idiomatic way?

From their docs, what if you could declare a strongly-typed Main method? This was the question that led to the creation of the experimental app model called "DragonFruit", which allows you to create an entry point with multiple parameters of various types and using default values, like this:

static void Main(int intOption = 42, bool boolOption = false, FileInfo fileOption = null) { Console.WriteLine($"The value of intOption is: {intOption}"); Console.WriteLine($"The value of boolOption is: {boolOption}"); Console.WriteLine($"The value of fileOption is: {fileOption?.FullName ?? "null"}"); }

In this concept, the Main method - the entry point - is an interface that can be used to infer options and apply defaults.

using System;

namespace DragonFruit
{
class Program
{
/// <summary>
/// DragonFruit simple example program
/// </summary>
/// <param name="verbose">Show verbose output</param>
/// <param name="flavor">Which flavor to use</param>
/// <param name="count">How many smoothies?</param>
static int Main(
bool verbose,
string flavor = "chocolate",
int count = 1)
{
if (verbose)
{
Console.WriteLine("Running in verbose mode");
}
Console.WriteLine($"Creating {count} banana {(count == 1 ? "smoothie" : "smoothies")} with {flavor}");
return 0;
}
}
}

I can run it like this:

> dotnet run --flavor Vanilla --count 3
Creating 3 banana smoothies with Vanilla

The way DragonFruit does this is super clever. During the build process, DragonFruit changes this public strongly typed Main to a private (so it's not seen from the outside - .NET won't consider it an entry point. It's then replaced with a Main like this, but you'll never see it as it's in the compiled/generated artifact.

public static async Task<int> Main(string[] args)
{
return await CommandLine.ExecuteAssemblyAsync(typeof(AutoGeneratedProgram).Assembly, args, "");
}

So DragonFruit has swapped your Main for its smarter Main and the magic happens! You'll even get free auto-generated help!

DragonFruit:
DragonFruit simple example program

Usage:
DragonFruit [options]

Options:
--verbose Show verbose output
--flavor <flavor> Which flavor to use
--count <count> How many smoothies?
--version Display version information

If you want less magic and more power, you can use the same APIs DragonFruit uses to make very sophisticated behaviors. Check out the Wiki and Repository for more and perhaps get involved in this open source project!

I really like this idea and I'd love to see it taken further! Have you used DragonFruit on a project? Or are you using another command line argument parser?

Sponsor: Ossum unifies agile planning, version control, and continuous integration into a smart platform that saves 3x the time and effort so your team can focus on building their next great product. Sign up free.


© 2019 Scott Hanselman. All rights reserved.
     

Real World Cloud Migrations: Azure Front Door for global HTTP and path based load-balancing

Jul 11, 2019

Description:

As I've mentioned lately, I'm quietly moving my Website from a physical machine to a number of Cloud Services hosted in Azure. This is an attempt to not just modernize the system - no reason to change things just to change them - but to take advantage of a number of benefits that a straight web host sometimes doesn't have. I want to have multiple microsites (the main page, the podcast, the blog, etc) with regular backups, CI/CD pipeline (check in code, go straight to staging), production swaps, a global CDN for content, etc.

I'm breaking a single machine into a series of small sites BUT I want to still maintain ALL my existing URLs (for good or bad) and the most important one is hanselman.com/blog/ that I now want to point to hanselmanblog.azurewebsites.net.

That means that the Azure Front Door will be receiving all the traffic - it's the Front Door! - and then forward it on to the Azure Web App. That means: hanselman.com/blog/foo -> hanselmanblog.azurewebsites.net/foo hanselman.com/blog/bar -> hanselmanblog.azurewebsites.net/foo hanselman.com/blog/foo/bar/baz -> hanselmanblog.azurewebsites.net/foo/bar/baz

There's a few things to consider when dealing with reverse proxies like this and I've written about that in detail in this article on Dealing with Application Base URLs and Razor link generation while hosting ASP.NET web apps behind Reverse Proxies.

You can and should read in detail about Azure Front Door here.

It's worth considering a few things. Front Door MAY be overkill for what I'm doing because I have a small, modest site. Right now I've got several backends, but they aren't yet globally distributed. If I had a system with lots of regions and lots of App Services all over the world AND a lot of static content, Front Door would be a perfect fit. Right now I have just a few App Services (Backends in this context) and I'm using Front Door primarily to manage the hanselman.com top level domain and manage traffic with URL routing.

On the plus side, that might mean Azure Front Door was exactly what I needed, it was super easy to set up Front Door as there's a visual Front Door Designer. It was less than 20 minutes to get it all routed, and SSL certs too just a few hours more. You can see below that I associated staging.hanselman.com with two Backend Pools. This UI in the Azure Portal is (IMHO) far easier than the Azure Application Gateway. Additionally, Front Door is Global while App Gateway is Regional. If you were a massive global site, you might put Azure Front Door in ahem, front, and Azure App Gateway behind it, regionally. image

Again, a little overkill as my Pools are pools are pools of one, but it gives me room to grow. I could easily balance traffic globally in the future.

CONFUSION: In the past with my little startup I've used Azure Traffic Manager to route traffic to several App Services hosted all over the global. When I heard of Front Door I was confused, but it seems like Traffic Manager is mostly global DNS load balancing for any network traffic, while Front Door is Layer 7 load balancing for HTTP traffic, and uses a variety of reasons to route traffic. Azure Front Door also can act as a CDN and cache all your content as well. There's lots of detail on Front Door's routing architecture details and traffic routing methods. Azure Front Door is definitely the most sophisticated and comprehensive system for fronting all my traffic. I'm still learning what's the right size app for it and I'm not sure a blog is the ideal example app.

Here's how I set up /blog to hit one Backend Pool. I have it accepting both HTTP and HTTPS. Originally I had a few extra Front Door rules, one for HTTP, one for HTTPs, and I set the HTTP one to redirect to HTTPS. However, Front door charges 3 cents an hour for the each of the first 5 routing rules (then about a penny an hour for each after 5) but I don't (personally) think I should pay for what I consider "best practice" rules. That means, forcing HTTPS (an internet standard, these days) as well as URL canonicalization with a trailing slash after paths. That means /blog should 301 to /blog/ etc. These are simple prescriptive things that everyone should be doing. If I was putting a legacy app behind a Front Door, then this power and flexibility in path control would be a boon that I'd be happy to pay for. But in these cases I may be able to have that redirection work done lower down in the app itself and save money every month. I'll update this post if the pricing changes.

/blog hits the Blog App Service

After I set up Azure Front Door I noticed my staging blog was getting hit every few seconds, all day forever. I realized there are some health checks but since there's 80+ Azure Front Door locations and they are all checking the health of my app, it was adding up to a lot of traffic. For a large app, you need these health checks to make sure traffic fails over and you really know if you app is healthy. For my blog, less so.

There's a few ways to tell Front Door to chill. First, I don't need Azure Front Door doing a GET requests on /. I can instead ask it to check something lighter weight. With ASP.NET 2.2 it's as easy as adding HealthChecks. It's much easier, less traffic, and you can make the health check as comprehensive as you want.

app.UseHealthChecks("/healthcheck");

Next I turned the Interval WAY app so it wouldn't bug me every few seconds.

Interval set to 255 seconds

These two small changes made a huge difference in my traffic as I didn't have so much extra "pinging."

After setting up Azure Front Door, I also turned on Custom Domain HTTPs and pointing staging to it. It was very easy to set up and was included in the cost.

Custom Domain HTTPS

I haven't decided if I want to set up Front Door's caching or not, but it might mean an easier, more central way than using a CDN manually and changing the URLs for my sites static content and images. In fact, the POP (Point of Presense) locations for Front Door are the same as those for Azure CDN.

NOTE: I will have to at some point manage the Apex/Naked domain issue where hanselman.com and www.hanselman.com both resolve to my website. It seems this can be handled by either CNAME flattening or DNS chasing and I need to check with my DNS provider to see if this is supported. I suspect I can do it with an ALIAS record. Barring that, Azure also offers a Azure DNS hosting service.

There is another option I haven't explored yet called Azure Application Gateway that I may test out and see if it's cheaper for what I need. I primarily need SSL cert management and URL routing.

I'm continuing to explore as I build out this migration plan. Let me know your thoughts in the comments.

Sponsor: Develop Xamarin applications without difficulty with the latest JetBrains Rider: Xcode integration, JetBrains Xamarin SDK, and manage the required SDKs for Android development, all right from the IDE. Get it today


© 2019 Scott Hanselman. All rights reserved.
     

Dealing with Application Base URLs and Razor link generation while hosting ASP.NET web apps behind Reverse Proxies

Jul 9, 2019

Description:

Updating my site to run on AzureI'm quietly moving my Website from a physical machine to a number of Cloud Services hosted in Azure. This is an attempt to not just modernize the system - no reason to change things just to change them - but to take advantage of a number of benefits that a straight web host sometimes doesn't have. I want to have multiple microsites (the main page, the podcast, the blog, etc) with regular backups, CI/CD pipeline (check in code, go straight to staging), production swaps, a global CDN for content, etc.

I'm also moving from an ASP.NET 4 (was ASP.NET 2 until recently) site to ASP.NET Core 2.x LTS and changing my URL structure. I am aiming to save money but I'm not doing this as a "spend basically nothing" project. Yes, I could convert my site to a static HTML generated blog using any number of great static site generators, or even a Headless CMS. Yes I could host it in Azure Storage fronted by a CMS, or even as a series of Azure Functions. But I have 17 years of content in DasBlog, I like DasBlog, and it's being actively updated to .NET Core and it's a fun app. I also have custom Razor sites in the form of my podcast site and they work great with a great workflow. I want to find a balance of cost effectiveness, features, ease of use, and reliability.  What I have now is a sinking feeling like my site is gonna die tomorrow and I'm not ready to deal with it. So, there you go.

Currently my sites live on a real machine with real folders and it's fronted by IIS on a Windows Server. There's an app (an IIS Application, to be clear) leaving at \ so that means hanselman.com/ hits / which is likely c:\inetpub\wwwroot full stop.

For historical reasons, when you hit hanselman.com/blog/ you're hitting the /blog IIS Application which could be at d:\whatever but may be at c:\inetpub\wwwroot\blog or even at c:\blog. Who knows. The Application and ASP.NET within it knows that the site is at hanselman.com/blog.

That's important, since I may write a URL like ~/about when writing code. If I'm in the hanselman.com/blog app, then ~/about means hanselman.com/blog/about. If I write /about, that means hanselman.com/about. So the ~ is a shorthand for "starting at this App's base URL." This is great and useful and makes Link generation super easy, but it only works if your app knows what it's server-side base URL is.

To be clear, we are talking about the reality of the generated URL that's sent to and from the browser, not about any physical reality on the disk or server or app.

I've moved my world to three Azure App Services called hanselminutes, hanselman, and hanselmanblog. They have names like http://hanselman.azurewebsites.net for example.

ASIDE: You'll note that hitting hanselman.azurewebsites.com will hit an app that looks stopped. I don't want that site to serve traffic from there, I want it to be served from http://hanselman.com, right? Specifically only from Azure Front Door which I'll talk about in another post soon. So I'll use the Access Restrictions and Software Based Networking in Azure to deny all traffic to that site, except traffic from Azure - in this case, from the Azure Front Door Reverse Proxy I'll be using.

That looks like this in this Access Restrictions part of the Azure Portal.

Only allowing traffic from Azure

Since the hanselman.com app will point to hanselman.azurewebsites.net (or one of its staging slots) there's no issue with URL generation. If I say / I mean /, the root of the site. If I generate a URL like "~/about" I'll get hanselman.com/about, right?

But with http://hanselmanblog.azurewebsites.net it's different.

I want hanselman.com/blog/ to point to hanselmanblog.azurewebsites.net.

That means that the Azure Front Door will be receiving traffic, then forward it on to the Azure Web App. That means:

hanselman.com/blog/foo -> hanselmanblog.azurewebsites.net/foo hanselman.com/blog/bar -> hanselmanblog.azurewebsites.net/foo hanselman.com/blog/foo/bar/baz -> hanselmanblog.azurewebsites.net/foo/bar/baz

There's a few things to consider when dealing with reverse proxies like this.

Is part of the /path being removed or is a path being added?

In the case of DasBlog, we have a configuration setting so that the app knows where it LOOKS like it is, from the Browser URL's perspective.

My blog is at /blog so I add that in some middleware in my Startup.cs. Certainly YOU don't need to have this in config - do whatever works for you as long as context.Request.PathBase is set as the app should see it. I set this very early in my pipeline.

That if statement is there because most folks don't install their blog at /blog, so it doesn't add the middleware.

//if you've configured it at /blog or /whatever, set that pathbase so ~ will generate correctly
Uri rootUri = new Uri(dasBlogSettings.SiteConfiguration.Root);
string path = rootUri.AbsolutePath;

//Deal with path base and proxies that change the request path
if (path != "/")
{
app.Use((context, next) =>
{
context.Request.PathBase = new PathString(path);
return next.Invoke();
});
}

Sometimes you want the OPPOSITE of this. That would mean that I wanted, perhaps hanselman.com to point to hanselman.azurewebsites.net/blog/. In that case I'd do this in my Startup.cs's ConfigureServices:

app.UsePathBase("/blog");

Be aware that If you're hosting ASP.NET Core apps behind Nginx or Apache or really anything, you'll also want ASP.NET Core to respect  X-Forwarded-For and other X-Forwarded standard headers. You'll also likely want the app to refuse to speak to anyone who isn't a certain list of proxies or configured URLs.

I configure these in Startup.cs's ConfigureServices from a semicolon delimited list in my config, but you can do this in a number of ways.

services.Configure<ForwardedHeadersOptions>(options =>
{
options.ForwardedHeaders = ForwardedHeaders.All;
options.AllowedHosts = Configuration.GetValue<string>("AllowedHosts")?.Split(';').ToList<string>();
});

Since Azure Front Door adds these headers as it forwards traffic, from my app's point of view it "just works" once I've added that above and then this in Configure()

app.UseForwardedHeaders();

There seems to be some confusion on hosting behind a reverse proxy in a few GitHub Issues. I'd like to see my scenario ( /foo -> / ) be a single line of code, as we see that the other scenario ( / -> /foo ) is a single line.

Have you had any issues with URL generation when hosting your Apps behind a reverse proxy?

Sponsor: Develop Xamarin applications without difficulty with the latest JetBrains Rider: Xcode integration, JetBrains Xamarin SDK, and manage the required SDKs for Android development, all right from the IDE. Get it today


© 2019 Scott Hanselman. All rights reserved.
     

Real World Cloud Migrations: CDNs are an easy improvement to legacy apps

Jul 4, 2019

Description:

I'm doing a quiet backend migration/update to my family of sites. If I do it right, there will be minimal disruption. Even though I'm a one person show (plus Mandy my podcast editor) the Cloud lets me manage digital assets like I'm a whole company with an IT department. Sure, I could FTP files directly into production, but why do that when I've got free/cheap stuff like Azure DevOps.

As I'm one person, I do want to keep costs down whenever possible and I've said so in my "Penny Pinching in the Cloud" series. However, I do pay for value, so I'll try to keep costs down whenever possible, but I will pay for things I feel are useful or provide me with a quality product and/or save me time and hassle. For example, Azure Pipelines gives one free Microsoft-hosted CI/CD pipeline and 1 hosted job with 1800 minutes a month. This should be more than enough for my set up.

Additionally, sometimes I'll shift costs in order to both save money and improve performance as when I moved my Podcast's image hosting over to an Azure CDN in 2013. Fast forward 6 years and since I'm doing this larger migration, I wanted to see how easy it would be to migrate even more of my static assets to a CDN.

Make a Storage Account and ensure that its Access Level is Public if you intend folks to be able to read from it over HTTP GETs.

Public Access Level

With in the Azure Portal you can make a CDN Profile that uses Microsoft, Akamai, or Verizon for the actual Content Distribution Network. I picked Microsoft's because I have experience with Akamai and Verizon (they were solid) and I wanted to see how the new Microsoft one does. It claims to put content within 50ms of 60 countries.

You'll have a CDN endpoint host name that points to an Origin. That Origin is the Origin of your content. The Origin can be an existing place you have stuff so the CDN will basically check there first, cache things, then serve the content. Or it can be an existing WebApp or in my case, Storage.

Azure Storage Accounts for my blog

I didn't want to make things too complex, so I have a single Storage Account with a few containers. Then I mapped custom endpoints for both my blog AND my podcast, but they share the same storage. I also took advantage of the free SSL Certs so images.hanselman.com and images.hanselminutes.com are both SSL. Gotta get those "A" grades from https://securityheaders.com right?

Custom domains for my CDN

Then, just grab the Azure Storage Explorer. It's free and cross platform. In fact, you can get Storage Explorer as an app that runs locally, or you can check it out in the Azure  Portal/in-browser as well, here. I've uploaded all my main assets (images used in my CSS, blog, headers, etc).

I've settled on a basic scheme where anything that was "/images/foo.png" is now "https://images.hanselman.com/blog/foo.png" or /main/ or /podcast/ depending on who is using it. This made search and replaces reliable and easy.

It's truly a less-than-an-hour operation to enable a CDN on an existing site. While I've chose to use Azure Storage as my backing store, you can just use your existing site's /images folder and just change your markup to pull from the CDN's URL. I recommend you make a simple cdn.yourdomain.com or images.yourdomain.com. This can also easily be enabled with a CNAME in less than an hour. Having your images at another URL via a subdomain CNAME also allows for even more download parallelism from the browser.

This is all in my Staging so you won't see it quite yet until I flip the switch on the whole migration.

Exploring future possibilities for dynamic image content

I haven't yet moved all my blog content images (a few gigs, but many thousands of images) to a CDN as I don't want to change my existing publishing workflow with Open Live Writer. I'm considering a flow that keeps the images uploading to the Web App but then using the Custom Origin Path options in the Azure CDN that will have images get picked up Web App then get served by the CDN. In order to manage 17 years of images though, I'd need to catch URLs like this:

https://www.hanselman.com/blog/content/binary/Windows-Live-Writer/0409a9d5fae6_F552/image_9.png

and redirect them to

https://images.hanselman.com/blog/content/binary/Windows-Live-Writer/0409a9d5fae6_F552/image_9.png

As I see it, there's a few options to get images from a GET of /blog/content/binary to be served by a CDN:

Dynamically Rewrite the URLs on the way OUT (as the HTML is generated) CPU expensive, but ya it's cached in my WebApp Rewriting Middleware to redirect (301?) requests to the new images location Easiest option, but costs everyone a 301 on all image GETs Programmatically change the stored markup (basically forloop over my "database," search and replace, AND ensure future images use this new URL) A hassle, but a one time hassle Not sure about future images, I might have to change my publishing flow AND run the process on new posts occasionally.

What are your thoughts on if I should go all the way and manage EVERY blog image vs the low hanging fruit of shared static assets? Worth it, or overkill?

The learning process continues!

Sponsor: Seq delivers the diagnostics, dashboarding, and alerting capabilities needed by modern development teams - all on your infrastructure. Download now.
© 2019 Scott Hanselman. All rights reserved.
     

Real World Cloud Migrations: Moving a 17 year old series of sites from bare metal to Azure

Jul 2, 2019

Description:

Technical Debt has a way of sneaking up on you. While my podcast site and the other 16ish sites I run all live in Azure and have a nice CI/CD pipeline with Azure DevOps, my main "Hanselman.com" series of sites and mini-sites has lagged behind. I'm still happy with its responsive design, but the underlying tech has started to get more difficult to manage and build and I've decided it's time to make some updates.

Moving sites to Azure DevOps

I want to be able to make these updates and have a clean switch over so that you, the reader, don't notice a difference. There's a number of things to think about when doing any migration like this, realizing it'll take some weeks (or months if you're a bigger company that just me).

Continuous Deployment/Continuous Integration I host my code on GitHub and Azure DevOps now lets you log in with GitHub and does a fine job of building AND deploying your code (while running tests AND allowing for manual quality gates) so I want to make sure my sites have a nice clean "check in and go live" process. I'll also be using Azure App Services and Deployment Slots, so I'll have a dev/test/staging site and production, like a real professional. No more editing text files in production. Well, at least, I won't tell you when I'm editing text file in production. Technology Update Hanselman.com proper (not the blog) and the mini pages/sites underneath it run on ASP.NET 4.0 and WebForms. I was able to easily move the main site over to ASP.NET Razor Pages. Razor is just so elegant, as it's basically just HTML then you type @ and you're in C# (Razor). More on that below, but the upgrade was a day as the home page and minisites are largely readonly. The Blog, hosted at /blog will be more challenging given I don't want to break two decades years of URLs, along with the fact that it's running DasBlog on a recently upgraded .NET 4.0. DasBlog was originally made in .NET 1, then upgraded to .NET 2, so this is 17 years of technical debt. That said, the .NET Standard along with open source cross-platform .NET Core has allowed us - with the leadership of Mark Downie - to create DasBlog Core. DasBlog Core shares the core reliable (if crusty) engine of DasBlog along with an all new system of URL writing using ASP.NET Core middleware, as well as a complete re-do of the (well ahead of its time) DasBlog Theming Engine, now based on Razor Pages. It's brilliant. This is in active development. Azure Front Door Because I'm moving from a single machine running IIS to Azure, I'll want to split things apart to remove single points of failture. I'll use Azure Front Door to manage my URL structure and act as a front end cache as well as distribute traffic to multiple Azure App Services (Web Apps). URL management Are you changing your URLs and URL structure? Remember that URLs are UI and they matter. I've long wanted to remove the "aspx" extension from my URLs, as well as move the TitleCaseBlogPostThing to a more "modern" title-case-blog-post-thing style. I need to do this in a way that updates my google sitemap, breaks zero URLs, 301 redirects to the new style, and uses rel=canonical in a smart way. Shared Assets/CDNs/Front Door Since I run a family of sites, there's an opportunity to use a CDN as well and some clean CNAME DNS such that images.hanselman.com and images.hanselminutes.com can share assets. Since the Azure CDN is easy to setup and offers free SSL certs and pay-as-you go, I'll set both of those CNAMES up to point to the same Azure Storage where I'll keep images, show pics, CSS, and JS.

I'll be blogging the whole process. What do you want to hear/learn about?

Sponsor: Seq delivers the diagnostics, dashboarding, and alerting capabilities needed by modern development teams - all on your infrastructure. Download now.


© 2019 Scott Hanselman. All rights reserved.
     

Git is case-sensitive and your filesystem may not be - Weird folder merging on Windows

Jun 27, 2019

Description:

I was working on DasBlog Core (an .NET Core cross-platform update of the ASP.NET WebForms-based blogging software that runs this blog) with Mark Downie, the new project manager, and Shayne Boyer. This is part of a larger cloud re-architecture of hanselman.com and the systems that run this whole site.

Shayne was working on getting a DasBlog Core CI/CD (Continuous Integration/Continuous Development) running in Azure DevOps' build system. We wanted individual build pipelines to confirm that DasBlog Core was in fact, cross-platform, so we needed to build, test, and run it on Windows, Linux, and Mac.

The build was working great on Windows and Mac...but failing on Linux. Why?

Well, like all things, it's complex.

Windows has a case-insensitive file system. By default, Mac uses a case-insensitive file system.

Since Git 1.5ish there's been a setting

git config --global core.ignorecase true

but you should always be aware of what a setting does before you just set it.

If you're not careful, you or someone on your team can create a case sensitive file path in your git index while you're using a case insensitive operating system like Windows or Mac. If you do this, you'll be able to end up with two separate entries from git's perspective. However Windows will silently merge them and see just one.

Here's our themes folder structure as seen on GitHub.com.

Case insenstive folder names

But when we clone it on Mac or Windows, we see just one folder.

DasBlog as a single folder in VS Code

Turns out that six months ago one of us introduced another folder with the name dasblog while the original was DasBlog. When we checked them on Mac or Windows the files ended up in merged into one folder, but on Linux they were/are two, so the build fails.

You can fix this in a few ways. You can rename the file in a case-sensitive way and commit the change:

git mv --cached name.txt NAME.TXT

Please take care and back up anything you don't understand.

If you're renaming a directory, you'll do a two stage rename with a temp name.

git mv foo foo2
git mv foo2 FOO
git commit -m "changed case of dir"

Be safe out there!

Sponsor: Looking for a tool for performance profiling, unit test coverage, and continuous testing that works cross-platform on Windows, macOS, and Linux? Check out the latest JetBrains Rider!


© 2019 Scott Hanselman. All rights reserved.
     

Adding Reaction Gifs for your Build System and the Windows Terminal

Jun 25, 2019

Description:

So, first, I'm having entirely too much fun with the new open source Windows Terminal. If you've got the latest version of Windows (go run Windows Update and do whatever it takes) then you can download the Windows Terminal from the Microsoft Store! This is a preview release (think v0.2) but it'll automatically update, often, from the Windows Store if you have Windows 10 version 18362.0 or higher.

One of the most fun things is that you can have background images. Even animated gifs! You can add those images in your Settings/profile.json like this.

"backgroundImage": "c:/users/scott/desktop/doug.gif",
"backgroundImageOpacity": 0.7,
"backgroundImageStretchMode": "uniformToFill

The profile.json is just JSON and you can update it. I could even update it programmatically if I wanted to parse it and mess about.

BUT. Enterprising developer Chris Duck created a lovely PowerShell Module called MSTerminalSettings that lets you very easily make Profile changes with script.

For example, Mac developers who use iTerm often go to https://iterm2colorschemes.com/ and get new color schemes for their consoles. Now Windows folks can as well!

From his docs, this example downloads the Pandora color scheme from https://iterm2colorschemes.com/ and sets it as the color scheme for the PowerShell Core terminal profile.

Invoke-RestMethod -Uri 'https://raw.githubusercontent.com/mbadolato/iTerm2-Color-Schemes/master/schemes/Pandora.itermcolors' -OutFile .\Pandora.itermcolors
Import-Iterm2ColorScheme -Path .\Pandora.itermcolors -Name Pandora
Get-MSTerminalProfile -Name "PowerShell Core" | Set-MSTerminalProfile -ColorScheme Pandora

That's easy! Then I was talking to Tyler Leonhardt and suggested that we programmatically change the background using a folder full of Animated Gifs. I happen to have such a folder (with 2000 categorized gif classics) so we started coding and streamed the whole debacle on Tyler's Twitch!

The result is Windows Terminal Attract Mode and it's a hot mess and it is up on GitHub and all set up for PowerShell Core.

Remember that "Attract mode" is the mode an idle arcade cabinet goes into in order to attract passersby to play, so clearly the Terminal needs this also.

./AttractMode.ps1 -name "profile name" -path "c:\temp\trouble" -secs 5

It's a proof of concept for now, and it's missing background/runspace support, being wrapped up in a proper module, etc but the idea is solid, building on a solid base, with improvements to idiomatic PowerShell Core already incoming. Right now it'll run forever. Wrap it in Start-Job if you like as well and can stand it.

I've made aliases so the new Windows Terminal shows REACTION GIFS for my build system and tests! pic.twitter.com/jpPSsrUoSO

— Scott Hanselman (@shanselman) June 28, 2019

The next idea was to have reactions gifs to different developer situations. Break the build? Reaction Gif. Passing tests? Reaction Gif.

Here's a silly proof (not refactored) that aliases "dotnet build" to "db" with reactions.

#messing around with build reaction gifs

Function DotNetAlias {
dotnet build
if ($?) {
Start-job -ScriptBlock {
d:\github\TerminalAttractMode\SetMoodGif.ps1 "PowerShell Core" "D:\Dropbox\Reference\Animated Gifs\chrispratt.gif"
Start-Sleep 1.5
d:\github\TerminalAttractMode\SetMoodGif.ps1 "PowerShell Core" "D:\Dropbox\Reference\Animated Gifs\4003cn5.gif"
} | Out-Null
}
else {
Start-job -ScriptBlock {
d:\github\TerminalAttractMode\SetMoodGif.ps1 "PowerShell Core" "D:\Dropbox\Reference\Animated Gifs\idk-girl.gif"
Start-Sleep 1.5
d:\github\TerminalAttractMode\SetMoodGif.ps1 "PowerShell Core" "D:\Dropbox\Reference\Animated Gifs\4003cn5.gif"
} | Out-Null

}
}

Set-Alias -Name db -value DotNetAlias

I added the Start-job stuff so that the build finishes and the Terminal returns control to you while the gifs still are updating. Runspace support would be smart as well.

Some other ideas? Giphy support. Random mood gifs. Pick me ups. You get the idea.

Later, Brandon Olin jumped in with this gem. Why not get a reaction gif if anything goes wrong in your last command? ERRORLEVEL 1? Explode.

Why are we doing this? Because it sparks joy, y'all.

Sponsor: Looking for a tool for performance profiling, unit test coverage, and continuous testing that works cross-platform on Windows, macOS, and Linux? Check out the latest JetBrains Rider!


© 2019 Scott Hanselman. All rights reserved.
     

You can now download the new Open Source Windows Terminal

Jun 20, 2019

Description:

Last month Microsoft announced a new open source Windows Terminal! It's up at https://github.com/microsoft/Terminal and it's great, but for the last several weeks you've had to build it yourself as a Developer. It's been very v0.1 if you know what I mean.

Today you can download the Windows Terminal from the Microsoft Store! This is a preview release (think v0.2) but it'll automatically update, often, from the Windows Store if you have Windows 10 version 18362.0 or higher. Run "winver" to make sure.

Windows Terminal

If you don't see any tabs, hit Ctrl-T and note the + and the pull down menu at the top there. Under the menu go to Settings to open profiles.json. Here's mine on one machine.

Here's some Hot Windows Terminal Tips

You can do background images, even animated, with opacity (with useAcrylic off):

"backgroundImage": "c:/users/scott/desktop/doug.gif",
"backgroundImageOpacity": 0.7,
"backgroundImageStretchMode": "uniformToFill

You can edit the key bindings to your taste in the "key bindings" section. For now, be specific, so the * might be expressed as Ctrl+Shift+8, for example.

Try other things like cursor shape and color, history size, as well as different fonts for each tab.

"cursorShape": "vintage"

If you're using WSL or WSL2, use the distro name like this in your new profile:

"wsl.exe -d Ubuntu-18.04"

If you like Font Ligatures or use Powerline, consider Fira Code as a potential new font.

I'd recommend you PIN terminal to your taskbar and start menu, but you can run windows terminal from the command "wt" from Windows R or from anotherc console. That's just "wt" and enter!

Try not just "Ctrl+Mouse Scroll" but also "Ctrl+Shift+Mouse Scroll" and get your your whole life!

Remember that the definition of a shell is someone fluid, so check out Azure Cloud Shell, in your terminal!

Windows Terminal menus

Also, let's start sharing nice color profiles! Share your new ones as a Gist in this format. Note the name.

{
"background" : "#2C001E",
"black" : "#4E9A06",
"blue" : "#3465A4",
"brightBlack" : "#555753",
"brightBlue" : "#729FCF",
"brightCyan" : "#34E2E2",
"brightGreen" : "#8AE234",
"brightPurple" : "#AD7FA8",
"brightRed" : "#EF2929",
"brightWhite" : "#EEEEEE",
"brightYellow" : "#FCE94F",
"cyan" : "#06989A",
"foreground" : "#EEEEEE",
"green" : "#300A24",
"name" : "UbuntuLegit",
"purple" : "#75507B",
"red" : "#CC0000",
"white" : "#D3D7CF",
"yellow" : "#C4A000"
}

Note also that this should be the beginning of a wonderful Windows Console ecosystem. This isn't the one terminal to end them all, it's the one to start them all. I've loved alternative consoles for YEARS, whether it be ConEmu or Console2 many years ago, I've long declared that Text Mode is a missed opportunity.

Remember also that Terminal !== Shell and that you can bring your shell of choice into your Terminal of choice! If you want the deep architectural dive, be sure to watch the BUILD 2019 technical talk with some of the developers or read about ConPTY and how to integrate with it!

Sponsor: Get the latest JetBrains Rider with WinForms designer, Edit & Continue, and an IL (Intermediate Language) viewer. Preliminary C# 8.0 support, rename refactoring for F#-defined symbols across your entire solution, and Custom Themes are all included.


© 2018 Scott Hanselman. All rights reserved.
     

Dynamically generating robots.txt for ASP.NET Core sites based on environment

Jun 18, 2019

Description:

I'm putting part of older WebForms portions of my site that still run on bare metal to ASP.NET Core and Azure App Services, and while I'm doing that I realized that I want to make sure my staging sites don't get indexed by Google/Bing.

I already have a robots.txt, but I want one that's specific to production and others that are specific to development or staging. I thought about a number of ways to solve this. I could have a static robots.txt and another robots-staging.txt and conditionally copy one over the other during my Azure DevOps CI/CD pipeline.

Then I realized the simplest possible thing would be to just make robots.txt be dynamic. I thought about writing custom middleware but that sounded like a hassle and more code that needed. I wanted to see just how simple this could be.

You could do this as a single inline middleware, and just lambda and func and linq the heck out out it all on one line You could write your own middleware and do lots of options, then activate it bested on env.IsStaging(), etc. You could make a single Razor Page with environment taghelpers.

The last one seemed easiest and would also mean I could change the cshtml without a full recompile, so I made a RobotsTxt.cshtml single razor page. No page model, no code behind. Then I used the built-in environment tag helper to conditionally generate parts of the file. Note also that I forced the mime type to text/plain and I don't use a Layout page, as this needs to stand alone.

@page
@{
Layout = null;
this.Response.ContentType = "text/plain";
}
# /robots.txt file for http://www.hanselman.com/
User-agent: *
<environment include="Development,Staging">Disallow: /</environment>
<environment include="Production">Disallow: /blog/private
Disallow: /blog/secret
Disallow: /blog/somethingelse</environment>

I then make sure that my Staging and/or Production systems have ASPNETCORE_ENVIRONMENT variables set appropriately.

ASPNETCORE_ENVIRONMENT=Staging

I also want to point out what may look like odd spacing and how some text is butted up against the TagHelpers. Remember that a TagHelper's tag sometimes "disappears" (is elided) when it's done its thing, but the whitespace around it remains. So I want User-agent: * to have a line, and then Disallow to show up immediately on the next line. While it might be prettier source code to have that start on another line, it's not a correct file then. I want the result to be tight and above all, correct. This is for staging:

User-agent: *
Disallow: /

This now gives me a robots.txt at /robotstxt but not at /robots.txt. See the issue? Robots.txt is a file (or a fake one) so I need to map a route from the request for /robots.txt to the Razor page called RobotsTxt.cshtml.

Here I add a RazorPagesOptions in my Startup.cs with a custom PageRoute that maps /robots.txt to /robotstxt. (I've always found this API annoying as the parameters should, IMHO, be reversed like ("from","to") so watch out for that, lest you waste ten minutes like I just did.

public void ConfigureServices(IServiceCollection services)
{
services.AddMvc()
.AddRazorPagesOptions(options =>
{
options.Conventions.AddPageRoute("/robotstxt", "/Robots.Txt");
});
}

And that's it! Simple and clean.

You could also add caching if you wanted, either as a larger middleware, or even in the cshtml Page, like

context.Response.Headers.Add("Cache-Control", $"max-age=SOMELARGENUMBEROFSECONDS");

but I'll leave that small optimization as an exercise to the reader.

UPDATE: After I was done I found this robots.txt middleware and NuGet up on GitHub. I'm still happy with my code and I don't mind not having an external dependency, but it's nice to file this one away for future more sophisticated needs and projects.

How do you handle your robots.txt needs? Do you even have one?

Sponsor: Get the latest JetBrains Rider with WinForms designer, Edit & Continue, and an IL (Intermediate Language) viewer. Preliminary C# 8.0 support, rename refactoring for F#-defined symbols across your entire solution, and Custom Themes are all included.


© 2018 Scott Hanselman. All rights reserved.
     

Making a tiny .NET Core 3.0 entirely self-contained single executable

Jun 13, 2019

Description:

I've always been fascinated by making apps as small as possible, especially in the .NET space. No need to ship any files - or methods - that you don't need, right? I've blogged about optimizations you can make in your Dockerfiles to make your .NET containerized apps small, as well as using the ILLInk.Tasks linker from Mono to "tree trim" your apps to be as small as they can be.

Work is on going, but with .NET Core 3.0 preview 6, ILLink.Tasks is no longer supported and instead the Tree Trimming feature is built into .NET Core directly.

Here is a .NET Core 3.0 Hello World app.

225 files, 69 megs

Now I'll open the csproj and add PublishTrimmed = true.

<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>netcoreapp3.0</TargetFramework>
<PublishTrimmed>true</PublishTrimmed>
</PropertyGroup>
</Project>

And I will compile and publish it for Win-x64, my chosen target.

dotnet publish -r win-x64 -c release

Now it's just 64 files and 28 megs!

64 files, 28 megs

If your app uses reflection you can let the Tree Trimmer know by telling the project system about your Assembly, or even specific Types or Methods you don't want trimmed away.

<ItemGroup>
<TrimmerRootAssembly Include="System.IO.FileSystem" />
</ItemGroup>

The intent in the future is to have .NET be able to create a single small executable that includes everything you need. In my case I'd get "supersmallapp.exe" with no dependencies. That's done using PublishSingleFile along with the RuntimeIdentifier in the csproj like this:

<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>netcoreapp3.0</TargetFramework>
<PublishTrimmed>true</PublishTrimmed>
<PublishReadyToRun>true</PublishReadyToRun>
<PublishSingleFile>true</PublishSingleFile>
<RuntimeIdentifier>win-x64</RuntimeIdentifier>
</PropertyGroup>
</Project>

At this point you've got everything expressed in the project file and a simple "dotnet publish -c Release" makes you a single exe!

There's also a cool global utility called Warp that makes things even smaller. This utility, combined with the .NET Core 3.0 SDK's now-built-in Tree Trimmer creates a 13 meg single executable that includes everything it needs to run.

C:\Users\scott\Desktop\SuperSmallApp>dotnet warp
Running Publish...
Running Pack...
Saved binary to "SuperSmallApp.exe"

And the result is just a 13 meg single EXE ready to go on Windows.

A tiny 13 meg .NET Core 3 application

If you want, you can combine this "PublishedTrimmed" object with "PublishReadyToRun" as well and get a small AND fast app.

<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>netcoreapp3.0</TargetFramework>
<PublishTrimmed>true</PublishTrimmed>
<PublishReadyToRun>true</PublishReadyToRun>
</PropertyGroup>
</Project>

These are not just IL (Intermediate Language) assemblies that are JITted (Just in time compiled) on the target machine. These are more "pre-chewed" AOT (Ahead of Time) compiled assemblies with as much native code as possible to speed up your app's startup time. From the blog post:

In terms of compatibility, ReadyToRun images are similar to IL assemblies, with some key differences. IL assemblies contain just IL code. They can run on any runtime that supports the given target framework for that assembly. For example a netstandard2.0 assembly can run on .NET Framework 4.6+ and .NET Core 2.0+, on any supported operating system (Windows, macOS, Linux) and architecture (Intel, ARM, 32-bit, 64-bit). R2R assemblies contain IL and native code. They are compiled for a specific minimum .NET Core runtime version and runtime environment (RID). For example, a netstandard2.0 assembly might be R2R compiled for .NET Core 3.0 and Linux x64. It will only be usable in that or a compatible configuration (like .NET Core 3.1 or .NET Core 5.0, on Linux x64), because it contains native code that is only usable in that runtime environment.

I'll keep exploring .NET Core 3.0, and you can install the SDK here in minutes. It won't mess up any of your existing stuff.

Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!


© 2018 Scott Hanselman. All rights reserved.
     

Visual Studio Code Remote Development over SSH to a Raspberry Pi is butter

Jun 11, 2019

Description:

There's been a lot of folks, myself included, who have tried to install VS Code on the Raspberry Pi. In fact, there's a lovely process for this now. However, we have to ask ourselves is a Raspberry Pi really powerful enough to be running a full development environment and the app being debugged? Perhaps, but maybe this is a job for remote debugging. That means installing Visual Studio Code locally on my Windows or Mac machine, then having Visual Studio code install its headless server component (for ARM7) on the Pi.

In January I blogged about Remote Debugging with VS Code on a Raspberry Pi using .NET Core on ARM. It was, and is, a little hacked together with SSH and wishes. Let's set up a proper VS Code Remote environment so I can be productive on a Pi while still enjoying my main laptop's abilities.

First, can you ssh into your Raspberry Pi without a password prompt? If not, be sure to set that up with OpenSSH, which is now installed on Windows 10 by default. You know you've got it down when you can "ssh pi@mypi" and it just drops you into a remote prompt. Next, get Visual Studio Code Insiders plus Remote Development Extension Uninstall the "Remote - SSH" Extensions, disabling them isn't enough because you want to replace them with... Important - Remote - SSH Nightly Builds

From within VS Code Insiders, hit Ctrl/CMD+P and type "Remote-SSH" for some of the choices.

Remote-SSH options in VS Code

I can connect to Host and VS Code will SSH into the PI and install the VS Code server components in ~./vscode-server-insiders and then connect to them. It will take a minute as its downloading a 25 meg GZip and unzipping it into this temp folder. You'll know you're connected when you see this green badge as seen below that says "SSH: hostname."

Green badge in VS Code - SSH: crowpi

Then when you go "File | Open Folder" from the main menu, you'll get the remote system's files! You are working and editing locally on remote files.

My Raspberry Pi's desktop, remotely

Note here that some of the extensions are NOT installed locally! The Python language services (using Jedi) are running remotely on the Raspberry Pi, so when I get intellisense, I'm getting it remoted from the actual machine I'm developing on, not a guess from my local box.

Some extentions are local and others are remote

When I open a Terminal with Ctrl+~, see that I'm automatically getting a remote terminal and I've even running htop in it!

Check this out, I'm doing a remote interactive debugging session against CrowPi samples running on the Raspberry Pi (in Python 2) remotely from VS Code on my Windows 10 machine! I did need to make one change to the remote settings as it was defaulting to Python3 and I wanted to use Python2 for these samples.

Remote Debugging a Raspberry Pi

This has been a very smooth process and I remain super impressed with the VS Remote Development experience. I'll be looking at containers, and remote WSL debugging soon as well. Next step is to try C#, remotely, which will mean making sure the C# OmniSharp Extension works on ARM and remotely.

Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!


© 2018 Scott Hanselman. All rights reserved.
     

This changes everything for the DIY Diabetes Community - TidePool partners with Medtronic and Dexcom

Jun 6, 2019

Description:

D8f8EEYVsAAqN-DI don’t speak in hyperbole very often, and I want to make sure that you all understand what a big deal this is for the diabetes DIY community. Everything that we’ve worked for for the last 20 years, it all changes now. #WeAreNotWaiting

"You probably didn’t see this coming, [Tidepool] announced an agreement to partner with our friends at Medtronic Diabetes to support a future Bluetooth-enabled MiniMed pump with Tidepool Loop. Read more here: https://www.tidepool.org/blog/tidepool-loop-medtronic-collaboration"

Translation? This means that diabetics will be able to choose their own supported equipment and build their own supported FDA Approved Closed Loop Artificial Pancreases.

Open Source Artificial Pancreases will become the new standard of care for Diabetes in 2019

Every diabetic engineer every, the day after they were diagnosed, tries to solve their (or their loved one's) diabetes with open software and open hardware. Every one. I did it in the early 90s. Someone diagnosed today will do this tomorrow. Every time.

I tried to send my blood sugar to the cloud from a PalmPilot. Every person diagnosed with diabetes ever, does this. Has done this. We try to make our own systems. Then @NightscoutProj happened and #WeAreNotWaiting happened and we shared code and now we sit on the shoulders of people who GAVE THEIR IDEAS TO USE FOR FREE.

D8gKqkqW4AEsK32

Here's the first insulin pump. Imagine a disease this miserable that you'd choose this. Type 1 Diabetes IS NOT FUN. Now we have Bluetooth and Wifi and the Cloud but I still have an insulin pump I bought off of Craigslist.

D8gK05ZXYAAzSLc

Imagine a watch that gives you an electrical shock so you can check your blood sugar. We are all just giant bags of meat and water under pressure and poking the meatbag 10 times a day with needles and #diabetes testing strips SUUUUCKS.

D8gLNCgWkAAbLLi

The work of early #diabetes pioneers is being now leveraged by @Tidepool_org to encourage large diabetes hardware and sensor manufacturers to - wait for it - INTEROPERATE on standards we can talk to.

D8gLi6kW4AMndv2

D8gL61PWwAA3Tz2Just hours after I got off stage speaking on this very topic at @RefactrTech, it turns out that @howardlook and the wonderful friends at @Tidepool_org like @kdisimone and @ps2 and pioneer @bewestisdoing and others announced there are now partnerships with MULTIPLE insulin pump manufacturers AND multiple sensors!

We the DIY #diabetes community declared #WeAreNotWaiting and, dammit, we'd do this ourselves. And now TidePool expressing the intent to put an Artificial Pancreas in the damn App Store - along with Angry Birds - WITH SUPPORT FOR WARRANTIED NEW BLE PUMPS. I could cry.

You see this #diabetes insulin pump? It’s mine. See those cracks? THOSE ARE CRACKS IN MY INSULIN PUMP. This pump does not have a warranty, but it’s the only one that I have if I want an open source artificial pancreas. Now I’m going to have real choices, multiple manufacturers.

D8gMv8OXoAA4V9oIt absolutely cannot be overstated how many people keep this community alive, from early python libraries that talked to insulin pumps, to man in the middle attacks to gain access to our own data, to custom hardware boards created to bridge the new and the old.

To the known in the unknown, the song in the unsung, we in the Diabetes Community appreciate you all. We are standing on the shoulders of giants - I want to continue to encourage open software and open hardware whenever possible. Get involved. 

Also, if you're diabetic, consider buying a Nightscout Xbox Avatar accessory so you can see yourself represented while you game!

Oh, and one other thing, journalists who cover the Diabetes DIY community, please let us read your articles before you write them. They all have mistakes and over-generalizations and inaccuracies and it's awkward to read them. That is all.

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

Clever little C# and ASP.NET Core features that make me happy

Jun 4, 2019

Description:

Visual StudioI recently needed to refactor my podcast site which is written in ASP.NET Core 2.2 and running in Azure. The Simplecast backed API changed in a few major ways from their v1 to a new redesigned v2, so there was a big backend change and that was a chance to tighten up the whole site.

As I was refactoring I made a few small notes of things that I liked about the site. A few were C# features that I'd forgotten about! C# is on version 8 but there were little happinesses in 6.0 and 7.0 that I hadn't incorporated into my own idiomatic view of the language.

This post is collecting a few things for myself, and you, if you like.

I've got a mapping between two collections of objects. There's a list of all Sponsors, ever. Then there's a mapping of shows where a show might have n sponsors.

Out Var

I have to "TryGetValue" because I can't be sure if there's a value for a show's ID. I wish there was a more compact way to do this (a language shortcut for TryGetValue, but that's another post).

Shows2Sponsor map = null;
shows2Sponsors.TryGetValue(showId, out map); if (map != null) { var retVal = sponsors.Where(o => map.Sponsors.Contains(o.Id)).ToList(); return retVal; } return null;

I forgot that in C# 7.0 they added "out var" parameters, so I don't need to declare the map or its type. Tighten it up a little and I've got this. The LINQ query there returns a List of sponsor details from the main list, using the IDs returned from the TryGetValue.

if (shows2Sponsors.TryGetValue(showId, out var map)) return sponsors.Where(o => map.Sponsors.Contains(o.Id)).ToList(); return null; Type aliases

I found myself building JSON types in C# that were using the "Newtonsoft.Json.JsonPropertyAttribute" but the name is too long. So I can do this:

using J = Newtonsoft.Json.JsonPropertyAttribute;

Which means I can do this:

[J("description")]
public string Description { get; set; }

[J("long_description")] public string LongDescription { get; set; } LazyCache

I blogged about LazyCache before, and its challenges but I'm loving it. Here I have a GetShows() method that returns a List of Shows. It checks a cache first, and if it's empty, then it will call the Func that returns a List of Shows, and that Func is the thing that does the work of populating the cache. The cache lasts for about 8 hours. Works great.

public async Task<List<Show>> GetShows()
{
Func<Task<List<Show>>> showObjectFactory = () => PopulateShowsCache();
return await _cache.GetOrAddAsync("shows", showObjectFactory, DateTimeOffset.Now.AddHours(8));
}
private async Task<List<Show>> PopulateShowsCache()
{
List<Show> shows = shows = await _simpleCastClient.GetShows();
_logger.LogInformation($"Loaded {shows.Count} shows");
return shows.Where(c => c.Published == true && c.PublishedAt < DateTime.UtcNow).ToList();
}

What are some little things you're enjoying?

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

What's better than ILDasm? ILSpy and dnSpy are tools to Decompile .NET Code

May 30, 2019

Description:

.NET code (C#, VB, F#, etc) compiles (for the most part) into Intermediate Language (IL) and then makes it way to native code usually by Just-in-time (JIT) compilation on the target machine. When you get a DLL/Assembly, it's pre-chewed but not full juiced, to mix my metaphors.

Often you'll come along a DLL that you want to learn more about. Sometimes you'll want to just see the structure of classes, methods, etc, and other times you want to see the IL - or a close representation of the original C#/VB/F#, etc. You're not looking at the source, you're seeing a backwards projection of the IL as whatever language you want. You're basically taking this pre-chewed food and taking it out of your mouth and getting a decent idea of what it was originally.

I've used ILDasm for years, but it's old and lame and people tease you for using it because they are cruel. ;)

Seriously, though, I use ILDasm - the IL Disassembler - simply because it's already installed. Those tweets got me thinking though that I need to update my options, so I'm trying out ILSpy and dnSpy.

ILSpy

ILSpy has been around for a while and has multiple front-ends, including ones for Linux/Mac/Windows based on Avalonia in the form of AvaloniaSpy. You can also integrate ILSpy into Visual Studio 2017 or 2019 with this extension. There is also a console decompiler and, interestingly, cross-platform PowerShell cmdlets.

ILSpy is a solid .NET decompiler

I've always liked the "Open List" feature of ILSpy where you can open a preconfigured list of assemblies you want to browse, like ASP.NET MVC, .NET 4, etc. A fun open source contribution for you might be to update the included lists with newer defaults. There's so many folks doing great work in open source out there, why not jump in and help them out?

dnSpy

dnSpy has a lovely UI AND a great Console app using the same engine. It's amazingly polished and VERY complete. I was surprised that it also has a full hex editor as well as property pages for common EXE file headers. From their GitHub, dnSpy features

Debug .NET Framework, .NET Core and Unity game assemblies, no source code required Edit assemblies in C# or Visual Basic or IL, and edit all metadata Light and dark themes Extensible, write your own extension High DPI support (per-monitor DPI aware)

dnSpy takes it to the next level with an integrated Debugger, meaning you can attach to a running process and debug it without source code - but it feels like source code because it's decompiling for you. Note where it says C#, I can choose C#, VB, or IL as a "view" on my decompiled code.

dnSpy is amazing for looking inside .NET apps

Here is dnSpy actually debugging ILSpy and stopped at a decompiled breakpoint.

image

There's a lot of great low-level stuff in this space. Another cool tool is Reflexil, a .NET Assembly Editor as well as de4dot by the same mysterious author as dnSpy. JetBrains has the excellent dotPeek and Telerik has JustDecompile. Commercial Tools include Reflector.

What's your favorite?

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

Bringing the SpaceOrb game controller forward with an Arduino Bridge via The Orbotron 9001

May 28, 2019

Description:

Thingotron brings your SpaceOrb back to lifeAlmost ten years ago I posted abut the SpaceTec SpaceOrb 360 Controller and that was 15 years after it came out. We are now 25 years into the legend of the SpaceOrb and I will continue to tell the tale. The SpaceOrb is one of a series of innovative "Spacemice" that offer more than just two degress of input freedom. In fact, they offer SIX.

"The puck or ball of a spacemouse can be moved along X, Y and Z axis as well as being twisted rotationally on each of those axis. (Roll, Pitch and Yaw)"

Vic Putz continues to carry a torch for the SpaceOrb, as do I, except he's actually doing something about it. A decade ago I bought an Arduino and an "OrbSheild" from Vic that sat on top and provided a realtime bridge between the RS-232 Serial Port and the modern USB "HID" (Hardware Input Devices) that are used today. The goal is to move behind unsigned device drivers and create a system-agnostic solution that would present an old device in a new driver-free way.

Vic has been working on a new version called the Orbotron 9000/9001 for the last few years and it's currently sold out at his little store. It acts as an interface for the SpaceOrb 360 and comes configured for that device, but should also work with the SpaceBall 5000, SpaceBall 4000FLX, and Magellan SpaceMouse. Code and plans on are GitHub, natch.

SpaceOrb

When you plug the SpaceOrb into the Orbotron 9001 then into your PC it shows up as a Game Controller!

The SpaceOrb as it presents inside Windows as a controller

There's several innovative "six degrees of freedom" games out there , like the "Overload" sequel to Decent on Steam, as well as Retrovirus, and NeonXSZ, as well as open source reimplementations of Descent like DXX Rebirth (give them some love!) and Forsaken.

Modern Xinput games are trickier, but you may have success with https://www.x360ce.com by mapping the orb buttons and axes to a gamepad.

Descent - DXX Re-birth

I'm still exploring this space, but I love that The Internet - with the help of the enterprising and patient - refused to let the good parts of history die, by making innovative and clean bridges between the past and the future.

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

Piper Command Center BETA - Build a game controller from scratch with Arduino

May 23, 2019

Description:

Piper Command CenterBack in 2018 I posted my annual Christmas List of STEM Toys and the Piper Computer Kit 2 was on the list. My kids love this little wooden "laptop" comprised of a Raspberry Pi and an LCD screen. You spend time going through curated episodes of custom content and build and wire the computer LIVE while it's on!

The Piper folks saw my post and asked me to take a look at the BETA of their Piper Command Center, so my sons and I jumped at the chance. They are actively looking for feedback. It's a chance to build our own game controller!

The Piper Command Center BETA already has a ton of online content and things to try. Their "firmware" is an Arduino sketch and it's all up on GitHub. You'll want to get the Arduino IDE from the Windows Store.

Today the Command Center can look like a Keyboard or a Mouse.

In Mouse Mode (default), the joystick controls cursor movement and the left and right buttons mimic left and right mouse clicks. In Keyboard Mode, the joystick mimics the arrow keys on a keyboard, and the buttons mimic Space Bar (Up), Z (Left), X (Down), and C (Right) keys on a keyboard.

Once it's built you can use the controller to play games in your browser, or soon, with new content on the Piper itself, which runs Minecraft usually. However, you DO NOT need the Piper to get the Piper Command Center. They are separate but complementary devices.

Assemble a real working game controller, understand the basics of an Arduino, and discover physical computing by configuring a joystick, buttons, and more. Ideal for ages 13+.

My son is looking at how he can modify the "firmware" on the Command Center to allow him to play emulators in the browser.

The parts ot the Piper Command Center Parts and Wires for the Piper Command Center

The Piper Command Center comes unassembled, of course, and you get to put it together with a cool blueprint instruction sheet. We had some fun with the wiring and a were off by one a few times, but they've got a troubleshooting video that helped us through it.

Blueprints for the Piper Command Center

It's a nice little bit of kit and I love that it's made of wood. I'd like to see one with a second joystick that could literally emulate an XInput control pad, although that might be more complex than just emulating a mouse or keyboard.

Go check it out. We're happy with it and we're looking forward to whatever direction it goes. The original Piper has updated itself many times in a few years we've had it, and we upgraded it to a 16gig SD Card to support the latest content and OS update.

Piper Command Center is in BETA and will be updated and actively developed as they explore this space and what they can do with the device. As of the time of this writing there were five sketches for this controller.

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

Visual Studio Code Remote Development may change everything

May 21, 2019

Description:

DevContainer using RustOK, that's a little clickbaity but it's surely impressed the heck out of me. You can read more about VS Code Remote Development (at the time of this writing, available in the VS Code Insiders builds) but here's a little on my first experience with it.

The Remote Development extensions require Visual Studio Code Insiders.

Visual Studio Code Remote Development allows you to use a container, remote machine, or the Windows Subsystem for Linux (WSL) as a full-featured development environment. It effectively splits VS Code in half and runs the client part on your machine and the "VS Code Server" basically anywhere else. The Remote Development extension pack includes three extensions. See the following articles to get started with each of them:

Remote - SSH - Connect to any location by opening folders on a remote machine/VM using SSH. Remote - Containers - Work with a sandboxed toolchain or container-based application inside (or mounted into) a container. Remote - WSL - Get a Linux-powered development experience in the Windows Subsystem for Linux.

Lemme give a concrete example. Let's say I want to do some work in any of these languages, except I don't have ANY of these languages/SDKS/tools on my machine.

Aside: You might, at this point, have already decided that I'm overreacting and this post is nonsense. Here's the thing though when it comes to remote development. Hang in there.

On the Windows side, lots of folks creating Windows VMs in someone's cloud and then they RDP (Remote Desktop) into that machine and push pixels around, letting the VM do all the work while you remote the screen. On the Linux side, lots of folks create Linux VMs or containers and then SSH into them with their favorite terminal, run vim and tmux or whatever, and then they push text around, letting the VM do all the work while you remote the screen. In both these scenarios you're not really client/server, you're terminal/server or thin client/server. VS Code is a thick client with clean, clear interfaces to language services that have location transparency.

I type some code, maybe an object instance, then intellisense is invoked with a press of "." - who does that work? Where does that list come from? If you're running code locally AND in the container, then you need to make sure both sides are in sync, same SDKs, etc. It's challenging.

OK, I don't have the Rust language or toolkit on my machine.

I'll clone this repository:

git clone https://github.com/Microsoft/vscode-remote-try-rust

Then I'll run Code, the Insiders version:

C:\github> git clone https://github.com/Microsoft/vscode-remote-try-rust
Cloning into 'vscode-remote-try-rust'...
Unpacking objects: 100% (38/38), done.
C:\github> cd .\vscode-remote-try-rust\
C:\github\vscode-remote-try-rust [main =]> code-insiders .

Then VS Code says, hey, this is a Dev Container, want me to open it?

There's a devcontainer.json file that has a list of extensions that the project needs. And it will install those VS Extensions inside a Development Docker Container and then access them remotely. This isn't a list of extensions that your LOCAL system needs - you don't want to sully your system with 100 extensions. You want to have just those extensions that you need for the project you're working on. Compartmentalization. You could do development and never install anything on your local machine, but you're finding a sweet spot that doesn't involved pushing text or pixels around.

Reopen in Container

Now look at this screenshot and absorb. It's setting up a dockerfile, sure, with the development tools you want to use and then it runs docker exec and brings in the VS Code Server!

Setting up Rust

Check out the Extensions section of VS Code, and check out the lower left corner. That green status bar shows that we're in a client/server situation. The extensions specific to Rust are installed in the Dev Container and we are using them from VS Code.

Extensions

When I'm typing and working on my code in this way (by the way it took just minutes to get started) I've got a full experience with Intellisense, Debugging, etc.

Intellisense from a container running Rust and VS Code Remote Containers

Here I am doing a live debug session of a Rust app with zero setup other than VS Code Insiders, the Remote Extensions, and Docker (which I already had).

Debugging in VS Code a Rust app within a DevContainer

As I mentioned, you can run within WSL, Containers, or over SSH. It's early days but it's extraordinarily clean. I'm really looking forward to seeing how far and effortless this style of development can go. There's so much less yak shaving! It effectively removes the whole setup part of your coding experience and you get right to it.

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

Using the Steam Link app to stream PC Games directly to your iPhone or mobile device

May 16, 2019

Description:

Steam Link on iOSI think that we, as an industry, are still figuring game streaming out. It's challenging to find that sweet spot between quality and frames per second, all while respecting the speed of light and the laws of physics.

That said, if you have a a rock solid 5Ghz wireless network, or better yet, a solid wired network, you can do some pretty cool stuff today.

How to stream PC games from Windows 10 to your Xbox One for free

You can use the Xbox app on Windows 10 to stream from your Xbox One to your PC. I use this to play on my Xbox while I walk on my treadmill in my garage. Works great even on my comparatively underpowered Surface Pro 3.

You can also do the opposite if you have a powerful PC. You can run the Xbox Wireless Display app and remote your PC to your Xbox.

Here I am running Batman on my PC with an NVidia 1080, from my Xbox

I also have a Steam Link - it's odd to me that they discontinued this great little device - that I use to stream from my PC to my big TV. However, if you have a Raspberry Pi 3 or 3B+ running Stretch, you can try a beta of Steam Link and effectively make your own little Steam Link dedicated device. Bonus points if you 3D Print a replica case to make it look like a Steam Link.

sudo apt update
sudo apt install steamlink
steamlink

Today, however, Steam Link was released (after a rejection) to the Apple iOS App store so I had to try this out from my iPhone XS Max. I also have a Steam Controller, which, while weird (i.e. it's not an Xbox Controller) is the most configurable controller ever and it can emulate a mouse pretty well when needed. They released a new Firmware for the Steam Controller that enabled BLE support which allows it to be used as an MFi controller on an iOS device. You do need to memorize or write down the incantations to switch between original RF mode and BLE mode, though.

Aside: MFi is almost criminally neglected and a Apple has utterly dropped the ball and missed an opportunity to REALLY make iOS devices more than casual gaming devices. Only in the last few years have decent MFi Controllers been released and game support is still embarrassingly spotty. I've used my now-discontinued SteelSeries Stratus a handful of times.

You install the app, pair your controller with your iOS device/phone/tablet, then test your network. I'm using an Amplifi Mesh Network so I can control how my devices connect to the network, I can manage band selection, as well as Quality of Service (QoS) so I didn't have any trouble getting 55 Mb/s from my wired computer to my wireless iPhone.

Steaming bandwidth test successful up to 55 Mb/s

Steaming bandwidth test successful up to 55 Mb/s

The quality is up and down as it appears they are focused on maintaining a high framerate. Here's a captured local video of me playing Batman from my high end rig streaming to Steam Link on my iPhone.

Here’s a better quality video with the iPhone at full power and connect to 5ghz using Steam Link pic.twitter.com/N2UZ0P2G4n

— Scott Hanselman (@shanselman) May 18, 2019

What has been YOUR experience with Game Streaming?

Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!
© 2018 Scott Hanselman. All rights reserved.
     

Introducing the Try .NET Global Tool - interactive in-browser documentation and workshop creator

May 15, 2019

Description:

Learn .NET or easily author your own workshopIf you find yourself learning C# and .NET and come upon the "Run your first C# Program" documentation you may have noticed a "Try the code in your browser" button that lets you work through your first app entirely online, with no local installation! You're running C# and .NET in the browser! It's a great way to learn that is familiar to folks who learn JavaScript.

The language team at Microsoft wants to bring that easy on-ramp to everyone who wants to learn .NET.

The .NET Foundation has published a lot of free .NET presentations and workshops that you can use today to teach open source .NET to your friends, colleagues, or students. However these do encourage you to install a number of prerequisites and we believe that there might be an easier on-ramp to learning .NET.

Today we're announcing that on ramp - the Try .NET global tool!

Here's the experience. Once you have the .NET SDK - Pick the one that says you want to "Build Apps." Just get the "try" tool! Try it!

Open a terminal/command prompt and type dotnet tool install --global dotnet-try

Now you can either navigate to an empty folder and type

dotnet try demo

or, even better, do this!

ACTION: Clone the samples repo with
git clone https://github.com/dotnet/try -b samples
then run
"dotnet try"
and that's it!

NOTE: Make sure you get the samples branch until we have more samples!

C:\Users\scott\Desktop> git clone https://github.com/dotnet/try -b samples
Cloning into 'try'...
C:\Users\scott\Desktop> cd .\try\Samples\
C:\Users\scott\Desktop\try\Samples [samples ≡]> dotnet try
Hosting environment: Production
Content root path: C:\Users\scott\Desktop\try\Samples
Now listening on: http://localhost:5000
Now listening on: https://localhost:5001

Your browser will pop up and you're inside a local interactive workshop! Notice the URL? You're browsing your *.md files and the code inside is runnable. It's all local to you! You can put this on a USB key and learn offline or in disconnected scenarios which is great for folks in developing countries. Take workshops home and remix! Run an entire workshop in the browser and the setup instructions for the room is basically "get this repository" and type "dotnet try!"

Try .NET interactive local documentation

This is not just a gentle on-ramp that teaches .NET without yet installing Visual Studio, but it also is a toolkit for you to light up your own Markdown.

Just add a code fence - you may already be doing this! Note the named --region there? It's not actually running the visible code in the Markdown...it's not enough! It's compiling your app and capturing the result of the named region in your source! You could even make an entire .NET interactive online book.

### Methods
A **method** is a block of code that implements some action. `ToUpper()` is a method you can invoke on a string, like the *name* variable. It will return the same string, converted to uppercase.
``` cs --region methods --source-file .\myapp\Program.cs --project .\myapp\myapp.csproj
var name = "Friends";
Console.WriteLine($"Hello {name.ToUpper()}!");
```

And my app's code might look like:

using System;

namespace HelloWorld
{
class Program
{
static void Main(string[] args)
{
#region methods
var name = "Friends"
Console.WriteLine($"Hello {name.ToUpper()}!");
#endregion
}
}
}

Make sense?

NOTE: Closing code fences ``` must be on a newline.

Hey you! YOU have some markdown or just a readme.md in your project! Can you light it up and make a workshop for folks to TRY your project?

Code Fences within Markdown

Here I've typed "dotnet try verify" to validate my markdown and ensure my samples compile. Dotnet Try is both a runner and a creator's toolkit.

Compiling your workshop

Today "dotnet try" uses .NET Core 2.1 but if you have .NET Core 3 installed you can explore the more complex C# samples here with even more interesting and sophisticated presentations. You'll note in the markdown the --session argument for the code fence allows for interesting scenarios where more than one editor runs in the context of one operation!

image

I'd love to see YOU create workshops with Try .NET. It's early days and this is an Alpha release but we think it's got a lot of promise. Try installing it and running it now and later head over to https://github.com/dotnet/try to file issues if you find something or have an idea.

Go install "dotnet try" locally now, and remember this is actively being developed so you can update it easily and often like this!

dotnet tool update -g dotnet-try

There's lots of ideas planned, as well as the ability to publish your local workshop as an online one with Blazor and WASM. Here's a live example.

Watch for an much more in-depth post from Maria from my team on Thursday on the .NET blog!

Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!


© 2018 Scott Hanselman. All rights reserved.
     

Systems Thinking as important as ever for new coders

May 9, 2019

Description:

Two programmers having a chatI was at the Microsoft BUILD conference last week and spent some time with a young university student who came prepared. I was walking between talks and he had a sheet of paper organized with questions. We sat down and went through the sheet.

One of his main questions that followed a larger theme was, since his class in South Africa was learning .NET Framework on Windows, should he be worried? Shouldn't they be learning the latest .NET Core and the latest C#? Would they be able to get jobs later if they aren't on the cutting edge? He was a little concerned.

I thought for a minute. This isn't a question one should just start talking about and see when their mouth takes them. I needed to absorb and breathe before answering. I'm still learning myself and I often need a refresher to confirm my understanding of systems.

It doesn't matter if you're a 21 year old university student learning C# from a book dated 2012, or a 45 year old senior engineer doing WinForms at a small company in the midwest. You want to make sure you are valuable, that your skills are appreciated, and that you'll be able to provide value at any company.

I told this young person to try not to focus on the syntax of C# and the details of the .NET Framework, and rather to think about the problems that it solves and the system around it.

This advice was .NET specific, but it can also apply to someone learning Rails 3 talking to someone who knows Rails 5, or someone who learned original Node and is now reentering the industry with modern JavaScript and Node 12.

Do you understand how your system talks to the file system? To the network? Do you understand latency and how it can affect your system? Do you have a general understanding of "the stack" from when your backend gets data from the database makes anglebrackets or curly braces, sends them over the network to a client/browser, and what that next system does with the info?

Squeezing an analogy, I'm not asking you to be able to build a car from scratch, or even rebuild an engine. But I am asking you for a passing familiarity with internal combustion engines, how to change a tire, or generally how to change your oil. Or at least know that these things exist so you can google them.

If you type Google.com into a browser, generally what happens? If your toaster breaks, do you buy a new toaster or do you check the power at the outlet, then the fuse, then call the neighbor to see if the power is out for your neighborhood? Think about systems and how they interoperate. Systems Thinking is more important than coding.

If your programming language or system is a magical black box to you, then I ask that you demystify it. Dig inside to understand it. Crack it open. Look in folders and directories you haven't before. Break things. Fix them.

Know what artifacts your system makes and what's needed for it to run. Know what kinds of things its good at and what it's bad at - in a non-zealous and non-egotistical way.

You don't need to know it all. In fact, you may dig in, look around inside the hood of a car and decide to take a ride-sharing or public transport the rest of your life, but you will at least know what's under the hood!

For the young person I spoke to, yes .NET Core may be a little different from .NET Framework, and they might both be different from Ruby or JavaScript, but strings are strings, loops are loops, memory is memory, disk I/O is what it is, and we all share the same networks. Processes and threads, ports, TCP/IP, and DNS - understanding the basic building blocks are important.

Drive a Honda or a Jeep, you'll still need to replace your tires and think about the road you're driving on, on the way to the grocery store.

What advice would you give to a young person who is not sure if what they are learning in school will serve them well in the next 10 years? Let us know in the comments.

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

An experiment - The Azure Cloud Shell at the command line with AZ SHELL

May 7, 2019

Description:

I've blogged before about the Azure Cloud Shell. It's super cool and you can get your own easily in any browse by hitting https://shell.azure.com. You can have either bash or powershell, and you get a shared "cloud drive" that is persisted between sessions.

If you have Visual Studio Code you can get an Azure Cloud Shell integrated within VSCode by just installing Visual Studio Code and adding the Azure Account Extension.

I recently got a build of the new open source Windows Terminal on my machine and I set up some profiles with tabs for DOS, PowerShell, VS2019, Ubuntu but something was missing. Why can't I get my Azure Cloud Shell?

Sure, I can fire up a VM and ssh into it. But Azure Cloud Shell spins up a free container with a persistent cloud drive AND has a bunch of developer tools like python, node, dotnet, and go already installed. I'd love to use it! But it's not a VM and the container isn't exposed with SSH. Instead, we'll want to spin the Azure Cloud Shell up the same way the https://shell.azure.com site does, with web calls and web sockets. So...why not do it?

image

I thought I was pretty clever when I had this idea so I started a C# implementation myself. Then I talked to Anders Liu from work about how to do it right, and over the weekend he beat me to it with his own VERY nice and clean implementation in Go that he put on his github at https://github.com/yangl900/azshell. We shared this on an internal alias and found out that Noel Bundick had the same great idea and put it in his Az CLI extensions pack (which has a ton of other cool stuff you should see). Anders' is standalone and Noel's is an Az CLI extension.

Either way, we all together think this idea has merit and maybe it should be an official thing! What do you think? Regardless maybe it doesn't need to be since you can try it today with these open source options.

Just put "azshell.exe" in your PATH and make sure you have the latest Azure CLI installed and you're logged in.

By the way, you can also get a Cloud Shell inside the Portal. In fact there's a button for it at the top that looks like >_ Personally I think with the addition of "az shell" (or in this case, azshell.exe) from the command line) it completes the circle in a really cool way.

image

Let me know what you think in the comments!

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

A new Console for Windows - It's the open source Windows Terminal

May 3, 2019

Description:

"My fellow Windows users, our long national nightmare is over." The Windows Terminal is here, it's open source, it's real, and it's spectacular. It's very early days to be clear, but the new Windows Terminal is open source and it's up at https://github.com/microsoft/Terminal for you to check out.

The repository includes Windows Terminal The Windows console host (conhost.exe) - a local copy that is separate from the built-in Windows one.  Components shared between the two projects ColorTool Sample projects that show how to consume the Windows Console API

And even better, it'll be, as they say:

Windows Terminal will be delivered via the Microsoft Store in Windows 10 and will be updated regularly, ensuring you are always up to date and able to enjoy the newest features and latest improvements with minimum effort.

How do you get it? TODAY you clone the repo and build your own copy. There will be early builds in the Store this summer and 1.0 should be out before the end of the year.

As of today, the Windows Terminal and Windows Console have been made open source and you can clone, build, run, and test the code from the repository on GitHub: https://github.com/Microsoft/Terminal

This summer in 2019, Windows Terminal previews will be released to the Microsoft Store for early adopters to use and provide feedback.

This winter in 2019, our goal is to launch Windows Terminal 1.0 and we’ll work with the community to ensure it’s ready before we release!

So today, yes, it'll take some effort if you want to play with it today. But good things are worth a little effort. Here's some of the things I've done to mine. I hope you make your Windows Terminal your own as well!

Windows Terminal

When you click the menu, check out Settings, which will open your profile.json in your JSON editor. I use VS Code to edit. You'll need to run Format Document to make the JSON look nice as today it may show up on one line.

You can create color profiles in the "schemes" node. For example, here's my "UbuntuLegit" color theme in my profiles.json.

{
"name": "UbuntuLegit",
"foreground": "#EEEEEE",
"background": "#2C001E",
"colors": [
"#4E9A06", "#CC0000", "#300A24", "#C4A000",
"#3465A4", "#75507B", "#06989A", "#D3D7CF",
"#555753", "#EF2929", "#8AE234", "#FCE94F",
"#729FCF", "#AD7FA8", "#34E2E2", "#EEEEEE"
]
}

Here's an example profile with all the settings I know about set. This is for "CMD.exe"

"profiles": [
{
"startingDirectory": "C:/Users/Scott/Desktop",
"guid": "{7d04ce37-c00f-43ac-ba47-992cb1393215}",
"name": "DOS but not DOS",
"colorscheme": "Solarized Dark",
"historySize": 9001,
"snapOnInput": true,
"cursorColor": "#00FF00",
"cursorHeight": 25,
"cursorShape": "vintage",
"commandline": "cmd.exe",
"fontFace": "Cascadia Code",
"fontSize": 20,
"acrylicOpacity": 0.85,
"useAcrylic": true,
"closeOnExit": false,
"padding": "0, 0, 0, 0",
"icon": "ms-appdata:///roaming/cmd-32.png"
},

I like the "vintage" cursor and I make it bright green. I can also add icons in this location:

%LOCALAPPDATA%\packages\Microsoft.WindowsTerminal_8wekyb3d8bbwe\RoamingState

So I put some 32x32 PNGs in that folder and then I can reference them as seen above with ms-appdata://

Cool Icons

I'll go into more detail about what's happening in each of these profiles/tabs in the next post! I've got a few creative ideas for taking MY Windows Terminal to the next level.

"defaultProfile": "{7d04ce37-c00f-43ac-ba47-992cb1393215}",
"initialRows": 30,
"initialCols": 120,
"alwaysShowTabs": true,
"showTerminalTitleInTitlebar": true,
"experimental_showTabsInTitlebar": true,
"requestedTheme": "dark",

Here I've set the theme to dark using "requestedTheme" even though I run Windows in a light theme. I'm setting the tabs to be shown all the time and moved the tabs into the TitleBar.

Here's my Ubuntu tab with the UbuntuLegit color theme above:

Nice Ubuntu Colors

Notice I'm also using Powerline in my prompt. I'm using Fira Code which has the glyphs I need but you can certainly use patched Powerline fonts or make your own fonts with tools like those from Nerd Fonts and it's font patcher. This font patcher is often used to take your favorite monospace font and add Powerline glyphs to it.

NOTE: If you see any weird spacing issues with glyphs you might try using --use-single-width-glyphs to work around it. By release all these little issues I assume will be worked out. I had no issues with Fira Code in my case, your mileage may vary.

This new Windows Terminal is great. As mentioned, it's super early days but it's amazingly fast, runs on your GPU (the current conhost runs on your CPU) and it's VERY configurable.

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.
© 2018 Scott Hanselman. All rights reserved.
     

Did I leave the garage door open? A no-code project with Azure IoT Central and the MXChip DevKit

May 1, 2019

Description:

Azure IoT DevKitFor whatever reason when a programmer tries something out for the first time, they write a "Hello World!" application. In the IoT (Internet of Things) world of devices, it's always fun to make an LED blink as a good getting started sample project.

When I'm trying out an IoT platform or tiny microcontroller I have my own "Hello World" project - I try to build a simple system that tells me "Did I leave the garage door open?"

I wanted to see how hard it would be to use an Azure IoT MXChip DevKit to build this little system. The DevKit is small and thin but includes Wifi, OLED display, headphone, microphone, sensors like temperature, humidity, motion, pressure sensors. The kit isn't super expensive given all it does and you can buy it most anywhere. The DevKit is also super easy to update and it's actively developed. In fact, I just updated mine to Firmware 1.6.2 yesterday and there is an Azure IoT Device Workbench Extension for VS Code. There is also a fantastic IoT DevKit Project Catalog you should check out.

I wanted to use this little Arduino friendly device and have it talk to Azure. My goal was to see how quickly and simply I could make a solution that would:

Detect if my garage door is open If it's open for more than 4 minutes, text me Later, perhaps I'll figure out how to reply to the Text or take an action to close the door remotely.

However, there is an Azure IoT Hub and there's Azure IoT Central and this was initially confusing to me. It seems that Azure IoT Hub is a individual Azure service but it's not an end-to-end IoT solution - it's a tool in the toolbox. Azure IoT Central, on the other hand, is an browser-based system with templates that is a SaaS (Software as a Service) and hides most of the underlying systems. With IoT Central no coding is needed!

Azure IoT Central: What is Azure IoT Central? IoT solution accelerators: What are Azure IoT solution accelerators? IoT Hub: What is Azure IoT Hub?

Slick. I was fully prepared to write Arduino code to get this garage door sensor working but if I can do it with no code, rock on. I may finish this before lunch is over. I have an Azure account so I went to https://azureiotcentral.com and created a new Application. I chose Pay as You Go but it's free for the first 5 devices so, swag.

Create a New Azure IoT Central App

You should totally check this out even if you don't have an IoT DevKit because you likely DO have a Raspberry Pi and it totally has device templates for Pis or even Windows 10 IoT Core Devices.

Azure IoT Central

Updating the firmware for the IoT DevKit couldn't be easier. You plug it into a free USB port, it shows up as a disk drive, and you drag in the new (or alternate) firmware. If you're doing something in production you'll likely want to do OTA (Over-the-air) firmware updates with Azure IoT Hub automatic device management, so it's good to know that's also an option. The default DevKit firmware is fun to explore but I am connecting this device to Azure (and my Wifi) so I used the firmware and instructions from here which is firmware specific to Azure IoT Central.

The device reboots as a temporary hotspot (very clever) and then you can connect to it's wifi, and then it'll connect to yours over WPA2. Once you're connected to Wifi, you can add a new Real (or Simulated - you can actually do everything I'm doing where without a real device!) device using a Device ID that you'll pair with your Mxchip IoT DevKit. After it's connect you'll see tons of telemetry pour into Azure. You can, of course, choose what you want to send and send just the least amount your projects needs, but it's still a very cool first experience to see temp, humidity, and on and on from this little device.

MxChip in Azure

Here's a wonderful HIGH QUALITY diagram of my Garage door planned system. You only wish your specifications were this sophisticated. ;)

Basically the idea is that when the door is closed I'll have the IoT DevKit taped to the door with a battery, then when it open it'll rotate 90 degrees and the Z axis of the Accelerometer will change! If it stays there for more than 5 minutes then it should text me!

image

In Azure IoT central I made a Device Template with a Telemetry Rule that listens to the changes in the accelerometer Z and when the average is less than 900 (I figured this number out by moving it around and testing) then it fires an Action.

The "Action" is using an Azure Monitor action group that can either SMS or even call me voice!

In this chart when the accelerometer is above the line the garage door is closed and when it drops below the line it's open!

The gyroscope Z changing with time

Here's the Azure Monitoring alert that texts me when I leave the garage door open too long.

Azure Activity Monitor

And here's my alert SMS!

mxchip

I was very impressed I didn't have to write any code to pull this off. I'm going to try this same "Hello World" later with custom code using a AdaFruit Huzzah Feather and an ADXL345 Accelerometer. I'll write Arduino C code and still have it talk to Azure for Alerts.

It's amazing how clean and simple the building blocks are for projects like this today.

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

Software Defined Radio is a great way to bridge the physical and the digital and teach STEM

Apr 25, 2019

Description:

Software Defined Radio AdapterOne of the magical technologies that makes an Open Source Artificial Pancreas possible is "Software-defined Radio" or SDR. I have found that SDR is one of those technologies that you've either heard of and agree it's amazing or you've literally never heard of it. Well, buckle up, friends

There's an amazing write up by Pete Schwamb, one of the core members of the community who works on Loop full time now, on how Software Defined Radios have allowed the community to "sniff" the communication protocols of insulin pumps in the RF spectrum and reverse engineer the communications for the Medtronic and now Omnipod Eros Insulin Pumps. It's a fascinating read that really illustrates how you just need the right people and a good cause and you can do anything.

In his post, Pete explains how he configured the SDR attached to his computer to listen into the 433MHz range and capture the RF (radio frequencies) coming to and from an insulin pump. He shows how the shifts between a slightly higher and slightly lower frequency is used to express 1s and 0s, just like a high voltage is a 1 and a low or no voltage is a 0.

Radio Frequency to 1s and 0s

Then he gets a whole "packet," plucks it out of the thin air, and then manipulates it from Python. Insert Major Motion Picture Programmer Montage and a open source pancreas pops out the other side.

1s and 0s from RF into a string in Python

Lemme tell you, Dear Reader, Hello World is nice, but pulling binary data out of electromagnetic radiation with wavelengths in the electromagnetic spectrum longer than infrared light is THE HOTNESS.

From a STEM perspective, SDR is more fun than Console Apps when educating kids about the world and it's a great way to make the abstract REAL while teaching programming and science.

You can get a SDR kit for as little as US$20 as a USB device. They are so simple and small it's hard to believe they work at all.

Just plug it in and download Airspy (Formerly SDRSharp, there are many choices in the SDR space). and run the install-rtlsdr.bat to setup a few drivers.

You'll want to run zadig.exe and change the default driver for listening to radio (FM, TV) over to something more low-level. Run it, select "List All Interfaces," and select "Bulk Interface 0"

Updating SDR wtih Zadig

After you hit Replace Driver with WinUSB, you can close this and run SDRSharp.exe.

I've set my SDRSharp to WFM (FM Radio) and turned the Gain up and OMG it's the radio.

Listening to the Radio with SDR

In this pic I'm listening to 91.5 FM in Portland, Oregon which is National Public Radio. The news is the center red line moving down, while the far right is 92.3, a rock station, and 90.7 on the far left is more jazz. You can almost see it!

AdaFruit has as great SDR tutorial and I'll use it to find the local station for National Weather Radio. This is the weather alert that is available anywhere here in America. Mine was Narrow Band (WFM) at 162.550 FM! It was harder to hear but it was there when I turned up the gain.

The weather report

But wait, it's more than radio, it's the whole spectrum!

Here I am sending a "Get Pump Model" command to my insulin pump in the 900Mhz range! The meaty part is in the red.

Talking to an Insulin Pump

Here's the heartbeat and requests that are sent to my Insulin Pump from my Loop app through a RileyLink (BT to RF Bridge). I'm seeing the Looping communications of my Open Source Artificial Pancreas here, live.

Watching RF Pump Communications

Next post or two I'll try to get the raw bits off of the RF signal of something interesting. If you haven't messed with SDR you should really give it a try! As I said before you can get a SDR kit for as little as US$20 as a USB device.

Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!
© 2018 Scott Hanselman. All rights reserved.
     

Open Source Artificial Pancreases will become the new standard of care for Diabetes in 2019

Apr 23, 2019

Description:

Loop is an open source pancreas for iPhoneI've been a Type 1 diabetic for over 25 years. Diabetes sucks. They actually give you an award for staying alive for years on insulin. Diabetics don't usually die of old age, they die of heart disease or stroke, kidney failure, and while they're at it they may go blind, get nerve damage, amputation, and a bunch of other stuff. It used to be a death sentence but when insulin was introduced as a treatment in 1921, there was a chance for something new.

The idea is if you keep your blood sugars close to normal - if you can simulate your non-working pancreas - you'll get hit by an ice cream truck! At least, that's how I hope I go. :)

Early on it was boiling big gauge steel needles and pork insulin to dose, and peeing on a stick to get a sense of sugar levels. Then it was a dozen finger pricks a day and a half dozens manual shots with a syringe. Then it was inserted continuous glucose meters and insulin pumps that - while not automatic - mean less invasive treatment and greater control.

Today, we are closing the loop. What's the loop? It's this:

Consider my glucose levels, what I'm about to eat, and what I'm about to to (and dozens of other environmental factors) Dose myself with insulin GOTO 1. Every few hours, or every few minutes, depending on the situation.

I do that. Manually. Every diabetic does, and the mental pressure - the intense background psychic weight of it all - is overwhelming. We want to lower the cognitive load of diabetes. This is a disease where you may not live as long if you're not good at math. Literally. That's unfair.

The community is "looping" by allowing an algorithm to make some of those decisions for me.

I've personally been looping with an open source artificial pancreas for over two years. It's night and day from where I started with finger sticks and a half dozen needle sticks a day. It's not perfect, it's not automatic, but Open Source Pancreas are "Tesla autopilot for diabetes." It doesn't always park the car right or stop at every stop light, but it works very hard to keep me in-between the lines and going straight ahead and now that I have it, I can't imagine living without it.

I sleep through the night while my Loop makes tiny adjustments every five minutes to keep my sugars as flat as possible. I don't know about you but my pancreas sits on my nightstand.

It's happening and it can't be stopped

Seven years ago I wrote about The Sad State of Diabetes Technology in 2012. Three years ago The Promising State of Diabetes Technology in 2016 and last year The Extremely Promising State of Diabetes Technology in 2018. There's a great comment from the first blog post in 2012 where Howard Loop shared his frustration with the state of things. Unlike most commenters on the Internet, amazingly Howard took action and started the Tidepool Organization! Everything in his comment from 7 years ago is happening.
Great article, Scott. You've accurately captured the frustration I've felt since my 12 year old daughter was diagnosed with T1D nine months ago. She also wears a pump and CGM and bravely performs the ritual you demonstrate in your video every three days. The technology is so retro it's embarrassing.

It's 2019 and things are really looking up. The open source DIY diabetes community is thriving. There are SEVERAL open pancreas systems to choose from and there's constant innovation happening with OpenAPS and Loop/LoopKit.

OpenAPS runs on devices like Raspberry Pi Zeros and is a self-contained pancreas with the communications and brain/algorithm all on the main device. Loop runs on an iPhone and uses a "RileyLink" devices that bridges the RF (Radio Frequency) insulin pump communications with modern Bluetooth.

The first bad part is I am running a 15 year old out of warranty cracked insulin pump I bought on Craigslist. Most new pumps are locked down, and my old pump is the last version that supported remote control. However, the Loop open source project announced support for a second pump this week, the OmniPod Eros. This is the first time an "in warranty" pump has been supported and it also proves the larger point made by the diabetes community. We Are Not Waiting. We want open choice and open data and open choices that put us in control.

Read about the history of Loop by original developer Nate Racklyeft. As he points out, a thing like Loop or OpenAPS is the result of a thousand little steps and innovation by countless community members who are so generous with their time.

The first system to run it was a Raspberry Pi; the code was a series of plugins, written with the help of Chris Hannemann, to the openaps toolkit developed by Ben West in collaboration with Dana Lewis and Scott Leibrand. I’m still in awe of the elegant premise in Ben’s design: a system of repeatable, recordable, and extendable transform commands, all backed by Git. The central plugin of the toolkit is decocare: Ben’s 5-year magnum opus, a reverse-engineered protocol of the Minimed Carelink USB radio to command insulin pumps.

There's an amazing write up by Pete Schwamb, one of the core members of the community who works on Loop full time now,  on how Software Defined Radios have allowed the community to "sniff" the communication protocols of insulin pumps in the RF spectrum and reverse engineer the communications for the Medtronic and now Omnipod Eros Insulin Pumps. It's a fascinating read that really illustrates how you just need the right people and a good cause and you can do anything.

You can watch my video presentation "Solving Diabetes with an Open Source Artificial Pancreas" where I offer an overview of the problem, a number solutions offered over the year, and two open source pancreas options in the form of LoopKit and OpenAPS.

The community members and organizations like Tidepool and the Nightscout Foundation are working with the FDA to take projects and concepts like an open source pancreas system from a threat based on years of frustration to a bright future based on mutual collaboration!

In March, 2018, the FDA announced a de novo iCGM (integrated CGM) designation. A de novo designation is the FDA process for creating new device classifications, in this case moving qualifying CGMs from Class-III, the highest FDA risk classification, to Class-II with Special Controls. The first CGM to get this designation is the Dexcom G6.

Diabetic Xbox AvatarWhat does this mean? It means the FDA is willing to classify continuous glucose meters in a formal way that paves a path towards interoperable devices. Today we hack devices to build these Loops with out-of-warranty pumps. We are doing this utterly on our own. It can take months to collect the equipment needed, get ancient pumps on the gray market, compile the software yourself - which is a huge hurdle for the non-technical.

Imagine a future where someone could buy a supported and in-warranty "iPump," download an officially supported app or package, and start looping! We could have world of open and interoperable devices and swappable algorithms.

In October of 2018 the non-profit Tidepool organization announced its intent to deliver the Loop app as a supported and FDA-regulated mobile app in the Apple App Store! This is happening, people but we are just getting started.

To learn more, start reading.

Loop - https://loopkit.github.io/loopdocs/ OpenAPS - https://openaps.org/ Tidepool - https://www.tidepool.org/

Also, if you're diabetic, consider buying a Nightscout Xbox Avatar accessory so you can see yourself represented while you game!

Sponsor: Suffering from a lack of clarity around software bugs? Give your customers the experience they deserve and expect with error monitoring from Raygun.com. Installs in minutes, try it today!


© 2018 Scott Hanselman. All rights reserved.
     

Exploring DNS with the .NET Core based Technitium DNS Server

Apr 18, 2019

Description:

Earlier this week I talked about how Your Computer is not a Black Box and I spent some time in TCPView and at the command line exploring open ports on my computer. I was doing this in order to debug an issue with a local DNS server I was playing with, so I thought I'd take a moment and look at that server itself.

The Technitium DNS Server is a personal local DNS server (FOSS on GitHub) written in C# and it runs on Windows, macOS, Linux, Raspberry Pi, etc. I downloaded the Portable app.

For Windows folks who aren't used to .tar.gz files, remember to "eXtract Zie Files!" with "tar -xzvf DnsServerPortable.tar.gz -C ./TechnitiumDNS/" and it's also worth reminding you all that tar.exe, curl.exe, wget.exe and more are all included in Windows 10 and have been since 2017. If that's too hard, use 7zip.

Technitium DNS is pretty cool, you just unzip/tar it and run start.sh or start.bat and it "just works." Of course, I did have a process already on port 53 - DNS - so I did a little debugging, but that was my fault.

Here's the local web UI that you can use to administer the server locally. You can forward to whatever upstream DNS server you'd like, with the added bonus that the forwarder can be DNS over HTTPS so you can use things like CloudFlare, Google, or Cloud9. Using DNS over HTTPS means your DNS lookups can be secured with DNSSEC and are far more secure and private than regular DNS over UDP/TCP.

Technitium also includes support for DNS Sinkholes (similar to how I use my Pi-Hole) and Block List URLs. It'll automatically download block lists daily and block ads.

Technitium is a lovely .NET Core based DNS Server

It's also educational to try running your own DNS server and it's fun to read the code! The code for Technitium's DNS Server is up at https://github.com/TechnitiumSoftware/DnsServer and is super interesting from a networking perspective, but also from an C# perspective. It's a very interesting example of some .NET Core code at a very low level and I'm thrilled that it works on every operating system.

There's even bash scripts for setting Technitium up on your RaspberryPi or Ubuntu to make it easy. If you are using Windows and don't care about .NET Core you can use the .NET that's included with Windows and Technitum has a Tray app and Installer as well.

Some of the code isn't "idiomatic" C#/.NET Core but it's interesting to read about. The main DnsWebService.cs is pretty intense as it doesn't use any ASP.NET Core routing or primitives. It's a complete webserver written using only System.Net and its own support libraries, along with some of the lower-level Newtonsoft.Json libraries.

The main DnsServer is also quite low level and very performant. It lives in DnsServer.cs. It opens up n sockets (depending on how many ports you bind to) and starts accepting connections here. DNS Datagrams start getting parsed here, right off the stream. The supporting libraries and networking helper code lives over at https://github.com/TechnitiumSoftware/TechnitiumLibrary which is a wealth of interesting and useful code covering BitTorrent, Mail, and Firewall management. There's a ton of OO representations of networking concepts, and all the DNS records are parsed manually.

Technitium has a DNS Server, client, Mac Address Changer, and open source instant messenger. The developer is extremely prolific. They even host a version of "Get HTTPS for free" that works with Windows and makes getting Let's Encrypt certificates super easy.

Anyway, I've been enjoying exploring DNS again and reminding myself not only that it still works great (since I learned about DNS from sniffing packets in networking class) and it's been updated and improved with caches, DNSSEC, DNS over HTTP and more in the years following.

Here I've set my IPv4 DNS to 127.0.0.1 and my IPv6 DNS to ::1, then I run NSLookup and try some domain lookups.

Looking up domains at the command line with nslookup

Again, to be clear, the local DNS server took these lookups and then forwarded them upstream to another server. However, you have the choice for your upstream lookups to be done over whatever protocols you want, you can use Google, OpenDNS, Quad9 (with DNSSEC or without), and on and on.

Are you running your own DNS Server?

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

Your computer is not a black box - Understanding Processes and Ports on Windows by exploring

Apr 16, 2019

Description:

TCPViewI did a blog post many years ago reminding folks that The Internet is not a Black Box. Virtually nothing is hidden from you. The same is true for your computer, whether it runs Linux, Mac, or Windows.

Here's something that happened today at lunch. I was testing a local DNS Server (more on this on Thursday) and I started it up...and it didn't work.

In order to test a DNS server on Windows, you can go to the command line and run "nslookup" then use the command "server 1.1.1.1" where 1.1.1.1 is the DNS server you'd like to try out. Go ahead and try it now. Run cmd.exe or powershell.exe and then run "nslookup" and then type any domain name. You should get an IP address.

Given that I was trying to run a DNS Server on localhost:53 (Port 53 is where DNS usually hangs out, just like Port 80 is where Web Servers (HTTP) hang out and 443 is where Secured Web Servers (HTTPS) usually are) I should be able to do this. I'm trying to send DNS requests to localhost:53

C:\Users\scott> nslookup
Default Server: pihole
Address: 192.168.151.6

> server 127.0.0.1
Default Server: localhost
Address: 127.0.0.1

> hanselman.com
Server: localhost
Address: 127.0.0.1

*** localhost can't find hanselman.com: No response from server
> hanselman.com

Weird, that didn't work. Let me try a DNS Server I know works like Google's 8.8.8.8 public DNS

> server 8.8.8.8
Default Server: google-public-dns-a.google.com
Address: 8.8.8.8

> hanselman.com
Server: google-public-dns-a.google.com
Address: 8.8.8.8

Non-authoritative answer:
Name: hanselman.com
Address: 206.72.120.92

Ok, it seems my local DNS isn't listening on point 53. Checking the logs of the Technitium local DNS server shows this:

[2019-04-15 23:26:31 UTC] [0.0.0.0:53] [UDP] System.Net.Sockets.SocketException (10048): Only one usage of each socket address (protocol/network address/port) is normally permitted
at System.Net.Sockets.Socket.UpdateStatusAfterSocketErrorAndThrowException(SocketError error, String callerName)
at System.Net.Sockets.Socket.DoBind(EndPoint endPointSnapshot, SocketAddress socketAddress)
at System.Net.Sockets.Socket.Bind(EndPoint localEP)
at DnsServerCore.DnsServer.Start() in Z:\Technitium\Projects\DnsServer\DnsServerCore\DnsServer.cs:line 1234
[2019-04-15 23:26:31 UTC] [0.0.0.0:53] [TCP] DNS Server was bound successfully.
[2019-04-15 23:26:31 UTC] [[::]:53] [UDP] DNS Server was bound successfully.
[2019-04-15 23:26:31 UTC] [[::]:53] [TCP] DNS Server was bound successfully.

The DNS Server's process is trying to bind to TCP:53 and UDP:53 using IPv4 (expressed as localhost with 0.0.0.0:53) and then TCP:53 and UDP:53 using IPv6 (expressed as localhost using [::]:53) but it seems like the UDP binding to port 53 on IPv4 failed. Weird.

Someone else is listening in on Port 53 localhost via IPv4.

That's weird. How can we find out what ports are open locally?

I can run "netstat" and ask Windows for a list of all TCP/IP connections and the processes that are listening on which ports. I'll also PIPE the results to "clip" which will put it in the clipboard automatically. Then I can look at it in a text editor (or I could pipe it through find or findstr).

You can run netstat --help to get the right arguments. I've asked it to tell me the process IDs and all the details it can.

Active Connections
Proto Local Address State PID

TCP 0.0.0.0:53 LISTENING 27456
[dotnet.exe]

UDP 0.0.0.0:53 LISTENING 11128
[svchost.exe]

TCP [::]:53 *:* 27456
[dotnet.exe]

UDP [::]:53 *:* 27456
[dotnet.exe]

Hm, a service is already listening on port 53. I'm running Windows 10, not a Server so it's odd there's already a DNS listener on port 53.

I wonder what service is it?

I can check the Services Tab of the Task Manager and sort by PID. Or can I run "tasklist" and ask directly.

C:\WINDOWS\system32>tasklist /svc /fi "pid eq 11128"

Image Name PID Services
========================= ======== ============================================
svchost.exe 11128 SharedAccess

That's Internet Connection Sharing, and it's used by Docker and other apps for NAT translation and routing. I can shut it down with the sc (service control) or with "net stop."

C:\WINDOWS\system32>net stop sharedaccess
The Internet Connection Sharing (ICS) service is stopping.
The Internet Connection Sharing (ICS) service was stopped successfully.

Now I can start my DNS Server again (it's written in .NET Core) and I can see with tcpview.exe that it's listening on all appropriate ports.

TCPView showing everything on Port 53

In conclusion, it's a good reminder to refresh yourself on the basics of IPv4, IPv6, how processes talk to/allocate ports, what Process IDs (PIDs) are, and their relationships. Much of this is taught in computer science university courses but if you're self taught or not doing low level work every day it's easy to forget.

Virtually nothing on your computer is hidden from you!

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

Blocking ads before they enter your house at the DNS level with pi-hole and a cheap Raspberry Pi

Apr 11, 2019

Description:

image

Lots of folks ask me about Raspberry Pis. How many I have, what I use them for. At last count there's at least 22 Raspberry Pis in use in our house.

One runs our dakboard family dashboard that we built in a weekend but use every day. We have at 3 that are set up for retrogaming - one in a 3d printed Gameboy (A pi-grrl, in fact), one in a X-Arcade Tankstick, one in a tiny laser-cut arcade case for the desktop. I have a Raspberry Pi that runs one of my 3D Printers running Octoprint. This one also has as camera and does time-lapse videos of my 3D prints. We have another 3 that run little robots my sons and I have built 6 are running in a local Kubernetes Cluster These 6 Pis are my personal cloud, so maybe there's 16 Pis in the house and one Pi Cloud/Cluster. One is an internet radio in the 13 year old's room running PiMusicBox. One is a touchscreen tablet the 11 year old uses for Scratch. Imagine a Linux iPad. One runs Kodi as an entertainment center in the kids' play room. One lives in a CrowPi that we use for experiments and .NET Core remote debugging. Another three are Raspbery Pi Zero Ws for various experiments with one Pi Zero W acting as as backup Open Source Artificial Pancreas. and most recently one is a Pi-hole. A Black hole that eats tracking cookies, advertising, and other bad stuff. See also "shut your pie hole." AKA that place you put pie.

A Pi-hole is a Raspbery Pi appliance that takes the form of an DNS blocker at the network level. You image a Pi, set up your network to use that Pi as a DNS server and maybe white-list a few sites when things don't work.

I was initially skeptical, but I'm giving it a try. It doesn't process all network traffic, it's a DNS hop on the way out that intercepts DNS requests for known problematic sites and serves back nothing.

Installation is trivial if you just run unread and untrusted code from the 'net ;)

curl -sSL https://install.pi-hole.net | bash

Otherwise, follow their instructions and download the installer, study it, and run it.

I put my pi-hole installation on the metal, but there's also a very nice Docker Pi-hole setup if you prefer that. You can even go further, if, like me, you have Synology NAS which can also run Docker, which can in turn run a Pi-hole.

Within the admin interface you can tail the logs for the entire network, which is also amazing to see. You think you know what's talking to the internet from your house - you don't. Everything is logged and listed. After installing the Pi-hole roughly 18% of the DNS queries heading out of my house were blocked. At one point over 23% were blocked. Oy.

NOTE: If you're using an Amplifi HD or any "clever" router, you'll want to change the setting "Bypass DNS cache" otherwise the Amplifi will still remain the DNS lookup of choice on your network. This setting will also confuse the Pi-hole and you'll end up with just one "client" of the Pi-hole - the router itself.

For me it's less about advertising - especially on small blogs or news sites I want to support - it's about just obnoxious tracking cookies and JavaScript. I'm going to keep using Pi-hole for a few months and see how it goes. Do be aware that some things WILL break. Could be a kid's iPhone free-to-play game that won't work unless it can download an add, could be your company's VPN. You'll need to log into http://pi.hole/admin (make sure you save your password when you first install, and you can only change it at the SSH command line with "pihole -a -p") and sometimes disable it for a few minutes to test, then whitelist certain domains. I suspect after a few weeks I'll have it nicely dialed in.

Sponsor: Seq delivers the diagnostics, dashboarding, and alerting capabilities needed by modern development teams - all on your infrastructure. Download at https://datalust.co/seq.
© 2018 Scott Hanselman. All rights reserved.
     

Accessibility Insights for the Web and Windows makes accessibility even easier

Apr 9, 2019

Description:

Accessibility InsightsI recently stumbled upon https://accessibilityinsights.io. There's both a Chrome/Edge extension and a Windows app, both designed to make it easier to find and fix accessibility issues in your websites and apps.

The GitHub for the Accessibility Insights extension for the web is at https://github.com/Microsoft/accessibility-insights-web and they have three trains you can get on:

Canary (released continuously) Insider (on feature completion) Production (after validation in Insider)

It builds on top of the Deque Axe core engine with a really fresh UI. The "FastPass" found these issues with my podcast site in seconds - which kind of makes me feel bad, but at least I know what's wrong!

However, the most impressive visualization in my opinion was the Tab Stop test! See below how it draws clear numbered line segments as you Tab from element. This is a brilliant way to understand exactly how someone without a mouse would move through your site.

I can easily see what elements are interactive and what's totally inaccessible with a keynote! I can also see if the the tab order is inconsistent with the logical order that's communicated visually.

Visualized Tab Stops as numbered points on a line segment that moves through the DOM

After the FastPass and Tab Visualizations, there's an extensive guided assessment that walks you through 22 deeper accessibility areas, each with several sub issues you might run into. As you move through each area, most have Visual Helpers to help you find elements that may have issues.

Checking for accessible elements on a web site

After you're done you and export your results as a self-contained HTML file you can check in and then compare with future test results.

There is also an Accessibility Insights for Windows if I wanted to check, for example, the accessibility of the now open-source Windows Calculator https://github.com/Microsoft/calculator.

It also supports Tab Stop visualization and is a lot like Spy++ - if you remember that classic developer app. There were no Accessibility issues with Calculator - which makes sense since it ships with Windows and a lot of people worked to make it Accessible.

Instead I tried to test Notepad2. Here you can see it found two elements that can have keybook focus but have no names. Even cooler, you can click "New Bug" and it will create a new accessibility bug for you in Azure DevOps.

Test Results for Windows apps being checked for accessibility

The Windows app is also open source and up at https://github.com/Microsoft/accessibility-insights-windows for you to explore and file issues! There's also excellent developer docs to get you up to speed on the organization of the codebase and how each class and project works.

You can download both of these free open source Accessibility Tools at https://accessibilityinsights.io and start testing your websites and apps. I have some work to do!

Sponsor: Seq delivers the diagnostics, dashboarding, and alerting capabilities needed by modern development teams - all on your infrastructure. Download at https://datalust.co/seq.


© 2018 Scott Hanselman. All rights reserved.
     

Coders: Context Switching is hard for both computers and relationships

Apr 4, 2019

Description:

Coders: The Making of a New Tribe and the Remaking of the World

Clive Thompson is a longtime contributing writer for the New York Times Magazine and a columnist for Wired and now has a new book out called "Coders."

"Along the way, Coders thoughtfully ponders the morality and politics of code, including its implications for civic life and the economy. Programmers shape our everyday behavior: When they make something easy to do, we do more of it. When they make it hard or impossible, we do less of it."

I'm quoted in the book and I talk about how I've struggled with context-switching.

Here is TechTarget's decent definition of Context Switching:

A context switch is a procedure that a computer's CPU (central processing unit) follows to change from one task (or process) to another while ensuring that the tasks do not conflict. Effective context switching is critical if a computer is to provide user-friendly multitasking.

However, human context switching is the procedure we all have to go through to switch from "I am at work" mode to "I am at home" mode. This can be really challenging for everyone, no matter their job or background, but I propose for certain personalities and certain focused jobs like programming it can be even worse.

Quoting Clive from an ArsTechnica article where he mentions my troubles, emphasis mine:

One of the things that really leapt out is the almost aesthetic delight in efficiency and optimization that you find among software developers. They really like taking something that's being done ponderously, or that's repetitive, and optimizing it. Almost all engineering has focused on making things run more efficiently. Saving labor, consolidating steps, making something easier to do, amplifying human abilities. But it also can be almost impossible to turn off. Scott Hanselman talks about coding all day long and coming down to dinner. The rest of the family is cooking dinner and he immediately starts critiquing the inefficient ways they're doing it: "I've moved into code review of dinner."

Ordinarily a good rule of thumb on the internet is "don't read the comments." But we do. Here's a few from that ArsTechnica thread that are somewhat heartening. It sucks to "suffer" but there's a kind of camaraderie in shared suffering.

With reference to "Scott Hanselman talks about coding all day long and coming down to dinner. The rest of the family is cooking dinner and he immediately starts critiquing the inefficient ways they're doing it: "I've moved into code review of dinner.""

Wow, that rings incredibly true.

That's good to hear. I'm not alone!

I am not this person. I have never been this person.
Then again, I'm more of a hack than hacker, so maybe that's why. I'm one of those people who enjoys programming, but I've never been obsessed with elegance or efficiency. Does it work? Awesome, let's move on.

That's amazing that you have this ability. For some it's not just hard to turn off, it's impossible and it can ruin relationships.

When you find yourself making "TODO" and "FIXME" comments out loud, it's time to take a break. Don't ask me how I know this.

It me.

Yep, here too 2x--both my wife and I are always arguing over the most efficient way to drive somewhere. It's actually caused some serious arguments! And neither one of us are programmers or in that field. (Although I think each of us could have been.)
From the day I was conscious I've been into bin packing and shortest path algorithms--putting all the groceries up in the freezer even though we bought too much--bin packing. Going to that grocery store and back in peak traffic--shortest path. I use these so often and find such sheer joy in them that it's ridiculous, but hey, whatever keeps me happy.

This is definitely a thing that isn't programmer-specific. Learning to let go and to accept that your partner in life would be OK without you is an important stuff. My spouse is super competent and I'm sure could reboot the router without me and even drive from Point A to Point B without my nagging. ;)

However we forget these things and we tend to try and "be helpful" and hyper-optimize things that just don't need optimizing. Let it go. Let people just butter their damn bread the way they like. Let them drive a mile out of the way, you'll still get there. We tend to be ruder to our partners than we would be to a stranger.

That’s part of the reason why I’m now making all dinners for my family ;-)

LOL, this is also a common solution. Oh, you got opinions? Here's the spatula!

What do YOU think? How do you context switch and turn work off and try to be present for your family?

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

The Transitive Property of Friendship - and the importance of the Warm Intro

Apr 2, 2019

Description:

Too many LinkedIn invitationsPer Wikipedia, "In mathematics, a binary relation ... is transitive if ... element a is related to an element b and b is related to an element c then a is also related to c."

Per Me, if I am cool with you, and you are cool with your friend, then I'm cool with your friend. I've decided this is The Transitive Property of Friendship.

As I try to mentor more and more people and help folks Level Up in tech, I'm realizing how important it is to #BeTheLuck for someone else. This is something that YOU can do - volunteer at local schools, forward that resume for your neighbor, give a Warm Intro to a friend of a friend.

A lot of one's success can be traced back to hard work and being prepared for opportunities to present themselves, but also to Warm Intros. Often you'll hear about someone who worked hard in school, studied, did well, but then got a job because "their parent knew a person who worked at x." That's something that is hard to replicate. For under-represented folks trying to break into tech, for example, it's the difference between your resume sitting in a giant queue somewhere vs. sitting on the desk of the hiring manager. Some people inherit a personal network and a resume can jump to the top of a stack with a single phone call, while others send CV after CV with nary a callback.

This is why The Warm Intro is so important. LinkedIn has tried to replicate this by allowing you to "build your professional network" but honestly, you can't tell if I'm cool with someone on LinkedIn just because they're connected to me. Even Facebook "friends" have changed the definition of friend. It certainly has for me. Now I'm mentally creating friend categories like work colleague, lowercase f friend, Uppercase F Friend, etc.

Here's where it gets hard. You can't help everyone. You also have to protect yourself and your own emotional well-being. This is where cultivating a true network of genuine friends and work colleagues comes in. If your First Ring of Friends are reliable, kind, and professional, then it's safer to assume that anyone they bring into your world has a similar mindset. Thus, The Transitive Property of Friendship - also know as "Any friend of Scott's is a friend of mine." The real personal network isn't determined by Facebook or LinkedIn, it's determined by your gut, your experiences, and your good judgment. If you get burned, you'll be less likely to recommend someone in the future.

I've been using this general rule to determine where and when to spend my time while still trying to Lend my Privilege to as many people as possible. It's important also to not be a "transactional networker." Be thoughtful if you're emailing someone cold (me or otherwise). Don't act like an episode of Billions on Showtime. We aren't keeping score, tracking favors, or asking for kickbacks. This isn't about Amazon Referral Money or Finder's Fees. When a new friend comes into your life via another and you feel you can help, give of your network and time freely. Crack the door open for them, and then let them kick it open and hopefully be successful.

All of this starts by you - we - building up warm, genuine professional relationships with a broad group of people. Then using that network not just for yourself, but to lift the voices and careers of those that come after you.

What are YOUR tips and thoughts on building a warm and genuine personal and professional network of folks?

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

Displaying your realtime Blood Glucose from NightScout on an AdaFruit PyPortal

Mar 29, 2019

Description:

file-2AdaFruit makes an adorable tiny little Circuit Python IoT device called the PyPortal that's just about perfect for the kids - and me. It a little dakBoard, if you will - a tiny totally programmable display with Wi-Fi and  lots of possibilities and sensors. Even better, you can just plug it in over USB and edit the code.py file directly on the drive that will appear. When you save code.py the device soft reboots and runs your code.

I've been using Visual Studio Code to program Circuit Python and it's become my most favorite IoT experience so far because it's just so easy. The "Developer's Inner Loop" of code, deploy, debug is so fast.

As you may know, I use a Dexcom CGM (Continuous Glucose Meter) to manage my Type 1 Diabetes. I feed the data every 5 minutes into an instance of the Nightscout Open Source software hosted in Azure. That gives me a REST API to my own body.

I use that REST API to make "glanceable displays" where I - or my family - can see my blood sugar quickly and easily.

I put my blood sugar in places like:

my git prompt the color of my keyboard keys Siri and Alexa DakBoard family wall mounted dashboards

And today, on a tiny PyPortal device. The code is simple, noting that I don't speak Python, so Pull Requests are always appreciated.

import time
import board
from adafruit_pyportal import PyPortal

# Set up where we'll be fetching data from
DATA_SOURCE = "https://NIGHTSCOUTWEBSITE/api/v1/entries.json?count=1"
BG_VALUE = [0, 'sgv']
BG_DIRECTION = [0, 'direction']

RED = 0xFF0000;
ORANGE = 0xFFA500;
YELLOW = 0xFFFF00;
GREEN = 0x00FF00;

def get_bg_color(val):
if val > 200:
return RED
elif val > 150:
return YELLOW
elif val < 60:
return RED
elif val < 80:
return ORANGE
return GREEN

def text_transform_bg(val):
return str(val) + ' mg/dl'

def text_transform_direction(val):
if val == "Flat":
return "→"
if val == "SingleUp":
return "↑"
if val == "DoubleUp":
return "↑↑"
if val == "DoubleDown":
return "↓↓"
if val == "SingleDown":
return "↓"
if val == "FortyFiveDown":
return "→↓"
if val == "FortyFiveUp":
return "→↑"
return val

# the current working directory (where this file is)
cwd = ("/"+__file__).rsplit('/', 1)[0]
pyportal = PyPortal(url=DATA_SOURCE,
json_path=(BG_VALUE, BG_DIRECTION),
status_neopixel=board.NEOPIXEL,
default_bg=0xFFFFFF,
text_font=cwd+"/fonts/Arial-Bold-24-Complete.bdf",
text_position=((90, 120), # VALUE location
(140, 160)), # DIRECTION location
text_color=(0x000000, # sugar text color
0x000000), # direction text color
text_wrap=(35, # characters to wrap for sugar
0), # no wrap for direction
text_maxlen=(180, 30), # max text size for sugar & direction
text_transform=(text_transform_bg,text_transform_direction),
)

# speed up projects with lots of text by preloading the font!
pyportal.preload_font(b'mg/dl012345789');
pyportal.preload_font((0x2191, 0x2192, 0x2193))
#pyportal.preload_font()

while True:
try:
value = pyportal.fetch()
pyportal.set_background(get_bg_color(value[0]))
print("Response is", value)
except RuntimeError as e:
print("Some error occured, retrying! -", e)
time.sleep(180)

I've put the code up at https://github.com/shanselman/NightscoutPyPortal. I want to get (make a custom?) a larger BDF (Bitmap Font) that is about twice the size AND includes 45 degree arrows ↗ and ↘ as the font I have is just 24 point and only includes arrows at 90 degrees. Still, great fun and took just an hour!

NOTE: I used the Chortkeh BDF Font viewer to look at the Bitmap Fonts on Windows. I still need to find a larger 48+ PT Arial.

What information would YOU display on a PyPortal?

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.
© 2018 Scott Hanselman. All rights reserved.
     

F7 is the greatest PowerShell hotkey that no one uses any more. We must fix this.

Mar 26, 2019

Description:

Thousands of years ago your ancestors, and myself, were using DOS (or CMD) pressing F7 to get this amazing little ASCII box to pop up to pick commands they'd typed before.

Holy crap it's a little ASCII box

When I find myself in cmd.exe I use F7 a lot. Yes, I also speak *nix and Yes, Ctrl-R is amazing and lovely and you're awesome for knowing it and Yes, it works in PowerShell.

Ctrl-R for history works in PowerShell

Here's the tragedy. Ctrl-R for a reverse command search works in PowerShell because of a module called PSReadLine. PSReadLine is basically a part of PowerShell now and does dozens of countless little command line editing improvements. It also - not sure why and I'm still learning - unknowingly blocks the glorious F7 hotkey.

If you remove PSReadLine (you can do this safely, it'll just apply to the current session)

Remove-Module -Name PSReadLine

Why, then you get F7 history with a magical ASCII box back in PowerShell. And as we all know, 4k 3D VR be damned, impress me with ASCII if you want a developer's heart.

There is a StackOverflow Answer with a little PowerShell snippet that will popup - wait for it - a graphical list with your command history by calling

Set-PSReadlineKeyHandler -Key F7

And basically rebinding the PSReadlineKeyHandler for F7. PSReadline is brilliant, but I what I really want to do is to tell it to "chill" on F7. I don't want to bind or unbind F7 (it's not bound by default) I just want it passed through.

Until that day, I, and you, can just press Ctrl-R for our reverse history search, or get this sad shadow of an ASCII box by pressing "h." Yes, h is already aliased on your machine to Get-History.

PS C:\Users\scott> h

  Id CommandLine
   -- -----------
    1 dir
    2 Remove-Module -Name PSReadLine

Then you can even type "r 1" to "invoke-history" on item 1.

But I will still mourn my lovely ASCII (High ASCII? ANSI? VT100?) history box.

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

Getting Started with .NET Core and Docker and the Microsoft Container Registry

Mar 22, 2019

Description:

It's super easy to get started with .NET Core and/or ASP.NET Core with Docker. If you have Docker installed you don't need to install anything to try out .NET Core, of course.

To run a little .NET Core console app:

docker run --rm mcr.microsoft.com/dotnet/core/samples:dotnetapp

And the result:

latest: Pulling from dotnet/core/samples
Hello from .NET Core!
...SNIP...

**Environment**
Platform: .NET Core
OS: Linux 4.9.125-linuxkit #1 SMP Fri Sep 7 08:20:28 UTC 2018

To run a quick little ASP.NET Core website just:

docker run -it --rm -p 8000:80 --name aspnetcore_sample mcr.microsoft.com/dotnet/core/samples:aspnetapp

And here it is running on localhost:8000

Simple ASP.NET Core app under Docker

You can also host ASP.NET Core Images with Docker over HTTPS to with this image, or run ASP.NET Core apps in Windows Containers.

Note that Microsoft teams are now publishing container images to the MCR (Microsoft Container Registry) so they can use the Azure CDN and pull faster when they are closer to you globally. The images start at MCR and then can be syndicated to other container registries.

The new repos follow: .NET Core Runtime dependencies (just the stuff .NET Core needs, but not .NET Core itself - useful if you want to distribute your own copy and still want a small container image size) .NET Core Runtime (Just what's needed to run a .NET Core app) .NET Core SDK (includes the compilers, everything) ASP.NET Core runtime (everything you need to RUN your ASP.NET Core web app)

When you "docker pull" you can use tag strings for .NET Core and it works across any supported .NET Core version

SDK: docker pull mcr.microsoft.com/dotnet/core/sdk:2.1 ASP.NET Core Runtime: docker pull mcr.microsoft.com/dotnet/core/aspnet:2.1 .NET Core Runtime: docker pull mcr.microsoft.com/dotnet/core/runtime:2.1 .NET Core Runtime Dependencies: docker pull mcr.microsoft.com/dotnet/core/runtime-deps:2.1

For example, I can run the .NET Core 3.0 SDK and mess around with it like this:

docker run -it mcr.microsoft.com/dotnet/core/sdk:3.0

I've been using Docker to run my unit tests on my podcast site within a container locally. Then I volume mount and dump the test results out in a local folder and inspect them with Visual Studio

docker build --pull --target testrunner -t podcast:test .
docker run --rm -v c:\github\hanselminutes-core\TestResults:/app/hanselminutes.core.tests/TestResults podcast:test

I can then either host the Docker container in Azure App Service for Containers, or as little one-off per-second billed instances with Azure Container Instances (ACI).

Have you been using .NET Core in Docker? How has it been going for you?

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

What is Blazor and what is Razor Components?

Mar 19, 2019

Description:

I've blogged a little about Blazor, showing examples like Compiling C# to WASM with Mono and Blazor then Debugging .NET Source with Remote Debugging in Chrome DevTools as well as very early on asking questions like .NET and WebAssembly - Is this the future of the front-end?

Let's back up and level-set.

What is Blazor?

Blazor is a single-page app framework for building interactive client-side Web apps with .NET. Blazor uses open web standards without plugins or code transpilation. Blazor works in all modern web browsers, including mobile browsers.

You write C# in case of JavaScript, and you can use most of the .NET ecosystem of open source libraries. For the most part, if it's .NET Standard, it'll run in the browser. (Of course if you called a Windows API or a Linux specific API and it didn't exist in the client-side browser S world, it's not gonna work, but you get the idea).

The .NET code runs inside the context of WebAssembly. You're running "a .NET" inside your browser on the client-side with no plugins, no Silverlight, Java, Flash, just open web standards.

WebAssembly is a compact bytecode format optimized for fast download and maximum execution speed.

Here's a great diagram from the Blazor docs.

Blazor runs inside your browser, no plugins needed

Here's where it could get a little confusing. Blazor is the client-side hosting model for Razor Components. I can write Razor Components. I can host them on the server or host them on the client with Blazor.

You may have written Razor in the past in .cshtml files, or more recently in .razor files. You can create and share components using Razor - which is a mix of standard C# and standard HTML, and you can host these Razor Components on either the client or the server.

In this diagram from the docs you can see that the Razor Components are running on the Server and SignalR (over Web Sockets, etc) is remoting them and updating the DOM on the client. This doesn't require Web Assembly on the client, the .NET code runs in the .NET Core CLR (Common Language Runtime) and has full compatibility - you can do anything you'd like as you are not longer limited by the browser's sandbox.

Here's Razor Components running on the server

Per the docs:

Razor Components decouples component rendering logic from how UI updates are applied. ASP.NET Core Razor Components in .NET Core 3.0 adds support for hosting Razor Components on the server in an ASP.NET Core app. UI updates are handled over a SignalR connection.

Here's the canonical "click a button update some HTML" example.

@page "/counter"

<h1>Counter</h1>

<p>Current count: @currentCount</p>

<button class="btn btn-primary" onclick="@IncrementCount">Click me</button>

@functions {
int currentCount = 0;

void IncrementCount()
{
currentCount++;
}
}

You can see this running entirely in the browser, with the C# .NET code running on the client side. .NET DLLs (assemblies) are downloaded and executed by the CLR that's been compiled into WASM and running entirely in the context of the browser.

Note also that I'm stopped at a BREAKPOINT in C# code, except the code is running in the browser and mapped back into JS/WASM world.

Debugging Razor Components on the Client Side

But if I host my app on the server as hosted Razor Components, the C# code runs entirely on the Server-side and the client-side DOM is updated over a SignalR link. Here I've clicked the button on the client side and hit the breakpoint on the server-side in Visual Studio. No there's no POST and no POST-back. This isn't WebForms - It's Razor Components. It's a SPA app written in C#, not JavaScript, and I can change the locations of the running logic, while the UI remains always standard HTML and CSS.

Debugging Razor Components on the Server Side

Looking at how Razor Components and now Phoenix LiveView are offering a new way to manage JavaScript-free stateful server-rendered apps has me realizing it’s the best parts of WebForms where the postback is now a persistent websockets tunnel to the backend and only diffs are sent

— Scott Hanselman (@shanselman) March 16, 2019

It's a pretty exciting time on the open web. There's a lot of great work happening in this space and I'm very interesting to see how frameworks like Razor Components/Blazor and Phoenix LiveView change (or don't) how we write apps for the web.

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

Xbox Avatar accessories for People with Diabetes! Sponsored by Nightscout and Konsole Kingz

Mar 14, 2019

Description:

My Xbox user name is Glucose for a reason.

This is a passion project of mine. You've likely seen me blog about diabetes for many many years. You may have enjoyed my diabetes hacks like lighting up my keyboard keys to show me my blood sugar, or some of the early work Ben West and I did to bridge Dexcom's cloud with the NightScout open source diabetes management system.

Recently Xbox announced new avatars! They look amazing and the launch was great. They now have avatars in wheelchairs, ones with artificial limbs, and a wide variety of hair and skin tones. This is fantastic as it allows kids (and adults!) to be seen and be represented in their medium of choice, video games.

I was stoked and immediately searched the store for "diabetes." No results. No pumps, sensors, emotes, needles, nothing. So I decided to fix it.

NOW AVAILABLE: Go and buy the Nightscout Diabetes CGM avatar on the Xbox Store now!

I called two friends - my friends at the Nightscout Foundation, dedicated to open source and open data for people with diabetes, as well as my friends at Konsole Kingz, digital avatar creators extraordinaire with over 200 items in the Xbox store from kicks to jerseys and tattoos.

And we did it! We've added our first diabetes avatar top with some clever coding from Konsole Kingz, it is categorized as a top but gives your avatar not only a Nightscout T-Shirt with your choice of colors, but also a CGM (Continuous Glucose Meter) on your arm!

Miss USA has a CGMFor most diabetics, CGMs are the hardware implants we put in weekly to tell us our blood sugar with minimal finger sticks. They are the most outwardly obvious physical manifestation of our diabetes and we're constantly asked about them. In 2017, Miss USA contestant Krista Ferguson made news by showing her CGM rather than hiding it. This kind of visible representation matters to kids with diabetes - it tells them (and us) that we're OK.

You can find the Nightscout CGM accessory in a nuimber of ways. You can get it online at the Xbox Avatar shop, and when you've bought it, it'll be in the Purchased Tab of the Xbox Avatar Editor, under Closet | Tops.

You can even edit your Xbox Avatar on Windows 10 without an Xbox! Go pick up the Xbox Avatar Editor and install it (on both your PC and Xbox if you like) and you can experiment with shirt and logo color as well.

Consider this a beta release. We are working on improving resolution and quality, but what we really what to know is this - Do you want more Diabetes Xbox Avatar accessories? Insulin pumps on your belt? An emote to check your blood sugar with a finger stick?

Diabetes CGM on an Xbox avatar

If this idea is a good one and is as special to you and your family (and the gamers in your life with diabetes) please SHARE it. Share it on social media, tell your friends at the news. Profits from this avatar item will go to the Nightscout Foundation!

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.
© 2018 Scott Hanselman. All rights reserved.
     

How to stream PC games from Windows 10 to your Xbox One for free

Mar 12, 2019

Description:

Xbox is ready for you to connect to wirelesslyI've been really enjoying my Xbox lately (when the family is asleep) as well as some fun Retrogaming on original consoles. Back in 2015 I showed how you can stream from your Xbox to any PC using the Xbox app from the Windows Store. You can pair your Xbox controller with any PC you've got around (either with the $20 Xbox Wireless Adapter or just with a micro-USB cable you likely have already). In fact, I often walk on the treadmill while streaming games from the Xbox to my little Surface Pro 3.

Then, a year later I did the inverse. I played PC games on my big screen using a SteamLink! Although they've been discontinued, they are out there and they work great. This little box lets you play PC games remotely on your large screens. I have a big PC in my office and I wanted to use the big TV in the living room. The game still runs on the PC but the video/audio and controls are all remoted to the Xbox. Plus, SteamLink only works with the Steam app running and is optimized for Steam games. It's a single task box and one more thing to plug into HDMI but it works well.

Fast-forward to today and I learned that Windows 10 can project its screen to an Xbox One AND you can use your Xbox One controller to control it (it's paired on the Xbox side) and play games or run apps. No extra equipment needed.

I installed the Xbox Wireless Display App on my Xbox One. Then on my PC, here's what I see upon pressing Win+P and clicking "Connect to Wireless Display."

Connected to Xbox One

Once I've duplicated, you can see here I'm writing this blog post wirelessly projected to the Xbox. It just worked. Took 5 min to do this.

If you're tech savvy, you may say, isn't this "just Miracast" and "hasn't this always been possible?" Yes and no. What's been updated is the Xbox Wireless Display App that you'll want to install and run on your Xbox. You may have been able to project your PC screen to various sticks and Miracast adapters, but this free app makes your Xbox a receiver for Miracast broadcasts (over wifi or LAN) and most importantly - now you can use your Xbox controller already paired to the Xbox to control the remote PC. You can use that control to play games or switch to mouse control mode with Start+Select and mouse around with your Xbox thumbsticks!

20190315_050720572_iOS

If I hit the menu button I can see how the controllers map to PC controls. No remote keyboard and mouse connected from the Xbox...yet. (and to be clear, no word if that will ever be supported but it'd be cool!)

Controller Mapping for PC to Xbox

To make sure you can do this, run DxDiag and save all information into "DxDiag.txt." Here's part of mine. There's nothing special about my machine. It's worth pointing out I have no Wifi adapter on this machine and it has an NVidia 1080 video card. Miracast is happening over the Wired LAN (local area network) in my house. This is Miracast over Infrastructure and it's in Windows 10 since version 1703 (March 2017).

------------------
System Information
------------------
Machine name: IRONHEART
Operating System: Windows 10 Pro 64-bit
Processor: Intel(R) Core(TM) i9-7900X CPU @ 3.30GHz (20 CPUs), ~3.3GHz
Memory: 32768MB RAM
DirectX Version: DirectX 12
User DPI Setting: 144 DPI (150 percent)
System DPI Setting: 144 DPI (150 percent)
Miracast: Available, with HDCP

When you've connected your PC to my Xbox and are streaming FROM my PC to your Xbox, you'll see this bar at the top of the PC side. There's three optimization settings for Gaming, Working, and Watching Videos. I assume these are balancing crispness/quality with framerate and latency changes.

Gaming, Working, Watching Videos

Now let's take it to the next level. I can run Steam Big Picture and here I am running Batman: Arkham Origins on my PC, but played on and controlled from my Xbox in the other room!

Ok this is amazing. You run “wireless display” on your Xbox. Then on your PC, just WinKey+P on your PC and connect to wireless display. This is Batman on my Xbox...RUNNING ON MY PC pic.twitter.com/pyxmHLe3fz

— Scott Hanselman (@shanselman) March 14, 2019

I like that I don't need the SteamLink. I find that this runs more reliably and more easily than my original set up. I like that I can switch the Xbox controller from controller mode to mouse mode. And most of all I like that this doesn't require any custom setup, extra work, or drivers. It just worked out of the box for me.

Your mileage may vary and I'm trying to figuire out why some people's video card drivers don't allow this and then end up with no "Connect to a Wireless Display" option in their Win+P menu. If you figure it out, please sound of in the comments.

Give it a try! I hope you enjoy it. I'm having a blast.

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

How to parse string dates with a two digit year and split on the right century in C#

Mar 7, 2019

Description:

So you've been asked to parse some dates, except the years are two digit years. For example, dates like "12 Jun 30" are ambiguous...or are they?

If "12 Jun 30" is intended to express a birthday, given it's 2019 as the of writing of this post, we can assume it means 1930. But if the input is "12 Jun 18" is that last year, or is that a 101 year old person's birthday?

Enter the Calendar.TwoDigitYearMax property.

For example, if this property is set to 2029, the 100-year range is from 1930 to 2029. Therefore, a 2-digit value of 30 is interpreted as 1930, while a 2-digit value of 29 is interpreted as 2029.

The initial value for this property comes out of the DEPTHS of the region and languages portion of the Control Panel. Note way down there in "additional date, time, & regional settings" in the "more settings" and "date" tab, there's a setting that (currently) splits on 1950 and 2049.

Two Digit Year regional settings

If you're writing a server-side app that parses two digit dates you'll want to be conscious and explicit about what behavior you WANT so that you're not surprised.

Setting TwoDigitYearMax sets a 100 year RANGE that your two digit years will be interpreted to be within. You can also just change it on the current thread's current culture's calendar. It's up to you.

For example, this little app:

string dateString = "12 Jun 30"; //from user input
DateTime result;
CultureInfo culture = new CultureInfo("en-US");
DateTime.TryParse(dateString, culture, DateTimeStyles.None, out result);
Console.WriteLine(result.ToLongDateString());

culture.Calendar.TwoDigitYearMax = 2099;

DateTime.TryParse(dateString, culture, DateTimeStyles.None, out result);
Console.WriteLine(result.ToLongDateString());

gives this output:

Thursday, June 12, 1930
Wednesday, June 12, 2030

Note that I've changed TwoDigitYearMax from and moved it up to the 1999-2099 range so "30" is assumed to be 2030, within that 100 year range.

Hope this helps!

Sponsor: Stop wasting time trying to track down the cause of bugs. Sentry.io provides full stack error tracking that lets you monitor and fix problems in real time. If you can program it, we can make it far easier to fix any errors you encounter with it.


© 2018 Scott Hanselman. All rights reserved.
     

Converting an Excel Worksheet into a JSON document with C# and .NET Core and ExcelDataReader

Mar 6, 2019

Description:

Excel isn't a database, except when it isI've been working on a little idea where I'd have an app (maybe a mobile app with Xamarin or maybe a SPA, I haven't decided yet) for the easily accessing and searching across the 500+ videos from https://azure.microsoft.com/en-us/resources/videos/azure-friday/

HOWEVER. I don't have access to the database that hosts the metadata and while I'm trying to get at least read-only access to it (long story) the best I can do is a giant Excel spreadsheet dump that I was given that has all the video details.

This, of course, is sub-optimal, but regardless of how you feel about it, it's a database. Or, a data source at the very least! Additionally, since it was always going to end up as JSON in a cached in-memory database regardless, it doesn't matter much to me.

In real-world business scenarios, sometimes the authoritative source is an Excel sheet, sometimes it's a SQL database, and sometimes it's a flat file. Who knows?

What's most important (after clean data) is that the process one builds around that authoritative source is reliable and repeatable. For example, if I want to build a little app or one page website, yes, ideally I'd have a direct connection to the SQL back end. Other alternative sources could be a JSON file sitting on a simple storage endpoint accessible with a single HTTP GET. If the Excel sheet is on OneDrive/SharePoint/DropBox/whatever, I could have a small serverless function run when the files changes (or on a daily schedule) that would convert the Excel sheet into a JSON file and drop that file onto storage. Hopefully you get the idea. The goal here is clean, reliable pragmatism. I'll deal with the larger business process issue and/or system architecture and/or permissions issue later. For now the "interface" for my app is JSON.

So I need some JSON and I have this Excel sheet.

Turns out there's a lovely open source project and NuGet package called ExcelDataReader. There's been ways to get data out of Excel for decades. Literally decades. One of my first jobs was automating Microsoft Excel with Visual Basic 3.0 with COM Automation. I even blogged about getting data out of Excel into ASP.NET 16 years ago!

Today I'll use ExcelDataReader. It's really nice and it took less than an hour to get exactly what I wanted. I haven't gone and made it super clean and generic, refactored out a bunch of helper functions, so I'm interested in your thoughts. After I get this tight and reliable I'll drop it into an Azure Function and then focus on getting the JSON directly from the source.

A few gotchas that surprised me. I got a "System.NotSupportedException: No data is available for encoding 1252." Windows-1252 or CP-1252 (code page) is an old school text encoding (it's effectively ISO 8859-1). Turns out newer .NETs like .NET Core need the System.Text.Encoding.CodePages package as well as a call to System.Text.Encoding.RegisterProvider(System.Text.CodePagesEncodingProvider.Instance); to set it up for success. Also, that extra call to reader.Read at the start to skip over the Title row had me pause a moment.

using System;
using System.IO;
using ExcelDataReader;
using System.Text;
using Newtonsoft.Json;

namespace AzureFridayToJson
{
class Program
{
static void Main(string[] args)
{
Encoding.RegisterProvider(CodePagesEncodingProvider.Instance);

var inFilePath = args[0];
var outFilePath = args[1];

using (var inFile = File.Open(inFilePath, FileMode.Open, FileAccess.Read))
using (var outFile = File.CreateText(outFilePath))
{
using (var reader = ExcelReaderFactory.CreateReader(inFile, new ExcelReaderConfiguration()
{ FallbackEncoding = Encoding.GetEncoding(1252) }))
using (var writer = new JsonTextWriter(outFile))
{
writer.Formatting = Formatting.Indented; //I likes it tidy
writer.WriteStartArray();
reader.Read(); //SKIP FIRST ROW, it's TITLES.
do
{
while (reader.Read())
{
//peek ahead? Bail before we start anything so we don't get an empty object
var status = reader.GetString(0);
if (string.IsNullOrEmpty(status)) break;

writer.WriteStartObject();
writer.WritePropertyName("Status");
writer.WriteValue(status);

writer.WritePropertyName("Title");
writer.WriteValue(reader.GetString(1));

writer.WritePropertyName("Host");
writer.WriteValue(reader.GetString(6));

writer.WritePropertyName("Guest");
writer.WriteValue(reader.GetString(7));

writer.WritePropertyName("Episode");
writer.WriteValue(Convert.ToInt32(reader.GetDouble(2)));

writer.WritePropertyName("Live");
writer.WriteValue(reader.GetDateTime(5));

writer.WritePropertyName("Url");
writer.WriteValue(reader.GetString(11));

writer.WritePropertyName("EmbedUrl");
writer.WriteValue($"{reader.GetString(11)}player");
/*
<iframe src="https://channel9.msdn.com/Shows/Azure-Friday/Erich-Gamma-introduces-us-to-Visual-Studio-Online-integrated-with-the-Windows-Azure-Portal-Part-1/player" width="960" height="540" allowFullScreen frameBorder="0"></iframe>
*/

writer.WriteEndObject();
}
} while (reader.NextResult());
writer.WriteEndArray();
}
}
}
}
}

The first pass is on GitHub at https://github.com/shanselman/AzureFridayToJson and the resulting JSON looks like this:

[
{
"Status": "Live",
"Title": "Introduction to Azure Integration Service Environment for Logic Apps",
"Host": "Scott Hanselman",
"Guest": "Kevin Lam",
"Episode": 528,
"Live": "2019-02-26T00:00:00",
"Url": "https://azure.microsoft.com/en-us/resources/videos/azure-friday-introduction-to-azure-integration-service-environment-for-logic-apps",
"embedUrl": "https://azure.microsoft.com/en-us/resources/videos/azure-friday-introduction-to-azure-integration-service-environment-for-logic-appsplayer"
},
{
"Status": "Live",
"Title": "An overview of Azure Integration Services",
"Host": "Lara Rubbelke",
"Guest": "Matthew Farmer",
"Episode": 527,
"Live": "2019-02-22T00:00:00",
"Url": "https://azure.microsoft.com/en-us/resources/videos/azure-friday-an-overview-of-azure-integration-services",
"embedUrl": "https://azure.microsoft.com/en-us/resources/videos/azure-friday-an-overview-of-azure-integration-servicesplayer"
},
...SNIP...

Thoughts? There's a dozen ways to have done this. How would you do this? Dump it into a DataSet and serialize objects to JSON, make an array and do the same, automate Excel itself (please don't do this), and on and on.

Certainly this would be easier if I could get a CSV file or something from the business person, but the issue is that I'm regularly getting new drops of this same sheet with new records added. Getting the suit to Save As | CSV reliably and regularly isn't sustainable.

Sponsor: Stop wasting time trying to track down the cause of bugs. Sentry.io provides full stack error tracking that lets you monitor and fix problems in real time. If you can program it, we can make it far easier to fix any errors you encounter with it.


© 2018 Scott Hanselman. All rights reserved.
     

EditorConfig code formatting from the command line with .NET Core's dotnet format global tool

Mar 1, 2019

Description:

"EditorConfig helps maintain consistent coding styles for multiple developers working on the same project across various editors and IDEs." Rather than you having to keep your code in whatever format the team has agreed on, you can check in an .editorconfig file and your editor of choice will keep things in line.

If you're a .NET developer like myself, there's a ton of great .NET EditorConfig options you can set to ensure the team uses consistent Language Conventions, Naming Conventions, and Formatting Rules.

Language Conventions are rules pertaining to the C# or Visual Basic language, for example, var/explicit type, use expression-bodied member. Formatting Rules are rules regarding the layout and structure of your code in order to make it easier to read, for example, Allman braces, spaces in control blocks. Naming Conventions are rules respecting the way objects are named, for example, async methods must end in "Async".

If you're using Visual Studios 2010, 2012, 2013, or 2015, fear not. There's at least a basic EditorConfig free extension for you that enforces the basic rules. There is also an extension for Visual Studio Code to support EditorConfig files that takes just seconds to install.

ASIDE: If you are looking for a decent default for C#, take a look at the .editorconfig that the C# Roslyn compiler team uses. I don't know about you, but my brain exploded when I saw that they used spaces vs tabs.

But! What if you want this coding formatting goodness at the dotnet command line? You can use "dotnet format" as a global tool! It's one line to install, then it's available everywhere for all your .NET Core apps.

D:\github\hanselminutes-core> dotnet tool install -g dotnet-format
You can invoke the tool using the following command: dotnet-format
Tool 'dotnet-format' (version '3.0.2') was successfully installed.
D:\github\hanselminutes-core> dotnet format
Formatting code files in workspace 'D:\github\hanselminutes-core\hanselminutes-core.sln'.
Found project reference without a matching metadata reference: D:\github\hanselminutes-core\hanselminutes.core\hanselminutes-core.csproj
Formatting code files in project 'hanselminutes-core'.
Formatting code files in project 'hanselminutes.core.tests'.
Format complete.

You can see in the screenshot below where dotnet format used its scandalous defaults to move my end of line { to its own line! Well, if that's what the team wants! ;)

My code is automatically formatted by the dotnet format tool

Of course, dotnet format is all open source and up at https://github.com/dotnet/format. You can install the stable build OR a development build from myGet.

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.
© 2018 Scott Hanselman. All rights reserved.
     

Hey Siri, what's my blood sugar? Learning to Code with Apple's iPhone Shortcuts

Feb 27, 2019

Description:

BA library of dozens of shortcuts on iOSear with me here. Apple Shortcuts (free on the App Store) is extraordinary and you shouldn't sleep on it. In fact, you should use it and explore it as it's amazing. I would go even further and say it could be a great place to learn to code!

Apple Shortcuts on iPhone is a lot like Microsoft Flow, except for your phone. Shortcuts let you string together Actions (ahem, functions) into multi-step tasks (ahem, functions that call functions). There's a rich and growing gallery of shortcuts that you can copy into your local (to your phone) library. You can then name them and invoke your Shortcuts with Siri.

Here's a few links to Shortcuts that (assuming you are reading this from your iPhone) you can add to your library with a click!

Do Not Disturb Timer Expand URL Intelligent Power (from Reddit) Check Spelling Convert Video to GIF Download YouTube Video

Once you have a shortcut you can invoke it as an item/icon on your springboard/home screen, you can have Siri run it with your voice, or invoke it via a "share sheet" that is available in all apps.

It would be reasonable to think this was a simple macro system with a few basic building blocks, but I don't think Apple's team gets enough credit. This is a complete development environment on your phone.

For example, here's a incredibly intricate and powerful Shortcut if one is pulled over by the police.

It pauses any music that may be playing, turns down your brightness and volume, turns on Do Not Disturb, and sends a message to the contact of your choosing letting them know you’re being pulled over and what your current location is. It then opens your front camera and starts a video recording so you have a video record of being pulled over.

Once you stop the recording it sends a copy of the video to a contact you specify, puts volume and brightness back to where they were, turns off Do Not Disturb, and gives you the option to send to iCloud Drive or Dropbox!

You could then record a Siri shortcut and just say "Hey Siri, I'm being pulled over" and all this happens automatically, hands free.

Take a look at the Laundry Timer app here. It's a very classic "take input and do a thing" program. You can build and extend workflows like this and the data from one flows through to the next one.

A multiple step shortcut with many actions that flow data into the next, organized in a pipeline

Note the Shortcut above. The "Adjust Date" action pops up a Date and is used as a Diff(erence) against the "Current Date" action, then used again in the Add New Reminder as an input to "Add New Reminder." These contextual variables flow through and are easily accessible in this genius UI. It really is near-perfect. Try it.

At this point you may be thinking, um, OK, that's cute, but where's the learn to code revolution here? It's not that open-ended of a system, what can I really do?

Like many connected cars, my car has a kind of REST API that its app uses to do things like heat up the climate system. Here I can literally POST (like Curl, but on your iPhone!) to an endpoint and pass in a FORM and parse the resulting JSON. Wow! Drink that in. You can write complex functions with iOS Shortcuts. Really.

calling a REST API with an iOS shortcut

Hang on. My body has a REST API. I use the open source Nightscout project to create a REST API on top of my Diabetes Continuous Glucose Meter then surface it in places like my lighted keyboard or even my Git Prompt.

How hard would it be to - right now as I make this blog post - write a method to have Siri retrieve my blood sugar and announce it to me when I say "Siri what's my blood sugar?" Let's see!

I make a URL object with my REST API that returns my sugar as JSON, it gets passed into Get Contents of URL. That makes a Dictionary from the Input, then gets the value of "sgv" (serum glucose value) and then the result of that is used to make a string with the Text action.

Preparing to make a shortcut

Now I have Siri SAY it. I can "debug" by running the Shortcut with the play button.

Building a shortcut

Then I can Add it to Siri and record my phrase. Here's me saying "what's my blood sugar" and she's telling me. Yes, I know. I had a cookie. I deserved it.

Running your shortcut

This is just the start. It could also tell me my trend lines, text someone if it's high, make a chart, I figure can do anything! I'm going to continue to explore Shortcuts but this little NightScout one can be downloaded to YOUR phone here. You'll only need to put in YOUR own URL for your Nightscout instance.

Sponsor: Manage GitHub Pull Requests right from the IDE with the latest JetBrains Rider. An integrated performance profiler on Windows comes to the rescue as well.


© 2018 Scott Hanselman. All rights reserved.
     

Learning about .NET Core futures by poking around at David Fowler's GitHub

Feb 22, 2019

Description:

A picture of David silently judging my code, but with loveDavid Fowler is the ASP.NET Core Architect (and an amazing highly technical public speaker) and I've learned a lot from watching him code. However, what's the best way for YOU to learn from folks like David if you can't sit on their shoulder? Why, look at their GitHub!

Since .NET Core (and most of Microsoft) is not only open source but also developed in the open now on GitHub, we can actually watch folks in their day to day work as they commit code to projects like the C# compiler, .NET Core, and ASP.NET Core.

Even more interestingly, we can look at David's github here https://github.com/davidfowl and then under Repositories see what he's up to, filter by language and type, and explore! Sometimes I just explore the Pull Requests on projects like ASP.NET Core.

You can have Private repositories on GitHub, as I do, and as I'm sure David does. But GitHub is a social network for code and it's more fun and a better learning experience when we can see each others code and read it. Read with a critical eye, but without judgment as you may not have all the context that the author does. If you went to my GitHub, https://github.com/shanselman you might be disappointed but you also may be missing the big picture. Just consider that as you Follow people and explore their code.

David is an advanced .NET developer, while, for example, I am comparatively intermediate. So I realize that not all of David's code is FOR me. It's a scratchpad, it's not educational how-to workshops. However, I can get pick up cool idioms, interesting directions the tech may be going, and more importantly - prototypes and spikes. Spikes are folks testing out technical ideas. They may not be complete. In fact, they may never be complete. But some my be harbingers of things to come.

Here's a few things I learned today.

gRPC for .NET Core

For example, at https://github.com/davidfowl/grpc-dotnet I can see David has forked (copied) gRPC for dotnet and his game is working with the gRPC folks to make a fully supported version of gRPC for production workloads with .NET Core! Here are the stated goals:

We plan to implement a fully-managed version of gRPC for .NET that will be built on top of ASP.NET Core HTTP/2 server. Good integration with the rest of ASP.NET Core ecosystem High-performance (we plan to utilize some of the cutting edge performance features from ASP.NET Core and in .NET plaform itself)

That sounds cool! I can go learn that gRPC is a modern (google sponsored) Remote Procedure Call framework that can run anywhere. It's used by Netflix and Square and supports basically any languaige and any environment. Nice for this microservice world we are entering and hopefully has learned from the sins of DCOM and CORBA and RMI, because I was there and it sucked.

Nothing to see here but moving to a new JSON serializer

This Web.Framework sounds fun, and I'll be sure to take the description to heart.

says "Lame name, just a prototype, nothing to see here (move along)"

You can see David and James Newton-King kicking ideas around as you explore the commit log. However, the most interesting commit IMHO is when David moves this little spike from using JSON.NET (the ubiquitous 3rd party JSON serializer) to the new emerging official System.Text.Json. Here is the commit with unified differences.

It's a small change but it also makes me feel good about the API underneath this new JSON API that's coming. My takeway is that it's not as scary as I'd assumed. Looks like a Good Thing(tm).

A diff of code shows that just one line is changed to move JSON serializers

 

Cool!

Multi-Protocol ASP.NET Core

This looks interesting.

"The following sample shows how you can host a TCP server and HTTP server in the same ASP.NET Core application. Under the covers, it's the same server (Kestrel) running different protocols on different ports. The ConnectionHandler is a new primitive introduced in ASP.NET Core 2.1 to support non-HTTP protocols."

I didn't know you could do that! Looks like this sample hasn't changed much since it was conceived of in 2018, but then in the last month it's been updated twice and it appears to be part of a larger, slow-moving architectural issue called Bedrock that's moving forward.

I learned that Kestral (the ASP.NET Core web server) has a "ListenLocalhost" option on its options object!

WebHost.CreateDefaultBuilder(args)
.ConfigureServices(services =>
{
// This shows how a custom framework could plug in an experience without using Kestrel APIs directly
services.AddFramework(new IPEndPoint(IPAddress.Loopback, 8009));
})
.UseKestrel(options =>
{
// TCP 8007
options.ListenLocalhost(8007, builder =>
{
builder.UseConnectionHandler<MyEchoConnectionHandler>();
});

// HTTP 5000
options.ListenLocalhost(5000);

// HTTPS 5001
options.ListenLocalhost(5001, builder =>
{
builder.UseHttps();
});
})
.UseStartup<Startup>();

I can see here that TCP port 8007 is customer and uses a custom ConnectionHandler which I also didn't know existed! I can then look at the implementation of that handler and it's cool how clean the API is. You can get the result cleanly off the Transport buffer. You're doing low-level TCP but it doesn't feel low level.

using System.Threading.Tasks;
using Microsoft.AspNetCore.Connections;
using Microsoft.Extensions.Logging;

namespace KestrelTcpDemo
{
public class MyEchoConnectionHandler : ConnectionHandler
{
private readonly ILogger<MyEchoConnectionHandler> _logger;

public MyEchoConnectionHandler(ILogger<MyEchoConnectionHandler> logger)
{
_logger = logger;
}

public override async Task OnConnectedAsync(ConnectionContext connection)
{
_logger.LogInformation(connection.ConnectionId + " connected");

while (true)
{
var result = await connection.Transport.Input.ReadAsync();
var buffer = result.Buffer;

foreach (var segment in buffer)
{
await connection.Transport.Output.WriteAsync(segment);
}

if (result.IsCompleted)
{
break;
}

connection.Transport.Input.AdvanceTo(buffer.End);
}

_logger.LogInformation(connection.ConnectionId + " disconnected");
}
}
}

Pretty slick. This just echos what is sent to that port but not only has it educated me about a thing I didn't know about, it's something I can mentally file away until I need it!

All of these things I learned in just 30 minutes of exploring someone's public repository.

What kinds of code do you like to read and what have you learned from just poking around?

Sponsor: Get the latest JetBrains Rider for remote debugging via SSH, SQL injections, a new Search Everywhere popup, and improved Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Right click publish quickly to Azure App Services with VS Code extensions and zipdeploy

Feb 20, 2019

Description:

I wanted to see what was the fastest way to get an ASP.NET Core web site up (for free) on Azure. First, I could use Visual Studio Community (which is free), and just right click Publish, sign into Azure, and make a free Web App and I'm cool. But I also wanted to see what it was like on Visual Studio Code (which would work on Linux, etc)

I downloaded these things. This is 10-15 min tops for download AND install. Likely less on a fast connection.

VS Code https://code.visualstudio.com .NET Core https://dotnet.microsoft.com Azure App Service extension for VS Code C# extension for VS Code (NOTE: this will get automatically recommended to you anywhere when you open a .cs file for the first time.

This also assumes you have a free Azure account https://azure.microsoft.com/free/

I made a new ASP.NET web site with "dotnet new razor" at the command line. The Azure App Service extension makes a new Azure icon appear on the left of VS Code. I can see my subscription(s) and any sites I've made before. I can right-click the top of the tree or just click the + plus sign.

image

TRICK: The default mode of the Azure App Service extension is "basic" mode. This is fine for messing around, but it will assume a bunch of things. You don't have control over the location (it'll pick a nearby one) or really anything. Again, it's fine. However, if you DO want explicit prompts for name, location, OS, runtime, etc you can turn on "appService.advanced" in File | Preferences | Settings (or Ctrl+,). Don't feel you need to, but know it's possible.

appService.advanced

Now, in my opinion, deploying apps (.NET Core, Node, or otherwise) directly from source can be a little confusing, and it doesn't really scale for anything other than proofs of concept. There's usually a "build" step, and ideally you'll have a CI/CD (Continuous Integration/Continuous Deployment) pipeline for anything of any real size. It's easier than you think - you can likely get a basic DevOps pipeline up in a hour or so. I commit to GitHub and it just deploys to Azure.

That said, a quickie deploy has value so I wanted to do it. You can do a "git deploy" to Azure where Azure is a git remote and you just "git push azure master" but...you're pushing source and Azure "builds" it in Azure App Service using a thing called Kudu. That means it'll run npm install, dotnet restore, etc, it'll take some time. You could deploy a container to Azure and just push it to a Container Registry, then spin up the container.

However, Azure also has a little known but rather clever "zipdeploy" feature. Once you've configured your Azure App Service for zipdeployment as a source, you can just POST a new ZIP deployment with Curl!

curl -X POST -u <deployment_user> --data-binary @"<zip_file_path>" https://<app_name>.scm.azurewebsites.net/api/zipdeploy

You might find that weird, or you might find it elegant. If it's the latter, use it. If it's the former, don't. You can even do it with minimal or no downtime by deploying to a staging slot and use Auto Swap.

I'm going to use the Azure App Service extension in VS Code and it's going to hide all of this and it'll just publish in one click.

Here's the important part if you want it to just work and work easily. You'll want to deploy from a folder that represents your published app. That means your app in a state that it's ready to go.

With .NET Core, the easiest way is to dotnet publish. Then I'm right clicking on that publish folder as seen in this screenshot. Given that the extension is zipping up the target folder and deploying it, I want publish the publish folder, not the root of my source folder.

image

That will actually make a file .vscode/settings.json that will tell VS Code's Azure App Service extension what folder to deploy from in the future, thereby simplifying things.

{
"appService.defaultWebAppToDeploy": "/subscriptions/GUID/resourceGroups/appsvc_rg_Windows_CentralUS/providers/Microsoft.Web/sites/fancyweb1",
"appService.deploySubpath": "bin\\Debug\\netcoreapp2.2\\publish"
}

Below you can see the dialog that pops up "Always deploy the workspace ___ to ___" if you click Yes that will create the setting above specific to your application.

Always deploy the workspace web1 to fancyweb1

Now when I deploy, I can right click from anywhere and it will zipdeploy right to my site. Note the log below.

image

With this extension I can even right click and "Start Streaming Logs" and get output of the logs of your Azure App Service as it runs, right in the output pane of VS Code.

Start Streaming Logs

This will make things pretty easy for my simplest sites and proofs of concept. Give it a try!

Sponsor: Get the latest JetBrains Rider for remote debugging via SSH, SQL injections, a new Search Everywhere popup, and improved Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Exploring nopCommerce - open source e-commerce shopping cart platform in .NET Core

Feb 15, 2019

Description:

nopCommerce demo siteI've been exploring nopCommerce. It's an open source e-commerce shopping cart. I spoke at their conference in New York a few years ago and they were considering moving to open source and cross-platform .NET Core from the Windows-only .NET Framework, so I figured it was time for me to check in on their progress.

I headed over to https://github.com/nopSolutions/nopCommerce and cloned the repo. I have .NET Core 2.2 installed that I grabbed here. You can check out their official site and their live demo store.

It was a simple git clone and a "dotnet build" and it build and ran quite immediately. It's always nice to have a site "just work" after a clone (it's kind of a low bar, but no matter what the language it's always a joy when it works.)

I have SQL Express installed but I could just as easily use SQL Server for Linux running under Docker. I used the standard SQL Server Express connection string: "Server=localhost\SQLEXPRESS;Database=master;Trusted_Connection=True;" and was off and running.

nopCommerce is easy to setup

It's got a very complete /admin page with lots of Commerce-specific reports, the ability to edit the catalog, have sales, manage customers, deal with product reviews, set promotions, and more. It's like WordPress for Stores. Everything you'd need to put up a store in a few hours.

Very nice admin site in nopCommerce

nopCommerce has a very rich plugin marketplace. Basically anything you'd need is there but you could always write your own in .NET Core. For example, if I want to add Paypal as a payment option, there's 30 plugins to choose from!

NOTE: If you have any theming issues (css not showing up) with just using "dotnet build," you can try "msbuild" or opening the SLN in Visual Studio Community 2017 or newer. You may be seeing folders for plugins and themes not being copied over with dotnet build. Or you can "dotnet publish" and run from the publish folder.

Now, to be clear, I just literally cloned the HEAD of the actively developed version and had no problems, but you might want to use the most stable version from 2018 depending on your needs. Either way, nopCommerce is a clean code base that's working great on .NET Core. The community is VERY active, and there's a company behind the open source version that can do the work for you, customize, service, and support.

Sponsor: The next generation of Jira has arrived, with new roadmaps, more flexible boards, overhauled configuration, and dozens of new integrations. Whatever new awaits you, begin it here. In a new Jira.
© 2018 Scott Hanselman. All rights reserved.
     

How to convert an IMG file to an standard ISO easily with Linux on Windows 10

Feb 12, 2019

Description:

Modded Goldstar 3DO for USBThe optical disc drive is giving out on my GoldStar 3DO machine. It's nearly 30 years old. I want to make sure that the kids and I can still play our 3DO discs. I ordered this fantastic USB mod for the 3DO from a fellow out of Belarus. It came and it's great. It includes a game/file selector app that you boot off of if you put it in the root of a FAT32 formatted USB drive.

However, when I cloned my collection of CD-ROMS I ended up with a bunch of IMG files, and this mod wants ISO files. I thought my cloner was going to give me ISOs. I did the obvious thing and googled for "how to convert an img file to an iso."

This plunged me into the hellscape that is CNET and Major Geeks download wrappers. Every useful application or utility out there is hidden on a page filled with Download Now buttons that aren't the button you want OR if you get the app you want, it's actually a Chrome Search hijacker. I just want to convert a damn IMG to an ISO. If you want to do this on Windows you're going to be installing a bunch of virus-laden trial ISO cracking crap.

Fortunately, Windows 10 can run Linux very nicely, thank you very much. Go install Ubuntu from the Windows Store and get set up ASAP.

I just installed ccd2iso inside Ubuntu on my Windows 10 machine.

scott@IRONHEART:~$ sudo apt install ccd2iso
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following NEW packages will be installed:
ccd2iso
0 upgraded, 1 newly installed, 0 to remove and 27 not upgraded.
Need to get 7406 B of archives.
After this operation, 26.6 kB of additional disk space will be used.
Get:1 http://archive.ubuntu.com/ubuntu bionic/universe amd64 ccd2iso amd64 0.3-7 [7406 B]
Fetched 7406 B in 0s (21.0 kB/s)
Selecting previously unselected package ccd2iso.
(Reading database ... 61432 files and directories currently installed.)
Preparing to unpack .../ccd2iso_0.3-7_amd64.deb ...
Unpacking ccd2iso (0.3-7) ...
Processing triggers for man-db (2.8.3-2ubuntu0.1) ...
Setting up ccd2iso (0.3-7) ...
scott@IRONHEART:~$ cd /mnt/c/Users/scott/Desktop/3do/

Then I cd (change directory) into my file system where my IMG backups are. Note that my C:\ drive on Windows is at /mnt/c so you can see me in a folder on my Desktop here. Then just run ccd2iso.

scott@IRONHEART:/mnt/c/Users/scott/Desktop/3do$ ccd2iso AloneInTheDark.img AloneInTheDark.iso
179500 sector written
Done.

Boom. Super fast and does the job and now I'm up and running! Regardless of why you got to this blog post and needed to convert an IMG to an ISO, I hope this helps and saves you some time!

Sponsor: The next generation of Jira has arrived, with new roadmaps, more flexible boards, overhauled configuration, and dozens of new integrations. Whatever new awaits you, begin it here. In a new Jira. 


© 2018 Scott Hanselman. All rights reserved.
     

Lighting up my DasKeyboard with Blood Sugar changes using my body's REST API

Feb 7, 2019

Description:

imageI've long blogged about the intersection of diabetes and technology. From the sad state of diabetes tech in 2012 to its recent promising resurgence, it's clear that we are not waiting.

If you're a Type 1 Diabetic using a CGM - a continuous glucose meter - you'll want to set up Nightscout so you can have a REST API for your sugar. The CGM checks my blood sugar every 5 minutes, it hops via BLE over to my phone and then to the cloud. You'll want your sugars stored in cloud storage that YOU control. CGM vendors have their own cloud, but we can easily bridge over to a MongoDB database.

I run Nightscout in Azure and my body has a REST API. I can do an HTTP GET like this:

/api/v1/entries.json?count=3

and get this

[
{
_id: "5c6066d477b2a69a0a7810e5",
sgv: 143,
date: 1549821626000,
dateString: "2019-02-10T18:00:26.000Z",
trend: 4,
direction: "Flat",
device: "share2",
type: "sgv"
},
{
_id: "5c6065a877b2a69a0a7801ce",
sgv: 134,
date: 1549821326000,
dateString: "2019-02-10T17:55:26.000Z",
trend: 4,
direction: "Flat",
device: "share2",
type: "sgv"
},
{
_id: "5c60647b77b2a69a0a77f381",
sgv: 130,
date: 1549821026000,
dateString: "2019-02-10T17:50:26.000Z",
trend: 4,
direction: "Flat",
device: "share2",
type: "sgv"
}
]

I can change the URL from a .json to a .txt and get this

2019-02-10T18:00:26.000Z 1549821626000 143 Flat
2019-02-10T17:55:26.000Z 1549821326000 134 Flat
2019-02-10T17:50:26.000Z 1549821026000 130 Flat

The "flat" value at the end is part of an enum that can give me a generalized trend value. Diabetics need to manage our sugars at the least hour by hour and sometimes minute by minute. As such it's super important that we have "glanceable displays." That means anything at all that gives me a sense (a sixth sense, if you will) of how I'm doing.

That might be:

Alexa, what's my blood sugar? Adding sugar numbers and trends to your Git/PATH prompt in your shell An Arduino with an LCD
image A wall-mounted dakBoard Family Calendar in a shared space that also shows my blood sugar

I got a Das Keyboard 5Q recently - I first blogged about Das Keyboard in 2006! and noted that it's got it's own local REST API. I'm working on using their Das Keyboard Q software's Applet API to light up just the top row of keys in response to my blood sugar changing. It'll use their Node packages and JavaScript and run in the context of their software.

However, since the keyboard has a localhost REST API and so does my blood sugar, I busted out this silly little shell script. Add a cron job and my keyboard can turn from orange (low), to green, yellow, red (high) as my sugar changes. That provides a nice ambient notifier of how my sugars are doing. Someone on Twitter said "who looks at their keyboard?" I mean, OK, that's just silly. If my entire keyboard turns red I will notice it. Again, ambient. I could certainly add an alert and make a klaxon go off if you'd like.

#!/bin/sh
# This script colorize all LEDs of a 5Q keyboard
# by sending JSON signals to the Q desktop public API.
# based on Blood Sugar values from Nightscout
set -e # quit on first error.
PORT=27301

# Colorize the 5Q keyboard
PID="DK5QPID" # product ID

# Zone are LED groups. There are less than 166 zones on a 5Q.
# This should cover the whole device.
MAX_ZONE_ID=166

# Get blood sugar from Nightscout as TEXT
red=#f00
green=#0f0
yellow=#ff0
#deep orange is LOW sugar
COLOR=#f50
bgvalue=$(curl -s https://MYSITE/api/v1/entries.txt?count=1 | grep -Eo '000\s([0-9]{1,3})+\s' | cut -f 2)
if [ $bgvalue -gt 80 ]
then
COLOR=$green
if [ $bgvalue -gt 140 ]
then
COLOR=$yellow
if [ $bgvalue -gt 200 ]
then
COLOR=$red
fi
fi
fi

echo "Sugar is $bgvalue and color is $COLOR!"

for i in `seq $MAX_ZONE_ID`
do
#echo "Sending signal to zoneId: $i"
# important NOTE: if field "name" and "message" are empty then the signal is
# only displayed on the devices LEDs, not in the signal center
curl -s -S --output /dev/null -X POST --header 'Content-Type: application/json' --header 'Accept: application/json' -d '{
"name": "Nightscout",
"id": "'$i'",
"message": "Blood sugar is '$bgvalue'",
"pid": "'$PID'",
"zoneId": "'"$i"'",
"color": "'$COLOR'",
"effect": "SET_COLOR"

}' "http://localhost:$PORT/api/1.0/signals"

done
echo "\nDone.\n\"

This local keyboard API is meant to send a signal to a single zone or key, so it's hacky of me (and them, really) to make 100+ REST calls to color the whole keyboard. But, it's a localhost call and it's not that spendy. This will go away when I move to their new API. Here's a video of it working.

You can also hit the volume button on the keyboard an any "signaled" (lit up) key and get a popup with the actual blood sugar value (that's 'message' in the second curl command above). Again, this is a hack but I'm going to make it a formal applet you can just install from the store. If you want to help (I'm slow) head to the code here https://github.com/shanselman/DasKeyboard-Q-NightScout

Got my keyboard keys changing color *when my blood sugar goes up!* @daskeyboard @NightscoutProj #WeAreNotWaiting #diabetes pic.twitter.com/DSBDcrO7RE

— Scott Hanselman (@shanselman) February 8, 2019

What are some other good ideas for ambient sugar alerts? An LCD strip around the monitor (bias lighting)? A Phillips Hue smart light?

Consider also that you could use the glanceable display idea for pulse, anxiety, blood pressure - anything in your body you could hook up to in real- or near-realtime.

Sponsor: Get the latest JetBrains Rider with Code Vision, Rename Project refactoring, and the Assembly Explorer. Improved support for C#, VB.NET, F#, TypeScript, and Angular is all included.


© 2018 Scott Hanselman. All rights reserved.
     

Teaching Kids to Code with Minecraft Mods made easy using MakeCode and Code Connection

Feb 5, 2019

Description:

Back in the day, making a Minecraft mod was...challenging. It was a series of JAR files and Java hacks and deep folder structures. It was possible, but it wasn't fun and it surely wasn't easy. I wanted to revisit things now that Minecraft is easily installed from the Windows Store.

Today, it couldn't be easier to make a Minecraft Mod, so I know what my kids and I are doing tonight!

I headed over to https://minecraft.makecode.com/setup/minecraft-windows10 and followed the instructions. I already have Minecraft installed, so I just had to install the Minecraft Code Connection app. The architecture here is very clean and clever. Basically you turn on cheats in Minecraft and use a local websockets connection between the Code Connection app and Minecraft - you're automating Minecraft from an external application!

Here I'm turning on cheats in a new Miencraft world:

Minecraft Allow Cheats

Then from the Code Connection app, I get a URL for the automation server, then go back to Minecraft, hit "t" and paste it in the URL. Now the two apps are talking to each other.

Connecting Minecraft to MakeCode

I can automate with MakeCode, Scratch, or other editors. I'll do MakeCode.

Make Code is amazing

Then an editor opens. This is the same base open source Make Code editor I used when I was coding for an Adafruit Circuit Playground Express earlier this year.

Now, I'll setup a chat command in Make Code that makes it rain chickens when I type the chat command "chicken." It runs a loop and spawns 100 chickens 10 blocks above my character's head.

Chicken rain

I was really surprised how easy this was. It was maybe 10 mins end to end, which is WAY easier than the Java add-ins I learned about just a few years ago.

Minecraft Chicken Rain

There are a ton of tutorials here, including Chicken Rain. https://minecraft.makecode.com/tutorials

The one I'm most excited to show my kids is the Agent. Your connection to the remote Code Connection app includes an avatar or "agent." Just like Logo (remember that, robot turtles?) you can control your agent and make him build stuff. No more tedious house building for us! Let's for-loop our way to glory and teach dude how to make us a castle!

Sponsor: Get the latest JetBrains Rider with Code Vision, Rename Project refactoring, and the Assembly Explorer. Improved support for C#, VB.NET, F#, TypeScript, and Angular is all included.


© 2018 Scott Hanselman. All rights reserved.
     

Brainstorming - Creating a small single self-contained executable out of a .NET Core application

Feb 1, 2019

Description:

I've been using ILMerge and various hacks to merge/squish executables together for well over 12 years. The .NET community has long toyed with the idea of a single self-contained EXE that would "just work." No need to copy a folder, no need to install anything. Just a single EXE.

While work and thought continues on a CoreCLR Single File EXE solution, there's a nice Rust tool called Warp that creates self-contained single executables. Warp is cross-platform, works on any tech, and is very clever

The Warp Packer app has a slightly complex command line, like this:

.\warp-packer --arch windows-x64 --input_dir bin/Release/netcoreapp2.1/win10-x64/publish --exec myapp.exe --output myapp.exe

Fortunately Hubert Rybak has created a very nice "dotnet-pack" global tool that wraps this all up into a single command, dotnet-pack.

NOTE: There is already a "dotnet pack" command so this dotnet-pack global tool is unfortunately named. Just be aware they are NOT the same thing.

All you have to do is this:

C:\supertestweb> dotnet tool install -g dotnet-pack
C:\supertestweb> dotnet-pack
O Running Publish...
O Running Pack...

In this example, I just took a Razor web app with "dotnet new razor" and then packed it up with this tool using Warp packer. Now I've got a 40 meg self-contained app. I don't need to install anything, it just works.

C:\supertestweb> dir
Directory: C:\supertestweb

Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 2/6/2019 9:14 AM bin
d----- 2/6/2019 9:14 AM obj
d----- 2/6/2019 9:13 AM Pages
d----- 2/6/2019 9:13 AM Properties
d----- 2/6/2019 9:13 AM wwwroot
-a---- 2/6/2019 9:13 AM 146 appsettings.Development.json
-a---- 2/6/2019 9:13 AM 157 appsettings.json
-a---- 2/6/2019 9:13 AM 767 Program.cs
-a---- 2/6/2019 9:13 AM 2115 Startup.cs
-a---- 2/6/2019 9:13 AM 294 supertestweb.csproj
-a---- 2/6/2019 9:15 AM 40982879 supertestweb.exe

Now here's what it gets interesting. Let's say I have a console app. Hello World, packed with Warp, ends up being about 35 megs. But if I use the "dotnet-pack -l aggressive" the tool will add the Mono ILLinker (tree shaker/trimmer) and shake off all the methods that aren't needed. The resulting single executable? Just 9 megs compressed (20 uncompressed).

C:\squishedapp> dotnet-pack -l aggressive
O Running AddLinkerPackage...
O Running Publish...
O Running Pack...
O Running RemoveLinkerPackage...
C:\squishedapp> dir
Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 2/6/2019 9:32 AM bin
d----- 2/6/2019 9:32 AM obj
-a---- 2/6/2019 9:31 AM 47 global.json
-a---- 2/6/2019 9:31 AM 193 Program.cs
-a---- 2/6/2019 9:32 AM 178 squishedapp.csproj
-a---- 2/6/2019 9:32 AM 9116643 squishedapp.exe

Here is where you come in!

NOTE: The .NET team has planned to have a "single EXE" supported packing solution built into .NET 3.0. There's a lot of ways to do this. Do you zip it all up with a header/unzipper? Well, that would hit the disk a lot and be messy. Do you "unzip" into memory? Do you merge into a single assembly? Or do you try to AoT (Ahead of Time) compile and do as much work as possible before you merge things? Is a small size more important than speed?

What do you think? How should a built-in feature like this work and what would YOU focus on?

Sponsor: Check out Seq 5 for real-time diagnostics from ASP.NET Core and Serilog, now with faster queries, support for Docker on Linux, and beautiful new dark and light themes.


© 2018 Scott Hanselman. All rights reserved.
     

Visiting The National Museum of Computing inside Bletchley Park - Can we crack Enigma with Raspberry Pis?

Jan 29, 2019

Description:

image"The National Museum of Computing is a museum in the United Kingdom dedicated to collecting and restoring historic computer systems. The museum is based in rented premises at Bletchley Park in Milton Keynes, Buckinghamshire and opened in 2007" and I was able to visit it today with my buddies Damian and David. It was absolutely brilliant.

I'd encourage you to have a listen to my 2015 podcast with Dr. Sue Black who used social media to raise awareness of the state of Bletchley Park and help return the site to solvency.

The National Museum of Computing is a must-see if you are ever in the UK. It was a short 30ish minute train ride up from London. We spent the whole afternoon there.

There is a rebuild of the Colossus, the the world's first electronic computer. It had a single purpose: to help decipher the Lorenz-encrypted (Tunny) messages between Hitler and his generals during World War II. The Colossus Gallery housing the rebuild of Colossus tells that remarkable story.

A working Bombe machine

The backside of the Bombe

National Computing Museum

Cipher Machine

We saw the Turing-Welchman Bombe machine, an electro-mechanical device used to break Enigma-enciphered messages about enemy military operations during the Second World War. They offer guided tours (recommended as the volunteers have encyclopedic knowledge) and we were able to encrypt a message with the German Enigma (there's a 90 second video I made, here) and decrypt it with the Bombe, which is effectively 12 Enigmas working in parallel, backwards.

Inside the top lid of a working EngimaA working Engima

It's worth noting - this from their website - that the first Bombe, named Victory, started code-breaking on Bletchley Park on 14 March 1940 and by the end of the war almost 1676 female WRNS and 263 male RAF personnel were involved in the deployment of 211 Bombe machines. The museum has a working reconstructed Bombe.

 

Now, decrypting the Enigma with the Bombe (and a crib, and some analysis, before the brute force starts) pic.twitter.com/OJsoMYXYhj

— Scott Hanselman (@shanselman) January 31, 2019

I wanted to understand the computing power these systems had then, and now. Check out the website where you can learn about the OctaPi - a Raspberry Pi array of eight Pis working together to brute-force Enigma. You can make your own here!

I hope you enjoy these pics and videos and I hope you one day get to enjoy the history and technology in and around Bletchley Park.

Sponsor: Check out Seq 5 for real-time diagnostics from ASP.NET Core and Serilog, now with faster queries, support for Docker on Linux, and beautiful new dark and light themes.


© 2018 Scott Hanselman. All rights reserved.
     

NuGet's fancy older sibling FuGet gives you a whole new view of the .NET packaging ecosystem

Jan 25, 2019

Description:

FuGet diffsI remember when we announced NuGet (almost 10 years ago). Today you can get your NuGet packages (that contain .NET libraries) from Nuget.exe, from within Visual Studio, from the .NET CLI (command line interface), and from Paket. Choice is good!

Most folks are familiar with NuGet.org but have you used FuGet?

FuGet is "pro nuget package browsing!" Creating by the amazing Frank A. Krueger - of whom I am an immense fan - FuGet offers a different view on the NuGet package library. NuGet is a repository of nearly 150,000 open source libraries and the NuGet Gallery does a decent job of letting one browse around. However, https://github.com/praeclarum/FuGetGallery is an alternative web UI with a lot more depth.

FuGet is "advanced mode" for NuGet. It's a package browser combined with an API browser that helps you explore the XML documentation and metadata of a package's assemblies to help you explore and learn. And it's a JOY.

For example, if I look at https://www.fuget.org/packages/Newtonsoft.Json I can also see who depends on the package! https://www.fuget.org/packages/Newtonsoft.Json/dependents Who has taken a public dependency on your package? I can see supported frameworks, namepsaces, as well as internal types. For example, I can explore JToken within Newtonsoft.Json and its embedded docs!

You can even do API diffs across versions! Check out https://www.fuget.org/packages/Serilog/2.8.0-dev-01042/lib/netstandard2.0/diff/2.6.0/ for example. This is an API Diff between 2.8.0-dev-01042 and 2.6.0 for Serilog. This could be useful for users or package maintainers when deciding how big a version bumb is required depending on how much of the API has changed. It also gives you a view (as the downstream consumer) of what's coming at you in pre-release versions!

From Frank's blog:

Have you ever wondered if the library your using has been customized for a certain platform? Have you wondered if it will work on your platform at all?

This doubt is removed by displaying - in full technicolor - all the frameworks that the library supports.  Supported Frameworks

They’re color coded so you can see at a glance: Green libraries are .NET Standard and will work everywhere Dark blue libraries are platform specific Light blue libraries are for full .NET and Mono only Yellow libraries are old PCLs that we’re all trying to forget

FuGet.org is a fanstatic addition to the .NET ecosystem and I"d encourage you to bookmark it, use it, support it, and get involved!

If you're interesting in stuff like this (and the code that runs stuff like this) also check out Stephen Cleary's useful http://dotnetapis.com/ and it's associated code on GitHub https://github.com/StephenClearyApps/DotNetApis.

Sponsor: Your code is bad, but that’s ok thanks to Sentry’s full stack error monitoring that enables you to track and fix application errors in real time. Stop garbage code from becoming garbage fires.


© 2018 Scott Hanselman. All rights reserved.
     

How to use Windows 10's built-in OpenSSH to automatically SSH into a remote Linux machine

Jan 23, 2019

Description:

In working on getting Remote debugging with VS Code on Windows to a Raspberry Pi using .NET Core on ARM in my last post, I was looking for optimizations and realized that I was using plink/putty for my SSH tunnel. Putty is one of those tools that we (as developers) often take for granted, but ideally I could do stuff like this without installing yet another tool. Being able to use out of the box tools has a lot of value.

A friend pointed out this part where I'm using plink.exe to ssh into the remote Linux machine to launch the VS Debugger:

"pipeTransport": {
"pipeCwd": "${workspaceFolder}",
"pipeProgram": "${env:ChocolateyInstall}\\bin\\PLINK.EXE",
"pipeArgs": [
"-pw",
"raspberry",
"root@crowpi.lan"
],
"debuggerPath": "/home/pi/vsdbg/vsdbg"
}

I could use Linux/bash that's built into Windows 10 for years now. As you may know, Windows 10 can run many Linuxes out of the box. If I have a Linux distro configured, I can call Linux commands locally from CMD or PowerShell. For example, here you see I have three Linuxes and one is the default. I can call "wsl" and any command line is passed in.

C:\Users\scott> wslconfig /l
Windows Subsystem for Linux Distributions:
Ubuntu-18.04 (Default)
WLinux
Debian
C:\Users\scott> wsl ls ~/
forablog forablog.2 forablog.2.save forablog.pub myopenaps notreal notreal.pub test.txt

So theoretically I could "wsl ssh" and use that Linux's ssh, but again, requires setup and it's a little silly. Windows 10 now supports OpenSSL already!

Open an admin PowerShell to see if you have it installed. Here I have the client software installed but not the server.

C:\> Get-WindowsCapability -Online | ? Name -like 'OpenSSH*'

Name : OpenSSH.Client~~~~0.0.1.0
State : Installed

Name : OpenSSH.Server~~~~0.0.1.0
State : NotPresent

You can then add the client (or server) with this one-time command:

Add-WindowsCapability -Online -Name OpenSSH.Client~~~~0.0.1.0

You'll get all the standard OpenSSH stuff that one would want.

OpenSSL tools on Windows

Let's say now that I want to be able to ssh (shoosh!) into a remote Linux machine using PGP keys rather than with a password. It's much more convenient and secure. I'll be ssh'ing with my Windows SSH into a remote Linux machine. You can see where ssh is installed:

C:\Users\scott>where ssh
C:\Windows\System32\OpenSSH\ssh.exe

Level set - What are we doing and what are we trying to accomplish?

I want to be able to type "ssh pi@crowpi" from my Windows machine and automatically be logged in.

I will

Make a key on my Window machine. The FROM. I want to ssh FROM here TO the Linux machine. Tell the Linux machine (by transferring it over) about the public piece of my key and add it to a specific user's allowed_keys. PROFIT

Here's what I did. Note you can do this is several ways. You can gen the key on the Linux side and scp it over, you can use a custom key and give it a filename, you can use a password as you like. Just get the essence right.

Below, note that when the command line is C:\ I'm on Windows and when it's $ I'm on the remote Linux machine/Raspberry Pi.

gen the key on Windows with ssh-keygen I ssh'ed over to Linux and note I'm prompted for a password, as expected. I "ls" to see that I have a .ssh/ folder. Cool. You can see authorized_keys is in there, you may or may no have this file or folder. Make the ~/.ssh folder if you don't. Exit out. I'm in Windows now. Look closely here. I'm "scott" on Windows so my public key is in c:\users\scott\.ssh\id_rsa.pub. Yours could be in a file you named earlier, be conscious. I'm type'ing (cat on Linux is type on Windows) that text file out and piping it into SSH where I login that remote machine with the user pi and I then cat (on the Linux side now) and append >> that text to the .ssh/authorized_keys folder. The ~ folder is implied but could be added if you like. Now when I ssh pi@crowpi I should NOT be prompted for a password.

Here's the whole thing.

C:\Users\scott\Desktop> ssh-keygen
Generating public/private rsa key pair.
Enter file in which to save the key (C:\Users\scott/.ssh/id_rsa):
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in C:\Users\scott/.ssh/id_rsa.
Your public key has been saved in C:\Users\scott/.ssh/id_rsa.pub.
The key fingerprint is:
SHA256:x2vJHHXwosSSzLHQWziyx4II+scott@IRONHEART
The key's randomart image is:
+---[RSA 2048]----+
| . .... . |
|..+. .=+=. o |
| .. |
+----[SHA256]-----+
C:\Users\scott\Desktop> ssh pi@crowpi
pi@crowpi's password:
Linux crowpi 2018 armv7l

pi@crowpi:~ $ ls .ssh/
authorized_keys id_rsa id_rsa.pub known_hosts
pi@crowpi:~ $ exit
logout
Connection to crowpi closed.
C:\Users\scott\Desktop> type C:\Users\scott\.ssh\id_rsa.pub | ssh pi@crowpi 'cat >> .ssh/authorized_keys'
pi@crowpi's password:
C:\Users\scott\Desktop> ssh pi@crowpi
pi@crowpi: ~ $

Fab. At this point I could go BACK to my Windows' Visual Studio Code launch.json and simplify it to NOT use Plink/Putty and just use ssh and the ssh key management that's included with Windows.

"pipeTransport": {
"pipeCwd": "${workspaceFolder}",
"pipeProgram": "ssh",
"pipeArgs": [
"pi@crowpi.lan"
],
"debuggerPath": "/home/pi/vsdbg/vsdbg"
}

Cool!

NOTE: In my previous blog post some folks noted I am logging in as "root." That's an artifact of the way that .NET Core is accessing the GPIO pins. That won't be like that forever.

Thoughts? I hope this helps someone.

Sponsor: Your code is bad, but that’s ok thanks to Sentry’s full stack error monitoring that enables you to track and fix application errors in real time. Stop garbage code from becoming garbage fires.


© 2018 Scott Hanselman. All rights reserved.
     

Remote debugging with VS Code on Windows to a Raspberry Pi using .NET Core on ARM

Jan 18, 2019

Description:

I've been playing with my new "CrowPi" from Elecrow. It's a great Raspberrry Pi STEM kit that is entirely self-contained in a small case. It includes a touch screen and a TON of sensors, LCDs, matrix display, sensors, buzzers, breadboard, etc.

NOTE: I talked to the #CrowPi people and they gave me an Amazon COUPON that's ~$70 off! The coupon is 8EMCVI56 and will work until Jan 31, add it during checkout. The Advanced Kit is at https://amzn.to/2SVtXl2 #ref and includes everything, touchscreen, keyboard, mouse, power, SNES controllers, motors, etc. I will be doing a full review soon. Short review is, it's amazing.

I was checking out daily builds of the new open source .NET Core System.Device.Gpio that lets me use C# to talk to the General Purpose Input/Output pins (GPIO) on the Raspberry Pi. However, my "developer's inner loop" was somewhat manual. The developer's inner loop is that "write code, run code, change code" loop that we all do. If you find yourself typing repetitive commands that deploy or test your code but don't write new code, you'll want to try to optimize that inner loop and get it down to one keystroke (or zero in the case of automatic test).

Rasbperry Pi Debugging with VS CodeIn my example, I was writing my code in Visual Studio Code on my Windows machine, building the code locally, then running a "publish.bat" that would scp (secure copy) the resulting binaries over to the Raspberry Pi. Then in another command prompt that was ssh'ed into the Pi, I would chmod the resulting binary and run it. This was tedious and annoying, however as programmers sometimes we stop noticing it and just put up with the repetitive motion.

A good (kind of a joke, but not really) programmer rule of thumb is - if you do something twice, automate it.

I wanted to be able not only to make the deployment automatic, but also ideally I'd be able to interactively debug my C#/.NET Core code remotely. That means I'm writing C# in Visual Studio Code on my Windows machine, I hit "F5" to start a debug session and my app is compiled, published, run, and I attached to a remote debugger running on the Raspberry Pi, AND I'm dropped into a debugging session with a breakpoint set. All with one keystroke. This is common practice with local apps, but for remote apps - and ones that span two CPU architectures - it can take a smidge of setup.

Starting with instructions here: https://github.com/OmniSharp/omnisharp-vscode/wiki/Attaching-to-remote-processes and here: https://github.com/OmniSharp/omnisharp-vscode/wiki/Remote-Debugging-On-Linux-Arm and a little help from Jose Perez Rodriguez at work, here's what I came up with.

Setting up Remote Debugging from Visual Code on Windows to a Raspberry Pi running C# and .NET Core

First, I'm assuming you've got .NET Core on both your Windows machine and Raspberry Pi. You've also installed Visual Studio Code on you Windows machine and you've installed the C# extension.

On the Raspberry Pi

I'm ssh'ing into my Pi from Windows 10. Windows 10 includes ssh out of the box now, but you can also ssh from WSL (Windows Subsystem for Linux).

Install the VS remote debugger on your Pi by running this command:
curl -sSL https://aka.ms/getvsdbgsh | /bin/sh /dev/stdin -v latest -l ~/vsdbg ​To debug you will need to run the program as root, so we'll need to be able to remote launch the program as root as well. For this, we need to first set a password for the root user in your pi, which you can do by running:
sudo passwd root Then we need to enable ssh connections using root, by running :
sudo nano /etc/ssh/sshd_config        
and adding a line that reads:
PermitRootLogin yes reboot the pi: sudo reboot

VSDbg looks like this getting installed:

pi@crowpi:~/Desktop/rpitest$ curl -sSL https://aka.ms/getvsdbgsh | /bin/sh /dev/stdin -v latest -l ~/vsdbg
Info: Creating install directory
Using arguments
Version : 'latest'
Location : '/home/pi/vsdbg'
SkipDownloads : 'false'
LaunchVsDbgAfter : 'false'
RemoveExistingOnUpgrade : 'false'
Info: Using vsdbg version '16.0.11220.2'
Info: Previous installation at '/home/pi/vsdbg' not found
Info: Using Runtime ID 'linux-arm'
Downloading https://vsdebugger.azureedge.net/vsdbg-16-0-11220-2/vsdbg-linux-arm.zip
Info: Successfully installed vsdbg at '/home/pi/vsdbg'

At this point I've got vsdbg installed. You can go read about the MI Debug Engine here. "The Visual Studio MI Debug Engine ("MIEngine") provides an open-source Visual Studio Debugger extension that works with MI-enabled debuggers such as gdb, lldb, and clrdbg."

On the Windows Machine

Note that there are a half dozen ways to do this. Since I had a publish.bat already that looked like this, after installing putty with "choco install putty" on my Windows machine. I'm a big fan of pushd and popd and I'll tell you this, they aren't used or known enough.

dotnet publish -r linux-arm /p:ShowLinkerSizeComparison=true
pushd .\bin\Debug\netcoreapp2.1\linux-arm\publish
pscp -pw raspberry -v -r .\* pi@crowpi.lan:/home/pi/Desktop/rpitest
popd

On Windows, I want to add two things to my .vscode folder. I'll need a launch.json that has my "Launch target" and I'll need some tasks in my tasks.json to support that. I added the "publish" task myself. My publish task calls out to publish.bat. It could also do the stuff above if I wanted. Note that I made publish "dependsOn" build, and I removed/cleared problemMatcher. If you wanted, you could write a regEx that would detect if the publish failed.

{
"version": "2.0.0",
"tasks": [
{
"label": "build",
"command": "dotnet",
"type": "process",
"args": [
"build",
"${workspaceFolder}/rpitest.csproj"
],
"problemMatcher": "$msCompile"
},
{
"label": "publish",
"type": "shell",
"dependsOn": "build",
"presentation": {
"reveal": "always",
"panel": "new"
},
"options": {
"cwd": "${workspaceFolder}"
},
"windows": {
"command": "${cwd}\\publish.bat"
},
"problemMatcher": []
}
]
}

Then in my launch.json, I have this to launch the remote console. This can be a little confusing because it's mixing paths that are local to Windows with paths that are local to the Raspberry Pi. For example, pipeProgram is using the Chocolatey installation of Putty's Plink. But program and args and cwd are all remote (or local to) the Raspberry Pi.

"configurations": [
{
"name": ".NET Core Launch (remote console)",
"type": "coreclr",
"request": "launch",
"preLaunchTask": "build",
"program": "/home/pi/dotnet/dotnet",
"args": ["/home/pi/Desktop/rpitest/rpitest.dll"],
"cwd": "/home/pi/Desktop/rpitest",
"stopAtEntry": false,
"console": "internalConsole",
"pipeTransport": {
"pipeCwd": "${workspaceFolder}",
"pipeProgram": "${env:ChocolateyInstall}\\bin\\PLINK.EXE",
"pipeArgs": [
"-pw",
"raspberry",
"root@crowpi.lan"
],
"debuggerPath": "/home/pi/vsdbg/vsdbg"
}
}

Note the debugger path lines up with the location above that we installed vsdbg.

Remote debugging with VS Code on Windows to a Raspberry Pi using .NET Core

It's worth pointing out that while I'm doing this for C# it's not C# specific. You could setup remote debugging with VS Code using these building blocks with any environment.

The result here is that my developer's inner loop is now just pressing F5! What improvements would YOU make?

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.
© 2018 Scott Hanselman. All rights reserved.
     

Installing the .NET Core 2.x SDK on a Raspberry Pi and Blinking an LED with System.Device.Gpio

Jan 16, 2019

Description:

The CrowPi from Elecrow is an amazing STEM KitI've written about running .NET Core on Raspberry Pis before, although support was initially limited. Now that Linux ARM32 is a supported distro, what else can we do?

We can certainly quickly and easily install Docker on a Raspberry Pi and be running C# and .NET Core programs in minutes. We can run .NET Core in a stack of Raspberry Pis as a Kubernetes Cluster, making our own tiny cloud and install a serverless platform in it like OpenFaas!

If you have a Raspberry Pi 3 with Raspbian on it like I do, check out https://dotnet.microsoft.com/download/dotnet-core/2.2 and note that last part of the URL. You can ask for /2.1, /2.0, etc, just in case you're reading this post in the future, like tomorrow. ;) Everything is always at https://dotnet.microsoft.com/download/archives so you can tell what's Current and what's not.

For example, if I end up here https://dotnet.microsoft.com/download/thank-you/dotnet-sdk-2.2.102-linux-arm32-binaries I can grab the exact blob URL from the "try again" link and then wget it on my Raspberry Pi. You'll want to get a few prerequisites first. Note these blob links change when new stuff comes out, so you'll want to double check to get latest.

sudo apt-get install curl libunwind8 gettext
wget https://download.visualstudio.microsoft.com/download/pr/9650e3a6-0399-4330-a363-1add761127f9/14d80726c16d0e3d36db2ee5c11928e4/dotnet-sdk-2.2.102-linux-arm.tar.gz
wget https://download.visualstudio.microsoft.com/download/pr/9d049226-1f28-4d3d-a4ff-314e56b223c5/f67ab05a3d70b2bff46ff25e2b3acd2a/aspnetcore-runtime-2.2.1-linux-arm.tar.gz

I got the Linux ARM 32-bit SDK as well as the ASP.NET Runtime so I have those packages available for any web apps I choose to make.

Then we'll extract. You can set it up as a user off of $HOME or in /opt/dotnet and then link to /usr/local/bin.

mkdir -p $HOME/dotnet && tar zxf dotnet-sdk-2.2.102-linux-arm.tar.gz -C $HOME/dotnet
export DOTNET_ROOT=$HOME/dotnet
export PATH=$PATH:$HOME/dotnet

Don't forget to untar the ASP.NET Runtime as well.

tar zxf aspnetcore-runtime-2.2.1-linux-arm.tar.gz -C $HOME/dotnet

Cool. You will want to add the PATH to your profile if you want it to survive restarts. Then run "dotnet --info" to see if it works.

pi@crowpi:~ $ dotnet --info
.NET Core SDK (reflecting any global.json):
Version: 2.2.102

Runtime Environment:
OS Name: raspbian
OS Version: 9
OS Platform: Linux
RID: linux-arm
Base Path: /home/pi/dotnet/sdk/2.2.102/

Host (useful for support):
Version: 2.2.1

.NET Core SDKs installed:
2.2.102 [/home/pi/dotnet/sdk]

.NET Core runtimes installed:
Microsoft.AspNetCore.All 2.2.1 [/home/pi/dotnet/shared/Microsoft.AspNetCore.All]
Microsoft.AspNetCore.App 2.2.1 [/home/pi/dotnet/shared/Microsoft.AspNetCore.App]
Microsoft.NETCore.App 2.2.1 [/home/pi/dotnet/shared/Microsoft.NETCore.App]

Looks good.

At this point I have BOTH the .NET Core runtime (for running stuff) as well as all the ASP.NET runtime for web apps or little microservices AND the .NET SDK which means I can actually compile code (slowly) on the Pi itself. It's up to me/you. If you aren't ever going to develop (compile code) on the Raspberry Pi, you can just install the runtime, but I think it's nice to be prepared.

I am installing all this on a wonderful Raspberry Pi kit called a "CrowPi." They had a successful KickStarter and are now selling a Raspberry Pi Educational Kit with an attached custom board with dozens of components. Rather than having to connect motion sensors, soud sensors, touch sensors, switches, buttons, and carry around a bunch of wires, you can experiment and play with stuff in a very organized case that also has a 7inch HDMI touch screen. They also have 21 great Python Video Courses on their YouTube Channel on how to get started with hardware. It's a joy of a device. More on that later.

NOTE: I talked to the #CrowPi people and they gave me an Amazon COUPON that's ~$70 off! The coupon is 8EMCVI56 and will work until Jan 31, add it during checkout. The Advanced Kit is at https://amzn.to/2SVtXl2 #ref and includes everything, touchscreen, keyboard, mouse, power, SNES controllers, motors, etc. I will be doing a full review soon. Short review is, it's amazing.

Now that .NET Core is installed, I can start exploring the fun happening over at https://github.com/dotnet/iot. It's filled with lots of new functionality inside of System.Device.Gpio. Remember that GPIO means "General Purpose Input/Output" which, on a Raspberry Pi, is connected to a ribbon cable on the CrowPi with lots of cool sensors ready to go!

I could build my Raspberry Pi apps on my Windows/Mac/Linux machine and I'll find it much faster to compile. Then I can "scp" (secure copy) it over to the Pi. It's nice to point out that Windows 10 includes scp.exe now by default!

In this example, by adding -r linux-arm I'm copying a complete self-contained app over the Pi, so don't actually need to install .NET Core like I did above. If instead, I didn't use -r (to declare a specific runtime) then I would need to make sure I've got the right versions on my dev box vs my RPi, so consider what's best for you.

Here I am in my Windows machine that also has the same version of the .NET Core SDK installed. I'm in .\rpitest with a console app I made with "dotnet new console." Now I want to build and copy it over to the Pi.

dotnet publish -r linux-arm
cd bin\Debug\netcore2.1\linux-arm\publish
scp -r . pi@crowpi:/home/pi/Desktop/rpitest

From the Pi, I'll need to "sudo chmod +x" the rpitest application to make sure it is executable.

There's a brilliant video from Cam Soper that shows you in great detail how to run .NET Core 2.x on a Raspberry Pi and I recommend you check it out as well.

IoT devices expose much more than serial ports. They typically expose multiple kinds of pins that can be programmatically used to read sensors, drive LED/LCD/eInk displays and communicate with our devices. .NET Core now has APIs for GPIO, PWM, SPI, and I²C pin types.

These APIs are available via the System.Device.GPIO NuGet package. It will be supported for .NET Core 2.1 and later releases. There's some basic samples here https://github.com/dotnet/iot/blob/master/samples/README.md to start with.

From Microsoft:

Most of our effort has been spent on supporting these APIs in Raspberry Pi 3. We plan to support other devices, like the Hummingboard. Please tell us which boards are important to you. We are in the process of testing Mono on the Raspberry Pi Zero.

For now System.Device.Gpio is a prelease so you'll want to add a nuget.config to your project with the path to the dailies:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageSources>
<clear />
<add key="myget.org" value="https://dotnet.myget.org/F/dotnet-core/api/v3/index.json" />
<add key="nuget.org" value="https://api.nuget.org/v3/index.json" />
</packageSources>
</configuration>

Add a reference to System.Device.Gpio or (at the time of this writing) version 0.1.0-prerelease.19065.1. Now let's do something!

Here I'm just blinking this LED!

Console.WriteLine("Hello World!");
GpioController controller = new GpioController(PinNumberingScheme.Board);
var pin = 37;
var lightTime = 300;

controller.OpenPin(pin, PinMode.Output);
try {
while (true) {
controller.Write(pin, PinValue.High);
Thread.Sleep(lightTime);
controller.Write(pin, PinValue.Low);
Thread.Sleep(lightTime);
}
}
finally {
controller.ClosePin(pin);
}

Yay! Step zero works! Every cool IoT projects starts with a blinking LED!

Blinking LEDs ZOMG

Do be aware that System.Device.Gpio is moving VERY fast and some of this code and the samples may not work if namespaces or class names change. It'll settle down soon.

Great stuff though! Go get involved over at https://github.com/dotnet/iot as they are actively working on drivers/abstractions for Windows, Linux, etc and you could even submit a PR for a device like an LCD or simple sensor! I've only been playing for an hour but I will report back as I try new experiments with my kids.

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

How to update the firmware on your Zune, without Microsoft, dammit.

Jan 11, 2019

Description:

A glorious little ZuneAs I said on social media today, it's 2019 and I'm updating the Firmware on a Zune, fight me. ;) There's even an article on Vice about the Zune diehards. The Zune is a deeply under-respected piece of history and its UI marked the start of Microsoft's fluent design.

Seriously, though, I got this Zune and it's going to be used by my 11 year old because I don't want him to have a phone yet. He's got a little cheap no-name brand MP3 player and he's filled it up and basically outgrown it. I could get him an iPod Touch or something but he digs retro things (GBC, GBA, etc) so my buddy gave me a Zune in the box. Hasn't been touched...but it has a super old non-metro UI firmware.

Can a Zune be updated in 2019? Surely it can. Isn't Zune dead? I hooked up a 3D0 to my 4k flatscreen last week, so it's dead when I say it's dead.

IMPORTANT UPDATE: After I spent time doing this out I found out on Twitter that there's a small but active Zune community on Reddit! Props for them to doing this in several ways as well. The simplest way to update today is to point resources.zune.net to zuneupdate.com's IP address in your hosts file. The way I did it does use the files directly from Microsoft and gives you full control, but it's overly complex for regular folks for as long as the zuneupdate.com server exists as a mirror. Use the method that works easier for you and that you trust and understand!

First, GET ZUNE: the Zune Software version 4.8 is up at the Microsoft Download Center and it installs just fun on Windows 10. I've also made a copy in my Dropbox if this ever disappears. You should too! Second, GET FIRMWARE: the Zune Firmware is still on the Microsoft sites as well. This is an x86 MSI so don't bother trying to install it, we're going to open it up like an archive instead. Save this file forever. There's a half dozen ways to crack open an MSI. Since not everyone who will read this blog is a programmer, the easiest ways is Download lessmsi and use it to to the open and extract the firmware MSI. It's just an MSI specific extractor but it's nicer than 7zip because it extracts the files with the correct names. If you use Chocolatey, it's just "choco install lessmsi" then run "lessmsi-gui." LessMSI will put the files in a deep folder structure. You'll want to move them and have all your files right at the top of c:\users\YOURNAME\downloads\zunestuff. We will make some other small changes a little later on here.
LessMSI If you really want to, you could install 7zip and extract the contents of the Zune Firmware MSI into a new folder but I don't recommend it as you'll need to rename the files and give them the correct extensions. NERDS: you can also use msiexec from the command line, but I'm trying to keep this super simple. Third, FAKE THE ZUNE UPDATE SERVER: Since the Zune servers are gone, you need to pretend to be the old Zune Server. The Zune Software will "phone home" to Microsoft at resources.zune.net (which is gone) to look for firmware. Since the Zune software was made in a simpler time (a decade ago) it doesn't use SSL or do any checking for the cert to confirm the identity of the Zune server. This would be sad in 2019, but it's super useful to us when bringing this old hardware back to life. Again, there's as half dozen ways to do this. Feel feel to do whatever makes you happy as an HTTP GET is an HTTP GET, isn't it? NERDS: If you use Fiddler or any HTTP sniffer you can launch the Zune software and see it phone home for resources.zune.net/firmware/v4_5/zuneprod.xml and get a 404. It if had found this, it'd look at your Zune model and then figure out which cab (cabinet) archive to get the firmware from. We can easily spoof this HTTP GET. NERDS^2: Why didn't I use the Fiddler Autoresponder to record and replay the HTTP GETS? I tried. However, there's a number of different files that the Zune software could request and I only have the one Zune and I couldn't figure out how to model it in Fiddler. If I could do this, we could just install Fidder and avoid editing the hosts file AND using a tiny web server. From an admin command prompt, run notepad \windows\system32\drivers\etc\hosts and add this line:
127.0.0.1 resources.zune.net This says "if you ever want stuff from resources.zune.net, I'll handle it myself." Who is "myself?" It's our computer! It'll be a little web server you (or I) will run on our own, on localhost AKA 127.0.0.1. Now download dot.net core, it's small and fast to install programming environment. Don't worry, we aren't coding, we are just using the tools it includes. It won't mess up your machine or install anything at startup. Grab any 2.x .NET SDK from https://dot.net and install it from an MSI. Then go to a command prompt and run these commands. first we'll run dotnet once to warm it up, then get the server and run it from our zunestuff folder. We'll install a tiny static webserver called dotnet serve. See below:
dotnet
dotnet tool install --global dotnet-serve
cd c:\users\YOURNAME\downloads\zunestuff
dotnet serve -p 80 If you get any errors that dotnet serve can't be found, just close the command prompt and open it again to update your PATH. If you get errors that port 80 is open, be sure to stop IIS or Skype Desktop or anything that might be listening on port 80. Now, remember where I said you'd extract all those cabs and files out of the Firmware MSI? BUT when we load the Zune software and watch network traffic, we see it's asking for resources.zune.net/firmware/v4_5/zuneprod.xml. We need to answer (since Zune is gone, it's on us now) You'll want to make folders like this: C:\users\YOURNAME\downloads\zunestuff\firmware\v4_5 copy/rename copy FirmwareUpdate.xml to zuneprod.xml and have it live in that directory. It'll look like this:
A file heirarchy under zunestuff The zuneprod.xml file has relative URls inside like this, one for each model of the Zune that maps from USB hardware id to cab file. Open zuneprod.xml in a text editor and add http://resources.zune.net/ before each of the firmware file cabinets. For example if you're using notepad, your find and replace will look like this.
Replace URL=" with URL="http://resources.zune.net/

<FirmwareUpdate DeviceClass="1"
FamilyID="3"
HardwareID="USB\Vid_045e&amp;Pid_0710&amp;Rev_0300"
Manufacturer="Microsoft"
Name="Zune"
Version="03.30.00039.00-01620"
URL="DracoBaseline.cab">

UPDATE: As mentioned above, I did all this work (about an hour of traffic sniffing) and spoofed the server locally then found out that someone made http://zuneupdate.com as an online static spoof! It also doesn't use HTTPS, and if you'd like, you can skip the local spoof and point your your \windows\system32\drivers\etc\hosts with an entry pointing resources.zune.net to its IP address - which at the time of this writing was 66.115.133.19. Your hosts file would look like this, in that case. If this useful resource ever goes away, use the localhost hack above.
66.115.133.19 resources.zune.net Now run the Zune software, connect your Zune. Notice here that I know it's working because I launch the Zune app and go to Settings | Device then Update and I can see dotnet serve in my other window serving the zuneprod.xml in response.

Required Update

It's worth pointing out that the original Zune server was somewhat smart and would only return firmware if we NEEDED a firmware update. Since we are faking it, we always return the same response. You may be prompted to install new firmware if you manually ask for an update. But you only need to get on the latest (3.30 for my brown Zune 30) and then you're good...forever.

image

Enjoy!

Your iPod SucksZune is the way

Guardians 2 Zune

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Relationship Hacks: Playing video games and having hobbies while avoiding resentment

Jan 7, 2019

Description:

Super Nintendo Controller from PexelsI'm going to try to finished my Relationship Hacks book in 2019. I've been sitting on it too long. I'm going to try to use blog posts to spur myself into action.

A number of people asked me what projects, what code, what open source I did over the long holiday. ZERO. I did squat. I played video games, in fact. A bunch of them. I felt a little guilty then I got over it.

The Fun of Finishing - Exploring old games with Xbox Backwards Compatibility

I'm not a big gamer but I like a good story. I do single player with a plot. I consider a well-written video game to be up there with a good book or a great movie. I like a narrative and a beginning and end. Since it was the holidays, it did require some thought to play games.

When you're in a mixed relationship (a geek/techie and a non-techie) you need to be respectful of your partner's expectations. The idea of burning 4-6 hours playing games likely doesn't match up with your partner's idea of a good time. That's where communication comes in. We've found this simple system useful. It's non-gendered and should work for all types of relationships.

My spouse and I sat down at the beginning of our holiday vacation and asked each other "What do you hope to get out of this time?" Setting expectations up front avoids quiet resentment building later. She had a list of to-dos and projects, I wanted to veg.

Sitting around all day (staycation) is valid, as is using the time to take care of business (TCB). We set expectations up front to avoid conflict. We effectively scheduled my veg time so it was planned and accepted and it was *ok and guilt-free*

We've all seen the trope of the gamer hyper-focused on their video game while the resentful partner looks on. My spouse and I want to avoid that - so we do. If she knows I want to immerse myself in a game, a simple heads up goes a LONG way. We sit together, she reads, I play.

It's important to not sneak these times up on your partner. "I was planning on playing all night" can butt up against "I was hoping we'd spend time together." Boom, conflict and quiet resentment can start. Instead, a modicum of planning. A simple headsup and balance helps.

I ended up playing about 2-3 days a week, from around 8-9pm to 2am (so a REAL significant amount of time) while we hung out on the other 4-5 days. My time was after the kids were down. My wife was happy to see me get to play (and finish!) games I'd had for years.

Also, the recognition from my spouse that while she doesn't personal value my gaming time - she values that *I* value it. Avoid belittling or diminishing your partner's hobby. If you do, you'll find yourself pushing (or being pushed) away.

One day perhaps I'll get her hooked on a great game and one day I'll enjoy a Hallmark movie. Or not. ;) But for now, we enjoy knowing and respecting that we each enjoy (and sometimes share) our hobbies. End of thread.

If you enjoy my wife's thinking, check her out on my podcast The Return of Mo. My wife and I also did a full podcast with audio over our Cancer Year  Mo and Scott share their thoughts and struggle in this cancer diary they started the day after Mo was diagnosed.

Hope you find this helpful.

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Using Visual Studio Code to program Circuit Python with an AdaFruit NeoTrellis M4

Dec 26, 2018

Description:

My son and I were working on an Adafruit NeoTrellis M4 Mainboard over the holidays. This amazing little device puts a NeoPixel + an Audio board + a USB port along with a 120 MHz Cortex M4 Core and a mic amplifier and you can program it with CircuitPython. CircuitPython is open source and on Github at https://github.com/adafruit/circuitpython. "CircuitPython is an education friendly open source derivative of MicroPython." It works with a bunch of boards including this NeoTrellis and it's just lovely for teaching and learning.

This item is just the mainboard! You'll almost certainly want two Silicone Elastomer 4x4 Pads and an enclosure to go along.

Circuit PythonAs with a lot of these small boards, when you plug a NeoTrellis into a your machine via USB you'll get new disk drive that pops up. All you have to do to "deploy" your code is copy it to your drive. Even better, why not just edit the code place?

There's a great Python editor called Mu that works well with Circuit Python. However, my son and I are more familiar with Visual Studio Code so we wanted to see how it worked with Circuit Python.

We installed the Python extension for VS Code as well as the Arduino extension for VS Code and the Arduino IDE directly from the Windows Store.

Fire up VS Code and File | Open Folder and open the Disk Drive of the NeoTrellis and open (or create) a code.py file. Then from the Command Palette (Ctrl-Shift-P) in VS Code select Arduino > Initialize. If you get an error you may need to set up the path to your Arduino IDE. If you installed it from the Windows Store like we did you may find it in a weird path. We set the arduino.path like this:

"arduino.path": "C:\\Program Files\\WindowsApps\\ArduinoLLC.ArduinoIDE_1.8.19.0_x86__mdqgnx93n4wtt"

The NeoTrellis M4 also shows up as a COM port so you can look at its Serial Output for debugging purposes as if it were an Arduino (because it is underneath). You then Arduino > Select a COM Port from the Command Palette and it will create a file called .vscode/arduino.json in your folder that will look like this:

{
"port": "COM3"
}

Trellis M4 is awesomeNow, within Visual Studio Code select Arduino > Open Serial Monitor and all of your print("") methods will output to that bottom pane.

Of course, we could putty into the COM Port but since I'm using this as a learning tool with my 11 year old, I find that a single window that shows both the console and the code help them focus, rather than managing multiple windows.

At this point we have a nice Developer Inner Loop going. That inner loop for us (the developers) is that we can write some code, hit save (Ctrl-S) and get immediate feedback. The application restarts when it detects the code.py file has changed and any debug (print) statements appear in the console immediately.

Visual Studio Code doing some Circuit Python

We are really enjoying this Adafruit NeoTrellis M4 Express kit. Next we're going to make a beat sequencer since the Christmas Soundboard was such a hit with mom!

Sponsor: Me! Hey friends, I've got a podcast I'm very proud of where I interview amazing people every week. Check it out at https://www.hanselminutes.com and please not only subscribe in your favorite podcasting app, but also tell your friends! Tweet about it and review it on iTunes/Google Play. Thanks!


© 2018 Scott Hanselman. All rights reserved.
     

The Fun of Finishing - Exploring old games with Xbox Backwards Compatibility

Dec 21, 2018

Description:

Star Wars: KOTORI'm on vacation for the holidays and I'm finally getting some time to play video games. I've got an Xbox One X that is my primary machine, and I also have a Nintendo Switch that is a constant source of joy. I recently also picked up a very used original PS4 just to play Spider-man but expanded to a few other games as well.

One of the reasons I end up using my Xbox more than any of my other consoles is its support for Backwards Compatibility. Backwards Compat is so extraordinary that I did an entire episode of my podcast on the topic with one of the creators.

The general idea is that an Xbox should be able to play Xbox games. Let's take that even further - Today's Xbox should be able to play today's Xbox games AND yesterday's...all the way back to the beginning. One more step further, shall well? Today's Xbox should be able to play all Xbox games from every console generation and they'll look better than you imagined them!

The Xbox One X can take 720p games and upscale them to 4k, use higher quality textures, and some games like Final Fantasy XIII have even been fully remastered but you still use the original disc! I would challenge you to play the original Red Dead Redemption on an Xbox One X and not think it was a current generation game. I recently popped in a copy of Splinter Cell: Conviction and it automatically loaded a 5-year-old save game from the cloud and I was on my way. I played Star Wars: KOTOR - an original Xbox game - and it looks amazing.

Red Dead Redemption

A little vacation combined with a lot of backwards compatibility has me actually FINISHING games again. I've picked up a ton of games this week and finally had that joy of finishing them. Each game I started up that had a save game found me picking up 60% to 80% into the game. Maybe I got stuck, perhaps I didn't have enough time. Who knows? But I finished. Most of these finishings were just 3 to 5 hours of pushing from my current (old, original) save games.

Crysis 2 - An Xbox 360 game that now works on an Xbox One X. I was halfway through and finished it up in a few days. Crysis 3 - Of course I had to go to the local retro game trader and pick up a copy for $5 and bang through it. Crysis is a great trilogy. Dishonored - I found a copy in my garage while cleaning. Turns out I had a save game in the Xbox cloud since 2013. I started right from where I left off. It's so funny to see a December 2018 save game next to a 2013 save game. Alan Wake - Kind of a Twin Peaks type story, or a Stephen King with a flashlight and a gun. Gorgeous game, and very innovative for the time. Mirror's Edge - Deceptively simple graphics that look perfect on 4k. This isn't just upsampling, to be clear. It's magic. Metro 2033 - Deep story and a lot of world building. Oddly I finished Metro: Last Light a few months back but never did the original. Sunset Overdrive - It's so much better than Jet Set Radio Future. This game has a ton of personality and they recorded ALL the lines twice with a male and female voice. I spoke to the voiceover artist for the female character on Twitter and I really think her performance is extraordinary. I had so much fun with this game that now the 11 year old is starting it up. An under-respected classic. Gears of War Ultimate - This is actually the complete Gears series. I was over halfway through all of these but never finished. Gears are those games where you play for a while and end up pausing and googling "how many chapters in gears of war." They are long games. I ended up finishing on the easiest difficulty. I want a story and I want some fun but I'm not interested in punishment. Shadow Complex - Also surprisingly long, I apparently (per my save game) gave up with just an hour to go. I guess I didn't realize how close I was to the end?

I'm having a blast (while the spouse and kids sleep, in some cases) finishing up these games. I realize I'm not actually accomplishing anything but the psychic weight of the unfinished is being lifted in some cases. I don't play a lot of multiplayer games as I enjoy a story. I read a ton of books and watch a lot of movies, so I look for a tale when I'm playing video games. They are interactive books and movies for me with a complete story arc. I love it when the credits role. A great single player game with a built-up universe is as satisfying (or more so) as finishing a good book.

What are you playing this holiday season? What have you rediscovered due to Backwards Compatibility?

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Enjoy some DOS Games this Christmas with DOSBox

Dec 19, 2018

Description:

I blogged about DOSBox five years ago! Apparently I get nostalgic around this time of year when I've got some downtime. Here's what I had to say:

I was over at my parents' house for the Christmas Holiday and my mom pulled out a bunch of old discs and software from 20+ years ago. One gaame was "Star Trek: Judgment Rites" from 1995. I had the CD-ROM Collector's edition with all the audio from the original actors, not just the floppy version with subtitles. It's a MASSIVE 23 megabytes of content!

DOSBox has ben providing joy in its reliable service for over 16 years and you should go check it out RIGHT NOW, if only to remind yourself of how good we have it now. DOSBox is an x86 and DOS Emulator - not a virtual machine. It emulates classic hardware like Sound Blaster cards and older graphics standards like VGA/VESA.

If a game runs too fast, you can slow it down by pressing Ctrl-F11. You can speed up games by pressing Ctrl-F12. DOSBox’s CPU speed is displayed in its title bar. Type "intro special" for a full hotkey list.

Note that DOSBox will start up TINY if you have a 4k monitor. There's a few things to you can do about it. First, ALT-ENTER will toggle DOSBox into full screen mode, although when you return to Windows your windows may find themselves resized.

For Windowed mode, I used these settings. You can't scale the window when output=surface, so experiment with settings like these:

windowresolution=1280 x 1024 output=ddraw

These are only the most basic initial changes you'll want to make. There's an enthusiastic community of DOSBox users that are dedicated to making it as perfect as possible. I enjoy this reddit thread debating "pixel perfect" settings. There's also a number of forks and custom builds of DOSBox out there that impose specific settings so be sure to explore and pick the one that makes you happy. It's also important to understand that aspect ratios and the size and squareness of a pixel will all change how your game looks.

I tend to agree with them that I don't want a blurry scaler. I want the dots/pixels as they are, simply made larger (2x, 3x, 4x, etc) with crisp edges at a reasonable aspect ratio. An interesting change you can make to your .conf file is the "forced" keyword after your scaler choice.

Here is scaler=normal3x (no forced)

Blurry DOSBox

and there's scaler-normal3x forced

The instructions say that forced means "the scaler will be used even if the result might not be desired." In this case, it forces the use of the scaler in text mode. Your mileage may vary, but the point is there's options and it's great fun. You may want scanlines or you may want crisp pixels.

I've found it all depends on what your memory of DOS is and what you're trying to do is to change the settings to best visualize that memory. My (broken) memory is of CRISP pixels.

Crisp DOSBox

Amazing difference!
The first thing you should do is add lines like these to the bottom of your dosbox.conf. You'll want your virtual C: drive mounted every time DOSBox starts up!

[autoexec] # Lines in this section will be run at startup. MOUNT C: C:\Users\scott\Dropbox\DosBox

If you want to play classic games but don't want the hassle (or questionable legality) of other ways, I'd encourage you to spend some serious time at https://www.gog.com. They've packaged up a ton of classic games so they "just work."

Bard's Tale 3 Space Quest 3

Enjoy! And THANK YOU to the folks that work on DOSBox for their hard work. It shows and we appreciate it.

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Useful ASP.NET Core 2.2 Features

Dec 14, 2018

Description:

Earlier this week I talked about how I upgraded my podcast site to ASP.NET Core 2.2 and added Health Check features fairly easily. There's a ton of new features and so far it's been great running on my site with no issues. Upgrading from 2.1 is straightforward.

Better integration with popular Open API (Swagger) libraries including design-time checks with code analyzers Introduction of Endpoint Routing with up to 20% improved routing performance in MVC Improved URL generation with the LinkGenerator class & support for route Parameter Transformers (and a post from Scott Hanselman) New Health Checks API for application health monitoring Up to 400% improved throughput on IIS due to in-process hosting support Up to 15% improved MVC model validation performance Problem Details (RFC 7807) support in MVC for detailed API error results Preview of HTTP/2 server support in ASP.NET Core Template updates for Bootstrap 4 and Angular 6 Java client for ASP.NET Core SignalR Up to 60% improved HTTP Client performance on Linux and 20% on Windows

I wanted to look at just a few of these that I found particularly interesting.

You can get a very significant performance boost by moving ASP.NET Core in process with IIS.

Using in-process hosting, an ASP.NET Core app runs in the same process as its IIS worker process. This removes the performance penalty of proxying requests over the loopback adapter when using the out-of-process hosting model.

After the IIS HTTP Server processes the request, the request is pushed into the ASP.NET Core middleware pipeline. The middleware pipeline handles the request and passes it on as an HttpContext instance to the app's logic. The app's response is passed back to IIS, which pushes it back out to the client that initiated the request.

HTTP Client performance improvements are quite significant as well.

Some significant performance improvements have been made to SocketsHttpHandler by improving the connection pool locking contention. For applications making many outgoing HTTP requests, such as some Microservices architectures, throughput should be significantly improved. Our internal benchmarks show that under load HttpClient throughput has improved by 60% on Linux and 20% on Windows. At the same time the 90th percentile latency was cut down by two on Linux. See Github #32568 for the actual code change that made this improvement.

HTTP/2 is enabled by default. HTTP/2 may be sneaking up on you as for the most part "it just works." In ASP.NET Core's Kestral web server HTTP/2 is enabled by default over HTTPS. You can see here at both the command line and in Chrome I'm using HTTP/2 locally.

HTTP/2 locally

Here's Chrome. Note the "h2."

HTTP/2 in Chrome

Note that you'll only be able to get HTTP/2 when ALPN (Application-Layer Protocol Negotiation) is available. That means ALPN is supported on:

.NET Core on Windows 8.1/Windows Server 2012 R2 or higher .NET Core on Linux with OpenSSL 1.0.2 or higher (e.g., Ubuntu 16.04)

All in all, it's a solid release. Go check out the announcement post on ASP.NET Core 2.2 for even more detail!

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

How to set up ASP.NET Core 2.2 Health Checks with BeatPulse's AspNetCore.Diagnostics.HealthChecks

Dec 12, 2018

Description:

Availability TestsASP.NET Core 2.2 is out and released and upgrading my podcast site was very easy. Once I had it updated I wanted to take advantage of some of the new features.

For example, I have used a number of "health check" services like elmah.io, pingdom.com, or Azure's Availability Tests. I have tests that ping my website from all over the world and alert me if the site is down or unavailable.

I've wanted to make my Health Endpoint Monitoring more formal. You likely have a service that does an occasional GET request to a page and looks at the HTML, or maybe just looks for an HTTP 200 Response. For the longest time most site availability tests are just basic pings. Recently folks have been formalizing their health checks.

You can make these tests more robust by actually having the health check endpoint check deeper and then return something meaningful. That could be as simple as "Healthy" or "Unhealthy" or it could be a whole JSON payload that tells you what's working and what's not. It's up to you!

image

Is your database up? Maybe it's up but in read-only mode? Are your dependent services up? If one is down, can you recover? For example, I use some 3rd party back-end services that might be down. If one is down I could used cached data but my site is less than "Healthy," and I'd like to know. Is my disk full? Is my CPU hot? You get the idea.

You also need to distinguish between a "liveness" test and a "readiness" test. Liveness failures mean the site is down, dead, and needs fixing. Readiness tests mean it's there but perhaps isn't ready to serve traffic. Waking up, or busy, for example.

If you just want your app to report it's liveness, just use the most basic ASP.NET Core 2.2 health check in your Startup.cs. It'll take you minutes to setup.

// Startup.cs
public void ConfigureServices(IServiceCollection services)
{
services.AddHealthChecks(); // Registers health check services
}

public void Configure(IApplicationBuilder app)
{
app.UseHealthChecks("/healthcheck");
}

Now you can add a content check in your Azure or Pingdom, or tell Docker or Kubenetes if you're alive or not. Docker has a HEALTHCHECK directive for example:

# Dockerfile
...
HEALTHCHECK CMD curl --fail http://localhost:5000/healthcheck || exit

If you're using Kubernetes you could hook up the Healthcheck to a K8s "readinessProbe" to help it make decisions about your app at scale.

Now, since determining "health" is up to you, you can go as deep as you'd like! The BeatPulse open source project has integrated with the ASP.NET Core Health Check API and set up a repository at https://github.com/Xabaril/AspNetCore.Diagnostics.HealthChecks that you should absolutely check out!

Using these add on methods you can check the health of everything - SQL Server, PostgreSQL, Redis, ElasticSearch, any URI, and on and on. Just add the package you need and then add the extension you want.

You don't usually want your health checks to be heavy but as I said, you could take the results of the "HealthReport" list and dump it out as JSON. If this is too much code going on (anonymous types, all on one line, etc) then just break it up. Hat tip to Dejan.

app.UseHealthChecks("/hc",
new HealthCheckOptions {
ResponseWriter = async (context, report) =>
{
var result = JsonConvert.SerializeObject(
new {
status = report.Status.ToString(),
errors = report.Entries.Select(e => new { key = e.Key, value = Enum.GetName(typeof(HealthStatus), e.Value.Status) })
});
context.Response.ContentType = MediaTypeNames.Application.Json;
await context.Response.WriteAsync(result);
}
});

At this point my endpoint doesn't just say "Healthy," it looks like this nice JSON response.

{
status: "Healthy",
errors: [ ]
}

I could add a Url check for my back end API. If it's down (or in this case, unauthorized) I'll get this a nice explanation. I can decide if this means my site is unhealthy or degraded.  I'm also pushing the results into Application Insights which I can then query on and make charts against.

services.AddHealthChecks()
.AddApplicationInsightsPublisher()
.AddUrlGroup(new Uri("https://api.simplecast.com/v1/podcasts.json"),"Simplecast API",HealthStatus.Degraded)
.AddUrlGroup(new Uri("https://rss.simplecast.com/podcasts/4669/rss"), "Simplecast RSS", HealthStatus.Degraded);

Here is the response, cool, eh?

{
status: "Degraded",
errors: [
{
key: "Simplecast API",
value: "Degraded"
},
{
key: "Simplecast RSS",
value: "Healthy"
}
]
}

This JSON is custom, but perhaps I could use the a built in writer for a free reasonable default and then hook up a free default UI?

app.UseHealthChecks("/hc", new HealthCheckOptions()
{
Predicate = _ => true,
ResponseWriter = UIResponseWriter.WriteHealthCheckUIResponse
});

app.UseHealthChecksUI(setup => { setup.ApiPath = "/hc"; setup.UiPath = "/healthcheckui";);

Then I can hit /healthcheckui and it'll call the API endpoint and I get a nice little bootstrappy client-side front end for my health check. A mini dashboard if you will. I'll be using Application Insights and the API endpoint but it's nice to know this is also an option!

If I had a database I could check one or more of those for health well. The possibilities are endless and up to you.

public void ConfigureServices(IServiceCollection services)
{
services.AddHealthChecks()
.AddSqlServer(
connectionString: Configuration["Data:ConnectionStrings:Sql"],
healthQuery: "SELECT 1;",
name: "sql",
failureStatus: HealthStatus.Degraded,
tags: new string[] { "db", "sql", "sqlserver" });
}

It's super flexible. You can even set up ASP.NET Core Health Checks to have a webhook that sends a Slack or Teams message that lets the team know the health of the site.

Check it out. It'll take less than an hour or so to set up the basics of ASP.NET Core 2.2 Health Checks.

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

How to remove words from the Windows Autocorrect Spell Check Dictionary

Dec 7, 2018

Description:

Well crap. I was typing really fast and got a squiggly, so I right-clicked on it and rather than selecting the correct word from the autocorrect dictionary, I clicked Add To Dictionary.

I added the MISSPELLED WORD to the Dictionary! Now Windows is suggesting that I spell this word (and others) wrong in all apps.

At this point I also realized that I had no idea how to REMOVE a word from the Windows Spell Check Dictionary. However, I do know that Windows isn't a black box so there must be a dictionary somewhere. It's gotta be a file or a registry key or something, right?

It's even easier than I thought it would be. The Windows 10 custom dictionaries are at %AppData%\Microsoft\Spelling\

The Windows 10 custom dictionaries are at %AppData%\Microsoft\Spelling\

I just opened the default.dic file in Notepad and removed the misspelled word.

Opening default.dic in Notepad

Whew. I can't tell you how many wrong words have found there way in there over the years. Hope this helps you in some small way.

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.
© 2018 Scott Hanselman. All rights reserved.
     

Announcing WPF, WinForms, and WinUI are going Open Source

Dec 4, 2018

Description:

Buckle up friends! Microsoft is open sourcing WPF, Windows Forms (winforms), and WinUI, so the three major Windows UX technologies are going open source! All this is happening on the same day as .NET Core 3.0 Preview 1 is announced. Madness! ;)

.NET Core 3 is a major update which adds support for building Windows desktop applications using Windows Presentation Foundation (WPF), Windows Forms, and Entity Framework 6 (EF6). Note that .NET Core 3 continues to be open source and runs on Windows, Linux, Mac, in containers, and in the cloud. In the case of WPF/WinForms/etc you'll be able to create apps for Windows that include (if you like) their own copy of .NET Core for a clean side-by-side install and even faster apps at run time. The Windows UI XAML Library (WinUI) is also being open sourced AND you can use these controls in any Windows UI framework.

That means your (or my!) WPF/WinForms/WinUI apps can all use the same controls if you like, using XAML Islands. I could take the now 10 year old BabySmash WPF app and add support for pens, improved touch, or whatever makes me happy!

WPF and Windows Forms projects are run under the .NET Foundation which also announced changes today and the community will guide foundation operations. The .NET Foundation is also changing its governance model by increasing the number of board members to 7, with just 1 appointed by Microsoft. The other board members will be voted on by the community! Anyone who has contributed to a .NET Foundation project can run, similar to how the Gnome Foundation works! Learn more about the .NET Foundation here.

On the runtime and versioning side, here's a really important point from the .NET blog that's worth emphasizing IMHO:

Know that if you have existing .NET Framework apps that there is not pressure to port them to .NET Core. We will be adding features to .NET Framework 4.8 to support new desktop scenarios. While we do recommend that new desktop apps should consider targeting .NET Core, the .NET Framework will keep the high compatibility bar and will provide support for your apps for a very long time to come.

I think of it this way. If you’ve got an existing app that you’re happy with, there is no reason to port this to .NET Core. Microsoft will support the .NET Framework for a very long time, given that it’s a part of Windows. But post .NET Framework 4.8. new features will usually only become available in .NET Core because Microsoft is drastically reducing the risk and thus rate of change for .NET Framework. So if you’re building a new app or you’re actively evolving an existing app you should really start looking at .NET Core. Porting to .NET Core certainly isn’t free, but it offers many benefits, such as better performance, XCOPY deployment for the framework itself, and feature set that is growing fast, thanks to open source. Choose the strategy that makes sense for your project and/or business.

I don't want to hear any of this "this is dead, only use that" nonsense. We just open sourced WinForms and have already taken Pull Requests. WinForms has been updated for 4k+ displays! WPF is open source, y'all! Think about the .NET Standard and how you can run standard libraries on .NET Framework, .NET Core, and Mono - or any ".NET" that's out there. Mono is enabling running .NET Standard libraries via WebAssembly. To be clear - your browser is now .NET Standard capable! There are open source projects like https://platform.uno/ and Avalonia and Ooui taking .NET in new and interesting places. Blazor makes Web UIs in .NET with (preview/experimental) client support with Web Assembly and server support included in .NET 3.0 with Razor Components. Only good things are coming, my friends!

.NET ALL THE THINGS

.NET Core runs on Raspberry Pi and ARM processors! .NET Core supports serial ports, IoT devices, and there's even a System.Device.GPIO (General Purpose I/O) package! Go explore https://github.com/dotnet/iot to really get your head around how much cool stuff is happening in the .NET space.

I want to encourage you to go check out Matt Warren's extremely well-researched post "Open Source .NET - 4 years later" to get a real visceral sense of how far we've come as a community. You'll be amazed!

Now, go play!

Download .NET Core 3 Preview 1 on Windows, Mac and Linux. You can see details of the release in the .NET Core 3 Preview 1 release notes Visual Studio 2019 will support building .NET Core 3 applications and the VS2019 preview can be installed side by side with existing versions of VS.

Enjoy.

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

On Developer Advocacy

Nov 30, 2018

Description:

TeamworkNaming things is hard. I've talked before about the term "evangelism" and my dislike for it. Evangelism, Advocacy, Developer Relations, PR, Marketing, and on and on. More and more I'm just trying to educate and maybe entertain a little. So I like Edutainment, myself, hat tip to KRS-One.

I'm getting on a plane tomorrow to go to the Microsoft Azure + AI Conference @DevIntersection and the Free Microsoft Connect 2018 Event (you can watch online all day!) and as I was packing I was struck with a few thoughts I wanted to share here.

What a privilege it is to speak about products that so many people have worked on and (hopefully) so many people will enjoy. Especially ones as large as Azure or Visual Studio - thousands of people work so hard! Engineers, Program Managers, Testers, Community Members...people from all over working on each release so a select few of us get on stage to share it with you! And who am I to have this privilege?

Don't think for a second that when you're giving a technical talk that it's about you. You're sitting on a stack of software you had a small part in writing and standing on the shoulders of giants of generations of engineers and creators who came before you. When I do talks where I'm representing a huge group I reflect on this with gratitude.

If you work on any of the products I'm showing, know this; I may be one of the talking heads or a visible grand marshal but we work for you and we never forget it. My job at events like this is to make the product - your work - shine. I take that job very seriously, and if it looks like it's effortless, that's because of the massive amount of work we put into the presentation. Hours of practice, story arcs, literally blocking movement as if it were a play or stage show, camera work, and transitions. Deeping understanding what we're presenting and why it's awesome and why you're proud of it.

I'm writing this note for all the other advocates and visible community members.

What a joy and privilege it is to stand up and represent our co-workers and follow engineers and to tell the stories of the things they build!

Let that privilege both put motivation in you and propel you forward to present their work - your teams' work.

I appreciate you all, both inside and out, and I'll will do my best to represent your team and the larger community to the best of my ability.

Sponsor: Looking for a new challenge? Hired is the leading job marketplace that connects engineers to their next challenge. Let Hired connect you to your next challenge. Sign up now.


© 2018 Scott Hanselman. All rights reserved.
     

The 2018 Christmas List of Best STEM Toys for Kids

Nov 28, 2018

Description:

Hey friends! This is my FIFTH year doing a list of Great STEM Christmas Toys for Kids! Can you believe it? In case you missed them, here's the previous years' lists! Be aware I use Amazon referral links so I get a little kickback (and you support this blog!) when you use these links. I'll be using the pocket money to...wait for it...buy STEM toys for kids! So thanks in advance!

The 2017 Christmas List of Best STEM Toys for kids The 2016 Christmas List of Best STEM Toys for your little nerds and nerdettes The 2015 Christmas List of Best STEM Toys 2014: Getting Started with Robots for kids and children in STEM this holiday season

OK, let's do it!

littleBits

I've always liked littleBits but when they first came out I thought they were expensive and didn't include enough stuff. Fast forward and littleBits have dropped in price and built a whole ecosystem of littleBits that work together. This year the most fun is the littleBits Marvel Avengers Inventor Kit. At the time of this writing, this kit is 33% off at Amazon. You can built your own Iron Man (or Ironheart!) gauntlet and load it up with littleBits that can do whatever you'd like. One particularly cool thing included is an LED Matrix that you can address directly by writing code with the iOS or Android app.

littleBits Marvel Avengers Inventor Kit

Kano - Computer Kit and Wand

Both my kids love the Kano Computer Kit, now updated for 2018. It's a complete Raspberry Pi 3 kit that includes the keyboard, mouse, case, LED lights, and everything you'd need to build a Pi. This year they've branched out to the Kano Happy Potter Coding Kit that you can use to build a wand and learn to code. The "wand" is a custom PCB with codeable LEDs, buttons, and batteries that the kids put inside a wand. The wand is Bluetooth and includes lots of tech like an accelerometer, gyroscope, magnetometer, and a vibrating rumble pack. All of this tech is controllable with laptops or smart devices and code with JavaScript.

Harry Potter Kano Coding Kit and Wand

UbTech JIMU Robot - Unicornbot Kit

UbTech has a whole series of Technics-style Robot kits. There's the usual tanks and cars, but there's also some more creative and "out there" ones like this 400-piece Unicorn Robot. It includes color sensors, server motors, a DC motor, and a light up horn. It's also codeable/controllable via an iOS or Android app. Very cool!

I'd really like their Lynx Alexa controllable walking robot but it's way out of my price range. Still fun to check out though!

Unicornbot

Erector by Meccano Kits

We've found these Erector by Meccano Kits to be inexpensive and well-built. The 25-in-1 kit is great and includes a container and over 600 pieces. I like these metal kits because they feel like the ones I had in my childhood. Kids learn how to use motors, pulleys, and other explore functional motion.

Erector Set

Osmo Genius Kit for iPad

The Osmo Genius is quite clever and based on one deceptively simple idea - what if the iPad camera faced downward and could see the table in front of the child? It came with a base and a reflector that directs the front-facing camera downwards. Then the educational games are written to see what's happening on the table and provide near-instant feedback. You can start with the base kit and later optionally add kits and games.

Osmo Genius Kit for iPad

Elenco 130-in-1 Electronic Playground and Learning Center

I like classic toys and while toys with bluetooth and fancy features are cool, I want to balance it out with the classics that let you explore the physical world. These also tend to be more affordable as well.

I really like this classic electronic trainer with 130 experiments like an AM broadcast station, Electronic Organ, LED strobe light, Timer, Logic Circuits and much, much more. The 50-in-One version is just $16! Frankly all the Elenco products are fantastic.

image

Piper Computer Kit (2018 Edition)

I had this on the list last year but my kids still love it. We have the 2016 kit and it's been updated for 2018.

The Piper is a little spendy at first glance, but it's EXTREMELY complete and very thoughtfully created. Sure, you can just get a Raspberry Pi and hack on it - but the Piper is not just a Pi. It's a complete kit where your little one builds their own wooden "laptop" box (more of a luggable), and then starting with just a single button, builds up the computer. The Minecraft content isn't just vanilla Microsoft. It's custom episodic content! Custom voice overs, episodes, and challenges.

What's genius about Piper, though, is how the software world interacts with the hardware. For example, at one point you're looking for treasure on a Minecraft beach. The Piper suggests you need a treasure detector, so you learn about wiring and LEDs and wire up a treasure detector LED while it's running. Then you run your Minecraft person around while the LED blinks faster to detect treasure. It's absolute genius. Definitely a favorite in our house for the 8-12 year old set.

Piper Raspberry Pi Kit

I hope you have a great holiday season!

FYI: These Amazon links are referral links. When you use them I get a tiny percentage. It adds up to taco money for me and the kids! I appreciate you - and you appreciate me-  when you use these links to buy stuff.

Sponsor: Let top companies apply to you. Create a free profile on Hired and unlock the ability to let companies apply to you, not the other way around. Create a free profile.


© 2018 Scott Hanselman. All rights reserved.
     

Upgrading the DakBoard Family Calendar with Raspberry Pi Zero W and Read Only filesystem

Nov 23, 2018

Description:

Raspberry Pi Zeros are SMALLEarlier this week I built a Family Calendar using a used flat screen monitor and a Raspberry Pi 3 I had lying around and documented it in my post How to build a Wall Mounted Family Calendar and Dashboard with a Raspberry Pi and cheap monitor.

Eric Brown added two great comments (the comments on my blog are always better than the content!) He said:

You can save power & money by using an Pi Zero W instead. This is likely overkill, but I took the time to get the Pi Zero to mount the SD card read-only and do all the writes to a RAM disk.

Eric said "RPis are surprisingly sensitive to power glitches, and will often corrupt the SD card" and that "after mounting the SD read-only, my DakBoard has been running stably for months; before doing that, it corrupted the SD card within 6 weeks."

While I haven't had any issues with my Raspberry Pis, this seemed like a fun "version 2" of the calendar to make with the kids. Worst case scenario? Now I have LCD family calendars!

You'll recall I commented about how important the Spouse Acceptance Factor is whenever introducing new technology into the house.

It has to just work. If my Spouse doesn't like the idea or find its not reliable, the SAF (Spouse Acceptance Factor) will be low and they'll want to get rid of it. All it takes is one "why isn't this working" and I'm dead in the water.

I checked Amazon and found a number of Raspberry Pi Zero W (W is for Wireless, important!) Kits for around US$20. You can see in the picture above how SMALL a Raspberry Pi Zero W is (with LEGO Miss Marvel for scale).

Get the HDMI cables as flush an sanitary as possible

If you have the cables, power supplies, and don't need the headers and extra stuff, I've seen them as low as $10. It's very important to note that a Raspberry Pi Zero W does support HDMI but it has a MINI-HDMI female connector. You'll need a mini-HDMI to HDMI adapter or a mini-HDMI to HDMI short cable.

Here's another aside. Did you know there are a LOT of different HDMI connector orientations? Sure, you could just loop a big old 6 foot HDMI cable back there, but where's the fun in that? There are micro HDMI D1,D2,D3 that describe 90 degree and 270 degree rotations of the male. If you want to be really flush, consider a cable (for example like a C2 to A2) that is usually used in drones. This would allow you to mount the Pi Zero W flush against the back of the monitor - or even better, inside the monitor or a wooden picture frame!

Dakboard

Get the Raspberry Pi Zero W on your wireless and avoid the trouble of keyboards and mice!

Pi Zero Ws are so small that they don't have a regular USB connector. There is one for power and one that is "USB OTG." If you want to connect a mouse and keyboard directly to the Zero you'll need this USB OTG Micro to Type A Cable and/or a powered USB hub.

OR!

Save money and prep your Raspberry Pi Micro SD Card with SSH turned on by default and your Wireless Network enabled by default! Then you can set it up remotely as a DakBoard/MagicMirror Family Calendar.

Download the Image for Raspbian Stretch. You'll want the desktop version (not Lite) because this IS a visual project, not a headless one! I recommend Etcher for burning images to SD Cards. It's free. Raspberry Pi Zero W and a 1A+ micro USB power supply Cheap micro SD Card. They should include an adapter to plug it into your main computer to prepare. Create an empty file called "ssh" on the prepared Micro SD Card before you put the card in the Raspberry Pi Make a file called wpa_supplicant.conf with Linux line feeds (LF, not the default Windows CR/LF) with content like this (and your own country code)country=us
update_config=1
ctrl_interface=/var/run/wpa_supplicant

network={
scan_ssid=1
ssid="YourNetworkSSID"
psk="NETWORKPASSWORD!"
}

This will cause the Pi to get on the network on boot up which should allow you to SSH over to it directly, thereby avoiding any trouble with keyboards and mice and the Pi Zero W.

If you DO end up wanted to connect the keyboard and mouse, you'll want a keyboard/mouse setup that is all in one with just one USB adapter or you'll need a Powered USB Hub. This should be temporary as you get the Pi prepared.

Make the Raspberry Pi Zero W readonly - after it's been configured with DakBoard

Once I had the Pi Zero W all prepared I went around the net looking for tutorials to make it readonly. You're basically causing Linux to mount the SD Card readonly and then do all writes to a RAM Disk that will ultimately be tossed whenever you (rarely) reboot. Get it perfect before you go readonly as it's a small hassle to switch back. Or you can pull the card out and mount it on your other computer then return it. Still, not awesome.

Eric from the comments pointed me to a Raspberry Pi Jesse tutorial, but I tried it and it didn't work for me, likely because I'm on Raspbian Stretch, a newer version. There's a LOT of choices and ways to do this but the best tutorial I found was on the page for Domoticz, a n open source Home Automation system which looks, as an aside, awesome and something I need to check out in the future!

For now, I followed these instructions on Setting up overlayFS on Raspberry PI (the "overlay" being the file system you'll write to but it's a fake, the writes are going to one folder and the two foldkers (one read-write and one read-only are overlaid over each other). This allowed me to make a Raspberry Pi Raspbian Stretch system Readonly on my Pi Zero W.

I followed the instructions exactly, only skipping the parts like "Modify domoticz service" that didn't apply. When I run "mount" I can see the main file system is read-only and the others are overlaid and read-write.

pi@dakboard2:~ $ mount
/dev/mmcblk0p7 on / type ext4 (ro,noatime,data=ordered)
snip!
ramdisk on /var_rw type tmpfs (rw,relatime)
ramdisk on /home_rw type tmpfs (rw,relatime)
overlay on /home type overlay (rw,relatime,lowerdir=/home_org,upperdir=/home_rw/upper,workdir=/home_rw/work)
overlay on /var type overlay (rw,relatime,lowerdir=/var_org,upperdir=/var_rw/upper,workdir=/var_rw/work) So far so good! This will make a smaller and lower power Family Calendar that will hopefully be more reliable as well! Thanks Eric from the comments!

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.
© 2018 Scott Hanselman. All rights reserved.
     

How to build a Wall Mounted Family Calendar and Dashboard with a Raspberry Pi and cheap monitor

Nov 21, 2018

Description:

Glanceable DashboardI love dashboards. I love Raspberry Pis (tiny $35 computers the size of a set of playing cards). And I'm cheap frugal. I found a 24" old LCD at Goodwill (a local thrift shop) and bought it but it's been sitting unused in my garage.

Then I stumbled on DakBoard. The idea is simple - A wifi connected wall display for your photos, calendar, news, weather and to-do.

The implementation is simple genius. It's a browser that starts up full screen (kiosk mode) and just sits there and updates occasionally. DakBoard provides the private webpage and tools to make that happen. You can certainly build this yourself with any number of open source tools. I chose DakBoard because it was simple, beautiful, and I was able to get the whole thing done in less than an hour. I'm sure I'll spend many hours tweaking it through. There's also the very popular MagicMIrror platform, so lots of choice and power in this space!

What are some considerations?

You may want to turn it off on a scheduled to save power and the screen cronjob - turn it off on a schedule sensor - turn it on when something (your alarm, nest, thermostat motion detector attached to GPIO, etc) detects your presence) It has to act like an appliance. If you are messing with it to keep it alive, it's not an appliance, it's another computer to manage. It has to just work. If my Spouse doesn't like the idea or find its not reliable, the SAF (Spouse Acceptance Factor) will be low and they'll want to get rid of it. All it takes is one "why isn't this working" and I'm dead in the water. Finally - What do you want to show?

Someone asked me - "What would I want to put on my dashboard other than a calendar? I don't see why this is useful."

What would you put on a Glance-eable Display?

Family Calendar(s), movie times, temperature, news, my blood sugar, disk free on my NAS, TV schedule, family photos, commute traffic, album releases, homework due soon, family events, trips, flight status, music playing now, literally anything you want as a glance-able display. 

Glanceable Dashboard

Philosophy

You'll want to ask yourself, is this just an iPad on the wall? I'd propose not. In fact, I'd say this is a Wall Mounted Glanceable Display - a personal dashboard - not an interactive thing. I want the family and kids to just stop by, note important information and move on.

It's also worth pointing out the a horizontal monitor on the wall looks like, well, a monitor on the wall. But somehow when it's Portrait it's dramatic. It's not something we are (yet) used to seeing. I may try this out in a few ways, or even make a few of these displays!

How to Build a Raspberry Pi-based Family Calendar

It's pretty easy! I used the DakBoard Blog but I had most of the stuff already.

Get a $35 Raspberry Pi 3. The 3 is fast and includes Wifi so you don't need an extra adapter. I like a 2.5A powersupply but some folks say you can run the Raspberry Pi off the monitor's USB power - IF that power can put out at least 1A. 500mA will likely cause instability. It depends on if you want to try to get the whole thing down to one power cable. Cheap SD Card - 8 gigs is fine, but get whatever works for you. This doesn't need to be awesome. A 1 foot HDMI cable. You're gonna mount the Raspberry Pi to the back of the monitor and hide it so you want the cable to be as small as possible. You might need a 90 degree or 270 degree adapter to avoid HDMI cables from sticking out or a short cable with this built in. And finally - a 24" ish (smaller is fine) LCD (IPS is nice) monitor with smallish bezels and HDMI inputs that go out to the side (NOT directly out the back) as you want this flush on the wall. Think about how you'll mount it. You can take the back off the monitor and use hanging wire OR use a flush VESA mount.

Install Raspbian on the Raspberry Pi. I use Noobs to bootstrap my install as it's super fast and easy. Go through the standard setup. Make sure you've set up:

Wifi login Timezone Boot to Desktop automatically install chromium via "sudo apt-get install -y rpi-chromium-mods"

Then you make sure that Chromium starts up full screen, the mouse is hidden, and we're looking at the dashboard! It's super important you don't have to touch it. It's an appliance, right?

sudo nano ~/.config/lxsession/LXDE-pi/autostart @xset s off @xset -dpms @xset s noblank @chromium-browser --noerrdialogs --incognito --kiosk http://dakboard.com/app/?p=YOUR_PRIVATE_URL

Then you can set up a cronjob if you want to turn the Pi's screen on and off on a schedule. Using rpi-hdmi.sh you can make a crontab -e that looks like this:

# Turn HDMI Off (22:00/10:00pm) 0 22 * * * /home/pi/rpi-hdmi.sh off # Turn HDMI On (7:00/7:00am) 0 7 * * * /home/pi/rpi-hdmi.sh on

My family uses Google Calendar (GSuite) to manage hanselman.com, but I use Outlook at work. I also have a lot of business/work crap in my calendar that the family doesn't need to see. So I have two problems here, filtering, and appointment movement between Work and Home.

My wife and kids use Google Calendar and it's their authoritative source. My work calendar is MY authoritative source, so I want to sync Outlook->Google but ONLY including Personal/Podcasts/Travel categories. I categorize in Outlook at work, and then those appointments that are appropriate for the family calendar get moved over. Then the Family Calendar dashboard includes color coordinated items for Mom, Dad, Kid1, Kid2. The kids include homework that's due as appointments.

I use the Outlook Google Calendar Sync open source project to do this calendar movement for me. It does require Outlook and is a client solution so if you have a better idea let me know.

GOTCHA: I have been using Google Calendar for YEARS. I have also been using sync tools like this for years. As such, I was noticing that sometimes DakBoard would timeout asking for my Google Calendar's ICS file. It would take minutes. So I requested it myself and it was 26 megs. It's clear that Google calendar doesn't care deeply about iCal and that's disappointing. This could easily be solved if they'd support some kind of OData like URL-based query for fromdate=, todate=. In this case, the DakBoard was getting 26 megs over and over to just show a few weeks of appointments. I literally had appointments from 2005 in the calendar. I decided that since I'd declared Outlook my authoritative source for my calendar that I'd take an archive (one time snapshot) of my iCal and then delete all my calendar items from Google Calendar and re-sync, one way, from the authoritative source, going back 1 year. I'm likely a rare case but it's worth noting in case you bump into this.

All in all, this can easily be done in a short few hours if you have a Pi and a monitor. The time will be spent making it "sanitary." Making the cables perfect, hanging it on the wall, hiding the cables, then tweaking the screen to be perfect.

Editing screens on DakBoard

DakBoard has a free option that works great, or a Premium subscription that gives you even more control. Again, it depends on your web/art ability, and your patience. This is a fun new world that I'm excited to get involved with and my family is already stoked about this new display as we enter the holiday season.

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Compiling C# to WASM with Mono and Blazor then Debugging .NET Source with Remote Debugging in Chrome DevTools

Nov 16, 2018

Description:

Blazor quietly marches on. In case you haven't heard (I've blogged about Blazor before) it's based on a deceptively simple idea - what if we could run .NET Standard code in the browsers? No, not Silverlight, Blazor requires no plugins and doesn't introduce new UI concepts. What if we took the AOT (Ahead of Time) compilation work pioneered by Mono and Xamarin that can compile C# to Web Assembly (WASM) and added a nice UI that embraced HTML and the DOM?

Sound bonkers to you? Are you a hater? Think this solution is dumb or not for you? To the left.

For those of you who want to be wacky and amazing, consider if you can do this and the command line:

$ cat hello.cs
class Hello {
static int Main(string[] args) {
System.Console.WriteLine("hello world!");
return 0;
}
}
$ mcs -nostdlib -noconfig -r:../../dist/lib/mscorlib.dll hello.cs -out:hello.exe
$ mono-wasm -i hello.exe -o output
$ ls output
hello.exe index.html index.js index.wasm mscorlib.dll

Then you could do this in the browser...look closely on the right side there.

You can see the Mono runtime compiled to WASM coming down. Note that Blazor IS NOT compiling your app into WASM. It's sending Mono (compiled as WASM) down to the client, then sending your .NET Standard application DLLs unchanged down to run within with the context of a client side runtime. All using Open Web tools. All Open Source.

Blazor uses Mono to run .NET in the browser

So Blazor allows you to make SPA (Single Page Apps) much like the Angular/Vue/React, etc apps out there today, except you're only writing C# and Razor(HTML).

Consider this basic example.

@page "/counter"

<h1>Counter</h1>
<p>Current count: @currentCount</p>
<button class="btn btn-primary" onclick="@IncrementCount">Click me</button>

@functions {
int currentCount = 0;
void IncrementCount() {
currentCount++;
}
}

You hit the button, it calls some C# that increments a variable. That variable is referenced higher up and automatically updated. This is trivial example. Check out the source for FlightFinder for a real Blazor application.

This is stupid, Scott. How do I debug this mess? I see you're using Chrome but seriously, you're compiling C# and running in the browser with Web Assembly (how prescient) but it's an undebuggable black box of a mess, right?

I say nay nay!

C:\Users\scott\Desktop\sweetsassymollassy> $Env:ASPNETCORE_ENVIRONMENT = "Development"
C:\Users\scott\Desktop\sweetsassymollassy> dotnet run --configuration Debug
Hosting environment: Development
Content root path: C:\Users\scott\Desktop\sweetsassymollassy
Now listening on: http://localhost:5000
Now listening on: https://localhost:5001
Application started. Press Ctrl+C to shut down.

Then Win+R and run this command (after shutting down all the Chrome instances)

%programfiles(x86)%\Google\Chrome\Application\chrome.exe --remote-debugging-port=9222 http://localhost:5000

Now with your Blazor app running, hit Shift+ALT+D (or Shift+SILLYMACKEY+D) and behold.

Feel free to click and zoom in. We're at a breakpoint in some C# within a Razor page...in Chrome DevTools.

HOLY CRAP IT IS DEBUGGING C# IN CHROME

What? How?

Blazor provides a debugging proxy that implements the Chrome DevTools Protocol and augments the protocol with .NET-specific information. When debugging keyboard shortcut is pressed, Blazor points the Chrome DevTools at the proxy. The proxy connects to the browser window you're seeking to debug (hence the need to enable remote debugging).

It's just getting started. It's limited, but it's awesome. Amazing work being done by lots of teams all coming together into a lovely new choice for the open source web.

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Web Development and Advanced Techniques with Linux on Windows (WSL)

Nov 14, 2018

Description:

I've posted several times on the Windows Subsystem for Linux that allows you to run Linux on Windows 10 without a VM. Check out my YouTube on Editing code and files on Windows Subsystem for Linux on Windows 10. There's just one rule. You can mess with Windows files from Linux but you can't mess with Linux files from Windows. Otherwise, go crazy and enjoy. Here's some of my previous posts you should check out:

The year of Linux on the (Windows) Desktop - WSL Tips and Tricks Setting up a Shiny Development Environment within Linux on Windows 10 Installing Fish Shell on Ubuntu on Windows 10 Writing and debugging Linux C++ applications from Visual Studio using the "Windows Subsystem for Linux" Ubuntu now in the Windows Store: Updates to Linux on Windows 10 and Important Tips

WSL is pretty fantastic although its disk access is slower than native Linux, I find myself using it every day. If you want to setup Linux on your Windows 10 machine, just turn it on, then head over to the Windows Store and search for "Linux."

You can turn on Linux on Windows 10 by typing "Windows Features" and checking "Windows Subsystem for Linux." Then get a Linux from the Windows Store.

If you prefer to use PowerShell and do it in one line, just do this from an Admin PowerShell prompt:

Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux

Then go get any one (or more!) of these from the Store:

Ubuntu OpenSUSE SLES Kali Linux Debian GNU/Linux

When you're in a Windows shell like PowerShell or CMD you might want to run Linux and/or jump comfortably between shells. You can do that in a few ways. The best and recommended way is running "wsl.exe" as that will start up your default distro. You can also just type the name of the distro. So I can type "ubuntu" and get in there directly.

You can type "bash" but that's not recommended if you've changed shells. If you've set up zsh or fish and type bash, it's gonna still try to run bash.

Here I've typed wslconfig and you can see I've got both Ubuntu and Debian installed, with Ubuntu as the default when I type "wsl."

C:\Users\scott>wslconfig /list
Windows Subsystem for Linux Distributions:
Ubuntu-18.04 (Default)
Debian

Now that I know how to run wsl from anywhere I can even pipe stuff in and out it Linux from outside. For example here I am in cmd.exe but I'm calling commands in Linux, that come out, then back in, etc. You can mix and match however you'd like!

C:\dev>type hello.sh
echo Hello
C:\dev>wsl cat /mnt/c/dev/hello.sh | wsl fromdos | wsl /bin/sh
Hello

This means even when I'm in CMD or PowerShell I can use Linux commands that are convenient or familiar to me. For example, here I'm piping a Windows Update log file into a the Linux command sha1sum command. Note the use of - to accept standard input - even though that input is from Windows!

C:\Users\scott\Desktop>type WindowsUpdate.log | wsl sha1sum -
3b48adce8f6c9cb816e8845d824dacc0440ca1b8 -

Sweet. There's a number of nice advanced techniques if you want to make your WSL installations smarter AND automatically configured.  You can make a file in /etc/wsl.conf to affect your DNS, metadata and driving mounting.

When you are in a WSL shell, your Windows drive (your main drive) is at /mnt/c. So here is my Windows desktop as viewed from WSL:

screenfetch in WSL

I most of my dev work in /mnt/d/github for example. That way I can use VS Code from Windows but run Node/Ruby/Go/Whatever from WSL.

I keep my files on my Windows drive, edit them in VS Code, but run things in WSL. Again, never use Windows utilities to reach into and/or edit files on the WSL/Linux subsystem. Also, always been conscious of your CR/LF situation, and be real conscious if you're going to run git in both Windows and WSL.

Here's VS Code at the top, WSL/Ubuntu running Node at the bottom, and the local node app running in Edge on Windows on the lower right. We are sharing file systems and network port space:

Cross platform Web Dev

You can even share environment variables between WSL and Windows with a special environment variable called WSLENV. This is pretty advanced but super powerful. Read this carefully. You make a environment variable that is a list of names of other variables that you want translated between environments.

That means you can do something like this. I'm in WSL and I have an environment variable that points to a location on the filesystem. I need it to be correct in both worlds.

scott@IRONHEART:/mnt/d$ export MYLINUXPATH=/mnt/d/github/expresstest
scott@IRONHEART:/mnt/d$ export WSLENV=MYLINUXPATH/p
scott@IRONHEART:/mnt/d$ cmd.exe
D:\>echo %MYLINUXPATH%
D:\github\expresstest

Read that carefully. It's awesome and it's very configurable.

There's lots of users of WSL and many have assembled great lists of resources like Awesome-WSL by Hayden.

It's also worth pointing out that WSL is just now one console you can choose from. There's PowerShell, CMD.exe, and a half dozen Linuxes. You can even make your own custom Linux Distro for your company if you like. And there's a whole world of 3rd party Consoles that sit on top of/replace conhost.exe so you can have consoles with tabs, cool fonts, ones based on web tech, whatever! You can even choose WSL/bash as your default shell in Visual Studio Code if you'd like with Ctrl+~.

Hope this gets you started with Linux on Windows. What did I miss? Sound off in the comments.

Sponsor: Preview the latest JetBrains Rider with its Assembly Explorer, Git Submodules, SQL language injections, integrated performance profiler and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Terminus and FluentTerminal are the start of a world of 3rd party OSS console replacements for Windows

Nov 9, 2018

Description:

Folks have been trying to fix supercharge the console/command line on Windows since Day One. There's a ton of open source projects over the year that try to take over or improve on "conhost.exe" (the thing that handles consoles like Bash/PowerShell/cmd on Windows). Most of these 3rd party consoles have weird or subtle issues. For example, I like Hyper as a terminal but it doesn't support Ctrl-C at the command line. I use that hotkey often enough that this small bug means I just won't use that console at all.

Per the CommandLine blog:

One of those weaknesses is that Windows tries to be "helpful" but gets in the way of alternative and 3rd party Console developers, service developers, etc. When building a Console or service, developers need to be able to access/supply the communication pipes through which their Terminal/service communicates with command-line applications. In the *NIX world, this isn't a problem because *NIX provides a "Pseudo Terminal" (PTY) infrastructure which makes it easy to build the communication plumbing for a Console or service, but Windows does not...until now!

Looks like the Windows Console team is working on making 3rd party consoles better by creating this new PTY mechanism:

We've heard from many, many developers, who've frequently requested a PTY-like mechanism in Windows - especially those who created and/or work on ConEmu/Cmder, Console2/ConsoleZ, Hyper, VSCode, Visual Studio, WSL, Docker, and OpenSSH.

Very cool! Until it's ready I'm going to continue to try out new consoles. A lot of people will tell you to use the cmder package that includes ConEmu. There's a whole world of 3rd party consoles to explore. Even more fun are the choices of color schemes and fonts to explore.

For a while I was really excited about Hyper. Hyper is - wait for it - an electron app that uses HTML/CSS for the rendering of the console. This is a pretty heavyweight solution to the rendering that means you're looking at 200+ megs of memory for a console rather than 5 megs or so for something native. However, it is a clever way to just punt and let a browser renderer handle all the complex font management. For web-folks it's also totally extensible and skinnable.

As much as I like Hyper and its look, the inability to support hitting "Ctrl-C" at the command line is just too annoying. It appears it's a very well-understood issue that will ultimately be solved by the ConPTY work as the underlying issue is a deficiency in the node-pty library. It's also a long-running issue in the VS Code console support. You can watch the good work that's starting in this node-pty PR that will fix a lot of issues for node-based consoles.

Until this all fixes itself, I'm personally excited (and using) these two terminals for Windows that you may not have heard of.

Terminus

Terminus is open source over at https://github.com/Eugeny/terminus and works on any OS. It's immediately gorgeous, and while it's in alpha, it's very polished. Be sure to explore the settings and adjust things like Blur/Fluent, Themes, opacity, and fonts. I'm using FiraCode Retina with Ligatures for my console and it's lovely. You'll have to turn ligature support on explicitly under Settings | Appearance.

Terminus is a lovely console replacement

Terminus also has some nice plugins. I've added Altair, Clickable-Links, and Shell-Selector to my loadout. The shell selector makes it easy on Windows 10 to have PowerShell, Cmd, and Ubuntu/Bash open all at the same time in multiple tabs.

I did do a little editing of the default config file to set up Ctrl-T for new tab and Ctrl-W for close-tab for my personal taste.

FluentTerminal

FluentTerminal is a Terminal Emulator based on UWP. Its memory usage on my machine is about 1/3 of Terminus and under 100 megs. As a Windows 10 UWP app it looks and feels very native. It supports ALT-ENTER Fullscreen, and tabs for as many consoles as you'd like. You can right-click and color specific tabs which was a nice surprise and turned out to be useful for on-the-fly categorization.

image

FluentTerminal has a nice themes setup and includes a half-dozen to start, plus supports imports.

It's not yet in the Windows Store (perhaps because it's in active development) but you can easily download a release and install it with a PowerShell install.ps1 script.

I have found the default Keybindings very intuitive with the usual Ctrl-T and Ctrl-W tab managers already set up, as well as Shift-Ctrl-T for opening a new tab for a specific shell profile (cmd, powershell, wsl, etc).

Both of these are great new entries in the 3rd party terminal space and I'd encourage you to try them both out and perhaps get involved on their respective GitHubs! It's a great time to be doing console work on Windows 10!

Sponsor: Check out the latest JetBrains Rider with built-in spell checking, enhanced debugger, Docker support, full C# 7.3 support, publishing to IIS and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Updating my ASP.NET Website from .NET 2.2 Core Preview 2 to .NET 2.2 Core Preview 3

Nov 7, 2018

Description:

I've recently returned from a month in South Africa and I was looking to unwind while the jetlagged kids sleep. I noticed that .NET Core 2.2 Preview 3 came out while I wasn't paying attention. My podcast site runs on .NET Core 2.2 Preview 2 so I thought it'd be interesting to update the site. That means I'd need to install the new SDK, update the project references, ensure it builds in Azure DevOps's CI/CD Pipeline, AND deploys and runs in Azure.

Let's see how it goes. I'm a little out of it but I'm writing this blog post AS I DO THE WORK so you'll see my train of thought with no editing.

Ok, what version of .NET Core does this machine have?

C:\Users\scott> dotnet --version
2.2.100-preview2-009404
C:\Users\scott> dotnet tool update --global dotnet-outdated
Tool 'dotnet-outdated' was successfully updated from version '2.0.0' to version '2.1.0'.

Looks like I'm on Preview 2 as I guessed. I'll take a moment and upgrade one Global Tool I love - dotnet-outdated - in case it's been updated since I've been out. Looks like it has a minor update. Dotnet Outdated is a great utility for checking references and you should absolutely be using it or another tool like NuKeeper or Dependabot.

I'll head over to https://www.microsoft.com/net/download/dotnet-core/2.2 and get .NET Core 2.2 Preview 3. I'm building on Windows but I may want to update my Linux (WSL) install and Docker images later.

All right, installed. Check it with dotnet --version to confirm it's correct:

C:\Users\scott> dotnet --version
2.2.100-preview3-009430

Let's try to build my podcast website. Note that it consists of two projects, the main website on ASP.NET Core, and Unit Tests with XUnit and Selenium.

D:\github\hanselminutes-core [main ≡]> dotnet build
Microsoft (R) Build Engine version 15.9.8-preview+g0a5001fc4d for .NET Core
Copyright (C) Microsoft Corporation. All rights reserved.

Restoring packages for D:\github\hanselminutes-core\hanselminutes.core.tests\hanselminutes.core.tests.csproj...
Restore completed in 80.05 ms for D:\github\hanselminutes-core\hanselminutes.core.tests\hanselminutes.core.tests.csproj.
Restore completed in 25.4 ms for D:\github\hanselminutes-core\hanselminutes.core\hanselminutes-core.csproj.
D:\github\hanselminutes-core\hanselminutes.core.tests\hanselminutes.core.tests.csproj : error NU1605: Detected package downgrade: Microsoft.AspNetCore.App from 2.2.0-preview3-35497 to 2.2.0-preview2-35157. Reference the package directly from the project to select a different version. [D:\github\hanselminutes-core\hanselminutes-core.sln]

The dotnet build fails, which make sense, because it's saying hey, you're asking for 2.2 Preview 2 but I've got Preview 3 all ready for you!

Detected package downgrade: Microsoft.AspNetCore.App from 2.2.0-preview3-35497 to 2.2.0-preview2-35157

Let's see what "dotnet outdated" says about this!

dotnet outdated says there's a few packages I need to update

Cool! I love these dependency tools and the community around them. You can see that it's noticed the Preview 2 -> Preview 3 opportunity, as well as a few other smaller minor or patch version bumps.

I can run dotnet outdated -u to automatically update the references, but I'll want to treat the "reference" of "Microsoft.AspNetCore.App" a little differently and use implicit versioning. You don't want to include a specific version - as I did - for this package.

Per the docs for .NET Core 2.1 and up:

Remove the "Version" attribute on the package reference to Microsoft.AspNetCore.App. Projects which use <Project Sdk="Microsoft.NET.Sdk.Web"> do not need to set the version. The version will be implied by the target framework and selected to best match the way ASP.NET Core 2.1 works. (See below for more information.)

Doing this also fixes the build because it picks up the latest 2.2 SDK automatically! Now I'll run my Unit Tests (with code coverage) and see how it works. Cool all tests pass (including Selenium).

88% Code Coverage

It builds locally, will it build in Azure DevOps when I check it in to GitHub?

Azure DevOps

I added a .NET Core SDK installer step when I set up my Azure Dev Ops Pipeline. This is where I'm explicitly installing a Preview version of the .NET Core SDK.

While I'm in here I noticed the Azure DevOps pipeline was using NuGet 4.4.1. I run "nuget update -self" on my local machine and got 4.7.1, so I updated that version as well to make the CI/CD pipeline reflect my own machine.

Now I'll git add, git commit (using verified/signed GitHub commits with my PGP Key and Yubikey):

D:\github\hanselminutes-core [main ≡ +0 ~2 -0 !]> git add .
D:\github\hanselminutes-core [main ≡ +0 ~2 -0 ~]> git commit -m "bump to 2.2 Preview 3"
[main 7a84bc7] bump to 2.2 Preview 3
2 files changed, 16 insertions(+), 13 deletions(-)

Add in a Git Push...and I can see the build start in Azure DevOps:

CI/CD pipeline build starting

Cool. While that's building, I'll make sure my existing Azure App Service (website) installation is ready to receive the deployment (assuming the build succeeds). Since I'm using an ASP.NET Core Preview build I'll want to make sure I have the Preview Site Extension installed, per the docs.

If I visit the Site Extensions menu item in the Azure Portal I can see I've got .NET Core 2.2 Preview 2, but there's an update available, as expected.

Update Available

I'll click this extension and then click Update. This extension's job is to make sure the App Service gets Preview versions of the .NET Core SDK. Only released (GA - general availability) SDKs are installed by default.

OK, .NET Core 2.2 is all updated in Azure, so I'll confirm that it's deployed as well in Azure DevOps. Yes, I'm deploying into Production without a net. Seriously, though, if there is an issue I'll just rollback. If I was deeply serious about downtime I'd be doing all this in Staging.

image

Successful local test, successful CI/SD build and test, successful deployment, and the site is back up now running on ASP.NET Core 2.2 Preview 3. It took about 45 min to do the work while simultaneously taking these screenshots and writing this blog post during the slow parts.

Good night everyone!

Sponsor: Check out the latest JetBrains Rider with built-in spell checking, enhanced debugger, Docker support, full C# 7.3 support, publishing to IIS and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

.NET Core and .NET Standard for IoT - The potential of the Meadow Kickstarter

Nov 1, 2018

Description:

I saw this Kickstarter today - Meadow: Full-stack .NET Standard IoT platform. It says that "It combines the best of all worlds; it has the power of RaspberryPi, the computing factor of an Arduino, and the manageability of a mobile app. And best part? It runs full .NET Standard on real IoT hardware."

NOTE: I don't have any relationship with the company/people behind this Kickstarter, but it seems interesting so I'm sharing it with you. As with all Kickstarters, it's not a pre-order, it's an investment that may not pan out, so always be prepared to lose your investment. I lost mine with the .NET "Agent" SmartWatch even though all signs pointed to success.

Meadow IoT KickstarterI've done IoT work on Raspberry Pis which is much easier lately with the emerging community support for ARM32, Raspberry Pis, and cool stuff happening on Windows 10 IoT. I've written on how easy it is to get running on Raspberry Pi. I was even able to get my own podcast website running on Raspberry Pi and in Docker.

This Meadow Kickstarter says it's running on the Mono Runtime and will support the .NET Standard 2.0 API. That means that you likely already know how to program to it. Most libraries on NuGet are .NET Standard compliant so a ton of open source software should "Just Work" on any solution that supports .NET Standard.

One thing that seems interesting about Meadow is this sentence: "The power of Raspberry Pi in the computing factor of an Arduino, and the manageability of a mobile app."

Raspberry Pis are full-on computers. Ardunios are small little (mostly) single-tasked devices. Microcomputer vs Microcontroller. It's overkill to have Ubuntu on a computer just to turn on a device. You usually want IoT devices to have as small a surface area as possible.

Meadow says "Meadow has been designed to run on a variety of microcontrollers, and our first board is based on STMicroelectronics' flagship STM32F7 MCU. The Meadow F7 Micro board is an embeddable module that's based on Adafruit Feather form factor." Remember, we are talking megs not gigs here. "We've paired the STM32F7 with 32MB of flash storage and 16MB of RAM, so you can run pretty much anything you can think of building." This is just a 216MHz ARM board.

Be sure to scroll all the way down to the bottom of the page as they outline risks as well as what's left to be done.

What do you think? While you are at it, check out (total coincidence) our sponsor this week is Intel IoT! They have some great developer kits.

Sponsor: Reduce time to market and simplify IOT development using developer kits built on Intel Atom®, Intel® Core™ and Intel® Xeon® processors and tools such as Intel® System Studio and Arduino Create*
© 2018 Scott Hanselman. All rights reserved.
     

Side by Side user scoped .NET Core installations on Linux with dotnet-install.sh

Oct 30, 2018

Description:

I can run .NET Core on Windows, Mac, or a dozen Linuxes. On my Ubuntu installation I can check what version I have installed and where it is like this:

$ dotnet --version 2.1.403 $ which dotnet /usr/bin/dotnet

If we interrogate that dotnet file we see it's a link to elsewhere:

$ ls -alogF /usr/bin/dotnet lrwxrwxrwx 1 22 Sep 19 03:10 /usr/bin/dotnet -> ../share/dotnet/dotnet*

If we head over there we see similar stuff as we do on Windows.

Side by side DotNet installs

Basically c:\program files\dotnet is the same as /share/dotnet.

$ cd ../share/dotnet $ ll total 136 drwxr-xr-x 1 root root 4096 Oct 5 19:47 ./ drwxr-xr-x 1 root root 4096 Aug 1 17:44 ../ drwxr-xr-x 1 root root 4096 Feb 13 2018 additionalDeps/ -rwxr-xr-x 1 root root 105704 Sep 19 03:10 dotnet* drwxr-xr-x 1 root root 4096 Feb 13 2018 host/ -rw-r--r-- 1 root root 1083 Sep 19 03:10 LICENSE.txt drwxr-xr-x 1 root root 4096 Oct 5 19:48 sdk/ drwxr-xr-x 1 root root 4096 Aug 1 18:07 shared/ drwxr-xr-x 1 root root 4096 Feb 13 2018 store/ -rw-r--r-- 1 root root 27700 Sep 19 03:10 ThirdPartyNotices.txt $ ls sdk 2.1.4 2.1.403 NuGetFallbackFolder $ ls shared Microsoft.AspNetCore.All Microsoft.AspNetCore.App Microsoft.NETCore.App $ ls shared/Microsoft.NETCore.App/ 2.0.5 2.1.5

Looking in directories works to figure out what SDKs and Runtime versions are installed, but the best way is to use the dotnet cli itself like this. 

$ dotnet --list-sdks 2.1.4 [/usr/share/dotnet/sdk] 2.1.403 [/usr/share/dotnet/sdk] $ dotnet --list-runtimes Microsoft.AspNetCore.All 2.1.5 [/usr/share/dotnet/shared/Microsoft.AspNetCore.All] Microsoft.AspNetCore.App 2.1.5 [/usr/share/dotnet/shared/Microsoft.AspNetCore.App] Microsoft.NETCore.App 2.0.5 [/usr/share/dotnet/shared/Microsoft.NETCore.App] Microsoft.NETCore.App 2.1.5 [/usr/share/dotnet/shared/Microsoft.NETCore.App]

There's great instructions on how to set up .NET Core on your Linux machines via Package Manager here.

Note that these installs of the .NET Core SDK are installed in /usr/share. I can use the dotnet-install.sh to do non-admin installs in my own user directory.

In order to gain more control and do things more manually, you can use this shell script here: https://dot.net/v1/dotnet-install.sh and its documentation is here at docs. For Windows there is also a PowerShell version https://dot.net/v1/dotnet-install.ps1

The main usefulness of these scripts is in automation scenarios and non-admin installations. There are two scripts: One is a PowerShell script that works on Windows. The other script is a bash script that works on Linux/macOS. Both scripts have the same behavior. The bash script also reads PowerShell switches, so you can use PowerShell switches with the script on Linux/macOS systems.

For example, I can see all the current .NET Core 2.1 versions at https://www.microsoft.com/net/download/dotnet-core/2.1 and 2.2 at https://www.microsoft.com/net/download/dotnet-core/2.2 - the URL format is regular. I can see from that page that at the time of this blog post, v2.1.5 is both Current (most recent stable) and also LTS (Long Term Support).

I'll grab the install script and chmod +x it. Running it with no options will get me the latest LTS release.

$ wget https://dot.net/v1/dotnet-install.sh --2018-10-31 15:41:08-- https://dot.net/v1/dotnet-install.sh Resolving dot.net (dot.net)... 104.214.64.238 Connecting to dot.net (dot.net)|104.214.64.238|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 30602 (30K) [application/x-sh] Saving to: ‘dotnet-install.sh’

I like the "-DryRun" option because it will tell you what WILL happen without doing it.

$ ./dotnet-install.sh -DryRun dotnet-install: Payload URL: https://dotnetcli.azureedge.net/dotnet/Sdk/2.1.403/dotnet-sdk-2.1.403-linux-x64.tar.gz dotnet-install: Legacy payload URL: https://dotnetcli.azureedge.net/dotnet/Sdk/2.1.403/dotnet-dev-ubuntu.16.04-x64.2.1.403.tar.gz dotnet-install: Repeatable invocation: ./dotnet-install.sh --version 2.1.403 --channel LTS --install-dir <auto>

If I use the dotnet-install script can have multiple copies of the .NET Core SDK installed in my user folder at ~/.dotnet. It all depends on your PATH. Note this as I use ~/.dotnet for my .NET Core install location and run dotnet --list-sdks. Make sure you know what your PATH is and that you're getting the .NET Core you expect for your user.

$ which dotnet /usr/bin/dotnet $ export PATH=/home/scott/.dotnet:$PATH $ which dotnet /home/scott/.dotnet/dotnet $ dotnet --list-sdks 2.1.402 [/home/scott/.dotnet/sdk]

Now I will add a few more .NET Core SDKs side-by-side with the dotnet-install.sh script. Remember again, these aren't .NET's installed with apt-get which would be system level and by run with sudo. These are user profile installed versions.

There's really no reason to do side by side at THIS level of granularity, but it makes the point.

$ dotnet --list-sdks 2.1.302 [/home/scott/.dotnet/sdk] 2.1.400 [/home/scott/.dotnet/sdk] 2.1.401 [/home/scott/.dotnet/sdk] 2.1.402 [/home/scott/.dotnet/sdk] 2.1.403 [/home/scott/.dotnet/sdk]

When you're doing your development, you can use "dotnet new globaljson" and have each path/project request a specific SDK version.

$ dotnet new globaljson The template "global.json file" was created successfully. $ cat global.json { "sdk": { "version": "2.1.403" } }

Hope this helps!

Sponsor: Reduce time to market and simplify IOT development using developer kits built on Intel Atom®, Intel® Core™ and Intel® Xeon® processors and tools such as Intel® System Studio and Arduino Create*


© 2018 Scott Hanselman. All rights reserved.
     

ASP.NET Core 2.2 Parameter Transformers for clean URL generation and slugs in Razor Pages or MVC

Oct 25, 2018

Description:

I noticed that last week .NET Core 2.2 Preview 3 was released:

.NET Core 2.2 Preview 3 ASP.NET Core 2.2 Preview 3 Entity Framework 2.2 Preview 3

You can download and get started with .NET Core 2.2, on Windows, macOS, and Linux: .NET Core 2.2 Preview 3 SDK (includes the runtime) .NET Core 2.2 Preview 3 Runtime

Docker images are available at microsoft/dotnet for .NET Core and ASP.NET Core.

.NET Core 2.2 Preview 3 can be used with Visual Studio 15.9 Preview 3 (or later), Visual Studio for Mac and Visual Studio Code.

The feature I am most stoked about in ASP.NET 2.2 is a subtle one but I remember implementing it manually many times over the last 10 years. I'm happy to see it nicely integrated into ASP.NET Core's MVC and Razor Pages patterns.

ASP.NET Core 2.2 introduces the concept of Parameter Transformers to routing. Remember there isn't a directly relationship between what's in the URL/Address bar and what's on disk. The routing subsystem handles URLs coming in from the client and routes them to Controllers, but it also generates URLs (strings) when given an Controller and Action.

So if I'm using Razor Pages and I have a file Pages/FancyPants.cshtml I can get to it by default at /FancyPants. I can also "ask" for the URL when I'm creating anchors/links in my Razor Page:

<a class="nav-link text-dark" asp-area="" asp-page="/fancypants">Fancy Pants</a>

Here I'm asking for the page. That asp-page attribute points to a logical page, not a physical file.

 

We can make an IOutboundParameterTransformer that changes URLs to a format (for example) like a WordPress standard slug in the two-words format.

public class SlugifyParameterTransformer : IOutboundParameterTransformer { public string TransformOutbound(object value) { if (value == null) { return null; } // Slugify value return Regex.Replace(value.ToString(), "([a-z])([A-Z])", "$1-$2").ToLower(); } }

Then you let the ASP.NET Pipeline know about this transformer, either in Razor Pages...

services.AddMvc() .SetCompatibilityVersion(CompatibilityVersion.Version_2_2) .AddRazorPagesOptions(options => { options.Conventions.Add( new PageRouteTransformerConvention( new SlugifyParameterTransformer())); });

or in ASP.NET MVC:

services.AddMvc(options => { options.Conventions.Add(new RouteTokenTransformerConvention( new SlugifyParameterTransformer())); });

Now when I run my application, I get my routing both coming in (from the client web browser) and going out (generated via Razor pages. Here I'm hovering over the "Fancy Pants" link at the top of the page. Notice that it's generated /fancy-pants as the URL.

image

So that same code from above that generates anchor tags <a href= gives me the expected new style of URL, and I only need to change it in one location.

There is also a new service called LinkGenerator that's a singleton you can call outside the context of an HTTP call (without an HttpContext) in order to generate a URL string.

return _linkGenerator.GetPathByAction( httpContext, controller: "Home", action: "Index", values: new { id=42 });

This can be useful if you are generating URLs outside of Razor or in some Middleware. There's a lot more little subtle improvements in ASP.NET Core 2.2, but this was the one that I will find the most useful in the near term.

Sponsor: Check out the latest JetBrains Rider with built-in spell checking, enhanced debugger, Docker support, full C# 7.3 support, publishing to IIS and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Dependabot for .NET Core dependency tracking in GitHub

Oct 22, 2018

Description:

Bump Microsoft.ApplicationInsights.AspNetCore from 2.5.0-beta1 to 2.5.0-beta2I've been exploring automated dependency tracking lately. I usually use my podcast's ASP.NET Core website that I host on Github as a guinea pig. I tried Nukeeper and the dotnet outdated global tool - both of which are fantastic and worth exploring.

This week I'm trying Dependbot. I have no relationship with this company. Public repos and personal account repos are free and their pricing is very clear and organization accounts start at just $15 with a free trial.

I'm really impressed with how clever Dependabot is. It's almost like a person in its behavior. Yes, I realize that's kind of the point, but it's no less surprising to see. A well-written bot is a joy to behold.

For example, here is a PR (Pull Request) where Dependbot says "Bumps Microsoft.ApplicationInsights.AspNetCore from 2.5.0-beta1 to 2.5.0-beta2."

Basic stuff, right? But that's not all.

It not only does the basics where it noticed that a version bump occurred in a NuGet package, but it also copied the release notes from that NuGet package's release on GitHub! It included links to what was fixed between versions, links to the change logs, AND a complete linked commit list. I mean, that's just lovely.

A few days later, Dependabot went and closed the PR because the dependancy had updated (I was slow) then it commented telling me this PR was superseded by another.

Superseded by #20

Dependabot, like any good bot, also includes commands you can send to it via "Chats" in GitHub PR comments. You can tell it to use specific labels, control milestones. You can also control behavior in the Dependabot Dashboard and have it automerge things like minor versions, or just lock things down to security-only updates.

All in all, it's a very smart bot that supports basically all the languages. .NET support is in Beta, but I haven't had any issues with it. You should definitely check it out. And let me tell you, once you've got everything automated you'll wonder how you ever managed before.

Sponsor: Check out the latest JetBrains Rider with built-in spell checking, enhanced debugger, Docker support, full C# 7.3 support, publishing to IIS and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Customer Notes: Diagnosing issues under load of Web API app migrated to ASP.NET Core on Linux

Oct 16, 2018

Description:

When the engineers on the ASP.NET/.NET Core team talk to real customers about actual production problems they have, interesting stuff comes up. I've tried to capture a real customer interaction here without giving away their name or details.

The team recently had the opportunity to help a large customer of .NET investigate performance issues they’ve been having with a newly-ported ASP.NET Core 2.1 app when under load. The customer's developers are experienced with ASP.NET on Windows but in this case they needed help getting started with performance investigations with ASP.NET Core in Linux containers.

As with many performance investigations, there were a variety of issues contributing to the slowdowns, but the largest contributors were time spent garbage collecting (due to unnecessary large object allocations) and blocking calls that could be made asynchronous.

After resolving the technical and architectural issues detailed below, the customer's Web API went from only being able to handle several hundred concurrent users during load testing to being able to easily handle 3,000 and they are now running the new ASP.NET Core version of their backend web API in production. Problem Statement

The customer recently migrated their .NET Framework 4.x ASP.NET-based backend Web API to ASP.NET Core 2.1. The migration was broad in scope and included a variety of tech changes.

Their previous version Web API (We'll call it version 1) ran as an ASP.NET application (targeting .NET Framework 4.7.1) under IIS on Windows Server and used SQL Server databases (via Entity Framework) to persist data. The new (2.0) version of the application runs as an ASP.NET Core 2.1 app in Linux Docker containers with PostgreSQL backend databases (via Entity Framework Core). They used Nginx to load balance between multiple containers on a server and HAProxy load balancers between their two main servers. The Docker containers are managed manually or via Ansible integration for CI/CD (using Bamboo).

Although the new Web API worked well functionally, load tests began failing with only a few hundred concurrent users. Based on current user load and projected growth, they wanted the web API to support at least 2,000 concurrent users. Load testing was done using Visual Studio Team Services load tests running a combination of web tests mimicking users logging in, doing the stuff of their business, activating tasks in their application, as well as pings that the Mobile App's client makes regularly to check for backend connectivity. This customer also uses New Relic for application telemetry and, until recently, New Relic agents did not work with .NET Core 2.1. Because of this, there was unfortunately no app diagnostic information to help pinpoint sources of slowdowns. Lessons Learned Cross-Platform Investigations

One of the most interesting takeaways for me was not the specific performance issues encountered but, instead, the challenges this customer had working in a Linux environment. The team's developers are experienced with ASP.NET on Windows and comfortable debugging in Visual Studio. Despite this, the move to Linux containers has been challenging for them.

Because the engineers were unfamiliar with Linux, they hired a consultant to help deploy their Docker containers on Linux servers. This model worked to get the site deployed and running, but became a problem when the main backend began exhibiting performance issues. The performance problems only manifested themselves under a fairly heavy load, such that they could not be reproduced on a dev machine. Up until this investigation, the developers had never debugged on Linux or inside of a Docker container except when launching in a local container from Visual Studio with F5. They had no idea how to even begin diagnosing issues that only reproduced in their staging or production environments. Similarly, their dev-ops consultant was knowledgeable about Linux infrastructure but not familiar with application debugging or profiling tools like Visual Studio.

The ASP.NET team has some documentation on using PerfCollect and PerfView to gather cross-platform diagnostics, but the customer's devs did not manage to find these docs until they were pointed out. Once an ASP.NET Core team engineer spent a morning showing them how to use PerfCollect, LLDB, and other cross-platform debugging and performance profiling tools, they were able to make some serious headway debugging on their own. We want to make sure everyone can debug .NET Core on Linux with LLDB/SOS or remotely with Visual Studio as easily as possible.

The ASP.NET Core team now believes they need more documentation on how to diagnose issues in non-Windows environments (including Docker) and the documentation that already exists needs to be more discoverable. Important topics to make discoverable include PerfCollect, PerfView, debugging on Linux using LLDB and SOS, and possibly remote debugging with Visual Studio over SSH. Issues in Web API Code

Once we gathered diagnostics, most of the perf issues ended up being common problems in the customer’s code.  The largest contributor to the app’s slowdown was frequent Generation 2 (Gen 2) GCs (Garbage Collections) which were happening because a commonly-used code path was downloading a lot of images (product images), converting those bytes into a base64 strings, responding to the client with those strings, and then discarding the byte[] and string. The images were fairly large (>100 KB), so every time one was downloaded, a large byte[] and string had to be allocated. Because many of the images were shared between multiple clients, we solved the issue by caching the base64 strings for a short period of time (using IMemoryCache). HttpClient Pooling with HttpClientFactory When calling out to Web APIs there was a pattern of creating new HttpClient instances rather than using IHttpClientFactory to pool the clients. Despite implementing IDisposable, it is not a best practice to dispose HttpClient instances as soon as they’re out of scope as they will leave their socket connection in a TIME_WAIT state for some time after being disposed. Instead, HttpClient instances should be re-used. Additional investigation showed that much of the application’s time was spent querying PostgresSQL for data (as is common). There were several underlying issues here. Database queries were being made in a blocking way instead of being asynchronous. We helped address the most common call-sites and pointed the customer at the AsyncUsageAnalyzer to identify other async cleanup that could help. Database connection pooling was not enabled. It is enabled by default for SQL Server, but not for PostgreSQL. We re-enabled database connection pooling. It was necessary to have different pooling settings for the common database (used by all requests) and the individual shard databases which are used less frequently. While the common database needs a large pool, the shard connection pools need to be small to avoid having too many open, idle connections. The Web API had a fairly ‘chatty’ interface with the database and made a lot of small queries. We re-worked this interface to make fewer calls (by querying more data at once or by caching for short periods of time). There was also some impact from having other background worker containers on the web API’s servers consuming large amounts of CPU. This led to a ‘noisy neighbor’ problem where the web API containers didn’t have enough CPU time for their work. We showed the customer how to address this with Docker resource constraints. Wrap Up

As shown in the graph below, at the end of our performance tuning, their backend was easily able to handle 3,000 concurrent users and they are now using their ASP.NET Core solution in production. The performance issues they saw overlapped a lot with those we’ve seen from other customers (especially the need for caching and for async calls), but proved to be extra challenging for the developers to diagnose due to the lack of familiarity with Linux and Docker environments.

Performance and Errors Charts look good, up and to the rightThroughput and Tests Charts look good, up and to the right

Some key areas of focus uncovered by this investigation were:

Being mindful of memory allocations to minimize GC pause times

Keeping long-running calls non-blocking/asynchronous

Minimizing calls to external resources (such as other web services or the database) with caching and grouping of requests

Hope you find this useful! Big thanks to Mike Rousos from the ASP.NET Core team for his work and analysis!

Sponsor: Check out the latest JetBrains Rider with built-in spell checking, enhanced debugger, Docker support, full C# 7.3 support, publishing to IIS and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

New prescriptive guidance for Open Source .NET Library Authors

Oct 16, 2018

Description:

Open-source library guidanceThere's a great new bunch of guidance just published representing Best Practices for creating .NET Libraries. Best of all, it was shepherded by JSON.NET's James Newton-King. Who better to help explain the best way to build and publish a .NET library than the author of the world's most popular open source .NET library?

Perhaps you've got an open source (OSS) .NET Library on your GitHub, GitLab, or Bitbucket. Go check out the open-source library guidance.

These are the identified aspects of high-quality open-source .NET libraries: Inclusive - Good .NET libraries strive to support many platforms and applications. Stable - Good .NET libraries coexist in the .NET ecosystem, running in applications built with many libraries. Designed to evolve - .NET libraries should improve and evolve over time, while supporting existing users. Debuggable - .NET libraries should use the latest tools to create a great debugging experience for users. Trusted - .NET libraries have developers' trust by publishing to NuGet using security best practices.

The guidance is deep but also preliminary. As with all Microsoft Documentation these days it's open source in Markdown and on GitHub. If you've got suggestions or thoughts, share them! Be sure to sound off in the Feedback Section at the bottom of the guidance. James and the Team will be actively incorporating your thoughts.

Cross-platform targeting

Since the whole point of .NET Core and the .NET Standard is reuse, this section covers how and why to make reusable code but also how to access platform-specific APIs when needed with multi-targeting.

Strong naming

Strong naming seemed like a good idea but you should know WHY and WHEN to strong name. It all depends on your use case! Are you publishing internally or publically? What are your dependencies and who depends on you?

NuGet

When publishing on the NuGet public repository (or your own private/internal one) what do you need to know about SemVer 2.0.0? What about pre-release packages? Should you embed PDBs for easier debugging? Consider things like Dependencies, SourceLink, how and where to Publish and how Versioning applies to you and when (or if) you cause Breaking changes.

Also be sure to check out Immo's video on "Building Great Libraries with .NET Standard" on YouTube!

Sponsor: Check out the latest JetBrains Rider with built-in spell checking, enhanced debugger, Docker support, full C# 7.3 support, publishing to IIS and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

C# and .NET Core scripting with the "dotnet-script" global tool

Oct 12, 2018

Description:

dotnet scriptYou likely know that open source .NET Core is cross platform and it's super easy to do "Hello World" and start writing some code.

You just install .NET Core, then "dotnet new console" which will generate a project file and basic app, then "dotnet run" will compile and run your app? The 'new' command will create all the supporting code, obj, and bin folders, etc. When you do "dotnet run" it actually is a combination of "dotnet build" and "dotnet exec whatever.dll."

What could be easier?

What about .NET Core as scripting?

Check out dotnet script:

C:\Users\scott\Desktop\scriptie> dotnet tool install -g dotnet-script
You can invoke the tool using the following command: dotnet-script
C:\Users\scott\Desktop\scriptie>copy con helloworld.csx
Console.WriteLine("Hello world!");
^Z
1 file(s) copied.
C:\Users\scott\Desktop\scriptie>dotnet script helloworld.csx
Hello world!

NOTE: I was a little tricky there in step two. I did a "copy con filename" to copy from the console to the destination file, then used Ctrl-Z to finish the copy. Feel free to just use notepad or vim. That's not dotnet-script-specific, that's Hanselman-specific.

Pretty cool eh? If you were doing this in Linux or OSX you'll need to include a "shebang" as the first line of the script. This is a standard thing for scripting files like bash, python, etc.

#!/usr/bin/env dotnet-script
Console.WriteLine("Hello world");

This lets the operating system know what scripting engine handles this file.

If you you want to refer to a NuGet package within a script (*.csx) file, you'll use the Roslyn #r syntax:

#r "nuget: AutoMapper, 6.1.0"
Console.WriteLine("whatever);

Even better! Once you have "dotnet-script" installed as a global tool as above:

dotnet tool install -g dotnet-script

You can use it as a REPL! Finally, the C# REPL (Read Evaluate Print Loop) I've been asking for for only a decade! ;)

C:\Users\scott\Desktop\scriptie>dotnet script
> 2+2
4
> var x = "scott hanselman";
> x.ToUpper()
"SCOTT HANSELMAN"

This is super useful for a learning tool if you're teaching C# in a lab/workshop situation. Of course you could also learn using http://try.dot.net in the browser as well.

In the past you may have used ScriptCS for C# scripting. There's a number of cool C#/F# scripting options. This is certainly not a new thing:

Mono scripting ScriptCS (Filip was core on this) Roslyn Scripting APIs (Roslyn is underneath most scripting environments) CS-Script

In this case, I was very impressed with the easy of dotnet-script as a global tool and it's simplicity. Go check out https://github.com/filipw/dotnet-script and try it out today!

Sponsor: Check out the latest JetBrains Rider with built-in spell checking, enhanced debugger, Docker support, full C# 7.3 support, publishing to IIS and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Using Enhanced Mode Ubuntu 18.04 for Hyper-V on Windows 10

Oct 10, 2018

Description:

I run Windows as my daily driver but I use WSL (Windows Subsystem for Linux) all day long but WSL is just the command-line and has some perf issues with heavy file system work. I use Docker for Windows which works amazingly and has it good perf but sometimes I want to test on a full Ubuntu Desktop.

ASIDE: No joke. My Linux/Ubuntu bona fides go back a while. Here's me installing Ubuntu 10.4 on Windows 7 over 8 years ago. Umuntu ngumuntu ngabantu!

To be frank, historically Ubuntu has sucked on Window's Hyper-V. If you wanted to get a higher (read: usable) resolution it would take a miracle. If you wanted shared clipboards or shared disk drives, well, again, miracle or a ton of manual set up. It's possible but it's not fun.

Why can't it be easy? Well, it is. I installed the Windows 10 "Fall Creators Update" - yes the name is stupid. It's Windows 10 "1809" - that's 2018 and the 9th month. Just type "Winver" from the Start menu. You may have "1803" from March. Go update.

Windows 10 includes Hyper-V Quick Create which has this suspiciously short list under "Select an operating system." Anytime a list has 1 or 2 items and some whitespace that means it will someday have n+1 list items.

Recently Ubuntu 18.04.1 LTS showed up in this list. You can quickly and easily create an Ubuntu VM from here and it's all handled, downloading, network switch, VM create, etc.

Create Virtual Machine

I dig it. So click create, start it up...get to the set up screen. Now, here, make sure you click "Require my password to login." What we want to do won't work with "Log in Automatically" and you don't want that anyway.

Setting up an Ubuntu VM

After you've created your VM and got it mostly setup, close the Hyper-V client window. Just X it out. The VM is still running of course.

Go over to Hyper-V Manager and right click on it and "Connect."

Connect to VM

You'll see a resolution dialog...pick one! Go crazy! Do be aware that there are issues on 4k display but you can adjust within Ubuntu itself.

Set Resolution

Now, BEFORE you click Connect, click "Show Options" and then "Local Resources." Under here, uncheck Smart Cards and Check "Drives."

Uncheck Smart Cards and Check Drives

Click OK and Connect...and you get this weird dialog! You're actually RDP'ing into Ubuntu! Rather than using the historical weird Hyper-V Client stuff to talk to Ubuntu and struggle with video cards and resolutions, here you are literally just Remote Desktoping into Ubuntu using integrated open source xrdp!

Login with your name and password (remember before when I said don't automatically login? This is why.)

Login to xrdp

What about Dynamic Resizing?

Here's an even better possible future. What we REALLY want (don't we, Dear Reader) is Dynamic Resolution and Resizing without Reconnection! Today you can just close and reconnect to change resolutions but I'd love to just resize the Ubuntu window like I do Windows 7/8/10 VM client windows.

The feature "Dynamic resolution update" was introduced in RDP 8.1. It enables to resize screen resolution on-the-fly.

Since we are using xrdp and that's open source over https://github.com/neutrinolabs/xrdp/ AND there's even a issue about this AND a lovely person has the code in their own branch and agreed to possibly upstream it maybe we can start using it and this great feature will just light up for folks who use Hyper-V Quick Create. Certainly we're talking weeks and months here (unless you want to help) but the lion's share of the work is done. I'm looking forward to resizing Ubuntu VMs dynamically.

What's in Enhanced Mode Today?

Back to today! You can read about how Linux VMs (Ubuntu or Arch) are set up in this GitHub repo https://github.com/Microsoft/linux-vm-tools You can set them up yourself with scripts, but the nice thing about Hyper-V Quick Create is that the work is done for us to make these "enhanced session" RDP-friendly VMs. No need to fear, you can just read the scripts yourself.

I can connect quickly and Enhanced Mode VMs give me:

a shared clipboard the resolution of my choice on connect fast painting/video/scrolling automatic shared-drives Smooth and automatic mouse capture

Fantastic.

Ubuntu on Windows 10

What about installing Visual Studio Code? Of course. And also .NET Core.NET Core, natch.

image

This took like 10 min and 8 of it was waiting for Hyper-V Create to download Ubuntu. Try it out!

Sponsor: Check out the latest JetBrains Rider with built-in spell checking, enhanced debugger, Docker support, full C# 7.3 support, publishing to IIS and more advanced Unity support.


© 2018 Scott Hanselman. All rights reserved.
     

Troubleshooting Windows 10 Nearby Sharing and Bluetooth Antennas

Oct 5, 2018

Description:

wifi

When building my Ultimate Developer PC I picked this motherboard, and it's lovely.

ASUS ROG STRIX LGA2066 X299 ATX Motherboard - Good solid board with built in BT and Wifi, an M.2 heatsink included, 3x PCIe 3.0 x16 SafeSlots (supports triple @ x16/x16/x8), 1x PCIe 3.0 x4, 2x PCIe 3.0 x1 and a Max of 128 gigs of RAM. It also has 8x USB 3.1s and a USB C which is nice.

I put it all together and I've thrilled with the machine. However, recently I was trying to use the new Windows 10 "Nearby Devices" feature.

It's this cool feature that lets you share stuff to "Nearby Devices" - that means your laptop, other desktops, whatever. Similar to AirDrop, it solves that problem of moving stuff between devices without using an intermediate server.

You can turn it on in Settings on Windows 10 and decide if you want to receive data from everyone or just contacts.

Nearby Sharing

So I started using on my new Desktop, IRONHEART, but I kept getting this "Looking for nearby devices" dialog...and it would just do nothing.

Looking for Nearby Devices

It turns out that the ASUS Motherboard also comes with a Wi-Fi Antenna. I don't use Wifi (I'm wired) so I didn't bother attaching it. It seems that this antenna is also a Bluetooth antenna and if you plug it in you'll ACTUALLY GET A LOVELY BLUETOOTH SIGNAL. Who knew? ;)

Now I can easily right click on files in Explorer or Web Pages in Edge and transfer them between systems.

Sharing a file with Nearby Sharing

A few tips on Nearby Sharing

Make sure you know your visibility settings. From the Start Menu type "nearby sharing" and confirm them. Make sure the receiving device doesn't have "Focus Assist" on (via the Action Center in the lower right of the screen) or you might miss the notification. And if you're using a desktop like me, ahem, plug in your BT antenna

Hope this helps someone because Nearby Sharing is a great feature that I'm now using all the time.

Sponsor: Telerik DevCraft is the comprehensive suite of .NET and JavaScript components and productivity tools developers use to build high-performant, modern web, mobile, desktop apps and chatbots. Try it!
© 2018 Scott Hanselman. All rights reserved.
     

Headless CMS and Decoupled CMS in .NET Core

Oct 3, 2018

Description:

Headless by Wendy used under CC https://flic.kr/p/HkESxWI'm sure I'll miss some, so if I do, please sound off in the comments and I'll update this post over the next week or so!

Lately I've been noticing a lot of "Headless" CMSs (Content Management System). A ton, in fact. I wanted to explore this concept and see if it's a fad or if it's really something useful.

Given the rise of clean RESTful APIs has come the rise of Headless CMS systems. We've all evaluated CMS systems (ones that included both front- and back-ends) and found the front-end wanting. Perhaps it lacks flexibility OR it's way too flexible and overwhelming. In fact, when I wrote my podcast website I considered a CMS but decided it felt too heavy for just a small site.

A Headless CMS is a back-end only content management system (CMS) built from the ground up as a content repository that makes content accessible via a RESTful API for display on any device.

I could start with a database but what if I started with a CMS that was just a backend - a headless CMS. I'll handle the front end, and it'll handle the persistence.

Here's what I found when exploring .NET Core-based Headless CMSs. One thing worth noting, is that given Docker containers and the ease with which we can deploy hybrid systems, some of these solutions have .NET Core front-ends and "who cares, it returns JSON" for the back-end!

Lynicon

Lyncicon is literally implemented as a NuGet Library! It stores its data as structured JSON. It's built on top of ASP.NET Core and uses MVC concepts and architecture.

It does include a front-end for administration but it's not required. It will return HTML or JSON depending on what HTTP headers are sent in. This means you can easily use it as the back-end for your Angular or existing SPA apps.

Lyncion is largely open source at https://github.com/jamesej/lyniconanc. If you want to take it to the next level there's a small fee that gives you updated searching, publishing, and caching modules.

ButterCMS

ButterCMS is an API-based CMS that seamlessly integrates with ASP.NET applications. It has an SDK that drops into ASP.NET Core and also returns data as JSON. Pulling the data out and showing it in a few is easy.

public class CaseStudyController : Controller { private ButterCMSClient Client; private static string _apiToken = ""; public CaseStudyController() { Client = new ButterCMSClient(_apiToken); } [Route("customers/{slug}")] public async Task<ActionResult> ShowCaseStudy(string slug) { var json = await Client.ListPageAsync("customer_case_study", slug) dynamic page = ((dynamic)JsonConvert.DeserializeObject(json)).data.fields; ViewBag.SeoTitle = page.seo_title; ViewBag.FacebookTitle = page.facebook_open_graph_title; ViewBag.Headline = page.headline; ViewBag.CustomerLogo = page.customer_logo; ViewBag.Testimonial = page.testimonial; return View("Location"); } }

Then of course output into Razor (or putting all of this into a RazorPage) is simple:

<html> <head> <title>@ViewBag.SeoTitle</title> <meta property="og:title" content="@ViewBag.FacebookTitle" /> </head> <body> <h1>@ViewBag.Headline</h1> <img width="100%" src="@ViewBag.CustomerLogo"> <p>@ViewBag.Testimonial</p> </body> </html>

Butter is a little different (and somewhat unusual) in that their backend API is a SaaS (Software as a Service) and they host it. They then have SDKs for lots of platforms including .NET Core. The backend is not open source while the front-end is https://github.com/ButterCMS/buttercms-csharp.

Piranha CMS

Piranha CMS is built on ASP.NET Core and is open source on GitHub. It's also totally package-based using NuGet and can be easily started up with a dotnet new template like this:

dotnet new -i Piranha.BasicWeb.CSharp dotnet new piranha dotnet restore dotnet run

It even includes a new Blog template that includes Bootstrap 4.0 and is all set for customization. It does include optional lightweight front-end but you can use those as guidelines to create your own client code. One nice touch is that Piranha also includes image resizing and cropping.

Umbraco Headless

The main ASP.NET website currently uses Umbraco as its CMS. Umbraco is a well-known open source CMS that will soon include a Headless option for more flexibility. The open source code for Umbraco is up here https://github.com/umbraco.

Orchard Core

Orchard is a CMS with a very strong community and fantastic documentation. Orchard Core is a redevelopment of Orchard using open source ASP.NET Core. While it's not "headless" it is using a Decoupled Architecture. Nothing would prevent you from removing the UI and presenting the content with your own front-end. It's also cross-platform and container friendly.

Squidex

"Squidex is an open source headless CMS and content management hub. In contrast to a traditional CMS Squidex provides a rich API with OData filter and Swagger definitions." Squidex is build with ASP.NET Core and the CQRS pattern and works with both Windows and Linux on today's browsers.

Squidex is open source with excellent docs at https://docs.squidex.io. Docs are at https://docs.squidex.io. They are also working on a hosted version you can play with here https://cloud.squidex.io. Samples on how to consume it are here https://github.com/Squidex/squidex-samples.

The consumption is super clean:

[Route("/{slug},{id}/")] public async Task<IActionResult> Post(string slug, string id) { var post = await apiClient.GetBlogPostAsync(id); var vm = new PostVM { Post = post }; return View(vm); }

And then the View:

@model PostVM @{ ViewData["Title"] = Model.Post.Data.Title; } <div> <h2>@Model.Post.Data.Title</h2> @Html.Raw(Model.Post.Data.Text) </div>

What .NET Core Headless CMSs did I miss? Let me know.

This definitely isn't a fad. It makes a lot of sense to me architecturally. Given the proliferation of "backend as a service" systems, DocumentDBs like Cosmos and Mongo, it follows that a headless CMS could easily fit into my systems. One less DB schema to think about, no need to roll my own auth/auth.

*Photo "headless" by Wendy used under CC https://flic.kr/p/HkESxW

Sponsor: Telerik DevCraft is the comprehensive suite of .NET and JavaScript components and productivity tools developers use to build high-performant, modern web, mobile, desktop apps and chatbots. Try it!


© 2018 Scott Hanselman. All rights reserved.
     

Exploring .NET Core's SourceLink - Stepping into the Source Code of NuGet packages you don't own

Sep 28, 2018

Description:

According to https://github.com/dotnet/sourcelink, SourceLink "enables a great source debugging experience for your users, by adding source control metadata to your built assets."

Sounds fantastic. I download a NuGet to use something like Json.NET or whatever all the time, I'd love to be able to "Step Into" the source even if I don't have laying around. Per the GitHub, it's both language and source control agnostic. I read that to mean "not just C# and not just GitHub."

Visual Studio 15.3+ supports reading SourceLink information from symbols while debugging. It downloads and displays the appropriate commit-specific source for users, such as from raw.githubusercontent, enabling breakpoints and all other sources debugging experience on arbitrary NuGet dependencies. Visual Studio 15.7+ supports downloading source files from private GitHub and Azure DevOps (former VSTS) repositories that require authentication.

Looks like Cameron Taggart did the original implementation and then the .NET team worked with Cameron and the .NET Foundation to make the current version. Also cool.

Download Source and Continue Debugging

Let me see if this really works and how easy (or not) it is.

I'm going to make a little library using the 5 year old Pseudointernationalizer from here. Fortunately the main function is pretty pure and drops into a .NET Standard library neatly.

I'll put this on GitHub, so I will include "PublishRepositoryUrl" and "EmbedUntrackedSources" as well as including the PDBs. So far my CSPROJ looks like this:

<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<PublishRepositoryUrl>true</PublishRepositoryUrl>
<EmbedUntrackedSources>true</EmbedUntrackedSources>
<AllowedOutputExtensionsInPackageBuildOutputFolder>$(AllowedOutputExtensionsInPackageBuildOutputFolder);.pdb</AllowedOutputExtensionsInPackageBuildOutputFolder>
</PropertyGroup>
</Project>

Pretty straightforward so far. As I am using GitHub I added this reference, but if I was using GitLab or BitBucket, etc, I would use that specific provider per the docs.

<ItemGroup>
<PackageReference Include="Microsoft.SourceLink.GitHub" Version="1.0.0-beta-63127-02" PrivateAssets="All"/>
</ItemGroup>

Now I'll pack up my project as a NuGet package.

D:\github\SourceLinkTest\PsuedoizerCore [master ≡]> dotnet pack -c release
Microsoft (R) Build Engine version 15.8.166+gd4e8d81a88 for .NET Core
Copyright (C) Microsoft Corporation. All rights reserved.

Restoring packages for D:\github\SourceLinkTest\PsuedoizerCore\PsuedoizerCore.csproj...
Generating MSBuild file D:\github\SourceLinkTest\PsuedoizerCore\obj\PsuedoizerCore.csproj.nuget.g.props.
Restore completed in 96.7 ms for D:\github\SourceLinkTest\PsuedoizerCore\PsuedoizerCore.csproj.
PsuedoizerCore -> D:\github\SourceLinkTest\PsuedoizerCore\bin\release\netstandard2.0\PsuedoizerCore.dll
Successfully created package 'D:\github\SourceLinkTest\PsuedoizerCore\bin\release\PsuedoizerCore.1.0.0.nupkg'.

Let's look inside the .nupkg as they are just ZIP files. Ah, check out the generated *.nuspec file that's inside!

<?xml version="1.0" encoding="utf-8"?>
<package xmlns="http://schemas.microsoft.com/packaging/2012/06/nuspec.xsd">
<metadata>
<id>PsuedoizerCore</id>
<version>1.0.0</version>
<authors>PsuedoizerCore</authors>
<owners>PsuedoizerCore</owners>
<requireLicenseAcceptance>false</requireLicenseAcceptance>
<description>Package Description</description>
<repository type="git" url="https://github.com/shanselman/PsuedoizerCore.git" commit="35024ca864cf306251a102fbca154b483b58a771" />
<dependencies>
<group targetFramework=".NETStandard2.0" />
</dependencies>
</metadata>
</package>

See under repository it points back to the location AND commit hash for this binary! That means I can give it to you or a coworker and they'd be able to get to the source. But what's the consumption experience like? I'll go over and start a new Console app that CONSUMES my NuGet library package. To make totally sure that I don't accidentally pick up the source from my machine I'm going to delete the entire folder. This source code no longer exists on this machine.

I'm using a "local" NuGet Feed. In fact, it's just a folder. Check it out:

D:\github\SourceLinkTest\AConsumerConsole> dotnet add package PsuedoizerCore -s "c:\users\scott\desktop\LocalNuGetFeed"
Writing C:\Users\scott\AppData\Local\Temp\tmpBECA.tmp
info : Adding PackageReference for package 'PsuedoizerCore' into project 'D:\github\SourceLinkTest\AConsumerConsole\AConsumerConsole.csproj'.
log : Restoring packages for D:\github\SourceLinkTest\AConsumerConsole\AConsumerConsole.csproj...
info : GET https://api.nuget.org/v3-flatcontainer/psuedoizercore/index.json
info : NotFound https://api.nuget.org/v3-flatcontainer/psuedoizercore/index.json 465ms
log : Installing PsuedoizerCore 1.0.0.
info : Package 'PsuedoizerCore' is compatible with all the specified frameworks in project 'D:\github\SourceLinkTest\AConsumerConsole\AConsumerConsole.csproj'.
info : PackageReference for package 'PsuedoizerCore' version '1.0.0' added to file 'D:\github\SourceLinkTest\AConsumerConsole\AConsumerConsole.csproj'.

See how I used -s to point to an alternate source? I could also configure my NuGet feeds, be they local directories or internal servers with "dotnet new nugetconfig" and including my NuGet Servers in the order I want them searched.

Here is my little app:

using System;
using Utils;

namespace AConsumerConsole
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine(Pseudoizer.ConvertToFakeInternationalized("Hello World!"));
}
}
}

And the output is [Ħęľľő Ŵőřľđ! !!! !!!].

But can I step into it? I don't have the source remember...I'm using SourceLink.

In Visual Studio 2017 I confirm that SourceLink is enabled. This is the Portable PDB version of SourceLink, not the "SourceLink 1.0" that was "Enable Source Server Support." That only worked on Windows..

Enable Source Link Support

You'll also want to turn off "Just My Code" since, well, this isn't your code.

Disable Just My Code

Now I'll start a Debug Session in my consumer app and hit F11 to Step Into the Library whose source I do not have!

Source Link Will Download from The Internet

Fantastic. It's going to get the source for me! Without git cloning the repository it will seamlessly let me continue my debugging session.

The temporary file ended up in C:\Users\scott\AppData\Local\SourceServer\4bbf4c0dc8560e42e656aa2150024c8e60b7f9b91b3823b7244d47931640a9b9 if you're interested. I'm able to just keep debugging as if I had the source...because I do! It came from the linked source.

Debugging into a NuGet that I don't have the source for

Very cool. I'm going to keep digging into SourceLink and learning about it. It seems that if YOU have a library or published NuGet either inside your company OR out in the open source world that you absolutely should be using SourceLink.

You can even install the sourcelink global tool and test your .pdb files for greater insight.

D:\github\SourceLinkTest\PsuedoizerCore>dotnet tool install --global sourcelink
D:\github\SourceLinkTest\PsuedoizerCore\bin\release\netstandard2.0>sourcelink print-urls PsuedoizerCore.pdb
43c83e7173f316e96db2d8345a3f963527269651 sha1 csharp D:\github\SourceLinkTest\PsuedoizerCore\Psuedoizer.cs
https://raw.githubusercontent.com/shanselman/PsuedoizerCore/02c09baa8bfdee3b6cdf4be89bd98c8157b0bc08/Psuedoizer.cs
bfafbaee93e85cd2e5e864bff949f60044313638 sha1 csharp C:\Users\scott\AppData\Local\Temp\.NETStandard,Version=v2.0.AssemblyAttributes.cs
embedded

Think about how much easier consumers of your library will have it when debugging their apps! Your package is no longer a black box. Go set this up on your projects today.

Sponsor: Rider 2018.2 is here! Publishing to IIS, Docker support in the debugger, built-in spell checking, MacBook Touch Bar support, full C# 7.3 support, advanced Unity support, and more.


© 2018 Scott Hanselman. All rights reserved.
     

A command-line REPL for RESTful HTTP Services

Sep 25, 2018

Description:

HTTP REPLMy that's a lot of acronyms. REPL means "Read Evaluate Print Loop." You know how you can run "python" and then just type 2+2 and get answer? That's a type of REPL.

The ASP.NET Core team is building a REPL that lets you explore and interact with your RESTful services. Ideally your services will have Swagger/OpenAPI available that describes the service. Right now this Http-REPL is just being developed and they're aiming to release it as a .NET Core Global Tool in .NET Core 2.2.

You can install global tools like this:

dotnet tool install -g nyancat

Then you can run "nyancat." Get a list of installed tools like this:

C:\Users\scott> dotnet tool list -g Package Id Version Commands -------------------------------------------------------------------- altcover.global 3.5.560 altcover dotnet-depends 0.1.0 dotnet-depends dotnet-httprepl 2.2.0-preview3-35304 dotnet-httprepl dotnet-outdated 2.0.0 dotnet-outdated dotnet-search 1.0.0 dotnet-search dotnet-serve 1.0.0 dotnet-serve git-status-cli 1.0.0 git-status github-issues-cli 1.0.0 ghi nukeeper 0.7.2 NuKeeper nyancat 1.0.0 nyancat project2015to2017.cli 1.8.1 csproj-to-2017

For the HTTP-REPL, since it's not yet released you have to point the Tool Feed to a daily build location, so do this:

dotnet tool install -g --version 2.2.0-* --add-source https://dotnet.myget.org/F/dotnet-core/api/v3/index.json dotnet-httprepl

Then run it with "dotnet httprepl." I'd like another name? What do you think? RESTy? POSTr? API Test? API View?

Here's an example run where I start up a Web API.

C:\SwaggerApp> dotnet httprepl (Disconnected)~ set base http://localhost:65369 Using swagger metadata from http://localhost:65369/swagger/v1/swagger.json http://localhost:65369/~ dir . [] People [get|post] Values [get|post] http://localhost:65369/~ cd People /People [get|post] http://localhost:65369/People~ dir . [get|post] .. [] {id} [get] http://localhost:65369/People~ get HTTP/1.1 200 OK Content-Type: application/json; charset=utf-8 Date: Wed, 26 Sep 2018 20:25:37 GMT Server: Kestrel Transfer-Encoding: chunked [ { "id": 1, "name": "Scott Hunter" }, { "id": 0, "name": "Scott Hanselman" }, { "id": 2, "name": "Scott Guthrie" } ]

Take a moment and read that. It can be a little confusing. It's not HTTPie, it's not Curl, but it's also not PostMan. it's something that you run and stay running if you're a command line person and enjoy that space. It's as if you "cd (change directory)" and "mount" a disk into your Web API.

You can use all the HTTP Verbs, and when POSTing you can set a default text editor and it will launch the editor with the JSON written for you! Give it a try!

A few gotchas/known issues:

You'll want to set a default Content-Type Header for your session. I think this should be default. set header Content-Type application/json If the HTTP REPL doesn't automatically detect your Swagger/OpenAPI endpoint, you'll need to set it manually: set base https://yourapi/api/v1/ set swagger https://yourapi/swagger.json I haven't figure out how to get it to use VS Code as its default editor. Likely because "code.exe" isn't a thing. (It uses a batch .cmd file, which the HTTP REPL would need to special case). For now, use an editor that's an EXE and point the HTTP REPL like this: pref set editor.command.default 'c:\notepad2.exe'

I'm really enjoy this idea. I'm curious how you find it and how you'd see it being used. Sound off in the comments.

Sponsor: Rider 2018.2 is here! Publishing to IIS, Docker support in the debugger, built-in spell checking, MacBook Touch Bar support, full C# 7.3 support, advanced Unity support, and more.


© 2018 Scott Hanselman. All rights reserved.
     

Scripts to remove old .NET Core SDKs

Sep 20, 2018

Description:

That's a lot of .NET Core installations.NET Core is lovely. It's usage is skyrocketing, it's open source, and .NET Core 2.1 has some amazing performance improvements. Just upgrading from 2.0 to 2.1 gave Bing a 34% performance boost.

However, for those of us who are installing multiple .NET Core SDKs side by side have noticed that they add-up if you are installing daily builds or very often. As of 2.x, .NET Core doesn't yet have an "uninstall all" or "uninstall all previews" option. There will be work done in .NET Core 3.0 that will mitigate this cumulative effect when you have lots of installers.

If you're taking dailies and it's time to tidy up, the short answer per Damian Edwards is "Delete them all, then nuke the dotnet folder in program files, then install the latest version."

Here's a PowerShell Script you can run on Windows as admin that will aggressively uninstall .NET Core SDKs.

Note the match at the top. Depending on your goals, you might want to change it to "Microsoft .NET Core SDK 2.1" or just "Microsoft .NET Core SDK 2."

Once it's all removed, then add the latest from https://www.microsoft.com/net/download/archives

A list of .NET Core SDKs

Here's the script, which is an improvement on Andrew's comment here. You can improve it as it's on GitHub here https://github.com/shanselman/RemoveDotNetCoreSDKInstallers. This scripts currently requires you to hit YES as the MSIs elevate. It doesn't work right then you try /passive as a switch. I'm interesting if you can get a "torch all Core SDK installers and install LTS and Current" script working.$app = Get-WmiObject -Class Win32_Product | Where-Object { $_.Name -match "Microsoft .NET Core SDK" } Write-Host $app.Name Write-Host $app.IdentifyingNumber pushd $env:SYSTEMROOT\System32 $app.identifyingnumber |% { Start-Process msiexec -wait -ArgumentList "/x $_" } popd

This PowerShell is Windows-only, of course.

If you're on RHEL, Ubuntu/Debian, there are scripts here to try out https://github.com/dotnet/cli/tree/master/scripts/obtain/uninstall

Let me know if this script works for you.

Sponsor: Copy: Rider 2018.2 is here! Publishing to IIS, Docker support in the debugger, built-in spell checking, MacBook Touch Bar support, full C# 7.3 support, advanced Unity support, and more.


© 2018 Scott Hanselman. All rights reserved.
     

Azure DevOps Continuous Build/Deploy/Test with ASP.NET Core 2.2 Preview in One Hour

Sep 18, 2018

Description:

Hanselminutes WebsiteI've been doing Continuous Integration and Deployment for well over 13 years. We used a lot of custom scripts and a lovely tool called CruiseControl.NET to check out, build, test, and deploy our code.

However, it's easy to get lulled into complacency. To get lazy. I don't set up Automated Continuous Integration and Deployment for all my little projects. But I should.

I was manually deploying a change to my podcast website this evening via a git deploy to Azure App Service. Pushing to Azure this way via Git uses "Kudu" to actually build the site. However, earlier this week I was also trying to update my site to .NET Core 2.2 which is in preview. Plus I have Unit Tests that aren't getting run during deploy.

So look at it this way. My simple little podcast website with a few tests and the desire to use a preview .NET Core SDK means I've outgrown a basic "git push to prod" for deploy.

I remembered that Azure DevOps (formerly VSTS) is out and offers free unlimited minutes for open source projects. I have no excuse for my sloppy builds and manual deploys. It also has unlimited free private repos, although I'm happy at GitHub and have no reason to move.

It usually takes me 5-10 minutes for a manual build/test/deploy, so I gave myself an hour to see if I could get this same process automated in Azure DevOps. I've never used this before and I wanted to see if I could do it quickly, and if it was intuitive.

Let's review my goals.

My source is in GitHub Build my ASP.NET Core 2.2 Web Site I want to build with .NET Core 2.2 which is currently in Preview. Run my xUnit Unit Tests I have some Selenium Unit Tests that can't run in the cloud (at least, I haven't figured it out yet) so I need them skipped. Deploy the resulting site to product in my Azure App Service

Cool. So I make a project and point Azure DevOps at my GitHub.

Azure DevOps: Source code in GitHub

They have a number of starter templates, so I was pleasantly surprised I didn't need manually build my Build Configuration myself. I'll pick ASP.NET app. I could pick Azure Web App for ASP.NET but I wanted a little more control.

Select a template

Now I've got a basic build pipeline. You can see it will use NuGet, get the packages, build the app, test the assemblies (if there are tests...more on that later) and the publish (zip) the build artifacts.

Build Pipeline

I then clicked Save & Queue...and it failed. Why? It says that I'm targeting .NET Core 2.2 and it doesn't support anything over 2.1. Shoot.

Agent says it doesn't support .NET Core 2.2

Fortunately there's a pipeline element that I can add called ".NET Core Tool Installer" that will get specific versions of the .NET Core SDK.

NOTE: I've emailed the team that ".NET Tool Installer" is the wrong name. A .NET Tool is a totally different thing. This task should be called the ".NET Core SDK Installer." Because it wasn't, it took me a minute to find it and figure out what it does.

I'm using the SDK Agent version 2.22.2.100-preview2-009404 so I put that string into the properties.

Install the .NET Core SDK custom version

At this point it builds, but I get a test error.

There's two problems with the tests. When I look at the logs I can see that the "testadapter.dll" that comes with xunit is mistakenly being pulled into the test runner! Why? Because the "Test Files" spec includes a VERY greedy glob in the form of **\*test*.dll. Perhaps testadapter shouldn't include the word test, but then it wouldn't be well-named.

**\$(BuildConfiguration)\**\*test*.dll
!**\obj\**

My test DLLs are all named with "tests" in the filename so I'll change the glob to "**\$(BuildConfiguration)\**\*tests*.dll" to cast a less-wide net.

Screenshot (45)

I have four Selenium Tests for my ASP.NET Core site but I don't want them to run when the tests are run in a Docker Container or, in this case, in the Cloud. (Until I figure out how)

I use SkippableFacts from XUnit and do this:

public static class AreWe
{
public static bool InDockerOrBuildServer {
get {
string retVal = Environment.GetEnvironmentVariable("DOTNET_RUNNING_IN_CONTAINER");
string retVal2 = Environment.GetEnvironmentVariable("AGENT_NAME");
return (
(String.Compare(retVal, Boolean.TrueString, ignoreCase: true) == 0)
||
(String.IsNullOrWhiteSpace(retVal2) == false));
}
}
}

Don't tease me. I like it. Now I can skip tests that I don't want running.

if (AreWe.InDockerOrBuildServer) return;

Now my tests run and I get a nice series of charts to show that fact.

22 tests, 4 skipped

I have it building and tests running.

I could add the Deployment Step to the Build but Azure DevOps Pipelines includes a better way. I make a Release Pipeline that is separate. It takes Artifacts as input and runs n number of Stages.

Creating a new Release Pipeline

I take the Artifact from the Build (the zipped up binaries) and pass them through the pipeline into the Azure App Service Deploy step.

Screenshot (49)

Here's the deployment in progress.

Manually Triggered Release

Cool! Now that it works and deploys, I can turn on Continuous Integration Build Triggers (via an automatic GitHub webhook) as well as Continuous Deployment triggers.

Continuous Deployment

Azure DevOps even includes badges that I can add to my readme.md so I always know by looking at GitHub if my site builds AND if it has successfully deployed.

4 releases, the final one succeeded

Now I can see each release as it happens and if it's successful or not.

Build Succeeded, Never Deployed

To top it all off, now that I have all this data and these pipelines, I even put together a nice little dashboard in about a minute to show Deployment Status and Test Trends.

My build and deployment dashboard

When I combine the DevOps Dashboard with my main Azure Dashboard I'm amazed at how much information I can get in so little effort. Consider that my podcast (my little business) is a one-person shop.

Azure Dashboard

And now I have a CI/CD pipeline with integrated testing gates that deploys worldwide. Many years ago this would have required a team and a lot of custom code.

Today it took an hour. Awesome.

I check into GitHub, kicks off a build, tests, emails me the results, and deploys the website if everything is cool. Of course, if I had another team member I could put in deployment gates or reviews, etc.

Sponsor: Copy: Rider 2018.2 is here! Publishing to IIS, Docker support in the debugger, built-in spell checking, MacBook Touch Bar support, full C# 7.3 support, advanced Unity support, and more.


© 2018 Scott Hanselman. All rights reserved.
     

A complete containerized .NET Core Application microservice that is as small as possible

Sep 14, 2018

Description:

OK, maybe not technically a microservice, but that's a hot buzzword these days, right? A few weeks ago I blogged about Improvements on ASP.NET Core deployments on Zeit's now.sh and making small container images. By the end I was able to cut my container size in half.

The trimming I was using is experimental and very aggressive. If you app loads things at runtime - like ASP.NET Razor Pages sometimes does - you may end up getting weird errors at runtime when a Type is missing. Some types may have been trimmed away!

For example:

fail: Microsoft.AspNetCore.Server.Kestrel[13]
Connection id "0HLGQ1DIEF1KV", Request id "0HLGQ1DIEF1KV:00000001": An unhandled exception was thrown by the application.
System.TypeLoadException: Could not load type 'Microsoft.AspNetCore.Diagnostics.IExceptionHandlerPathFeature' from assembly 'Microsoft.Extensions.Primitives, Version=2.1.1.0, Culture=neutral, PublicKeyToken=adb9793829ddae60'.
at Microsoft.AspNetCore.Diagnostics.ExceptionHandlerMiddleware.Invoke(HttpContext context)
at System.Runtime.CompilerServices.AsyncMethodBuilderCore.Start[TStateMachine](TStateMachine& stateMachine)
at Microsoft.AspNetCore.Diagnostics.ExceptionHandlerMiddleware.Invoke(HttpContext context)
at Microsoft.AspNetCore.HostFiltering.HostFilteringMiddleware.Invoke(HttpContext context)
at Microsoft.AspNetCore.Hosting.Internal.HostingApplication.ProcessRequestAsync(Context context)
at Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpProtocol.ProcessRequests[TContext](IHttpApplication`1 application)

Yikes!

I'm doing a self-Contained deployment and then trim the result! Richard Lander has a great dockerfile example. Note how he's doing the package addition with the dotnet CLI with "dotnet add package" and subsequent trim within the Dockerfile (as opposed to you adding it to your local development copy's csproj).

I'm adding the Tree Trimming Linker in the Dockerfile, so the trimming happens when the container image is built. I'm using the dotnet command to "dotnet add package ILLink.Tasks. This means I don't need to reference the linker package at development time - it's all at container build time.

FROM microsoft/dotnet:2.1-sdk-alpine AS build
WORKDIR /app

# copy csproj and restore as distinct layers
COPY *.sln .
COPY nuget.config .
COPY superzeit/*.csproj ./superzeit/
RUN dotnet restore

# copy everything else and build app
COPY . .
WORKDIR /app/superzeit
RUN dotnet build

FROM build AS publish
WORKDIR /app/superzeit
# add IL Linker package
RUN dotnet add package ILLink.Tasks -v 0.1.5-preview-1841731 -s https://dotnet.myget.org/F/dotnet-core/api/v3/index.json
RUN dotnet publish -c Release -o out -r linux-musl-x64 /p:ShowLinkerSizeComparison=true

FROM microsoft/dotnet:2.1-runtime-deps-alpine AS runtime
ENV DOTNET_USE_POLLING_FILE_WATCHER=true
WORKDIR /app
COPY --from=publish /app/superzeit/out ./
ENTRYPOINT ["dotnet", "superzeit.dll"]

I did end up hitting this bug in the Linker (it's not Released) but there's an easy workaround. I just need to set the property CrossGenDuringPublish to false in the project file.

If you look at the Advanced Instructions for the Linker you can see that you can "root" types or assemblies. Root means "don't mess with these or stuff that hangs off them." So I just need to exercise my app at runtime and make sure that all the types that my app needs are available, but no unnecessary ones.

I added the Assemblies I wanted to keep (not remove) while trimming/linking to my project file:

<Project Sdk="Microsoft.NET.Sdk.Web">

<PropertyGroup>
<TargetFramework>netcoreapp2.1</TargetFramework>
<CrossGenDuringPublish>false</CrossGenDuringPublish>
</PropertyGroup>

<ItemGroup>
<LinkerRootAssemblies Include="Microsoft.AspNetCore.Mvc.Razor.Extensions;Microsoft.Extensions.FileProviders.Composite;Microsoft.Extensions.Primitives;Microsoft.AspNetCore.Diagnostics.Abstractions" />
</ItemGroup>

<ItemGroup>
<!-- this can be here, or can be done all at runtime in the Dockerfile -->
<!-- <PackageReference Include="ILLink.Tasks" Version="0.1.5-preview-1841731" /> -->
<PackageReference Include="Microsoft.AspNetCore.App" />
</ItemGroup>

</Project>

My strategy for figuring out which assemblies to "root" and exclude from trimming was literally to just iterate. Build, trim, test, add an assembly by reading the error message, and repeat.

This sample ASP.NET Core app will deploy cleanly on Zeit with the smallest image footprint as possible. https://github.com/shanselman/superzeit

Next I'll try an actual Microservice (as opposed to a complete website, which is what this is) and see how small I can get that. Such fun!

UPDATE: This technique works with "dotnet new webapi" as well and is about 73 megs per "docker images" and it's 34 megs when sent and squished through Zeit's "now" CLI.

Small services!

Sponsor: Rider 2018.2 is here! Publishing to IIS, Docker support in the debugger, built-in spell checking, MacBook Touch Bar support, full C# 7.3 support, advanced Unity support, and more.
© 2018 Scott Hanselman. All rights reserved.
     

How do you use System.Drawing in .NET Core?

Sep 12, 2018

Description:

I've been doing .NET image processing since the beginning. In fact I wrote about it over 13 years ago on this blog when I talked about Compositing two images into one from the ASP.NET Server Side and in it I used System.Drawing to do the work. For over a decade folks using System.Drawing were just using it as a thin wrapper over GDI (Graphics Device Interface) which were very old Win32 (Windows) unmanaged drawing APIs. We use them because they work fine.

.NET Conf: Join us this week! September 12-14, 2018 for .NET Conf! It's a FREE, 3 day virtual developer event co-organized by the .NET Community and Microsoft. Watch all the sessions here. Join a virtual attendee party after the last session ends on Day 1 where you can win prizes! Check out the schedule here and attend a local event in your area organized by .NET community influencers all over the world.

DotNetBotFor a while there was a package called CoreCompat.System.Drawing that was a .NET Core port of a Mono version of System.Drawing.

However, since then Microsoft has released System.Drawing.Common to provide access to GDI+ graphics functionality cross-platform.

There is a lot of existing code - mine included - that makes assumptions that .NET would only ever run on Windows. Using System.Drawing was one of those things. The "Windows Compatibility Pack" is a package meant for developers that need to port existing .NET Framework code to .NET Core. Some of the APIs remain Windows only but others will allow you to take existing code and make it cross-platform with a minimum of trouble.

Here's a super simple app that resizes a PNG to 128x128. However, it's a .NET Core app and it runs in both Windows and Linux (Ubuntu!)

using System;
using System.Drawing;
using System.Drawing.Drawing2D;
using System.Drawing.Imaging;
using System.IO;

namespace imageresize
{
class Program
{
static void Main(string[] args)
{
int width = 128;
int height = 128;
var file = args[0];
Console.WriteLine($"Loading {file}");
using(FileStream pngStream = new FileStream(args[0],FileMode.Open, FileAccess.Read))
using(var image = new Bitmap(pngStream))
{
var resized = new Bitmap(width, height);
using (var graphics = Graphics.FromImage(resized))
{
graphics.CompositingQuality = CompositingQuality.HighSpeed;
graphics.InterpolationMode = InterpolationMode.HighQualityBicubic;
graphics.CompositingMode = CompositingMode.SourceCopy;
graphics.DrawImage(image, 0, 0, width, height);
resized.Save($"resized-{file}", ImageFormat.Png);
Console.WriteLine($"Saving resized-{file} thumbnail");
}
}
}
}
}

Here it is running on Ubuntu:

Resizing Images on Ubuntu

NOTE that on Ubuntu (and other Linuxes) you may need to install some native dependencies as System.Drawing sits on top of native libraries

sudo apt install libc6-dev
sudo apt install libgdiplus

There's lots of great options for image processing on .NET Core now! It's important to understand that this System.Drawing layer is great for existing System.Drawing code, but you probably shouldn't write NEW image management code with it. Instead, consider one of the great other open source options.

ImageSharp - A cross-platform library for the processing of image files; written in C# Compared to System.Drawing ImageSharp has been able to develop something much more flexible, easier to code against, and much, much less prone to memory leaks. Gone are system-wide process-locks; ImageSharp images are thread-safe and fully supported in web environments.

Here's how you'd resize something with ImageSharp:

using (Image<Rgba32> image = Image.Load("foo.jpg"))
{
image.Mutate(x => x
.Resize(image.Width / 2, image.Height / 2)
.Grayscale());
image.Save("bar.jpg"); // Automatic encoder selected based on extension.
} Magick.NET -A .NET library on top of ImageMagick SkiaSharp - A .NET wrapper on top of Google's cross-platform Skia library

It's awesome that there are so many choices with .NET Core now!

Sponsor: Rider 2018.2 is here! Publishing to IIS, Docker support in the debugger, built-in spell checking, MacBook Touch Bar support, full C# 7.3 support, advanced Unity support, and more.


© 2018 Scott Hanselman. All rights reserved.
     

The Extremely Promising State of Diabetes Technology in 2018

Sep 7, 2018

Description:

This blog post is an update to these two Diabetes Technology blog posts:

The Promising State of Diabetes Technology in 2016 The Sad State of Diabetes Technology in 2012

You might also enjoy this video of the talk I gave at WebStock 2018 on Solving Diabetes with an Open Source Artificial Pancreas*.

First, let me tell you that insulin is too expensive in the US.

Between 2002 and 2013, the price of insulin jumped, with the typical cost for patients increasing from about $40 a vial to $130.

Open Source Artificial Pancreas on iPhoneFor some of the newer insulins like the ones I use, I pay as much as $296 a bottle. I have a Health Savings Plan so this is often out of pocket until I hit the limit for the year.

People in America are rationing insulin. This is demonstrable fact. I've personally mailed extra insulin to folks in need. I've meet young people who lost their insurance at age 26 and have had to skip shots to save vials of insulin.

This is a problem, but on the technology side there's some extremely promising work happening, and it's we have really hit our stride in the last ten years.

I wrote the first Glucose Management system for the PalmPilot in 1998 called GlucoPilot and provided on the go in-depth analysis for the first time. The first thing that struck me was that the PalmPilot and the Blood Sugar Meter were the same size. Why did I need two devices with batteries, screens, buttons and a CPU? Why so many devices?

I've been told every year the a Diabetes Breakthrough is coming "in five years." It's been 25 years.

In 2001 I went on a trip across the country with my wife, an insulin pump and 8 PDAs (personal digital assistants, the "iPhones" of the time) and tried to manage my diabetes using all the latest wireless technology...this was the latest stuff 17 years ago. I had just moved from injections to an insulin pump. Even now in 2018 Insulin Pumps are expensive, mostly proprietary, and super expensive. In fact, many folks use insulin pumps in the states use out of warranty pumps purchased on Craigslist.

Fast forward to 2018 and I've been using an Open Source Artificial Pancreas for two years.

OpenAPS - Open Artificial Pancreas System. A platform for building a closed-loop with open tools. AndroidAPS - A branch of OpenAPS running on Android Loop/LoopKit - Open Source Artificial Pancreas running on the iPhone with a hardware bridge (RileyLink) to the pump. I run this pancreas, personally, and have for nearly 2 years. Watch Dana Lewis (the originator of OpenAPS) talk about OpenAPS at OSCON!

The results speak for themselves. While I do have bad sugars sometimes, and I do struggle, if you look at my blood work my HA1c (the long term measurement of "how I'm doing" shows non-diabetic levels. To be clear - I'm fully and completely Type 1 diabetic, I produce zero insulin of my own. I take between 40 and 50 Units of insulin every day, and have for the last 25 years...but I will likely die of old age.

Open Source Artificial Pancreas === Diabetes results pic.twitter.com/ZSsApTLRXq

— Scott Hanselman (@shanselman) September 10, 2018

This is significant. Why? Because historically diabetics die of diabetes. While we wait (or more accurately, #WeAreNotWaiting) for a biological/medical solution to Type 1 diabetes, the DIY (Do It Yourself) community is just doing it ourselves.

Building on open hardware, open software, and reverse-engineered protocols for proprietary hardware, the online diabetes community literally has their choice of open source pancreases in 2018! Who would have imagined it. You can choose your algorithms, your phone, your pump, your continuous glucose meter.

Today, in 2018, you can literally change the code and recompile a personal branch of your own pancreas.

Watch my 2010 YouTube video "I am Diabetic" as I walk you through the medical hardware (pumps, needles, tubes, wires) in managing diabetes day to day. Then watch my 2018 talk on Solving Diabetes with an Open Source Artificial Pancreas*.

I believe that every diabetic should be offered a pump, a continuous glucose meter, and trained on some kind of artificial pancreas. A cloud based reporting system has also been a joy. My wife and family can see my sugar in real time when I'm away. My wife has even called me overseas to wake me up when I was in a bad sugar situation.

Artificial Pancreas generations

As the closed-hardware and closed-software medical companies work towards their own artificial pancreases, the open source community feel those companies would better serve us by opening up their protocols, using standard Bluetooth ISO profiles and use security best practices.

Looking at the table above, the open source community is squarely in #4 and moving quickly into #5. But why did we have to do this ourselves? We got tired of waiting.

All in all, through open software and hardware, I can tell you that my life is SO MUCH BETTER than it was when I was first diagnosed. I figure we'll have this all figured out in about five years, right? ;)

THANK YOU!

MORE DIABETES READING Bridging Dexcom Share CGM Receivers and Nightscout Hacking Diabetes Visualizing your real-time blood sugar values in Git Diabetes Technology: Dexcom G5 CGM Review Introducing Web Tiles for Microsoft Band - My diabetes data on a Band! Diabetics: It's fun to say Bionic Pancreas but how about a reality check It's WAY too early to call this Insulin Pump an Artificial Pancreas

* Yes there are some analogies, stretched metaphors, and oversimplifications in this talk. This talk is an introduction to the space to the normally-sugared. If you are a diabetes expert you might watch and say...eh...ya, I mean, it kind of works like that. Please take the talk in in the thoughtful spirit it was intended.

Sponsor: Get home early, eat supper on time and coach your kids in soccer. Moving workloads to Azure just got easy with Azure NetApp Files. Sign up to Preview Azure NetApp Files!


© 2018 Scott Hanselman. All rights reserved.
     

Always Be Closing...Pull Requests

Sep 5, 2018

Description:

Always be closingI was looking at a Well Known Open Source Project on GitHub today. It had like 978 Pull Requests. A "PR" means "hey here's some code I did for your project, you can PULL it from here and merge it into your code!"

But these were Open Pull Requests. Pending. Limbo Pull Requests. Dating back to 2015.

Why do Pull Requests stay open?

Why do projects keep Pull Requests open? What's a reasonable amount of time? Here's a few thoughts.

PR as Call to Action PRs are a shout. They are HERE IS SOME CODE and they create work for the maintainer. They are needy things and require review and merging, but even worse, sometimes manual merging. Plus for folks new to Git and Open Source, asking them to "rebase on top of latest" may be enough for them to just give up. Fear of Closing If you close a PR without merging it, it's a rejection. It's a statement that this work isn't going to be used, and there's always a chance that the person who did the work will feel pretty bad about it. Abandoned Sometimes the originator of the PR disappears. The PR is effectively abandoned. These should be closed after a time. Opened so long they can't be merged The problem with PRs that are open for long is that they become impossible to merge. The cost of understanding whether they are still relevant plus resolving the merge conflicts might be higher than the value of the PR itself. Incorrectly created A PR originator may intent to change a single word (misspelling) but their PR changes CRs to LFs or Tabs to Spaces, it's a hassle. Formatting It's generally considered poor form to send a PR out of the blue where one just ran a linter or formatter. If the project wanted that done they'd ask for it. Totally not aligned with Roadmap If a PR shows up without context or communication, it may not be aligned with the direction of the project. Surprise PR Unfortunately some PRs show up out of the blue with major changes, file moves, or no context. If a PR wasn't asked for, or if a PR wasn't requested, or borne of an Issue, you'll likely have trouble pushing it through.

Thanks to Jon and Immo for their thoughts on this (likely incomplete) list. Jess Frazelle has a great post on "The Art of Closing" that I just found, and it includes a glorious gif from Glengarry Glen Ross where Always Be Closing comes from (warning, clip has dated and offensive language).

Jess suggests a few ways to Always Be Closing.

Two things that can help make your open source project successful AND stay tidy!

including a CONTRIBUTING.md GitHub has some good guidance https://help.github.com/articles/setting-up-your-project-for-healthy-contributions/ and most of the dotnet repos have some decent contribution guidelines. I LOVE Gina Häußge's Contributing.md on her open source project "OctoPrint." using Pull Request templates that give clear guidance on how to submit a successful pull request. https://help.github.com/articles/creating-a-pull-request-template-for-your-repository Use bots to test and build PRs, sign CLAs (Contributor License Agreements) and move the ball forward.

What do you think? Why do PRs stay open?

Sponsor: Get home early, eat supper on time and coach your kids in soccer. Moving workloads to Azure just got easy with Azure NetApp Files. Sign up to Preview Azure NetApp Files!


© 2018 Scott Hanselman. All rights reserved.
     

Interesting bugs - MSB3246: Resolved file has a bad image, no metadata, or is otherwise inaccessible. Image is too small.

Aug 30, 2018

Description:

I got a very strange warning recently when building a .NET Core app with "dotnet build."

MSB3246: Resolved file has a bad image, no metadata, or is otherwise inaccessible.
Image is too small.

Interesting Bug used under CC https://flic.kr/p/4SpmL6Eek! It's clear, in that something is "too small" but what? A file I guess? Maybe it's the wrong size?

The error code is MSB3246 which is nice and googleable/searchable but it was confusing because I couldn't figure our what file specifically. It just felt vague.

BUT!

I had recently been overclocking my machine (overly aggressively, gulp, about 40%) and had a very nasty hard reboot. As a result I had a few dozen files get orphaned - specifically the files were zero'ed out! Zero is small, right?

Turns out you can pass parameters over to MSBuild from "dotnet build" and see what MSBuild is doing internally. For example, you could

/fileLoggerParameters:verbosity=diagnostic

but that's long. So how about:

dotnet build /flp:v=diag

Cool. What deep logging do I see now?

Primary reference "deliberately.zero.bytes.dll". (TaskId:41)
13:36:52.397 1:7>C:\Program Files\dotnet\sdk\2.1.400\Microsoft.Common.CurrentVersion.targets(2110,5): warning MSB3246: Resolved file has a bad image, no metadata, or is otherwise inaccessible. Image is too small. [S:\work\zero-byte-ref\zero-byte-ref.csproj]
Resolved file path is "S:\work\zero-byte-ref\deliberately.zero.bytes.dll". (TaskId:41)
Reference found at search path location "{RawFileName}". (TaskId:41)

Now with "verbose" turned on I can see that one of the references is zero'ed out/corrupted/bad. I reinstalled .NET Core in my case and doubled checked all the DLLs/Assemblies that I was bringing in - I also ran chkdsk /f - and I was back in business!

I hope this helps someone who might stumble on error MSB3246 and wonder what's up.

Even better, thanks to Rainer Sigwald who filed a bug against MSBuild to update the error message to be more clear. In the future I'll be able to debug this without changing verbosity!

Sponsor: Preview the latest JetBrains Rider with its built-in spell checking, initial Blazor support, partial C# 7.3 support, enhanced debugger, C# Interactive, and a redesigned Solution Explorer.
© 2018 Scott Hanselman. All rights reserved.
     

Improvements on ASP.NET Core deployments on Zeit's now.sh and making small container images

Aug 29, 2018

Description:

Back in March of 2017 I blogged about Zeit and their cool deployment system "now." Zeit will take any folder and deploy it to the web easily. Better yet if you have a Dockerfile in that folder as Zeit will just use that for the deployment.

image

Zeit's free Open Source account has a limit of 100 megs for the resulting image, and with the right Dockerfile started ASP.NET Core apps are less than 77 megs. You just need to be smart about a few things. Additionally, it's running in a somewhat constrained environment so ASP.NET's assumptions around FileWatchers can occasionally cause you to see errors like

at System.IO.FileSystemWatcher.StartRaisingEvents()
Unhandled Exception: System.IO.IOException:
The configured user limit (8192) on the number of inotify instances has been reached.
at System.IO.FileSystemWatcher.StartRaisingEventsIfNotDisposed(

While this environment variable is set by default for the "FROM microsoft/dotnet:2.1-sdk" Dockerfile, it's not set at runtime. That's dependent on your environment.

Here's my Dockerfile for a simple project called SuperZeit. Note that the project is structured with a SLN file, which I recommend.

Let me call our a few things.

First, we're doing a Multi-stage build here. The SDK is large. You don't want to deploy the compiler to your runtime image! Second, the first copy commands just copy the sln and the csproj. You don't need the source code to do a dotnet restore! (Did you know that?) Not deploying source means that your docker builds will be MUCH faster as Docker will cache the steps and only regenerate things that change. Docker will only run dotnet restore again if the solution or project files change. Not the source. Third, we are using the aspnetcore-runtime image here. Not the dotnetcore one. That means this image includes the binaries for .NET Core and ASP.NET Core. We don't need or want to include them again. If you were doing a publish with a the -r switch, you'd be doing a self-contained build/publish. You'd end up copying TWO .NET Core runtimes into a container! That'll cost you another 50-60 megs and it's just wasteful. If you want to do that Go explore the very good examples on the .NET Docker Repro on GitHub https://github.com/dotnet/dotnet-docker/tree/master/samples Optimizing Container Size .NET Core Alpine Docker Sample - This sample builds, tests, and runs an application using Alpine. .NET Core self-contained Sample - This sample builds and runs an application as a self-contained application. Finally, since some container systems like Zeit have modest settings for inotify instances (to avoid abuse, plus most folks don't use them as often as .NET Core does) you'll want to set ENV DOTNET_USE_POLLING_FILE_WATCHER=true which I do in the runtime image.

So starting from this Dockerfile:

FROM microsoft/dotnet:2.1-sdk-alpine AS build
WORKDIR /app

# copy csproj and restore as distinct layers
COPY *.sln .
COPY superzeit/*.csproj ./superzeit/
RUN dotnet restore

# copy everything else and build app
COPY . .
WORKDIR /app/superzeit
RUN dotnet build

FROM build AS publish
WORKDIR /app/superzeit
RUN dotnet publish -c Release -o out

FROM microsoft/dotnet:2.1-aspnetcore-runtime-alpine AS runtime
ENV DOTNET_USE_POLLING_FILE_WATCHER=true
WORKDIR /app
COPY --from=publish /app/superzeit/out ./
ENTRYPOINT ["dotnet", "superzeit.dll"]

Remember the layers of the Docker images, as if they were a call stack:

Your app's files ASP.NET Core Runtime .NET Core Runtime .NET Core native dependencies (OS specific) OS image (Alpine, Ubuntu, etc)

For my little app I end up with a 76.8 meg image. If want I can add the experimental .NET IL Trimmer. It won't make a difference with this app as it's already pretty simple but it could with a larger one.

BUT! What if we changed the layering to this?

Your app's files along with a self-contained copy of ASP.NET Core and .NET Core .NET Core native dependencies (OS specific) OS image (Alpine, Ubuntu, etc)

Then we could do a self-Contained deployment and then trim the result! Richard Lander has a great dockerfile example.

See how he's doing the package addition with the dotnet CLI with "dotnet add package" and subsequent trim within the Dockerfile (as opposed to you adding it to your local development copy's csproj).

FROM microsoft/dotnet:2.1-sdk-alpine AS build
WORKDIR /app

# copy csproj and restore as distinct layers
COPY *.sln .
COPY nuget.config .
COPY superzeit/*.csproj ./superzeit/
RUN dotnet restore

# copy everything else and build app
COPY . .
WORKDIR /app/superzeit
RUN dotnet build

FROM build AS publish
WORKDIR /app/superzeit
# add IL Linker package
RUN dotnet add package ILLink.Tasks -v 0.1.5-preview-1841731 -s https://dotnet.myget.org/F/dotnet-core/api/v3/index.json
RUN dotnet publish -c Release -o out -r linux-musl-x64 /p:ShowLinkerSizeComparison=true

FROM microsoft/dotnet:2.1-runtime-deps-alpine AS runtime
ENV DOTNET_USE_POLLING_FILE_WATCHER=true
WORKDIR /app
COPY --from=publish /app/superzeit/out ./
ENTRYPOINT ["dotnet", "superzeit.dll"]

Now at this point, I'd want to see how small the IL Linker made my ultimate project. The goal is to be less than 75 megs. However, I think I've hit this bug so I will have to head to bed and check on it in the morning.

The project is at https://github.com/shanselman/superzeit and you can just clone and "docker build" and see the bug.

However, if you check the comments in the Docker file and just use the a "FROM microsoft/dotnet:2.1-aspnetcore-runtime-alpine AS runtime" it works fine. I just think I can get it even smaller than 75 megs.

Talk so you soon, Dear Reader! (I'll update this post when I find out about that bug...or perhaps my bug!)

Sponsor: Preview the latest JetBrains Rider with its built-in spell checking, initial Blazor support, partial C# 7.3 support, enhanced debugger, C# Interactive, and a redesigned Solution Explorer.


© 2018 Scott Hanselman. All rights reserved.
     

Decoding an SSH Key from PEM to BASE64 to HEX to ASN.1 to prime decimal numbers

Aug 24, 2018

Description:

I'm reading a new chapter of The Imposter's Handbook: Season 2 that Rob and I are working on. He's digging into the internals of what's exactly in your SSH Key.

Decoding a certificate

I generated a key with no password:

ssh-keygen -t rsa -C scott@myemail.com

Inside the generated file is this text, that we've all seen before but few have cracked open.

-----BEGIN RSA PRIVATE KEY-----
MIIEpAIBAAKCAQEAtd8As85sOUjjkjV12ujMIZmhyegXkcmGaTWk319vQB3+cpIh
Wu0mBke8R28jRym9kLQj2RjaO1AdSxsLy4hR2HynY7l6BSbIUrAam/aC/eVzJmg7
qjVijPKRTj7bdG5dYNZYSEiL98t/+XVxoJcXXOEY83c5WcCnyoFv58MG4TGeHi/0
coXKpdGlAqtQUqbp2sG7WCrXIGJJdBvUDIQDQQ0Isn6MK4nKBA10ucJmV+ok7DEP
kyGk03KgAx+Vien9ELvo7P0AN75Nm1W9FiP6gfoNvUXDApKF7du1FTn4r3peLzzj
50y5GcifWYfoRYi7OPhxI4cFYOWleFm1pIS4PwIDAQABAoIBAQCBleuCMkqaZnz/
6GeZGtaX+kd0/ZINpnHG9RoMrosuPDDYoZZymxbE0sgsfdu9ENipCjGgtjyIloTI
xvSYiQEIJ4l9XOK8WO3TPPc4uWSMU7jAXPRmSrN1ikBOaCslwp12KkOs/UP9w1nj
/PKBYiabXyfQEdsjQEpN1/xMPoHgYa5bWHm5tw7aFn6bnUSm1ZPzMquvZEkdXoZx
c5h5P20BvcVz+OJkCLH3SRR6AF7TZYmBEsBB0XvVysOkrIvdudccVqUDrpjzUBc3
L8ktW3FzE+teP7vxi6x/nFuFh6kiCDyoLBhRlBJI/c/PzgTYwWhD/RRxkLuevzH7
TU8JFQ9BAoGBAOIrQKwiAHNw4wnmiinGTu8IW2k32LgI900oYu3ty8jLGL6q1IhE
qjVMjlbJhae58mAMx1Qr8IuHTPSmwedNjPCaVyvjs5QbrZyCVVjx2BAT+wd8pl10
NBXSFQTMbg6rVggKI3tHSE1NSdO8kLjITUiAAjxnnJwIEgPK+ljgmGETAoGBAM3c
ANd/1unn7oOtlfGAUHb642kNgXxH7U+gW3hytWMcYXTeqnZ56a3kNxTMjdVyThlO
qGXmBR845q5j3VlFJc4EubpkXEGDTTPBSmv21YyU0zf5xlSp6fYe+Ru5+hqlRO4n
rsluyMvztDXOiYO/VgVEUEnLGydBb1LwLB+MVR2lAoGAdH7s7/0PmGbUOzxJfF0O
OWdnllnSwnCz2UVtN7rd1c5vL37UvGAKACwvwRpKQuuvobPTVFLRszz88aOXiynR
5/jH3+6IiEh9c3lattbTgOyZx/B3zPlW/spYU0FtixbL2JZIUm6UGmUuGucs8FEU
Jbzx6eVAsMojZVq++tqtAosCgYB0KWHcOIoYQUTozuneda5yBQ6P+AwKCjhSB0W2
SNwryhcAMKl140NGWZHvTaH3QOHrC+SgY1Sekqgw3a9IsWkswKPhFsKsQSAuRTLu
i0Fja5NocaxFl/+qXz3oNGB56qpjzManabkqxSD6f8o/KpeqryqzCUYQN69O2LG9
N53L9QKBgQCZd0K6RFhhdJW+Eh7/aIk8m8Cho4Im5vFOFrn99e4HKYF5BJnoQp4p
1QTLMs2C3hQXdJ49LTLp0xr77zPxNWUpoN4XBwqDWL0t0MYkRZFoCAG7Jy2Pgegv
uOuIr6NHfdgGBgOTeucG+mPtADsLYurEQuUlfkl5hR7LgwF+3q8bHQ==
-----END RSA PRIVATE KEY-----

The private key is an ASN.1 (Abstract Syntax Notation One) encoded data structure. It's a funky format but it's basically a packed format with the ability for nested trees that can hold booleans, integers, etc.

However, ASN.1 is just the binary packed "payload." It's not the "container." For example, there are envelopes and there are letters inside them. The envelope is the PEM (Privacy Enhanced Mail) format. Such things start with ----- BEGIN SOMETHING ----- and end with ----- END SOMETHING ------. If you're familiar with BASE64, your spidey sense may tell you that this is a BASE64 encoded file. Not everything that's BASE64 turns into a friendly ASCII string. This turns into a bunch of bytes you can view in HEX.

We can first decode the PEM file into HEX. Yes, I know there's lots of ways to do this stuff at the command line, but I like showing and teaching using some of the many encoding/decoding websites and utilities there are out there. I also love using https://cryptii.com/ for these things as you can build a visual pipeline.

308204A40201000282010100B5DF00B3CE6C3948E3923575DAE8
CC2199A1C9E81791C9866935A4DF5F6F401DFE7292215AED2606
47BC476F234729BD90B423D918DA3B501D4B1B0BCB8851D87CA7
63B97A0526C852B01A9BF682FDE57326683BAA35628CF2914E3E
DB746E5D60D65848488BF7CB7FF97571A097175CE118F3773959
C0A7CA816FE7C306E1319E1E2FF47285CAA5D1A502AB5052A6E9
DAC1BB582AD7206249741BD40C8403410D08B27E8C2B89CA040D
74B9C26657EA24EC310F9321A4D372A0031F9589E9FD10BBE8EC
FD0037BE4D9B55BD1623FA81FA0DBD45C3029285EDDBB51539F8
AF7A5E2F3CE3E74CB919C89F5987E84588BB38F87123870560E5
snip

This ASN.1 JavaScript decoder can take the HEX and parse it for you. Or you can that ASN.1 packed format at the *nix command line and see that there's nine big integers inside (I trimmed them for this blog).

openssl asn1parse -in notreal
0:d=0 hl=4 l=1188 cons: SEQUENCE
4:d=1 hl=2 l= 1 prim: INTEGER :00
7:d=1 hl=4 l= 257 prim: INTEGER :B5DF00B3CE6C3948E3923575DAE8CC2199A1C9E81791C9866935A4DF5F6F401DFE7292215
268:d=1 hl=2 l= 3 prim: INTEGER :010001
273:d=1 hl=4 l= 257 prim: INTEGER :8195EB82324A9A667CFFE867991AD697FA4774FD920DA671C6F51A0CAE8B2E3C30D8A1967
534:d=1 hl=3 l= 129 prim: INTEGER :E22B40AC22007370E309E68A29C64EEF085B6937D8B808F74D2862EDEDCBC8CB18BEAAD48
666:d=1 hl=3 l= 129 prim: INTEGER :CDDC00D77FD6E9E7EE83AD95F1805076FAE3690D817C47ED4FA05B7872B5631C6174DEAA7
798:d=1 hl=3 l= 128 prim: INTEGER :747EECEFFD0F9866D43B3C497C5D0E3967679659D2C270B3D9456D37BADDD5CE6F2F7ED4B
929:d=1 hl=3 l= 128 prim: INTEGER :742961DC388A184144E8CEE9DE75AE72050E8FF80C0A0A38520745B648DC2BCA170030A97
1060:d=1 hl=3 l= 129 prim: INTEGER :997742BA4458617495BE121EFF68893C9BC0A1A38226E6F14E16B9FDF5EE072981790499E

Per the spec the format is this:

An RSA private key shall have ASN.1 type RSAPrivateKey:

RSAPrivateKey ::= SEQUENCE {
version Version,
modulus INTEGER, -- n
publicExponent INTEGER, -- e
privateExponent INTEGER, -- d
prime1 INTEGER, -- p
prime2 INTEGER, -- q
exponent1 INTEGER, -- d mod (p-1)
exponent2 INTEGER, -- d mod (q-1)
coefficient INTEGER -- (inverse of q) mod p }

I found the description for how RSA works in this blog post very helpful as it uses small numbers as examples. The variable names here like p, q, and n are agreed upon and standard.

The fields of type RSAPrivateKey have the following meanings:
o version is the version number, for compatibility
with future revisions of this document. It shall
be 0 for this version of the document.
o modulus is the modulus n.
o publicExponent is the public exponent e.
o privateExponent is the private exponent d.
o prime1 is the prime factor p of n.
o prime2 is the prime factor q of n.
o exponent1 is d mod (p-1).
o exponent2 is d mod (q-1).
o coefficient is the Chinese Remainder Theorem
coefficient q-1 mod p.

Let's look at that first number q, the prime factor p of n. It's super long in Hexadecimal.

747EECEFFD0F9866D43B3C497C5D0E3967679659D2C270B3D945
6D37BADDD5CE6F2F7ED4BC600A002C2FC11A4A42EBAFA1B3D354
52D1B33CFCF1A3978B29D1E7F8C7DFEE8888487D73795AB6D6D3
80EC99C7F077CCF956FECA5853416D8B16CBD89648526E941A65
2E1AE72CF0511425BCF1E9E540B0CA23655ABEFADAAD028B

That hexadecimal number converted to decimal is this long ass number. It's 308 digits long!

22959099950256034890559187556292927784453557983859951626187028542267181746291385208056952622270636003785108992159340113537813968453561739504619062411001131648757071588488220532539782545200321908111599592636973146194058056564924259042296638315976224316360033845852318938823607436658351875086984433074463158236223344828240703648004620467488645622229309082546037826549150096614213390716798147672946512459617730148266423496997160777227482475009932013242738610000405747911162773880928277363924192388244705316312909258695267385559719781821111399096063487484121831441128099512811105145553708218511125708027791532622990325823

It's hard work to prove this number is prime but there's a great Integer Factorization Calculator that actually uses WebAssembly and your own local CPU to check such things. Expect to way a long time, sometimes until the sun death of the universe. ;)

Rob and I are finding it really cool to dig just below the surface of common things we look at all the time. I have often opened a key file in a text file but never drawn a straight and complete line through decoding, unpacking, decoding, all the way to a math mathematical formula. I feel I'm filling up some major gaps in my knowledge!

Sponsor: Preview the latest JetBrains Rider with its built-in spell checking, initial Blazor support, partial C# 7.3 support, enhanced debugger, C# Interactive, and a redesigned Solution Explorer.


© 2018 Scott Hanselman. All rights reserved.
     

How do you even know this crap?

Aug 22, 2018

Description:

Imposter's HandbookThis post won't be well organized so lower your expectations first. When Rob Conery first wrote "