<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Timothy McGrath</title>
    <description>The latest articles on DEV Community by Timothy McGrath (@timothymcgrath).</description>
    <link>https://dev.to/timothymcgrath</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/timothymcgrath"/>
    <language>en</language>
    <item>
      <title>TIL: Generate Required &amp; Optional Parameters with NSwag</title>
      <dc:creator>Timothy McGrath</dc:creator>
      <pubDate>Wed, 10 Mar 2021 02:23:11 +0000</pubDate>
      <link>https://dev.to/timothymcgrath/til-generate-required-optional-parameters-with-nswag-3g61</link>
      <guid>https://dev.to/timothymcgrath/til-generate-required-optional-parameters-with-nswag-3g61</guid>
      <description>&lt;p&gt;I use NSwag to generate a Swagger page for each of our .NET Core services, and I also use NSwag to generate a .NET Standard proxy client for other services to use when talking to that service. &lt;/p&gt;

&lt;p&gt;However, sometimes the signature of the generated client is not accurate in describing which parameters are required and which parameters cannot be null. This is how to fix that.&lt;/p&gt;

&lt;p&gt;This is the example endpoint. userId and age are required and not nullable. faveNumber is optional and nullable. &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    [HttpGet()]
    public ActionResult GetValues(
        FromHeader] string userId,
        [FromQuery] int age,
        [FromQuery] int? faveNumber)
    {
        return Ok();
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;I want the generated client code to create the following method signature:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;GetValues(string userId, int age, int? faveNumber = null)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h3&gt;
  
  
  Fix the swagger.json
&lt;/h3&gt;

&lt;p&gt;First, I need to properly describe the endpoint parameters on the endpoint, so that the generated swagger.json describes the parameters correctly:&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    "parameters": [
      {
        "name": "userId",
        "in": "header",
        "required": true,
        "schema": {
          "type": "string"
        },
        "x-position": 1
      },
      {
        "name": "age",
        "in": "query",
        "required": true,
        "schema": {
          "type": "integer",
          "format": "int32"
        },
        "x-position": 2
      },
      {
        "name": "faveNumber",
        "in": "query",
        "schema": {
          "type": "integer",
          "format": "int32",
          "nullable": true
        },
        "x-position": 3
      }
    ],
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The required and nullable fields are the key to getting a properly generated client. Use Nullable (?) to denote parameters which are nullable, and use BindRequired to denote parameters which must be included:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    [HttpGet()]
    public ActionResult GetValues(
        [BindRequired][FromHeader] string userId,
        [BindRequired][FromQuery] int age,
        [FromQuery] int? faveNumber)
    {
        return Ok();
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;In Startup.cs set the default null reference to NotNull, which will prevent the required fields from accepting null as a value.&lt;/p&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    services.AddOpenApiDocument(settings =&amp;gt;&lt;br&gt;
    {&lt;br&gt;
        settings.DefaultReferenceTypeNullHandling = NJsonSchema.Generation.ReferenceTypeNullHandling.NotNull;&lt;br&gt;
    });&lt;br&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h3&gt;
&lt;br&gt;
  &lt;br&gt;
  &lt;br&gt;
  Generate the Client&lt;br&gt;
&lt;/h3&gt;

&lt;p&gt;Now that we have a proper swagger.json, use that to build the NSwag client.&lt;/p&gt;

&lt;p&gt;Make sure that the "Generate optional parameters" setting is selected. I end up with a generated client with the following method: &lt;/p&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    public async System.Threading.Tasks.Task&amp;lt;FileResponse&amp;gt; Values_GetValuesAsync(&lt;br&gt;
        string userId, &lt;br&gt;
        int age, &lt;br&gt;
        int? faveNumber = null))&lt;br&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h3&gt;
&lt;br&gt;
  &lt;br&gt;
  &lt;br&gt;
  Summary&lt;br&gt;
&lt;/h3&gt;

&lt;p&gt;When generating an NSwag client, don't just play with the Client generation settings, make sure that the swagger.json that is being generated properly represents the endpoints you've built in code. &lt;/p&gt;

&lt;p&gt;Building the correct swagger.json is key to building a correct client.&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>csharp</category>
      <category>api</category>
    </item>
    <item>
      <title>TIL: How to Fix WebView2 in WPF TabControl</title>
      <dc:creator>Timothy McGrath</dc:creator>
      <pubDate>Tue, 26 Jan 2021 02:15:22 +0000</pubDate>
      <link>https://dev.to/timothymcgrath/til-how-to-fix-webview2-in-wpf-tabcontrol-5g8o</link>
      <guid>https://dev.to/timothymcgrath/til-how-to-fix-webview2-in-wpf-tabcontrol-5g8o</guid>
      <description>&lt;p&gt;There is a new web browser control available in WPF called WebView2. It uses the new Chromium-based Edge browser instead of the IE7 or whatever the original browser control in WPF was using. This allows for modern web pages to be displayed in your WPF app.&lt;/p&gt;

&lt;p&gt;Inside my app, I host a WebView2 inside a TabControl, so that for one tab it actually shows an HTML page that looks like part of the WPF app.&lt;/p&gt;

&lt;p&gt;However, putting the WebView2 control into the TabControl causes it to not render until the tab is clicked. So, for some users it leaves them with a white screen for a few seconds before the content pops in. &lt;/p&gt;

&lt;p&gt;I wanted to eliminate this lag because users are expecting the content to just be there when they click on the tab. &lt;/p&gt;

&lt;p&gt;The UI is simple, with a TabControl holdings two tabs, the second one has the WebView2 in it:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;Grid&amp;gt;
    &amp;lt;TabControl
        x:Name="tabs"&amp;gt;

        &amp;lt;TabItem Header="Welcome"&amp;gt;
            &amp;lt;Grid&amp;gt;
                &amp;lt;TextBlock&amp;gt;Hello!&amp;lt;/TextBlock&amp;gt;
            &amp;lt;/Grid&amp;gt;
        &amp;lt;/TabItem&amp;gt;

        &amp;lt;TabItem Header="Content"&amp;gt;
            &amp;lt;Grid&amp;gt;
                &amp;lt;wpf:WebView2
                    x:Name="browser"
                    /&amp;gt;
            &amp;lt;/Grid&amp;gt;
        &amp;lt;/TabItem&amp;gt;
    &amp;lt;/TabControl&amp;gt;
&amp;lt;/Grid&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The code-behind sets the source of the WebView2:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public partial class MainWindow : Window
{
    public MainWindow()
    {
        InitializeComponent();

        browser.Source = new System.Uri("https://www.cnn.com/");

        this.Loaded += (_, __) =&amp;gt;
        {
            Debug.WriteLine("Main Window Loaded");
        };

        browser.CoreWebView2InitializationCompleted += (_, __) =&amp;gt;
        {
            Debug.WriteLine("Web View Initialized");
        };
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Running this code shows that the WebViewInitializationCompleted event never triggers until the user clicks on the second tab, which triggers the browser to start the navigation. &lt;/p&gt;

&lt;p&gt;However, if I put tell WPF to activate that tab, update layout to cause the browser to kick off, and then go back to the original tab, it triggers the initialization completed event immediately and renders the web page immediately:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public partial class MainWindow : Window
{
    public MainWindow()
    {
        InitializeComponent();

        browser.Source = new System.Uri("https://www.cnn.com/");

        this.Loaded += (_, __) =&amp;gt;
        {
            Debug.WriteLine("Main Window Loaded");

            tabs.SelectedIndex = 1;
            tabs.UpdateLayout();
            tabs.SelectedIndex = 0;
        };

        browser.CoreWebView2InitializationCompleted += (_, __) =&amp;gt;
        {
            Debug.WriteLine("Web View Initialized");
        };
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;I don't love this fix, but it is the only way I could get the browser to prerender the content ahead of time. I was worried it would show some shakiness as the tabs switched, but that switch is not visible on-screen.&lt;/p&gt;

&lt;p&gt;If you're hosting a WebView2 inside a WPF TabControl, this can help to prerender the content before the user clicks on the tab.&lt;/p&gt;

&lt;p&gt;Check out the full code here: &lt;a href="https://github.com/timothymcgrath/tabwebviewfix"&gt;https://github.com/timothymcgrath/tabwebviewfix&lt;/a&gt;&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>wpf</category>
      <category>csharp</category>
    </item>
    <item>
      <title>Logging Scaffold for .NET Core / Serilog</title>
      <dc:creator>Timothy McGrath</dc:creator>
      <pubDate>Tue, 01 Sep 2020 23:46:01 +0000</pubDate>
      <link>https://dev.to/timothymcgrath/logging-scaffold-for-net-core-serilog-3o91</link>
      <guid>https://dev.to/timothymcgrath/logging-scaffold-for-net-core-serilog-3o91</guid>
      <description>&lt;h3&gt;
  
  
  Motivation
&lt;/h3&gt;

&lt;p&gt;Logging is a cross-cutting concern that we want all of our services to perform consistently. We don't want each service to recreate the wheel for logging. We want the same format of data and we want it to go to the same places. &lt;/p&gt;

&lt;h3&gt;
  
  
  Goal
&lt;/h3&gt;

&lt;p&gt;The goal is a simple library that sets up the standard for logging by writing to a local, rolling file and writing to an external source like DataDog. I don't want developers to have to think about setting up the correct logging, but just fall into success.&lt;/p&gt;

&lt;p&gt;The defaults should configure appropriate log levels, but they should be overridable in appsettings.json. New sinks can be created but they will be appended to the existing default sinks.&lt;/p&gt;

&lt;p&gt;It should also encapsulate all the necessary nuget packages for logging (as there are many needed to configure Serilog properly).&lt;/p&gt;

&lt;p&gt;Using this log library should make adding consistent, standard logging simple.&lt;/p&gt;

&lt;h3&gt;
  
  
  Logging Library
&lt;/h3&gt;

&lt;p&gt;Create a new .NET Standard library to contain the shared logging setup. The .csproj contains all the Serilog packages needed, which reduces the number of packages that need to be manually added to each app host.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;Project Sdk="Microsoft.NET.Sdk"&amp;gt;
    &amp;lt;PropertyGroup&amp;gt;
        &amp;lt;TargetFramework&amp;gt;netstandard2.0&amp;lt;/TargetFramework&amp;gt;
        &amp;lt;nullable&amp;gt;enable&amp;lt;/nullable&amp;gt;
        &amp;lt;TreatWarningsAsErrors&amp;gt;true&amp;lt;/TreatWarningsAsErrors&amp;gt;
        &amp;lt;LangVersion&amp;gt;latest&amp;lt;/LangVersion&amp;gt;
    &amp;lt;/PropertyGroup&amp;gt;

    &amp;lt;ItemGroup&amp;gt;
        &amp;lt;PackageReference Include="Serilog.AspNetCore" Version="3.4.0" /&amp;gt;
        &amp;lt;PackageReference Include="Serilog.Settings.Configuration" Version="3.1.0" /&amp;gt;
        &amp;lt;PackageReference Include="Serilog.Sinks.Async" Version="1.4.0" /&amp;gt;
        &amp;lt;PackageReference Include="Serilog.Sinks.Debug" Version="1.0.1" /&amp;gt;
        &amp;lt;PackageReference Include="Serilog.Sinks.File" Version="4.1.0" /&amp;gt;
    &amp;lt;/ItemGroup&amp;gt;
&amp;lt;/Project&amp;gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Add a LogCore.cs file to the project to encapsulate all the logging setup:&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public class LogCore
{
    public static void Configure(string appName)
    {
        var environment = GetEnvironment();

        var logConfig = ConfigureDefaults(environment);
        logConfig = ConfigureFile(logConfig, appName);
        // Add more logging sinks here...

        // Set the logger instance to the configured logger.
        Log.Logger = logConfig.CreateLogger();
    }

    private static string GetEnvironment()
    {
        // The environment variable is needed for some logging configuration.
        var environment = System.Environment.GetEnvironmentVariable(Environment);
        if (environment == null)
        {
            throw new NullReferenceException($"{Environment} environment variable is not set.");
        }

        return environment;
    }

    private static LoggerConfiguration ConfigureDefaults(string environment)
    {
        // Use the appsettings.json configuration to override minimum levels and add any additional sinks.
        var config = new ConfigurationBuilder()
            .AddJsonFile($"appsettings.json")
            .AddJsonFile($"appsettings.{environment}.json", optional: true)
            .Build();

        // Minimum levels will be overriden by the configuration file if they are an exact match.
        return new LoggerConfiguration()
            .MinimumLevel.Information()
            .MinimumLevel.Override("Microsoft", LogEventLevel.Information)
            .MinimumLevel.Override("Microsoft.AspNetCore", LogEventLevel.Warning)
            .ReadFrom.Configuration(config);
    }

    private static LoggerConfiguration ConfigureFile(LoggerConfiguration logConfig, string appName)
    {
        var fileDirectory = $"c:\\logs\\{appName}\\";
        var hostName = System.Environment.MachineName.ToLower();

        // Add a default async rolling file sink.
        return logConfig
            .WriteTo.Async(a =&amp;gt; a.File(
                formatter: new JsonFormatter(renderMessage: true),
                path: $"{fileDirectory}\\log-{hostName}-.json",  // Auto-appends the file number to the filename (log-webvm-001.json)
                rollingInterval: RollingInterval.Day,
                fileSizeLimitBytes: 50000000, // 50 MB file limit
                rollOnFileSizeLimit: true,
                retainedFileCountLimit: 10,
                buffered: false));
    }
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;LogCore configures all the default logging. This example sets up a rolling file writer with a standard name, location, and rollover limit. It also guarantees that the file logging is happening asynchronously. &lt;/p&gt;

&lt;p&gt;LogCore also sets the default log levels for Microsoft libraries. Add a default log level for any internal libraries as well. The nice part is that each log level can be overriden from configuration as usual. The library sets the default log levels first, and then it adds the configuration settings which will add to or override any default settings. So, to modify the log levels, the following setting can be added:&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Serilog": {
    "MinimumLevel": {
        "Override": {
            "Microsoft": "Error"
        }
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;New sinks can also be added through configuration and they will be appended to the default list of sinks.&lt;/p&gt;

&lt;h3&gt;
  
  
  Integration
&lt;/h3&gt;

&lt;p&gt;At the start of Program.cs, add the LogCore.Configure call:&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public static class Program
{
    public const string ApplicationName = "[APP NAME HERE]";

    public static void Main(string[] args)
    {
        LogCore.Configure(ApplicationName);
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Also add the .UseSerilog() call to the Host builder:&lt;/p&gt;


&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public static IHostBuilder CreateHostBuilder(string[] args)&lt;br&gt;
{&lt;br&gt;
    return Host.CreateDefaultBuilder(args)&lt;br&gt;
        .UseSerilog()&lt;br&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;h3&gt;
&lt;br&gt;
  &lt;br&gt;
  &lt;br&gt;
  Conclusion&lt;br&gt;
&lt;/h3&gt;

&lt;p&gt;The app now has standard, consistent logging without having to manually configure each sink and log level. This is great for cases where there are multiple logging outputs that each need their own configuration. Developers can focus on the value of the app instead of the cross-cutting concerns.&lt;/p&gt;

&lt;p&gt;Let me know if you have any thoughts/suggestions to improve this!&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>serilog</category>
    </item>
    <item>
      <title>Private Office or Shared Space?</title>
      <dc:creator>Timothy McGrath</dc:creator>
      <pubDate>Sat, 01 Feb 2020 17:03:03 +0000</pubDate>
      <link>https://dev.to/timothymcgrath/private-office-or-shared-space-23fh</link>
      <guid>https://dev.to/timothymcgrath/private-office-or-shared-space-23fh</guid>
      <description>&lt;p&gt;I've worked in a private office for years as a software developer, but recently we tried working in a shared team room. The goal was to measure if it would increase communication, collaboration, and productivity. &lt;/p&gt;

&lt;p&gt;I found the shared space to be extremely distracting and decreased overall productivity. &lt;/p&gt;

&lt;p&gt;While at times it was easier to collaborate, it was very difficult to find time to focus on a hard problem and experience flow.&lt;/p&gt;

&lt;p&gt;One of the developers on the team has a habit of snapping at people and that raised the anxiety of the entire room. However, this really brought to attention that this developer's attitude needs to be addressed as it is always affecting the team, it is just much more clear now.&lt;/p&gt;

&lt;p&gt;What are your thoughts? &lt;br&gt;
I personally prefer a private office and having the ability to close my door when I need to really focus on something. I always prefer to pair program using a tool like Live Share from my own PC, so that we can both navigate the codebase at the same time. &lt;/p&gt;

</description>
      <category>discuss</category>
      <category>office</category>
    </item>
    <item>
      <title>That's Not A Moon, It's My Terminal!</title>
      <dc:creator>Timothy McGrath</dc:creator>
      <pubDate>Fri, 03 Jan 2020 02:45:33 +0000</pubDate>
      <link>https://dev.to/timothymcgrath/that-s-not-a-moon-it-s-my-terminal-cea</link>
      <guid>https://dev.to/timothymcgrath/that-s-not-a-moon-it-s-my-terminal-cea</guid>
      <description>&lt;p&gt;First, let's take a look at my beautiful new terminal. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fscribblesofcode.com%2Fwp-content%2Fuploads%2F2020%2F01%2FDeathStarTerminal.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fscribblesofcode.com%2Fwp-content%2Fuploads%2F2020%2F01%2FDeathStarTerminal.png" title="Death Star Terminal" alt="alt text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Sweet, right?&lt;/p&gt;

&lt;p&gt;Now, if you're still here and want to learn a little about the new Windows Terminal and how to setup your own background image, keep going...&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Windows Terminal?
&lt;/h2&gt;

&lt;p&gt;The Windows Terminal is the brand-new, awesome Terminal built by Microsoft. It is open-source, it can run all your Shells (PowerShell, bash, etc.) in multiple tabs. I love it!&lt;/p&gt;

&lt;p&gt;If you don't already have the Windows Terminal, install it now. It's free from the Microsoft Store app (that Store you probably never open...)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fscribblesofcode.com%2Fwp-content%2Fuploads%2F2020%2F01%2FMicrosoftStore.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fscribblesofcode.com%2Fwp-content%2Fuploads%2F2020%2F01%2FMicrosoftStore.png" title="Microsoft Store Terminal Page" alt="alt text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Terminal is still in Beta (at the time of this post) but it is very stable and I've been using it as a replacement for my usual Terminal.&lt;/p&gt;

&lt;h2&gt;
  
  
  Configure Windows Terminal
&lt;/h2&gt;

&lt;p&gt;Launch the Terminal, click the Arrow at the end of the tabs, and choose Settings. This will open a .JSON file with all the Terminal settings in it. The format is pretty simple to work with.&lt;/p&gt;

&lt;p&gt;The settings should look something like this:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// To view the default settings, hold "alt" while clicking on the "Settings" button.
// For documentation on these settings, see: https://aka.ms/terminal-documentation

{
    "$schema": "https://aka.ms/terminal-profiles-schema",

    "defaultProfile": "{61c54bbd-c2c6-5271-96e7-009a87ff44bf}",

    "profiles":
    [
        {
            // Make changes here to the powershell.exe profile
            "guid": "{61c54bbd-c2c6-5271-96e7-009a87ff44bf}",
            "name": "Windows PowerShell",
            "commandline": "powershell.exe",
            "hidden": false
        },
        {
            // Make changes here to the cmd.exe profile
            "guid": "{0caa0dad-35be-5f56-a8ff-afceeeaa6101}",
            "name": "cmd",
            "commandline": "cmd.exe",
            "hidden": false
        },
        {
            "guid": "{b453ae62-4e3d-5e58-b989-0a998ec441b8}",
            "hidden": false,
            "name": "Azure Cloud Shell",
            "source": "Windows.Terminal.Azure"
        }
    ],

    // Add custom color schemes to this array
    "schemes": [],

    // Add any keybinding overrides to this array.
    // To unbind a default keybinding, set the command to "unbound"
    "keybindings": []
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The profiles are what we are interested in today. Each profile represents a specific Shell to connect to. I have three that were discovered on installation, Powershell, CMD, and Azure Cloud Shell. The default Shell is set with the defaultProfile value above the Profiles. &lt;/p&gt;

&lt;p&gt;NOTE: The Guids are random and have no inherent meaning. They just need to be unique and one of them needs to be used by defaultProfile.&lt;/p&gt;

&lt;p&gt;To change the Terminal's background you can set the following values for each Profile. &lt;/p&gt;

&lt;p&gt;&lt;em&gt;background&lt;/em&gt; changes the color of the Terminal's background. Use an RGB value in the #111111 format.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;backgroundImage&lt;/em&gt; changes the image that is displayed in the background of the Terminal. &lt;/p&gt;

&lt;p&gt;&lt;em&gt;backgroundImageOpacity&lt;/em&gt; changes the transparency on the image, allowing some of the background color to come through. This is useful to make the image more subtle.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;backgroundImageStretchMode&lt;/em&gt; controls how the image is displayed. Use "uniformToFill" to have the image fill the Terminal.&lt;/p&gt;

&lt;p&gt;For my Death Star Terminal, I found a sweet picture of a Death Star (technically the Death Star 2...). I stored that image in my OneDrive, so that I could easily reference it from Terminals on all my PCs. Use the %OneDriveConsumer% environment variable to reference your OneDrive folder path. &lt;/p&gt;

&lt;p&gt;I used a dark gray background color (#303030) with a backgroundImageOpacity of 50% (0.5) to fade the image out some.&lt;/p&gt;

&lt;p&gt;Here are my final settings:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"background": "#303030",
"backgroundImage": "%OneDriveConsumer%/Terminal/deathstar.jpg",
"backgroundImageOpacity": 0.5,
"backgroundImageStretchMode": "uniformToFill",
"guid": "{61c54bbd-c2c6-5271-96e7-009a87ff44bf}",
"name": "Windows PowerShell",
"commandline": "powershell.exe",
"hidden": false
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The Terminal will auto-update immediately without reopening, which is awesome. It now looks like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fscribblesofcode.com%2Fwp-content%2Fuploads%2F2020%2F01%2FDeathStarTerminalPreFont.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fscribblesofcode.com%2Fwp-content%2Fuploads%2F2020%2F01%2FDeathStarTerminalPreFont.png" title="Death Star Terminal PreFont Update" alt="alt text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What About The Sweet Fonts?
&lt;/h2&gt;

&lt;p&gt;Right. I agree, my fonts and Terminal UI look super sweet. I can't take credit for that, though, I used posh-git, oh-my-gosh, and a font named Delugia Nerd Font. To learn how to do this, check out &lt;a href="https://www.hanselman.com/blog/HowToMakeAPrettyPromptInWindowsTerminalWithPowerlineNerdFontsCascadiaCodeWSLAndOhmyposh.aspx" rel="noopener noreferrer"&gt;Scott Hanselman's excellent post&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;After applying Scott Hanselman's instructions I end up with this beauty:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fscribblesofcode.com%2Fwp-content%2Fuploads%2F2020%2F01%2FDeathStarTerminal.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fscribblesofcode.com%2Fwp-content%2Fuploads%2F2020%2F01%2FDeathStarTerminal.png" title="Death Star Terminal" alt="alt text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapup
&lt;/h2&gt;

&lt;p&gt;So, it's that easy to create your own beautiful Windows Terminal. Try it out, if you use the Terminal all day, you might as well make it feel cozy.&lt;/p&gt;

&lt;p&gt;Let me know what you think of my Terminal and show me what your Terminal looks like!&lt;/p&gt;

</description>
      <category>tools</category>
      <category>windows</category>
    </item>
    <item>
      <title>Escape from Nuget Hell</title>
      <dc:creator>Timothy McGrath</dc:creator>
      <pubDate>Sun, 15 Dec 2019 00:30:21 +0000</pubDate>
      <link>https://dev.to/timothymcgrath/escape-from-nuget-hell-517m</link>
      <guid>https://dev.to/timothymcgrath/escape-from-nuget-hell-517m</guid>
      <description>&lt;p&gt;If you're a .NET developer, as Nuget has become more prolific, you have come to dread the following error: &lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;System.IO.FileLoadException: 'Could not load file or assembly 'Newtonsoft.Json, Version=12.0.0.0, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)'
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;It sometimes happens while debugging, it sometimes happens after deploying an app. It's not always Newtonsoft, but Newtonsoft is used everywhere, so it's an easy example.&lt;/p&gt;

&lt;p&gt;Using a small example project, I'm going to walk through an example that creates the problem and then a couple of solutions to solve the problem.&lt;/p&gt;

&lt;h2&gt;
  
  
  Example Project
&lt;/h2&gt;

&lt;p&gt;The example is a solution containing two .NET Framework projects (4.7.2). ReferenceTest.Console is a Console app that references ReferenceTest.Logic, a Class Library. They are both going to reference Newtonsoft.Json through Nuget, but they will use different versions of Newtonsoft.Json to replicate different issues.&lt;/p&gt;

&lt;h2&gt;
  
  
  What Assembly Bindings Fix
&lt;/h2&gt;

&lt;p&gt;In this example, the two projects reference: &lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ReferenceTest.Console - Newtonsoft 12.0.3

ReferenceTest.Logic - Newtonsoft 9.0.1
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;This is a pretty common issue where both projects are referencing different versions of a nuget package, but the parent project (Console) also references the higher version of the nuget package. This will error at runtime when code in the child project is called for the first time.&lt;/p&gt;

&lt;p&gt;This can be fixed by telling the child project (Logic) to reference the higher-version nuget package as well. (This is technically perilious as it is not true that v12 of Newtonsoft is guaranteed to work the same as v9, but that's an issue for another day).&lt;/p&gt;

&lt;p&gt;How do we tell the child project (Logic) to use v12? AssemblyBindings! The AssemblyBinding will tell your app to use v12 of Newtonsoft instead of v9. It can be defined in the configuration file like this: &lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;runtime&amp;gt;
    &amp;lt;assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1"&amp;gt;
    &amp;lt;dependentAssembly&amp;gt;
        &amp;lt;assemblyIdentity name="Newtonsoft.Json" publicKeyToken="30AD4FE6B2A6AEED" culture="neutral"/&amp;gt;
        &amp;lt;bindingRedirect oldVersion="0.0.0.0-12.0.0.0" newVersion="12.0.0.0"/&amp;gt;
    &amp;lt;/dependentAssembly&amp;gt;
    &amp;lt;/assemblyBinding&amp;gt;
&amp;lt;/runtime&amp;gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;The trick to making this work is to activate Auto-generate Binding Redirects in your Project's Properties page (right below the .NET Framework version). Activating this will automatically add the correct binding redirects to your project's app.config (in the debug/release folder, not in your source code).&lt;/p&gt;

&lt;h2&gt;
  
  
  What AssemblyBindings Won't Fix
&lt;/h2&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ReferenceTest.Console - 9.0.1

ReferenceTest.Logic - 12.0.3
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;This issue cannot be resolved. This is a case where the child project is referencing a higher version than the parent, so this will error at runtime when the child project is loaded. You can't fix this with an assembly binding unless you configure it to point v12 to v9, which seems especially dangerous to do.&lt;/p&gt;

&lt;p&gt;The only way I know to fix this is to update the parent project (Console) to use 12.0.3 as well. This solution works, but it creates a problem where anytime you add or update a nuget package from a lower-level project, you run the risk of creating this error in a parent project, and you won't know about it till runtime.&lt;/p&gt;

&lt;h2&gt;
  
  
  How Can I Catch This Sooner?
&lt;/h2&gt;

&lt;p&gt;This is where the new Project file format will save us.&lt;/p&gt;

&lt;p&gt;The new Project format has numerous advantages. It automatically includes all the files in the folder instead of only ones listed in the project file, it eliminates a number of unnecessary project file attributes, and it improves Nuget references.&lt;/p&gt;

&lt;p&gt;Converting the projects by hand is cumbersome, so I use &lt;a href="https://github.com/hvanbakel/CsprojToVs2017?WT.mc_id=-blog-scottha"&gt;this migration tool&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Running this tool on both projects makes nice, tidy .csproj files using PackageReference instead of a separate packages.config file.&lt;/p&gt;

&lt;p&gt;But the best part is a new compile error shows up for our previous example:&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Error   NU1605  Detected package downgrade: Newtonsoft.Json from 12.0.3 to 9.0.1. Reference the package directly from the project to select a different version. 
ReferenceTest.Console -&amp;gt; ReferenceTest.Logic -&amp;gt; Newtonsoft.Json (&amp;gt;= 12.0.3) 
ReferenceTest.Console -&amp;gt; Newtonsoft.Json (&amp;gt;= 9.0.1) ReferenceTest.Console
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;The compiler has figured out that there is an incompatible nuget reference and that we need to upgrade the host project to resolve it. So, now we have our error at compile-time instead of at run-time. Beautiful!&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapup
&lt;/h2&gt;

&lt;p&gt;So, we learned that there are two different ways we can get into trouble with nuget package referencing. They can be handled by assemblybindings but for the best experience you need to migrate to the new .csproj format.&lt;/p&gt;

</description>
      <category>dotnet</category>
      <category>nuget</category>
    </item>
    <item>
      <title>Start the Day By Planning the Day</title>
      <dc:creator>Timothy McGrath</dc:creator>
      <pubDate>Wed, 06 Nov 2019 02:20:07 +0000</pubDate>
      <link>https://dev.to/timothymcgrath/start-the-day-by-planning-the-day-4n7a</link>
      <guid>https://dev.to/timothymcgrath/start-the-day-by-planning-the-day-4n7a</guid>
      <description>&lt;p&gt;It's a new day! What should I work on? Where should I dedicate my time?  &lt;/p&gt;

&lt;p&gt;Taking 10 or 15 minutes every morning to plan the day is the most important step. This is your chance to look at everything on your plate and determine what needs to be done today. You cannot do everything, so prioritization is critical.&lt;/p&gt;

&lt;p&gt;I have multiple inputs to my morning planning:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Emails. I know, I know, everyone says don't look at your emails first thing! But, I pretty much ignore my email for the rest of the day, so the first thing I do is get to Inbox Zero every morning. The email either gets deleted or it ends up in my todo list to be prioritized.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Help Desks. We have an internal help desk system that contains requests that could be from another department, or from a customer through our support department. I check this every morning and add any new items to my todo list.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Team Kanban Board. This is where the actual projects my team is working on exist, and is preferably where I spend most of my time. This includes new projects and bugs.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Todoist. I'm a huge Todoist fan (I'll talk more about this in a future post). It's just a Todo app, though... use your Todo app of choice. The important thing is that it contains everything that I need to get done that is not in the Team Kanban board. Any emails, help desks, or work that just showed up at my door goes here. It's so important source to look at for all my open work (except for my kanban board which I do not duplicate here...)&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;So, now that all of the emails and help desks have been consolidated into my todo list, along with anything else that is there from a previous day or scheduled for today. I'm ready to prioritize.  &lt;/p&gt;

&lt;p&gt;Each todo item can be organized as:  &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Today - These items are important and urgent and need to be accomplished today.
&lt;/li&gt;
&lt;li&gt;Future - These items are important but not urgent, so I reschedule them in Todoist to show up on a future day (either tomorrow or some other future date).
&lt;/li&gt;
&lt;li&gt;Delegate - These items are important but don't need to be completed by me. They can be assigned to a teammate or other employee. I assign them out and if important, I setup a todo to check in on it in a few days.
&lt;/li&gt;
&lt;li&gt;Delete - These items are no longer important and can be completely removed.
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I now have a list of everything that needs to be done today, that is outside of what I consider my team's work. It's a great day when this list is currently empty!&lt;/p&gt;

&lt;p&gt;I then look at the Kanban board and determine what are today's priorities. I always start at the right-side of the Kanban board and move to the left. If an item needs deployed to Production, that is most important. Then QA bug fixes, then QA deployments, then new development, then design.  &lt;/p&gt;

&lt;p&gt;I now take all these items (todo list and kanban board priorities) and reprioritize them on my office white board. I enjoy being able to glance at my whiteboard and see where I'm at on the list I created first thing this morning. The final whiteboard list drives my day.  &lt;/p&gt;

&lt;p&gt;Any new interruptions that come in do not get worked on unless they are critical. Interruptions get sent to my todo list to be prioritized the next morning.  &lt;/p&gt;

&lt;p&gt;At the end of the day, I re-synchronize by updating my todo list by eliminating everything I've accomplished that day. Anything that was not completed gets rescheduled for tomorrow, so that it can be reprioritized the next morning.&lt;/p&gt;

&lt;p&gt;Working and prioritizing this way has helped me keep stress under control by always knowing that I am working on what I planned instead of constantly reacting to every interruption during the day. It has also greatly improved my reliability, it is very difficult for a task to slip through the cracks. If it's important, it will get done.&lt;/p&gt;

&lt;p&gt;In future posts, I'll give more details on how I use Todoist.  &lt;/p&gt;

&lt;p&gt;Let me know how you plan your day!  &lt;/p&gt;

</description>
      <category>productivity</category>
    </item>
    <item>
      <title>Enums &amp; APIs</title>
      <dc:creator>Timothy McGrath</dc:creator>
      <pubDate>Sun, 01 Sep 2019 17:49:30 +0000</pubDate>
      <link>https://dev.to/timothymcgrath/enums-apis-15n4</link>
      <guid>https://dev.to/timothymcgrath/enums-apis-15n4</guid>
      <description>&lt;p&gt;Enums are a double-edged sword. They are extremely useful to create a set of possible values, but they can be a versioning problem if you ever add a value to that enum.&lt;/p&gt;

&lt;p&gt;In a perfect world, an enum represents a closed set of values, so versioning is never a problem because you never add a value to an enum. However, we live in the real, non-perfect world and what seemed like a closed set of values often turns out to be open.&lt;/p&gt;

&lt;p&gt;So, let's dive in.&lt;/p&gt;

&lt;h3&gt;
  
  
  Beer API
&lt;/h3&gt;

&lt;p&gt;My example API is a Beer API! &lt;/p&gt;

&lt;p&gt;I have a GET that returns a Beer, and a POST that accepts a Beer.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[HttpGet]
public ActionResult&amp;lt;Models.Beer&amp;gt; GetBeer()
{
    return new ActionResult&amp;lt;Models.Beer&amp;gt;(new Models.Beer()
    {
        Name = "Hop Drop",
        PourType = Beer.Common.PourType.Draft
    });
}

[HttpPost]
public ActionResult PostBeer(Models.Beer beer)
{
    return Ok();
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;The Beer class:&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public class Beer
{
    public string Name { get; set; }

    public PourType PourType { get; set; }

}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;And the PourType enum:&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public enum PourType
{
    Draft = 1,
    Bottle = 2
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;The API also converts all enums to strings, instead of integers which I recommend as a best practice. &lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;        services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_2)
                .AddJsonOptions(options =&amp;gt;
                {
                    options.SerializerSettings.Converters.Add(new Newtonsoft.Json.Converters.StringEnumConverter());
                });
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;So, the big question comes down to this definition of PourType in the Beer class.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public PourType PourType { get; set; }
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Should it be this insted? &lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public string PourType { get; set; }
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;We're going to investigate this question by considering what happens if we add a new value to PourType, Can = 3.&lt;/p&gt;

&lt;p&gt;Let's look at the pros/cons.&lt;/p&gt;

&lt;h3&gt;
  
  
  Define As Enum
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Pros
&lt;/h4&gt;

&lt;p&gt;When you define PourType as an Enum on Beer, you create discoverability and validation by default. When you add Swagger (as you should do), it defines the possible values of PourType as part of your API. Even better, when you generate client code off of the Swagger, it defines the Enum on the client-side, so they can easily send you the correct value.&lt;/p&gt;

&lt;h4&gt;
  
  
  Cons
&lt;/h4&gt;

&lt;p&gt;Backwards compatibility is now an issue. When we add Can to the PourType, we have created a new value that the client does not know about. So, if the client requests a Beer, and we return a Beer with the PourType of Can, it will error on deserialization. &lt;/p&gt;

&lt;h3&gt;
  
  
  Define As String
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Pros
&lt;/h4&gt;

&lt;p&gt;This allows new values to be backwards compatible with clients as far as deserialization goes. This will work great in cases where the client doesn't actually care about the value or the client never uses it as an enum. &lt;/p&gt;

&lt;p&gt;However, from the API's perspective, you have no idea if that is true or not. It could easily cause a runtime error anyway. If the client attempts to convert it to an enum it will error. If the client is using the value in an IF or SWITCH statement, it will lead to unexpected behavior and possibly error. &lt;/p&gt;

&lt;h4&gt;
  
  
  Cons
&lt;/h4&gt;

&lt;p&gt;The biggest issue is discoverability is gone. The client has no idea what the possible set of values are, it has to pass a string, but has no idea what string. &lt;/p&gt;

&lt;p&gt;This could be handled with documentation, but documentation is notoriously out of date and defining it on the API is a much easier process for a client. &lt;/p&gt;

&lt;h3&gt;
  
  
  So What Do We Do?
&lt;/h3&gt;

&lt;p&gt;Here's what I've settled on.&lt;/p&gt;

&lt;h2&gt;
  
  
  Enum!
&lt;/h2&gt;

&lt;p&gt;The API should describe itself as completely as possible, including the possible values for an enum value. Without these values, the client has no idea what the possible values are. &lt;/p&gt;

&lt;p&gt;So, a new enum should be considered a version change to the API.&lt;/p&gt;

&lt;p&gt;There are a couple ways to handle this version change.&lt;/p&gt;

&lt;h4&gt;
  
  
  Filter
&lt;/h4&gt;

&lt;p&gt;The V1 controller could now filter the Beer list to remove any Beer's that have a PourType of Can. This may be okay if the Beer only makes sense to clients if they can understand the PourType. &lt;/p&gt;

&lt;h4&gt;
  
  
  Unknown Value
&lt;/h4&gt;

&lt;p&gt;The Filter method will work in some cases, but in other cases you may still want to return the results because that enum value is not a critical part of the resource. &lt;/p&gt;

&lt;p&gt;In this case, make sure your enum has an Unknown value. It will need to be there at V1 for this to work. When the V1 controller gets a Beer with a Can PourType, it can change it to Unknown. &lt;/p&gt;

&lt;p&gt;Here's the enum for PourType:&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public enum PourType
{
    /// &amp;lt;summary&amp;gt;
    /// Represents an undefined PourType, could be a new PourType that is not yet supported.
    /// &amp;lt;/summary&amp;gt;
    Unknown = 0,
    Draft = 1,
    Bottle = 2
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Because Unknown was listed in the V1 API contract, all clients should have anticipated Unknown as a possibility and handled it. The client can determine how to handle this situation... it could have no impact, it could have a UI to show the specific feature is unavailable, or it could choose to error. The important thing is that the client should already expect this as a possibility.&lt;/p&gt;

&lt;h3&gt;
  
  
  Resource Solution
&lt;/h3&gt;

&lt;p&gt;One thing that should be considered in this situation is that the enum is actually a resource. &lt;/p&gt;

&lt;p&gt;PourType is a set of values that could expand as more ways to drink Beer are invented (Hooray!). It may make more sense to expose the list of PourType values from the API. This prevents any version changes when the PourType adds a new value. &lt;/p&gt;

&lt;p&gt;This works well when the client only cares about the list of values (e.g. displaying the values in a combobox). But if the client needs to write logic based on the value it can still have issues with new values, as they will land in the default case. &lt;/p&gt;

&lt;p&gt;Exposing the enum as a resource also allows additional behavior to be added to the value, which can help with client logic. For example, we could add a property to PourType for RequiresBottleOpener, so the client could make logic decisions without relying on the "Bottle" value, but just on the RequiresBottleOpener property.&lt;/p&gt;

&lt;p&gt;The PourType resource definition:&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;public class PourType
{
    public string Name { get; set; }

    public bool RequiresBottleOpener {  get; set; }
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;The PourType controller:&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[HttpGet]
public ActionResult&amp;lt;IEnumerable&amp;lt;PourType&amp;gt;&amp;gt; GetPourTypes()
{
    // In real life, store these values in a database.
    return new ActionResult&amp;lt;IEnumerable&amp;lt;PourType&amp;gt;&amp;gt;(
        new List&amp;lt;PourType&amp;gt;{
                new PourType {Name = "Draft"},
                new PourType {Name = "Bottle", RequiresBottleOpener = true},
                new PourType {Name = "Can"}
        });
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;However, this path does increase complexity at the API and client, so I do not recommend this for every enum. Use the resource approach when you have a clear case of an enum that will have additional values over time. &lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;I have spent a lot of time thinking about this and I believe this is the best path forward for my specific needs. &lt;/p&gt;

&lt;p&gt;If you have tackled this issue in a different way, please discuss in the comments. I don't believe there is a perfect solution to this, so it'd be interesting to see other's solutions.&lt;/p&gt;

</description>
      <category>api</category>
      <category>dotnet</category>
    </item>
    <item>
      <title>The Great Azure DevOps Migration - Part 6: Import</title>
      <dc:creator>Timothy McGrath</dc:creator>
      <pubDate>Mon, 12 Aug 2019 20:39:19 +0000</pubDate>
      <link>https://dev.to/timothymcgrath/the-great-azure-devops-migration-part-6-import-2obc</link>
      <guid>https://dev.to/timothymcgrath/the-great-azure-devops-migration-part-6-import-2obc</guid>
      <description>&lt;p&gt;This is it! We've made it to the import step! This is when we finally move our data into Azure DevOps Service.&lt;/p&gt;

&lt;p&gt;If you missed the earlier posts, start &lt;a href="https://scribblesofcode.com/2019/07/13/the-great-azure-devops-migration/"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I highly recommend &lt;a href="https://azure.microsoft.com/en-us/services/devops/migrate/"&gt;Microsoft’s Azure DevOps Service Migration Guide&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Detach Collection
&lt;/h3&gt;

&lt;p&gt;First, you need to detach the collection from TFS. Don't detach the database in SQL Server, but detach the collection in the Azure DevOps Server. &lt;/p&gt;

&lt;p&gt;To detach the collection, open the Azure DevOps Management Tool, go to Collections, and choose Detach on the collection that is going to be imported.&lt;/p&gt;

&lt;h3&gt;
  
  
  Generate the Database Backup
&lt;/h3&gt;

&lt;p&gt;If you have managed to keep your import under 30 GB, this step is fairly easy. If not, you are in for a harder import because you now need to move your database to a SQL Server Database in Azure. I won't cover the SQL Server migration as I did not do this step, but &lt;a href="https://docs.microsoft.com/en-us/azure/devops/migrate/migration-import?utm_source=ms&amp;amp;utm_medium=guide&amp;amp;utm_campaign=vstsdataimportguide&amp;amp;view=azure-devops#importing-large-collections"&gt;here is the guide&lt;/a&gt; on how to do this.&lt;/p&gt;

&lt;p&gt;So, if you are going the under 30 GB route, you need to create a DACPAC that is going to be imported to Azure DevOps Service. You should be able to run the DACPAC tool from your Developer Command Prompt for Visual Studio or from the following location: &lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\150
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;Here is the packaging command: &lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;SqlPackage.exe /sourceconnectionstring:”Data Source=localhost;
Initial Catalog=[COLLECTION_NAME];Integrated Security=True” 
/targetFile:C:\dacpac\Tfs_DefaultCollection.dacpac 
/action:extract 
/p:ExtractAllTableData=true 
/p:IgnoreUserLoginMappings=true 
/p:IgnorePermissions=true 
/p:Storage=Memory
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;After the packaging is completed, you will have a new DACPAC at C:\dacpac\ with all your import data.&lt;/p&gt;

&lt;h3&gt;
  
  
  Upload the Package
&lt;/h3&gt;

&lt;p&gt;We're not going to upload the package directly into Azure DevOps Service. First, we need to upload it to Azure itself. And then we'll point Azure DevOps Service at the DACPAC in Azure.&lt;/p&gt;

&lt;p&gt;The easiest way to do this is to install the &lt;a href="https://azure.microsoft.com/en-us/features/storage-explorer/"&gt;Azure Storage Explorer&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Open the Azure Storage Explorer app.&lt;br&gt;
Choose Add Azure Account.&lt;br&gt;
Login to your Azure Account.&lt;br&gt;
Go to Azure Storage Container.&lt;br&gt;
Create a new Blob Container named DACPAC.&lt;br&gt;
Upload the DACPAC file created by SqlPackage.exe.&lt;/p&gt;

&lt;h3&gt;
  
  
  Create the SAS Key
&lt;/h3&gt;

&lt;p&gt;You need to create a secret key that will allow Azure DevOps Service to access the DACPAC. &lt;/p&gt;

&lt;p&gt;In Azure Storage Explorer, right-click the DACPAC folder and choose Get Shared Access Signature...&lt;/p&gt;

&lt;p&gt;Set the expiration to one week from today.&lt;br&gt;
Give it read/list rights, nothing else.&lt;br&gt;
Copy the URL for the SAS Key.&lt;/p&gt;

&lt;p&gt;This SAS URL should be placed in the import.json file that was in the Logs folder from earlier. Set it in the Source.Location field.&lt;/p&gt;

&lt;h3&gt;
  
  
  Import
&lt;/h3&gt;

&lt;p&gt;That's it! We are ready to start the import!&lt;/p&gt;

&lt;p&gt;Run the following command from the Data Migration Tool folder: &lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Run Migrate import /importfile:[IMPORT-JSON-LOCATION]
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;The import will begin and the command will provide a link to view the status of your import.&lt;/p&gt;

&lt;p&gt;It does take a few minutes before you can even see the import page, so don't panic.&lt;/p&gt;

&lt;p&gt;Once the import began, it took about two hours to complete... so this is a good time to take a break.&lt;/p&gt;

&lt;h3&gt;
  
  
  Validation
&lt;/h3&gt;

&lt;p&gt;You did it! Your migration to Azure DevOps is completed. You should now verify that everything is working correctly. &lt;/p&gt;

&lt;h4&gt;
  
  
  Users
&lt;/h4&gt;

&lt;p&gt;First, verify your list of users. You can find your users in the Organization Settings. I had to eliminate a lot of users that did not need access to the service. You should then set the correct Access Level for your actual users. We have a number of VS Enterprise subscriptions that I used for most of my developers, and our contractors received Basic access. Most importantly, make sure all users are listed that should be.&lt;/p&gt;

&lt;p&gt;This is a great chance to see how much Azure DevOps Service is actually going to cost you, so make sure you set this up just like your Production environment will be.&lt;/p&gt;

&lt;h4&gt;
  
  
  Source Control
&lt;/h4&gt;

&lt;p&gt;Because you moved your GIT source control, you don't actually need to re-clone it, you can just redirect your existing local repo to the new location. &lt;/p&gt;

&lt;p&gt;You can change your local repo origin with the following command (you can find the REMOTE_GIT_REPO in the Clone button in Azure DevOps Service - Repos - Files).&lt;/p&gt;


&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;git remote set-url origin [REMOTE_GIT_REPO]&lt;br&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;h4&gt;
&lt;br&gt;
  &lt;br&gt;
  &lt;br&gt;
  Billing&lt;br&gt;
&lt;/h4&gt;

&lt;p&gt;Make sure your Billing Account is configured for the service. When you do your Production migration, this is important. You won't be billed till the first of the next month, so make sure you have Billing and Users setup by the end of the month.&lt;/p&gt;

&lt;h4&gt;
  
  
  Build / Release Agents
&lt;/h4&gt;

&lt;p&gt;Any local Build / Release agents will need to be reconfigured. I only had about 10 agents running locally, so I chose to just remove them and reinstall them after the final Production run. The Powershell command makes this very easy.&lt;/p&gt;

&lt;p&gt;I did not test this with the Dry Run, I simply reconfigued it after the Production migration and everything worked smoothly.&lt;/p&gt;

&lt;h3&gt;
  
  
  Final Import
&lt;/h3&gt;

&lt;p&gt;And that is it!&lt;br&gt;
We had very few other issues, the dry run went well and the Production migration a few weeks later went very smoothly.&lt;/p&gt;

&lt;p&gt;For the final migration, I simply repeated the steps of this Guide and changed the import.json to use Production instead of Dry-Run.&lt;/p&gt;

&lt;p&gt;I turned off our local TFS server and am keeping it around but off in case we need the legacy code.&lt;/p&gt;

&lt;p&gt;The main thing that came up after final migration was setting Permissions for Users correctly, but I simply adjusted these settings as we went. &lt;/p&gt;

&lt;p&gt;Some users had issues with non-Visual Studio tools being unable to connect to the remote repo, but setting their GIT Credentials in Azure DevOps Service - Repos - Files - Clone fixed the issue.&lt;/p&gt;

&lt;p&gt;I hope you have learned from my efforts and if you any questions let me know!&lt;/p&gt;

</description>
      <category>tfs</category>
      <category>azure</category>
      <category>devops</category>
      <category>azuredevops</category>
    </item>
    <item>
      <title>The Great Azure DevOps Migration - Part 5: Prepare</title>
      <dc:creator>Timothy McGrath</dc:creator>
      <pubDate>Wed, 07 Aug 2019 02:45:27 +0000</pubDate>
      <link>https://dev.to/timothymcgrath/the-great-azure-devops-migration-part-5-prepare-1g7f</link>
      <guid>https://dev.to/timothymcgrath/the-great-azure-devops-migration-part-5-prepare-1g7f</guid>
      <description>&lt;p&gt;We've validated that our data is ready for import. Now, we need to prepare the data to be imported!&lt;br&gt;
This is a short step, so let's enjoy the ease of this one. &lt;/p&gt;

&lt;p&gt;If you missed the earlier posts, start &lt;a href="https://scribblesofcode.com/2019/07/13/the-great-azure-devops-migration/"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I highly recommend &lt;a href="https://azure.microsoft.com/en-us/services/devops/migrate/"&gt;Microsoft’s Azure DevOps Service Migration Guide&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Prepare Command
&lt;/h3&gt;

&lt;p&gt;In the same way that we used the Migrator validate command earlier, we need to run a Migrator prepare command. This re-runs the validation but also creates a .json file that will be used for the actual import process. &lt;/p&gt;

&lt;p&gt;So, open Powershell to the directory that contains the Migrator.exe file (in the DataMigrationTool download). Execute the cmd below: &lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Migrator prepare /collection:[COLLECTION_ADDRESS] tenantdomainname:[AZURE_TENANT_NAME] /region:CUS
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;I recommend using the localhost address to your collection to verify that you are pointed at the right server. The tenant domain name is the Azure Active Directory that it will connect to for your newly imported data. The region must be from a narrow list of Azure regions, make sure you choose a supported region. View the full list &lt;a href="https://azure.microsoft.com/en-us/services/devops/migrate/"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Execute the command and you will see results similar to the validation run earlier. &lt;/p&gt;

&lt;p&gt;If all goes well, you will find the new import.json file in the Logs folder of the DataMigrationTool. Inside Logs, open the newest folder, and open the import.json file in a text editor. &lt;/p&gt;

&lt;p&gt;There are a bunch of fields in this file, but we only care about the ones at the very top. Update the following fields: &lt;br&gt;
Target.Name - The name of your organization that will be created at Azure DevOps. &lt;br&gt;
Properties.ImportType - DryRun for this initial test. &lt;br&gt;
The two source fields will be updated in the next post.&lt;/p&gt;

&lt;h3&gt;
  
  
  Azure Storage
&lt;/h3&gt;

&lt;p&gt;Next, you need to setup an Azure Storage Container. This is the location you will move the file containing all of your TFS data to before importing it into Azure DevOps Service.&lt;/p&gt;

&lt;p&gt;In Azure, you just need to create a new Standard Storage Container. This container has to be created in the same data center region as you set in the import.json file. So make sure you pay attention to that!&lt;/p&gt;

&lt;p&gt;I simply created a Standard Storage Container in Central US, easy.&lt;/p&gt;

&lt;h3&gt;
  
  
  What's Next?
&lt;/h3&gt;

&lt;p&gt;We're so close! Our data is now prepared for import!&lt;/p&gt;

&lt;p&gt;In the next step, we'll push the data to the Storage container and begin the import process!&lt;/p&gt;

</description>
      <category>tfs</category>
      <category>azure</category>
      <category>devops</category>
      <category>azuredevops</category>
    </item>
    <item>
      <title>The Great Azure DevOps Migration - Part 4: Validation</title>
      <dc:creator>Timothy McGrath</dc:creator>
      <pubDate>Sat, 27 Jul 2019 02:53:50 +0000</pubDate>
      <link>https://dev.to/timothymcgrath/the-great-azure-devops-migration-part-4-validation-kh8</link>
      <guid>https://dev.to/timothymcgrath/the-great-azure-devops-migration-part-4-validation-kh8</guid>
      <description>&lt;p&gt;We have the staging server setup. We've cleaned out the data that we don't want to import. We're almost ready!&lt;/p&gt;

&lt;p&gt;We need to run the Azure DevOps Service validation on our local server to verify that there are no issues before importing. This validation will alert us to any issues that need to be resolved before the actual import.&lt;/p&gt;

&lt;p&gt;If you missed the earlier posts, start &lt;a href="https://scribblesofcode.com/2019/07/13/the-great-azure-devops-migration/"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I highly recommend &lt;a href="https://azure.microsoft.com/en-us/services/devops/migrate/"&gt;Microsoft’s Azure DevOps Service Migration Guide&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Get the Tool
&lt;/h3&gt;

&lt;p&gt;Start by downloading the &lt;a href="https://azure.microsoft.com/en-us/services/devops/migrate/"&gt;Data Migration Tool&lt;/a&gt;. This tool contains the Guide, which you should definitely read, and the actual Migration tool.&lt;/p&gt;

&lt;p&gt;Copy the .zip for the Data Migration Tool to the staging server and unzip it to the C:.&lt;/p&gt;

&lt;h3&gt;
  
  
  Run Validation
&lt;/h3&gt;

&lt;p&gt;Open a command prompt and change directory to the unzipped DataMigrationTool folder. This folder contains the Migrator.exe file.&lt;/p&gt;

&lt;p&gt;To execute the validation, run Migrator validate /collection:[COLLECTION_NAME] from the command prompt. Make sure you are executing this on your staging server, use localhost:8080 to make sure you are pointed at the right server.&lt;/p&gt;

&lt;p&gt;The validation only takes a few minutes to run and it creates a number of log files with the results.&lt;/p&gt;

&lt;h3&gt;
  
  
  Analyze the Results
&lt;/h3&gt;

&lt;p&gt;You can view the results of the validation in the command prompt or in the log file stored in the Logs folder of the DataMigrationTool. Open Logs, select the Collection you validated, then click the latest folder (one is made for each migration validation), then open DataMigrationTool.log.&lt;/p&gt;

&lt;p&gt;I had a few issues that needed to be resolved, which I'll explain below. You'll probably get different ones which you can lookup in the &lt;a href="https://docs.microsoft.com/en-us/azure/devops/migrate/migration-troubleshooting?view=azure-devops"&gt;Migration Troubleshooting Guide&lt;/a&gt;. None of the issues I ran into were especially hard, just had to read up on the fix.&lt;/p&gt;

&lt;h4&gt;
  
  
  VS403443 Error
&lt;/h4&gt;

&lt;p&gt;This validation error means that you need to rename a work item field. This seems to happen with old databases that have been updated over time. The schema needs to be tweaked to get in sync with the service. &lt;/p&gt;

&lt;p&gt;I had about 8 of these to fix, which I was able to do with the witadmin tool inside the Developer Command Prompt.&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;witadmin changefield /collection:[COLLECTION_NAME] /n:System.IterationId /name:"Iteration Id"
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;

&lt;p&gt;The important thing (which I screwed up at first) was that the /n parameter is the field, and /name is the name that you change it to.&lt;/p&gt;

&lt;h4&gt;
  
  
  ISVError:100014 Error
&lt;/h4&gt;

&lt;p&gt;This error means that one of the built-in groups is missing required permissions. It needs to be re-added using the TFSSecurity.exe tool.&lt;/p&gt;

&lt;p&gt;Use the instructions at &lt;a href="https://docs.microsoft.com/en-us/azure/devops/migrate/migration-troubleshooting?view=azure-devops#isverror100014"&gt;Migration Troubleshooting&lt;/a&gt; to resolve this issue.&lt;/p&gt;

&lt;h3&gt;
  
  
  Users
&lt;/h3&gt;

&lt;p&gt;You may get some validation issues about users, but in my test run I fixed my users after the import by making sure the correct users have access and removing users that didn't belong.&lt;/p&gt;

&lt;p&gt;You won't get charged till the 1st of the following month after import, so you will have time to address any user import issues.&lt;/p&gt;

&lt;h3&gt;
  
  
  Space
&lt;/h3&gt;

&lt;p&gt;If you get a warning that your import is too large and needs to be done by importing to an Azure SQL Database first, your import is about to get a lot harder. I initially had this warning, and it is the reason that I cleaned out more of the data in my import (in our previous step). If you can get under this limit, it will make your life easier. If you can't, you'll need to do a few extra steps on the import which I won't be doing, but I'll provide a link to the guide on how to do it.&lt;/p&gt;

&lt;h3&gt;
  
  
  What's Next?
&lt;/h3&gt;

&lt;p&gt;We have successfully validated that the Import is ready to go!&lt;/p&gt;

&lt;p&gt;Next, we will prepare the actual Migration package.&lt;/p&gt;

</description>
      <category>tfs</category>
      <category>azure</category>
      <category>devops</category>
      <category>azuredevops</category>
    </item>
    <item>
      <title>The Great Azure DevOps Migration - Part 3: Clean</title>
      <dc:creator>Timothy McGrath</dc:creator>
      <pubDate>Wed, 24 Jul 2019 01:22:34 +0000</pubDate>
      <link>https://dev.to/timothymcgrath/the-great-azure-devops-migration-part-3-clean-4dhl</link>
      <guid>https://dev.to/timothymcgrath/the-great-azure-devops-migration-part-3-clean-4dhl</guid>
      <description>&lt;p&gt;Before migrating the TFS data into Azure DevOps, it's a good idea to eliminate any data that you don't need to move into the new service. Ten years of TFS has accumulated a huge amount of code, and I really only need to bring my latest repos forward. &lt;/p&gt;

&lt;p&gt;This part will show which data to eliminate and the quickest way to do it.&lt;/p&gt;

&lt;p&gt;If you missed the earlier posts, start &lt;a href="https://scribblesofcode.com/2019/07/13/the-great-azure-devops-migration/"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I highly recommend &lt;a href="https://azure.microsoft.com/en-us/services/devops/migrate/"&gt;Microsoft’s Azure DevOps Service Migration Guide&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Team Projects
&lt;/h2&gt;

&lt;p&gt;In my case, I had about 50 Team Project Collections. Team Project Collections each contain their own project template, code repository, and work items. &lt;/p&gt;

&lt;p&gt;I had about 50 of these because we imported from SourceSafe over 10 years ago and the import process set it up this way. We actually only use one of these projects on an ongoing basis. Over the last 10 years, we migrated the code for active projects into this main project so we could share work items and project templates. &lt;/p&gt;

&lt;p&gt;Because of this, I have about 50 extra projects that are old projects that are rarely (if ever) worked on. None of them have their own work items. I don't want to bring any of these projects into the Azure DevOps service.&lt;/p&gt;

&lt;p&gt;The future plan is that if we need to access the code for one of these projects, we'll import it as a new GIT repository into our Azure DevOps Service project.&lt;/p&gt;

&lt;h3&gt;
  
  
  Delete Team Projects
&lt;/h3&gt;

&lt;p&gt;So, for Step 1, delete the unnecessary Team Projects. This is most easily done through the Web UI for TFS. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Make sure you are on the Staging TFS Web UI!&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the Web UI, you need to access the Collection's Settings. In the breadcrumb trail, at the top of the UI, click the root (mine is DefaultCollection). Then click Admin Settings at the bottom left corner. This will show you the full list of projects in your collection. Click the ellipsis next to each project (except for the ones you want to keep) and click Delete. &lt;/p&gt;

&lt;p&gt;If any of these projects have a large code-base, this will take a long time. One of my big ones took over an hour, so be prepared to wait. &lt;/p&gt;

&lt;h2&gt;
  
  
  Team Foundation Version Control
&lt;/h2&gt;

&lt;p&gt;Before GIT, we had TFVC. TFVC was the only source control that TFS supported in the beginning, so if you've been using it for long, you probably have lingering TFVC repositories. &lt;/p&gt;

&lt;p&gt;We now exclusively use GIT, so I don't want to migrate any of the TFVC repositories to Azure DevOps Service. If you are using TFVC, you can migrate these repositories... but I recommend you move to GIT anyway because it's awesome.&lt;/p&gt;

&lt;p&gt;My current project that I'm migrating contains both GIT and TFVC, so I want to purge the TFVC before migration. You can't actually destroy the TFVC repository, it is there forever... but you can clear everything inside of it.&lt;/p&gt;

&lt;h3&gt;
  
  
  Delete Workspaces
&lt;/h3&gt;

&lt;p&gt;First, delete all the workspaces. TFS won't let you delete the code until the attached workspaces are gone. &lt;/p&gt;

&lt;p&gt;The best way I found to do this was using a blast from the past: &lt;a href="http://www.attrice.info/downloads/index.htm#tfssidekicks2015"&gt;TFS Sidekicks&lt;/a&gt;!&lt;/p&gt;

&lt;p&gt;TFS Sidekicks is a handy tool for TFVC but has fallen away as GIT has taken over. However, it still works in Azure DevOps Server 2019.&lt;/p&gt;

&lt;p&gt;Install it onto your staging server and run it. Go to the Workspaces tab, and highlight and delete all the workspaces. Easy!&lt;/p&gt;

&lt;h3&gt;
  
  
  Delete Code
&lt;/h3&gt;

&lt;p&gt;Now for the code. The best way to delete the code from the database for good is with a command line "tf destroy". This will eliminate the code completely from the database. &lt;/p&gt;

&lt;p&gt;It is also very important that you include the /startcleanup parameter as that will tell the database to remove it immediately. Otherwise, it can take up to five days to be removed.&lt;/p&gt;

&lt;p&gt;There is one caveat, that the tf destroy command will fail if it takes too long to run. So, if you have an enormous amount of code, you will need to do it in smaller chunks. &lt;/p&gt;

&lt;p&gt;I had a ton of branches persisted in TFVC, so I had to do it one branch at a time. It took a while, so maybe put on a TV show while you do this...&lt;/p&gt;

&lt;p&gt;The tf destroy command needs to be run from the Developer Command Prompt for Visual Studio. Type that into Start to find it.&lt;/p&gt;

&lt;p&gt;Then run *tf destroy $/[REPOSITORY]/[FOLDER] /startcleanup &lt;/p&gt;

&lt;p&gt;If your repository is small enough, skip [FOLDER] and attempt to destroy it all in one run.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;Your old data is cleaned out! This may seem unnecessary, but there are two reasons to do this.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;We really want to be under 30 GB before migration to have the simplest migration possible. More on this later...&lt;/li&gt;
&lt;li&gt;This is a great opportunity to cut loose clutter that you no longer need. If you think you will need this code in the future, keep it. But in my case, I'm 99% sure I will never need it again. And if I do need it again, I want to migrate it to GIT anyway.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  What's Next?
&lt;/h2&gt;

&lt;p&gt;In the next post, we begin the Validation. &lt;/p&gt;

</description>
      <category>tfs</category>
      <category>azure</category>
      <category>devops</category>
      <category>azuredevops</category>
    </item>
  </channel>
</rss>
