As of Windows 10 Anniversary Update, there was the inclusion of an awesome new tool that allowed users to have their very own Bash terminal on Windows without needing to run a Linux image in a hypervisor. This new tool was called the Windows Subsystem for Linux, which shipped with it’s own copy of a Canonical sanctioned variant of Ubuntu 14.04. The differences between it and a normal Ubuntu installation to any experienced Linux user would probably identify a couple of things that stuck out as being different. First item up is that the Kernel details listed Microsoft, second item up is that a lot of the linux utilities would either not return the right information, or just outright fail. Ansible for example is one such tool that I use and didn’t work under this new subsystem because semiphores were not entirely supported yet.

What Microsoft did was essentially create their own kernel in what they call a “pico-process”, which is essentially a proxy between Linux kernel calls to the Windows kernel’s implementations (or using bridges for things that didn’t quite match up. This environment, even with the terminal window on the screen, doesn’t use up any system resources – until a command is executed. It’s pretty cool and as of this blog post, entirely negated the use VirtualBox for me to run Fedora locally to get a fully functional Bash terminal.

As of Windows 10 Creators Update, the proxy-kernel has been updated and pretty much supports everything I could imagine wanting to run now, without any issues. Along with the kernel updates was an update to the Ubuntu image that you could use, which is now Ubuntu 16.04. The one thing I don’t like about the whole thing is this lock-in to using Ubuntu, when I’m pretty much a fanboy of the RedHat ecosystem – from RHEL, CentOS, Amazon Linux, and to Fedora. But now there’s a way to not only use this subsystem with a different Linux distribution, you can use it with multiple distributions.

 

Getting Updated Ubuntu Image – Suggested way

This will literally nuke everything you have in the existing Ubuntu environment. Be sure to backup everything you care about NOW, before running the following command.

  1. Close any existing Bash terminal windows you may have open
  2. Execute the following two commands
C:\> lxrun /uninstall /full /y
C:\> lxrun /install

Ensure that you enter in the username and password of choice after the last command that you’ll use going forward with that environment.

 

Getting Updated Ubuntu Image – Easy way

This will keep everything you have in place and do an in-place upgrade, but Microsoft doesn’t suggest it. It would be a good idea to backup things anyways even though you should theoretically be safe here.

$ sudo do-release-upgrade

I guess if you’re fine with Ubuntu, then you’re done. Have fun, see you again in another year when I decide to blog again.

If you’re not fine with Ubuntu like I am, then we have some more work to do…

 

Another Distribution – Downloading

The README for the GitHub project we just cloned shows the following images and tags are available as of August 18th:

  • debian – 8.5, 8, jessie, latest | jessie-backports | oldstable | oldstable-backports | sid | stable | stable-backports | stretch | testing | unstable | 7.11, 7, wheezy | wheezy-backports | rc-buggy | experimental
  • ubuntu – 12.04.5, 12.04, precise-20160707, precise | 14.04.5, 14.04, trusty-20160802, trusty | 16.04, xenial-20160809, xenial, latest | 16.10, yakkety-20160806.1, yakkety, devel
  • fedora – latest, 24 | 23 | 22 | 21 | rawhide | 20, heisenbug
  • centos – latest, centos7, 7 | centos6, 6 | centos5, 5 | centos7.2.1511, 7.2.1511 | centos7.1.1503, 7.1.1503 | centos7.0.1406, 7.0.1406 | centos6.8, 6.8 | centos6.7, 6.7 | centos6.6, 6.6 | centos5.11, 5.11
  • opensuse – 42.1, leap, latest | 13.2, harlequin | tumbleweed
  • mageia – latest, 5
  • oraclelinux – latest, 7, 7.2 | 7.1 | 7.0 | 6, 6.8 | 6.7 | 6.6 | 5, 5.11
  • alpine – 3.1 | 3.2 | 3.3 | 3.4, latest | edge
  • crux – latest, 3.1
  • clearlinux – latest, base

This will only download a Docker repository image of a distribution you’re interested in. It does not install it, that’s another step. You can run this multiple times for the distributions you’re interested in, if you choose.

  1. Install Python3 in your Windows environment (not inside the Linux subsystem) and include it in your PATH.
  2. Open a command or PowerShell Prompt
  3. Execute the following command
git clone https://github.com/RoliSoft/WSL-Distribution-Switcher.git
  1. Move into the git directory
  2. Rename the hook_postinstall_all.sample.sh file to hook_postinstall_all.sh
  3. Execute the following command using the appropriate tags defined above
python get-prebuilt.py [docker_image_identifier]:[image_version]

 

Another Distribution – Installing

Now we’re going to switch into the distribution we want and just downloaded.

  1. Close any existing Bash terminal windows you may have open
  2. Execute the following command
python install.py [docker_image_identifier]:[image_version]
  1. Open the Bash terminal by launching the  “Bash on Ubuntu on Windows” link in the start menu again
  2. Execute the following command:
dnf --version

 

Another Distribution – Switching

You can repeat the download and install steps multiple times, for each distribution you’re interested in. The last install ran makes the subsystem of that distribution, but each install performed on the machine stays, and it makes a note of each installation performed. Which makes this next step much awesome and so wow.

$ python switch.py
usage: ./switch.py image[:tag]

The following distributions are currently installed:

  - amazonlinux:latest
  - fedora:latest
  - ubuntu:trusty

To switch back to the default distribution, specify ubuntu:trusty as the argument.

The downside is that you aren’t able to have multiple distributions active at the same time, but switching like this is still pretty awesome.

Well, now that I’ve shared that information, I’m going to go clear up 80gb of disk space by removing VirtualBox and it’s assorted images.

For more information, see the blog post from Microsoft here and the README of the project you just blindly cloned to your machine and arbitrarily ran commands from because a blog post referenced it.

I’ve been working on a side project at home the last few days to snipe hard to find restaurant reservations and came across a weird issue I’ve never experienced before while using the .NET framework.

The API I’m calling to find these opportunities returns a DateTime value in the format “2016-11-21T23:00:00-05:00” which could be read as “November 21st, 2016 11:00pm (EST)”. Now while we’ve read the “-05:00” as the timezone offset which equates to the Eastern Time Zone, it appears the .NET framework in the DateTime.Parse() method, takes that as a hint to adjust the value relative to the local timezone instead to “November 22nd, 2016 4:00am”. Can’t imagine any restaurant that is hard to get is open at 4am in the local time.

While on my laptop (using EST) it was working just fine, deploying this to say… an Azure instance (using UTC) introduces just enough frustration to want to kick puppies and pop a small child’s balloon in passing.

I’ve seen it for years in the intellisense popup in Visual Studio without ever looking at it, and now I know why it’s there. The lifesaving DateTimeOffset type works just like the DateTime type, but when fed the value I needed parsed instead sees the timezone value as an offset (thus the type’s name), not an adjustment hint.

 

Hopefully you see this before you do anything horrible to a young canine or child. For the record, doing either is mean. Asshole.

If you’ve ever had problems trying to target particular .NET frameworks, this cheat sheet put together by @bradwilson@onovotny, and myself a while ago might help. I’m not going to go into depth on about what PCLs are what targeting means with the new frameworks – I think Oren Novotny did a great job over on his blog with his articles…

Below is a static image of it at the time of this blog posting, or you can see the live version which will reflect updates shared here on OneDrive. I suggest opening it in the full fledged Excel as vertically rotated columns don’t render in the online version. If you see any updates that are needed, you can contact me on Twitter @William_Holroyd.

Continue reading

Getting started on a microservice adoption isn’t trivial given the challenge of migrating existing data, merging with other data sets, keeping yourself in a ‘known good state’, and the new requirement of honoring contracts long after you’ve moved onto a new way of doing something. At least it’s just a technical problem that can easily be changed and monitored over the course of it’s implementation. I covered some options to start that half of the work in my prior microservice article and isn’t a goal here.

The real challenge comes in the other half of the work to lay the foundation for the microservices effort – the path to a DevOps supporting culture – which is our goal for this sub-section of the “working towards microservices” series. It’s the hardest half of the two parts given that it’s a business problem that can’t be easily changed or monitored – it requires changing the organization’s culture. Unlike computers which change their logic processing with a deployment, humans are pretty stubborn when confronted with new information, or conflicting ideas and thoughts.

You can’t directly change the culture just like you can’t directly change a person’s personality. You can change the culture indirectly by modifying behavior through the implementation of an intrinsic motivation and reward system, or working towards the implementation of a social-norm culture.Continue reading

There has been a lot of noise in the UNIX/Linux world recently when a presentation was made at the last Bay Area FreeBSD Users Group meetup where a couple of guys have patched the most recent FreeBSD release with a bunch of services like the Mach IPC, launchd, notifyd, asld, and libdispatch  that are heavily used by Apple and the Darwin kernel. This “science project” as they have called it is NextBSD – taking it’s name from the NeXT components that eventually became part of OS X and the Darwin kernel.

The video of the meeting is available here.

The goal as I’ve been able to understand at this point isn’t an entire fork of FreeBSD, but a parallel development that utilizes FreeBSD as it’s core – just changes to how it’s compiled, essentially how Fedora Core is well.. a core to RHEL/CentOS. Not only are there inclusions of components found in the Darwin kernel, but they also hope to incorporate pieces of HardendBSD as well for security reasons.

Continue reading

In the work that I’ve done recently for evangelizing the principles behind our DevOps movement, it was interesting to hear the question of how microservices has anything to do with DevOps at all. It seems my blogging and marketing within the company is working as neither concept was widely known at the time I had started there and now people are getting to get the concepts well enough to know the difference.

They are very much correct in stating that microservices is an architectural concern and not something that DevOps dictates or even talks about. Even today, I am still marketing both DevOps and microservices in almost the same breath in most conversations. I have changed my messaging a to remove the confusion and put more emphasis on what each aims to resolve.Continue reading

Ever needed to enumerate certificates installed on a remote machine using just C# and .NET without having to use an agent? It was a problem I’ve had a couple of times now and was able to figure out without the help of MSDN or StackOverflow. I discovered the X509Store class has the power to solve this problem for us as it utilizes the C++ CertOpenStore functionality underneath, but it’s not documented anywhere. As a result, you can use some of the same functionality as the underlying library at the C# level, just by simply doing this…

[gist https://gist.github.com/wholroyd/b7026197c485c6085c60]

It’s evidently the second time I’ve had to do this type of lookup. Luckily I remembered to post the answer back to my own StackOverflow question years ago.

One of the projects I’ve spent the past month working on is a system called Foundation that will eventually become the company’s one stop shop for automation and workflow management. It’ll keep track of everything from services to environments, and the resources they are using underneath in our private cloud and public cloud provider.

In the process of learning Code First Entity Framework for the first time (all previous provides were Data First), I came across an interesting event that you could use to prepare entities before they are committed to the database. For example, you have some entity properties that need to be checked at last minute and possibly changed. You could do the following without having to layer in an abstract DbContextBase between your own context implementation and DbContext that overrides SaveChanges() to perform the same work before calling the same method off the base class…Continue reading

I want to start off by saying that having a monolithic application isn’t always a bad thing, and this article may not necessarily be for you. Yet. It just comes down to the correct timing of using microservices when it make sense and then diving into that work at the moment it’s needed, and not a moment later. Utilizing a microservices architecture too soon will hold you back and slow the development process back, whereas waiting too long to perform the migration makes the refactoring effort very painful.

  • If you have a single product that was designed well, is easily maintainable, and carries minimal technical debt, you may not have a lot of reasons to invest into a microservices architecture. Or certain areas are becoming areas of concern for performance and scalability, then you may slowly split those areas out.
  • If you’re like the rest of us dealing with multiple products through acquisitions, mergers, or reorganizations that were originally built in a time long ago before best practices existed for online services, there is little hope that it is maintainable or carrying minimal technical debt.

Continue reading

A lot of people have asked me over the past year or so why I left Seattle. Well, here are over 445 accolades (nine years worth) the Raleigh area was awarded that partially helped weigh the decision to move here over any other city in the United States. The reasons you would want to move here won’t necessarily match ours, so I didn’t filter them.

To be completely transparent with you, I am not currently (or have ever been) employed by any company on this list, and have no financial incentives as a result of posting this. I just live here.

Continue reading