My developers toolbelt 2016

I cought a tweet a few days ago asking for your developers toolbelt, specifically on windows. And I gave a very short answer and mentioned I would blog about this:

So, this is a more elaborate answer to the developers toolbelt question. My default windows developer installation contains the following:

Continue reading "My developers toolbelt 2016"

Unobtrusive MSBuild: Using Git information in your assemblies

For my current project I wanted to add some information from Git as well as some additional build environment info into my assemblies at compile time.

My usual approach to this is adding some additional steps in my build process. I learned a lot about MSBuild from Sayed I. Hashimi (@sayedihashimi) who wrote the book Inside the Microsoft Build Engine: Using MSBuild and Team Foundation Build (by the way a must-read if you want to dig into how MSBuild works). MSBuild is very powerful and easy to extend, and so I think it's the best way to solve this.

Since I developed some MSBuild tasks and targets for internal stuff at my place of work I started to create MSBuild stuff in a way that I like to call unobtrusive MSBuild. My idea was to design my MSBuild project extensions in a way that they can be used by just adding a project include and optionally adding some configuration properties right before the include. This keeps them portable, reusable and flexible enough to be used in slightly different environments. Continue reading "Unobtrusive MSBuild: Using Git information in your assemblies"

Ask a Ninja: Automated WordPress blog backup using Git

I thought I had posted this already, but the article list of my blog tells otherwise. Early this year I posted how I moved this blog from the old server to the current one. After that I thought I also could automate the backup this way.

So, what are the required steps?

  • Create a dump of the database.
  • Add the dump and all local modifications to the local repository.
  • Commit the changes to the local repo.
  • Push to a remote repository.

In my case I like to go sure and push to two remote repositories.

So, this is the script that will backup my blog and push it to my repos:

D:
cd D:Websdotnetninja.de
SET PATH=%PATH%;D:MariaDBbin
del backup.sql
mysqldump --skip-dump-date -u backup blog_dotnetninja.de > backup.sql
git add .
git commit -m "Automatic backup"
git push origin
git push backup master
exit

To automate the backup I just created a simple scheduled task to execute this script once a day.
Restoring the blog from the backup is as easy as described in my blog post about the move.

Custom deployment scripts – with mstest – for Windows Azure Website git deployment

I just started another project. It is hosted on Windows Azure and I'm using Git deployment for this website.

This was very fine and I am extremely impressed how easy it was to get started with it. Then I ran into a little problem.

Sidenote: My project relies on NuGet packages, and I, personally, have the strong opinion that compiled stuff does not belong into my source code versioning system. This is why I did not check in the NuGet.exe into Git, but just the NuGet.config and NuGet.Targets files configured to download NuGet.exe when it's missing. Of course I make my build dependent on a NuGet package server, but since I could host my own gallery on a custom domain, and configure that domain in my NuGet.config, I could take control over this dependency at any time.

I wanted my project to incorporate information about the Git commit hash it is built from, the Git branch it was built from and other little details. For that the MsBuild Community Tasks project offers some nice helpers. So I added the NuGet package of this project to my solution.

The Problems

Now there is this chicken-egg problem: When MsBuild encounters a UsingTask declaration, it automatically loads the assembly that contains the task. If that assembly is not there, using the task will fail. Now, the NuGet download of the packages - including the task library - happens as part of the build. That is, after the project files are loaded. So the fresh downloaded file was not found when importing the projects and... the build fails.

To avoid this problem, I cheated a little bit on MsBuild: I added another project to my solution that also has the MsBuild Community Task project listed in it's packaged.config. Then I manually set my web application project to be build after this 'BuildSupport' project. Now the BuildSupport project build downloads the community task library, which is then available when the project import is defined in the web application's project file. It's just a small cheat, though.

Then the next problem: The BuildSupport project is not actually 'required' to build the website project, and so the Git deployment build process simply does not build it. The task library is not downloaded prior to executing the actual build process of the application, and so it fails. I could not get the project to build the 'BuildSupport' project before the actual web application on Azure.

The Solution

After a little bit of research I found this can be achieved by using a custom deployment script.
I was a bit afraid that I had to figure out how the actual deployment works to add a step just in front of the actual compile, but there is some infrastructure in place to help us out with that.

For a .NET developer this will feel strange, but you'll need node.js in the first place. The Windows Azure Command Line Tools are a node.js package, and we'll need that to get started with the actual live deployment script. So, after installing node.js, we're going to install the package:

npm install azure-cli -g

This will globally install the Azure CLI for use on our console. Now we navigate to our solution directory and let the Azure CLI generate the deployment script that will automatically run to deploy our application to Azure if we don't do anything custom:

azure site deploymentscript --aspWAP ApplicationFolderApplication.csproj -s Solution.sln

This will generate two files for you. First there is a .deployment file. This is a file structured like a oldfashioned .ini configuration file, telling Azure that there is a custom deployment file and how its name is. It's content simply is:

[config]
command = deploy.cmd

It also reveals the second generated file, the actual deployment script called deploy.cmd. This is the interesting part for us so far. I'm not posting the full script but rather go through the sections.

First there is a check that node.js is available. It is assumed that this is available on Azure, but to test the deployment script locally you'll also need node.js. We just installed it, so we're all set, but the next one checking out the solution could be missing node.

Then the script defines some environment variables for folders. Like where the build artifacts will be placed and where the actual files to deploy will be placed. This defaults to /artifacts/wwwroot and can be overridden by setting the corresponding environment variables before the deployment.

In a thirds step, the script checks if kudu is installed. Kudu is the actual deployment engine running on Azure, and is also capable of running on your machine. After that additional paths are configured.

In the fourth step the actual compiling and deployment work is done, and the fifth is just some error handling.

So, let's have a look at the actual important stuff in the file:

:: 1. Build to the temporary path
%MSBUILD_PATH% "%DEPLOYMENT_SOURCE%MyApplication.WebMyApplication.Web.csproj" /nologo /verbosity:m /t:Build /t:pipelinePreDeployCopyAllFilesToOneFolder /p:_PackageTempDir="%DEPLOYMENT_TEMP%";AutoParameterizationWebConfigConnectionStrings=false;Configuration=Release /p:SolutionDir="%DEPLOYMENT_SOURCE%.\" %SCM_BUILD_ARGS%
IF !ERRORLEVEL! NEQ 0 goto error

:: 2. KuduSync
call %KUDU_SYNC_CMD% -v 50 -f "%DEPLOYMENT_TEMP%" -t "%DEPLOYMENT_TARGET%" -n "%NEXT_MANIFEST_PATH%" -p "%PREVIOUS_MANIFEST_PATH%" -i ".git;.hg;.deployment;deploy.cmd"
IF !ERRORLEVEL! NEQ 0 goto error

Now, thats actualy very easy: MsBuild is called for the web application project, and then Kudu is launched to do the actual deployment.

What we want to achieve now is to build the full solution upfront to have all required NuGet packages downloaded before the actual project is being built. And while we are actually getting our hands dirty in a custom deployment script, why don't add running the unit tests of the project as part of the deployment? So, if a test fails, deployment will fail too. I think that's a good idea.

So what I did was adding these two steps just in front of the two default steps:

:: 1. Build solution
echo Build solution
%MSBUILD_PATH% "%DEPLOYMENT_SOURCE%MySolution.sln" /nologo /verbosity:m /t:Build /p:_PackageTempDir="%DEPLOYMENT_TEMP%";AutoParameterizationWebConfigConnectionStrings=false;Configuration=Release /p:SolutionDir="%DEPLOYMENT_SOURCE%.\" %SCM_BUILD_ARGS%
IF !ERRORLEVEL! NEQ 0 goto error

:: 2. Running tests
echo Running tests
vstest.console.exe "%DEPLOYMENT_SOURCE%MyApplication.Web.TestsbinReleaseMyApplication.Web.Tests.dll"
IF !ERRORLEVEL! NEQ 0 goto error

That's it. I just copied the build line and pointed it to my solution, and I added a call to the MsTest tooling to run my tests.

So, with very little tweaking I could remove all dependencies to actual binaries I would have to check in otherwise and I have the Azure git deployment run my unit tests on every deployment. That's what I call easy and convenient.

This blog just moved

Hello from.. Gallifrey.

As you should see in the footer, this blog just moved and now is hosted on my new server Gallifrey. Previously it was located on smarthost01, I admit, a very boring name for a server.

Perhaps it's intersting for you, how I moved this installation. This was done in a few easy steps. I...
1.) scripted a backup of the database (including the user) directly in my wordpress directory using HeidiSQL,
2.) initiated a new Git repository in my wordpress directory, added all files to the repo and commited, then deleted the SQL file again,
3.) created a new repo on my Stash evaluation instance on the new server,
4.) pushed the local copy to the new repo,
5.) cloned the fresh repo on Gallifrey,
6.) executed the SQL file using HeidiSQL on Gallifreys MariaDB and then deleted the SQL file,
7.) configured an web in IIS on the directory, and applied the same file system permissions,
8.) tested the installation and switched the DNS entries from smarthost01 to Gallifrey.

That was done in about 10 minutes in total. By the time of this posting, the DNS change should be propagated wide enough that you should see this served from Gallifrey. Moving a blog instance using Git is very easy and comfortable, so if you feel the urge to move your blog, this could be the way to go.