Static site gens the 2nd: Hexo and Sandra.Snow

In my recent blog post I wrote about my experiences so far with static site gens in general. I said I was looking into Hexo before I go on with my plan B and this is what I did.

Hexo is very capable. If you really just want to a blog, then this is the way to go (imho). The main problem with Hexo is that it is a one-man-show from China and that this guy is currently in the middle of the process to release Hexo 3.0. Which is not a bad thing, but for one several plugins have not yet been updated, which makes it very hard to get things running. Then again, some plugins like the sitemap plugin that should generate a sitemap.xml do not have access to all entries for the tags and the categories. That said, I could probably write my own, but while the API is documented somehow I don't got around configuring my WebStorm IDE correctly so that it indeed provides me with code completion on the Hexo API, which makes everything very tedious.
Continue reading "Static site gens the 2nd: Hexo and Sandra.Snow"

Ask a Ninja: Current state of static site generators

Over the course of the last weekend I tried to built a website for a side project of mine (gaming related). To broaden my horizon, and to be able to host the website cheap and fast, I wanted to use a static site generator for it.

First try: Jekyll. TL;DR: Does not work on windows.

Since Jekyll is directly supported by Github pages, and I wanted to host here, and a lot of other guys around on my Twitter timeline use Jekyll, I thought this is the way to go.
Continue reading "Ask a Ninja: Current state of static site generators"

Custom deployment scripts – with mstest – for Windows Azure Website git deployment

I just started another project. It is hosted on Windows Azure and I'm using Git deployment for this website.

This was very fine and I am extremely impressed how easy it was to get started with it. Then I ran into a little problem.

Sidenote: My project relies on NuGet packages, and I, personally, have the strong opinion that compiled stuff does not belong into my source code versioning system. This is why I did not check in the NuGet.exe into Git, but just the NuGet.config and NuGet.Targets files configured to download NuGet.exe when it's missing. Of course I make my build dependent on a NuGet package server, but since I could host my own gallery on a custom domain, and configure that domain in my NuGet.config, I could take control over this dependency at any time.

I wanted my project to incorporate information about the Git commit hash it is built from, the Git branch it was built from and other little details. For that the MsBuild Community Tasks project offers some nice helpers. So I added the NuGet package of this project to my solution.

The Problems

Now there is this chicken-egg problem: When MsBuild encounters a UsingTask declaration, it automatically loads the assembly that contains the task. If that assembly is not there, using the task will fail. Now, the NuGet download of the packages - including the task library - happens as part of the build. That is, after the project files are loaded. So the fresh downloaded file was not found when importing the projects and... the build fails.

To avoid this problem, I cheated a little bit on MsBuild: I added another project to my solution that also has the MsBuild Community Task project listed in it's packaged.config. Then I manually set my web application project to be build after this 'BuildSupport' project. Now the BuildSupport project build downloads the community task library, which is then available when the project import is defined in the web application's project file. It's just a small cheat, though.

Then the next problem: The BuildSupport project is not actually 'required' to build the website project, and so the Git deployment build process simply does not build it. The task library is not downloaded prior to executing the actual build process of the application, and so it fails. I could not get the project to build the 'BuildSupport' project before the actual web application on Azure.

The Solution

After a little bit of research I found this can be achieved by using a custom deployment script.
I was a bit afraid that I had to figure out how the actual deployment works to add a step just in front of the actual compile, but there is some infrastructure in place to help us out with that.

For a .NET developer this will feel strange, but you'll need node.js in the first place. The Windows Azure Command Line Tools are a node.js package, and we'll need that to get started with the actual live deployment script. So, after installing node.js, we're going to install the package:

npm install azure-cli -g

This will globally install the Azure CLI for use on our console. Now we navigate to our solution directory and let the Azure CLI generate the deployment script that will automatically run to deploy our application to Azure if we don't do anything custom:

azure site deploymentscript --aspWAP ApplicationFolderApplication.csproj -s Solution.sln

This will generate two files for you. First there is a .deployment file. This is a file structured like a oldfashioned .ini configuration file, telling Azure that there is a custom deployment file and how its name is. It's content simply is:

[config]
command = deploy.cmd

It also reveals the second generated file, the actual deployment script called deploy.cmd. This is the interesting part for us so far. I'm not posting the full script but rather go through the sections.

First there is a check that node.js is available. It is assumed that this is available on Azure, but to test the deployment script locally you'll also need node.js. We just installed it, so we're all set, but the next one checking out the solution could be missing node.

Then the script defines some environment variables for folders. Like where the build artifacts will be placed and where the actual files to deploy will be placed. This defaults to /artifacts/wwwroot and can be overridden by setting the corresponding environment variables before the deployment.

In a thirds step, the script checks if kudu is installed. Kudu is the actual deployment engine running on Azure, and is also capable of running on your machine. After that additional paths are configured.

In the fourth step the actual compiling and deployment work is done, and the fifth is just some error handling.

So, let's have a look at the actual important stuff in the file:

:: 1. Build to the temporary path
%MSBUILD_PATH% "%DEPLOYMENT_SOURCE%MyApplication.WebMyApplication.Web.csproj" /nologo /verbosity:m /t:Build /t:pipelinePreDeployCopyAllFilesToOneFolder /p:_PackageTempDir="%DEPLOYMENT_TEMP%";AutoParameterizationWebConfigConnectionStrings=false;Configuration=Release /p:SolutionDir="%DEPLOYMENT_SOURCE%.\" %SCM_BUILD_ARGS%
IF !ERRORLEVEL! NEQ 0 goto error

:: 2. KuduSync
call %KUDU_SYNC_CMD% -v 50 -f "%DEPLOYMENT_TEMP%" -t "%DEPLOYMENT_TARGET%" -n "%NEXT_MANIFEST_PATH%" -p "%PREVIOUS_MANIFEST_PATH%" -i ".git;.hg;.deployment;deploy.cmd"
IF !ERRORLEVEL! NEQ 0 goto error

Now, thats actualy very easy: MsBuild is called for the web application project, and then Kudu is launched to do the actual deployment.

What we want to achieve now is to build the full solution upfront to have all required NuGet packages downloaded before the actual project is being built. And while we are actually getting our hands dirty in a custom deployment script, why don't add running the unit tests of the project as part of the deployment? So, if a test fails, deployment will fail too. I think that's a good idea.

So what I did was adding these two steps just in front of the two default steps:

:: 1. Build solution
echo Build solution
%MSBUILD_PATH% "%DEPLOYMENT_SOURCE%MySolution.sln" /nologo /verbosity:m /t:Build /p:_PackageTempDir="%DEPLOYMENT_TEMP%";AutoParameterizationWebConfigConnectionStrings=false;Configuration=Release /p:SolutionDir="%DEPLOYMENT_SOURCE%.\" %SCM_BUILD_ARGS%
IF !ERRORLEVEL! NEQ 0 goto error

:: 2. Running tests
echo Running tests
vstest.console.exe "%DEPLOYMENT_SOURCE%MyApplication.Web.TestsbinReleaseMyApplication.Web.Tests.dll"
IF !ERRORLEVEL! NEQ 0 goto error

That's it. I just copied the build line and pointed it to my solution, and I added a call to the MsTest tooling to run my tests.

So, with very little tweaking I could remove all dependencies to actual binaries I would have to check in otherwise and I have the Azure git deployment run my unit tests on every deployment. That's what I call easy and convenient.

Setting up my infrastructure – Part 1: Basic tools

For my new pet project I want to use good and efficient tooling. Since I want to create a tool for me and other .NET developers and I feel at home on this platform, I'm going to use C# for the project.

I have my personal MSDN Professional subscription, and so I use Visual Studio 2012 Professional for development. I add my personal ReSharper licence for productivity and I chose Windows 8 Professional as my development OS (in a VM on my MacBook Air). Being totally in the Microsoft .NET ecosystem I'm also going to use MSBuild and MSTest.

Update: Talking about VM on my Mac, I use VMWare Fusion for that. I also have VMware Workstation running on my home server for my build server virtualization, but that will be part of another post.

For source code versionioning I chose Git. Mainly, because I feel that even if Mercurial currently has better tooling support on Windows, Git is more mainstream and tooling is becoming better. As Git clients I currently use the GitHub client and of course the official Git commandline client. I host my sources on BitBucket from Atlassian. They give you private repositories for free, and since I invited some guys I also can collaborate with 3 others if I want without the need to pay for a private shared repository.

Besides that, I of course have the usual .NET developer tools like The Regulator for working with regular expressions, LinqPad for small test thingies and DotPeek as my decompiler.

Now, besides that I need additional tooling to keep track of my tasks, so I need a bug / issue / task tracker. And I don't want to build releases manually or do manual testing, so I will need some sort of automatic build & test tooling, which leads me to some continuous integration / build server. Chosing which tool is best here will take some time, and so I started to evaluate different solutions. More on that in a separate post.

So the toolset for my pet project is right now:

Continue with the next part, or see the other parts in this series:

[contentblock id=infrastructurelinks]