My conference season 2017

It is, again, the time of the year where the schedule for the conference season comes together nicely. So, without further adue, these are the conferences I will be giving talks on this year:

The conference season 2017

Foren-Tage 2017, Hamburg, Sept. 23rd

The Foren-Tage (Forum days) are an event driven by the three big German Delphi forums. I am looking forward to visit Hamburg again for that community meeting.

At the Foren Tage I will speak about the 101 of Software Quality.

BASTA! 20, Mainz, September 25th to 29th

The BASTA! is the biggest German .NET conference.

At this years BASTA I will give a session about possibilities to combine your existing .NET Desktop applications with modern web tech.

EKON 21, Cologne, October 23th to 25th 2017

The EKON is a conference focused on software developers in general.

This year I am going to talk about what Node.js can do for you as a developer and I will give a session about building lightweight web APIs using .NET Core.

Using VSTS Wildcards for .NET Core projects

On our Thinktecture Research Retreat I had the chance to have a deeper look at the Visual Studio Team System (VSTS), especially in regards to VSTS and its Build / CI System. Right on the first steps with it I ran into an issue with the VSTS wildcards for .NET Core projects.

The Issue with VSTS Wildcards

I wanted to build and publish NuGet packages for a .NET Core library project after all unit tests passed. So for the VSTS build I chose the .NET Core template, that comes with the tasks to call the dotnet CLI with restore, build, test and publish the project.

Since I wanted to create and push NuGet packages instead of creating a project with the dotnet publish command I changed the publish task to call dotnet pack instead. The problem with that was that the default project search pattern for the .NET Core CLI commands is **/*.csproj. With that pattern, the build created (and pushed) a NuGet package for all projects. Including my test project, which is something you usually don’t want to do, because who wants to install a test project with NuGet? 😉

Since that pattern seemed like a normal globbing pattern, I tried to simply exclude my test project from that, but sadly the next build did not pick up any of my projects for packing. Since I worked quite a bit with gulp tasks previously and usually know my way around globs, I found that a bit irritating. I tried several different syntaxes, but none of them worked out.

The journey

First I tried to to find some official documentation from Microsoft on the VSTS task wildcards. Sadly I wasn’t able to find one, which was quite discouraging. So I asked Google for some more ideas. In a blog post from Léon Bouqiet he explains the TFS 2015 build task wildcard format. Since VSTS and TFS are very similar (in fact, TFS is a version of VSTS you can install on premises), I hoped this would help.

Update 2017-07-22: I now know why I wasn’t able to find the official documentation during my searches. There is quite a disconnect between the terms used on the VSTS UI (“Projects”, “Paths”, “Wildcards”) and the terms used in the documentation (“File matching patterns reference”). I just didn’t come up with the correct search terms, as they were nowhere on the UI.

Strangely enough, also none of his search patterns worked. The syntax described in his post reminded me a bit of the file patterns used in JetBrains TeamCity server. Since I know my way around TeamCity build configs quite well, I tried multiple variations of each scheme, with and without variables for the full patch. I tried them separated by semi-colons or by new lines, with different orders etc., but none of them yielded the expected result. Every version I tried led to one of two possible results: Either no or all projects were packaged.

At this stage I was quite a bit frustrated. Luckily my boss knows someone who knows someone and so we could direct that question to some people who know quite a bit more about VSTS than what was available on online resources.

Already the first answer that came back was a big hint in the correct direction.

The rabbit hole

This response hinted me to the fact, that the VSTS tasks are open source on GitHub. Which, by the way, is awesomesauce. He linked me to the repository of the .NET Core CLI tasks. There I searched my way through the dependencies of that project where the actual file matching is implemented. Working through the code I found the correct way of how the pattern should look like.

Using VSTS Wildcards for .NET Core projects

First and foremost, there are currently at least two different versions in VSTS for wildcard file matching. The one explained in the aforementioned blog post is the ‘legacy’ version and no longer widely supported. Besides that any task is a custom extension, so it is possible that other tasks use different globbing libraries and bring their own pattern matching variation with them.

To give the answer away, this is the pattern that includes the projects, but excludes all test projects:


Despite what other sites may say, for the .NET CLI Tasks the patterns need to be separated by new lines and not be semi-colons. Also, the exclusion operator is the exclamation mark and not the -: operator.


Astonishingly this is a very simple, straightforward pattern. However given previous experience with other globbing syntax and pretty much all online references pointing to the ‘legacy’ patterns that is similar to TeamCity´s this really isn’t the intuitive way.

Getting used to VSTS Wildcards in build tasks, especially for .NET Core projects, takes a bit of time when you are used to other globbing syntaxes. It would have helped a lot when the VSTS UI would have provided a link to the most recent documentation. However, having the source code of the VSTS Wildcards search pattern evaluation in the build tasks at hand is an invaluable resource.

By the way, after I figured all that out another response to our question turned up with a link to the official documentation. That verified my findings were correct. It seems my google foo wasn’t so strong in the first place.

Update 2017-07-22:

The team at Microsoft got in touch with me after this post, which is pretty awesome! Indeed they want to add a link to the official documentation page right from the info box. This will help a lot, because it removes the need to search for help on the patterns in the first place.

How to: Deep link into an Angular SPA on IIS

When you create a single page application (SPA) with Angular (or other SPA-frameworks, for the matter), you might want to leverage the power of the web and allow a deep link directly to a certain route within your application.

For most, the easy way is to address routes in your application using the so called hash location strategy. This means the URL within your app is separated by a hash and will look like this: https://your.domain/app#route/to/component. However, the hash in the URL actually has a different meaning and should position the browser to a specific anchor (or fragment, as it is called in Angular) on the currently shown page.

Fragment in routing links
Fragment in routing links

In the optimal case we mostly would like to use an semantically more correct URL like this https://your.domain/app/route/to/component#fragment.

This, however, brings up a different issue.

Why do we need the # for a deep link in the first place?

Now, why do we need to resort to the hash in the url at all?

The reason is, that the part in front of the hash is treated as the path to the actual page, and the browser will request exactly that. Our example URL would request the file app/route/to/component from the web server, and this will usually be answered with a HTTP 404 (Not found) status code: the web server does not have a file at this location to deliver. In fact, this route only exists within our client-side application, which is not yet available.

For the application to be available, the browser first needs to load the entry point of our SPA (the index.html), then load the required JavaScript and other required resources, start up the application and finally evaluate the URL to show the correct component to the user.

The solution for an easy deep link

The solution to this is actually pretty simple: We tell the web server to deliver the actual application entry point (our index.html) to the web browser.

The first idea is sending a redirect to the index.html file (using HTTP 301 or 302 responses). This would, however, a) require a second round trip and b) actually change the URL in the browser. If we did this, the SPA would not know of the route in the URL anymore, and we would miss our goal.

Instead, we tell our web server to directly deliver the contents of the index.html file instead of a 404.

Configure IIS to make it so

In our case we host the static files of our SPA on an IIS (Internet Information Server). We need to configure our IIS to deliver the index.html whenever it receives a request it cannot serve otherwise.

For this to happen, we install the URL Rewrite 2.0 extension into IIS. Follow the link and download the installer. It will use the Web Platform Installer to download and install the module into IIS.

Create the basic rule

Now, we need to configure the URL rewrite module to do what we want. In IIS Manager, go to the web site where the SPA is served from and double click on URL rewrite.

We want to ‘Add Rule(s)…’ and create a new rule for the redirect. I called mine ‘redirect all requests’, but you can use whatever name you like to.

In the ‘Match URL’ box, we want to redirect to the index.html whenever we hit any URL. So the ‘Requested URL’ should be set to ‘Matches the Pattern’ and we want to use regular expressions. The pattern we want to use is ^(.*)$, which just means: Match anything from the beginning to the end. We also check the ‘Ignore case’ checkbox.

Match URL rule for a deep link in a SPA
Match URL box

For the moment we skip the ‘Conditions’ and ‘Server Variables’ box and go right to the end to the ‘Action’ box.

We set the ‘Action type’ to ‘Rewrite’ and the ‘Rewrite URL’ to /index.html. We also check the box ‘Append query string’ to preserve the rest of the URL and then check the box ‘Stop processing of subsequent rules’ to prevent the IIS from doing too much work.

Action for a deep link in a SPA
Action box

Tweak the rule for real work usage

Now, this configuration will simply rewrite all requests to our web application and will deliver the contents of our ‘index.html’ file every time. This is not what we want though, as the index.html file will reference some other JavaScript files, images, CSS files etc. from our web server, and we should at least deliver these files 😉

For that, we go back to the ‘Conditions’ box.

Conditions box
Conditions box

First, we ‘Add’ a new condition.

The ‘Condition input’ should be set to {REQUEST_FILENAME}, which is a IIS variable and points to the actual requested file. We set the ‘Check if input string’ drop down to ‘Is Not a File’ and save this condition.

Existing file exclusion
File exclusion

This will prevent IIS from rewriting a URL that points to an existing file, which is exactly what we want: Deliver the existing file instead of the index.html.

Exclude certain paths

In my special case I configured a sub-application in my web site to serve the backend API from the folder ‘/api’. So I do not want the IIS to rewrite requests to the ‘/api’ folder, too.

For that, I added another condition rule that says if the ‘Condition input’ {REQUEST_URI} ‘Does Not Match the Pattern’ /api(.*)$ (with ignore case). This will prevent all requests that start with ‘/api’ from being rewritten.

API path exclusion
API exclusion

Since I now have multiple rules, I set the ‘Logical grouping’ in the ‘Conditions’ box to ‘Match all’.

Make the rules deployable

We do not want to configure this every time we add a new web site. Also, ISS is a nice player and writes all rules into the ‘web.config’ file. Therefore my use case ended up resulting in this little file of elegant IIS rewrite magic:

<?xml version="1.0" encoding="UTF-8"?>
                <rewriteMap name="^(.*)$" />
                   name="redirect all requests"
                    <match url="^(.*)$" />
                    <conditions logicalGrouping="MatchAll">
                    <action type="Rewrite" url="/index.html" />

Finally we can leverage this to simply make this web.config file a part of our Angular web application. My Angular application uses the angular-cli tooling, and so I just added the ‘web.config’ file to my .angular-cli.json file in the app.assets array.


If you want to deep link into your Angular SPA, you can avoid using the hash location strategy. By configuring your web server to deliver the application entry point file instead of a 404 Not found it is possible to bootstrap your application at any route within your application.

This post shows how to achieve that using the URL rewrite module in IIS, but the same concept applies to other web servers capable of rewriting urls.

My developers toolbelt 2016

I cought a tweet a few days ago asking for your developers toolbelt, specifically on windows. And I gave a very short answer and mentioned I would blog about this:

So, this is a more elaborate answer to the developers toolbelt question. My default windows developer installation contains the following:

  • Windows 10 Professional (fully updated)
    • .NET Frameworks active
    • IIS Installed
    • Dev-Mode enabled
    • Linux Subsystem installed
    • Windows Defender as Antivirus solution & default Firewall
      (no external security software)
  • Dropbox
  • 1Password
  • IE, Edge, Chrome, Firefox (for testing, yes, I do quite a bit web dev 🙂 )
  • Git for Windows for commandline usage
  • SourceTree as my graphical git client*
  • Beyond Compare as my diff tool
  • Cmder as my console of choice
    (my previous blog post is about using the Linux bash on Windows in Cmder)
  • Node Version Manager nvm for Windows, and as such a lot of node versions
  • Primary IDE: Visual Studio 2015 (with ReSharper Ultimate)
  • Secondary IDE: WebStorm
  • .NET Sourcepad: LinqPad
  • Primary Database: SQL Server 2016 Express
    other DBMS as required by projects.
  • dbForge Schema compare and dbForge Data compare
  • Office 365
    • OneNote for collaboration
    • PowerPoint for presentations
    • Outlook for e-mail comms
    • Word for occasional paperwork
  • Slack and TeamViewer for other comms / collab
  • VSCode as my main text editor
  • Atom as secondary text editor (i.e. large markdown files, where VSCode crashes)
  • Android Studio
  • Genymotion android emulator (quite a bit faster than the normal one)

In the list above, except for ReSharper, I am not listing additional addons / extensions to the other listed tools.

* – I also tried GitKraken, Tower for Windows and the GitHub client, but they are – in my opinion – not as usable as SourceTree. Especially Tower wastes too much screen estate.

Running Windows 10 Ubuntu Bash in Cmder

“Can you run Bash in Cmder?” – In the comments of my last post (install and run (Oh-my-) zsh on Bash on Ubuntu on Windows), I was asked whether it would be possible to run the Bash (or Zsh) also in Cmder. First I thought it was not possible, but then I got curious. After digging in a bit more it turned out that it IS, in fact, possible. And it’s not difficult too.

So, since I figured out how it works, I also want to show you how you can run the Windows 10 Ubuntu Bash (and/or Zsh) in Cmder.

What is Cmder?

Cmder is a console emulator for Windows. It is my preferred way to use the Windows console (cmd.exe) for the last years, as it allows me to use *NIX commands like ls, less, grep and the like. For me, Cmder is a much nicer overall experience on the command line in Windows, and it makes me much more productive.

Screenshot of Cmder console

Cmder allows me to open multiple tabs and multiple shells at once. I can open a normal cmd.exe shell, a second one that also executed the VsDevCmd.bat file to provide access to msbuild, csc etc., a third one with powershell and, if set up correctly, also one with Bash and/or Zsh.


Actually, you just need Bash on Ubuntu on Windows enabled and, of course Cmder. If you don’t have that, you can simply follow these instructions:

Set up Bash in Cmder

First, in Cmder, you press Win + Alt + T to open up the settings dialog for Tasks. Alternatively you can open up the hamburger menu on the bottom right of the window and navigate to Settings -> Startup -> Tasks.

Step 1: You create a new Task by clicking on the ‘+‘ Button at the bottom and enter the details.

Setup Bash in Cmder
Steps for setting up Bash in Cmder

Step 2: The first input field of the dialog is the task name. I named it ‘bash::ubuntu‘ but the naming is completely up to you. You can use double colons for grouping, so this would be the ‘Ubuntu‘ task in the ‘Bash‘ group. Cmder already comes with a ‘Bash‘ group containing entries for Bash on mintty (using Cygwin) and another one based on on git-for-windows. To distinguish between the other Bashes and the ‘real’ Ubuntu thing, I simply chose to also opt into this naming scheme.

Step 3: In the “Task parameters” input you can configure an icon (I just picked the Ubuntu Bash icon): /icon "%USERPROFILE%\AppData\Local\lxss\bash.ico"

Step 4: In the “Commands” input field, you enter the command that this task should start. This is the actual call to Bash: %windir%\system32\bash.exe ~ -cur_console:p

This will start bash.exe in the current user directory (~), and also sets the cursor console mode of ConEmu, which works behind the scenes in Cmder, to allow for correct cursor movement with the arrow keys.

You can find further details on how to set up Tasks in Cmder (actually, in ConEmu) in the ConEmu documentation about tasks.

This task now will start the Bash on Ubuntu on Windows within Cmder, with all the settings you did in your .bashrc file.

Set up Zsh in Cmder

If you did not set up your Bash to automatically launch Zsh from your .bashrc file like I showed in the other blog post, you can add another task for this.

I called this new task ‘zsh::ubuntu’, but again the naming is up to you. I used the same task parameters as for the Bash task and just added -c zsh to the command entry. This will cause Bash.exe to start Zsh automatically.

The full line is: %windir%\system32\bash.exe ~ -c zsh -cur_console:p

How to install and run (Oh-My-) zsh on Windows

I run zsh on Windows. But why? Some time ago, when I was still using a Mac, one of my colleagues suggested to use zsh instead of bash.

Since then I switched to a Surface Book which I happily preferred over OS X mac OS and mainly use cmdr as my shell. Now the Windows 10 Anniversary update is out, and it comes with “Bash on Ubuntu on Windows“.

Now, while having bash on my fingertips again, my colleagues’ suggestion came back to my mind, and I tried to use zsh again.

Installation of zsh on Windows

Installation of zsh on Bash on Ubuntu on Windows is as easy as installing it on plain Ubuntu. You start bash and issue the following command:

sudo apt-get install zsh

This will use the default package manager apt of the Ubuntu that runs on the Linux Subsystem on Windows to install the zsh package.

You can directly try it out by simply calling zsh from your shell to open a new zsh from Bash.

Making zsh on Windows your default shell

I wanted zsh to start directly when I open Bash on Ubuntu on Windows, because I am too lazy to always launch it manually. To do so, I added the following little snippet at the very beginning of the ~/.bashrc file:

# if running in terminal...
if test -t 1; then
# ...start zsh
exec zsh

See it here in context:

Image of changes in .bashrc in context
Changes in .bashrc file

When Bash starts up, it will check if it has a terminal attached to stdout (the test -t 1 part) and then it executes zsh. You should try it out directly by quitting Bash and restarting Bash on Ubuntu on Windows and see how it launches zsh directly.


A plain zsh is quite boring, and there are a ton of useful things for zsh to leverage, so customization is key. A well-known repository of zsh customizations with nice defaults is Oh-My-Zsh, and it brings a cornucopia of themes, plugins and features with it. Installation is fairly easy, again. From your freshly installed and started zsh, you just issue the command that is shown at the oh-my-zsh website:

sh -c "$(curl -fsSL"

After that, you can configure your plugins (I use git and ubuntu) and themes (I use this custom one).

Update 2016/11/16: Be aware that this theme also requires ‘git’ to be installed to display branch information, so you should do a sudo apt-get install git if you did not already.

Zsh is a bit theme-happy, so you will find more than 100 of them in the default installation. To help a bit, there are some screenshots shown in the zsh wiki. Please be aware that unicode characters in zsh in Bash on Ubuntu on Windows aren’t really supported by now, so some themes may not be for you.

So, after a bit of customization, you can start to enjoy the features of zsh.

An example of globbing, in zsh on Bash on Ubuntu on Windows
An example of globbing, in zsh on Bash on Ubuntu on Windows

Further reading

If you are more interested in Mac and OS X mac OS, here is a great post from one of my co-workers: Thorsten Hans: Setting up iterm2 with oh-my-zsh and powerline on OS X.

Also a great post on the features of zsh (including globbing!) is this one: David Fendrich: No, Really. Use Zsh.

And if you already seek for a far more deeper experience, then you can dive into this: Jerome Dalbert: Migrate from oh-my-zsh to prezto.

Update 2016/11/16:

Fixed in article: OS X is now mac OS.

In response to Michals question (thanks!), I blogged about how you can run Bash (or Zsh) in you can run Bash (or Zsh) in Cmder

Microsoft Edge – First impressions

After the upgrade to Windows 10 I started using Microsoft EdgeMicrosoft Edge as my main browser. Now, just a few days later, I can tell you about my first impressions.

For me, Edge seems like a lean browser, but it still feels very much beta.

Some things I already complain about: When you drag tabs out of the main window, and put them on the desktop as a separate window, and perhaps later want to drag that separate window into the main window again, then the second window automatically opens up a new empty tab and stays there.

When using my web mail account, I cannot use drag & drop there to add new attachments from an explorer window. Also, the normal file upload dialog does not work there (it does here on my blog, though).

Also, on certain pages like Twitter and especially JabbR, the tabs tend to hang and block and I need to reload the page often.

Another huge issue for me is, that there is no adblock extension for Edge so far. And I am so used to using the adblocker, as it really makes my browsing experience just so much better, that I was really disturbed about the awful lot of ads I now see again.

After all, there are some problems with Edge that needs to be fixed, and some extensions need to be provided soon. Still, Edge seems promising, when the issues are resolved quickly. At the current stage, I really see Edge as a beta and not as a production-ready browser.

I just released my first open source library

That may sound strange, but I’m indeed a little bit proud that yesterday evening I released my very first open source library: the SmartDev.ConfigurationMapper.

This is a small helper library designed to be used with ASP.NET 5 (currently still known as vNext). You pass in an instance of the new Microsoft.Framework.ConfigurationModel.IConfiguration type, and you can map the config keys to properties of your custom classes, to have strongly typed configuration types in your application.

It works both on Asp.Net50 as well as Asp.NetCore5.0 (new CoreCLR), and is one building block of a side project I started recently, because I struggled with the configuration system a bit.

Grab it on NuGet:, or get involved on GitHub:!

Static site gens the 2nd: Hexo and Sandra.Snow

In my recent blog post I wrote about my experiences so far with static site gens in general. I said I was looking into Hexo before I go on with my plan B and this is what I did.

Hexo is very capable. If you really just want to a blog, then this is the way to go (imho). The main problem with Hexo is that it is a one-man-show from China and that this guy is currently in the middle of the process to release Hexo 3.0. Which is not a bad thing, but for one several plugins have not yet been updated, which makes it very hard to get things running. Then again, some plugins like the sitemap plugin that should generate a sitemap.xml do not have access to all entries for the tags and the categories. That said, I could probably write my own, but while the API is documented somehow I don’t got around configuring my WebStorm IDE correctly so that it indeed provides me with code completion on the Hexo API, which makes everything very tedious.

That said, under all static site generators Hexo is by far the most powerful one, and definitely worth a look at, as it is plain JavaScript, runs on Node and is very unproblematic to install both on Windows and on OS X.

Before I went on with my plan B I also tried a quick look at Sandra.Snow. Sandra.Snow is a .NET based static site gen. I looked at it and its source and also talked to its creator on JabbR. I did not really dig deeper into it. Again the problem was that it is totally intended to be a blog platform. Doing some more sophisticated website stuff with it is not really supported and seems very hard to do.

So, from my trip to the world of static site gens I am back with a finding. If you want a simple blog, concentrate on your articles, are fine with some predefined template you don’t want to make a lot of changes to and don’t want to do more sophisticated stuff like creating menus for all of your categories and tags on the start page, which involves knowing all posts metadata while generating the main page, it seems almost all static site gens are capable of delivering what you need. As soon as you want to do more fancy stuff, some fall apart sooner, some later.

If I wanted to move my personal blog (this one) away from WordPress to a static page, it probably would be Hexo.
But for now, I’m firing up my Visual Studio on my PC and Xamarin Studio on my Mac and build my website with ASP.NET MVC. That at least also allows me to implement a search functionality into the page.

Update: Fixed some of my mistakes. Thanks to Andreas H. for proofreading. 🙂

Ask a Ninja: Current state of static site generators

Over the course of the last weekend I tried to built a website for a side project of mine (gaming related). To broaden my horizon, and to be able to host the website cheap and fast, I wanted to use a static site generator for it.

First try: Jekyll. TL;DR: Does not work on windows.

Since Jekyll is directly supported by Github pages, and I wanted to host here, and a lot of other guys around on my Twitter timeline use Jekyll, I thought this is the way to go.

Jekyll does not state that it would work under Windows. It is also not officially supported there. So this is not a great surprise. There were, however, some tutorials / manuals around that described how it should be able to run on Windows, which I wanted to follow. I installed ruby, and right from the the first try to install the Jekyll gem failed hard. So I could not even follow those instructions, as already the installation failed because of some strange SSL errors.

That’s when I looked for an alternative. I found the page that lists a couple of projects related to this topic. This lead me to the next step.

Second try: Pretzel. TL;DR: Not stable. Crashes all the time.

As I am a developer that feels himself comfortable on the .NET platform, I looked for something compatible with Jekyll that, mabye, would run on Windows and on Mono on my Mac. Pretzel was the candidate. I installed Pretzel, and it worked with the template that ships with it.

Sadly, as soon as I started to throw in my own HTML, CSS and JS, the Pretzel preview server crashed at least at every second request. So it was changing files, restart Pretzel, preview. Change files, restart Pretzel, preview, and so on. I did this like half an hour when I became too annoyed to follow this path further.

At the end of the day, I want to build a website, and not debug a tool that will generate some HTML for me. So it was time to look for some alternative again. Since there weren’t any other alternatives on .NET that were Jekyll compatible, I thought about Node as a platform. I know that Node JS runs on Windows and my Mac, I had some previous experience with Node and Javascript is not that bad.

I did not go to StaticGen this time, because a handful of people on my timeline also started to use a Node based static site generator: Wintersmith. So without double checking the facts and blindly trusting my tech-colleagues, I downloaded it.

Third try: Wintersmith. TL;DR: Plugins don’t work and/or are too complicated. Not enough docs.

To be honest, I was totally turned off by the default template engine Jade. I have a full HTML template, and Jade is nowhere near HTML. Some people may find that cool, but it does not represent the way I want to work with my website.

Gladly, the template system can be changed, and I switched to Handlebars, as I liked the easy and lightweight mixin with HTML. Then the difficulties began.

The HTML template I was going to use requires Bootstrap (which requires jQuery), FontAwesome, I used two additional webfonts to match a certain website style, and alltogether I ended up with a few javascript and css files. I wanted them combined and minified, but the most promising Wintersmith plugin kelvin was not ported to Wintersmith 2.

Another plugin problem was wintersmith-tags. First of all, the plugin exposes methods to be called in the template. Something Handlebars is not capable of. Luckily the switch to Nunjucks for the templating engine was more or less done quickly without a lot of effort, but then I noticed that wintersmith-tags would not, despite the documentation stating otherwise, list all available tags on call of the corresponding method. I just got back an empty array.

That the plugins are mostly written in Coffescript, which I am not used to read, does not make it any better. The fact that wintersmith-tags was the only plugin with a rough documentation makes it even more difficult to work with. Up to the point where I gave up I had 11 plugins installed, none working as I intended them to.


Fact is: I’m totally frustrated about the extremely poor experience I had over the last weekend.

I will give it a last shot with Hexo. It seems more active than Wintersmith, is pure Javascript (no Coffeescript, so for me better to read and understand), and it seems to have a larger installation base, so the chances are higher to get questions answered by the community. But if that does not work out too, I already came up with my plan B:

Let go of all these static site gens available out there. I’m probably going another route: I am a developer with deep knowledge in .NET. And .NET, especially ASP.NET MVC, Razor, Web.Optimization etc. has all the stuff required to built a great website, right at my fingertips. But I still want simple HTML delivered.

It is likely I’m going to grab ASP.NET vNext and built my own static site gen on top of it. Using the next .NET version it will also run cross-platform and I can use it on my Mac too.

This way I can make sure it works as I want it to, and since the guys using Wintersmith (and by the way are not really happy with it too) are also mainly former .NET devs, I probably have a good chance to get the first users except me, too. But I really hope Hexo is going to work out. Doing that much work just to get a simple but modern website on, is quite ridiculous. A decade ago it was a lot easier, but expecations have risen since then. A lot.

Update 2015-02-10: I will postpone my exploration of Hexo a little bit. It seems the project is still alive and just a few days ago release it’s version 3 RC. I can’t figure out if Hexo does support Nunjucks but I’m getting in touch with its creator and will find out.