I just released my first open source library

That may sound strange, but I’m indeed a little bit proud that yesterday evening I released my very first open source library: the SmartDev.ConfigurationMapper.

This is a small helper library designed to be used with ASP.NET 5 (currently still known as vNext). You pass in an instance of the new Microsoft.Framework.ConfigurationModel.IConfiguration type, and you can map the config keys to properties of your custom classes, to have strongly typed configuration types in your application.

It works both on Asp.Net50 as well as Asp.NetCore5.0 (new CoreCLR), and is one building block of a side project I started recently, because I struggled with the configuration system a bit.

Grab it on NuGet: http://www.nuget.org/packages/SmartDev.ConfigurationMapper, or get involved on GitHub: https://github.com/gingters/SmartDev.ConfigurationMapper!

Static site gens the 2nd: Hexo and Sandra.Snow

In my recent blog post I wrote about my experiences so far with static site gens in general. I said I was looking into Hexo before I go on with my plan B and this is what I did.

Hexo is very capable. If you really just want to a blog, then this is the way to go (imho). The main problem with Hexo is that it is a one-man-show from China and that this guy is currently in the middle of the process to release Hexo 3.0. Which is not a bad thing, but for one several plugins have not yet been updated, which makes it very hard to get things running. Then again, some plugins like the sitemap plugin that should generate a sitemap.xml do not have access to all entries for the tags and the categories. That said, I could probably write my own, but while the API is documented somehow I don’t got around configuring my WebStorm IDE correctly so that it indeed provides me with code completion on the Hexo API, which makes everything very tedious.
Continue reading “Static site gens the 2nd: Hexo and Sandra.Snow”

Ask a Ninja: Current state of static site generators

Over the course of the last weekend I tried to built a website for a side project of mine (gaming related). To broaden my horizon, and to be able to host the website cheap and fast, I wanted to use a static site generator for it.

First try: Jekyll. TL;DR: Does not work on windows.

Since Jekyll is directly supported by Github pages, and I wanted to host here, and a lot of other guys around on my Twitter timeline use Jekyll, I thought this is the way to go.
Continue reading “Ask a Ninja: Current state of static site generators”

Speaking at Delphi Tage 2014 and EKON 18

Hi, my schedule for this years conference season is quite short.

I will be speaking at Delphi-Tage in Bonn this weekend (6th of September). My two sessions there are:

  • Continuous Integration: Build- und Testautomatisierung für Ihre Projekte
  • JavaScript für Delphi-Entwickler: TypeScript

Additionally I will be speaking at EKON 18, 3rd to 5th of November in Cologne. Here I have three talks, and these are:

  • Verteiltes Leid ist halbes Leid: Git in der Praxis
  • Ey, was geht ab? – Metriken und Anwendungsanalyse
  • Software Quality 101

And maybe, just maybe, I will have a short guest-appearance at EGX London, but if that is becoming true, I will post about that separately.

Re-emerging from the void. Some random notes

Hello everybody. I just wanted to tell you i’m back 🙂

Some might have wondered why I did not blog anymore. Answer is simple: I moved. From a rented house into another house I now bought and where I had to do a lot of DIY stuff (including electrical installation) myself before we could move in. Besides all that stuff I simply did not find the time to blog and to be honest, if I would have had a slot of spare time, I probably wouldn’t have used that for blogging.
Continue reading “Re-emerging from the void. Some random notes”

Why the IIS overwrites your vary-Header with ‘Accept-Encoding’

I spent some time today searching for a bug. Turned out it was a nasty one.

If you have a single line of code, being there for a single purpose, you usually expect this code to work.

My case: Setting a HTTP header on a response to a normal HTTP request. One probably would think this is a normal task. I wanted to set a header to control caching. More specifically, the ‘vary’ header, as the result of the request is dependent upon a specific header sent with the request.
Continue reading “Why the IIS overwrites your vary-Header with ‘Accept-Encoding’”

Note to myself: Regular Expressions performance

This post is mostly as a reminder for myself, to not loose these important links again. But said that, it’s probably interesting for you too, when you care about performance and the .NET behaviour around regular expressions (Regex).

The common things

In the .NET world, some things are very common. First is, you are advised to use a StringBuilder whenever you concatenate some strings. Second is: If a Regex is slow, use RegexOptions.Compiled to fix it. Well… Now, in fact there are reasons for this sort of advise. String concatenation IS slow, for various, commonly known reasons. But still a StringBuilder has some overhead and there are situations where using it imposes an unwanted overhead.

The very same goes for RegexOptions.Compiled, and Jeff Atwood, aka Coding Horror, wrote a very good article about that a few years ago: To compile or not to compile (Jeff Atwood).

In one of the comments another article from MSDN (BCL Blog) is referenced, where the different caching behaviour of Regex in .NET 1.1 vs. .NET 2.0 is explained: Regex Class Caching Changes between .NET Framework 1.1 and .NET Framework 2.0 (Josh Free).

The not-so-common things

There is only a single thing that is true for each and every kind of performance optimization. And it’s the simple two words: “It depends.”.

With regular expressions, the first thing any performance issue depends on is, if you really need a regular expression for the task. Of course, if you really know regular expressions, what they can do and what they can’t, and for what they are the correct tool, you are very likely to not run into those kinds of problems. But when you just learned about the power of Regexes (all you have is a hammer) everything starts to look as a string desperatly waiting to get matched (everything is a nail). What I want to say is: Not everything that could be solved with a Regex also should be solved by one. Again, I have a link for me and you to keep in your Regex link collection: Regular Expressions: Now You Have Two Problems (Jeff Atwood).

Now, finally to performance optimization links.

There is a good blog article series on the MSDN BCL Blog (like the one above) that goes very deep into how the Regex class performs in different scenarios. You find them here:

And, besides those, once again a nice article on “catastrophic backtracking” from Jeff: Regex Performance (Jeff Atwood).

One more thing

There are three articles, that are not really available anymore. Three very good articles from Mike, that you can only retrieve from the wayback machine. I’m really thinking hard about providing a mirror for these articles on my blog too. But until then, here are the links:

My résumé of BASTA conference 2013

Right now I’m on the train on my journey back from BASTA 2013. Time for a small résumé.

Making it short: The BASTA conference was great. Especially of course meeting friends and colleagues you usually just see at conferences. It’s quite interesting to see some of the guys you know from other conferences like EKON or Delphi Live! that are now also here at BASTA (especially from tool vendors that have their origin in the Delphi space).

Besides my own session about JavaScript Unit Testing (here are the Slides and code samples and also a small summary (in German) published just after my talk) I also attended some other sessions. Especially the whole “Modern Business Application” track, mainly driven from our colleagues at thinktecture was very interesting.

But perhaps even more interesting were some of the boothes. Especially one of them caught my attention: Testhub (in German). I really like the idea of outsourcing your testing efforts to testing specialists. And also the price tag on this kind of crowd-testing seems very appealing. I had the opportunity to talk directly to one of the founders of Testhub, and chatted with him about requirements against testing and the whole concept seems very well thought-out.

Im a bit sad that I have to leave that early, but I have other duties that cannot I don’t want to wait. I wish all other BASTA attendees and speakers another two exciting days in Mainz. I’m looking forward to see some of them at EKON in November.

Update: Added a link to the summary article on “Windows Developer”.

I’m Speaking at EKON 17, too

Nick started the “I’m speaking at” campaign for EKON 17, and so I thought I’d team up and join him, not only on this campaign but also going to support his Unit-Testing session with some automation tips & tricks in my own CI session.

I’m giving two sessions at

EKON 17, 04. -06. November 2013 in Cologne:

Both talks will be held in German, but since I keep my slides in english you should be able to follow, and I’m also happy to explain things in english when asked to do so.

The first session is about Continuous Integration (CI), where I’m going to explain why you should do it and what tools are available and how powerful a good CI setup can be.

The second talk is an introduction into Git. I’m explaining the differences between SVN and Git, and showing you what distributed version control can do for you.

So, I’d be happy to see you at EKON in Cologne.

Oh, and if you’re still unsure about whether you should go, then just contact me. We can arrange a little discount 😉

EKON 17 Banner

And some additional info about EKON (in German):

EKON 17 – Das Konferenz-Highlight für Delphi-Entwickler

Vom 04. bis 06. November 2013 präsentiert das Entwickler Magazin in Köln die 17. Entwickler Konferenz. Die EKON ist das große jährliche Konferenz-Highlight für die Delphi-Community und bietet in diesem Jahr insgesamt 30 Sessions und 4 Workshops mit vielen nationalen und internationalen Profis der Delphi-Community – unter anderen mit Marco Cantú von Embarcadero, Bernd Ua, Ray Konopka, Nick Hodges, Cary Jensen u.v.m. Fünf Tracks stehen zur Auswahl, von Tips &Techniques, IDE & Tools, Crossplatform/ Mobile/ Web über OOP/ Fundamentals bis hin zu Datenbanken. Auch Neuigkeiten aus dem Hause Embarcadero, wie beispielsweise die iOS-Entwicklung stehen auf der Agenda. Alle Infos auf www.entwickler-konferenz.de.

Why is everyone using Disqus?

Recently I discovered that more and more Blogs I visit start to use Disqus. And I don’t understand, why.

As Tim Bray said: “Own your space on the Web, and pay for it. Extra effort, but otherwise you’re a sharecropper“.

I read it as this is not just about owning your own ‘real estate’ on the web, but also owning your content. There is a saying going through the net (I couldn’t discover the original source, but it’s quoted like hell out there): “If you’re not paying for it, you’re the product“.

What’s this Disqus thing in the first place?

Maybe the reason that I can’t understand why everyone starts to use Disqus is, that I didn’t get the Disqus concept right myself.
For me, Disqus is outsourcing the discussion part (read: comments area) from your blog (or website) to a third party. Completely. Including the user-generated content in these comments.

Disqus offers itself as an easy way to build up communities. It’s targeted at advertising, so everything you post in your comments via Disqus may will be used to target ads.

If your read through their terms and conditions, you will notice that your personal identifiable information you and third parties (like Facebook, when connecting to Disqus) pass to them may also be sold, together with any information you ‘publicly share’ via their service (read: your comments).

What’s so bad about it?

Well, you may decide for yourself if not actually owning your own content is okay for you. You may decide to share your comments on a site that uses Disqus or you may decide to NOT share your thoughts there.

But making this decision for your Blog is making this decision for all your visitors and everyone that want to comment on your posts is forced to share their comments with Disqus – or not share their thoughts with you.

The latter is the real problem with it. I won’t comment on sites using Disqus. So you won’t receive feedback from me. Okay, some people would rather say that’s a good thing ;-), but others would be pretty happy about what I have to say.

The technical debt

On several occasions I noticed that the Disqus service isn’t that reliable. I am commuting a lot. Right now I’m sitting in a train and have tethered internet connection. Mostly, Disqus doesn’t load at all for me. I can’t tell why. Especially not why it mostly happens when I’m on a tethered connection. And honestly, I don’t care.

When using Disqus for your site, you’re not only sourcing out your comments and your user’s content, but also the continuity of your service. What, if the Disqus API changes? You need to react, or lose you’re comments. What, if they decide to shut down the service? You lose your comments. Maybe you’re able to export all ‘your’ stuff previously. But then you’re on your own how to import that stuff into your own system again.

In my opinion, the price you pay with using this service is too high. You may loose participants in your discussions, you loose control over your content and you loose control over the availability of parts of your service.

Oh, wait. I forgot to mention the advantages you have from giving up your control. Erm.. Okay. Maybe someone tells me? I can’t find any.

Update: Fixed a typo. Thanks to Bob for the heads-up.