How to: Deep link into an Angular SPA on IIS

When you create a single page application (SPA) with Angular (or other SPA-frameworks, for the matter), you might want to leverage the power of the web and allow a deep link directly to a certain route within your application.

For most, the easy way is to address routes in your application using the so called hash location strategy. This means the URL within your app is separated by a hash and will look like this: https://your.domain/app#route/to/component. However, the hash in the URL actually has a different meaning and should position the browser to a specific anchor (or fragment, as it is called in Angular) on the currently shown page.

Fragment in routing links
Fragment in routing links

In the optimal case we mostly would like to use an semantically more correct URL like this https://your.domain/app/route/to/component#fragment.

This, however, brings up a different issue.

Why do we need the # for a deep link in the first place?

Now, why do we need to resort to the hash in the url at all?

The reason is, that the part in front of the hash is treated as the path to the actual page, and the browser will request exactly that. Our example URL would request the file app/route/to/component from the web server, and this will usually be answered with a HTTP 404 (Not found) status code: the web server does not have a file at this location to deliver. In fact, this route only exists within our client-side application, which is not yet available.

For the application to be available, the browser first needs to load the entry point of our SPA (the index.html), then load the required JavaScript and other required resources, start up the application and finally evaluate the URL to show the correct component to the user.

The solution for an easy deep link

The solution to this is actually pretty simple: We tell the web server to deliver the actual application entry point (our index.html) to the web browser.

The first idea is sending a redirect to the index.html file (using HTTP 301 or 302 responses). This would, however, a) require a second round trip and b) actually change the URL in the browser. If we did this, the SPA would not know of the route in the URL anymore, and we would miss our goal.

Instead, we tell our web server to directly deliver the contents of the index.html file instead of a 404.

Configure IIS to make it so

In our case we host the static files of our SPA on an IIS (Internet Information Server). We need to configure our IIS to deliver the index.html whenever it receives a request it cannot serve otherwise.

For this to happen, we install the URL Rewrite 2.0 extension into IIS. Follow the link and download the installer. It will use the Web Platform Installer to download and install the module into IIS.

Create the basic rule

Now, we need to configure the URL rewrite module to do what we want. In IIS Manager, go to the web site where the SPA is served from and double click on URL rewrite.

We want to ‘Add Rule(s)…’ and create a new rule for the redirect. I called mine ‘redirect all requests’, but you can use whatever name you like to.

In the ‘Match URL’ box, we want to redirect to the index.html whenever we hit any URL. So the ‘Requested URL’ should be set to ‘Matches the Pattern’ and we want to use regular expressions. The pattern we want to use is ^(.*)$, which just means: Match anything from the beginning to the end. We also check the ‘Ignore case’ checkbox.

Match URL rule for a deep link in a SPA
Match URL box

For the moment we skip the ‘Conditions’ and ‘Server Variables’ box and go right to the end to the ‘Action’ box.

We set the ‘Action type’ to ‘Rewrite’ and the ‘Rewrite URL’ to /index.html. We also check the box ‘Append query string’ to preserve the rest of the URL and then check the box ‘Stop processing of subsequent rules’ to prevent the IIS from doing too much work.

Action for a deep link in a SPA
Action box

Tweak the rule for real work usage

Now, this configuration will simply rewrite all requests to our web application and will deliver the contents of our ‘index.html’ file every time. This is not what we want though, as the index.html file will reference some other JavaScript files, images, CSS files etc. from our web server, and we should at least deliver these files 😉

For that, we go back to the ‘Conditions’ box.

Conditions box
Conditions box

First, we ‘Add’ a new condition.

The ‘Condition input’ should be set to {REQUEST_FILENAME}, which is a IIS variable and points to the actual requested file. We set the ‘Check if input string’ drop down to ‘Is Not a File’ and save this condition.

Existing file exclusion
File exclusion

This will prevent IIS from rewriting a URL that points to an existing file, which is exactly what we want: Deliver the existing file instead of the index.html.

Exclude certain paths

In my special case I configured a sub-application in my web site to serve the backend API from the folder ‘/api’. So I do not want the IIS to rewrite requests to the ‘/api’ folder, too.

For that, I added another condition rule that says if the ‘Condition input’ {REQUEST_URI} ‘Does Not Match the Pattern’ /api(.*)$ (with ignore case). This will prevent all requests that start with ‘/api’ from being rewritten.

API path exclusion
API exclusion

Since I now have multiple rules, I set the ‘Logical grouping’ in the ‘Conditions’ box to ‘Match all’.

Make the rules deployable

We do not want to configure this every time we add a new web site. Also, ISS is a nice player and writes all rules into the ‘web.config’ file. Therefore my use case ended up resulting in this little file of elegant IIS rewrite magic:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>
    <system.webServer>
        <rewrite>
            <rewriteMaps>
                <rewriteMap name="^(.*)$" />
            </rewriteMaps>
            <rules>
                <rule
                   name="redirect all requests"
                   stopProcessing="true"
                >
                    <match url="^(.*)$" />
                    <conditions logicalGrouping="MatchAll">
                        <add
                           input="{REQUEST_URI}"
                           pattern="/api(.*)$"
                           negate="true"
                        />
                        <add
                           input="{REQUEST_FILENAME}"
                           matchType="IsFile"
                           negate="true"
                        />
                    </conditions>
                    <action type="Rewrite" url="/index.html" />
                </rule>
            </rules>
        </rewrite>
    </system.webServer>
</configuration>

Finally we can leverage this to simply make this web.config file a part of our Angular web application. My Angular application uses the angular-cli tooling, and so I just added the ‘web.config’ file to my .angular-cli.json file in the app.assets array.

Conclusion

If you want to deep link into your Angular SPA, you can avoid using the hash location strategy. By configuring your web server to deliver the application entry point file instead of a 404 Not found it is possible to bootstrap your application at any route within your application.

This post shows how to achieve that using the URL rewrite module in IIS, but the same concept applies to other web servers capable of rewriting urls.

My developers toolbelt 2016

I cought a tweet a few days ago asking for your developers toolbelt, specifically on windows. And I gave a very short answer and mentioned I would blog about this:

So, this is a more elaborate answer to the developers toolbelt question. My default windows developer installation contains the following:

  • Windows 10 Professional (fully updated)
    • .NET Frameworks active
    • IIS Installed
    • Dev-Mode enabled
    • Linux Subsystem installed
    • Windows Defender as Antivirus solution & default Firewall
      (no external security software)
  • Dropbox
  • 1Password
  • IE, Edge, Chrome, Firefox (for testing, yes, I do quite a bit web dev 🙂 )
  • Git for Windows for commandline usage
  • SourceTree as my graphical git client*
  • Beyond Compare as my diff tool
  • Cmder as my console of choice
    (my previous blog post is about using the Linux bash on Windows in Cmder)
  • Node Version Manager nvm for Windows, and as such a lot of node versions
  • Primary IDE: Visual Studio 2015 (with ReSharper Ultimate)
  • Secondary IDE: WebStorm
  • .NET Sourcepad: LinqPad
  • Primary Database: SQL Server 2016 Express
    other DBMS as required by projects.
  • dbForge Schema compare and dbForge Data compare
  • Office 365
    • OneNote for collaboration
    • PowerPoint for presentations
    • Outlook for e-mail comms
    • Word for occasional paperwork
  • Slack and TeamViewer for other comms / collab
  • VSCode as my main text editor
  • Atom as secondary text editor (i.e. large markdown files, where VSCode crashes)
  • Android Studio
  • Genymotion android emulator (quite a bit faster than the normal one)

In the list above, except for ReSharper, I am not listing additional addons / extensions to the other listed tools.

* – I also tried GitKraken, Tower for Windows and the GitHub client, but they are – in my opinion – not as usable as SourceTree. Especially Tower wastes too much screen estate.

Running Windows 10 Ubuntu Bash in Cmder

“Can you run Bash in Cmder?” – In the comments of my last post (install and run (Oh-my-) zsh on Bash on Ubuntu on Windows), I was asked whether it would be possible to run the Bash (or Zsh) also in Cmder. First I thought it was not possible, but then I got curious. After digging in a bit more it turned out that it IS, in fact, possible. And it’s not difficult too.

So, since I figured out how it works, I also want to show you how you can run the Windows 10 Ubuntu Bash (and/or Zsh) in Cmder.

What is Cmder?

Cmder is a console emulator for Windows. It is my preferred way to use the Windows console (cmd.exe) for the last years, as it allows me to use *NIX commands like ls, less, grep and the like. For me, Cmder is a much nicer overall experience on the command line in Windows, and it makes me much more productive.

Screenshot of Cmder console
Cmder

Cmder allows me to open multiple tabs and multiple shells at once. I can open a normal cmd.exe shell, a second one that also executed the VsDevCmd.bat file to provide access to msbuild, csc etc., a third one with powershell and, if set up correctly, also one with Bash and/or Zsh.

Prerequisites

Actually, you just need Bash on Ubuntu on Windows enabled and, of course Cmder. If you don’t have that, you can simply follow these instructions:

Set up Bash in Cmder

First, in Cmder, you press Win + Alt + T to open up the settings dialog for Tasks. Alternatively you can open up the hamburger menu on the bottom right of the window and navigate to Settings -> Startup -> Tasks.

Step 1: You create a new Task by clicking on the ‘+‘ Button at the bottom and enter the details.

Setup Bash in Cmder
Steps for setting up Bash in Cmder

Step 2: The first input field of the dialog is the task name. I named it ‘bash::ubuntu‘ but the naming is completely up to you. You can use double colons for grouping, so this would be the ‘Ubuntu‘ task in the ‘Bash‘ group. Cmder already comes with a ‘Bash‘ group containing entries for Bash on mintty (using Cygwin) and another one based on on git-for-windows. To distinguish between the other Bashes and the ‘real’ Ubuntu thing, I simply chose to also opt into this naming scheme.

Step 3: In the “Task parameters” input you can configure an icon (I just picked the Ubuntu Bash icon): /icon "%USERPROFILE%\AppData\Local\lxss\bash.ico"

Step 4: In the “Commands” input field, you enter the command that this task should start. This is the actual call to Bash: %windir%\system32\bash.exe ~ -cur_console:p

This will start bash.exe in the current user directory (~), and also sets the cursor console mode of ConEmu, which works behind the scenes in Cmder, to allow for correct cursor movement with the arrow keys.

You can find further details on how to set up Tasks in Cmder (actually, in ConEmu) in the ConEmu documentation about tasks.

This task now will start the Bash on Ubuntu on Windows within Cmder, with all the settings you did in your .bashrc file.

Set up Zsh in Cmder

If you did not set up your Bash to automatically launch Zsh from your .bashrc file like I showed in the other blog post, you can add another task for this.

I called this new task ‘zsh::ubuntu’, but again the naming is up to you. I used the same task parameters as for the Bash task and just added -c zsh to the command entry. This will cause Bash.exe to start Zsh automatically.

The full line is: %windir%\system32\bash.exe ~ -c zsh -cur_console:p

How to install and run (Oh-My-) zsh on Windows

I run zsh on Windows. But why? Some time ago, when I was still using a Mac, one of my colleagues suggested to use zsh instead of bash.

Since then I switched to a Surface Book which I happily preferred over OS X mac OS and mainly use cmdr as my shell. Now the Windows 10 Anniversary update is out, and it comes with “Bash on Ubuntu on Windows“.

Now, while having bash on my fingertips again, my colleagues’ suggestion came back to my mind, and I tried to use zsh again.

Installation of zsh on Windows

Installation of zsh on Bash on Ubuntu on Windows is as easy as installing it on plain Ubuntu. You start bash and issue the following command:

sudo apt-get install zsh

This will use the default package manager apt of the Ubuntu that runs on the Linux Subsystem on Windows to install the zsh package.

You can directly try it out by simply calling zsh from your shell to open a new zsh from Bash.

Making zsh on Windows your default shell

I wanted zsh to start directly when I open Bash on Ubuntu on Windows, because I am too lazy to always launch it manually. To do so, I added the following little snippet at the very beginning of the ~/.bashrc file:

# if running in terminal...
if test -t 1; then
# ...start zsh
exec zsh
fi

See it here in context:

Image of changes in .bashrc in context
Changes in .bashrc file

When Bash starts up, it will check if it has a terminal attached to stdout (the test -t 1 part) and then it executes zsh. You should try it out directly by quitting Bash and restarting Bash on Ubuntu on Windows and see how it launches zsh directly.

Customization

A plain zsh is quite boring, and there are a ton of useful things for zsh to leverage, so customization is key. A well-known repository of zsh customizations with nice defaults is Oh-My-Zsh, and it brings a cornucopia of themes, plugins and features with it. Installation is fairly easy, again. From your freshly installed and started zsh, you just issue the command that is shown at the oh-my-zsh website:

sh -c "$(curl -fsSL https://raw.githubusercontent.com/robbyrussell/oh-my-zsh/master/tools/install.sh)"

After that, you can configure your plugins (I use git and ubuntu) and themes (I use this custom one).

Update 2016/11/16: Be aware that this theme also requires ‘git’ to be installed to display branch information, so you should do a sudo apt-get install git if you did not already.

Zsh is a bit theme-happy, so you will find more than 100 of them in the default installation. To help a bit, there are some screenshots shown in the zsh wiki. Please be aware that unicode characters in zsh in Bash on Ubuntu on Windows aren’t really supported by now, so some themes may not be for you.

So, after a bit of customization, you can start to enjoy the features of zsh.

An example of globbing, in zsh on Bash on Ubuntu on Windows
An example of globbing, in zsh on Bash on Ubuntu on Windows

Further reading

If you are more interested in Mac and OS X mac OS, here is a great post from one of my co-workers: Thorsten Hans: Setting up iterm2 with oh-my-zsh and powerline on OS X.

Also a great post on the features of zsh (including globbing!) is this one: David Fendrich: No, Really. Use Zsh.

And if you already seek for a far more deeper experience, then you can dive into this: Jerome Dalbert: Migrate from oh-my-zsh to prezto.

Update 2016/11/16:

Fixed in article: OS X is now mac OS.

In response to Michals question (thanks!), I blogged about how you can run Bash (or Zsh) in Cmder.how you can run Bash (or Zsh) in Cmder

I just released my first open source library

That may sound strange, but I’m indeed a little bit proud that yesterday evening I released my very first open source library: the SmartDev.ConfigurationMapper.

This is a small helper library designed to be used with ASP.NET 5 (currently still known as vNext). You pass in an instance of the new Microsoft.Framework.ConfigurationModel.IConfiguration type, and you can map the config keys to properties of your custom classes, to have strongly typed configuration types in your application.

It works both on Asp.Net50 as well as Asp.NetCore5.0 (new CoreCLR), and is one building block of a side project I started recently, because I struggled with the configuration system a bit.

Grab it on NuGet: http://www.nuget.org/packages/SmartDev.ConfigurationMapper, or get involved on GitHub: https://github.com/gingters/SmartDev.ConfigurationMapper!

Static site gens the 2nd: Hexo and Sandra.Snow

In my recent blog post I wrote about my experiences so far with static site gens in general. I said I was looking into Hexo before I go on with my plan B and this is what I did.

Hexo is very capable. If you really just want to a blog, then this is the way to go (imho). The main problem with Hexo is that it is a one-man-show from China and that this guy is currently in the middle of the process to release Hexo 3.0. Which is not a bad thing, but for one several plugins have not yet been updated, which makes it very hard to get things running. Then again, some plugins like the sitemap plugin that should generate a sitemap.xml do not have access to all entries for the tags and the categories. That said, I could probably write my own, but while the API is documented somehow I don’t got around configuring my WebStorm IDE correctly so that it indeed provides me with code completion on the Hexo API, which makes everything very tedious.

That said, under all static site generators Hexo is by far the most powerful one, and definitely worth a look at, as it is plain JavaScript, runs on Node and is very unproblematic to install both on Windows and on OS X.

Before I went on with my plan B I also tried a quick look at Sandra.Snow. Sandra.Snow is a .NET based static site gen. I looked at it and its source and also talked to its creator on JabbR. I did not really dig deeper into it. Again the problem was that it is totally intended to be a blog platform. Doing some more sophisticated website stuff with it is not really supported and seems very hard to do.

So, from my trip to the world of static site gens I am back with a finding. If you want a simple blog, concentrate on your articles, are fine with some predefined template you don’t want to make a lot of changes to and don’t want to do more sophisticated stuff like creating menus for all of your categories and tags on the start page, which involves knowing all posts metadata while generating the main page, it seems almost all static site gens are capable of delivering what you need. As soon as you want to do more fancy stuff, some fall apart sooner, some later.

If I wanted to move my personal blog (this one) away from WordPress to a static page, it probably would be Hexo.
But for now, I’m firing up my Visual Studio on my PC and Xamarin Studio on my Mac and build my website with ASP.NET MVC. That at least also allows me to implement a search functionality into the page.

Update: Fixed some of my mistakes. Thanks to Andreas H. for proofreading. 🙂

Ask a Ninja: Current state of static site generators

Over the course of the last weekend I tried to built a website for a side project of mine (gaming related). To broaden my horizon, and to be able to host the website cheap and fast, I wanted to use a static site generator for it.

First try: Jekyll. TL;DR: Does not work on windows.

Since Jekyll is directly supported by Github pages, and I wanted to host here, and a lot of other guys around on my Twitter timeline use Jekyll, I thought this is the way to go.

Jekyll does not state that it would work under Windows. It is also not officially supported there. So this is not a great surprise. There were, however, some tutorials / manuals around that described how it should be able to run on Windows, which I wanted to follow. I installed ruby, and right from the the first try to install the Jekyll gem failed hard. So I could not even follow those instructions, as already the installation failed because of some strange SSL errors.

That’s when I looked for an alternative. I found the page Staticgen.com that lists a couple of projects related to this topic. This lead me to the next step.

Second try: Pretzel. TL;DR: Not stable. Crashes all the time.

As I am a developer that feels himself comfortable on the .NET platform, I looked for something compatible with Jekyll that, mabye, would run on Windows and on Mono on my Mac. Pretzel was the candidate. I installed Pretzel, and it worked with the template that ships with it.

Sadly, as soon as I started to throw in my own HTML, CSS and JS, the Pretzel preview server crashed at least at every second request. So it was changing files, restart Pretzel, preview. Change files, restart Pretzel, preview, and so on. I did this like half an hour when I became too annoyed to follow this path further.

At the end of the day, I want to build a website, and not debug a tool that will generate some HTML for me. So it was time to look for some alternative again. Since there weren’t any other alternatives on .NET that were Jekyll compatible, I thought about Node as a platform. I know that Node JS runs on Windows and my Mac, I had some previous experience with Node and Javascript is not that bad.

I did not go to StaticGen this time, because a handful of people on my timeline also started to use a Node based static site generator: Wintersmith. So without double checking the facts and blindly trusting my tech-colleagues, I downloaded it.

Third try: Wintersmith. TL;DR: Plugins don’t work and/or are too complicated. Not enough docs.

To be honest, I was totally turned off by the default template engine Jade. I have a full HTML template, and Jade is nowhere near HTML. Some people may find that cool, but it does not represent the way I want to work with my website.

Gladly, the template system can be changed, and I switched to Handlebars, as I liked the easy and lightweight mixin with HTML. Then the difficulties began.

The HTML template I was going to use requires Bootstrap (which requires jQuery), FontAwesome, I used two additional webfonts to match a certain website style, and alltogether I ended up with a few javascript and css files. I wanted them combined and minified, but the most promising Wintersmith plugin kelvin was not ported to Wintersmith 2.

Another plugin problem was wintersmith-tags. First of all, the plugin exposes methods to be called in the template. Something Handlebars is not capable of. Luckily the switch to Nunjucks for the templating engine was more or less done quickly without a lot of effort, but then I noticed that wintersmith-tags would not, despite the documentation stating otherwise, list all available tags on call of the corresponding method. I just got back an empty array.

That the plugins are mostly written in Coffescript, which I am not used to read, does not make it any better. The fact that wintersmith-tags was the only plugin with a rough documentation makes it even more difficult to work with. Up to the point where I gave up I had 11 plugins installed, none working as I intended them to.

Conclusion:

Fact is: I’m totally frustrated about the extremely poor experience I had over the last weekend.

I will give it a last shot with Hexo. It seems more active than Wintersmith, is pure Javascript (no Coffeescript, so for me better to read and understand), and it seems to have a larger installation base, so the chances are higher to get questions answered by the community. But if that does not work out too, I already came up with my plan B:

Let go of all these static site gens available out there. I’m probably going another route: I am a developer with deep knowledge in .NET. And .NET, especially ASP.NET MVC, Razor, Web.Optimization etc. has all the stuff required to built a great website, right at my fingertips. But I still want simple HTML delivered.

It is likely I’m going to grab ASP.NET vNext and built my own static site gen on top of it. Using the next .NET version it will also run cross-platform and I can use it on my Mac too.

This way I can make sure it works as I want it to, and since the guys using Wintersmith (and by the way are not really happy with it too) are also mainly former .NET devs, I probably have a good chance to get the first users except me, too. But I really hope Hexo is going to work out. Doing that much work just to get a simple but modern website on, is quite ridiculous. A decade ago it was a lot easier, but expecations have risen since then. A lot.

Update 2015-02-10: I will postpone my exploration of Hexo a little bit. It seems the project is still alive and just a few days ago release it’s version 3 RC. I can’t figure out if Hexo does support Nunjucks but I’m getting in touch with its creator and will find out.

Speaking at Delphi Tage 2014 and EKON 18

Hi, my schedule for this years conference season is quite short.

I will be speaking at Delphi-Tage in Bonn this weekend (6th of September). My two sessions there are:

  • Continuous Integration: Build- und Testautomatisierung für Ihre Projekte
  • JavaScript für Delphi-Entwickler: TypeScript

Additionally I will be speaking at EKON 18, 3rd to 5th of November in Cologne. Here I have three talks, and these are:

  • Verteiltes Leid ist halbes Leid: Git in der Praxis
  • Ey, was geht ab? – Metriken und Anwendungsanalyse
  • Software Quality 101

And maybe, just maybe, I will have a short guest-appearance at EGX London, but if that is becoming true, I will post about that separately.

Why the IIS overwrites your vary-Header with ‘Accept-Encoding’

I spent some time today searching for a bug. Turned out it was a nasty one.

If you have a single line of code, being there for a single purpose, you usually expect this code to work.

My case: Setting a HTTP header on a response to a normal HTTP request. One probably would think this is a normal task. I wanted to set a header to control caching. More specifically, the ‘vary’ header, as the result of the request is dependent upon a specific header sent with the request.

Besides the line setting the vary header, I had two other lines setting cacheability to public and setting the absolute expiration for that specific response.

As expected, the expiration was set and the cacheability was set to public. But surpinsingly, the vary header always was ‘Accept-Encoding’, ignoring the value I specifically set. I tried setting the header via the cache class on the response, directly on the response, in my controller, in the EndRequest event on the application.. everything failed. I set another header on the same places, all of them worked. As soon as I wanted to change or even remove the ‘vary’ header, I had no chance. It always was sent with the value ‘Accept-Encoding’.

So I searched in my whole code base if there has been some tampering with the header. No result too. And then the nasty search began.

To make this post a bit shorter, I discovered that this is a bug in the web server that hits you as soon as you enable dynamic content compression. In this case, IIS overwrites your vary header with ‘Accept-Encoding’.

Of course it makes sense to *add* that value to the header, because when a GZip’ed response is cached and returned to a client that does not accept gzip as encoding, the client would receive a compressed response it can not decode. What does *not* make sense though, is to overwrite pre-existing vary-headers, as they usually are being set for a reason too 😉

That bug in IIS was reported in August 2012: http://connect.microsoft.com/VisualStudio/feedback/details/758474/iis-gzip-compression-filter-removes-pre-existing-vary-header

It was not until November 2013 until it was fixed: http://support.microsoft.com/kb/2877816/en-us (at least, in November the binaries for this hotfix were built), it seems the KB article and hotfix was officially released just a few days ago.

The good: I discovered what the issue was and was able to direct my co workers to the best solution (patching the affected systems).
The bad: I spent almost half a day for researching and reproducing that issue until I found the solution.
The ugly: In fact, I would never have expected such a massive bug in a product like IIS, which I honestly consider rock-stable as a platform you can easily depend on.

Note to myself: Regular Expressions performance

This post is mostly as a reminder for myself, to not loose these important links again. But said that, it’s probably interesting for you too, when you care about performance and the .NET behaviour around regular expressions (Regex).

The common things

In the .NET world, some things are very common. First is, you are advised to use a StringBuilder whenever you concatenate some strings. Second is: If a Regex is slow, use RegexOptions.Compiled to fix it. Well… Now, in fact there are reasons for this sort of advise. String concatenation IS slow, for various, commonly known reasons. But still a StringBuilder has some overhead and there are situations where using it imposes an unwanted overhead.

The very same goes for RegexOptions.Compiled, and Jeff Atwood, aka Coding Horror, wrote a very good article about that a few years ago: To compile or not to compile (Jeff Atwood).

In one of the comments another article from MSDN (BCL Blog) is referenced, where the different caching behaviour of Regex in .NET 1.1 vs. .NET 2.0 is explained: Regex Class Caching Changes between .NET Framework 1.1 and .NET Framework 2.0 (Josh Free).

The not-so-common things

There is only a single thing that is true for each and every kind of performance optimization. And it’s the simple two words: “It depends.”.

With regular expressions, the first thing any performance issue depends on is, if you really need a regular expression for the task. Of course, if you really know regular expressions, what they can do and what they can’t, and for what they are the correct tool, you are very likely to not run into those kinds of problems. But when you just learned about the power of Regexes (all you have is a hammer) everything starts to look as a string desperatly waiting to get matched (everything is a nail). What I want to say is: Not everything that could be solved with a Regex also should be solved by one. Again, I have a link for me and you to keep in your Regex link collection: Regular Expressions: Now You Have Two Problems (Jeff Atwood).

Now, finally to performance optimization links.

There is a good blog article series on the MSDN BCL Blog (like the one above) that goes very deep into how the Regex class performs in different scenarios. You find them here:

And, besides those, once again a nice article on “catastrophic backtracking” from Jeff: Regex Performance (Jeff Atwood).

One more thing

There are three articles, that are not really available anymore. Three very good articles from Mike, that you can only retrieve from the wayback machine. I’m really thinking hard about providing a mirror for these articles on my blog too. But until then, here are the links: