Using VSTS Wildcards for .NET Core projects

On our Thinktecture Research Retreat I had the chance to have a deeper look at the Visual Studio Team System (VSTS), especially in regards to VSTS and its Build / CI System. Right on the first steps with it I ran into an issue with the VSTS wildcards for .NET Core projects.

The Issue with VSTS Wildcards

I wanted to build and publish NuGet packages for a .NET Core library project after all unit tests passed. So for the VSTS build I chose the .NET Core template, that comes with the tasks to call the dotnet CLI with restore, build, test and publish the project.

Since I wanted to create and push NuGet packages instead of creating a project with the dotnet publish command I changed the publish task to call dotnet pack instead. The problem with that was that the default project search pattern for the .NET Core CLI commands is **/*.csproj. With that pattern, the build created (and pushed) a NuGet package for all projects. Including my test project, which is something you usually don’t want to do, because who wants to install a test project with NuGet? 😉

Since that pattern seemed like a normal globbing pattern, I tried to simply exclude my test project from that, but sadly the next build did not pick up any of my projects for packing. Since I worked quite a bit with gulp tasks previously and usually know my way around globs, I found that a bit irritating. I tried several different syntaxes, but none of them worked out.

The journey

First I tried to to find some official documentation from Microsoft on the VSTS task wildcards. Sadly I wasn’t able to find one, which was quite discouraging. So I asked Google for some more ideas. In a blog post from Léon Bouqiet he explains the TFS 2015 build task wildcard format. Since VSTS and TFS are very similar (in fact, TFS is a version of VSTS you can install on premises), I hoped this would help.

Update 2017-07-22: I now know why I wasn’t able to find the official documentation during my searches. There is quite a disconnect between the terms used on the VSTS UI (“Projects”, “Paths”, “Wildcards”) and the terms used in the documentation (“File matching patterns reference”). I just didn’t come up with the correct search terms, as they were nowhere on the UI.

Strangely enough, also none of his search patterns worked. The syntax described in his post reminded me a bit of the file patterns used in JetBrains TeamCity server. Since I know my way around TeamCity build configs quite well, I tried multiple variations of each scheme, with and without variables for the full patch. I tried them separated by semi-colons or by new lines, with different orders etc., but none of them yielded the expected result. Every version I tried led to one of two possible results: Either no or all projects were packaged.

At this stage I was quite a bit frustrated. Luckily my boss knows someone who knows someone and so we could direct that question to some people who know quite a bit more about VSTS than what was available on online resources.

Already the first answer that came back was a big hint in the correct direction.

The rabbit hole

This response hinted me to the fact, that the VSTS tasks are open source on GitHub. Which, by the way, is awesomesauce. He linked me to the repository of the .NET Core CLI tasks. There I searched my way through the dependencies of that project where the actual file matching is implemented. Working through the code I found the correct way of how the pattern should look like.

Using VSTS Wildcards for .NET Core projects

First and foremost, there are currently at least two different versions in VSTS for wildcard file matching. The one explained in the aforementioned blog post is the ‘legacy’ version and no longer widely supported. Besides that any task is a custom extension, so it is possible that other tasks use different globbing libraries and bring their own pattern matching variation with them.

To give the answer away, this is the pattern that includes the projects, but excludes all test projects:


Despite what other sites may say, for the .NET CLI Tasks the patterns need to be separated by new lines and not be semi-colons. Also, the exclusion operator is the exclamation mark and not the -: operator.


Astonishingly this is a very simple, straightforward pattern. However given previous experience with other globbing syntax and pretty much all online references pointing to the ‘legacy’ patterns that is similar to TeamCity´s this really isn’t the intuitive way.

Getting used to VSTS Wildcards in build tasks, especially for .NET Core projects, takes a bit of time when you are used to other globbing syntaxes. It would have helped a lot when the VSTS UI would have provided a link to the most recent documentation. However, having the source code of the VSTS Wildcards search pattern evaluation in the build tasks at hand is an invaluable resource.

By the way, after I figured all that out another response to our question turned up with a link to the official documentation. That verified my findings were correct. It seems my google foo wasn’t so strong in the first place.

Update 2017-07-22:

The team at Microsoft got in touch with me after this post, which is pretty awesome! Indeed they want to add a link to the official documentation page right from the info box. This will help a lot, because it removes the need to search for help on the patterns in the first place.

My developers toolbelt 2016

I cought a tweet a few days ago asking for your developers toolbelt, specifically on windows. And I gave a very short answer and mentioned I would blog about this:

So, this is a more elaborate answer to the developers toolbelt question. My default windows developer installation contains the following:

  • Windows 10 Professional (fully updated)
    • .NET Frameworks active
    • IIS Installed
    • Dev-Mode enabled
    • Linux Subsystem installed
    • Windows Defender as Antivirus solution & default Firewall
      (no external security software)
  • Dropbox
  • 1Password
  • IE, Edge, Chrome, Firefox (for testing, yes, I do quite a bit web dev 🙂 )
  • Git for Windows for commandline usage
  • SourceTree as my graphical git client*
  • Beyond Compare as my diff tool
  • Cmder as my console of choice
    (my previous blog post is about using the Linux bash on Windows in Cmder)
  • Node Version Manager nvm for Windows, and as such a lot of node versions
  • Primary IDE: Visual Studio 2015 (with ReSharper Ultimate)
  • Secondary IDE: WebStorm
  • .NET Sourcepad: LinqPad
  • Primary Database: SQL Server 2016 Express
    other DBMS as required by projects.
  • dbForge Schema compare and dbForge Data compare
  • Office 365
    • OneNote for collaboration
    • PowerPoint for presentations
    • Outlook for e-mail comms
    • Word for occasional paperwork
  • Slack and TeamViewer for other comms / collab
  • VSCode as my main text editor
  • Atom as secondary text editor (i.e. large markdown files, where VSCode crashes)
  • Android Studio
  • Genymotion android emulator (quite a bit faster than the normal one)

In the list above, except for ReSharper, I am not listing additional addons / extensions to the other listed tools.

* – I also tried GitKraken, Tower for Windows and the GitHub client, but they are – in my opinion – not as usable as SourceTree. Especially Tower wastes too much screen estate.

Running Windows 10 Ubuntu Bash in Cmder

“Can you run Bash in Cmder?” – In the comments of my last post (install and run (Oh-my-) zsh on Bash on Ubuntu on Windows), I was asked whether it would be possible to run the Bash (or Zsh) also in Cmder. First I thought it was not possible, but then I got curious. After digging in a bit more it turned out that it IS, in fact, possible. And it’s not difficult too.

So, since I figured out how it works, I also want to show you how you can run the Windows 10 Ubuntu Bash (and/or Zsh) in Cmder.

What is Cmder?

Cmder is a console emulator for Windows. It is my preferred way to use the Windows console (cmd.exe) for the last years, as it allows me to use *NIX commands like ls, less, grep and the like. For me, Cmder is a much nicer overall experience on the command line in Windows, and it makes me much more productive.

Screenshot of Cmder console

Cmder allows me to open multiple tabs and multiple shells at once. I can open a normal cmd.exe shell, a second one that also executed the VsDevCmd.bat file to provide access to msbuild, csc etc., a third one with powershell and, if set up correctly, also one with Bash and/or Zsh.


Actually, you just need Bash on Ubuntu on Windows enabled and, of course Cmder. If you don’t have that, you can simply follow these instructions:

Set up Bash in Cmder

First, in Cmder, you press Win + Alt + T to open up the settings dialog for Tasks. Alternatively you can open up the hamburger menu on the bottom right of the window and navigate to Settings -> Startup -> Tasks.

Step 1: You create a new Task by clicking on the ‘+‘ Button at the bottom and enter the details.

Setup Bash in Cmder
Steps for setting up Bash in Cmder

Step 2: The first input field of the dialog is the task name. I named it ‘bash::ubuntu‘ but the naming is completely up to you. You can use double colons for grouping, so this would be the ‘Ubuntu‘ task in the ‘Bash‘ group. Cmder already comes with a ‘Bash‘ group containing entries for Bash on mintty (using Cygwin) and another one based on on git-for-windows. To distinguish between the other Bashes and the ‘real’ Ubuntu thing, I simply chose to also opt into this naming scheme.

Step 3: In the “Task parameters” input you can configure an icon (I just picked the Ubuntu Bash icon): /icon "%USERPROFILE%\AppData\Local\lxss\bash.ico"

Step 4: In the “Commands” input field, you enter the command that this task should start. This is the actual call to Bash: %windir%\system32\bash.exe ~ -cur_console:p

This will start bash.exe in the current user directory (~), and also sets the cursor console mode of ConEmu, which works behind the scenes in Cmder, to allow for correct cursor movement with the arrow keys.

You can find further details on how to set up Tasks in Cmder (actually, in ConEmu) in the ConEmu documentation about tasks.

This task now will start the Bash on Ubuntu on Windows within Cmder, with all the settings you did in your .bashrc file.

Set up Zsh in Cmder

If you did not set up your Bash to automatically launch Zsh from your .bashrc file like I showed in the other blog post, you can add another task for this.

I called this new task ‘zsh::ubuntu’, but again the naming is up to you. I used the same task parameters as for the Bash task and just added -c zsh to the command entry. This will cause Bash.exe to start Zsh automatically.

The full line is: %windir%\system32\bash.exe ~ -c zsh -cur_console:p

How to install and run (Oh-My-) zsh on Windows

I run zsh on Windows. But why? Some time ago, when I was still using a Mac, one of my colleagues suggested to use zsh instead of bash.

Since then I switched to a Surface Book which I happily preferred over OS X mac OS and mainly use cmdr as my shell. Now the Windows 10 Anniversary update is out, and it comes with “Bash on Ubuntu on Windows“.

Now, while having bash on my fingertips again, my colleagues’ suggestion came back to my mind, and I tried to use zsh again.

Installation of zsh on Windows

Installation of zsh on Bash on Ubuntu on Windows is as easy as installing it on plain Ubuntu. You start bash and issue the following command:

sudo apt-get install zsh

This will use the default package manager apt of the Ubuntu that runs on the Linux Subsystem on Windows to install the zsh package.

You can directly try it out by simply calling zsh from your shell to open a new zsh from Bash.

Making zsh on Windows your default shell

I wanted zsh to start directly when I open Bash on Ubuntu on Windows, because I am too lazy to always launch it manually. To do so, I added the following little snippet at the very beginning of the ~/.bashrc file:

# if running in terminal...
if test -t 1; then
# ...start zsh
exec zsh

See it here in context:

Image of changes in .bashrc in context
Changes in .bashrc file

When Bash starts up, it will check if it has a terminal attached to stdout (the test -t 1 part) and then it executes zsh. You should try it out directly by quitting Bash and restarting Bash on Ubuntu on Windows and see how it launches zsh directly.


A plain zsh is quite boring, and there are a ton of useful things for zsh to leverage, so customization is key. A well-known repository of zsh customizations with nice defaults is Oh-My-Zsh, and it brings a cornucopia of themes, plugins and features with it. Installation is fairly easy, again. From your freshly installed and started zsh, you just issue the command that is shown at the oh-my-zsh website:

sh -c "$(curl -fsSL"

After that, you can configure your plugins (I use git and ubuntu) and themes (I use this custom one).

Update 2016/11/16: Be aware that this theme also requires ‘git’ to be installed to display branch information, so you should do a sudo apt-get install git if you did not already.

Zsh is a bit theme-happy, so you will find more than 100 of them in the default installation. To help a bit, there are some screenshots shown in the zsh wiki. Please be aware that unicode characters in zsh in Bash on Ubuntu on Windows aren’t really supported by now, so some themes may not be for you.

So, after a bit of customization, you can start to enjoy the features of zsh.

An example of globbing, in zsh on Bash on Ubuntu on Windows
An example of globbing, in zsh on Bash on Ubuntu on Windows

Further reading

If you are more interested in Mac and OS X mac OS, here is a great post from one of my co-workers: Thorsten Hans: Setting up iterm2 with oh-my-zsh and powerline on OS X.

Also a great post on the features of zsh (including globbing!) is this one: David Fendrich: No, Really. Use Zsh.

And if you already seek for a far more deeper experience, then you can dive into this: Jerome Dalbert: Migrate from oh-my-zsh to prezto.

Update 2016/11/16:

Fixed in article: OS X is now mac OS.

In response to Michals question (thanks!), I blogged about how you can run Bash (or Zsh) in you can run Bash (or Zsh) in Cmder

Microsoft Edge – First impressions

After the upgrade to Windows 10 I started using Microsoft EdgeMicrosoft Edge as my main browser. Now, just a few days later, I can tell you about my first impressions.

For me, Edge seems like a lean browser, but it still feels very much beta.

Some things I already complain about: When you drag tabs out of the main window, and put them on the desktop as a separate window, and perhaps later want to drag that separate window into the main window again, then the second window automatically opens up a new empty tab and stays there.

When using my web mail account, I cannot use drag & drop there to add new attachments from an explorer window. Also, the normal file upload dialog does not work there (it does here on my blog, though).

Also, on certain pages like Twitter and especially JabbR, the tabs tend to hang and block and I need to reload the page often.

Another huge issue for me is, that there is no adblock extension for Edge so far. And I am so used to using the adblocker, as it really makes my browsing experience just so much better, that I was really disturbed about the awful lot of ads I now see again.

After all, there are some problems with Edge that needs to be fixed, and some extensions need to be provided soon. Still, Edge seems promising, when the issues are resolved quickly. At the current stage, I really see Edge as a beta and not as a production-ready browser.

Static site gens the 2nd: Hexo and Sandra.Snow

In my recent blog post I wrote about my experiences so far with static site gens in general. I said I was looking into Hexo before I go on with my plan B and this is what I did.

Hexo is very capable. If you really just want to a blog, then this is the way to go (imho). The main problem with Hexo is that it is a one-man-show from China and that this guy is currently in the middle of the process to release Hexo 3.0. Which is not a bad thing, but for one several plugins have not yet been updated, which makes it very hard to get things running. Then again, some plugins like the sitemap plugin that should generate a sitemap.xml do not have access to all entries for the tags and the categories. That said, I could probably write my own, but while the API is documented somehow I don’t got around configuring my WebStorm IDE correctly so that it indeed provides me with code completion on the Hexo API, which makes everything very tedious.

That said, under all static site generators Hexo is by far the most powerful one, and definitely worth a look at, as it is plain JavaScript, runs on Node and is very unproblematic to install both on Windows and on OS X.

Before I went on with my plan B I also tried a quick look at Sandra.Snow. Sandra.Snow is a .NET based static site gen. I looked at it and its source and also talked to its creator on JabbR. I did not really dig deeper into it. Again the problem was that it is totally intended to be a blog platform. Doing some more sophisticated website stuff with it is not really supported and seems very hard to do.

So, from my trip to the world of static site gens I am back with a finding. If you want a simple blog, concentrate on your articles, are fine with some predefined template you don’t want to make a lot of changes to and don’t want to do more sophisticated stuff like creating menus for all of your categories and tags on the start page, which involves knowing all posts metadata while generating the main page, it seems almost all static site gens are capable of delivering what you need. As soon as you want to do more fancy stuff, some fall apart sooner, some later.

If I wanted to move my personal blog (this one) away from WordPress to a static page, it probably would be Hexo.
But for now, I’m firing up my Visual Studio on my PC and Xamarin Studio on my Mac and build my website with ASP.NET MVC. That at least also allows me to implement a search functionality into the page.

Update: Fixed some of my mistakes. Thanks to Andreas H. for proofreading. 🙂

Ask a Ninja: Current state of static site generators

Over the course of the last weekend I tried to built a website for a side project of mine (gaming related). To broaden my horizon, and to be able to host the website cheap and fast, I wanted to use a static site generator for it.

First try: Jekyll. TL;DR: Does not work on windows.

Since Jekyll is directly supported by Github pages, and I wanted to host here, and a lot of other guys around on my Twitter timeline use Jekyll, I thought this is the way to go.

Jekyll does not state that it would work under Windows. It is also not officially supported there. So this is not a great surprise. There were, however, some tutorials / manuals around that described how it should be able to run on Windows, which I wanted to follow. I installed ruby, and right from the the first try to install the Jekyll gem failed hard. So I could not even follow those instructions, as already the installation failed because of some strange SSL errors.

That’s when I looked for an alternative. I found the page that lists a couple of projects related to this topic. This lead me to the next step.

Second try: Pretzel. TL;DR: Not stable. Crashes all the time.

As I am a developer that feels himself comfortable on the .NET platform, I looked for something compatible with Jekyll that, mabye, would run on Windows and on Mono on my Mac. Pretzel was the candidate. I installed Pretzel, and it worked with the template that ships with it.

Sadly, as soon as I started to throw in my own HTML, CSS and JS, the Pretzel preview server crashed at least at every second request. So it was changing files, restart Pretzel, preview. Change files, restart Pretzel, preview, and so on. I did this like half an hour when I became too annoyed to follow this path further.

At the end of the day, I want to build a website, and not debug a tool that will generate some HTML for me. So it was time to look for some alternative again. Since there weren’t any other alternatives on .NET that were Jekyll compatible, I thought about Node as a platform. I know that Node JS runs on Windows and my Mac, I had some previous experience with Node and Javascript is not that bad.

I did not go to StaticGen this time, because a handful of people on my timeline also started to use a Node based static site generator: Wintersmith. So without double checking the facts and blindly trusting my tech-colleagues, I downloaded it.

Third try: Wintersmith. TL;DR: Plugins don’t work and/or are too complicated. Not enough docs.

To be honest, I was totally turned off by the default template engine Jade. I have a full HTML template, and Jade is nowhere near HTML. Some people may find that cool, but it does not represent the way I want to work with my website.

Gladly, the template system can be changed, and I switched to Handlebars, as I liked the easy and lightweight mixin with HTML. Then the difficulties began.

The HTML template I was going to use requires Bootstrap (which requires jQuery), FontAwesome, I used two additional webfonts to match a certain website style, and alltogether I ended up with a few javascript and css files. I wanted them combined and minified, but the most promising Wintersmith plugin kelvin was not ported to Wintersmith 2.

Another plugin problem was wintersmith-tags. First of all, the plugin exposes methods to be called in the template. Something Handlebars is not capable of. Luckily the switch to Nunjucks for the templating engine was more or less done quickly without a lot of effort, but then I noticed that wintersmith-tags would not, despite the documentation stating otherwise, list all available tags on call of the corresponding method. I just got back an empty array.

That the plugins are mostly written in Coffescript, which I am not used to read, does not make it any better. The fact that wintersmith-tags was the only plugin with a rough documentation makes it even more difficult to work with. Up to the point where I gave up I had 11 plugins installed, none working as I intended them to.


Fact is: I’m totally frustrated about the extremely poor experience I had over the last weekend.

I will give it a last shot with Hexo. It seems more active than Wintersmith, is pure Javascript (no Coffeescript, so for me better to read and understand), and it seems to have a larger installation base, so the chances are higher to get questions answered by the community. But if that does not work out too, I already came up with my plan B:

Let go of all these static site gens available out there. I’m probably going another route: I am a developer with deep knowledge in .NET. And .NET, especially ASP.NET MVC, Razor, Web.Optimization etc. has all the stuff required to built a great website, right at my fingertips. But I still want simple HTML delivered.

It is likely I’m going to grab ASP.NET vNext and built my own static site gen on top of it. Using the next .NET version it will also run cross-platform and I can use it on my Mac too.

This way I can make sure it works as I want it to, and since the guys using Wintersmith (and by the way are not really happy with it too) are also mainly former .NET devs, I probably have a good chance to get the first users except me, too. But I really hope Hexo is going to work out. Doing that much work just to get a simple but modern website on, is quite ridiculous. A decade ago it was a lot easier, but expecations have risen since then. A lot.

Update 2015-02-10: I will postpone my exploration of Hexo a little bit. It seems the project is still alive and just a few days ago release it’s version 3 RC. I can’t figure out if Hexo does support Nunjucks but I’m getting in touch with its creator and will find out.

Note to myself: Regular Expressions performance

This post is mostly as a reminder for myself, to not loose these important links again. But said that, it’s probably interesting for you too, when you care about performance and the .NET behaviour around regular expressions (Regex).

The common things

In the .NET world, some things are very common. First is, you are advised to use a StringBuilder whenever you concatenate some strings. Second is: If a Regex is slow, use RegexOptions.Compiled to fix it. Well… Now, in fact there are reasons for this sort of advise. String concatenation IS slow, for various, commonly known reasons. But still a StringBuilder has some overhead and there are situations where using it imposes an unwanted overhead.

The very same goes for RegexOptions.Compiled, and Jeff Atwood, aka Coding Horror, wrote a very good article about that a few years ago: To compile or not to compile (Jeff Atwood).

In one of the comments another article from MSDN (BCL Blog) is referenced, where the different caching behaviour of Regex in .NET 1.1 vs. .NET 2.0 is explained: Regex Class Caching Changes between .NET Framework 1.1 and .NET Framework 2.0 (Josh Free).

The not-so-common things

There is only a single thing that is true for each and every kind of performance optimization. And it’s the simple two words: “It depends.”.

With regular expressions, the first thing any performance issue depends on is, if you really need a regular expression for the task. Of course, if you really know regular expressions, what they can do and what they can’t, and for what they are the correct tool, you are very likely to not run into those kinds of problems. But when you just learned about the power of Regexes (all you have is a hammer) everything starts to look as a string desperatly waiting to get matched (everything is a nail). What I want to say is: Not everything that could be solved with a Regex also should be solved by one. Again, I have a link for me and you to keep in your Regex link collection: Regular Expressions: Now You Have Two Problems (Jeff Atwood).

Now, finally to performance optimization links.

There is a good blog article series on the MSDN BCL Blog (like the one above) that goes very deep into how the Regex class performs in different scenarios. You find them here:

And, besides those, once again a nice article on “catastrophic backtracking” from Jeff: Regex Performance (Jeff Atwood).

One more thing

There are three articles, that are not really available anymore. Three very good articles from Mike, that you can only retrieve from the wayback machine. I’m really thinking hard about providing a mirror for these articles on my blog too. But until then, here are the links:

Ask a Ninja: Is the “Googlevelopment” approach bad?

I stumbled upon a recent and very interesting blog post from Rick Strahl: “The Search Engine Developer“. Rick in turn was motivated by a post from Scott Hanselman who asked “Am I really a developer or just a good googler?“.

That inspired me to write this post, too. Mostly because this topic has to do a lot with self-improvement, learning and attitude.

What is it, what Ninja calls Googlevelopment?

We all know it: If we encounter something in our job we don’t know, it is very tempting to throw some words related to it into the search engine of your choice, sift through the results and if there’s a link to StackOverflow or to a blog from certain persons you know (like from conferences, book authors, via twitter, from others that pointed you to them earlier), these are your first stops. You don’t even look further, because your problem is most probably solved. You copy the code, paste it into your solution, make some adjustments, test it and don’t think further. There’s no problem anymore.

To the a point

Scott is just having so much keystrokes left, and because of that didn’t give a broad explanation on WHY he has the opinion he wrote down. Well, it isn’t even in fact an opinion you read in his post, but a call for action: Try to stop googlevelopment and do it the old fashioned way: Write it yourself, go to user groups, learn, do code Katas etc. One can easily guess that Scott thinks googlevelopment is a bad Hobbit habit, and you shouldn’t do it.

Rick, instead, was a bit more chatty. He mentioned that it “feels like [he’s] become a slacker at best and a plagiarizer at worst” sometimes. He summed up his experience, back to the days where there simply was no publicly available Internet – no chance to copy your code -, through the 90ies (some online manuals, discussion forums), through the millenium where blogs started to spread and up to now, where collaborative sites like StackExchange are flourishing.

Using libraries, for Rick, is “standing on the shoulder of giants“, and copying and adopting code from the intertubes gives him a rough feeling about the interior of the library, to be able to use it the right way, but not too deeply because, his example was a QR code library, that’s not his actual problem domain.

He, while being totally right on that matter, said that there is no need to re-invent the wheels others invented previously. And then, there’s this bummer: “It’s fun to wax nostalgic about ‘back in the day’ stories, but I don’t miss those days one bit. I don’t think anybody does…

Not missing the old days?

Rick, honestly? You don’t miss these days? I think this ‘back in the day’ time was the time that made you the developer you are today. Those were the days that made you learn your stuff.

Today’s younger developers, that didn’t went through this (more or less hard) school of learning by themselves, trying things, failing, learning from their failure, inspecting stuff, who JUST started as ‘search engine developers’ or googlevelopers, can’t really leverage the information they find on the net. Fiddling around with your platform, with your compiler, with sample sources (if any), with documentation is in the first place teaching you how to learn.

Rick then goes on describing that, because there are so much things out there, it could happen that you have a great idea and want to go on this. Then you might find finished implementations (even if not really ‘good’) – and just stick with them. Even if those implementations would deserve a new competing library from exactly you – because you could do better. But you left it alone.

Making this decision, re-implement or stick with a not-so-good solutions, is, of course, mainly driven by time / effort / money, but also by an educated analysis of risks and chances and the technical debt you’re taking when using a not-so-good solution. You also need to be educated to estimate whether a re-invention would benefit your (and maybe others too, talking about open source) solution.

You can’t, however, get that evaluation right when you haven’t learned the implications of doing it yourself vs. using existing stuff when you did not do a lot by yourself previously.

Ask a Ninja: Is Googlevelopment bad?

So, now it’s time for my personal opinion on that topic.

I already mentioned that I’m not with Ricks point of view. I think it’s sad that he does not miss the old days. I started developing software very early. I got my first PC when I was 9 and two to three years later just ‘using’ it got boring. With 14 I wrote the first program that I sold to a doctor to manage lists of his patients. The german health insurance card was just available and there were keyboards with an integrated card reader that would just ‘type’ in all the data from the card when it was inserted.

My program just stored the data in a flat file (I didn’t know that this format I chose was already known as CSV), and I had to invent and implement my own sorting algorithm. If I remember correctly, I did something like an insertion sort. I figured out ways to send that data to a printer when requested. And I spend a lot of time formatting the outputs of my program to look nice and present them in a beautiful way to its users (mostly young women that worked there and that I tried to impress back then, hell I was 14 🙂 ). So, I figured all that out. It took long. I learned a lot. And it was fun.

I’d love to learn new stuff all day. Fiddling with stuff. Trying to get things done by myself. I really miss that a lot. Sadly, in todays businesses this isn’t possible anymore. There’s just a tiny window of time available for that kind of stuff.


Finally Rick comes to this conclusion: “Search engines are another powerful tool in our arsenal and we can and should let them help us do our job and make that job easier. But at the same time we shouldn’t let them lull us into a false sense of security – into a sense of thinking that all we need is information at our fingertips.“.

Having all that information at our fingertips empowers us to build great stuff. It is information we can use to learn from it. And we have the power to decide to NOT use it. Rick linked to an older article from Scott: We need to Sharpen the Saw – this is us – on a regular base.

We should try to develop – not googlevelop – more stuff by ourself. This strengthens our skills and makes us more sensitive for when we have to use stuff others did. We need to find the right balance between “standing on the shoulder of giants” and trying to be the giant. This fine line, where you’re in balance, is a moving target, though:

  • Young Padawan, more fiddling on your own, not using the force you must.
  • Knight, use the force and learn from it.
  • Master Jedi, more fiddling on your own again you should.

This is my idea. Well, not really mine. I just adopted it. With some friends I share a maxime. In German: “Schau in Dich. Schau um Dich. Schau über Dich.” This goes for three steps of learning:

  1. “Schau in Dich.” – Look inside you. This is about self-awareness. You should learn about yourself, about your learning.
  2. “Schau um Dich.” Look around you. This is about learning not from yourself, but from others. And also about learning what your influence is on others, but that would go to far at this stage.
  3. “Schau über Dich.” Look beyond you. Okay, that is a very loose translation. The aim of this part is to make you think about things in a larger scale, and push the limits.

This is also, what the learning of an apprentice, journeman and master craftsman was back in the old days. The apprentice should learn to learn. The journeman travels around, this enables him to learn more from others of his craft. The master then knows the basics of his craft, but he also tried to improve his skills, to be able to compete. Masters also could leverage their skills to try our really new stuff on their own – and succeed with that. Masters usually also were eligible to join the guild, where there was a lot of exchange between them – also about new stuff they discovered.

There is a slight chance, that this, what was done for decades back then, had some truth in it. And we software developers, engineers or craftsmen, could (and should) try to map this to our daily learning again.

Bottom line

Well. This is just a line. At the bottom. For no reason. 🙂

Why FireMonkey is wrong, the second

I just stumbled upon a really, really great post on Steven Sinofskys Blog.

His article is about the challenges of cross-platform development in general, and he brings up some rather good points on why some approaches will eventually fail.

I’m pretty sure he doesn’t even know about FireMonkey, but this is what he has to say on cross platform libraries in general:

One of the most common approaches developers attempt (and often provided by third parties as well) is to develop or use a library that abstracts away platform differences or claims to map a unique “meta API” to multiple platforms. These cross—platform libraries are conceptually attractive but practically unworkable over time. Again, early on this can work. Over time the platform divergence is real.

And then he continues:

Worse, as an app developer you end up relying on essentially a “shadow” OS provided by a team that has a fraction of the resources for updates, tooling, documentation, ..
It is important to keep in mind that the platforms are evolving rapidly and the customer desire for well-integrated apps (not just apps that run).

The very last point is what I already stated in my own post on why FireMonkey is wrong. I didn’t even write about the even more important first ones. And this only are quotes from one paragraph where he thinks about cross-platform libraries.

I strongly suggest that you take a few minutes and read what Sinofsky wrote about cross-platform development. And then, if you currently feel that FireMonkey could be the right tool for you, try to understand his points and re-think your position on cross-platform tooling. I’m sure you will see that FireMonkey can’t be the right tool for you – or anybody.