Setting up my infrastructure – Part 5: Additional tools, server and hosting

In this post I’m going to mention all the other necessary stuff for a project like mine.

Preamble: Actually, it is a spare time, private thing, and as such I don’t want to spent too much money on it. I also don’t (yet?) know how long this will take and as such don’t want to pay too much. Especially not on subscriptions for services.

So, where to start? I think source control is the most important thing for a software project, so let’s go.

Source control

I chose Git. In the first infrastructure post I already mentioned some of my versioning tooling (which in fact already changed up to now). I have a lot of experience with SVN, not yet so much with Git, but as I already mentioned it seems that from an adoption and acception point of view Git is the new mainstream source control tool. It is powerful, it is cross-platform, and GUI clients support is growing. My other alternative would be Mercurial (Hg), but despite it’s better windows GUI clients, adoption is not that good and I want to be able to ask questions on StackOverflow and get help quickly.

So, I already said I was using Bitbucket from Atlassian for hosting free private repositories. This is only partially correct by now. I decided to self-host my repositories and use bitbucket as an additional off-site backup for my repositories. Why is that? I don’t want to be fully dependent on a single point of failure (Bitbucket). They host in the cloud, and we all saw that the big cloud players like Amazon with EC3 and Microsoft with Azure can encounter large-scale problems. Even if Atlassian takes all precocious measures to keep their service available, which is probably not the case given a lot of people are only using the free stuff, something really stupid like expired certificates at the cloud side could render the service unavailable for hours or even days.

My idea is the following: I mainly work on my self-hosted repository. Whenever my build server has a new successful build, it will automatically push that to the Bitbucket repo. This way I have a repo backup on my dev mashine, Bitbucket with the latest fully working state (since you commit and push often, that should not be too far away from my local copy) and of course my self-hosted repo. That should be enough safety in case something happens to my notebook, my server or Atlassian.

Speaking of Atlassian, they have this great Git client SourceTree for Mac. They recently announced opening up a beta test for SourceTree for Windows via Twitter. Guess what? I signed up 😉

You see, I use Bitbucket from them, I use SourceTree on the Mac from them and I’m eager to get experience with their SourceTree for Windows. Atlassian is very present in my Git-centric versioning environment, which is why I also started to use their product Stash. Stash is BitBucket on my own server. I can create repositories, manage permissions (okay, currently I’m the only user) and have it automatically manage my branches. And it is very cheap at yearly 10 USD for 10 potential users. So when my project succeeds, and I stock up my development team beyond 10, then I for sure will have the money to upgrade.

Source quality

Since you now the tooling I use to store my sources and to manage it on my server and my development machine I want to introduce another tool I bought and installed, even if it’s usefulness is (currently) questionable. I bought FishEye and Crucible from Atlassian to. At 10 USD each it was not a real investment, and I feel that FishEye lets me keep control over my code more easily. It allows fast searching through all the project code (in 5 repositories for 10 users) and lets me browse through the history of my code in a convenient way. Crucible as a code review tool is probably not of so much use for a one man show, but perhaps later on somebody want to join my efforts with this project and potentially participate on revenues, if this becomes successful. Crucible is the only tool thats the 10 USD for only 5 and not 10 users.

Hosting

For a long time I had a hosted Linux root server (dune) at Strato for 49€ / month. It used to host my email server (I completely switched to Gmail for my domain a few years ago), hosted my first blogs, some home pages and discussion forums for the guilds when I still was playing. Besides that I had a very small Windows Server at 1&1 (smarthost), which I got for 14 € / month as a special offer during my studies. But it was not powerful enough to replace all services on dune.

As I already posted, this blog (and almost all other things hosted on dune and smarthost) now moved to Gallifrey. Gallifrey is a big Windows Server 2012 ‘Level 4’ V-Server at Strato, with 4 virtual CPU cores, 4 Gig of ram and a 250 GB HDD. Enough power to host those littles web sites, my blog and my complete build environment. I ordered Gallifrey when there was a 6-month free offer and it is at 29€ / month. So I canceled dune and smarthost, which will in fact save me about 34€ / month while at the same time offering more power.

Backup

As already mentioned, my sources will be automatically backed up to BitBucket. By now, I also put the sources of this blog and all other homepages into Git repositories which are also automatically backed up this way. All databases are dumped on a regular basis and copied over both my home server and a cloud storage. Same goes for the working directories with config files and changing contents. They are copied to a backup location, zipped and transferred together with the database dumps. All that is triggered by a scheduled task on the v-server.

Summary

So the toolset for my pet project is right now:

Update: Fixed some typos. Thanks Manuel 🙂

Continue with the next part, or see the other parts in this series:

[contentblock id=infrastructurelinks]

Setting up my infrastructure – Part 4: Build server requirements

Okay, after the short delay I want to continue with my pre-thoughts for the tooling evaluation for my pet project. I already mentioned my requirements for a task and bug tracking tool to coordinate my work and keep me on track.

Now the second important thing is to stay in control of my code I’m going to produce driven by those tasks. For that I want a good suite of unit- and integration tests. The next logical step is having the tests run at every checkin, so a CI / build server is required. This also automatically opens up the possibility to automatically deploy certain parts of my project to test, staging and eventually production environments. So this is something I totally want to go for.

My personal requirements for a CI / build server:

  • Work fine with .NET environments
    I want to write something for .NET developers. As such, my pet project is a .NET project itself. I like to use the things that come in the same box as .NET (MSBuild, MSTest). Whatever tool I chose, it should support that out of its own box.
  • Work fine with Git
    Okay, that probably is a no-brainer. Git is the new SVN (just talking about adoption here, please don’t kill me for that comparison), and I assume all tools out there will have Git support in some degree.
  • Can build branches automagically
    That probably comes with ‘fine’ Git support. Whenever I create a new branch and push it to the repo the CI server builds from, this branch should build too automatically. This way I know for sure everything is working before merging stuff.
  • Easy to setup
    I actually want to work on my project and don’t spend all my spare time with getting my infrastructure up and running.
  • Integrate with my bug tracker
    As already mentioned in the previous post, a two-way integration of Ci server and task tracking would be extremely cool, but is not an absolute must-have.
  • Allow extensions with reports easily
    I think about code coverage analysis, running FxCop and or StyleCop on the Buildserver and have their reports displayed directly with the build report. Not from the very beginning, but such things should be possible.

So, that’s pretty much it for the CI / build server.

The next post will bring some light in the darker areas of the infrastructure part, like where to host my Git repositories and what additional tooling may be nice when working with the code. This will also raise some questions about hosting in general as well as setup and tooling on the server, which may affect the tools that I’m going to evaluate.

Continue with the next part, or see the other parts in this series:

[contentblock id=infrastructurelinks]

Some delay with my pet project

Just a short update about my pet project and the evaluation posts:

I’m a little bit late with my project. The reason is that I had some high load periods at my job and I had to finish an article for a magazine.

Now with the high load managed and the article written, I’m fully commited to my project again and will continue with my tool evaluation. Be prepared to read more soon.

Setting up my infrastructure – Part 3: Bug tracker requirements

In my last blog post I explained why I want to automate as much as possible. For this I want a build server. The next thing that’s really important to me is, to log every issue and idea I have and have myself organized through the project.

For that I’m going to evaluate some tools, and before you start with downloading and installing all sort of tools and testing them, you should know what you’re looking for.

So lets start:
My personal requirements for a bug / task tracking tool:

  • Easy input of tasks / bugs / features / ideas
    I don’t want to spend a lot of time on ‘managing’ my task manager, and I certainly don’t want to overmanage myself.
  • Work log / spend time tracking
    Even though it’s a pet project, I’d like to see how much time a certain feature has cost. Additionally, if I can see my estimates vs. reality, and I log the reasons why I needed less/more time, I can improve my estimates.
  • Change logs / Release Notes report
    I’d love to be able to generate my release notes out of the tasks that have been fixed for a certain release, so I have to maintain this information only in a single place. Like in an additional field of the task, where I just enter the information that should be visible in the release notes.
  • Documentation
    Not necessarily required in the Bug tracker, but if I can note some technical details (i.e. in custom fields), then I know where to look for the information to build up the real project’s documentation.
  • Integration with VCS / Build server
    In a perfect world I would be able to see the related task(s) from a commit, the related commits from a task, the builds within the build server affected by those commits and the other way round: I would be able to open the corresponding tasks directly from the build in the build server.
  • An easy API

If that tool would support a little bit of analysis / reporting, that’d be totally great. I already mentioned the release notes generation above, but also time spent on features vs. bugs would be an interesting figure for my project. The last bullet point would hopefully make up for all the things that I would like, but that are not supported out of the box. Nevertheless, I actually don’t want to lock me in a specific tool, so writing custom stuff for a single solution would only be that last option, since that time is definetly lost when switching tools.

In the next post I want to share my thoughts on what a ci / build server should be capable of, before we want to start with the real evaluation.

Continue with the next part, or see the other parts in this series:

[contentblock id=infrastructurelinks]

Who am I? And how to prove?

Since I am currently setting up my infrastructure for my pet project, I wanted to sort out some little details.

My Bug tracker, the build server software, the repository managament… all this will be web based, and hosted on my new virtual server Gallifrey (that now also serves this blog). I don’t want my credentials go over the internet in plain text, so I want to access these applications only over HTTPS.

Now, of course I could simply use a self signed certificate and trust the certificate on all my machines, but what if I’m going to partner up with somebody else? An official SSL certificate would be better.

A colleague pointed me to StartSSL. They offer a 2 year multi-domain, wildcard certificate for just 59 USD. So I signed up there and – my bad – entered my mobile number as my contact number. That was a problem, because as I would get to know soon after that, they do a three-way check to validate my identity.

1.) They want to see two recent photo ID’s of me. So I emailed them photographs of my ID and my drivers licence.
2.) To validate my address and my name they wanted to see a phone bill showing my name and the address and the phone number.
3.) They call the number that, as by the documents, belong to the same person identified by the photo IDs, to ask for additional informations given on the ID (like place and date of birth).

Now the problem was, that my mobile phone contract is not mine, but payed by my company. So the phone bill didn’t show my name and they could not do this validation. After a few mails I figured out, that they could swap my mobile against my landline number, and so I emailed them a pdf scan of my land line bill, which shows me as the owner. Now that PDF scan was a bit too big for their email system, and until I figured that out a full day passed. By know I know that my email bounce notifications are sent every 24 hours.

Well, they called me on the landline, asked those questions and a few hours later I had my certificate.
So, you should be able to connect to this blog also via https, and also the other domains and web applications are secured too.

When you want a certificate from StartSSL, better be prepared and give them a phone number where your name is on the bill…

This blog just moved

Hello from.. Gallifrey.

As you should see in the footer, this blog just moved and now is hosted on my new server Gallifrey. Previously it was located on smarthost01, I admit, a very boring name for a server.

Perhaps it’s intersting for you, how I moved this installation. This was done in a few easy steps. I…
1.) scripted a backup of the database (including the user) directly in my wordpress directory using HeidiSQL,
2.) initiated a new Git repository in my wordpress directory, added all files to the repo and commited, then deleted the SQL file again,
3.) created a new repo on my Stash evaluation instance on the new server,
4.) pushed the local copy to the new repo,
5.) cloned the fresh repo on Gallifrey,
6.) executed the SQL file using HeidiSQL on Gallifreys MariaDB and then deleted the SQL file,
7.) configured an web in IIS on the directory, and applied the same file system permissions,
8.) tested the installation and switched the DNS entries from smarthost01 to Gallifrey.

That was done in about 10 minutes in total. By the time of this posting, the DNS change should be propagated wide enough that you should see this served from Gallifrey. Moving a blog instance using Git is very easy and comfortable, so if you feel the urge to move your blog, this could be the way to go.

Setting up my infrastructure – Part 2: Automate everything

This post is the second part in the series about my pet project, and this is about automating stuff.

As I already mentioned, I want my personal project to be automated as much as it makes sense. That starts with automatic builds whenever I check in some new code, automated testing starting with unit testing, later integration testing and then, last but not least, automated UI testing and of course code coverage analysis while testing.

To be able to do the automated UI testing, I need the UI to run somewhere to test, so that will include automated deployment to one or more test environments.

I also want to automate the process of creating beta and of course release builds. If I’m going to release binary packages to the public, I also need to package them up in some sort of installer or NuGet packages. So when I start with that stuff, I’m going to automate that too.

Some of you might ask why I’m so bought in automating all that stuff and putting so much work in the backing infrastructure when this is ‘just a pet project’.

Well, there is mainly one reason: All those tasks are in fact pretty tedious, like running the whole test suite before committing, building, packaging up and deploying the project etc.

When I work on this project, I want to concentrate on the actual work, like getting things to work like I want them to work, see what I just did and not spent a lot of time on some boring and also error-prone tasks.

So I feel that every minute I spend for my infrastructure will pay for me later in the project. That is also why I’m going to chose my environment carefully and evaluate some products. More on that in my next blog post, where I want to introduce the build server and bug tracker’s in my first evaluation round. I also want to explain to you the important things that those tools need to be capable of – of course in my personal point of view and for this very pet project.

Continue with the next part, or see the other parts in this series:

[contentblock id=infrastructurelinks]

Setting up my infrastructure – Part 1: Basic tools

For my new pet project I want to use good and efficient tooling. Since I want to create a tool for me and other .NET developers and I feel at home on this platform, I’m going to use C# for the project.

I have my personal MSDN Professional subscription, and so I use Visual Studio 2012 Professional for development. I add my personal ReSharper licence for productivity and I chose Windows 8 Professional as my development OS (in a VM on my MacBook Air). Being totally in the Microsoft .NET ecosystem I’m also going to use MSBuild and MSTest.

Update: Talking about VM on my Mac, I use VMWare Fusion for that. I also have VMware Workstation running on my home server for my build server virtualization, but that will be part of another post.

For source code versionioning I chose Git. Mainly, because I feel that even if Mercurial currently has better tooling support on Windows, Git is more mainstream and tooling is becoming better. As Git clients I currently use the GitHub client and of course the official Git commandline client. I host my sources on BitBucket from Atlassian. They give you private repositories for free, and since I invited some guys I also can collaborate with 3 others if I want without the need to pay for a private shared repository.

Besides that, I of course have the usual .NET developer tools like The Regulator for working with regular expressions, LinqPad for small test thingies and DotPeek as my decompiler.

Now, besides that I need additional tooling to keep track of my tasks, so I need a bug / issue / task tracker. And I don’t want to build releases manually or do manual testing, so I will need some sort of automatic build & test tooling, which leads me to some continuous integration / build server. Chosing which tool is best here will take some time, and so I started to evaluate different solutions. More on that in a separate post.

So the toolset for my pet project is right now:

Continue with the next part, or see the other parts in this series:

[contentblock id=infrastructurelinks]

My new pet project

As already announced on twitter, this year is the year of my pet project.

I’m going to develop something and, hopefully, will be able to release it this year. I can’t tell you much about it at this stage, but the main idea is to create some developer tooling where I couldn’t find anything useful on the market up to now and of course to try out new things and stuff.

I also want to improve my personal process of development with some experiments during this project. The first will be to set up all required infrastructure I consider important for such a project.

During my efforts I want to inform you about tools and techniques I use during this experiment, so stay tuned for more.

This game has to be made: Elite: Dangerous

<Update> The Kickstarter reached it’s funding goal, and also the stretch goal of 1.4 mio GBP for a Mac version. With a few hours to go it’s still possible to accomplish the 1.5 mio stretch goal for 10 additional playable ships. </Update>

There is currently a Kickstarter by David Braben to create a sequel to the lengendary Elite computer game. I want this game, and so I want to point some additional people to this kickstarter. Now here’s my personal story why.

When I was a kid of about 8 or 9 years (in the late 80ies), we regularly visited my uncle and aunt in the black forest. In this house lived a young guy that had an Amiga and played Elite, and I was allowed to play it. That was my first contact. Whenever we visited, I played my own commander and eventually I made it into the second galaxy on his Amiga.

We moved there a few years later on, and that guy moved away. In the early 90ies I found out that someone in the next village had a copy of the original PC Elite, which I bought from him. It is still in a good condition (even though I don’t have a 5 1/4″ floppy drive anymore to read the disc). See this picture I just made:

My copy of Elite for PC
My copy of Elite for PC

I loved this game, played it through various galaxies and even blew away some Thargoids.

Until now, there are some games that try to resembly the original elite.
I’m not talking about the two more or less official sequels Frontier (‘Elite II’) and Frontier: First Encounters. They were in the spirit of Elite (David Braben, one of the original Elite programmers, was the head here), but I personally disliked the way too realistic physics. More than a few times I ended up drifting into endless space with all the fuel used in what one would call a dogfight. I’d call it turn, burn fuel to change direction to the target, fire a few times by passing through it, turn, burn fuel etc – until one of the ships eventually runs out of fuel or gets hit.

Frontier and Frontier: First Encounters were a nice idea, but totally failed by being too realistic in physics.

What I liked to play too:

  • Wing Commander: Privateer
  • Freelancer
  • All parts of the X – Series
  • EVE Online

In my opinion the X-series games from Egosoft (a german game studio) are very good in resembling the old Elite feeling. Especially that one can build his own space stations, produce goods, own multiple ships and have them fly on auto pilot to trade is a very nice idea and I love to see the next part X: Rebirth on a totally new game engine.

Eve is a very good massive multiplayer game, but the dogfight capabilities are very limited and I really miss the pilot’s perspective from the cockpit.

Nevertheless, the original Elite with his legendary ships like the Cobra, the galaxies you can level through (without jump gates like in X) and the whole feeling of total freedom is unique, and I’d love to see a sequel.

So I backed Elite: Dangerous in the hope that David Braben won’t mess up the physics again and I really do hope that they get the missing 200 Pounds in the next few days.

I’m a proud backer of the Elite: Dangerous Kickstarter, and I say: This game has to be made.