Web developer from Sydney Australia. Currently using asp.net, mvc where possible.

Friday, November 19, 2010

Installing Ruby on Rails 3.0.3 on Ubuntu on Virtual Box on Windows 7

WARNING: This is how I got things working, it may not be the best way and it may not work for you!

For a while now I have been wanting to try out ruby and rails (and Python + Django and php). Why? well I want to call myself a web developer and not a asp.net mvc developer. It’s just about keeping up on how the rest of the world is developing web applications. So I am going to make a real effort to learn other frameworks, starting with rails.

Pretty much every rails guide out there says at some point ‘Don’t use windows’ as your OS.
Having done a tiny bit of Unix programming in the past I already understood that using windows won’t give your the full ruby on rails experience. The command line on Linux/Unix systems is just that much better then what windows provides and ruby on rails makes full use of it. There are other benefits as well such as: ruby running faster, better support from the community (more people using Linux environments), more open source tools, experience using Linux (which comes in handy when you host on Linux) . NB: Mac's use a Linux based OS.

Getting the environment setup was something of a challenge (to say the least). Given I already have basic Linux skills, the setup could be a complete show stopper for people who have only used windows for development. This post is so I can quickly and easily get it up and running in the future and in the process hopefully help one or two other people along the way.

1) Getting an Ubuntu ISO

You can download the Ubuntu Desktop Edition ISO from the Ubuntu download page. It’s around the 700Mb mark. If your running a 32bit OS then make sure you download the 32 bit version. If your unsure download the 32bit version.

Just save it to disk somewhere you can find it, once we have VirtualBox up and running it will mount the ISO image directly as a CD-ROM drive on your virtual machine.

2) VirtualBox

Although there are many choices out there for running virtual machines I choose Virtual Box based on previous good experiences using the product. It’s also available free of charge for personal use.
The install is just a click through wizard on windows.
Once VirtualBox is running click the New icon to make a new virtual machine. Make sure you select ‘Linux’ as the operation system and Ubuntu as the version (select Ubuntu 64 bit if downloaded the 64 bit ISO).


For starters allocate around the 1.5 gig mark for memory (you can change this at any point and assuming you have at least 4 gig on the host).

For the virtual hard disk I just left the settings as default (“Create new hard disk”) and creating a new Dynamically expanding storage drive. But feel free to tinker this as your see fit.

Once your new machine has been created you can select it and click the settings button. Then select storage and set the Ubuntu ISO as the source for the CDROM drive as show below.


3) Ubuntu

Now we have our virtual machine setup and a installation disk mounted we can fire up our new virtual machine by double clicking it inside VirtualBox. After some time you should see the Ubuntu Install screen.


Just make sure you select “Install Ubuntu 10.XX LTS” because we are running at virtual machine and we want it to install onto our virtual hard drive.

From here on its a really easy (glad to say) wizard similar to the windows installation wizard. Just keep clicking next unless something catches your eye (like incorrect region or such). Make sure you remember the password you create for your account!

After the installation completes we can prepare Ubuntu for our rails installation.

Assuming you have remembered your password, you should be able to login. Don’t worry about any aesthetics, small window size or anything else yet just get rails working first.

If you new to Linux and the terminal you may want to Google-up on that and find some nice introduction articles to get you up to speed. Here is a terminal introduction from the Ubuntu documentation.

First open a terminal window (under Applications –> Accessories –> Terminal) and run the following command to update any existing software packages.
sudo apt-get update
  • sudo: means run the next command as root (or Administrator level)
  • apt-get: is the Ubuntu package manager, it downloads and installs software for you.
  • update: just tells apt-get to update all existing packages.
Now lets install some more packages, I have just followed the advice from this page: http://thekindofme.wordpress.com/2010/10/24/rails-3-on-ubuntu-10-10-with-rvm-passenger-and-nginx/. To be specific: the command suggested by the commenter at the bottom. This post was a great help in getting my installation working.

Inside the terminal run the following command:
sudo apt-get install build-essential bison openssl libreadline5 libreadline5-dev curl git-core zlib1g zlib1g-dev libssl-dev libsqlite3-0 libsqlite3-dev sqlite3 libxml2-dev libmysqlclient-dev
Once this is complete we are ready to move onto ruby and rails using RVM.

4) Ruby on Rails 3.0.3

For our ruby installation we are going to use RVM (Ruby Version Manager). RVM is awesome and it can really take a lot of the pain out of getting rails up and running. RVM is also a powerful tool for managing GEMS, it gives you a single command to managing the version of ruby you are using and the GEMs you are using.

Installing RVM is not straight forward (don’t be fooled by the Quick Install) on the home page. Start by running the command below (as suggested).
bash < <( curl http://rvm.beginrescueend.com/releases/rvm-install-head )
Then follow the link to the Installing RVM page and follow the Post Install instructions carefully.

Use the command below to fire up gedit so you can edit your .bashrc file and add the line as instructed:
gedit .bashrc
Now add this line to the BOTTOM of the file
[[ -s "$HOME/.rvm/scripts/rvm" ]] && . "$HOME/.rvm/scripts/rvm" # This loads RVM into a shell session.
Now follow the Troubleshooting your install section and read the notes about the return statement. It basically says replace:
[ -z "$PS1" ] && return
if [[ -n "$PS1" ]]; then
and indent EVERYTHING below until just before the line you have just added. I simply held ctrl-shift and pressed End to select the remaining content of the file, then pressed TAB to indent. Then inserted the closing:

[[ -s "$HOME/.rvm/scripts/rvm" ]] && . "$HOME/.rvm/scripts/rvm"  # This loads RVM into a shell session.
Save and close gedit. Close your terminal window and open a new terminal window then type:
Which should result in the usage information for rvm being shown. Now we can proceed to install ruby version 1.9.2.
rvm install 1.9.2
Once this has installed successfully we need to set 1.9.2 as our default ruby installation. (NB: to be safe you might want to restart your terminal here)
rvm use 1.9.2 --default
We can check this worked by asking ruby it’s version:
ruby -v
Assuming this worked and said something about ruby 1.9.2 we can finally install rails.
rvm 1.9.2@global
gem install rails
The first command sets the gemset (think of it as like a namespace for gems) to the global gemset. We want to install rails into the global gemset, this means all other gemsets (that we may create) will have access to the rails gem (+ dependencies).

Once rails has installed, lets create a new rails app and to be super unoriginal we call it MyBlog.
mkdir sites 
cd sites 
rails new MyBlog 
cd MyBlog 
bundle install
So what we did here is:
  • Create a new directory for called sites
  • Changed to the sites directory
  • Created a new rails app called MyBlog
  • Change to the MyBlog directory (created by rails)
  • Called Bundler to install any necessary gems for our new project (sqllite was missing)
Now we can fire up the rails server
rails server
If you read the output you will see the port that the web server is running. So its just a matter of opening your browser to that address.

OK, that's it for now. If you read this far I really hope you managed to get rails up and running on a virtual Ubuntu machine! Proof below….


5) VirtualBox Additions

If rails is up and running chances are you want to improve your working environment. The first step is to install the VirtualBox Additions, here is a good article detailing the process.

Otherwise, Ubuntu is fully customisable but don't waste time trying to make it look like a mac :)

Monday, October 25, 2010

Unit Testing – MsBuild Series

This post is going to walk through the steps of adding unit testing to our build script. This post is assuming that you already have a MsBuild script setup that you can run from the command line. Otherwise you might want to look at my previous post on setting up a build file for Visual Studio:

1) Change the output path for Test projects

The first thing to do is to change the output path for the test project (or projects). We do this so that the testing binaries to are separated from the release binaries. Below is a screen shot of the directory being changed inside visual studio.

Change output Directory For Test Projects

Things to note:

2) NUnit Target

For running our tests I am going to use the “MsBuld.Community.Tasks” NUnit Task. We have been using other Tasks from the “MsBuld.Community.Tasks” so we already have the targets and dll that are needed for these tasks (see this post for details on getting the Community Tasks setup).

So, now we just define our NUnit target as follows:
<Target Name="NUnit">
        <TestAssembly Include="$(BuildOutputDir)\Test\*.test.dll" />
    <Message Text="NUnit is running on: @(TestAssembly)" />
    <Nunit ToolPath=".\Tools\NUnit" Assemblies="@(TestAssembly)" />

<Target Name="Test" DependsOnTargets="NUnit">
    <Message Text="Testing code" />

Things to note:
  • I have created two targets “Test” & “NUnit” which is not necessary if your only using NUnit. However, I did this so that it's easy to add other targets for different testing frameworks such as MbUnit, xUnit, Mspec etc. (if you using multiple test frameworks make sure you checkout gallio)
  • The TestAssembly ItemGroup will grab any assembly that ends with Test. 
  • The nunit console runner and lib directory have been copied to "[Project Dir]\Tools\NUnit"

The necessary nunit binaries have been copied into “[Project Root]\Tools\Nunit”. Normally I just copy “nunit-console.exe” , “nunit-console.exe.config” and the entire Lib directory.

3) Incorporate the Test target into the main build file

We now have the Test target so we all we need to do is have our default build target depend on our Test target, as follows:
<Target Name="Build" DependsOnTargets="Clean; VersionSolutionInfo; Compile; Test; Zip-Source; Zip-Binaries; NuPack">
<Message Text="Clean, VersionSolutionInfo, Compile, Test, Zip-Source, Zip-Binaries, NuPack"/>
Now when we run our build the code is tested before creating our zips or NuPack (or should I say Nuget).

Testing Code

NB: Please check the reference files for full copies of the build targets used in this series so far


This is the fifth post in the MsBuild series.

Shout it kick it on DotNetKicks.com

Thursday, October 21, 2010

Creating a NuPack package using a Build File - MsBuild Series

This post is going to walk through the steps of creating a NuPack package as part of an automated build file.
Previously we have created an automated build file to clean, version, compile, zip-binaries and zip-source code.
If you’re now familiar with creating build files then you may want to catch up by reading these:

1) NuPack Package Preparation

Before we start we need to get a few things in place.
  1. We need the NuPack.exe utility
  2. We need a directory to keep our NuPack stuff.
  3. We need a NuPack manifest file
(NB: The best way to learn how to build nuspec packages (at the moment) is to look through the examples by downloading the codeplex repository here: http://nupackpackages.codeplex.com/SourceControl/list/changesets)

Firstly we need to add the NuPack.exe utility to our tools directory. The NuPack.exe will be used to create our NuPack package. You can download this file from: http://nupack.codeplex.com/releases/view/52016
After downloading the following path should be valid:
  • [project root]\tools\nupack\nupack.exe
Next, we will create a directory for holding the input files for the NuPack package. Create the following directory:
  • [project root]\NuPack\
Inside this directory create a [projectName].nuspec file, for example:
  • [project root]\NuPack\StickyBeak.nuspec
This file needs to conform to the nuspec file format found here: http://nupack.codeplex.com/documentation?title=Nuspec%20Format
Below is a sample for the StickyBeak project:
<?xml version="1.0" encoding="utf-8"?>
            <author>Mark Kemper</author>
        <description>StickyBeak is a logging utility for asp.net websites which can log every request to your site. It provides similar features as IIS log files but provides additional logging information (which just isn’t possible with IIS logs) and easy viewing of logs via a admin page</description>
Notice for now that we have hard coded the version number, later as part of the build we will replace this version number with the real version number.

Next I needed to copy some static files under the NuPack Directory. For StickyBeak I created the follow static content transforms
  • [Project Root]\NuPack\Content\NLog.config.transform”
  • [Project Root]\NuPack\Content\Web.Config.transform”
Any files you place in the Content directory will be copied to the target directory when someone installs your package.

Once you have your NuPack directory setup just how you want it, its time to build the package using a build target.

2) NuPack Target

For building the NuPack package I followed these basic steps it the build
  1. Copy everything from our “[Project Root]\NuPack” directory into “[Project Root]\Build\NuPack” . This sets up the static content required for the NuPack package.
  2. Copy the freshly built binaries from our output directory into “[Project Root]\Build\NuPack\Lib” – these files will become references when someone installs our package.
  3. Update the version number inside the Package specification file using a FileUpdate (regex) task
  4. Use a exec task to call the NuPack.exe executable and perform the packaging. NB: I set the working directory so the package will be created in the correct location.
The NuPack target is quite involved at the moment but hopefully dedicated task is created by Microsoft to help make this a little easier in the future.

The full target is shown below:
<Target Name="NuPack">
        <NuPackFiles Include="$(NuPackDestSource)\**" />
     <NuPackLibFiles Include="$(BuildOutputDir)\Bin\**" />
     <NuPackPackageFile Include="StickyBeak*nupkg" />

    <Message Text="Setting up the $(NuPackDestDir) directory will all the necessary files to create our package"/>
    <Copy SourceFiles="@(NuPackFiles)"  DestinationFiles="@(NuPackFiles->'$(NuPackDestDir)\%(RecursiveDir)%(Filename)%(Extension)')" />
    <Copy SourceFiles="@(NuPackLibFiles)"  DestinationFiles="@(NuPackLibFiles->'$(NuPackDestDir)\Lib\%(RecursiveDir)%(Filename)%(Extension)')" />

    <FileUpdate Files="$(NuPackDestDir)\StickyBeak.nuspec"
   ReplacementText="version>$(Major).$(Minor).$(Build).$(Revision)</version" />

    <Message Text="Executing the NuPack.exe packager"/>
    <Exec WorkingDirectory="$(BuildOutputDir)" Command="..\Tools\NuPack\NuPack.exe ..\$(NuPackDestDir)\StickyBeak.nuspec"/>

3) Update Our Default Build

Now we can update default build task to create our NuSpec package after each successful build.
<Target Name="Build" DependsOnTargets="Clean; VersionSolutionInfo; Compile; Zip-Source; Zip-Binaries; NuPack">
<Message Text="Clean, VersionSolutionInfo, Compile, Zip-Source, Zip-Binaries, 
That's it. We can now prepare our NuSpec package a the click of a button. The results of our build directory now look like this:


This is the forth post in the MsBuild series.

Shout it kick it on DotNetKicks.com

Wednesday, October 20, 2010

Zipping Build Outputs using a Build File - MsBuld Series

We are continuing ths msbuild series, this post will focus on zipping the outputs of the build. To recap so far we have setup a batch file that can build and version our project. The output of this build is placed inside a folder called "build" at the top of projects directory structure.

We will zip up the outputs of the build so they can be easily distributed. To do this, the Zip build task that is part of the “MsBuld.Community.Tasks” library will be used. If you are not familiar with this library and how to setup then please read through my two previous posts to get up to speed.
We are going to create two zip files during our build:
  1. A zip of all binaries needed to run the application
  2. A zip of all source code needed to compile the application.
We are zipping up the source so we can provide an easy download of the source code for our application via codeplex, github etc. Similarly we can provide just a zip of the binaries for a specific version of our application.

1) Change the output directory to “build\bin”

Firstly we are going to change our build output directory inside visual studio so that the binaries are placed into a new folder called bin under the build directory “[project]\Build\Bin”. Having a sub directory removes the clutter from the root of our build directory so we can place our zip files there.

Below is the screen shot of the new location being set inside visual studio for our automated_build configuration.

Changing the output directory

2) Creating the Zip-Binaries target

We have already included the “MSBuild.Community.Tasks” targets in our build file, so only thing we need to do before using the zip task is to ensure that we have copied the ICSharpCode.SharpZipLib.dll binary into our “Tools\MsBuildCommunityTasks” directory. This binary is included in the MsBuildCommunityTasks download.

Now declare the Zip-Binaries target as follows
<Target Name="Zip-Binaries">
<BinDirectoryFiles Include="$(BuildOutputDir)\bin\**" />
<Zip Files="@(BinDirectoryFiles)" WorkingDirectory="$(BuildOutputDir)\bin\"
ZipFileName=".\$(BuildOutputDir)\Jobping.StickyBeak_Binaries_$(Major).$(Minor).$(Build).$(Revision).zip" />
Notes on the Zip-Binaries target
  • BinDirectoryFiles – this variable references all files below the "build\bin” directory. These are the files that we will zip.
  • The WorkingDirectory attribute for the zip task specifies the root directory for the zip file. We speicify the “build\bin” directory as the root for our zip
  • We use the version information to name the resultant zip file.

2) Creating the Zip-Source target

Apart from the binaries we also want to distribute the source code. So, we are going to declare a Zip-Source target for our build file as follows:
<Target Name="Zip-Source">
<SourceFilesExclude Include="**\.hg\**" />
<SourceFilesExclude Include="build**" />
<SourceFilesExclude Include="**\.*" />
<SourceFilesExclude Include="**\bin\**" />
<SourceFilesExclude Include="**\obj\**" />
<SourceFilesExclude Include="**\Test\**" />
<SourceFilesExclude Include="**\TestResults\**" />
<SourceFilesExclude Include="**\*.user" />
<SourceFilesExclude Include="**\*.suo" />
<SourceFilesExclude Include="**\*.cache" />
<SourceFilesExclude Include="**\*.vsmdi" />
<SourceFilesExclude Include="**\*.testsettings" />
<SourceFiles Include="**\*.*" Exclude="@(SourceFilesExclude)" />
<Zip Files="@(SourceFiles)" 
ZipFileName=".\$(BuildOutputDir)\Jobping.StickyBeak_Source_$(Major).$(Minor).$(Build).$(Revision).zip" />
Notes on the Zip-Source target
  • We generally want to zip up everything, so we include all files by using the wildcard “**\*.*” which means all files in all directories.
  • We specify a list of excludsions, these are source control files and other files that are generated or not necessary in order to build our application.
  • Getting the exclude list correct requires some testing, you should test your source zip by extracting it and making sure you can compile the solution.

3) Incorporating the Zip tasks into our default build

Since we can now created the two zip targets all we need to do is have our default build target depend on these targets, as follows:
<Target Name="Build" DependsOnTargets="Clean; VersionSolutionInfo; Compile; Zip-Source; Zip-Binaries">
<Message Text="Clean, VersionSolutionInfo, Compile, Zip-Source, Zip-Binaries"/>
That’s it. Below is a screen shot of the kicking off a build and the resultant output in the build directory.

Build output

Screen shot of our ready to ship zips:

Build output directory

This is the third post in the MsBuild series.

Shout it kick it on DotNetKicks.com

Tuesday, October 19, 2010

Mercurial Revision No to Version your AssemblyInfo - MsBuild Series

By now you already have a build file setup and running, if not please see my previous post.

This post is going to focus on getting our version number automatically updated during the build process. We are going to use the AssemblyInfo task from the “MSBuild.Community.Tasks” library and the HgVersion task from the “MSBuild.Mercurial library.

The HgVersion task will get the revision number from our Mercurial source control repository so we can include it in our assemblyInfo file.

The AssemblyInfo task will create our AssemblyInfo file before the compile task is executed.

If you are using TFS or Subversion you will need to substitute the Mercurial task for the equivalent task for these version control systems. The MsBuild.Community.Tasks contains a TFS and Subversion Version tasks for this purpose.

1) Getting the revision number from Mercurial.

Firstly download and copy the MSBuild.Mercurial from http://msbuildhg.codeplex.com/releases/view/47779 and copy the following two files below to “[Project Root]\Tools\MSBuild.Mercurial” diretory:
  • MSBuild.Mercurial.dll
  • MSBuild.Mercurial.tasks
With these files in place we can now use the HgVersion Task to retrieve the version number from Hg and store it into a property.
<Import Project="Tools\MSBuild.Mercurial\MSBuild.Mercurial.Tasks" />
<Target Name="Hg-Revision">
    <HgVersion LocalPath="$(MSBuildProjectDirectory)" Timeout="5000">
        <Output TaskParameter="Revision" PropertyName="Revision" />
    <Message Text="Last revision from HG: $(Revision)"/>
Firstly we set “MsBuildMercurialPath” to the current directory. Then we import the Mercurial Tasks from our Tools directory.

Next we define a new target called “Hg-Revision” which calls the HgVersion task. The HgVersion task will set the revision number from the repository in the "LocalPath" to a property called “Revision”.

Lets test it out by running: build /t:hg-revision

Output from hg-revision task

2) Creating our Assembly Info File

Now we are going to create a task to automatically generate an assembly info file.

We need the MSBuild.Community.Tasks library from here: http://msbuildtasks.tigris.org/

Again I will copy the necessary files under the tools folder.
  • Tools\MSBuildCommunityTasks\MSBuild.Community.Tasks
  • Tools\MSBuildCommunityTasks\MSBuild.Community.Tasks.dll
(NB: Copying these assemblies under tools folder makes the solution much easier to setup on other developer machines and build servers.)
<!-- Create an Assembly Info File -->

<Import Project=".\Tools\MSBuildCommunityTasks\MSBuild.Community.Tasks.Targets" />

<Target Name="SolutionInfo">
    <Message Text="Creating Version File: $(Major).$(Minor).$(Build).$(Revision)"/>
        AssemblyDescription="StickyBeak logs requests to your website"
        AssemblyCopyright="Copyright © whohive"    
Once this is in place we can now generate our AssemblyInfo.cs file. I have called the output filename SolutionInfo.cs as I like to use the same file for each project in the solution (I’ll show you how to do this soon).

Let's test this by running: build /t:solutionInfo

Output from SolutionInfo

As you can see we are using the hard-coded version number of for the moment.

Now lets hook our SolutionInfo.cs file into our visual studio project and have it included in the build.
  1. Firstly go through your project and delete any existing AssemblyInfo files you may have under the “Properties” folder.
  2. For each project in your solution right click and select “Add Existing Item” from the context menu.
  3. Navigate to the SolutionInfo.cs file generated and add as a link. (See below on how to add as a link)
Add SolutionInfo as a link

3) Combining the Hg-Revision and SolutionInfo Tasks

So far our SolutionInfo task only uses a hard-coded revision number of  "0". What we really want is to have this revision number linked to the source control revision number. This is really easy to achieve as follows.
<Target Name="VersionSolutionInfo" DependsOnTargets="Hg-Revision;SolutionInfo">
    <Message Text="Get Revision, Generate SolutionInfo"/>
So we have just combined the two items into a new task called VersionSolutionInfo. This task depends on the Hg-Revision and SolutionInfo tasks, so it will first call Hg-Revision to get the lastest revision number into our Revision property then when SolutionInfo is called it will use this revision number.

Lets execute it as follows: build /t:VersionSolutionInfo

Output from VersionSolutionInfo

As you can see, the revision number 10 is retrieved from mercurial and then the solutionInfo is generated. If you inspect the SolutionInfo.cs file you will find that the version number is now

The final step is to change our default build task to depend on VersionSolutionInfo so we can have everything done in 1 step.
<Target Name="Build" DependsOnTargets="Clean;VersionSolutionInfo,Compile">
    <Message Text="Clean, VersionSolutionInfo, Compile"/>
NB: you can have any number of files containing the assembly meta data such as company, version etc. For example we could split it as follows:
  • SolutionInfo.cs – contains static information and is checked into source control
  • SolutionInfo.geneated.cs – just contains version information (not checked in)
You can organise it anyway you like.

Update: You need to TAG your repository with the full version number so you can relate the release's version no to a changeset across all repositories. I'll post a build target to do the tag automatically at some point. Thanks to @jdhard for bringing this up.

Below I have provided two sample MsBuild files. Firstly we have our main build file that handles : clean & compile from the first post it this series. The second file contains the targets used in this post. I have split out the items so it's easy to isolate the Tasks for this post.

This is the second post in the MsBuild series.

Shout it kick it on DotNetKicks.com

Sunday, October 17, 2010

Create a Build File for a Visual Studio Solution - MsBuild Series

Why create a build file for a Visual Studio 
A build file automates the process of building, testing, analyzing, packaging, & deploying your project. Build files can be used to give you a single click solution to perform mundane tasks in a consistent way.
It saves you time by automating all the tedious steps necessary to prepare your project for testing or deploying. It reduces risk because you can confidently repeat the process, there's no change you will forgot to rename a file or change to release mode etc etc.

OK but How?
Creating a build file for the first time can be a little tricky so I have prepared a quick tutorial on creating a simple msbuild file for compiling your solution. Just follow the steps below and let me know if you have any issues.

1) Folder Structure

Once we start running builds of our project we need a place to store the results. We may also need extra tools for our build process and of course we need a place for our new build file.

Below is a the folder structure I will be using for this introduction. It allows us to keep all of the build files / folders out of our source tree.

Project Root:/
/Build – result of the build will be placed in here
/Source – all source code (& libraries) for the project
/Tools – collection of tools used for the build process
/build.bat – simple 'double click to build' ms dos batch file
/build.proj– our build file for msbuild, this is where we define the steps for our build process

2) Creating the Batch File

Create a new blank file called build.bat in the root directory of your project. This will be a simple ms dos batch file to kick off our build script. Simple copy the contents below into the batch file.
REM dont remove this line
"%windir%\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe" /nologo build.proj  %*
Make sure you leave the first line intact, otherwise you may run into problems because of the encoding used to save the file. If you have issues with encoding then you may need to open the file using a specific encoding (850) see: http://msdn.microsoft.com/en-us/library/dxfdkfke(VS.80).aspx.

The batch file simply calls msbuild.exe and passes in our build.proj file. The %* argument passes any command line arguments supplied to the batch file into the msbuild.exe command. This allows us to specify a target at the command prompt like this: build /t:clean

4) Creating the Build File

Lets start with the simplest sample possible.
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">


     <Target Name="Clean">
      <RemoveDir Directories="$(BuildOutputDir)" />
     <Target Name="Build" DependsOnTargets="Clean">
      <Message Text="Clean"/>
The build file contains two targets: Clean & Build. Build is our default target so if you run build.bat without any target specified and the build target will be executed by default.

The default target is specified on the 2nd line with DefaultTargets="Build"

The Build target depends on the Clean target, so before the build target is executed the Clean target is executed. The Clean target simply deletes the output directory, giving us a clean working space to preform the build.

The output directory is defined in the ProperyGroup near the top of the file. This simply assigns the variable: BuildOutputDir to the string “build”, which is the name of our output or build directory.
5) Adding the Compile Target
To actually compile the project we need to add another target, which we are going to call Compile.

The updated proj file now looks like this:

 <Target Name="Clean">
  <RemoveDir Directories="$(BuildOutputDir)" />

 <Target Name="Compile">
  <MakeDir Directories="$(OutputDirectory)" />
  <MSBuild Projects="$(SolutionToCompile)"
     Properties="Configuration=Automated_Build;" />
 <Target Name="Build" DependsOnTargets="Clean;Compile">
  <Message Text="Clean, Compile"/>
Things to notice:
  • Extra variable for the solution to compile
  • Extra target called Compile
  • Compile target uses msbuild to compile the solution
  • Solution Configuration is set to Automated_Build
  • The Build target now depends on Compile too.
If you run this you should receive an error stating that the Automated_Build configuration does not exists. We need to create this configuration inside visual studio.

6) Creating the Configuration “Automated_Build” in VS

Open up our visual studio solution, then under the Build menu, select Configuration Manager. Create a new configuration called “Automated_Build” as show below, copy the settings from the release configuration.

Creating the Configuration inside Visual Studio

I am going to unselect SampleWeb because I do not want that project compiled as part of the automated build.

Configuration Setup - Removed Sample WEb

Now we will change the output directory for the “Jobping.StickyBeak” project under the “Automated_Build” configuration. As Show below:

With our project configuration complete we can now run our build.

Build Output

As you can see we now have our freshly compiled binaries in the build directory.

By simply adding more targets to your build file you can automate any tedious step that is needed to build and package your project.

This is just the beginning! next we will look at 

Shout it kick it on DotNetKicks.com

Saturday, October 2, 2010

Jobping Url Shortener – Version 0.6

We have just checked in our latest version of the Jobping Url Shortener, version 0.6. This version resolves an important issue raised by @nato24. Nato24 asked “Have you guys implemented a solution for obscenities in your short url encoder?” . Thanks nato24 & Good question!

Our short urls are too short at the moment to create any four letter words(yet!) but it was just a matter of time before all possible 4 letter words were used.

So, in this release we have decided to remove all vowels (‘a’, ‘e’, ‘i’, ‘o’, ‘u’, ‘A’, ‘E’, ‘I’," ‘O’, ‘U’) from the possible letters used in the Url shortener. This means that our urls will grow a little faster but we were willing to sacrifice this for wordless urls.

However, we do get a major benefit also, we can now make our own short urls using words (that contain vowels). So we are free to make up short urls like http://jobp.in/example and know that the shortener will never create a clashing url because it contains a vowel (or 3).

We also added a feature that allows us to offset the urls generated. By adding this offset into the configuration we allowed our url shortener to skip over any duplicates that would have been created if we just removed the vowels from our code.

The new version is on codeplex here: http://jpurlshortener.codeplex.com/

The production site for our shortener is here: http://jobp.in/

Also be sure to visit Jobping, which is now Global and advertises positions based on Microsoft Technologies.

Shout it

StickyBeak Version 0.3 Released

Finally found some time to add a couple of features to StickyBeak.

StickyBeak is a logging utility for asp.net websites which can log every request to your site. It provides similar features as IIS log files but provides additional logging information (which just isn’t possible with IIS logs) and easy viewing of logs via a admin page. You can also use the StickyBeak log file parser in your own code.

StickyBeak records request details such as url, ip address, unique session Id,  datestamp, cookies, querystring, form values, session variables etc. StickyBeak allows you to track the requests that lead up to errors/exceptions on your site. So it provides valuable context for figuring out exactly what caused an error. StickyBeak is complimentary to elmah.

We use a modified version of StickyBeak on Jobping .

Version 0.3 is now on codeplex and includes the following additional features:

1) Logs Asp.Net Session keys and values.
Session keys and values are now recorded alongside the existing request data (cookies, querystring, posted form values, headers etc).

2) Allows StickyBeak to be temporarily turned on or off via the admin page.
StickyBeak can be Enabled via the configuration file but now you can temporary override this setting on the admin page. Therefore you can have StickyBeak turned off in the configuration file (which means no logging is taking place) then temporary turn on the logging via the admin page. If the application is restarted StickyBeak will then revert to the config setting.

Upcoming features for StickyBeak

We would love to hear any suggestions you may have for the next version so please send them through. Currently we are planning to include features such as:

Database integration - which will allow log files to be consolidated from multiple sources into a rational db for analysis.

Viewstate (logging/decoding) – allow Viewstate to be logged and decoded (we are using mvc :)

You can read more about StickyBeak here: http://markkemper1.blogspot.com/2010/06/introduction-to-stickybeak.html

StickyBeak is hosted on Codeplex: http://stickybeak.codeplex.com/

Shout it
kick it on DotNetKicks.com

StickyBeak in action below. New features highlighted with red squares.

Tuesday, June 22, 2010

Introduction to StickyBeak

While working on Jobping we wanted a raw record of each request made to our site so IF something happens to go wrong we would have all the data necessary to recreate the event and/or the data itself. Needless to say that it has proved very useful to investigate what has happened on the site.

However the code used for this logging is embed into the main project and not easily portable, I wanted to make a assembly that captured this functionality so I could easily drop it into the next project. So we created  StickyBeak.

StickyBeak is a logging tool for asp.net websites written in c# and currently requires the NLog logging library to run. StickyBeak’s purpose is to log each request to your web server and also provide a easy interface to view these requests.

Looking at these logged requests is extremely useful when you are trying to find the cause of an exception or  even more useful when you are trying to recover some lost data because of an exception.

StickyBeak works as an HttpModule and logs the raw request data into a log file using Nlog. The information recorded for the requests includes, date, http method, url, User.Identity.Name, IP Address, unique session Id, unique browser Id, header values, querystring values, posted form values and cookie values.

Below is a screenshot of the admin viewing tool, which lets you see the logged activity on your site.

How it works

StickyBeak runs as a HttpModule, each time a request is processed by .net the module creates a new RequestLog object and populates all the data using the current request. The RequestLog object is then passed to the LogRepository which saves the Requestlog object.

Currently there is only one LogRepository, this repository uses NLog. The NLogRepository writes the LogRequest object out to the log files in a custom format. The NLogRepository can also read LogRequest objects for viewing using the admin interface.

Configuration Needed For StickyBeak To Work

  1. You need a reference to the StickyBeak and NLog assemblies contained in the binary zip file distribution on CodePlex
  2. Configure the StickyBeak HttpModule

  3. Configure the handler (to display admin interface). The configuration below also secures the handler you may wish to remove this for testing.

  4. Configure NLog to record the logging information that is record by StickBeak

  5. You can also optionally configure exclusions for StickyBeak. For example you could exclude all requests to a certain URL, exclude a querystring/form/cookie/header value by key etc. See the sample configuration for more details.
You can download the source and binaries from StickyBeak on CodePlex. kick it on DotNetKicks.com Shout it

Tuesday, May 18, 2010

Quickly Trim all model's Properties - Reflection Vrs Fasterflect

I've had some great feedback from Buu Nguyen author of the Fasterflect library. He supplied the feedback below as well as some sample code, which I have incorporated into the code base. 

Buu Nguyen says:
Fasterflect only speeds up invocation operations, not query (aka lookup) operations.  In fact, query operations in Fasterflect are convenient wrapper for .NET reflection, so using them will cause the code to run slower.

Suggested usage: avoid query operations if you don’t really need them;  move the GetProperties out of each iteration because it is the same for all methods and thus makes it hard to see the performance difference when using Fasterflect.

The delegates should be reused instead of being regenerated every time – the latter will make the code run even slower than the normal Fasterflect API.
Suggested usage: use a dictionary to cache the generated 

The main change to the code was to cache the call to type.GetProperties(). So instead of calling the GetProperties() method directly a call is made to a caching component which ensures that the GetProperties() call is only made once per type.

Also, as Buu has suggested, the calls for generating the delegate setters and getters are now cached into a static dictionary as well.

Test Methods (2-4 using cached call to type.GetProperties() ):
    1. Long Hand - No reflection.
    2. Initial code - normal reflection
    3. FasterFlect m1 - Uses the FasterFlect library
    4. FasterFlect m2 – Uses the FasterFlect library’s delegates approach


So here are the results (over 1 million object trims):
Long Hand Trim
Initial code
Fasterflect m1
Fasterflect m2

Ok, now we are seeing Fasterflect perform nearly twice as fast as reflection and about 3 times as fast when using the delegate method of Fasterflect.

It also appears that the delegate method of Fasterflect is only about twice as slow as the long hand trim.
Download the source code.

Long Hand code

Property Cache Helper

Initial code

FasterFlect Method #1

FasterFlect Method #2

Set Default Outgoing Repository with hg (Mercurial)

Firstly if you are new to Mercurial check out http://hginit.com/ its a brilliant little Mercurial tutorial.

When creating a hg repository locally, at some point later, I often need to set the default outgoing repository to push out my changes.

Creating projects locally using hg (hg init) without cloning from anther location is ideal for playing with new projects locally and having your source code version controlled.

There are so many times where I spend about 10 minutes going down a path then find out that I don't want to continue. With mercurial or git as a local version control system, you can simply revert back your changes to the last commit.

When the time comes to publish your changes to the world, setting up a default outgoing repository in hg saves a lot of typing.

Here is how to do it
  1. Go to the .hg folder in the root of your repository
  2. Create a new file called "hgrc"
  3. Enter your default outgoing repository as show below.

That's it. Now you can just enter "hg outgoing" to see the list of change-sets that need to be pushed to your default repository and "hg push" to actually push out your changes.

Codeplex now has Mercurial as an option for version control.

Here is a copy paste sample for the Jobping Url Shortener project on codeplex
default = https://hg01.codeplex.com/jpurlshortener

kick it on DotNetKicks.com

Monday, May 17, 2010

Quickly Trim all your model's string properties – Speed results

Update: Posted another follow up with feedback on the usage of Fasterflect

With my Quickly Trim all your model's string properties post causing quite a stir (2 comments), I have decided to post a follow up and actually test the speed.

For the actual benchmarking I used a slightly modified version of this nice little class http://www.yoda.arachsys.com/csharp/Benchmark.cs, which I found from this page http://www.yoda.arachsys.com/csharp/benchmark.html

I tested 4 different methods that use reflection to trim all the string properties on an object and the 'long hand' method that does not use reflection at all.

Test Methods:

    1. Long Hand - No reflection.
    2. Initial code - with the GetIndexParameters call removed (not needed)
    3. Initial code with linq – Initial code but using linq to select the properties to process
    4. TypeDescriptor– used the TypeDescriptor class to get the property list
    5. FasterFlect m1 - Uses the FasterFlect library
    6. FasterFlect m2 – Uses the FasterFlect library’s delegates approach


So here are the results (over 1 million object trims):
Long Hand Trim 00:00:01.4220000
Initial code 00:00:07.7130000
Initial code with linq 00:00:10.7270000
TypeDescriptor 00:00:14.2340000
FasterFlect m1 00:00:06.1330000
FasterFlect m2 00:00:06.1830000

Surprised? I was. Firstly, the refection code varied from 7 - 14 times slower then the direct code, this wasn't a surprise more of a reminder that you really need to be careful when using reflection in your code base.

I knew my initial code would perform better then trying to filter for properties before doing the work. Filtering the properties first using linq or any other means, basically results in another loop over the properties on the object. So the more properties you have on your object the worst this method will perform.

But I was quite suprised with the TypeDescriptor. I had hopes that this would perform better then my initial code, as according to the documentation this method actually caches the meta data about objects. I am suspicious I haven’t used it to its full potential…

The other surprise was that the delegate method in fasterflect is actually slower then the normal method. Again, someone may well point out how to implement this better. The other note is that fasterflect was only able to achieve an 80% reduction. I was hoping for more.

Anyway interesting stuff, if someone has another fast method to achieve the same results let me know I’ll take it for a test drive.
Download the source code.

Long Hand code

Initial code

Initial code with linq


FasterFlect Method #1

FasterFlect Method #2

kick it on DotNetKicks.com