19 February 2008 in Mindscape & Tools | Comments (0)

In the last six months we have been carefully monitoring the statistics for our website and have been using a mixture of Google Analytics & AWStats. Those two provide a reasonable amount of what I would consider “core” statistics and if you invest more time with Google Analytics you can get some seriously cool statistics about conversion rates for advertising campaigns (such as how many people click through, how many sign up to emails, how many download a trial etc). However I’ve always been wanting to try something like crazyegg to get some heat map statistics. Recently Sam from YouTXT pointed me in the direction of ClickHeat.

What’s a heat map?

A heat map is a graphical representation of where people actually click on a page and looks very similar to weather maps that you sometimes see showing rainfall (red means more, blue means less, transparent means none). This can be useful for identifying where people click on your page and highlights what is important on certain pages.

Here is a screenshot of a forum page from the Mindscape site:

ClickHeat heap map for the Mindscape forums

What sort of things can you learn? Well, from my experience, I’ve learnt the following:

  • Anywhere that you write “free” tends to attract clicks. I don’t have that currently hyperlinked to a download page so I should change that to increase conversions
  • Continuing the trend of the previous information, I have found several areas where people click expecting a link but where we do not currently have links – time to update those locations
  • Our services page does not have many links and yet people seem to click a lot on one or two of the technologies that we specialise in and can provide great services for. For example, Windows Server 2008 is attracting a lot of coverage at the moment with Jeremy doing the Microsoft road trip promoting Windows Server 2008, SQL Server 2008 and Visual Studio 2008

That’s just a taste of some of the things we’re learning – there is a lot of data in there. Some of the more simple stats (e.g. most people visit our blog from the services page) could be calculated from existing web stats but it would more challenging to extract that information. Higher fidelity representation of information is a key to to improving information consumptionand this tool certainly highlights that.

How can you use it?

You could either use CrazyEgg or you can install and setup your own free version of ClickHeat, an open source alternative. If you would prefer to have some professional help in setting up this sort of system then you’re more than welcome to get in touch with me and Mindscape can help you out.

– JD

Average Rating: 4.5 out of 5 based on 257 user reviews.

30 January 2008 in .Net & Code & Microsoft & Tools | Comments (0)

A big thanks to everyone who came along to the dot net user group meeting this evening for my presentation on the new ASP.Net MVC Framework. I’ve attached the files below to have your own play with the sample and to check out the presentation file as well.

You will need to install the ASP.Net MVC Framework but everything else (like the MVC Toolkit) is included in the download. If you would like more help with LightSpeed then I’d suggest you download the Express edition which includes a huge number of samples and one of the best developer guides you’ll find.

I would also take this moment to mention that this is a first CTP, you shouldn’t be using it to create production quality solutions yet and the chance for the API to change is very very high. Just have an explore, taste test the framework and get your head around some of the concepts at this stage :-)

Download the slides and sample application here.

I hope those of you that attended enjoyed the presentation and, as always, I appreciate any feedback.


– JD

Average Rating: 4.9 out of 5 based on 249 user reviews.

20 December 2007 in Code & Microsoft & Tools | Comments (0)

Andrew Peters (co-founder of Mindscape) has released the NHaml View Engine for ASP.NET MVC. He had been working with Haml on a project and fell in love with how easy it was to use and the quality of the output and decided it was high time something like this was available to .NET Developers.

Andrew has created a rather lengthy blog post about the engine as well as how to write in Haml, Check out the NHaml View Engine here.

If you are using ASP.NET MVC and are a lover of fine XHTML output then you should check this out.

– JD

Average Rating: 4.4 out of 5 based on 296 user reviews.

4 December 2007 in .Net & Code & Tools | Comments (5)

After yesterdays post I thought I should write up a basic sample to test the effectiveness of the Parallel Extensions. Admittedly it is a contrived example and you are unlikely to see this sort of performance increase in a real world since your applications are unlikely to be this primitive.

My sample iterates through a number sequence from 0 upwards and works out if the value is a prime number. There are two implementations, one using a standard loop and the other using a Parallel.For(). Of course, to try and ride out any spikes I iterate the tests 25 times and then average the outcome. This test is of course not run in a clean environment but does give a roughly indicative result of using the Parallel Extensions.

Using a dual core system, checking the numbers up to 100, 000 and running 25 iterations of each run I had the following outcome:

Using a normal for() loop: 3104 milliseconds average per run
Using a Parallel.For() loop: 1607 milliseconds average per run

This speed up is acceptable and, as you can imagine, these sorts of results are only going to become more impressive as we move to 8, 16, 32 core systems.

A few things to consider in a real world application (consider this my “don’t blame me if you think this will solve all your problems” line! :) :

  • Often slowness is caused by some slow resource – a web connection, a database call etc. The parallel extensions library will default to spinning up as many threads as there are cores and therefore if you have a slow dependent resource you may wish to investigate bumping up the thread count or writing your own threading code.
  • The architecture of a solution is more likely to impact the overall performance of the application. Improving the speed of a few loops and LINQ queries will not improve performance by any order of magnitude.
  • Amdahl’s Law applies – effectively this law states that the maximum parallel improvement that is possible for an application is limited by the amount of sequential code remaining. For example, if I can only make 10% of the code run in parallel then even with infinitive parallel processes running I’m still running slow sequential code 80% of the time – this feeds back to the previous point.

Download my sample application here (with source)

Note: You will need .Net 3.5 framework installed + the Parallel Extensions Library installed

– JD

kick it on

Average Rating: 4.4 out of 5 based on 223 user reviews.

3 December 2007 in .Net & Code & Microsoft & Tools | Comments (0)

Just an FYI for anyone that is keeping up parallel computing, Microsoft has released the December CTP of the Parallel Extensions Library.

From my work with it I’ve found it to generally be quite usable however the documentation and general install quality is a little weak at the moment. Even when trying to respond via the email link in the documentation with some suggested documentation changes I found the email was bounced back – not an overly good look but it is early days :)

The extensions provide parallelisation helpers for both general tasks (e.g. making a for loop parallel) as well as providing PLINQ (one quess for what the “P” stands for!). PLINQ, from my reading, applies only to LINQ to Objects but is a useful start. So far my interest has been in the tasks support.

As a simple example of how to make a loop run using all your cores, here is our original code:

foreach(MyClass c in data)

Here is our code using the Parallel Extensions:

Parallel.ForEach(data, delegate(MyClass c)

As you can see, the extensions make it easy enough to start getting some elementary parallelism working.

The tide is changing

I haven’t heard developers discussing parallelism all that much yet and that concerns me slightly given the impending dependence on parallelisation that high performance software solutions are going to have in the coming years. I wonder if perhaps this is because Microsoft hasn’t released anything specific about it yet (present post topic excluded) and therefore many in the .NET space simply have ignored the parallelisation issues. Certainly some developers are looking at languages such as Erlang which is designed with parallel development in mind and enables the creation of massively parallel software.

A key thing to remember is that parallisation is not a solved problem. Simply dropping in a Microsoft assembly is not going to mean that your solutions are going to run a lot better all the time, hence my advice that developers everywhere should be sharpening their saw and rediscovering exactly what the long bearded lecturer in their computer science concurrency class was babbling about.

I would urge every developer to, at the very least, do some reading up on threading, concurrency and even perhaps try the Parallel Extensions.

– JD

Average Rating: 4.6 out of 5 based on 217 user reviews.

8 August 2007 in Tools | Comments (1)

First off, apologies if you were drawn to this post thinking I was going to be talking about the benefits of hot vaporised water ;)

Some of the gamer audience that tracks my blog will know Steam as an game delivery platform created by Valve software. Users can sign up, and the buy games and download them directly through the Steam application, the upside to this is that now when I purchase games I don’t need to worry about losing the installation media or about keeping them up to date with patches as Steam takes care of all of this for me.

I had my first brush with this service a few years back at Intergen with a game called Counter Strike (a game I’ve played for 5 minutes, didn’t like and never played again). The service was slow, painful to get up and running and I never looked back at it until a few days ago when Newt mentioned the id Super Pack. I was super impressed – just about every game and add-on that id software has ever released in one convenient purchase (from Keen, Wolfenstein, Doom, Quake, Hexen and Heretic). What makes this work even more beautifully is that it’s already configured with DOSbox so, as a Vista user, I can still happily run Commander Keen as if I was back on my 486 without having to muck around with EMS memory and configuring the IRQ of my sound card. Nice!

All of that cost about 65 US dollars – so it is a very efficient way of getting a heap of games (perhaps if you have children and don’t mind exposing them to some first person shooting action this would make a fantastic gift and keep them busy for months). It also helps that our dollar is as high as it is right now.

There are far more than just id software titles on Steam, if you’re interested check out:

Now all I need to do is find a service where I can purchase time to actually play some games again.

– JD

Average Rating: 5 out of 5 based on 230 user reviews.

6 August 2007 in Mindscape & Tools | Comments (1)

Late Friday last week we pushed LightSpeed 1.0 out the door for everyone to poke and prod! I’m really pleased with what has been produced and have had some great feedback to date about the performance, elegance of the API and generally great feedback.

As much fun as we had celebrating the occasion of shipping our first product we’re not going to rest, now it’s time to step up to the plate of trying to educate developers and technology vendors of the benefits of using LightSpeed in their applications. I honestly believe that this product will enable your developers to develop data driven solutions faster than ever before with LightSpeed.

While the market is pretty full the feedback is that our method of achieving things works much better than most competitors. We’ve already had several purchases of LightSpeed and those commercial customers have been really pleased with LightSpeed, it’s always a good sign that you’re ready to ship when someone has taken the risk of playing beta software only to have it pay off for them and start praising the framework before you even reach 1.0 status. That probably sounds like bragging but I honestly am just really happy with what has been created.

I’d love for anybody who keeps track of my blog to download the Express edition and have a play with the samples. You will see the difference right away and hopefully appreciate that it’s worth using in your application development (and if you don’t decide that, let me know why so we can work on it ;) ).

Phew! It feels great to ship! :)

– JD

P.S. We also upgraded our site on Friday with new forums, content and slightly enhanced look at feel. If you were part of the EAP program, your credentials and posts have been migrated :)

Average Rating: 4.8 out of 5 based on 192 user reviews.

21 June 2007 in Code & Tools | Comments (2)

Earlier this year I was tooling around with PowerShell and generally having a fun time with it and, based on a comment from Andrew, decided to see how difficult it would be to write a PowerShell Drive Provider. A drive provider is a mechanism of allowing you to navigate any store with the same commands as navigating the file system. PowerShell ships with several providers already, a registry and certificate store. This means you are able to “cd” into your registry and write commands like “dir” or “ls” on a node and get a list of all the children for example.

The attached download includes working commented code and a detailed tutorial PDF on how to write a drive provider that will enable you to mount a zip file as a drive and then navigate through it. This is powerful for removing the need to extract and then navigate files quickly and easily. I’ve implemented it in a simplistic manner and there is plenty of room for additional features but I wanted to keep it simple for educational purposes. Note that you will need PowerShell installed to even compile as it relies on several assemblies that ship with PowerShell.

So if you find yourself lying awake at night wishing you could mount zip files as drives within PowerShell (don’t we all?) or just want to learn more about writing drive providers or managed plug-ins for PowerShell then check this out.

Download the PowerShell Zip Drive Provider and Tutorial

Hope this helps people have more of a poke at PowerShell,

– JD

Average Rating: 4.6 out of 5 based on 203 user reviews.