Saturday, 6 October 2012

How to update/install Node on Ubuntu

I needed to upgrade node.js on my ubuntu dev machine but could not find any good instructions on the internet, i tried several suggestions and finally got it working using an amalgamation of a few blogs.
My current system setup before the process.
ubuntu64:~$ which node
/usr/local/bin/node
ubuntu64:~$ node -v
v0.4.9

Firstly make sure your system is upto date
ubuntu64:~$ sudo apt-get update
ubuntu64:~$ sudo apt-get install git-core curl build-essential
            openssl libssl-dev

Then clone the Node.js repository at git hub:
ubuntu64:~$ git clone https://github.com/joyent/node.git
ubuntu64:~$ cd node

I wanted the latest tagged version
ubuntu64:~/node$ git tag
....
....big list of all the tags
....
ubuntu64:~/node$ git checkout v0.9.2

Then I removed the old version of node
ubuntu64:~$ which node
/usr/local/bin/node
ubuntu64:~$ cd /usr/local/bin
ubuntu64:/usr/local/bin$ sudo rm node

Now to install the desired version, in may case v0.9.2
ubuntu64:/usr/local/bin$ cd ~/node
ubuntu64:~/node$ ./configure
....
ubuntu64:~/node$ make
....
ubuntu64:~/node$ sudo make install
....

Then I had to run the following to update the profile
ubuntu64:~/node$ . ~/.profile
Finally confirm that node is in fact upgraded, and npm has magically been installed too :-) bonus
ubuntu64:~/node$ which node
/usr/local/bin/node
ubuntu64:~/node$ node -v
v0.9.2
ubuntu64:~/node$ which npm
/usr/local/bin/npm
ubuntu64:~/node$ npm -v
1.1.61

Wednesday, 22 August 2012

Continuous Integration performance testing. An easily customisable solution.


Using JMeter to profile the performance of your web application and visualise performance trends, all within the CI pipeline.


The full solution as outlined here can be found on my GitHub repository at https://github.com/DamianStanger/CIPerformance

Introduction

Most companies care about the performance of their web sites/web apps but often the testing of this performance is left till the last minute with the hope that the devs will have been doing good job writing well performant code for the last x months whilst developing. I don’t know why this is often the way? If performance really is a major Non Functional Requirements (NFR) then you have to test your performance as you go, you can’t leave this until the last moment just before deployment / live and then when you find that performance is not good enough just try and hack in quick fixes. This is just not good enough, you can’t just hack in performance after the fact, it can take a substantial change to the design (to do it well).

On our team we have been performance profiling each important page of our app since month 1 of the development process, we are now live and are working towards the 4th major release. I (and the team) and have found our continuous performance testing invaluable. Here is the Performance graph as it stood a few weeks ago:

The process outlined below is not a method for stress testing your app. It’s not designed to calculate the load that can be applied, instead it's used to see the trend in the performance of the app. Has a recent check-in caused the home page perform like a dog? Any N+1 DB selects or recursive functions causing trouble? It’s a method of getting quick feedback within the CI pipeline minutes after a change is checked in.

The process

1. When we check in, our CI box (TeamCity) runs the build (including javascript tests, unit tests, integration tests, functional tests, acceptance tests), if all this is successful then the performance tests are kicked off.
2. Teardown the DB and restore a new copy (so we always have the same data for every run, this DB has a decent amount of data in it simulating the data you have in live in terms of volume and content).
3. Kick the web apps to prepare them for the performance tests, this ensures IIS has started up, and the in memory caches primed.
4. Run the JMeter scripts.
a. There are numerous scripts which simulate load generated by different categories of user. For example a logged out user will have a different performance profile to a fully subscribed user.
b. We run all the scripts in serial as we want to see the performance profiles of each type of user on each different site we run.
5. The results from each run are processed by a powershell script which extracts the data from the JMeter log files (jtl) and writes the results into a sql server database (DB). There is one record per page per test run.
6. We have a custom MVC app that pulls this data from the DB (using dapper) and displays it to the team on a common monitor (using JSON and RGraph) that is always updating. We see instantly after we have checked in if we have affected performance, good or bad. We could break the build if we wanted but decided this was a step too far as sometimes it can be a day or two to fix any poorly performing aspect of the site.

A stripped down version is avaliable on my GitHub account, Running the powershell script a few times and then running the mvc app you should see something like the following:

The juicy bits (interesting bits of code and descriptions)

Powershell script (runTest.ps1)

• Calling out to JMeter from powershell on line 112
& $jmeter -n -t $test_plan -l $test_results -j $test_log

• Parse JMeter results on line 133
[System.Xml.XmlDocument] $results = new-object System.Xml.XmlDocument
$results.load($file)
$samples = $results.selectnodes("/testResults/httpSample | /testResults/sample/httpSample")


Then iterate all the samples and record all the page times and errors

• Write results to DB on line 171
$conn = New-Object System.Data.SqlClient.SqlConnection($connection_string)
$conn.Open()
foreach($pagestat in $page_statistics.GetEnumerator())
{
    $cmd = $conn.CreateCommand()
    $name = $pagestat.Name
    $stats = $pagestat.Value
    $cmd.CommandText = "INSERT Results VALUES ('$start_date_time', '$($name)',
    $($stats.AverageTime()), $($stats.Max()), $($stats.Min()), $($stats.NumberOfHits()),
    $($stats.NumberOfErrors()), $test_plan)"
    $cmd.ExecuteNonQuery()
}

JMeter scripts

You can look up JMeter yourself to find suitable examples of this. My project posted here just has a very simple demo script which hits Google and Bing over and over. You can replace this with any JMeter script you like. The DB and the web app are page and site agnostic so it should be easy to replace with your own, and it will pick up your data and just work.
I recommend testing all the critical pages in your app, but I find the graphs get too busy with more than 10 different lines (pages) on them. If you want to test more stuff just add more scripts and graphs rather than have loads of lines on one graph.
The generic solution given here has two scripts but you can actually have as many as you like.Two would be a good choice if you had a public facing site and an editor admin site which both have different performance profiles and pages. But in the end it's up to you to be creative in the use of your scripts and test what really needs testing.

The results DB

The DB is really simple. It consists of just one table which stores a record per page per test run. This DB needs creating before you run the script for the first time. The file Database.sql will create it for you in SQL server.

The MVC app

Data layer, Dapper

Using dapper (a micro ORM installed through nuget) to get the daily results is done in the resultsRepository class:

var sqlConnection = new SqlConnection("Data Source=(local); Initial Catalog=PerformanceResults; Integrated Security=SSPI");
sqlConnection.Open();
var enumerable = sqlConnection.Query(@"
SELECT Url, AVG(AverageTime) As AverageTime, CAST(RunDate as date) as RunDate FROM Results
    WHERE TestPlan = @TestPlan
    GROUP BY CAST(RunDate as date), Url
    ORDER BY CAST(RunDate as date), Url", new { TestPlan = testPlan });
sqlConnection.Close();
return enumerable;

The view, JSON and RGraph

In this sample code there are four different graphs on the page, two for Google (test plan 1), and two for Bing (test plan 2). Heartbeat data shows a data point for every performance run. It shows you instantly if there has been a bad performance run. This shows all the runs over the last two weeks. The Daily Averages show a data point per day for all the performance data on the DB.
There are four canvases that contain the graphs, these graphs are all drawn using RGraph from some JSON data populated from the data pulled off the DB. It’s the javascript function configureGraph that does this work with RGraph, for details of how to use RGraph see the appendix.
The JSON data is created from the model using LINQ in the view as such:
dailyData: [@String.Join(",", Model.Daily.Select(x => "[" + String.Join(",", x.Results.Select(y => y.AverageTimeSeconds).ToList()) + "]"))],

This will create something like the following depending on the data in your DB:
dailyData: [[4.6,5.1],[1.9,2.2],[4.0,3.9],[9.0,9.0]],
Where the inner numbers are the data points of the individual lines. So the data above is four lines each with two data points each.
Customising things for your own purposes

Customisation

So you would like to customise this whole process for your own purposes? Here are the very simple steps:
  1. Edit the function CreateAllTestDefiniotions in RunTest.ps1 to add in any JMeter scripts that you want to run as new TestPlanDefinitions.
  2. Change or add to the JMeter scripts (.jmx) to exercise the sites and pages that you want to test.
  3. Add the plan definitions to the method CreateAllPlanDefinitions of the class PlanDefinition in the performance stats solution. This is all you need to edit for the web interface to display all your test plans. The graphs will automatically pick up the page names that have been put into the configured JMeter scripts.
  4. Optionally change the yMax of each graph so that you can more easily see the performance lines to a scale that suits your performance results.

Conclusion

We as a team have found this set up very useful. It has highlighted many issues to us including: n+1 select issues, combres configuration problems, and all number of issues with business logic usually with enumerations or recursive functions.
When set up so that the page refreshes every minute, it does a really good job. It has been a constant reminder to the team to make sure they are doing a good job with regard to the NFR which is performance.

A note on live performance/ stress testing

Live performance testing is a very different beast altogether, the objective of which is to see how the system as a whole reacts under stress: To determine the maximum number of page requests that can be served simultaneously. This is different to the CI performance tests outlined above. These tests run on a dev box and are only useful as a relative measure to see how page responsiveness is changing as new functionality is added.

Appendix

JMeter - https://jmeter.apache.org/
Dapper – Installed from Nuget
RGraph - http://www.rgraph.net/
GitHub - https://github.com/DamianStanger/CIPerformance
VS2012 - https://www.microsoft.com/visualstudio/11/en-us

Tuesday, 15 May 2012

A PowerShell script to count your lines of source code

We have been thinking about code quality and metrics of late and since im also learning more powershell decided to write a little script to do that for me. It basically finds all the code files in the project directories and counts lines and files:

Here it is:

$files = Get-ChildItem . -Recurse | `
    where-Object {$_.Name -match "^.+\.cs$"}
$processedfiles = @();
$totalLines = 0;
foreach ($x in $files)
{
    $name= $x.Name;
    $lines= (Get-Content ($x.Fullname) | `
        Measure-Object –Line ).Lines;
    $object = New-Object Object;
    $object | Add-Member -MemberType noteproperty `
        -name Name -value $name;
    $object | Add-Member -MemberType noteproperty `
        -name Lines -value $lines;
    $processedfiles += $object;
    $totalLines += $lines;
}
$processedfiles | Where-Object {$_.Lines -gt 100} | `
    sort-object -property Lines -Descending
Write-Host ... ... ... ... ...
Write-Host Total Lines $totalLines In Files $processedfiles.count


Line 00: Will get all the .cs files from the current working folder and below.
Line 07: Uses the measure-object cmdlet to get the number of lines in the current file being processed.
Line 09: Creates an object, lines 10 and 12 dynamically adds properties to that object for the file name and the line count.
Line 11: Adds the new object to the end of the array of processed files.
Line 17: Selects all the files from the array where the line count is greater than 100 (an arbitary amount, i only care about files longer than roughly 2 screens worth of text), Then print them out in descending order of line count.

My results:
our current project has a total of 154068 lines of code in .cs files.
2559 .cs files of which 312 files have a line count greater than 100 lines.
16 files are over 400 lines in length, but none of those were in the main product (All the worst classes are test classes and helpers which are not production code).

I also wondered about the state of my views:
320 .cshtml files a total of 10958 lines, the vast majority are less than 100 and only 6 over 150.

Tuesday, 17 January 2012

Linq performance problems with deferred execution causing multiple selects against the DB


We have some really good performance tests that run on every checkin providing
the team with an excelent view of how the performance of the software changes
due to different changes in the code base. We recently saw a drop in performance
and we tracked it down to a problem in our data layer.

The problem we encountered was within LINQ to SQL but will be a problem with
other types of LINQ if your not careful.

Personally i consider LINQtoSQL to be dangerous for a number of reasons and
would actually prefer not to be using it but we are where we are and we as a
team just need to be weary of LINQToSQL and its quirks.

This quirk is when the deferred execution of a linq to sql enumeration is
causing multiple selects against the DB.

As this code demonstrates.

public IList<IndustrySector> GetIndustrySectorsByArticleId(int articleId)
{
  var industrySectorsIds = GetIndustrySectorIds(articleId);
  return ByIds(industrySectorsIds);
}

private IEnumerable<int> GetIndustrySectorIds(int articleId)
{
  var articleIndustrySectorsDaos = databaseManager.DataContext.ArticleIndustrySectorDaos.Where(x => x.ArticleID == articleId);
  return articleIndustrySectorsDaos.Select(x => x.IndustrySectorID);
}

public IList<IndustrySector> ByIds(IEnumerable<int> industrySectorIds)
{
  return All().Where(i => industrySectorIds.Contains(i.Key)).Select(x => x.Value).ToList();
}


public IEnumerable<IndustrySector> All()
{
  //work out all the industry sectors valid for this user in the system, this doesn't make a DB call
}

So in the end this all causes an number of identical queries to be fired against the DB,
industrySectorsIds.count number of calls to the DB to be precise.
This is the select we were seeing:

exec sp_executesql N'SELECT [t0].[IndustrySectorID]
FROM [dbo].[tlnk_Article_IndustrySector] AS [t0]
WHERE [t0].[ArticleID] = @p0',N'@p0 int',@p0=107348

By forcing the ByIds() method to retreive all the ids from the DB before iterating All()
will mean that they are loaded into memory once only.

public IList<IndustrySector> ByIds(IEnumerable<int> industrySectorIds)
{
  var sectorIds = industrySectorIds.ToList();
  return All().Where(i => sectorIds.Contains(i.Key)).Select(x => x.Value).ToList();
}

now you only get one call to the DB, thanks LINQtoSQL, your great.

Monday, 7 November 2011

Installing node.js on windows 7 machine

Ive recently been looking for instructions on how to install node.js (the javascript powered web server) on windows, and it received some quite confusing answers involving cygwin, building things from source etc.
And yes it is possible to install node on windows 7, in fact.... Its really really easy actually.

1. download node.exe from http://nodejs.org/dist/v0.6.0/node.exe (link avaliable from http://nodejs.org)
2. create a sample node app

var http = require('http');
http.createServer(function (req, res) {
    res.writeHead(200, {'Content-Type': 'text/plain'});
    res.end('Hello World\n');
}).listen(1337, "127.0.0.1");
console.log('Server running at http://127.0.0.1:1337/');


save in a file 'example.js' and place in the same folder as node.exe (anywhere on your system)
open a command prompt in the folder with node.exe and your .js file

run > node example.js
and thats it, now open your browser and goto: http://127.0.0.1:1337/

Hello World

how easy was that?

Thursday, 20 October 2011

Kanban inspired card wall. Our example

Overview
Some call them 'card walls', some 'story walls', others just 'the wall'. There are many names given to the wall but its purpose is the same, to convey project information to the team and other interested parties / stakeholders. I work on a software development team but Kanban boards can be applied to any process, from manufacturing to household to-do lists. Honestly the uses are as wide ranging as your imagination. But this article concentrates on software development.

Some teams practice 'scrum' and put tasks on the wall, some XP and run on weekly/2 weekly/4 weekly iterations, but we are running in a kanban mode at the moment, applying Work In Progress (WIP) limits as best we can.

I've worked on many different teams, with many different types of wall/board, from scrum to xp, from big walls to tiny boards. I think the Kanban inspired wall we are currently running with is the best I've have the good fortune to use. So I thought I'd share - I'll try to run you through it as best as I can.

The wall
We are fortunate enough to have a very big space for our card wall (3-4 metres wide). I have been on teams that have had to manage with a small white board. Having said that, we have still filled every available space, and it would be nice if we had even more.

The size allows us to show lots of project relevant information and also include plenty of room for up stream activities like information architecture (IA), user experience (UX) and analysis. So we have lots of visibility as to the state of all aspects of the project.

The Columns
The columns are detailed below in their own sections, but to summarise they are:
  • Design
    • In visual design
    • In customer testing
    • In front end dev
  • Analysis
    • In analysis
    • Selected for dev
  • Development
    • In dev
    • Dev complete
  • QA
    • In QA
    • QA complete
  • Done
  • Technical tasks
  • Risks
The WIP Limits
Work In Progress limits help us to focus on the tasks currently in play. Too many tasks on the board at any one time can mean the team is spread too thinly. Traffic jams can occur and important bug fixes won't get through quickly.

Example without WIP limits:
Two big stories have just been finished and so QA now gets to work on them, whilst the developers start on new stories. At this point all four dev pairs are busy, as are the two QAs. A day later the testing is still not complete and another pair finishes their story, this story gets queued up in waiting for QA. If there were no WIP limits the pair would start a new story and get on with that work. But what happens if there is then a bug in one of the stories, a big bug? Who should pick it up? A dev pair need to stop what they are doing mid flow, shelve all their current checked out work, context switch to the bug, investigate and fix it. I guess you know where this is going, with effectively three stories in QA and four stories in dev everyone is going to be flip-flopping between stories and bug fixes. Context switching costs time and the whole process reduces our ability to respond to change. The main consequence being that throughput and velocity suffer, not to mention the headaches of leaving half finished code laying about whilst you fix something else.

By limiting the number of stories in each column this does not happen (well, much much less so). Developers are always available for fixes and everyone experiences less context switching and more uninterrupted flow.

The Story Cards
We have four different coloured cards on the wall: yellow, blue, white and pink. The different colours mean different things depending on where they are on the wall, but its quite straight forward.
  • Yellow: Design tasks, our UX guys work on the yellow cards in the first 3 columns, either working on the IA, wire frames, or styling of up coming stories.
  • Yellow: Development tasks, tech tasks and tech debt. All things that are not stories but need playing, e.g.: Set up performance test environment, or refactor X to remove duplication, or even spike out Y to prove integration points.
  • Blue: Features under analysis. These cards only appear in the analysis and done columns and represent whole features, big things. I think there are around 15 in release one.
  • Pink: Bugs, simples [sic].
  • White: Most of the cards flowing across the wall are white. These denote stories that require dev effort. We try to ensure that all the white cards can be finished in 1-4 days

White Story Card
  • Story number for reference in Mingle or TFS.
  • Story title, brief, must fit on one card in thick writing.
  • Days in play (the dots get added every morning before our daily stand up meeting) This allows us to see how long cards take to flow through the wall.
  • Story point estimate (size/complexity)
  • Relevant notes can be captured on the reverse if needed.

Yellow Tech Card
  • Usually these do not have reference numbers.
  • Usually no estimate either.
  • Just a brief description of the work required.
  • Again relevant notes can be captured on the reverse if needed.

Flow across the wall
Features start their life in the 'In analysis' column. All the features that are designated for release 1 are in this column in priority order. Whilst the BAs are working on splitting stories out, the UX and IA guys are also looking into how best to fulfil the users requirements from a design perspective, and they put their own cards up on the left of the board.
Once a story is ready to be developed, a white card is placed in the 'Selected for dev' column and waits here until a dev pair is ready to pick them up. At which point they are moved in to 'In dev' until the pair has showcased the story to the QA and BA's satisfaction at which point it moves into dev complete (awaiting QA).
The QAs pick up cards and work on them in the 'In QA' column. Cards NEVER move backwards, if there are bugs to fix a dev pair will move to fix it in the QA column, whatever they were doing will remain in the 'In dev' column. Once the QA is happy with the card it will move into 'QA complete' where the BAs now showcase the story to the product owner before moving it to 'Done'.

The Columns In Detail
Design
Walls I've often worked with in the past have not had a design area on them. This project has a heavy design and UX aspect and so it was important to have this area on the wall so we can all see what features are at which point in the process.
The columns we have here are: 'In visual design' for development of the wireframes and IA; 'In customer testing'; and 'In front end dev' for development of the UX including styling and image creating.

Analysis (WIP 6)

The Analysis columns consist of 'In analysis' which list all the features of the current release (release one is about nine months) we keep stickies on the features (blue cards) to show how complete features are. The 'Selected for dev' column is used for the next stories to be picked up by the dev team (white cards), these stories should have been through the analysis process and be ready for devs to pick up, this column also serves as a heads up for the QAs so they can start getting ready for stories, working out acceptance criteria and so forth with the BAs and devs.

Dev (WIP 4)

We currently have four dev pairs and so have set our WIP limit to four for the combination of the two dev columns. At the beginning of the project we set the WIP limit to three as we always had a pair on tech tasks, environment set up and the like, but now we are moving into the second half of the first release most of that type of task has been finished. If there are four white cards in the two dev columns the next pair to finish should help out with the testing effort or work on some tech tasks. This helps keep the actual work in progress on the board to a manageable amount.

QA (WIP 2)


The 'In QA' column has a WIP limit of two as we have two testers. If a story gets blocked due to a bug then devs need to come and help clear the blockage. The card can NEVER move back into dev and the QA can't pick up a new card from dev complete until the issues are resolved. This focuses attention and ensures that stories progress along the wall in a timly manner, always the empasis on getting completed work 'done done' so that we can claim the points, and move on.

Done (Done Done)

This column contains all the stories finished this week. Done Done. These stories are ready to be showcased to the product owner and will be demoed to the entire client user team (all stakeholders) on the Friday of each week. Every Monday all the cards are removed from this column so we can see what has been finished this week and so that the product owner and other stackholders can see the progress since last Friday's showcase.

Metrics


This area contains all the different metrics, the different graphs and guides.
  • Cumulative Flow
  • Weekly Velocity 
  • Points Burn Up
  • Risk Count
  • Story points Guide
  • Done Done Guide

Risks/Issues/Tech Tasks/Tech Debt Wall
There are four columns to this section of the wall. The left most three are for 'tech debt', and tech tasks. The right most is for capturing risks and making them more visible.

Column 1 (Things we must do to make the project successful).
Things like integrate with the corporate authentication system, spike asset management or setup performance testing environment.

Column 2 (Things that would enable us to deliver faster)
Includes things like refactor acceptance tests to reduce build time, and optimise windows configuration on all dev machines.

Column 3 (Things that are not essential to the success of the project but will give us a better solution)
Mainly includes lists of refactorings and areas we want to achieve better test coverage.

Column 4 (Risk wall)
Any risk to the project, from integration points, new technology, people, to computers and hardware. Listed in highest risk at the top. The arrow represents whether the risk is increasing or decreasing as time passes.

Summary
So that is our wall. It saves us time, keeps us focussed on what's important, and neatly tracks our progress.

How does your team work? I would be interested to know what other types of walls are out there. I've seen a few in my time but there is always another way of doing it - what does your wall do for you?


Glossary 
BA - Business Analyst
IA - Information Architecture
Kanban - http://en.wikipedia.org/wiki/Kanban
Mingle - http://www.thoughtworks-studios.com/mingle-agile-project-management
QA - Quality Analyst
Scrum - http://en.wikipedia.org/wiki/Scrum_(development)
Tech Debt - http://martinfowler.com/bliki/TechnicalDebt.html
TFS - Team Foundation Server
UX - User eXperiance
WIP - Work In Progress
XP - eXtreme Programming - http://en.wikipedia.org/wiki/Extreme_Programming

Wednesday, 31 August 2011

How much does your slow machine cost your company?


The problem

I'm currently working at your standard large company. The computers are managed centrally and are replaced every 3 or 4 years or so. The machine I'm working on is a Dell running Windows 7 32bit with 3.21 GB of ram, an Intel core 2 duo CPU at 3GHz and on-board graphics running 2 monitors. Not a terrible machine you may say, but far far from good enough as a developer machine.

We are developing an MVC 3 enterprise web app running on SQL Server 2008, LINQ to SQL, VS2010, Resharper 6, TFS integration. Javascript tests, unit tests, integration tests, feature tests and acceptance tests make up a sizeable testing suite. As you can imagine, this set-up can put quite a load on the machines especially as our app has been growing fast. Our team's velocity is still good, so the code base is growing quickly.

We are finding that the machines hang periodically. All the dev machines are quite slow and all exactly the same spec but for some reason, a few are terribly slow, and we avoid pairing on those machines. Then there's the 30-60 second waits for visual studio builds, the 5 minutes for the check-in builds, the time to compile and run tests as you are doing TDD, the time our tests run for, the 10 seconds+ for visual studio and resharper to refactor things, find things. It all adds up, but how much does it cost? Not only does it cost in terms of wasted time but also in context switching. For example, you're doing TDD, you write a test, do some coding and hit run test but have to wait 30 seconds+ for it to run. This takes long enough to break your flow, you have a quick think about something else and then you realise the test has run and you need to switch you attention back. You might have a quick chat about something else with your pair.

We know it's hurting our velocity but without numbers it's difficult to convince management of the true costs.

So what did we do?

We took a stop watch, kept it with us all day and recorded all the time that where we were waiting for the computer to do something - from opening apps, running builds and tests, searches and refactorings in visual studio - any time at all where the developer had to wait for the machine to work, be it 5 seconds or 5 minutes the stop watch was running. It took quite a lot of discipline. The results were startling.

Results

I did this for a week, every day, and so did a colleague. Our results were very similar in that on average we were sitting unproductive for a collective time of 30 to 60 minutes a day with a couple of days at 15 minutes (mainly due to days with meetings) and a couple of days at a whopping 75 minutes (these were days when perversely we were going quite quick and getting things done, but running builds/tests/check-ins all day takes time).

So lets say 40 minutes wasted per pair per day. That sounds like a lot, but it's how long the stopwatch said. You try it for a day. What are your numbers?

The machine

We are running Windows 7 on a machine that originally was running server 2003, so the requirements on the machine are different. We have turned off all the Aero UI elements from the machines (we have on board graphics). We've turned off the virus checking (controversially, as this is a corporate environment), and followed numerous tutorials on the internet about improving the speed of windows, but all to no avail.

If we had a faster, better, newer machine, then how much would we gain? Well that's subjective, but for arguments sake, say we had a great machine and could cut all the waiting/processing time in half  (*1)

The Costs

What is the cost to the business of one dev per day? It depends on your company, your devs, their seniority, your support staff, building costs, electricity etc. There are many factors, but let's say £200 (this is a conservative estimate *2)

Our team consists of 10 devs, 3 UI/UX/designers, 2 QAs/testers, 2 BAs (analysts), 1 PM (manager) = 18 people. If we assume they each cost a nominal £200 per day, that is £3600 a day for the team as a whole.

Devs are the constraint on the throughput of the system (the bottleneck). That's 10 people losing 40 mins a day, so 400 minutes a day of lost development effort. We said a fast machine might cut this in half, so we have 200 mins of needlessly lost development effort per day. How much does that cost?

We work an 8 hour day, 10 devs x 8 hours = 4,800 minutes of dev time available per day.

So the cost per minute for the throughput is £3,600 / 4,800 = £0.75 per minute (Wait you say, you took the cost of the team, divided by the time of the developers? I shall explain *3)

Finally £0.75 x 200 mins = £150 lost by the business every day

Conclusion

To sum up, it is costing the business £150 per day to have 10 developers using slow machines. Given we are paring on all dev work we only need 5 new fast machines. I know you can get a great machine for £1,000 (*4), which will be future proofed for the next 2 years or so. That's £5,000 in total.

£5,000 / £150 = 33 days. That's the number of days it takes to pay off the new machines by not losing the developers time waiting every day.

So is that £1,000 for a new dev machine worth it? I conclude it is, if the project is longer than 33 days. Is yours?



Appendix
  • Theory_of_Constraints on Wikipedia.
  • The_Goal_(novel) is an excellent book that describes the theory of constraints at a manufacturing plant in a story format.
  • 1 - We will only know how much better a great machine would be when we get one and use it on our project, but I suspect given the state of our current machines it will be half the time.
  • 2 - This is a low estimate, especially as I'm a consultant and get charged out at a higher rate than £200/day, but this is likely to be a conservative estimate for any company employing developers.
  • 3 - The theory of constraints says that the cost of the bottleneck is equal to the cost of the system as a whole. The goal of the system is to produce finished production ready functionality. Which part of the system is the bottleneck, stopping you from delivering more functionality? On our team it's us, the developers. We only release what is coded up every week. Nothing the testers or analysts can do can make us deliver more. So the output of the system is governed by the velocity of the devs. The cost of the team (in our case £3600) is the cost of the 20 story points we deliver that week, and it's the devs who control that velocity. That's not to say the other roles are not important, they are, critically important, but they are not the bottleneck.
  • 4 - Yes you should get a good machine for that. You could actually get a great computer for £500 - £600. We have the monitors and peripherals, and given we are a big company we would qualify for Dell's corporate rates.

Wednesday, 20 July 2011

model dynamic causing unhandled exception

We are using ASP.MVC 3 with Razor views on one of the projects in currently working on. MVC3 uses C# 4.0 and so we get access to the dynamic types, this is really convenient when creating partial views, we can create a model in one partial and pass the new dynamic model into the partial, no need for strongly typed view models everywhere.

But we have found that dynamic models can hide other errors with very time consuming consequence's.
This is the partial "_topics" that was causing us problems:

@model dynamic

<ul>
    @foreach (var topic in Model.Topics)
    {
        <li id="@topic.Id" class="@( Model.SelectedItems.Contains(topic.Id) ?  "selected" : string.Empty ) ">
            <label>@topic.Name</label>
            @if (topic.HasChildren)
        {
                @Html.Partial("_Topics", new { Topics = topic.Children, SelectedItems = Model.SelectedItems });
        }
        </li>
    }
</ul>

as you can see _topics is called recursively.

We made some changes to other areas of our project and got this error: (sorry for the very long stack trace, but i wanted to be complete)
Application information: 
    Application domain: /LM/W3SVC/3/ROOT-3-129566639908092111 
    Trust level: Full 
    Application Virtual Path: / 
    Application Path: C:\projects\Foo\Bar\Src\Editor\ 
    Machine name: AAA20711 
 
Process information: 
    Process ID: 5928 
    Process name: w3wp.exe 
    Account name: IIS APPPOOL\Editor 
 
Exception information: 
    Exception type: RuntimeBinderException 
    Exception message: 'object' does not contain a definition for 'Topics'
   at CallSite.Target(Closure , CallSite , Object )
   at System.Dynamic.UpdateDelegates.UpdateAndExecute1[T0,TRet](CallSite site, T0 arg0)
   at ASP._Page_Views_Shared__Topics_cshtml.Execute() in C:\projects\Foo\Bar\Src\Editor\Views\Shared\_Topics.cshtml:line 4
   at System.Web.WebPages.WebPageBase.ExecutePageHierarchy()
   at System.Web.Mvc.WebViewPage.ExecutePageHierarchy()
   at System.Web.WebPages.WebPageBase.ExecutePageHierarchy(WebPageContext pageContext, TextWriter writer, WebPageRenderingBase startPage)
   at System.Web.Mvc.RazorView.RenderView(ViewContext viewContext, TextWriter writer, Object instance)
   at System.Web.Mvc.BuildManagerCompiledView.Render(ViewContext viewContext, TextWriter writer)
   at System.Web.Mvc.HtmlHelper.RenderPartialInternal(String partialViewName, ViewDataDictionary viewData, Object model, TextWriter writer, ViewEngineCollection viewEngineCollection)
   at System.Web.Mvc.Html.PartialExtensions.Partial(HtmlHelper htmlHelper, String partialViewName, Object model, ViewDataDictionary viewData)
   at System.Web.Mvc.Html.PartialExtensions.Partial(HtmlHelper htmlHelper, String partialViewName, Object model)
   at ASP._Page_Views_Shared__ArticleTabs_cshtml.Execute() in C:\projects\Foo\Bar\Src\Editor\Views\Shared\_ArticleTabs.cshtml:line 42
   at System.Web.WebPages.WebPageBase.ExecutePageHierarchy()
   at System.Web.Mvc.WebViewPage.ExecutePageHierarchy()
   at System.Web.WebPages.WebPageBase.ExecutePageHierarchy(WebPageContext pageContext, TextWriter writer, WebPageRenderingBase startPage)
   at System.Web.Mvc.RazorView.RenderView(ViewContext viewContext, TextWriter writer, Object instance)
   at System.Web.Mvc.BuildManagerCompiledView.Render(ViewContext viewContext, TextWriter writer)
   at System.Web.Mvc.HtmlHelper.RenderPartialInternal(String partialViewName, ViewDataDictionary viewData, Object model, TextWriter writer, ViewEngineCollection viewEngineCollection)
   at System.Web.Mvc.Html.PartialExtensions.Partial(HtmlHelper htmlHelper, String partialViewName, Object model, ViewDataDictionary viewData)
   at System.Web.Mvc.Html.PartialExtensions.Partial(HtmlHelper htmlHelper, String partialViewName)
   at ASP._Page_Views_ArticleCreate_Create_cshtml.Execute() in C:\projects\Foo\Bar\Src\Editor\Views\ArticleCreate\Create.cshtml:line 23
   at System.Web.WebPages.WebPageBase.ExecutePageHierarchy()
   at System.Web.Mvc.WebViewPage.ExecutePageHierarchy()
   at System.Web.WebPages.StartPage.RunPage()
   at System.Web.WebPages.StartPage.ExecutePageHierarchy()
   at System.Web.WebPages.WebPageBase.ExecutePageHierarchy(WebPageContext pageContext, TextWriter writer, WebPageRenderingBase startPage)
   at System.Web.Mvc.RazorView.RenderView(ViewContext viewContext, TextWriter writer, Object instance)
   at System.Web.Mvc.BuildManagerCompiledView.Render(ViewContext viewContext, TextWriter writer)
   at System.Web.Mvc.ViewResultBase.ExecuteResult(ControllerContext context)
   at System.Web.Mvc.ControllerActionInvoker.InvokeActionResult(ControllerContext controllerContext, ActionResult actionResult)
   at System.Web.Mvc.ControllerActionInvoker.<>c__DisplayClass1c.<InvokeActionResultWithFilters>b__19()
   at System.Web.Mvc.ControllerActionInvoker.InvokeActionResultFilter(IResultFilter filter, ResultExecutingContext preContext, Func`1 continuation)
   at System.Web.Mvc.ControllerActionInvoker.<>c__DisplayClass1c.<>c__DisplayClass1e.<InvokeActionResultWithFilters>b__1b()
   at System.Web.Mvc.ControllerActionInvoker.InvokeActionResultFilter(IResultFilter filter, ResultExecutingContext preContext, Func`1 continuation)
   at System.Web.Mvc.ControllerActionInvoker.<>c__DisplayClass1c.<>c__DisplayClass1e.<InvokeActionResultWithFilters>b__1b()
   at System.Web.Mvc.ControllerActionInvoker.InvokeActionResultWithFilters(ControllerContext controllerContext, IList`1 filters, ActionResult actionResult)
   at System.Web.Mvc.ControllerActionInvoker.InvokeAction(ControllerContext controllerContext, String actionName)
   at System.Web.Mvc.Controller.ExecuteCore()
   at System.Web.Mvc.ControllerBase.Execute(RequestContext requestContext)
   at System.Web.Mvc.ControllerBase.System.Web.Mvc.IController.Execute(RequestContext requestContext)
   at System.Web.Mvc.MvcHandler.<>c__DisplayClass6.<>c__DisplayClassb.<BeginProcessRequest>b__5()
   at System.Web.Mvc.Async.AsyncResultWrapper.<>c__DisplayClass1.<MakeVoidDelegate>b__0()
   at System.Web.Mvc.Async.AsyncResultWrapper.<>c__DisplayClass8`1.<BeginSynchronous>b__7(IAsyncResult _)
   at System.Web.Mvc.Async.AsyncResultWrapper.WrappedAsyncResult`1.End()
   at System.Web.Mvc.MvcHandler.<>c__DisplayClasse.<EndProcessRequest>b__d()
   at System.Web.Mvc.SecurityUtil.<GetCallInAppTrustThunk>b__0(Action f)
   at System.Web.Mvc.SecurityUtil.ProcessInApplicationTrust(Action action)
   at System.Web.Mvc.MvcHandler.EndProcessRequest(IAsyncResult asyncResult)
   at System.Web.Mvc.MvcHandler.System.Web.IHttpAsyncHandler.EndProcessRequest(IAsyncResult result)
   at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
   at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)

Request information: 
    Request URL: http://editor/article/create 
    Request path: /article/create 
    User host address: 127.0.0.1 
    User: Foo 
    Is authenticated: True 
    Authentication Type: Forms 
    Thread account name: IIS APPPOOL\Editor 
 
Thread information: 
    Thread ID: 9 
    Thread account name: IIS APPPOOL\Editor 
    Is impersonating: False 
    Stack trace:    at CallSite.Target(Closure , CallSite , Object )
   at System.Dynamic.UpdateDelegates.UpdateAndExecute1[T0,TRet](CallSite site, T0 arg0)
   at ASP._Page_Views_Shared__Topics_cshtml.Execute() in C:\projects\Foo\Bar\Src\Editor\Views\Shared\_Topics.cshtml:line 4
   at System.Web.WebPages.WebPageBase.ExecutePageHierarchy()
   at System.Web.Mvc.WebViewPage.ExecutePageHierarchy()
   at System.Web.WebPages.WebPageBase.ExecutePageHierarchy(WebPageContext pageContext, TextWriter writer, WebPageRenderingBase startPage)
   at System.Web.Mvc.RazorView.RenderView(ViewContext viewContext, TextWriter writer, Object instance)
   at System.Web.Mvc.BuildManagerCompiledView.Render(ViewContext viewContext, TextWriter writer)
   at System.Web.Mvc.HtmlHelper.RenderPartialInternal(String partialViewName, ViewDataDictionary viewData, Object model, TextWriter writer, ViewEngineCollection viewEngineCollection)
   at System.Web.Mvc.Html.PartialExtensions.Partial(HtmlHelper htmlHelper, String partialViewName, Object model, ViewDataDictionary viewData)
   at System.Web.Mvc.Html.PartialExtensions.Partial(HtmlHelper htmlHelper, String partialViewName, Object model)
   at ASP._Page_Views_Shared__ArticleTabs_cshtml.Execute() in C:\projects\Foo\Bar\Src\Editor\Views\Shared\_ArticleTabs.cshtml:line 42
   at System.Web.WebPages.WebPageBase.ExecutePageHierarchy()
   at System.Web.Mvc.WebViewPage.ExecutePageHierarchy()
   at System.Web.WebPages.WebPageBase.ExecutePageHierarchy(WebPageContext pageContext, TextWriter writer, WebPageRenderingBase startPage)
   at System.Web.Mvc.RazorView.RenderView(ViewContext viewContext, TextWriter writer, Object instance)
   at System.Web.Mvc.BuildManagerCompiledView.Render(ViewContext viewContext, TextWriter writer)
   at System.Web.Mvc.HtmlHelper.RenderPartialInternal(String partialViewName, ViewDataDictionary viewData, Object model, TextWriter writer, ViewEngineCollection viewEngineCollection)
   at System.Web.Mvc.Html.PartialExtensions.Partial(HtmlHelper htmlHelper, String partialViewName, Object model, ViewDataDictionary viewData)
   at System.Web.Mvc.Html.PartialExtensions.Partial(HtmlHelper htmlHelper, String partialViewName)
   at ASP._Page_Views_ArticleCreate_Create_cshtml.Execute() in C:\projects\Foo\Bar\Src\Editor\Views\ArticleCreate\Create.cshtml:line 23
   at System.Web.WebPages.WebPageBase.ExecutePageHierarchy()
   at System.Web.Mvc.WebViewPage.ExecutePageHierarchy()
   at System.Web.WebPages.StartPage.RunPage()
   at System.Web.WebPages.StartPage.ExecutePageHierarchy()
   at System.Web.WebPages.WebPageBase.ExecutePageHierarchy(WebPageContext pageContext, TextWriter writer, WebPageRenderingBase startPage)
   at System.Web.Mvc.RazorView.RenderView(ViewContext viewContext, TextWriter writer, Object instance)
   at System.Web.Mvc.BuildManagerCompiledView.Render(ViewContext viewContext, TextWriter writer)
   at System.Web.Mvc.ViewResultBase.ExecuteResult(ControllerContext context)
   at System.Web.Mvc.ControllerActionInvoker.InvokeActionResult(ControllerContext controllerContext, ActionResult actionResult)
   at System.Web.Mvc.ControllerActionInvoker.<>c__DisplayClass1c.<InvokeActionResultWithFilters>b__19()
   at System.Web.Mvc.ControllerActionInvoker.InvokeActionResultFilter(IResultFilter filter, ResultExecutingContext preContext, Func`1 continuation)
   at System.Web.Mvc.ControllerActionInvoker.<>c__DisplayClass1c.<>c__DisplayClass1e.<InvokeActionResultWithFilters>b__1b()
   at System.Web.Mvc.ControllerActionInvoker.InvokeActionResultFilter(IResultFilter filter, ResultExecutingContext preContext, Func`1 continuation)
   at System.Web.Mvc.ControllerActionInvoker.<>c__DisplayClass1c.<>c__DisplayClass1e.<InvokeActionResultWithFilters>b__1b()
   at System.Web.Mvc.ControllerActionInvoker.InvokeActionResultWithFilters(ControllerContext controllerContext, IList`1 filters, ActionResult actionResult)
   at System.Web.Mvc.ControllerActionInvoker.InvokeAction(ControllerContext controllerContext, String actionName)
   at System.Web.Mvc.Controller.ExecuteCore()
   at System.Web.Mvc.ControllerBase.Execute(RequestContext requestContext)
   at System.Web.Mvc.ControllerBase.System.Web.Mvc.IController.Execute(RequestContext requestContext)
   at System.Web.Mvc.MvcHandler.<>c__DisplayClass6.<>c__DisplayClassb.<BeginProcessRequest>b__5()
   at System.Web.Mvc.Async.AsyncResultWrapper.<>c__DisplayClass1.<MakeVoidDelegate>b__0()
   at System.Web.Mvc.Async.AsyncResultWrapper.<>c__DisplayClass8`1.<BeginSynchronous>b__7(IAsyncResult _)
   at System.Web.Mvc.Async.AsyncResultWrapper.WrappedAsyncResult`1.End()
   at System.Web.Mvc.MvcHandler.<>c__DisplayClasse.<EndProcessRequest>b__d()
   at System.Web.Mvc.SecurityUtil.<GetCallInAppTrustThunk>b__0(Action f)
   at System.Web.Mvc.SecurityUtil.ProcessInApplicationTrust(Action action)
   at System.Web.Mvc.MvcHandler.EndProcessRequest(IAsyncResult asyncResult)
   at System.Web.Mvc.MvcHandler.System.Web.IHttpAsyncHandler.EndProcessRequest(IAsyncResult result)
   at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
   at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)

Looks like there was a problem with the dynamic model, line 4 is the for each loop on model.topics. But debugging this code showed there was no problem with the model, it was fully populated with lots of topics and the debugger could see them and access them, so what gives?

Well after investigating it turned out another partial that the razor view uses had a compilation error in it, we only found this after removing the _topics partial. Very strange that a compilation error in another partial would cause the error we got.

To stop this ever happening again we made the project pre compile the views along with the rest of the web project but putting MvcBuildViews=true the .csproj

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <PropertyGroup>
  ....
    <MvcBuildViews>true</MvcBuildViews>

Now we get the real error at compile time, not an error in the dynamic model code at run time, much nicer.

Sunday, 3 July 2011

Linq to Sql many to many delete

My current project is using linq2sql, we had a small problem deleting a record from a many to many relationship that used a link table which only contained 2 required foreign key columns.

product with productId
order with orderId
productOrder with 2 foreign keys productId and orderId

System.Data.Linq.DuplicateKeyException : Cannot add an entity with a key that is already in use.
System.InvalidOperationException : An attempt was made to remove a relationship between a product and a productOrder.

However, one of the relationship's foreign keys (productOrder.productId) cannot be set to null.

The trick was to add DeleteOnNull="true" to the association

to delete modify collections (many to many link tables)

<Table Name="dbo.product" Member="product">
  <Type Name="product">
    <Column Name="productId" Type="System.Int32" DbType="Int NOT NULL IDENTITY" IsPrimaryKey="true" IsDbGenerated="true" CanBeNull="false" />
    <Column Name="Title" Type="System.String" DbType="VarChar(255)" CanBeNull="true" />
    <Association Name="product_productOrder" Member="productOrders" ThisKey="productId" OtherKey="productId" Type="productOrder" DeleteOnNull="true"/>
  </Type>
</Table>

<Table Name="dbo.productOrders" Member="productOrders">
  <Type Name="productOrder">
    <Column Name="productId" Type="System.Int32" DbType="Int NOT NULL" IsPrimaryKey="true" CanBeNull="false" />
    <Column Name="orderId" Type="System.Int32" DbType="Int NOT NULL" IsPrimaryKey="true" CanBeNull="false" />
    <Association Name="product_productOrder" Member="productId" ThisKey="productId" OtherKey="productId" Type="product" IsForeignKey="true" DeleteOnNull="true"/>
    <Association Name="order_productOrder" Member="order" ThisKey="orderId" OtherKey="orderId" Type="order" IsForeignKey="true" DeleteOnNull="true"/>
  </Type>
</Table>

product.ProductOrders.Clear();
product.ProductOrders.AddRange(myNewProductOrders);

I found help on this matter here: http://blogs.msdn.com/b/bethmassi/archive/2007/10/02/linq-to-sql-and-one-to-many-relationships.aspx

Wednesday, 22 June 2011

Unit testing your unity IOC wiring

I'm using Unity for the IOC container at the moment which is great, injecting in dependencies to the controller makes the dev cycle so quick, without it TDDing would be significantly more difficult. we can unit test everything including the controllers, but you all know that.

There is a small downside to using unity, in that you only find out about problems to the wiring when you run your app, or your acceptance tests, and then you need to analyse the error and figure out what went wrong.

So i thought wouldn't it be nice to have a unit test to test your unity configuration :-)
Well here it is

using NUnit.Framework;
using System.Reflection;
using System.Web.Mvc;

namespace Tests.Unit.Consumer
{
    [TestFixture]
    public class ProductionWiringDefinitionTest
    {
        [Test]
        public void ProductionWiringIsComplete()
        {
            IWiringDefinition wiringDefinition = new ConsumerProductionWiringDefinition();
            WiringDefinitionAssertions.AssertWiringDefinitionAllDependenciesResolved(wiringDefinition);
        }
    }
  
    public static class WiringDefinitionAssertions
    {
        public static void AssertWiringDefinitionAllDependenciesResolved(IWiringDefinition wiringDefinition)
        {
            var unityDependencyResolver = UnityDependencyResolver.Instance;
            unityDependencyResolver.Configure(wiringDefinition);
            DependencyResolver.SetResolver(unityDependencyResolver);            
            
            var containerFieldInfo = unityDependencyResolver.GetType().GetField("container", BindingFlags.NonPublic | BindingFlags.Instance);
            var container = (Microsoft.Practices.Unity.UnityContainer) containerFieldInfo.GetValue(unityDependencyResolver);

            foreach (var registration in container.Registrations)
            {
                unityDependencyResolver.GetService(registration.RegisteredType);                   
                unityDependencyResolver.GetService(registration.MappedToType);                   
            }
        }
    }  
}

The lines that do the magic are 30 and 31 inside the for loop, basically we try and instantiate everything that unity knows about, if any dependencies are missing you will get a nice unit test failure and a very good error message telling you exactly what is wrong.

You may have to change this example a little as it uses our abstraction of IWiringDefinition because we have multiple sites all doing the same thing but with different wiring.

using Microsoft.Practices.Unity;
namespace Framework.Ioc
{
    public interface IWiringDefinition
    {
        void Configure(IUnityContainer container);
    }
}

And an example of our wiring definition

using Diagnostics;
using Microsoft.Practices.Unity;

namespace Consumer.Application
{
    public class ConsumerProductionWiringDefinition : IWiringDefinition
    {
        public void Configure(IUnityContainer container)
        {
            RegisterViewModelMappers(container);
            RegisterDatabaseContext(container);
            RegisterControllers(container);
        }

        private static void RegisterDatabaseContext(IUnityContainer container)
        {
            container.RegisterType<IDatabaseManager, DatabaseManager>();
            container.RegisterType<DatabaseManager>(new InjectionConstructor(new DbConnectionString().ForConsumer));
        }

        private static void RegisterControllers(IUnityContainer container)
        {
            container.RegisterType<DiagnosticsController>(new InjectionConstructor(new ServerDiagnosticsController()));
        }

        private static void RegisterViewModelMappers(IUnityContainer container)
        {
            container.RegisterType<ITopicWithArticleSummariesMapper, TopicWithArticleSummariesMapper>();
            container.RegisterType<IArticleSummaryMapper, ArticleSummaryMapper>();
            container.RegisterType<IToolsPageViewModelMapper, ToolsPageViewModelMapper>();
            container.RegisterType<IToolViewModelMapper, ToolViewModelMapper>();
        }
    }
}

Like i said this example above is not entirely complete, you will need a little more than this to get your wiring working, but its a good introduction to actually testing your wiring.

Good luck.