Thursday, 27 August 2009

Successful testing = Successful reputation

Following a number of fascinating comments made by nFocus’s clients on how vitally important software-testing was to maintaining their professional reputation, nFocus embarked upon a search of articles and white papers to see what other evidence it could find on the subject.

A number of articles emerged; none more so than one written by Jack Danahy, CTO of Ounce Labs, entitled, “Your company’s reputation: Critical but fragile” dated 7 April 2009. His article concentrates on a software breach at Heartland Payment Systems back in January 2009 and explores the implications on its reputation following negative media in the press.

We found that this article supported our own research and so we have written a brief commentary containing the fundamental take-away points.

First, how do you define reputation? And how tangible is it? Jack provides an interesting criterion for judging this whilst investigating the aftermath of the breach three months on. The criterion was a simple Google search that highlighted a great deal of negative media about the company, which most likely will have been read by clients, prospects and employees alike and so caused immeasurable damage to the business.

In Jack's own words, the Google search for Heartland Payment Systems is pretty illuminating and he says, “As one would expect, the first natural topic is the corporate website. Beyond this, it goes downhill pretty fast. Of the remaining nine items in the natural search list, with the exception of a pointer to a secondary company site and the company’s Hoovers listing, everything relates to the breach. That’s a pretty high percentage.”

He continues, “...querying for a vendor and having the second item have “breach” in the URL would likely be a warning flag to someone trying to learn about Heartland....[suggesting] that reputation is a critical, yet fragile thing. Building it and defending it are not small tasks, and a fall from favor can be swift and absolute.”

With that in mind, what is the cost of a damaged reputation? Jack’s view is that there is no simple or short-term solution. He says, “Rebuilding a tarnished reputation after a breach will require effort... and is always much more difficult than creating it in the first place, because breaches result in headlines that are free, interesting, popular media, while fixes and cleanup result in little beyond whitepapers, which are costly and unpopular media”

This dramatic security breach highlights the critical - but often underestimated - role that quality software testing plays on the day-to-day running of many businesses. Mistakes can be very costly indeed and can even put the future of some businesses in jeopardy.

If you would like to learn more about how high-calibre software testing could help to preserve the reputation of your company (and your own reputation too!) then please call us anytime.

You can click here to read Jack’s original article and to learn more about Jack Danahy’s insights into security, visit Jack's bio, or

Friday, 21 August 2009

Agile estimating- A practical quick start

Agile estimating techniques described by QSTC are based on the Wideband Delphi forecasting method, a refinement of the Delphi method developed by Rand during the cold war to forecast the impact of technology on warfare. Existing scientific laws didn't really work too well but there was a great deal of experience and expert opinions around. The challenge was to aggregate all of this expertise into a single forecast. A bit like trying to estimate how long the testing of software will take I guess (lol).

The technique uses a facilitator to gather and consolidate information to get a broad consensus on the estimate. After all a guess by the experts is better than a guess by the project manager (sorry PMs out there but you do tend to be a little optimistic at times!)

OK how do we do it? You can follow a quite detailed process in the links at the bottom of the page, but let's fast track it.

First we need to break down the system under test into manageable chunks; a good start point is to discuss with the developers what the content of the first "build" of the system will contain.

For each chunk decide and list what will be tested. Consider the GUI, any hidden client server functions, data bases access (stored procedures/SQL etc), infrastructure, performance checks, stress checks - user and technical (incoming interface overload), and infrastructure checks.

Define the major activities for testing e.g.

Remember tests have to be designed and written. Data has to be identified and created (why do we always overlook the test data strategy). Tests have to be executed and logged, some tests will be repeated (software does sometimes go wrong). Problems have to be analysed, reported, fixed and retested.

This forms the background for the experts at the estimating workshop.

Up to five experts are selected for the estimating workshop for their expertise and knowledge, (no bag carriers, observers this is a working session) covering:

• Experience of developing and testing software's using the chosen technology
• Experience of testing systems
• Experienced business user who know how the system will be used
• Experienced service delivery (the guys who have to run and maintain the system when it is live)
• Project Management

The facilitator describes a chunk of the system to be tested and the attendees vote on the effort required (person hours/days etc). Use the Fibonacci sequence to estimate the effort with hold up cards or post its:-

1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144

If there are estimates that are significantly different (someone votes 1 day when everyone else thinks it is 8 days) then those experts are asked what made them come to that conclusion. After discussion another vote is taken repeat until consensus is reached.

Record any assumptions that the team have made when estimating.

Thank you for reading, and I hope you found it useful. If you would like further information please feel free to comment below, or visit the following pages:

It is also worth mentioning that I will be presenting a number of software testing sessions for Intellect (London), during October and November 2009, and if you would like further information please click here.

Friday, 7 August 2009

Are you following nFocus_ltd on twitter?

Just a short blog entry today to mention that we are using twitter to point out useful articles, websites and so on throughout each day. For more info on this please visit

It is also worth mentioning that I am currently working on a number of helpful blog articles, so please feel free to check back soon.

In the mean time, happy testing.