Andy Pols Geek, Entrepreneur

Some random thoughts...

Fit Lesson 2

Following on from Fit Lesson 1.

Organise the Fit Tests around your iteration's stories

I like to take small baby steps when I write software. It makes life easy. I don't have to merge very much code, I get regular closure on what I'm doing, I don't have to carry too much around in my head, and I can discard mistakes without loosing too much work.

Automated acceptance tests, such as Fit, can throw a throw a spanner in the works.

I like to explore the problem by writing the Fit test with a domain expert before I start writing any code. I treat the acceptance test(s) as a definition of success. Before I start work on a new feature, I need to understand how I will know when the Customer is happy with what I have done. I flesh out the answer in the fit test (I often have several Fit tests exploring different aspects of the story).

Great, but Fit tests are course grained in nature. I typically make at least 20 baby steps (source code commits) before a story is complete. I don't want to commit the Fit test before I start writing code as the Fit test will fail and break the build. Not good. I don't want to wait until the fit test is complete as that will loose all the advantages of developing in baby steps.

Some teams use different directories to distinguish between the Fit tests that have been completed and should pass, and those that are currently being worked on. The tests get moved around when they have been completed. I guess that solves the baby steps problem, but I've never liked moving files around to indicate completion and it's still hard to see how the Fit tests relate to the originating Customer Stories.

William Jones has a much nicer solution with his Agilifier project. There is not much documentation at the moment (well, it is opensource!), so I will give you a quick overview.

  • You group the Fit tests around the stories they are associated with. All the tests associated with a story are contained in a directory. You place the original story in the story.txt file. You simply add your Fit test files to the directory.
  • You use test suites to make the distinction between the previously completed tests (i.e. we have broken something if they don't pass - and should fail the build!), the tests associated with the current iteration (i.e. these provide a progress bar of the current iteration) and those tests an individual (or pair) are currently working on. The suites are simple text files containing the tests to be included in the suite.
  • Agilifier runs the test suites and glues the story and Fit tests together in a nice html report.

Download it and give it a try. You should be able to figure it out from it's own acceptance tests!

Visibly out of sync

I spotted this on information aesthetics. It's a disk drive that changes shape to show how insync a data backup is with it's source.

What a great idea.

Think Small

Here is a nice podcast from Jason Fried from 37Signals.

He had an interesting view on co-location. I always thought distance hurts, and having everyone working together in the same location was critical.

Jason argues that if you can't explain a design or a business strategy using IM or Skipe then it's too complicated and you should choose something simpler.

It's a really interesting point of view. I have been mulling this over in my mind for a few days now. I wonder what impact people's learning style has on this? I'm very visual, I really like being able to jump up and draw pictures with collegues around a white board. I wonder if Jason is more of a words person?

Fit Lesson 1

Write the test so it reads like a requirement specification and not a test.

I was trying to explain a FIT test to a new colleague. It soon became clear that it had been written as a test. It didn't convey the intent of what was required. Nor did it communicate why it did what it did. There was a lot of tacit knowledge in my explanation of what was going on. It obviously failed to communicate the requirement. It did, however, test that a particular feature of the system worked!

So, what is the purpose of a FIT test?

Brian Marick makes the distinction between Business Facing tests and Technology Facing tests.

A business-facing test is one you could describe to a business expert in terms that would (or should) interest her. If you were talking on the phone and wanted to describe what questions the test answers, you would use words drawn from the business domain: "If you withdraw more money than you have in your account, does the system automatically extend you a loan for the difference?"

A technology-facing test is one you describe with words drawn from the domain of the programmers: "Different browsers implement Javascript differently, so we test whether our product works with the most important ones.", "Or, PersistentUser#delete should not complain if the user record doesn't exist."

The FIT tests should be business facing, they are a communication tool with the added benefit being executable.

FIT tests should explain the requirements so that people know what the system does. The collection of FIT tests forms an executable requirement specification. They should not really be called tests at all!

Our FIT test was clearly written from a developer's point of view. It was technology facing. Fit is probably the wrong tool for technology facing tests. I would much rather write these in a more developer friendly tool such as Java or ruby.

Getting to know your customer

I've uploaded the slides from Steve Freeman's and My talk at XPDay.

http://www.pols.co.uk/papers/CustomerGamesV2.pdf

I really enjoyed the session. Here are some of the product boxes people created:

I particularly love the "official" Kent Beck endorsement!

Without any prompting everyone asked for free beer!