Occasionally I am asked by IT managers to prove, with numbers, that agile engineering techniques like Test Driven Development (TDD) work.
Unfortunately that’s not possible. Academics have been trying unsuccessfully for years, and while I respect their efforts, I would be skeptical of any results proving or disproving that agile methods like TDD work.
The reason I would be skeptical, is because it’s not possible to apply the scientific method to things like TDD.
I remember being taught in grade 7 science, that when doing a scientific experiment you:
- Create a hypothesis
- You pick a manipulating variable
- You conduct the experiment
- Measure the result against your hypothesis.
TDD is better than non-TDD
to TDD or not TDD
This type of experiment can’t be done on software projects because the variables can’t be kept constant on both sides.
How many projects have you been on where you have:
- the same team
- with the same requirements
- where you build something the same way
- and you don’t leverage any learning’s or mistakes made from your previous experience
Dealing with questions like proving TDD works can be frustrating. One one hand you believe it works, yet on the other you can’t prove it (at least the way some managers would like you to).
So stop trying.
Accept that you will not be able to answer the question the way they would like it answered.
Instead, try looking at the question a different way.
Why they are asking the question in the first place?
Is the customer unhappy?
Did the last push to production not go smoothly?
Are projects costing more and taking longer to deliver?
Is there a concern teams are spending too much time testing?
How are we defining success?
You can’t discount the managers question. It’s a good one, and they very well be onto something in asking it.
Just understand that you can’t always answer every question the way the questioner would like. Especially when it comes to quantifying knowledge work like software development.