TDD (Test Driven Development) seems to be used a lot in technical and management circles, together with “CI”, “scrum” and other agile-ish buzzwords. Since I’ve got an internet connection and the ability to throw my opinion into the wild, I think I’ll do just that.
TDD is a good practice, however, it gives most positive results when applied to certain types of projects or in certain phases of a project.
TDD works best when applied to projects that:
- have a well defined code base, requirements and design
- need extending the already available features
- use a development process that allows plenty of time for design and architecture before the actual implementation
For these kinds of projects, one already has the blueprint laid down, and can write tests before starting the actual implementation. The developer knows what the system should be doing, has only to extend the existing interfaces, keeping consistency. There is not much entropy in this type of development process.
This situation can be found in large companies like automotive, aviation, and when working on legacy projects (the ones that don’t need re-writing).
TDD works against you when applied to projects that:
- don’t have clear requirements
- represent a new product (start from scratch) and the time-to-market is critical
- are going to switch technologies in order to get the desired functionality and features
These are the kinds of projects that are in early start-up phase. At the point that you don’t know exactly what you’re doing and you refactor three times a day for the first weeks or months.
In these cases, writing tests based on unstable interfaces and design, can consume a lot of effort, and will end up deleted most of the time.
As a conclusion: I strongly suggest using test-driven-development only when you have a clear picture of what you have to do, otherwise you will end up consuming more then 50% of development time on tests that will not see the end product.
But, of course, your mileage may vary.