July 23, 2014 - Marco Behler
[this is actually a post from our mailing list from a month ago. if you are not yet a member, devour it and sign up at the bottom! :) ]
Today we'll focus on testing and changing requirements: Especially when your feature requirements (seem to) change a lot and you are having pressure to deliver on time, you feel like it might be wasted effort to write tests in the first place or to maintain them - after all, with the next requirement change, they might just be garbage afterwards.
We have had this discussion many times in the past, especially at smaller startups where the general climate is „just get the fucking shit done asap“, which more often than not includes „no time for testing“.
Where this discussion usually goes wrong is that it focuses on the end of the development chain. Let's explain and switch to a cooking analogy: How do you properly cook two, size-wise completely different, pieces of meat in one pan so that they can be served at the very same time and temperature ?
Well, you could focus on super-nerdy techniques to give that 2kg piece of meat a special fry while the 100g piece is already getting cold. But what most chefs do is very straightforward: Make sure that the two pieces of meat do not differ (much) in size before putting them in the frying pan.
Programmers often take changing requirements as a given and then argue about how long it takes to write or maintain a test. But the problem is not the tests, the problem is that someone wants a helicopter one day and an airplane the other day (erm, we're just pivoting....yeah right.)
The only solution to this issue is to get away from what we like to do - coding - and get into an active discussion with our product owners/business analysts/managers. Depending on your company culture this could well turn into a long, slow up-hill battle. But this problem is simply not fixable by "not testing" or any sort of code.
But hey, even if I get thumbs up from everyone, writing those tests is still a lot of work and I need to finish up my stuff!
"No time for testing" usually means "we simply don't know how to test in an efficient manner". Of course it is super problematic to test that deeply buried JSP of yours, that mixes presentation, business logic and all other kinds of stuff. Hell you're using an uber-framework-x and there seems to be no simple way to just boot up specfic parts of it.
But how do you then usually test your software? Do you boot up your web-server, grab a coffee, then wait for a minute and manually click through all your workflows till you can test the one workflow you are trying to test? And what happens is something changes? How often do you do this type of work every day?
We can guarantee you, that whatever you are doing manually at the moment is easily replacable with some testing code. And it surely won't take longer, you only have to know how do it.
There's unfortunately very little literature out on testing real life code. We're trying to change that. Stay tuned for our new ebook on practical testing examples. As a member of this mailing list you'll get notified first when it launches and also receive the odd free preview chapter!
As always though, if there's anything you'd like to hear about in more depth, please feel free to drop us an email - we try to read them all and try our best at responding.
Until next time!