The easy way to "get it" is to think of those 3:00 AM phone calls -
"D--- I should have written a test case."
I can regression test one application in about 8 minutes, and know
immediately if it's broken. Many of my colleagues depend on hours of
manual testing, and still miss things that they broke in fixing
something else.
The rule of thumb for retrofitting testcases into a project is "if
it's a problem, write the testcase first, prove it fails, correct the
code, and prove it passes."
One thing that has been indirectly mentioned above, but that I find a
great thing when doing Test Assisted Development, is that writing
tests forced me to strongly decouple my code.
Suddenly, not only could i easily test things, but i could change
application architecture, change implementations of methods, and all
the other beautiful things that go with decoupling.
Testing drove the design in a much better direction.
One thing that has been indirectly mentioned above, but that I find a
great thing when doing Test Assisted Development, is that writing
tests forced me to strongly decouple my code.
Suddenly, not only could i easily test things, but i could change
application architecture, change implementations of methods, and all
the other beautiful things that go with decoupling.
Testing drove the design in a much better direction.
That is one of the most powerful ways that proactive testing improves quality.
There are only so many ways to say this: If each function in your
program had two different kinds of callers, the functions would
strongly decouple from each other. We no longer write deep and tangled
associations. The test cases provide the second kind of caller.