I am a junior programmer at a young, small company with more or less all "junior programmers"... we are trying to put together a more comprehensive testing plan for our projects (primarily small, web-based applications). I was wondering if anyone could offer me some resources (electronic or otherwise) or personal advice on testing methodologies, practices and procedures to help us put a comprehensive testing plan together. If you want to put this in the context of XP, great (I tried to argue for modular-based testing but I pretty much go shot down) but any other advice would also be greatly appreciated. Thanks in advance for any and all help.
Paul J Summers, Jr.<br />Advanced Media Productions
In my opinion the best testing strategy can simply be summed up as "test as early as possible and as often as possible". Any test plan which leaves testing to late in the development/deployment cycle is asking for trouble. After all, in the worst case, some test failures could need a complete rewrite to fix. As for more detailed advice; I suggest a design and testing strategy which emphasises a component (or "Lego") model, where as much as possible of the testing is done on small, understandable components, early in the process, so that they can be relied on to do their job in the larger application. Build the application in layers like this, and any test failures can easily be localized to the layer currently being developed. Web user-interfaces are notoriously hard to test - competing browsers do't even follow the same specification, so how can a test framework sensibly emulate them? The trick to this is to test as much as possible of the application without ever seeing a web server or browser. This fits well with the "Lego" design style described above.
Some sites you might want to visit to get an idea of how to go about "testing as early and as often as possible": http://www.junit.org http://www.thoughtworks.com (follow links to Communities & Learning - Library) Getting into early testing is often easier said than done. Many of the organizations that I have worked with in the past and present still relegate testing to a later phase of development. Many organizations, large ones in particular, will find it hard to change the way they do testing just because new strategies such as "test-first programming" and "continuous integration" runs contrary to how things are "normally" done. A smaller team may not have to deal with a lot of the cultural issues that come up when trying to introduce "test first programming" but you will still have to adjust a bit to the strategy since this is not what most of us learn in school. You might also want to look into studying and incorporating the practice of refactoring in your development. See http://www.refactoring.com Junilu
I wholeheartedly concur with what Frank and Junilu said about testing early. I would try to consider testing to be an integral part of your methodology, and not something additional, to be added at the end. For example, if your work is use case dirven, your use cases make excellent test plans (although use cases by themselves shouldn't be your only test plan). Figure out what you are going to test, and how. I could easily do 10:1 writing testing to coding, but that is probably excessive for most projects (exceptions would be NASA, medical devices, etc). Figure out which pieces you want to test, and to what degree, and find appropriate tools. (What tool sare available may effect how much time it takes to test.) Also, here are automated web testing tools from all the major vendors, such as mecury, silk, and rational. They may or may not make sense for your project. Because everyone is young, and because these tools are expensive, I would suggest that if you get them, you have a shared machine, and make sure everybody uses it, to gain testing experience. --Mark
Weeds: because mother nature refuses to be your personal bitch. But this tiny ad is willing: