• Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Linking Tests and Requirements

 
Paul Croarkin
Ranch Hand
Posts: 106
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Cedric / Hani,

Any suggestions on how to track tests with requirements? We have a project with over 10K tests. If a requirement changes I'd like to be able to easily find any tests that may have to change. I'd also like to be able to look at the test code and know which requirement is being tested.

I've tried making up annotations that list the requirement number and also cutting and pasting part of the requirement's text into comments on the test. This gets to be time-consuming and depends on the all the developers doing the same.

Do you know if anyone has tried to solve this?
 
Hani Suleiman
author
Greenhorn
Posts: 22
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
It depends on how far you want to go to automate it.

Here's an example of an implementation I've come across (which I think has gone a bit too far, but works very well if you have the requirements process very formalised)...

Every test has to have a tracking number (in this case, it was a JIRA issue). as one of its groups. The svn repo has a pre commit script that enforces this. Tests that were checked in that didnt have an issue number were not allowed in. The IDE used (IDEA) also had a JIRA plugin which hyperlinked the issue numbers, so it was easy to get to the original document via a single click.

This is a bit extreme, but its actually pretty easy to require this sort of thing without enforcing it via precommit scripts. Assuming you have any sort of basic code review process, people would just be nagged to tag everytime they commit a test without the tag, until it becomes part of the team culture.
 
Ilja Preuss
author
Sheriff
Posts: 14112
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
In my experience, there typically isn't a mapping between unit tests and requirements that's direct enough that it would make much sense to identify the test with a requirement. What I'd care more about is what code (including tests) was changed to support a requirement. Commit comments in the version control system work well for this, even more so if the comments are searchable I suppose.

For high level (acceptance) tests, I prefer them to *be* the authoritative expression of the requirements - both executable and understandable by the customer at the same time. For example by using a tool like Fit/FitNesse.
 
Daniel Trebbien
Ranch Hand
Posts: 62
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
10,000 tests? Wow, that's a lot.

Where do you put them? To put it another way, where is a good place to put tests under the assumption that one is using TestNG?

TestNG makes it easy for any method to be a test, but if there are a lot of tests, then there are probably a few that test a particular class. Supposing that there are three, does anyone recommend that they be placed in the class (as static methods) that is being tested?

Or, do people create a class, say TESTS, in every package which contains all tests for that package (as I do).

Or, instead of placing test code in the standard src/ directory, do people create a different directory, maybe tests_src/, which contains all test code.

What works well?

Daniel Trebbien
[ December 20, 2007: Message edited by: Daniel Trebbien ]
 
Ilja Preuss
author
Sheriff
Posts: 14112
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Well, I'm using JUnit, but I guess the principle holds when using TestNG for unit testing:

We have one or more test classes per production class. So if we have, say, a class Stack, we could have a single test class named StackTest, or several test classes Stack_EmptyTest, Stack_FilledTest (one test class per test fixture).

Currently our test classes reside in the same source folder as the production class, in a sub package called "test". So the test for de.disy.collections.Stack would be de.disy.collections.test.StackTest.

We are thinking about moving to a convention where the tests reside in the *same* package as the production class, but in a separate source folder.
 
Hani Suleiman
author
Greenhorn
Posts: 22
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I'd strongly recommend having test classes in the same package, but different source tree.

The huge huge benefit of doing so is that you can use package protected, thereby making your code testable, but without breaking encapsulation.
 
Daniel Trebbien
Ranch Hand
Posts: 62
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Originally posted by Hani Suleiman:
I'd strongly recommend having test classes in the same package, but different source tree.

The huge huge benefit of doing so is that you can use package protected, thereby making your code testable, but without breaking encapsulation.


Right. I'm constantly accessing package-private variables and classes just so that I can see what's going on if and when one of the tests fail.

Thanks for the tip.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic