• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Tim Cooke
  • paul wheaton
  • Paul Clapham
  • Ron McLeod
Sheriffs:
  • Jeanne Boyarsky
  • Liutauras Vilda
Saloon Keepers:
  • Tim Holloway
  • Carey Brown
  • Roland Mueller
  • Piet Souris
Bartenders:

Managing Testing Documentation?

 
Jonas Larsson
Greenhorn
Posts: 13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I work on a small firm that are about to start on
several development projects in Java. I've been
put in charge of the testing and the testing
process. It turns out that there has been exactly
ZERO effort put into documenting the testing during
almost two years of development on a codebase of
hundreds of thousands lines of code. Much to the
dismay of some of my collegues I've put my foot
down and started pushing for a testdriven development
process that amongst other things features JUnit testing
and an approach for organizing and producing testing/test documents.
Now to the real issue here. I can't find a decent/workable way
to manage and produce a complex hierarchy of testplans
/testcases that can easily be referenced and found from
Bugreports (we currently use BugZilla). One would think
that this problem would be universal, but I haven't even
been able to find a news posting discussing this! There
are plenty on bugtracking/issue tracking but absolutely
NOTHING on linking this to actual testdocumentation. Wich,
when you think about it is pretty strange...
My current proposal to my boss and my reluctant coworkers
is a system using word-documents that may or may not be
under version control. I've devised a system to code all
document names in a way that makes them easy to find
in a hierarchy only on the basis of their names.
But what I *really* want is a webb-based system that can
be used to collaboratively work on testplans/testresults
and can manage fairly complex document hierarchys and
can be made to also output the documents as fex PDF (for the bosstypes...)
I've been looking at using Latex as the document format (to steep a learning curve? No GUIediting? But can be put into CVS...)
Apache Forrest (Seems cool but the project does not seem very active...)
A whole bunch of project managment application wich does not
seem to address testing at all...
I'm running out of ideas here and have to start producing wordtemplates
at a crazy rate this week of nothing happens
Anyone with any input on this?
Great first post eh?
 
Lasse Koskela
author
Posts: 11962
5
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Welcome to the ranch, Jonas!
I don't think it's such a bad idea to simply go with Word documents on a shared drive (or something similar) unless you have a requirement (not a nice-to-have) that they can't fulfill.
By the way, what do you mean by "test documentation"?
 
Ilja Preuss
author
Posts: 14112
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
For unit testing, I don't think that it is very desirable to link them to bug reports or the like. They are just of too fine a granule.
More interesting might be system level / acceptance tests. For this you might want to take a look at http://fitnesse.org , a web-based acceptance testing and documentation tool.
 
Jonas Larsson
Greenhorn
Posts: 13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Lasse Koskela:
Welcome to the ranch, Jonas!
I don't think it's such a bad idea to simply go with Word documents on a shared drive (or something similar) unless you have a requirement (not a nice-to-have) that they can't fulfill.
By the way, what do you mean by "test documentation"?


Thanks!
I want to be able to separate all the testsetup and testcase documentation
from the actual Bugreport in Bugzilla. That means that the context of the
bug (system setup and configuration and actual testcase) is in a Testplan
document (with testcases) and the result is in a result document. Both these
documents are stored on a shared drive for the bugfixer to refer to when
trying to close the bug. This means that minimal information goes into
Bugzilla and keeps everyone on their toes when using my templates for
describing the bugs in a uniform way. This hopefully means reduced
crappiness for bugdescriptions in Bugzilla due to laziness...
The word route feels like a stopgap solution. I WOULD like to do something
more automated and searchable...
[ August 26, 2003: Message edited by: Jonas Larsson ]
 
Jonas Larsson
Greenhorn
Posts: 13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Ilja Preuss:
For unit testing, I don't think that it is very desirable to link them to bug reports or the like. They are just of too fine a granule.
More interesting might be system level / acceptance tests. For this you might want to take a look at http://fitnesse.org , a web-based acceptance testing and documentation tool.


This is not related to JUnit testing. I'm hoping that properly Javadocing the testclasses should be enough documentation for those.
This is for the functional testing of fex GUI functionality in the form
of "Click submit in order ticket -> status should change to pending". And if it does not a bug is reported and the results document is updated with
the bugnumber and the result.
I've actually looked att FITnesse and it does not seem to adress testing
*inside* the development process. Acceptance testing is more towards the
end of the process right? When customers/users are involved.
Besides...we actually have no requirements document to relate to when
doing acceptance testing. There used to be one, but it is so hopelessly
out of date that it is useless. We have to rely on ad-hoc testing and
try to guess or discuss every testcases expected outcome.
The strange thing is that when all this is done we should be able to
extract a requirements document that describes the current state
of the product. Isn't that backwards to the extreme eh?
 
Jonas Larsson
Greenhorn
Posts: 13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
It seems I've been searching for the wrong thing! I'm currently looking
inot Open Source CMS's (Content Managment Systems) that may be the solution
to many of my problems.
I'm down to the Zope-based Plone (plone.org) and what seems to be a great
system, typo3 (www.typo3.com).
The thing about typo3 is that it features a RTE (Rich Text Editor)
that may fit my bill for a simple and usable way to access and edit test-
documents. It supports advanced imagehandling and tables and a custom
template script language makes templates flexible and nice.
I'm going to make a skunkworks project out of this and a colleague and
I are going to do a test install on our not-very-official CS-Linuxbox
in our serverroom
I'll get back to you on this.
 
Mark Herschberg
Author
Posts: 6055
8
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
How about using Wiki?
I got a yellow flag riased when I read your initial posting

Originally posted by Jonas Larsson:
It turns out that there has been exactly
ZERO effort put into documenting the testing during
almost two years of development on a codebase of
hundreds of thousands lines of code. Much to the
dismay of some of my collegues I've put my foot
down and started pushing for a testdriven development
process that amongst other things features JUnit testing
and an approach for organizing and producing testing/test documents.


I admire you determination to do it The Right Way(TM), but remember that you can only lead a horse to water, you can't make him drink.
You can't go from 0 to 100 overnight. Start out simple. You need to cretae a basic framework for testing (if it doesn't already exist) and testing documention. Then teach them some basics and let them go for a month, documenting going forward (not retro-documenting). After a month, review the process and fix problems that occur. Give it another month. If after 2 months things are going smoothly, then start a small project of retro-documentation into the existing, working doc system. This could be taking a few developers to work on it full time, or it could be having all developers developer a dedicated few hours per week to this task.
The key is that this is not simply another activity to check off on the project plan. This is a cultural change. In fact, two changes (documenting tests and test first development). Go slow, be realistic, and constantly re-evaluate.
--Mark
 
Ilja Preuss
author
Posts: 14112
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Jonas Larsson:
This is not related to JUnit testing. I'm hoping that properly Javadocing the testclasses should be enough documentation for those.


Well, in my experience, JUnit tests can be so simple that most often they don't need any documentation at all - they *are* documentation...


This is for the functional testing of fex GUI functionality in the form
of "Click submit in order ticket -> status should change to pending".


OK.

And if it does not a bug is reported and the results document is updated with the bugnumber and the result.


What exactly do you mean by "a bug is reported"?

I've actually looked att FITnesse and it does not seem to adress testing *inside* the development process. Acceptance testing is more towards the end of the process right? When customers/users are involved.


What makes you think so?
FitNesse has its roots in an XP environment, where the Customer is involved during the whole process (one of his responsibilities is to define the acceptance tests which get run at the end of each one-to-two-week iteration).
If you can't do that, I don't see any problems with someone else (product manager, QA or even the developers) writing the tests using fitnesse, though.

Besides...we actually have no requirements document to relate to when
doing acceptance testing. There used to be one, but it is so hopelessly
out of date that it is useless. We have to rely on ad-hoc testing and
try to guess or discuss every testcases expected outcome.


As long as you can discuss the expected outcome with someone who actually *knows* what he speaks about, that isn't the worst situation.


The strange thing is that when all this is done we should be able to
extract a requirements document that describes the current state
of the product. Isn't that backwards to the extreme eh?


Does it work for you? How much guessing do you do, and how often are you wrong (and how do you know about it)?
 
Mark Herschberg
Author
Posts: 6055
8
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I missed this earlier...

Originally posted by Jonas Larsson:

Besides...we actually have no requirements document to relate to when
doing acceptance testing. There used to be one, but it is so hopelessly
out of date that it is useless. We have to rely on ad-hoc testing and
try to guess or discuss every testcases expected outcome.
The strange thing is that when all this is done we should be able to
extract a requirements document that describes the current state
of the product. Isn't that backwards to the extreme eh?


AAAAAGGGGGHHHHHH!!!
OK, that's Very Bad(TM). Now, I'm not an idealist. Rwquirements change and the requirements doc does very often get out of date. However, this is not good.
Ilja responded

Originally posted by Ilja Preuss:

As long as you can discuss the expected outcome with someone who actually *knows* what he speaks about, that isn't the worst situation.


If this is a small team, it can work. The problem is that that "someone" may not be unique. One of the biggest problems with requirements is that they are underspecified--everything thinks they agree on the behavior but it's not explicitly stated. What happens is that developers each "fill in the blanks" their own way, and it may turn out to be very inconsistent. Given no requirements, your project has a very high risk of running into this problem.
Now as you point out, you're going to define the requirements based on you test cases. Normally when we cretae requirements, we gather everyone together and spend a lot of time discussion and prioritizing behavior. I suspect what will happen here is that you will add tests "as you think of them" and will stumble across conflicting behavior; e.g. in order to make developer A's test work, dveeloper B's test will fail. Now who has preceedence? Instead of resolve this by having all parties involve consider the options, more likely is the handful of people involved will make an arbitrary decision (arbitrary meaning based on their limited information of the project). This is not good.
--Mark
 
Ilja Preuss
author
Posts: 14112
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Mark Herschberg:
If this is a small team, it can work.


I think another critical factor is feedback. You need to know as early as possible if you are on track. You can do this, for example, by frequently demonstrating a (partially) working system to the stakeholders/customer.
 
Jonas Larsson
Greenhorn
Posts: 13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Mark Herschberg:
How about using Wiki?


Can you import images (screendumps) and make
tables in an easy way in a wiki solution?
That is the basic funtionality i want. And
putting raw HTML in webform does not count

And I also want to be able to output the testplans
in alterative formats for distribution and printing.
Like PDF or some kind och Office format...
The solution I am looking at (typo3, an OS PHP CMS system)
have all of this and more...


I admire you determination to do it The Right Way(TM), but remember that you can only lead a horse to water, you can't make him drink.
You can't go from 0 to 100 overnight. Start out simple. You need to cretae a basic framework for testing (if it doesn't already exist) and testing documention. Then teach them some basics and let them go for a month, documenting going forward (not retro-documenting). After a month, review the process and fix problems that occur. Give it another month. If after 2 months things are going smoothly, then start a small project of retro-documentation into the existing, working doc system. This could be taking a few developers to work on it full time, or it could be having all developers developer a dedicated few hours per week to this task.
The key is that this is not simply another activity to check off on the project plan. This is a cultural change. In fact, two changes (documenting tests and test first development). Go slow, be realistic, and constantly re-evaluate.
--Mark


This is actually roughly the route I am taking. I've been talking about
testing (JUnit and documentation of other testing) way before actually
doing something. After weeks of working our boss I've finally got him
with me on it. This fact alone means I'm already halfway there.
Still working on our architect guy though. Noone does anything here without
him nodding it and he is definately not into anything that adds overhead to
our development process. Wich he seems to think is OK as it is.
So you could say that we are disagreeing. And since I am the last one
hired I have problems convincing people. Everyone thinks it's great
when I speak to them individually. They want order and structure. But
during my presentations about it, the only one actually speaking is the
architect and he is not hot on the idea... Mainly I think because HE
does not want to be forced to do the added work. In fact that's the feeling
I get from several of my coworkers. Everyone thinks it's absoultely fab, as
long as they don't have to do the work. So up to this point I have been
doing most of the research and setup work to get it running to smooth
it out.
At this point we are doing a pilot testproject using all my wordtemplates
and we are going to evaluate it. It shouldn't be to difficult showing the
benefits of using the system to my boss
I am definately aware of it being a cultural change and I'm biting my
toung a lot when problems constantly arise that could be mostly eliminated
with a structured approach to testing. I think they are tired of me saying
"- Well, why don't you just check the test documentation. Oh, right, we don't HAVE any test documentation..."
[ August 28, 2003: Message edited by: Jonas Larsson ]
 
Jonas Larsson
Greenhorn
Posts: 13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Ilja Preuss:

Well, in my experience, JUnit tests can be so simple that most often they don't need any documentation at all - they *are* documentation...


That's the feeling I get without actually working with it yet...
If the granularity is higher they might need some javadocs though.



What exactly do you mean by "a bug is reported"?


An entry in Bugzilla is made with a reference to the testplan and result
document.



What makes you think so?
FitNesse has its roots in an XP environment, where the Customer is involved during the whole process (one of his responsibilities is to define the acceptance tests which get run at the end of each one-to-two-week iteration).
If you can't do that, I don't see any problems with someone else (product manager, QA or even the developers) writing the tests using fitnesse, though.


Absolutely nothing. I've just browsed through it and I made an
assumption that may be wrong Maybe I'll take a second look at it.



As long as you can discuss the expected outcome with someone who actually *knows* what he speaks about, that isn't the worst situation.


Well that knowledge is spread throughout the company and by time, memory
erodes. So it is not an easy or effective process.



Does it work for you? How much guessing do you do, and how often are you wrong (and how do you know about it)?


I have to guess all the time since I'm the last one hired But in time
I'll get to where I actually have a good grasp of the state of the product.
That's one of the pro's of being involved in testing...
 
Jonas Larsson
Greenhorn
Posts: 13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Mark Herschberg:

OK, that's Very Bad(TM). Now, I'm not an idealist. Rwquirements change and the requirements doc does very often get out of date. However, this is not good.


It definately is not. But what can I do with a product that has 300K-400K
lines of code. I'm just one person and we are NOT gonna get the resources
necessary to backtrace everything into a req document. That I know.



If this is a small team, it can work. The problem is that that "someone" may not be unique. One of the biggest problems with requirements is that they are underspecified--everything thinks they agree on the behavior but it's not explicitly stated. What happens is that developers each "fill in the blanks" their own way, and it may turn out to be very inconsistent. Given no requirements, your project has a very high risk of running into this problem.


This is exactly how it works now. And sometimes it feels as there are
not enough hours in a 24-hour day to have the meetings required to agree
on every little detail. The level of specification has to hit a level
that works for eveyone but is not too detailed. And then there is the
thing about conflicting requirements wich seems to happen a lot...



Now as you point out, you're going to define the requirements based on you test cases. Normally when we cretae requirements, we gather everyone together and spend a lot of time discussion and prioritizing behavior. I suspect what will happen here is that you will add tests "as you think of them" and will stumble across conflicting behavior; e.g. in order to make developer A's test work, dveeloper B's test will fail. Now who has preceedence? Instead of resolve this by having all parties involve consider the options, more likely is the handful of people involved will make an arbitrary decision (arbitrary meaning based on their limited information of the project). This is not good.


The only thing that feels comforting to me now is that we are probably
going to start atleast one completely new project and use some of our
ideas from the prevoius project. So we can improve on the idea and start
with a completely new process that is (hopefully) testdriven or atleast
test-aware.
Thanks for all your comments!
I'll get back to you as this is progressing...
 
Ilja Preuss
author
Posts: 14112
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Jonas Larsson:
--------------------------------------------------------------------------------
Originally posted by Mark Herschberg:
How about using Wiki?
--------------------------------------------------------------------------------
Can you import images (screendumps) and make
tables in an easy way in a wiki solution?


In most wikis, yes. In FitNesse (which is a wiki, too), you can. See http://fitnesse.org/FitNesse.MarkupPicture and http://fitnesse.org/FitNesse.MarkupTable (Tables are a very important feature for FitNesse, as Fit-tests are always written in tables).


Everyone thinks it's absoultely fab, as long as they don't have to do the work. So up to this point I have been doing most of the research and setup work to get it running to smooth it out.


Yes, that's a common situation. You somehow need to convince them that by doing the extra work, there working conditions will improve. The only way I am aware of to effectively show this is to demonstrate it.
Doing a pilot is one good way to do this, if you can. Another way is to just do as much as you can yourself - and let the improved results speak for themselfes.
 
Mark Herschberg
Author
Posts: 6055
8
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Jonas Larsson:

It definately is not. But what can I do with a product that has 300K-400K
lines of code. I'm just one person and we are NOT gonna get the resources
necessary to backtrace everything into a req document. That I know.


Expectation setting. Explain to everyone that you are one person and it's a large code base and that this is a cultural change. What you don't want is management coming to you in 4 months and saying, "it's not working so we're pulling the plug on this idea." Not only will it not be ideal, during the first few months it may feel worse as you're only just starting to discover problems you had been blissfully ignoring.

Originally posted by Jonas Larsson:

This is exactly how it works now. And sometimes it feels as there are
not enough hours in a 24-hour day to have the meetings required to agree
on every little detail. The level of specification has to hit a level
that works for eveyone but is not too detailed. And then there is the
thing about conflicting requirements wich seems to happen a lot...


Make sure everyone understands this is a problem. This will
1) Make them more aware of it, and more likely to try to solve problems in a way that includes everyone.
2) Make people more sensative so when problems happen, people don't start pointing fingers, but instead recognize that the process itself is imperfect.
--Mark
 
Christian Hargraves
Ranch Hand
Posts: 42
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Sorry I got in on this post a little late. We have actually been through the same problems you are going through right now. However, I approached things a bit differently.
I was hired on as a developer at my current employer and I found several problems with the process for both QA and CM. Actually, there really was o process. After trying to explain my concerns and no one caring, I decided to do the work myself and show the results on a project after its release.
Once the product was released, I was able to prove a much higher product stability and more a effective use of the developers' time. All of a sudden management cared and it didn't matter what anyone else said because management could see the ROI.
We still have our problems. I am the only developer, since gone to QA, writing my unit tests and the other developers deny the public domain's definition of unit testing. We are still using a waterfall process which, in my opinion, is the worst process for software development.
We have been researching test doc management for a year now. I believe the inherent problems with good test case docs today are time, process and desire. In order to overcome the time problem, one must make documenting simple and quick. In order to overcome the process problem, one must either recruit others that will help or do it oneself until there is a visible difference in the stability of the product and therefore in the overall ROI. I believe desire comes from knowing how this change affects an individual's working day for the better. If the individual can't see a benefit in their daily tasks, then it's unfair to expect them to have desire in it. Unless one has desire to do it, then it probably won't get very well. If self-interest can't be fostered, then you must either do it yourself or make a really good relationship with the other team and make the "favor" very easy to do.
Our solution was to take the uses cases or stories and separate them up into features of the application. Then we categorize the features (function points ) into three groups: action, validation and navigation. Each of the function points are then documented and automated. Documenting the function points is done simply by describing the the steps involved for that particular function point. Automating is done the same way, without regard for the other function points. The documenting process is owned by QA, but we refer to the customer for any questions the requirements don't cover.
The test case documentation is then generated by grouping the function points together. The test plans' are mostly generated based on the test cases and the function points. The master test plan is also mostly generated from the individual test plans.
Let's say maybe one of the features of an application changes. Instead of updating all of your automated scripts and test case docs, all that is needed is to go into the one function point that changed and update it's docs and code, if it's automated.
Our test cases are actually generated on the fly and are separate from the our bug tracking tool ( we also use bugzilla ). When a bug, matching an existing test case is found, we simply add a link to the test case in the bug and add the regression category to the test case. If a test case for the bug doesn't exists yet (shame on us), then we create the test case, simply by grouping predefined function points together and link the bug to it.
While there are many different great approaches to the problems mentioned above, this one requires very little buy-in from other teams that don't see the direct benefits of test case documentation and it allows for a very high-level of reuse of both the automation code and the documentation.
I am still developing the auto-doc generator, but we are currently storing our function point docs in javadocs, using custom tags, for both automated and manaual functional points. Once the doc generator is complete, it will go into a test case (which simply contains function points), grab attributes from the javadocs of each of the function points and generate the test case docs. The same idea for the test and master test plans.
 
Jonas Larsson
Greenhorn
Posts: 13
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
OK. I'm officially waking this thread up again!
I see what you are saying here Christian. Simplicity is key to getting you coworkers in on it. And that is something I have been struggling with. I got time last week to install and initially evaluate some kind of automated documenting system. I've been looking at several Open Source CMS's and the most promising initially was Typo3 (www.typo3.com). After a day of struggling with the configuration of our LAMP-Box here (so it's been a while since I had admin rights on a Lnx-box OK?) I got it up and running, only to discover that it was too contrieved to add and edit tables (which was one of my requirements). It was also a whole science of it's own to build templates for the documents.
So....I installed Plone (www.plone.org) locally on my W2K-box (which by the way was a breeze to install). Plone is built on top on CMF which runs on a Python-based web-application server called Zope. And contrary to my expectations it was fast and really good looking. But the admin interface (to the Zope-server) is really hard to understand without actually reading a lot of documentation AND I was again disappointed in the table-editing area since there where two choices:
1. Edit raw HTML in a external editor and cut and paste it into a webform. Which is probably out of the question.
2. Use what is called "Structured text" (STX). STX is a simple way to write structured text without having to use a lot of metadata to structure it. It is like a really lightwheight HMTL-like language which seems great for everything *except* tables. Tables in STX looks like this:
|-----------------------------------|
|Spanning Header |
=====================================
| Cell 1 | Cell 2 |
| Spanning cell |
-------------------------------------
And everything has to align to work.
Great. Anyone know of any good WYSIWYG STX table editors out there?
And. I have to say that available documentation for Plone was minimal.
Crap. So I'm back on square one again thinking about a system of XML-files possibly in DocBook format using a XML-editor and then parsed and transformed (by Ant and friends) into static HTML or PDF. This approach has the added benefit of being possible to put into a CVS.
But I don't really know. I have to make a decision pretty soon since the word-based documentation is growing by the day and it will be harder and harder to move it to something else.
Has anyone had any experience in using any of the document-related Jakarta projects like Cocoon, Forrest or Anakia?
 
Christian Hargraves
Ranch Hand
Posts: 42
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I have used cocoon 1.x and it is a VERY powerful tool. However, it does require intial progamming and there is a learning curve.
For starters, are you sure you aren't putting in requirements that "would be nice" instead of "must have"? That could really change that tools you are looking at.

Have you looked into twiki? We use twiki as a CM tool. While I believe we are using it for all of the wrong reasons, our CM is very pleased with it. I guess it has some sort of versioning built into it as well.
There is another tool I played with that seemed really nice, called tcdb. "tcdb" stands for Test Case Database. It is written in php and is also very easy to install.
 
Ilja Preuss
author
Posts: 14112
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Any reason why you didn't try FitNesse yet? (Installation shouldn't be a problem - it's unzip-and-go, if you have a JDK installed.)
 
Amy Phillips
Ranch Hand
Posts: 280
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I found this topic very interesting, I am actually studying it for my university dissertation with the intent to build a test management and bug tracking tool. I am unable to see why people insist on storing test cases speratly from test results and defect reports. Surely they are all linked? Anyway I have just started researching this area and have found the following Test management tools:
Test case manager
TestLog
There are also the more expensive Mercury Interactive tools. If you are still stuck in a couple of months ask me again and I will have reams of information!
Amy
 
Doug Wang
Ranch Hand
Posts: 445
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Rational TestManager can be another choice for you to manage your testing documents.
Besides this, it can assist you to manage your testing project and define a clear testing workflow.
 
Ilja Preuss
author
Posts: 14112
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Amy Phillips:
I found this topic very interesting, I am actually studying it for my university dissertation with the intent to build a test management and bug tracking tool. I am unable to see why people insist on storing test cases speratly from test results and defect reports. Surely they are all linked?


Possibly it's because they have different livetimes? Just a thought....
 
With a little knowledge, a cast iron skillet is non-stick and lasts a lifetime.
reply
    Bookmark Topic Watch Topic
  • New Topic