Agile Methodology – Consolidating Bug and Iteration Tracking

agileissue-tracking

This topic stemmed from my other question about management-imposed waterfall-like schedule. From the responses in the other thread, I gathered this much about what is generally advised:

  • Each story should be completed with no bugs. Story is not closed until all bugs have been addressed. No news there and I think we can all agree with this.
  • If at a later date QA (or worse yet a customer) finds a bug, the report goes into a bug tracking database and also becomes a story which should be prioritized just like all other work.

Does this sum up general handling of bugs in agile environment?

If yes, the part I'm curious about is how do teams handle tracking in two different systems? (unless most teams don't have different systems).

I've read a lot of advice (including Joel's blog) on software development in general and specifically on importance of a good bug tracking tool. At the same time when you read books on agile methodology, none of them seem to cover this topic because in "pure" agile, you finish iteration with no bugs. Feels like there's a hole there somewhere.

So how do real teams operate? To track iterations you'd use (whiteboard, Rally…), to track bugs you'd use something from another set of products (if you are lucky enough, you might even get stuck with HP Quality Center). Should there be 2 separate systems? If they are separate, do teams spend time creating import/sync functionality between them? What have you done in your company? Is bug tracking software even used? Or do you just go straight to creating a story?

Best Answer

"...finds a bug, the report goes into a bug tracking database and also becomes a story which should be prioritized just like all other work.

The question is, should bug tracking and feature tracking be different, and can you use a single system to do both as well as schedule iterations/milestones/etc...

In terms of a "pure" Agile approach, you allow your team to use any combination of tools and processes that works well for them. Sure, you may find a single product that does everything, but perhaps it doesn't do some things as well as you'd like. If you run multiple systems, you need to determine just how integrated they need to be, and if any integration is needed, find the means to do it, and decide just how much information needs to be duplicated. It all boils down to a cost/benefit situation, so naturally any system employed needs to take into account the impact on a team's overall efficiency.

Where I work, we use a Redmine system to track bugs and features in a single system for multiple projects, with links between each project where dependencies exist. We create labels that relate to milestones, which for us are effectively long iterations that may range anywhere from a matter of weeks to a matter of months. For individual tasks and features, we tend not to track iterations too closely, so we have no need to worry about burn-down charts, white boards, sticky notes, feature cards and all of that stuff, as we've found that for our specific needs, some of this stuff is overkill. Each feature itself effectively represents small iterations of between 2-10 days duration, and for those that might care, we log our estimates of time versus actual time for later analysis. This may sound a little ad-hoc, but works for us and ultimately our real measure is working code within a series of time frames.

I suppose if we decided to employ another more formally "regimented" methodology, we might consider a tool to aid in tracking progress, but with what we currently have invested in our present method, we'd probably feed at a minimum the short feature descriptions and time data to another system, unless someone has developed a module for Redmine that does what we want it to, or if it became really important to us, we might create the Redmine module ourselves to avoid any nasty integration headaches that might concern us.

Related Topic