I've been wondering if one can still classify a Waterfall type development approach, where the length of the waterfall cycle is 1-2 weeks, as Agile.
Agile Development – Is Performing a Waterfall Every 2 Weeks Considered Agile?
agiledevelopment-processsdlcwaterfall
Related Solutions
Spiral is a cycling waterfall. Thats the definition of spiral. That's what Boehm and other proposed when they invented it.
Even in each Spiral or Agile cycle, there are still step-by-step tasks/objectives that need to be finished...
Mostly true.
Is the main difference the introduction of a feedback loop into the process of development earlier on and throughout the process.
Yes.
is there a fundamental difference in the actual step-by-step completion of a task?
The feedback loop is the fundamental difference.
Waterfall demands Big Requirements Up Front (BRUF). It demands Big Design Up Front (BDUF) before any real coding can begin.
Spiral and Agile methods relaxe this demand.
like it's really just compressing and iterating
Don't make it sound so minor. It's not a little tweak. It's a fundamental change in the volume of requirements (and design) and how those requirements (or design) are used.
In waterfall, you can't really start without all the requirements. In many cases, this is an intellectual impossibility. It's hard to visualize all the ramifications of a new way of doing business and new software to empower that new way of doing business.
In Spiral or Agile, you don't have all the information. You have enough to get started.
Many folks want "Spiral" to be "Waterfall". They want a defined schedule based on a complete understanding. In order to stop that foolishness, many folks try not to use the "Spiral" word because it doesn't encourage getting started and delivering software right now with incomplete requirements.
Well, the direct answer to your question would be Mu I'm afraid - there's just not enough details to make an informed guess whether you should or not quit trying.
The only thing I am pretty positive about is that level of agility should be driven by customer / market needs (which you gave no info about).
- For example, as a user of IDE I am perfectly happy to upgrade to new version once or maybe twice a year and I am never in a hurry to do that. Ie if their release cycle is 3 months (12 weeks) then I am perfectly happy with that.
On the other hand, I can easily imagine, say, financial trading company go bankrupt if it takes more than a month for their software to adapt to market changes - 12 weeks test cycle in this case would be a road to hell. Now - what are your product needs in this regard?
Another thing to consider is what level quality is required to serve your customer / market needs?
- Case in point - in a company I once worked we found we need some new feature in a product licensed from some software vendor. Without this feature we suffered rather strongly, so yes, we really wanted them to be agile and to deliver update within a month.
And yes, they appeared to be agile and yes they released that update in a month (if their QA cycle is 12 weeks then they likely just skipped it). And our feature worked perfectly well - guess we should have been perfectly happy? no! we discovered a showstopper regression bug in some functionality that worked just fine before - so we had to stick-n-suffer with older version.
Another month passed - they released another new version: our feature was there but same regression bug was there too: again, we didn't upgrade. And another month and another.
In the end we were able to upgrade only half year later so much for their agility.
Now, let's look a little closer into these 12 weeks you mention.
What options did you consider to shorten QA cycle? as you can see from above example, simply skipping it might not give you what you expect so you better be, well, agile and consider different ways to address it.
For example, did you consider ways to improve testability of your product?
Or, did you consider brute-force solution to just hire more QA? However simple it looks, in some cases this is indeed the way to go. I've seen the inexperienced management trying to fix product quality problems by blindly hiring more and more senior developers where just a pair of average professional testers would suffice. Pretty pathetic.
The last but not the least - I think one should be agile about very application of agile principles. I mean, if the project requirements aren't agile (stable or change slowly), then why bother? I once observed top management forcing Scrum in projects that were doing perfectly well without. What a waste it was. Not only there were no improvements in their delivery but worse, developers and testers all became unhappy.
update based on clarifications provided in comments
For me, one of the most important parts of Agile is having a shippable release at the end of each sprint. That implies several things. First, a level of testing must be done to ensure no showstopping bugs if you think you could release the build to a customer...
Shippable release I see. Hm. Hmmm. Consider adding a shot or two of Lean into your Agile cocktail. I mean, if this is not a customer/market need then this would mean only a waste of (testing) resources.
I for one see nothing criminal in treating Sprint-end-release as just some checkpoint that satisfies the team.
- dev: yeah that one looks good enough to pass to testers; QA: yeah that one looks good enough for the case if further shippable-testing is needed - stuff like that. Team (dev + QA) is satisfied, that's it.
...The most important point that you made was at the end of your response in terms of not applying agile if the requirements are not agile. I think this is spot on. When we started doing agile, we had it dialed in, and the circumstances made sense. But since then, things have changed dramatically, and we are clinging to the process where it may not make sense any longer.
You got it exactly right. Also from what you describe it looks like you got to the state (team/management maturity and customer relationship) allowing you to use regular iterative model development instead of Scrum. If so then you might be also interested to know that per my experience in cases like that regular iterative felt more productive than Scrum. Much more productive - there was simply so much less overhead, it was simply so much easier to focus on development (for QA to respectively focus on testing).
- I usually think of it in terms of Ferrari (as regular iterative) vs Landrover (as Scrum).
When driving on a highway (and your project seem to have reached that highway) Ferrari beats the hell out of Landrover.
It's the off-road where one needs jeep not sports car - I mean if your requirements are irregular and/or if the teamwork and management experience are not that good, you'll have to choose Scrum - simply because trying go regular will get you stuck - like Ferrari will stuck off-road.
Our full product is really made up of many smaller parts that can all be upgraded independently. I think our customers are very willing to upgrade those smaller components much more frequently. It seems to me that we should perhaps focus on releasing and QA'ing those smaller components at the end of sprints instead...
Above sounds like a good plan. I worked in such a project once. We shipped monthly releases with updates localized within small low-risk components and QA sign-off for these was as easy as it gets.
- One thing to keep in mind for this strategy is to have a testable verification that change is localized where expected. Even if this gets as far as to bit-by-bit file comparison for components that didn't change, go for it or you won't get it shipped. Thing is, it's QA who is responsible for release quality, not us developers.
It is tester's headache to make sure that unexpected changes didn't slip through - because frankly as a developer I've got enough other stuff to worry about that is more important to me. And because of that they (testers) really really need solid proof that things are under control with release they test-to-ship.
Best Answer
Traditional (and incorrect) waterfall is a single iteration through the phases of the lifecycle. First, you perform requirements engineering. Using those requirements, you architect and design the system and verify/validate those designs. Then, you implement the system. Once the system is implemented, you test it to ensure it works as intended. Finally, you ship it off to the customer for deployment and use. The project enters a maintenance cycle where you fix bugs and release updates, until the project is end-of-lifed. The process looks something like this:
In reality, this is a bad model for developing software. The paper where a lot of people learned about waterfall actually proposed something very different. It involves a high level of customer involvement at each phase and transitioning back to previous phases to correct and revise artifacts. You can read more about it in Royce's paper, Managing the Development of Large Software Systems. It looks something more like:
Finally, we have the agile approaches, which are iterative and incremental development models. There are many variants on iterative and incremental models. The idea in all of them is that you perform all of the lifecycle phases - requirements, architecture, design, implementation, testing, release - many times in the life of the product, until customer wants to end the product. There's no real detailed diagram of what iterative and incremental development looks like, as there are many variations, but the result is typically a feedback loop.