It looks like you took some fancy items from agile development, put them to waterfall process and now you call it agile.
The product is developed for a customer who will re-sell it while
paying us royalty.
This is OK.
The team does not get to talk directly to the end user. Only to
the reseller.
This is OK. Product owner talk to reseller and collects requirements.
A product requirements document was created before starting
development.
This is not OK. I haven't seen the project where definite requirement set can be defined upfront. Change your product requirements document to product vision (short) with some initial set of requirements which are subject to change.
The requirements are rigid and do not change.
This is not OK and you will see in the future that it is also not true.
A delivery schedule was agreed on with milestones such as "alpha",
"beta" etc. and features/times attached to those milestones.
This is not OK. The real schedule will be visible from the team progress. You can make general milestones but assigning exact set of features which will be implemented in these milestones is not agile. This can change during development.
All developers on the Scrum team report to the product owner, a
software manager.
This is not OK. I would not say that developers report to product owner. Scrum process keeps visibility of the process but developers do not report anything except regular meetings. It is responsibility of product owner to be in contact with a team and as active participant see the progress himself.
Testers on the team report to a QA manager.
This is not OK. Testers should be part of development team because user story is not done until it is tested (there should be automated test to validate acceptance criteria). There can be separate QA but it is additional level of complex testing and it is usually done on customer side (but doesn't have to be) to validate that SW does what customer expects and the feedback is collected as new backlog items or bugs to existing completed backlog items.
Separating complete QA outside of development team leads to breaking the whole purpose of definition of done. Some QA must be part of the team and that part is not related to any QA manager - that part is doing commitment with development team.
The product owner has directed the team towards certain high risk
technical tasks. The output of those tasks is not usable by the end
user but rather some technology/code that will eventually be used in
the product.
This happens in every project but it should be part of some product backlog item targeting end user. It can be included directly in backlog item implemented in current iteration or it can be included as a spike (proof-of-concept) to clarify complexity of some backlog item which should be implemented in the future.
The product owner has created a backlog based on the requirements.
This is a must.
The product owner is unable to answer some questions regarding the
product. He refers to others or to the documented requirements.
This is not OK. It is job of the product owner to know answers. He has a responsibility and he must do decisions. If he doesn't know answer he must find it asap.
The team goes through the motions of Scrum. Daily Scrum, Sprint
Planning, Retrospective etc. There is a ScrumMaster.
This is OK but it doesn't mean that team is doing Scrum.
Every sprint the product owner and management decide what backlog
items the team works on.
This is definitely not OK. The product owner and management can make priorities but commitment (selection of most prioritized items) is teams responsibility.
There is a burndown chart. Scrum board with stories and tasks. The
estimates on those come from the team.
This is OK.
The team sits in an open floor "bull pen" shared with other teams,
all visible and audible. There is cross-team noise and there is foot
traffic around the team area.
It is Scrum master's responsibility to make end of this if team feels like it reduces their productivity.
The team may be required to attend various meetings not directly
related to the goals of the sprint.
It is OK, the time wasted on these meetings will result in smaller commitment (team will do less real work). It is up to Scrum master / management to reduce these meetings to increase team's velocity.
There are pressures to select certain technical solutions. Some
tools and processes are mandated.
This is partially OK. There can be non-functional requirements for tools and architecture and there can be defined processes but still final implementation is up to the team.
At the beginning of the sprint there is nothing to test yet
Really? You have no requirements to validate? No discussions to have with your customer? No wire-frames to evaluate? No test plans to think about?
at the end of the sprint there is typically nothing or very little
left to develop/fix
I have never been in that place in a project. No more work to do? There is always something. Are all your tests fully automated? How is your CI looking? Could the database access layer be refactored to be simpler? And I've never worked on anything with an empty bug list and backlog. What did your developers used to do in a waterfall testing phase?
I know some people get very religious about what is and what is not 'SCRUM'. I couldn't care less about that. But I think you have two issues here:
A 'traditional' QA department that test code once it is 'finished' by developers rather than working with customers and developers to make sure you are building the right thing as well as building it right. Take a look at the agile testing quadrants by Lisa Crispin. The best testers are involved in every stage of the software development lifecycle and the best developers write their own tests.
Trying to stick too closely to a SCRUM timetable of 1 week / 2 week sprints without having a prioritised and sized backlog that is split down into tasks that are easy enough to complete within a short amount of time within a single sprint. If you had this then there would always be more work to get on with. Maybe the last feature you work on in this sprint doesn't get in to this sprint's release, but it can always go in to the next one.
Aside. If you have a small cohesive team then the roles matter less. Instead of having someone with the label tester who isn't allowed to write production code, or someone labelled a developer who thinks they are above testing, everyone should be doing whatever is necessary for the team to succeed, including the dreaded project management tasks when they are necessary, this is called a cross functional team.
One extra point brought up by @Cort Ammon in the comments. The agile manifesto talks about customer collaboration over contract negotiation. You say that:
the client may be disappointed to see the team waste time on something that does't bring immediate value
It can be difficult to explain and I understand customers can be very difficult at times but this would be a big red flag for me. They are trusting you with their source code / client relationship / business / whatever you are developing for them. If they can't trust you to act professionally in their best interest then either you have a problem or they do.
I have written a post that talks about software developers not being considered professionals. A professional doctor, lawyer, civil engineer faced with a client who changed the requirements on them part way through would not just reduce the quality and moan about it. They would tell their clients that it would be a problem. If the client pushed then a professional would not just blindly do it to a dangerously inferior standard because they would be liable. We don't take professional entrance exams and so are not liable. That doesn't mean we shouldn't try to be better.
In summary, I wouldn't worry too much about trying to get people to be more efficient at the beginning and end of a sprint but rather see it as a symptom of a wider issue within the team. Have you heard of eXtreme Programming (XP). I'd say the principles from XP to apply here are communication and respect:
- Respect your team to do what they think is best. I would argue that if there is a lot of watching cat videos then either you have poor developers or you are treating them poorly.
- Communication. If your developers are talking to each other, to the testers, to management, to the customer, then everyone should probably have a good feeling of what is up next and if they don't then they can just ask.
Best Answer
First and foremost, your process should be adapted to what you feel works best for you. Having said that, I think there are some general guidelines that might help: