So you have adopted agile at some level of your organization, that’s great. How do you think you are doing? Many organizations implement agile process without a way to gauge its effectiveness. While there are specific metrics at the team, program, and portfolio level to gauge effectiveness, there are also some general health indicators that organizations need to examine on a regular basis.
In this article, we will review three of those health indicators along with sets of questions you can ask to gauge how successful you are in each of these areas. As with agile development, an organization focusing on a holistic agile process will require incremental improvement to ensure that your agile health is where it should be.
The first and most significant indicator of agile success within your organization is the amount of time from idea to implementation (lead time). In the Scaled Agile Framework (SAFe)® we refer to this as sustainably shortest lead time. In organizations who are doing this well, there is minimal hands-off time for a prioritized feature. In organizations where this is a challenge, there might be extensive time where this feature sits in limbo before it ends up in the hands of your users.
Organizations that struggle with shortest lead time usually have the same issue: bottlenecks. For example, there might be a bottleneck at the enterprise architecture level, the decision making/prioritization level, or even the user experience research level. In the worst scenarios, an organization has all of these bottlenecks. Many organizations struggle from these bottlenecks by centralizing too much decision making with a single individual or group.
Another area where significant delays occur is around an organization’s release process. Extensive manual regression testing, integration, and environment configuration can lead to very long wait times to get a completed feature into the hands of your users. I have talked to many organizations who have a release process that is over a month long. Many of these organizations have very little of this process automated which requires extensive man hours each time a release is planned.
Here are some questions that you may want to ask yourself about your organization in regards to lead time:
- What elements of planning and definition cause delays? Is detailed decision-making centralized under a few individuals or groups (thus creating a bottleneck)?
- Is your funding process creating extensive wait times for initiatives? Does it take long to get resource alignment after funding is committed?
- Does your organization have a bloated and unmanageable backlog of features? Do you often get trapped within analysis paralysis when attempting to implement backlog items?
- What is your deployment and release process? How much time does it take from the team completing the development work to it being in the hands of your users? Could some of this be improved by some level of automation?
In addition to sustainably shortest lead time, predictability becomes the second health indicator. Ask yourself — given a set of functionality that has been estimated at one month, what is your degree of confidence that your team will be able to complete this work in the allotted time? For many organizations this number is probably below 20% and decreases dramatically if you go beyond a month. Why is this so hard to predict?
One of the common myths about agile development is that there is no level of future planning or predictability. In actuality, agile development if implemented properly within an organization, should lead to more successful future planning for a product roadmap than what you would get through traditional waterfall development. Techniques such as relative estimation, iterative development, and a shared definition of done have been proven to lead to more predictable execution.
Another key area where many organizations struggle is in centralized decision making. I’ve seen many organizations attempt to shoehorn agile processes into how they already work. For example, I have seen situations where someone other than the development team actually performs the relative estimation or where executives set release dates for functionality which cannot change. In situations where an organization is only agile at the team level (and not at the overall organizational level), there will always be a degree of success in predictability which cannot be realized.
Here are questions you may want to ask yourself about predictability within your organization:
- Are your development team’s estimates significantly different from the actual time required? Are they receiving the needed information to estimate correctly? Is someone other than the development team(s) completing the estimates?
- Is QA continually causing delays in a release? Are they integrated with the development teams, or do they operate in a vacuum? Are individual user stories considered complete before testing is complete? Does the development environment closely align with the staging and production environment?
- Does your planning, budgeting, and roadmap process follow agile principles, or is it still reflective of waterfall development? Is decision making too centralized with a few individuals or groups? Are timeframes / budgets set for initiatives without taking into account team estimates for the work?
The final indicator of an organization’s agile health is quality. As you define initiatives and task your teams with completing the work, is there a steady stream of quality related issues either during the release process or once it reaches production? An organization that is doing well will have minimal issues escape the sprint process. In this kind of environment, issues at the deployment level should be limited to integration issues. For organizations that are not doing well, quality issues are continually slipping through the cracks and there is a continual blame game between developers, testers, and business owners as to who is at fault.
In many organizations this is perpetuated because teams are still organized in a way that is consistent with waterfall development. Agile prescribes cross-functional teams. This means that everyone needed to define, build, and test a user story is on the same team, participates in estimation, and makes a shared commitment to complete a story within a sprint. They also as a group agree to the same definition of done. One of the reasons this model is so successful is that it makes the goal of quality something that is shared amongst a team. It is no longer a battle between developers, testers, and business owners.
Here are questions you may want to ask yourself about the quality of software within your organization:
- Do you receive a steady stream of bug reports from the users of your application? Are there issues that end up in production that were not discovered through your QA process?
- Are there clearly defined expectations for design and development best practices within your organization? Do your developers have needed access to the architects to answer questions about standards and implementation?
- Are the development teams cross-functional (including everyone needed to define, build, and test a user story)? Do the development teams have a single definition of done they all operate by?
- Is there a sense of team ownership in the completed vision? Do your developers and testers have a culture of cooperation toward a shared vision? Do you have automated testing in place to ensure certain issues get discovered on each check-in?
Organizations should take time each year to evaluate these areas and adapt their processes to remove the impediments. If it seems that your organization is having difficulty with one or more of these factors, it maybe time to see what it would look like to put a more holistic and proven framework in place to govern how your organization moves from idea to implementation. The current leading framework of this type is the Scaled Agile Framework® (SAFe®).
One advantage of working within SAFe® is that your analysis of your organization shifts from being subjective to being based on specific metrics that can gauge the success at the team, program, and portfolio levels. With relentless improvement being a core concept on which SAFe is built, there are predefined areas that allow you to review the successes and challenges of a given unit of work. It also has predefined points to review those metrics and plan how you can adapt the process for more efficient execution.
Whatever framework or approach you choose, continual inspection and adaptation is required to ensure it is working as intended.
Subscribe to DavidTucker.net
Get the latest posts delivered right to your inbox