Agile Accelerate

Leave Nothing on the Table


Leave a comment

“Leave Nothing On the Table: Where Your Productivity Goes—And How to Get It Back”

Leveraging Strategy, Metrics, and Agile Practices to Maximize Efficiency

How efficient is your organization?

If you were to estimate how much of your team’s work actually translates into real, tangible outcomes, what would you say—60%? 70%? Most leaders assume that the majority of their team’s efforts drive meaningful results.  

The reality, however, is far more alarming. Studies show that, on average, teams only accomplish 15% of work planned  —the rest disappears into inefficiencies, misalignment, and hidden bottlenecks.  

What if I told you that you could reclaim 10-20% or more of that lost capacity—not by working harder, but by identifying and addressing the key areas where work quietly drains away? By tackling these areas where process inefficiencies hide, organizations can dramatically increase their projected outcomes which means more delivery of services or products that customers can engage with and ultimately more money back to the company.

This white paper explores the seven critical leverage points that allow teams to recover lost work, reduce waste, and maximize productivity. Through a series of assessments and targeted workshops, Agile Accelerate is helping companies across the globe achieve more with existing resources using a targeted approach to find waste and attack it, leaving nothing on the table.

How Did This Happen?

In today’s complex and often isolated business structures, organizations often overestimate the efficiency of their operations. While many leaders believe that a significant portion of their team’s efforts directly contributes to organizational goals, the reality is starkly different. Research indicates that a substantial amount of work is lost due to various inefficiencies. McKinsey

A study by McKinsey & Company highlights that many organizations, despite adopting agile practices, are “ultimately doing so to deliver work and not value.” This misalignment between effort and value delivery underscores a critical issue in modern workplaces. McKinsey

The consequences of such inefficiencies are profound. Organizations face increased costs, delayed project timelines, and diminished employee morale. For instance, a report from the Project Management Institute found that for every $1 billion invested in projects, $122 million was wasted due to poor project performance. PMI.org 

In many cases, engineering efforts are so splintered, that just the efforts to tackle technical debt is hindered by context switching to address code maintenance issues and operational tasks or non-coding issues.

Addressing these challenges presents a significant opportunity. By identifying and mitigating these inefficiencies, organizations can enhance productivity, reduce costs, and improve overall performance. Implementing targeted strategies to tackle these issues can lead to substantial gains in efficiency and value delivery.

In the following sections, we will delve deeper into these inefficiencies, providing data-driven insights and practical solutions to help organizations reclaim lost productivity and achieve their strategic objectives. 

Help!

If only 15% of work is truly delivering value, what happens to the rest? The answer lies in seven key areas—common but often overlooked pitfalls that drain capacity, create inefficiencies, and erode the impact of even the most well-intentioned teams. Understanding these areas is the first step toward reclaiming lost work.

  • Dark Work – Tasks that consume time and resources but aren’t tracked or measured, making them invisible to leadership and impossible to optimize.
  • Low-Value Stories – Work that gets done but contributes little to strategic goals, often due to unclear requirements or poor stakeholder alignment.
  • Overhead – Administrative burdens, excessive meetings, reporting, and other non-value-adding activities that slow execution.
  • Misalignment – When teams and leadership operate with different priorities, causing duplication, conflicting efforts, and wasted resources.
  • DOWNTIME – Delays, handoffs, waiting for approvals, and other process inefficiencies that stretch timelines and reduce output.
  • Poor Prioritization – Teams spread thin across too many initiatives, leading to context switching, burnout, and a lack of meaningful progress.
  • Technical Debt – Short-term fixes and neglected system improvements that create long-term inefficiencies, making future work slower and costlier.

Each of these areas represents a recoverable loss—a place where small, targeted improvements can unlock 10-20% or more of lost capacity. By systematically addressing these issues, organizations can shift from merely working harder to working smarter, ensuring more of their effort translates into meaningful, high-value results.

What’s In It For Me?

At Agile Accelerate, we first work with organizations to assess what is currently being left on the table. A picture of this may be unique for different businesses, but many of the root causes are shared across industries. 

Our company then crafts custom and targeted workshops that directly address the work loss challenges described above, along with their benefits and implications. In the past, we have offered services focused on giving the organization 100% control over the work committed to and delivered. Example workshops we have hosted include:

1. Facilitated Strategy Mapping

  • Relevance to Work Loss: Many organizations struggle with inefficiencies because their teams lack clear alignment on objectives, critical success factors, and key activities. This leads to wasted effort on low-impact work.
  • Benefits: This workshop helps teams refine their vision and create structured success factors, ensuring that efforts are focused on delivering actual value rather than just completing tasks.
  • Implications: Companies that engage in strategic mapping can reduce misalignment, minimize duplicated efforts, and significantly improve the impact of their initiatives.

2. Metrics Audit and Strategy

  • Relevance to Work Loss: Organizations often measure activity rather than value, leading to an overemphasis on busy work instead of meaningful outcomes.
  • Benefits: By assessing gaps in current metrics and aligning them with value delivery, this workshop helps teams track progress in a way that drives impact rather than just motion.
  • Implications: Businesses that refine their measurement strategies can better identify inefficiencies, eliminate unnecessary work, and redirect focus toward high-value outcomes.

3. Value Stream Mapping Workshop

  • Relevance to Work Loss: Inefficiencies in workflow, handoffs, and bottlenecks often lead to excessive delays and wasted effort across teams.
  • Benefits: This workshop enables organizations to visualize their end-to-end delivery process, pinpoint waste, and create actionable plans for improvement.
  • Implications: By eliminating bottlenecks and streamlining workflows, organizations can enhance productivity, shorten cycle times, and improve overall efficiency.

4. Facilitated Root Cause Analysis (RCA)

  • Relevance to Work Loss: Many organizations implement surface-level fixes without addressing the systemic issues causing inefficiencies, leading to recurring problems.
  • Benefits: RCA helps teams methodically analyze and identify the true sources of inefficiencies, ensuring solutions are impactful and long-lasting.
  • Implications: Companies that perform root cause analysis can prevent repetitive failures, improve problem-solving capabilities, and reduce wasted time spent on rework.

These workshops provide concrete, actionable ways to tackle inefficiencies and reclaim lost productivity, ensuring that organizations are focusing on meaningful, value-driven work. 

Act Now! 

It’s time to leave nothing on the table and unlock your organization’s Full Potential.

The hidden inefficiencies in your organization—misaligned strategies, untracked metrics, fragmented collaboration, and unoptimized workflows—are costing you valuable time and resources. Agile Accelerate interactive workshops are designed to uncover and address these obstacles, empowering your teams to align on strategy, work smarter, and drive real, measurable impact.

Our sessions provide actionable insights and practical tools that help organizations streamline operations, enhance collaboration, and maximize value delivery. Whether you’re struggling with ineffective planning, lack of alignment, or missed opportunities for growth, our expert facilitators will guide you through targeted solutions that create lasting transformation.

The cost of inefficiency is too high to ignore. Let’s work together to reclaim lost capacity, strengthen your team’s agility, and ensure nothing is left on the table. 

Contact Agile Accelerate today to explore how our workshops can drive immediate and lasting improvements in your organization.

Schedule a consultation now and take the first step toward transformational success!


Leave a comment

Extending Cross-Functionality to Programs

There is an excellent rationale for cross-functional teams.  For large programs, that rationale can be easily scaled to the program-level.  But, for some reason, this isn’t always recognized.

TEAM CROSS-FUNCTIONALITY

Let’s say you have a team with the following profile of highly siloed individuals:

xfunc1

This is great if you have a profile of stories that fits it perfectly, as follows:

xfunc2

But what if your set of sprint stories looks more like this?:

xfunc3

In this case, you have a deficiency of analysts, back-end developers, and QA people to implement the stories that your aggregate team capacity might otherwise support.  And, your UX folks and front-end developers will be twiddling their thumbs for part of the sprint.

So, what to do?

Since you are only as good as your lowest capacity component (which appears to be QA, from this particular example), you will have to scale back the number of stories to fit the profile, as shown:

xfunc4

Now, everyone is underutilized, except for QA.  Good luck finding something useful for everyone else to do in the next two weeks.

The net result is that your team is perhaps 30% less productive than it could be (eyeballing the graphic).

However, if you take advantage of standard cross-functional teamwork, your team’s profile may look something like this:

xfunc5

Note that by “cross-functional” we do not mean that everyone should be able to do anything.  There are very valid reasons (education, experience, proclivity, enthusiasm) why certain people are ideally suited for certain kinds of work.  Think of the cross-functional nature of someone’s role on a team as a bell curve (alternatively, some talk about T-shaped employees – the T is just the bell curve upside down, as the Y-axis orientation is arbitrary).  The more the curve is spread out, the more they are able to take on other people’s roles.  On a good cross-functional team, the bell curves overlap “somewhat,” meaning that everyone can take on a little bit of someone  else’s role, although perhaps not as efficiently.  Still, this allows a team to take on a wide variety of “profiles” of sprint work, as will always be necessary.

So, for example, in the case above,

xfunc6

people will adjust to the desired “sprint needs” profile as follows:

xfunc7

PROGRAM LEVEL CROSS-FUNCTIONALITY

Don’t forget that this model can be applied to more than just teams.

For example, there can be a tendency for teams to develop “specific expertise”, due perhaps to knowledge held by certain BSAs or specific architectural or design skills in the development team.  The program may then tends to assign stories based on this expertise under the theory that this is the most efficient way to get work done. Unfortunately, this has the effect of only further driving each team into a functional silo.  It can become a vicious spiral and soon you may hear things like “well, we don’t have generic teams and, at this point, the schedule is paramount, so we need to keep assigning program work according to the team best suited to do it.”  As a result, program backlogs will consist of stories pre-targeted to specific teams, even arbitrarily far out in time.  Imagine what happens when the stakeholders decide to re-prioritize epics or add new features, or a new dependency arises that doesn’t line up with the ideal team at the right time.  The result will be a work profile that doesn’t match the “team profile,” as follows:

xfunc8

Enter a cadre of fix-it people – project managers, oversight groups, resource managers, program managers – all trying to re-balance the backlog, shuffling stories around, adding people to teams, squeezing some teams to do more work, while other teams tend to be idle, therefore resulting in the assignment of less than necessary filler work.  It is the same wasteful resource management nightmare that is so easily solved by cross-functional teams, except this time at the program level.

So, eliminate the waste, and follow the following simple program level guidelines:

  1. Create a fully prioritized program backlog without consideration for the teams that will be executing the stories.
  2. Once per sprint, have a program planning session or meta-scrum (Uber-PO, Uber-SM, team representatives) where the candidate stories for the upcoming sprint are identified for each team.  Include a little more than each team’s velocity would otherwise indicate in case they are able to take on more than their average.
  3. Make it a goal to avoid specializing teams.

All team “profiles” will be identical and program needs can easily be accommodated.

xfunc9

There may be a little bit of short term inefficiency resulting from having the “slightly less than ideal” team work on particular stories, but the more you do this, the more that inefficiency evaporates.  And the advantages are significant:

  • Holistic view of program backlog allow you to focus on what is important – delivering value
  • No need to engage the expensive swat team of fix-it managers to shuffle around people and project artifacts
  • All team members gain experience and learning, often resulting in greater job satisfaction, and higher performing teams
  • No more single point of failure; no more critical path team
  • Far less chaos and confusion, resulting in more focused individuals
  • Extremely easy to manage – program progress is measured by the simple rate at which all teams work through the stories.  Any gaps in targeted scope versus expected scope is easy to identify.


Leave a comment

An Experiment in Learning, Agile & Lean Startup Style

I always have a backlog of non-fiction books to read. Given the amount of free time that I have every day, I am guessing that it may be years before I get through them. In fact, the rate at which books get added to my backlog probably exceeds my learning velocity, creating an ever-increasing gap. It feels like a microcosm of Eddie Obeng’s “world after midnight.”

So what to do?

books800I am trying to increase my velocity by applying speed reading techniques. But so far, that is probably only closing a small percentage of the gap.

Iterative Learning

Then, upon a bit of soul searching, I had an epiphany. Why do I feel the need to read and understand every single word on every single page? This runs counter to what we coach our teams to do—eliminate waste, only document what makes sense, just-in-time practices, and applying iterative thinking instead of only incremental. The answer seemed to be that I don’t feel that I have really read the book if I haven’t read every word. So what? Am I trying to conquer the thing? It seems like a very egocentric point of view.

What if I was able to let go of the ego, and try to read a book iteratively instead of incrementally? Is it even possible? Would it be effective? There are all sorts of ways to tell stories or build products—top-down, bottom-up, inside-out—each of which have their strong points. Sometimes it is most effective, for instance, to grab the user’s attention by initially giving them a nugget that might logically be placed in the middle of a narrative, and then providing necessary foundation, or by filling in the gaps as necessary. Could one apply the same process to learning from a book? I could imagine scanning through a book randomly, stopping at points that looked interesting and digesting a bit—much like I used to do with encyclopedias as a kid. Or, maybe, first reviewing the TOC for areas of interest, jumping to those sections, absorbing a bit, and then searching for any context that was missing.  This would be a completely different way to learn from a book. I couldn’t call it reading, and don’t have a good term for it, other than a new kind of learning.

This led me to thinking a little more deeply about what I am trying to get out of reading; the learning aspect of it. What if I could scan a book in a tenth of the time that it took to read it, but retain half of the content? Would that be an improvement? There seems to be some sort of formula that I am trying to maximize, like dl/dt=CVR: Rate of learning equals the “learn-worthy” content of the book multiplied by the speed that I scan it multiplied by the percent that I retain. Is the percent retained equal to the percent value obtained? Do I get half the potential value of a book if I retain half as much? I could simply define R to be the percent value and my equation still holds. Something in the back of my mind says this it is really sad to look at learning this way. Something else says I am on to something.

Of course, there are all kinds of nuances.  For example, some books build upon a foundation which must be well understood to get any value at all out of the latter sections of the book.  For others, it may be easier to skip around. Some, you may be able to get value out of scanning the TOC, or the subheadings, digesting the graphics, or just reading the intros and summaries of each chapter; for others, not so much.  Hence, in a sense, different books have different learning profiles.

The Experiment

I was intrigued enough to attempt this on a book near the top of my backlog: Steven Wolfram’s A New Kind of Science, a 1280-page tome that took him ten years to write. So I did it. I didn’t “read” it. I iterated through it and digested some of it. And can honestly say that, for this particular book, I optimized my learning rate equation significantly. I can’t be sure of the total potential value that the book would have to me were I to read it in its entirely, but from what I digested, I feel like I got about 50% in about 5% of the time—a tenfold increase in my learning rate. And Steven got his royalty. Yes, I do appreciate the irony of using a new kind of learning on A New Kind of Science. And letting go of the idea of conquering a book was kind of liberating.

So, what if we look at a particular learning objective in the same way that we manage a large project or program? I am imagining a vision or an objective like “I want to become learned in Digital Philosophy” (one of my particular interests.) That vision results in the creation of a backlog of books, papers, blogs, etc. The larger of these (books) are epics and can be broken down into stories, like “Scan contents to get a sense of the material,” “Determine the core messages of the book by finding and reading the key points,” “Understand this author’s view on this particular topic,” and so on. By thinking about learning material this way, it opens up all kinds of new possibilities. For example, maybe there is another way to slice the backlog, such as by topic. If the most important thing to further my overall objective is to understand everything about cellular automata, I would assign higher priority to the stories related to that topic, even if they come from separate sources. So, my learning process takes a different path; one that slices through different material non-linearly.

Lean Startup Learning & Continuous Improvement

In fact, this all feels a bit to me like a lean startup approach to learning in that you can experiment with different chunks of material that may point you in different directions, depending on the outcome of the reading experiment. Having a finer backlog of reading components and being willing to let go of the need to conquer reading material might make possible a much faster path to an ultimate learning objective.

And so I am passing along this idea as an option for those who have a voracious desire to learn in this after-midnight world, but have a before-midnight backlog of reading material.


Leave a comment

Intuition & Innovation in the Age of Uncertainty

“My [trading] decisions are really made using a combination of theory and instinct. If you like, you may call it intuition.” – George Soros

“The intellect has little to do on the road to discovery. There comes a leap in consciousness, call it intuition or what you will, and the solution comes to you, and you don’t know how or why.” – Albert Einstein

“The only real valuable thing is intuition.” – Albert Einstein

“Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition.” – Steve Jobs

Have you ever considered why it is that you decide some of the things that you do? Like how to divide your time across the multiple projects or activities that you have at work, how and when to discipline your kids, where to go and what to do on vacation, which car to buy?

The ridiculously slow way to figure these things out is to do an exhaustive analysis on all of the options, potential outcomes and probabilities. This can be extremely difficult when the parameters of the analysis are constantly changing, as is often the case. Such analysis is making use of your conscious mind.

The other option is to use your subconscious mind and make a quick intuitive decision.

flip-coin_5

We who have been educated in the West, and especially those of us who received our training in engineering or the sciences, are conditioned to believe that “analysis” represents rigorous logical scientific thinking and “intuition” represents new-age claptrap. Analysis good, intuition silly.

This view is quite inaccurate.

Intuition Leads to Quick, Accurate Decisions

According to Gary Klein, ex-Marine, psychologist, and author of the book “The Power of Intuition: How to Use Your Gut Feelings to Make Better Decisions at Work,” 90% of the critical decisions that we make are made by intuition in any case. Intuition can actually be a far more accurate and certainly faster way to make an important decision. Here’s why…

The mind is often considered to be composed of two parts – conscious and subconscious. Admittedly, this division may be somewhat arbitrary, but it is also realistic.

The conscious mind is that part of the mind that deals with your current awareness (sensations, perceptions, feelings, fantasies, memories, etc.) Research shows that the information processing rate of the conscious mind is actually very low. Dr. Timothy Wilson from the University of Virginia estimates, in his book “Strangers to Ourselves: Discovering the Adaptive Unconscious,” the conscious mind’s processing capacity to be only 40 bits per second. Tor Nørretranders, author of “The User Illusion”, estimates the rate to be even lower at only 16 bits per second. In terms of the number of items that can be retained at one time by the conscious mind, estimates vary from 4 – 7, with the lower number being reported in a 2008 study by the National Academy of Sciences.

Contrast that with the subconscious mind, which is responsible for all sorts of things: autonomous functions, subliminal perceptions (all of that data streaming in to your five sensory interfaces that you barely notice), implicit thought, implicit learning, automatic skills, association, implicit memory, and automatic processing. Much of this can be combined into what we consider “intuition.” Estimates for the information processing capacity and storage capacity of the subconscious mind vary widely, but they are all orders of magnitude larger than their conscious counterparts. Dr. Bruce Lipton, in “The Biology of Belief,” notes that the processing rate is at least 20 Mbits/sec and maybe as high as 400 Gbits/sec. Estimates for storage capacity are as high as 2.5 petabytes.

Isn’t it interesting that the rigorous analysis that we are so proud of is effectively done on a processing system that is excruciatingly slow and has little memory capacity? Whereas, intuition is effectively done on a processing system that is blazingly fast and contains an unimaginable amount of data.

In fact, that’s what intuition is – the same analysis that you might consider doing consciously, but doing it instead with access to far more data, such as your entire wealth of experience, and the entire set of knowledge to which you have ever been exposed.

Innovation Is Fueled by Intuition

The importance of intuition only grows exponentially with every year that passes.  Here’s why…

Eddie Obeng is the Professor at the School of Entrepreneurship and Innovation, HenleyBusinessSchool, in the UK. He gave a TED talk which nicely captured the essence of our times, in terms of information overload. Figure 1 from that talk demonstrates what we all know and feel is happening to us:

Eddie_Obeng_Uncertainty

The horizontal axis is time, with “now” being all the way to the right. The vertical axis depicts information rate.

The green curve represents the rate at which we humans can absorb information, aka “learn.” It doesn’t change much over time because our biology stays pretty much the same. The red curve represents the rate at which information is coming at us.

Clearly, there was a time in the past, where we had the luxury of being able to take the necessary time to absorb all of the information necessary to understand the task, or project at hand. If you are over 40, you probably remember working in such an environment.

At some point, however, the incoming data rate exceeded our capacity to absorb it; television news broadcasts with two or three rolling tickers, tabloids, zillions of web sites to scan, Facebook posts, tweets, texts, blogs, social networks, information repositories, big data, etc. In our work place, projects typically have many dependencies on information from other teams, stakeholders, technologies, end users, and leadership, all of which are constantly changing.

It is easy to see that as time goes on, the ratio of unprocessed incoming information to human learning capacity grows exponentially. What this means is that there is increasingly more uncertainty in our world, because we just don’t have the ability to absorb the information needed to be “certain,” like we used to. Some call it “The Age of Uncertainty.” Some refer to the need to be “comfortable with ambiguity.”

This is a true paradigm shift. It demands entirely new ways of doing business, of structuring companies, of planning, of living. In my job, I help companies come to terms with these changes by implementing agile and lean processes, structures, and frameworks in order for them to be more adaptable to the constantly changing environment. Such processes are well suited for the organizational context in any case given that organizations are complex systems (as opposed to “complicated” ones, in Cynefin, or systems theory, parlance). But they are also the only kinds of processes that will be effective in this new environment because they embrace the idea of sensing and responding to change instead of requiring rigorous analysis to establish a predictable plan.

We no longer have time to do the rigorous analysis necessary to make the multitude of decisions with which we are confronted on a daily basis. Instead, we increasingly need to rely on our intuition. But, while we often concentrate our energies on improving specific technical or leadership skills, we rarely consider the idea that perhaps we can make better use of that powerful subconscious mind apparatus by improving the effectiveness of our intuition. It seems to me that this is a significantly missed opportunity, one that deserves more and more of our attention with every passing year.

Intuition Can Be Developed

Sounds as if intuition is a skill that could be very useful to hone, if possible. So how do we develop that capability?  Here are some ideas:

  • Have positive intent and an open mind – The first step to any new idea is to accept it. Think of it as “greasing the learning skids.”
  • Put yourself in situations where you gain more experience about the desired subject(s) – Intuition works best when you have a lot of experiences from which to draw. If you continue to do the same thing over and over, you are not building new experiences.  Therefore, the more you depart from the norm and from your comfort zone, and develop experiences in your area of interest, the more substantial your “intuitive database.”
  • Meditate / develop point-focus – Meditation develops all sorts of interesting personal capabilities, not least of which is an improved capacity to intuit.
  • Go with first thing that comes to mind – Effectively, you are practicing intuition by doing this. In time, the practice will lead to more effective use of the capability.
  • Notice impressions, connections, coincidences (a journal or buddy may help) – This reinforces the intuitive pathways of the mind. Neuroplasticity is a well-studied phenomenon whereby your thoughts develop incremental neural connections. Reinforcing the positive ones makes them more available for use.
  • 2-column exercises – Another mindfulness technique, these exercises help to raise you awareness of your mental processes, including your subconscious.
  • Visualize success – Think of this as applying the idea of neuroplasticity to build a set of success-oriented neural pathways in your mind.
  • Follow your path – Following a path that feels right to you does two things: First, it puts you into increasingly rewarding situations, generating positive feedback, which helps with all of the above practices. Second, it is simply practicing intuition, but specifically on what your subconscious mind knows are your best decisions.

I am doing many of these practices and finding them to be very valuable.


Leave a comment

Subtractive Transformation (or “How Improving a Company is Like Improving a Golf Swing”)

After living overseas for two years and not playing golf the entire time, I returned to the states, joined a golf league, and quickly realized how out of practice I was.  I had always had good luck taking lessons or “tune ups” from a particular golf pro in Boston, but now I was living in Florida, and needed to find someone new.  So, I went to one golf pro, who upon analyzing my swing, suggested a half dozen things I should be doing.

golfswing2

I got worse.

I went to another pro, who watched my poor excuse for a swing, and promptly suggested a different half dozen things to do.

I got even worse.

Before giving up entirely, I tried yet one more guy.  After watching me fumble through a couple drives, he said “I don’t know who you have been taking lessons from, but they’ve got your head full of rules and you can’t relax out there – that’s why your swing stinks.  Forget about everything they taught you and just get out there and hit the ball.”  (For those who have seen the movie “Tin Cup”, this advice might sound familiar)

I got worse.  But then I started to get better.

The lesson I learned from this was the power of simplicity. I was trying too hard to do too much.  Removing the impediments to a decent golf swing was far easier that trying to pile a bunch of better rules on top of the foundation of bad habits.

Having worked with companies who are trying to transform in some way, I find that the same guidelines apply.  Your organization may want to become more nimble, more agile, more innovative, more something.  But it is probably full of organizational structures, processes, attitudes, cultures, and leadership styles that represent impediments to the desired transformation.

The mistake many companies make is to try to pile on the new ideas without removing the impediments, a strategy we might call “additive transformation.”  Generally, the new ideas just won’t fit and you’ll have a mess of conflicts.  For example, you may try the approach of implementing a policy of having your designers, architects, and developers work on something “free of constraints” for 10% of their time (a commonly attempted tactic to bring innovative ideas and fresh thinking to a stale environment).  Unfortunately, those people will quickly run into problems, such as project managers or business stakeholders who are still driving to dates without any consideration for allowing time for investments in innovation or process improvement.  As if they “never got the memo.”  Sentiments like “yes, we are all for innovation, but we also have a business to run and important deadlines to meet” will be interpreted completely differently by folks with different levels of agile maturity and will therefore result in mismatched expectations.

Instead of adding incongruent changes, focus on removing those impediments to agility and innovation, a strategy we might call “subtractive transformation.”  By using thinking tools like “Double Loop Learning” and coupling them with analytical tools like “Current Reality Trees” (from Eliyahu Goldratt’s Theory of Constraints) you can uncover the organizational structures, process, cultural elements, belief traps, and leadership styles that are responsible for those impediments to your new organizational vision.  Once you have discovered the roots of those impediments, it may be an easy step to simply remove them or to brainstorm an “environment design” change that will undo them.

The other advantage to this approach is that you can start from the most logical place – where you are today.   You don’t need to do a “reset” and start with a clean slate in order to achieve a transformation.  All you need to do is simplify and improve, via the removal of bad habits.

Before you know it, you’ll be hitting the ball in the fairway again.


Leave a comment

Continuous Planning as an Antidote to the Sunk Cost Fallacy

money-drainThe Sunk Cost Fallacy, aka “throwing good money after bad”, is the irrational but common human behavior of continuing to do something not because of the return on investment going forward but because it is too painful to feel like you’ve wasted the money already sunk into the endeavor. Such behavior is very common in large companies, with fat product/project portfolios that are difficult to trim due to political reasons. The opportunity costs associated with sunk cost projects can be huge. Imagine what you can do with the money and people that would be freed up from cancelling even one of those projects. By applying some Lean principles to your portfolio planning process, you can avoid sunk cost fiascos and work toward being a more efficient and innovative organization.

Some high profile sunk cost traps include:

The Concorde – The British and French governments continued to fund the “commercial disaster” joint project due mostly to political reasons.1

Farmville – “Farmville players are mired in a pit of sunk costs. They can never get back the time or the money they’ve spent, but they keep playing to avoid feeling the pain of loss and the ugly sensation waste creates.” 2

The Iraq War – Many have interpreted the rationale for continuing the Iraq War past the point of meeting initial objectives as a perfect example of sunk cost fallacy. 3

One of the suggestions to avoid the sunk cost fallacy is to get opinions from a different set of people periodically, according to John Hammond.4  This helps to prevent the tendency to protect past decisions in order to avoid being associated with a failed project.

But the Lean concept of continuous planning may be the best antidote to sunk cost failures.  This is because it provides a framework for continuously inspecting the value of continuing a project.  You can do this with annual planning, but you only have the opportunity to inspect at a predetermined time, so even if you maintain a healthy process of periodically reevaluating the ROI on each project, on the average you will continue a project for six months past the point where the ROI turns negative.

Continuous planning also offers a subtler advantage to avoiding sunk cost scenarios.  With annual (or coarsely interactive) planning, decision making tends to be based on dividing up a budget.  Projects in progress almost always get consideration for continuance because people usually only do the ROI estimation for new projects.  However, with continuous planning, it is easier to make the value judgment “is this worth continuing?” a natural part of the process.

  1. Weatherhead, P.J. (1979). “Do Savannah Sparrows Commit the Concorde Fallacy?”Behav. Ecol. Sociobiol (Springer Berlin) 5 (4): 373–381
  1. http://youarenotsosmart.com/2011/03/25/the-sunk-cost-fallacy/
  1. http://www.freakonomics.com/2007/08/20/more-sunk-cost-thinking-on-iraq-war/
  1. “The Hidden Traps in Decision Making” John S. Hammond, Ralph L. Keeney, and Howard Raiffa, HBR


Leave a comment

An Agile Vacation: Why Agile Coaches Shouldn’t Take Stickies Abroad

Some of the people on my teams like to play a game called “Tease the Agile Coach by Pretending Everything He Does is Agile.”

Last fall I went on a 2-week vacation with my wife in France.  When I returned, of course, the gag was “Did you run your vacation agile style?”  “Did you start every morning with a scrum?”   I love playing along, but as I thought about it, to my horror, there was a lot of “agility” to my vacation, and maybe their teasing wasn’t so far from the truth.

As an example…eiffeltower_stickies3

We spent three days in Paris and the rest of our time roaming around the country in a car.  It was essentially a sightseeing vacation.  I maintained a big list of things that we intended to see.

A Backlog.

Much of the time, we woke up in one town and the only real objective for the day was making it by nightfall to a town where our next reservation was.

Sprint goal.

While we might have several things we wanted to see and do every day…

Sprint backlog.

…there were always things that got in the way of our plans, like unexpected traffic or bad weather.  Sometimes we simply left a little late having spent extra time savoring our croissants and café au lait.  As a result, we might have to drop the least important excursions.

Continuous prioritization.

We wouldn’t always have a clear idea of exactly what we wanted to do once we arrived at a new destination, but we usually had no problem coming up with a plan.

Grooming the Backlog.

At one point, we found ourselves in the Alps with inclement weather.  My plan to take the cable car up to the top of Mont Blanc fell apart because they were having snow and whiteout conditions at the top of the mountain and high winds closed the cable car.  So we changed plans and headed south to the Riviera one day early.

Sense and respond.

At the end of the trip, during the TGV ride back to Paris, we reflected on our trip.  I asked my wife if she were to plan the trip all over again, what would she do differently?

Retrospective.

I don’t know whether to be embarrassed or proud.

At least the inside of our car wasn’t covered with stickies.


Leave a comment

The Math Behind Agile and Automation

Every once in a while, we encounter individuals on our teams who have a healthy dose of skepticism about these new Agile practices they are learning. For those who tend to be scientifically-minded, or need more evidence than just a good story, I have found that it is important to give them real data to look at.

As an example, it is interesting to combine the usual Cost per Defect curve for a software project with a histogram that maps the probability or frequency of finding defects to the corresponding project phase. The result is mathematical support for both Agile as well as the value of Automation and good Agile QA practices.

Figure A below shows the typical Cost curve for a waterfall project (source: The Economics of Testing, Rice Consulting) along with an overlay showing the probability of finding defects in various phases of the project.

Cost of Fix per Defect - Waterfall

As can be seen, most defects are found during the testing phase, as one might expect. Using some industry standard numbers, the cost to fix per defect is about $490.

Note, however, what would happen if you are able to find defects earlier in the process.

Cost of Fix per Defect - Modified Waterfall

Figure B shows the same cost curve, but the histogram representing the probability of defects found is pushed earlier in the process. The kinds of practices that might result in such a change include collaborative QA and developer testing, pairing, and automation (which helps prevent the defects from being found in the expensive tail of the curve). This doesn’t mean spending more on QA, just utilizing tighter feedback loops that help prevent defects from getting to the later phase of the project. So, even with a waterfall process, or a non-agile iterative process, one can easily see how collaborative testing and automation can reduce the cost of defects considerably, in this case down to $220 per defect.

The Agile cost curve is actually a little different, as shown in Figure C.

Cost of Fix per Defect - Agile, automated

There is still the hockey stick effect when the software goes into production, but the rest of the cost curve would be flat, since the cost of fixing is pretty much the same from iteration to iteration. The defect frequency histogram is drastically different and is flattened and spread out across the entire life cycle of the release.

In this model, Agile practices alone, such as sprint-based functional testing and having QA and developers working off of the same requirements, are responsible for about a factor of two improvement in overall cost of defect fixing per release. Automation and good collaborative practices are responsible for another factor of two, which gets the overall cost per defect down to about $130.