Agile Accelerate

Leave Nothing on the Table


Leave a comment

“Leave Nothing On the Table: Where Your Productivity Goes—And How to Get It Back”

Leveraging Strategy, Metrics, and Agile Practices to Maximize Efficiency

How efficient is your organization?

If you were to estimate how much of your team’s work actually translates into real, tangible outcomes, what would you say—60%? 70%? Most leaders assume that the majority of their team’s efforts drive meaningful results.  

The reality, however, is far more alarming. Studies show that, on average, teams only accomplish 15% of work planned  —the rest disappears into inefficiencies, misalignment, and hidden bottlenecks.  

What if I told you that you could reclaim 10-20% or more of that lost capacity—not by working harder, but by identifying and addressing the key areas where work quietly drains away? By tackling these areas where process inefficiencies hide, organizations can dramatically increase their projected outcomes which means more delivery of services or products that customers can engage with and ultimately more money back to the company.

This white paper explores the seven critical leverage points that allow teams to recover lost work, reduce waste, and maximize productivity. Through a series of assessments and targeted workshops, Agile Accelerate is helping companies across the globe achieve more with existing resources using a targeted approach to find waste and attack it, leaving nothing on the table.

How Did This Happen?

In today’s complex and often isolated business structures, organizations often overestimate the efficiency of their operations. While many leaders believe that a significant portion of their team’s efforts directly contributes to organizational goals, the reality is starkly different. Research indicates that a substantial amount of work is lost due to various inefficiencies. McKinsey

A study by McKinsey & Company highlights that many organizations, despite adopting agile practices, are “ultimately doing so to deliver work and not value.” This misalignment between effort and value delivery underscores a critical issue in modern workplaces. McKinsey

The consequences of such inefficiencies are profound. Organizations face increased costs, delayed project timelines, and diminished employee morale. For instance, a report from the Project Management Institute found that for every $1 billion invested in projects, $122 million was wasted due to poor project performance. PMI.org 

In many cases, engineering efforts are so splintered, that just the efforts to tackle technical debt is hindered by context switching to address code maintenance issues and operational tasks or non-coding issues.

Addressing these challenges presents a significant opportunity. By identifying and mitigating these inefficiencies, organizations can enhance productivity, reduce costs, and improve overall performance. Implementing targeted strategies to tackle these issues can lead to substantial gains in efficiency and value delivery.

In the following sections, we will delve deeper into these inefficiencies, providing data-driven insights and practical solutions to help organizations reclaim lost productivity and achieve their strategic objectives. 

Help!

If only 15% of work is truly delivering value, what happens to the rest? The answer lies in seven key areas—common but often overlooked pitfalls that drain capacity, create inefficiencies, and erode the impact of even the most well-intentioned teams. Understanding these areas is the first step toward reclaiming lost work.

  • Dark Work – Tasks that consume time and resources but aren’t tracked or measured, making them invisible to leadership and impossible to optimize.
  • Low-Value Stories – Work that gets done but contributes little to strategic goals, often due to unclear requirements or poor stakeholder alignment.
  • Overhead – Administrative burdens, excessive meetings, reporting, and other non-value-adding activities that slow execution.
  • Misalignment – When teams and leadership operate with different priorities, causing duplication, conflicting efforts, and wasted resources.
  • DOWNTIME – Delays, handoffs, waiting for approvals, and other process inefficiencies that stretch timelines and reduce output.
  • Poor Prioritization – Teams spread thin across too many initiatives, leading to context switching, burnout, and a lack of meaningful progress.
  • Technical Debt – Short-term fixes and neglected system improvements that create long-term inefficiencies, making future work slower and costlier.

Each of these areas represents a recoverable loss—a place where small, targeted improvements can unlock 10-20% or more of lost capacity. By systematically addressing these issues, organizations can shift from merely working harder to working smarter, ensuring more of their effort translates into meaningful, high-value results.

What’s In It For Me?

At Agile Accelerate, we first work with organizations to assess what is currently being left on the table. A picture of this may be unique for different businesses, but many of the root causes are shared across industries. 

Our company then crafts custom and targeted workshops that directly address the work loss challenges described above, along with their benefits and implications. In the past, we have offered services focused on giving the organization 100% control over the work committed to and delivered. Example workshops we have hosted include:

1. Facilitated Strategy Mapping

  • Relevance to Work Loss: Many organizations struggle with inefficiencies because their teams lack clear alignment on objectives, critical success factors, and key activities. This leads to wasted effort on low-impact work.
  • Benefits: This workshop helps teams refine their vision and create structured success factors, ensuring that efforts are focused on delivering actual value rather than just completing tasks.
  • Implications: Companies that engage in strategic mapping can reduce misalignment, minimize duplicated efforts, and significantly improve the impact of their initiatives.

2. Metrics Audit and Strategy

  • Relevance to Work Loss: Organizations often measure activity rather than value, leading to an overemphasis on busy work instead of meaningful outcomes.
  • Benefits: By assessing gaps in current metrics and aligning them with value delivery, this workshop helps teams track progress in a way that drives impact rather than just motion.
  • Implications: Businesses that refine their measurement strategies can better identify inefficiencies, eliminate unnecessary work, and redirect focus toward high-value outcomes.

3. Value Stream Mapping Workshop

  • Relevance to Work Loss: Inefficiencies in workflow, handoffs, and bottlenecks often lead to excessive delays and wasted effort across teams.
  • Benefits: This workshop enables organizations to visualize their end-to-end delivery process, pinpoint waste, and create actionable plans for improvement.
  • Implications: By eliminating bottlenecks and streamlining workflows, organizations can enhance productivity, shorten cycle times, and improve overall efficiency.

4. Facilitated Root Cause Analysis (RCA)

  • Relevance to Work Loss: Many organizations implement surface-level fixes without addressing the systemic issues causing inefficiencies, leading to recurring problems.
  • Benefits: RCA helps teams methodically analyze and identify the true sources of inefficiencies, ensuring solutions are impactful and long-lasting.
  • Implications: Companies that perform root cause analysis can prevent repetitive failures, improve problem-solving capabilities, and reduce wasted time spent on rework.

These workshops provide concrete, actionable ways to tackle inefficiencies and reclaim lost productivity, ensuring that organizations are focusing on meaningful, value-driven work. 

Act Now! 

It’s time to leave nothing on the table and unlock your organization’s Full Potential.

The hidden inefficiencies in your organization—misaligned strategies, untracked metrics, fragmented collaboration, and unoptimized workflows—are costing you valuable time and resources. Agile Accelerate interactive workshops are designed to uncover and address these obstacles, empowering your teams to align on strategy, work smarter, and drive real, measurable impact.

Our sessions provide actionable insights and practical tools that help organizations streamline operations, enhance collaboration, and maximize value delivery. Whether you’re struggling with ineffective planning, lack of alignment, or missed opportunities for growth, our expert facilitators will guide you through targeted solutions that create lasting transformation.

The cost of inefficiency is too high to ignore. Let’s work together to reclaim lost capacity, strengthen your team’s agility, and ensure nothing is left on the table. 

Contact Agile Accelerate today to explore how our workshops can drive immediate and lasting improvements in your organization.

Schedule a consultation now and take the first step toward transformational success!


Leave a comment

Deliver Features to Customers 5 Times Faster

Does it seem to take a long time for your company to get a feature out the door? Have you done a “Five Whys” on the problem? Does one of the root causes seem to be the number of dependencies between teams, or “steps” that it takes to get something out the door? Chances are that it has something to do with the way your teams are structured.

The debate between feature teams and component teams is as old as Scrum is (which is getting to be Keith Richards territory). It usually boils down to:

  • Component teams are more efficient because a small number of SMEs who know the code inside out are the most effective to work on that part of the code. Anyone else interfering in the code will make a mess and create technical debt.

And the discussion about possible restructuring ends there.

It ends because there is also often a hidden reason why leaders don’t want to move to feature teams. If pressed, the following concerns may emerge (if not pressed, they will stay under the radar):

  • “As a leader, I am defined by the component I own, and the team(s) that work on it. I am also a SME in that component, which gives me job security. Moving to a feature team model would be threatening – Would I have a team? Would my people be scattered across multiple teams? How would i have time to attend all of those teams’ meetings? Would i still own a component?” And so on.

We will address these concerns later. But first, let’s look at a real case study…

I was a transformation coach at a large software company that was struggling with exactly this issue. Minor customer requests or enhancements seemed to take forever to deliver. So I worked with a colleague to do a deep dive into the problem. We selected a simple feature enhancement that took twelve weeks to deliver once the work began and inspected every Jira ticket that related to it. The following graphic shows the dependencies between all of the teams that were involved in the solution. It is blurred intentionally of course.

Each swim lane represented a different team – front end, back end, UX, globalization, various components, etc. What was fascinating about this investigation was that the aggregate amount of time expended by all of the teams was six times what a single team could do. Not because any of the teams were inefficient – on the contrary, they worked on their tickets efficiently. But, if it only took a day or two to do their part, it basically sat in the queue waiting for the next team to pick it up. And that was where the end-to-end inefficiency was.

I used the Flow Efficiency metric to get a sense for the level of our problem. Flow Efficiency is defined as Value-added Time/Total Elapsed Time expressed as a percentage. In this case, elapsed time was 6 sprints or 12 weeks (60 working days). The aggregate value-added effort was about 60 staff-days. But this metric is usually calculated on a per team basis, which is a straightforward calculation for a single team. However, when multiple teams are contributing to the Value-added Time, Flow Efficiency would be overstated. One can divide by the number of teams, which in this case would have given us a Flow Efficiency of just 8% or so. But I think that is misleading. So we used a modified metric, which was, effectively, “How long COULD it have taken with a single cross-functional team” versus “How long did it actually take?” With that metric, the Flow Efficiency was 16%.

This data was presented to senior leadership and was considered to be very eye opening. The obvious solution would be to create cross-functional feature teams. Of course, there were concerns:

  • New teams accessing code bases they would be unfamiliar with (of course, this points to deeper issues, like lack of automation and existing technical debt)
  • Loss of ownership of components and people (as mentioned above)
  • Some features might not lend themselves to the approach
  • A general concern about disrupting the organization

To make a long story short, we took an approach that addressed all of the concerns, which consisted of these elements:

  • Start small with a pilot program – we created 6 new teams, which represented only 10% of the org
  • Have a change management process to onboard middle managers and team managers
  • Designate SMEs on the component teams to reserve some bandwidth to help people who are new to the code base with software design decisions
  • Implement a fast-feedback continuous improvement program to quickly address concerns and resolve issues
  • Establish measurable metrics to represent the goals we wants to achieve, along with checks and balances

The metrics chosen were:

  • Flow Efficiency, defined as described above (this is like a lagging KR in OKR parlance to measure the key objective). Because we expected some initial challenges, we also measured how fast the teams ramped up to a steady state level of flow efficiency.
  • Subject Matter Expertise, to measure how quickly new developers got up to speed on unfamiliar components (leading KR)
  • Team satisfaction (balancing KR)

Of course, there were bumps along the way, but by all indicators, the program was very successful and Flow Efficiency was improved by a factor of 5. Across six different initiatives, average flow efficiency for the duration of the pilot was 80%. Even better, the teams ramped up to an average flow efficiency of 89% and this was done fairly quickly – an average of 1.3 sprints, or 2-3 weeks. 

Average team satisfaction increased by 30% over a period of six months, mostly because developers got out of their rut and learned new things. Subject matter expertise improved by 38% on an annualized basis.

Details of the methodology, practices, measurements, and learnings were presented in a white paper called “Winning the Concept to Cash Game with Feature Teams” at the 2021 XP Conference by Martina Ziegenfuss and myself.

Not unlike a stock investment disclaimer, actual results may vary and the variations may be substantial. But if you would like to deliver features to your customers five times faster, there is definitely value in considering an approach such as this.


Leave a comment

Extending Cross-Functionality to Programs

There is an excellent rationale for cross-functional teams.  For large programs, that rationale can be easily scaled to the program-level.  But, for some reason, this isn’t always recognized.

TEAM CROSS-FUNCTIONALITY

Let’s say you have a team with the following profile of highly siloed individuals:

xfunc1

This is great if you have a profile of stories that fits it perfectly, as follows:

xfunc2

But what if your set of sprint stories looks more like this?:

xfunc3

In this case, you have a deficiency of analysts, back-end developers, and QA people to implement the stories that your aggregate team capacity might otherwise support.  And, your UX folks and front-end developers will be twiddling their thumbs for part of the sprint.

So, what to do?

Since you are only as good as your lowest capacity component (which appears to be QA, from this particular example), you will have to scale back the number of stories to fit the profile, as shown:

xfunc4

Now, everyone is underutilized, except for QA.  Good luck finding something useful for everyone else to do in the next two weeks.

The net result is that your team is perhaps 30% less productive than it could be (eyeballing the graphic).

However, if you take advantage of standard cross-functional teamwork, your team’s profile may look something like this:

xfunc5

Note that by “cross-functional” we do not mean that everyone should be able to do anything.  There are very valid reasons (education, experience, proclivity, enthusiasm) why certain people are ideally suited for certain kinds of work.  Think of the cross-functional nature of someone’s role on a team as a bell curve (alternatively, some talk about T-shaped employees – the T is just the bell curve upside down, as the Y-axis orientation is arbitrary).  The more the curve is spread out, the more they are able to take on other people’s roles.  On a good cross-functional team, the bell curves overlap “somewhat,” meaning that everyone can take on a little bit of someone  else’s role, although perhaps not as efficiently.  Still, this allows a team to take on a wide variety of “profiles” of sprint work, as will always be necessary.

So, for example, in the case above,

xfunc6

people will adjust to the desired “sprint needs” profile as follows:

xfunc7

PROGRAM LEVEL CROSS-FUNCTIONALITY

Don’t forget that this model can be applied to more than just teams.

For example, there can be a tendency for teams to develop “specific expertise”, due perhaps to knowledge held by certain BSAs or specific architectural or design skills in the development team.  The program may then tends to assign stories based on this expertise under the theory that this is the most efficient way to get work done. Unfortunately, this has the effect of only further driving each team into a functional silo.  It can become a vicious spiral and soon you may hear things like “well, we don’t have generic teams and, at this point, the schedule is paramount, so we need to keep assigning program work according to the team best suited to do it.”  As a result, program backlogs will consist of stories pre-targeted to specific teams, even arbitrarily far out in time.  Imagine what happens when the stakeholders decide to re-prioritize epics or add new features, or a new dependency arises that doesn’t line up with the ideal team at the right time.  The result will be a work profile that doesn’t match the “team profile,” as follows:

xfunc8

Enter a cadre of fix-it people – project managers, oversight groups, resource managers, program managers – all trying to re-balance the backlog, shuffling stories around, adding people to teams, squeezing some teams to do more work, while other teams tend to be idle, therefore resulting in the assignment of less than necessary filler work.  It is the same wasteful resource management nightmare that is so easily solved by cross-functional teams, except this time at the program level.

So, eliminate the waste, and follow the following simple program level guidelines:

  1. Create a fully prioritized program backlog without consideration for the teams that will be executing the stories.
  2. Once per sprint, have a program planning session or meta-scrum (Uber-PO, Uber-SM, team representatives) where the candidate stories for the upcoming sprint are identified for each team.  Include a little more than each team’s velocity would otherwise indicate in case they are able to take on more than their average.
  3. Make it a goal to avoid specializing teams.

All team “profiles” will be identical and program needs can easily be accommodated.

xfunc9

There may be a little bit of short term inefficiency resulting from having the “slightly less than ideal” team work on particular stories, but the more you do this, the more that inefficiency evaporates.  And the advantages are significant:

  • Holistic view of program backlog allow you to focus on what is important – delivering value
  • No need to engage the expensive swat team of fix-it managers to shuffle around people and project artifacts
  • All team members gain experience and learning, often resulting in greater job satisfaction, and higher performing teams
  • No more single point of failure; no more critical path team
  • Far less chaos and confusion, resulting in more focused individuals
  • Extremely easy to manage – program progress is measured by the simple rate at which all teams work through the stories.  Any gaps in targeted scope versus expected scope is easy to identify.


Leave a comment

An Experiment in Learning, Agile & Lean Startup Style

I always have a backlog of non-fiction books to read. Given the amount of free time that I have every day, I am guessing that it may be years before I get through them. In fact, the rate at which books get added to my backlog probably exceeds my learning velocity, creating an ever-increasing gap. It feels like a microcosm of Eddie Obeng’s “world after midnight.”

So what to do?

books800I am trying to increase my velocity by applying speed reading techniques. But so far, that is probably only closing a small percentage of the gap.

Iterative Learning

Then, upon a bit of soul searching, I had an epiphany. Why do I feel the need to read and understand every single word on every single page? This runs counter to what we coach our teams to do—eliminate waste, only document what makes sense, just-in-time practices, and applying iterative thinking instead of only incremental. The answer seemed to be that I don’t feel that I have really read the book if I haven’t read every word. So what? Am I trying to conquer the thing? It seems like a very egocentric point of view.

What if I was able to let go of the ego, and try to read a book iteratively instead of incrementally? Is it even possible? Would it be effective? There are all sorts of ways to tell stories or build products—top-down, bottom-up, inside-out—each of which have their strong points. Sometimes it is most effective, for instance, to grab the user’s attention by initially giving them a nugget that might logically be placed in the middle of a narrative, and then providing necessary foundation, or by filling in the gaps as necessary. Could one apply the same process to learning from a book? I could imagine scanning through a book randomly, stopping at points that looked interesting and digesting a bit—much like I used to do with encyclopedias as a kid. Or, maybe, first reviewing the TOC for areas of interest, jumping to those sections, absorbing a bit, and then searching for any context that was missing.  This would be a completely different way to learn from a book. I couldn’t call it reading, and don’t have a good term for it, other than a new kind of learning.

This led me to thinking a little more deeply about what I am trying to get out of reading; the learning aspect of it. What if I could scan a book in a tenth of the time that it took to read it, but retain half of the content? Would that be an improvement? There seems to be some sort of formula that I am trying to maximize, like dl/dt=CVR: Rate of learning equals the “learn-worthy” content of the book multiplied by the speed that I scan it multiplied by the percent that I retain. Is the percent retained equal to the percent value obtained? Do I get half the potential value of a book if I retain half as much? I could simply define R to be the percent value and my equation still holds. Something in the back of my mind says this it is really sad to look at learning this way. Something else says I am on to something.

Of course, there are all kinds of nuances.  For example, some books build upon a foundation which must be well understood to get any value at all out of the latter sections of the book.  For others, it may be easier to skip around. Some, you may be able to get value out of scanning the TOC, or the subheadings, digesting the graphics, or just reading the intros and summaries of each chapter; for others, not so much.  Hence, in a sense, different books have different learning profiles.

The Experiment

I was intrigued enough to attempt this on a book near the top of my backlog: Steven Wolfram’s A New Kind of Science, a 1280-page tome that took him ten years to write. So I did it. I didn’t “read” it. I iterated through it and digested some of it. And can honestly say that, for this particular book, I optimized my learning rate equation significantly. I can’t be sure of the total potential value that the book would have to me were I to read it in its entirely, but from what I digested, I feel like I got about 50% in about 5% of the time—a tenfold increase in my learning rate. And Steven got his royalty. Yes, I do appreciate the irony of using a new kind of learning on A New Kind of Science. And letting go of the idea of conquering a book was kind of liberating.

So, what if we look at a particular learning objective in the same way that we manage a large project or program? I am imagining a vision or an objective like “I want to become learned in Digital Philosophy” (one of my particular interests.) That vision results in the creation of a backlog of books, papers, blogs, etc. The larger of these (books) are epics and can be broken down into stories, like “Scan contents to get a sense of the material,” “Determine the core messages of the book by finding and reading the key points,” “Understand this author’s view on this particular topic,” and so on. By thinking about learning material this way, it opens up all kinds of new possibilities. For example, maybe there is another way to slice the backlog, such as by topic. If the most important thing to further my overall objective is to understand everything about cellular automata, I would assign higher priority to the stories related to that topic, even if they come from separate sources. So, my learning process takes a different path; one that slices through different material non-linearly.

Lean Startup Learning & Continuous Improvement

In fact, this all feels a bit to me like a lean startup approach to learning in that you can experiment with different chunks of material that may point you in different directions, depending on the outcome of the reading experiment. Having a finer backlog of reading components and being willing to let go of the need to conquer reading material might make possible a much faster path to an ultimate learning objective.

And so I am passing along this idea as an option for those who have a voracious desire to learn in this after-midnight world, but have a before-midnight backlog of reading material.


Leave a comment

Intuition & Innovation in the Age of Uncertainty

“My [trading] decisions are really made using a combination of theory and instinct. If you like, you may call it intuition.” – George Soros

“The intellect has little to do on the road to discovery. There comes a leap in consciousness, call it intuition or what you will, and the solution comes to you, and you don’t know how or why.” – Albert Einstein

“The only real valuable thing is intuition.” – Albert Einstein

“Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition.” – Steve Jobs

Have you ever considered why it is that you decide some of the things that you do? Like how to divide your time across the multiple projects or activities that you have at work, how and when to discipline your kids, where to go and what to do on vacation, which car to buy?

The ridiculously slow way to figure these things out is to do an exhaustive analysis on all of the options, potential outcomes and probabilities. This can be extremely difficult when the parameters of the analysis are constantly changing, as is often the case. Such analysis is making use of your conscious mind.

The other option is to use your subconscious mind and make a quick intuitive decision.

flip-coin_5

We who have been educated in the West, and especially those of us who received our training in engineering or the sciences, are conditioned to believe that “analysis” represents rigorous logical scientific thinking and “intuition” represents new-age claptrap. Analysis good, intuition silly.

This view is quite inaccurate.

Intuition Leads to Quick, Accurate Decisions

According to Gary Klein, ex-Marine, psychologist, and author of the book “The Power of Intuition: How to Use Your Gut Feelings to Make Better Decisions at Work,” 90% of the critical decisions that we make are made by intuition in any case. Intuition can actually be a far more accurate and certainly faster way to make an important decision. Here’s why…

The mind is often considered to be composed of two parts – conscious and subconscious. Admittedly, this division may be somewhat arbitrary, but it is also realistic.

The conscious mind is that part of the mind that deals with your current awareness (sensations, perceptions, feelings, fantasies, memories, etc.) Research shows that the information processing rate of the conscious mind is actually very low. Dr. Timothy Wilson from the University of Virginia estimates, in his book “Strangers to Ourselves: Discovering the Adaptive Unconscious,” the conscious mind’s processing capacity to be only 40 bits per second. Tor Nørretranders, author of “The User Illusion”, estimates the rate to be even lower at only 16 bits per second. In terms of the number of items that can be retained at one time by the conscious mind, estimates vary from 4 – 7, with the lower number being reported in a 2008 study by the National Academy of Sciences.

Contrast that with the subconscious mind, which is responsible for all sorts of things: autonomous functions, subliminal perceptions (all of that data streaming in to your five sensory interfaces that you barely notice), implicit thought, implicit learning, automatic skills, association, implicit memory, and automatic processing. Much of this can be combined into what we consider “intuition.” Estimates for the information processing capacity and storage capacity of the subconscious mind vary widely, but they are all orders of magnitude larger than their conscious counterparts. Dr. Bruce Lipton, in “The Biology of Belief,” notes that the processing rate is at least 20 Mbits/sec and maybe as high as 400 Gbits/sec. Estimates for storage capacity are as high as 2.5 petabytes.

Isn’t it interesting that the rigorous analysis that we are so proud of is effectively done on a processing system that is excruciatingly slow and has little memory capacity? Whereas, intuition is effectively done on a processing system that is blazingly fast and contains an unimaginable amount of data.

In fact, that’s what intuition is – the same analysis that you might consider doing consciously, but doing it instead with access to far more data, such as your entire wealth of experience, and the entire set of knowledge to which you have ever been exposed.

Innovation Is Fueled by Intuition

The importance of intuition only grows exponentially with every year that passes.  Here’s why…

Eddie Obeng is the Professor at the School of Entrepreneurship and Innovation, HenleyBusinessSchool, in the UK. He gave a TED talk which nicely captured the essence of our times, in terms of information overload. Figure 1 from that talk demonstrates what we all know and feel is happening to us:

Eddie_Obeng_Uncertainty

The horizontal axis is time, with “now” being all the way to the right. The vertical axis depicts information rate.

The green curve represents the rate at which we humans can absorb information, aka “learn.” It doesn’t change much over time because our biology stays pretty much the same. The red curve represents the rate at which information is coming at us.

Clearly, there was a time in the past, where we had the luxury of being able to take the necessary time to absorb all of the information necessary to understand the task, or project at hand. If you are over 40, you probably remember working in such an environment.

At some point, however, the incoming data rate exceeded our capacity to absorb it; television news broadcasts with two or three rolling tickers, tabloids, zillions of web sites to scan, Facebook posts, tweets, texts, blogs, social networks, information repositories, big data, etc. In our work place, projects typically have many dependencies on information from other teams, stakeholders, technologies, end users, and leadership, all of which are constantly changing.

It is easy to see that as time goes on, the ratio of unprocessed incoming information to human learning capacity grows exponentially. What this means is that there is increasingly more uncertainty in our world, because we just don’t have the ability to absorb the information needed to be “certain,” like we used to. Some call it “The Age of Uncertainty.” Some refer to the need to be “comfortable with ambiguity.”

This is a true paradigm shift. It demands entirely new ways of doing business, of structuring companies, of planning, of living. In my job, I help companies come to terms with these changes by implementing agile and lean processes, structures, and frameworks in order for them to be more adaptable to the constantly changing environment. Such processes are well suited for the organizational context in any case given that organizations are complex systems (as opposed to “complicated” ones, in Cynefin, or systems theory, parlance). But they are also the only kinds of processes that will be effective in this new environment because they embrace the idea of sensing and responding to change instead of requiring rigorous analysis to establish a predictable plan.

We no longer have time to do the rigorous analysis necessary to make the multitude of decisions with which we are confronted on a daily basis. Instead, we increasingly need to rely on our intuition. But, while we often concentrate our energies on improving specific technical or leadership skills, we rarely consider the idea that perhaps we can make better use of that powerful subconscious mind apparatus by improving the effectiveness of our intuition. It seems to me that this is a significantly missed opportunity, one that deserves more and more of our attention with every passing year.

Intuition Can Be Developed

Sounds as if intuition is a skill that could be very useful to hone, if possible. So how do we develop that capability?  Here are some ideas:

  • Have positive intent and an open mind – The first step to any new idea is to accept it. Think of it as “greasing the learning skids.”
  • Put yourself in situations where you gain more experience about the desired subject(s) – Intuition works best when you have a lot of experiences from which to draw. If you continue to do the same thing over and over, you are not building new experiences.  Therefore, the more you depart from the norm and from your comfort zone, and develop experiences in your area of interest, the more substantial your “intuitive database.”
  • Meditate / develop point-focus – Meditation develops all sorts of interesting personal capabilities, not least of which is an improved capacity to intuit.
  • Go with first thing that comes to mind – Effectively, you are practicing intuition by doing this. In time, the practice will lead to more effective use of the capability.
  • Notice impressions, connections, coincidences (a journal or buddy may help) – This reinforces the intuitive pathways of the mind. Neuroplasticity is a well-studied phenomenon whereby your thoughts develop incremental neural connections. Reinforcing the positive ones makes them more available for use.
  • 2-column exercises – Another mindfulness technique, these exercises help to raise you awareness of your mental processes, including your subconscious.
  • Visualize success – Think of this as applying the idea of neuroplasticity to build a set of success-oriented neural pathways in your mind.
  • Follow your path – Following a path that feels right to you does two things: First, it puts you into increasingly rewarding situations, generating positive feedback, which helps with all of the above practices. Second, it is simply practicing intuition, but specifically on what your subconscious mind knows are your best decisions.

I am doing many of these practices and finding them to be very valuable.


Leave a comment

Lean Time Management: Are You a Slave to Someone’s Calendar?

When you get to work in the morning, what is the first thing that you do? I mean, after getting the cup of coffee and cinnamon-swirl danish.

Do you check your calendar to see what meetings you have to attend throughout the day? Did you schedule all of those meetings? If not, your time is being largely managed by other people.

Do you feel that your attendance at those meetings constitutes the greatest value that you can be providing for your company today? If not, are you satisfied with that fact, day after day?

toomanymeetings

The Problem With Too Many Meetings

I am not suggesting that someone else’s meetings are not important, or that your attendance at those meetings isn’t valuable. But all too often, there are significant dysfunctions that result from a meeting-oriented culture:

    • Jumping from meeting to meeting is context shifting and introduces an element of waste each time you change context; some studies suggest 20% or more.2
    • Meetings are generally aligned to hour or half-hour boundaries and tend to last exactly as long as they are scheduled. Does it really make sense that for a 1-hour meeting, we always have exactly 1 hour of stuff to discuss? If there was really 45 minutes worth of useful discussion that expanded to fit the time allotted, that’s another 25% waste. If there was really 75 minutes worth of useful discussion to be had and it was cut short by 15 minutes, then you will suffer from some serious additional waste.  First there is the waste in calendar time due to the fact that the topic under discussion must wait another week (or whatever the recurring meeting period, or duration before the newly scheduled followup meeting can take place) to make progress.  In addition, unless your attendees are robots, they will forget most of the discussion and conclusions reached by the time the next meeting rolls around and you will all waste time getting on the same page again.
    • Many meetings are perfunctory. They recur weekly or biweekly, regardless of the optimal frequency.  Perhaps they have been on the calendar for a long time and their usefulness is starting to decline, in which case it would make sense to decrease the meeting duration or frequency, but this is rarely done until more time is wasted.
  • Often times, the meeting has the wrong people. One too many is wasteful. One too few is wasteful to the rest of the attendees. You can easily tell when you are in a useless, perfunctory meeting or one which has too many attendees because many of the people will be processing other work on their laptops, thereby only half listening. More waste.

Considering all of these potential dysfunctions, I am simply suggesting the possibility of adopting a different mindset when you go to work&emdash;a mindset driven by the lean principle of eliminating waste.

Lean Time Management

What if you planned each day around the objective of providing the greatest impact that you can possibly provide that day?

Might you find yourself with more time to make a difference?

How would your job satisfaction change?

If everyone adopted that mentality, do you think your company would make greater progress toward its objectives?

What about collaboration – doesn’t this approach run counter to that core agile principle?

Assuming that all collaborators have a similar view of the value of the issue at hand, they will all be meeting the criteria for this principle. And, clearly, if you collaborated as needed and not as scheduled, you would be more efficient.

If you think your department, organization, or company is ready for a meeting reset, consider doing a meeting audit. Or, even better, wipe the calendar clean and start over with a new mind set. I’ll bet you’ll see a positive change.

Notes:

  1. Photo courtesy Creative Commons License, @boetter’s photostream
  2. Original source: Weinberg, Gerald. Quality Software Management: Systems Thinking, Dorset House, 1991.


Leave a comment

Subtractive Transformation (or “How Improving a Company is Like Improving a Golf Swing”)

After living overseas for two years and not playing golf the entire time, I returned to the states, joined a golf league, and quickly realized how out of practice I was.  I had always had good luck taking lessons or “tune ups” from a particular golf pro in Boston, but now I was living in Florida, and needed to find someone new.  So, I went to one golf pro, who upon analyzing my swing, suggested a half dozen things I should be doing.

golfswing2

I got worse.

I went to another pro, who watched my poor excuse for a swing, and promptly suggested a different half dozen things to do.

I got even worse.

Before giving up entirely, I tried yet one more guy.  After watching me fumble through a couple drives, he said “I don’t know who you have been taking lessons from, but they’ve got your head full of rules and you can’t relax out there – that’s why your swing stinks.  Forget about everything they taught you and just get out there and hit the ball.”  (For those who have seen the movie “Tin Cup”, this advice might sound familiar)

I got worse.  But then I started to get better.

The lesson I learned from this was the power of simplicity. I was trying too hard to do too much.  Removing the impediments to a decent golf swing was far easier that trying to pile a bunch of better rules on top of the foundation of bad habits.

Having worked with companies who are trying to transform in some way, I find that the same guidelines apply.  Your organization may want to become more nimble, more agile, more innovative, more something.  But it is probably full of organizational structures, processes, attitudes, cultures, and leadership styles that represent impediments to the desired transformation.

The mistake many companies make is to try to pile on the new ideas without removing the impediments, a strategy we might call “additive transformation.”  Generally, the new ideas just won’t fit and you’ll have a mess of conflicts.  For example, you may try the approach of implementing a policy of having your designers, architects, and developers work on something “free of constraints” for 10% of their time (a commonly attempted tactic to bring innovative ideas and fresh thinking to a stale environment).  Unfortunately, those people will quickly run into problems, such as project managers or business stakeholders who are still driving to dates without any consideration for allowing time for investments in innovation or process improvement.  As if they “never got the memo.”  Sentiments like “yes, we are all for innovation, but we also have a business to run and important deadlines to meet” will be interpreted completely differently by folks with different levels of agile maturity and will therefore result in mismatched expectations.

Instead of adding incongruent changes, focus on removing those impediments to agility and innovation, a strategy we might call “subtractive transformation.”  By using thinking tools like “Double Loop Learning” and coupling them with analytical tools like “Current Reality Trees” (from Eliyahu Goldratt’s Theory of Constraints) you can uncover the organizational structures, process, cultural elements, belief traps, and leadership styles that are responsible for those impediments to your new organizational vision.  Once you have discovered the roots of those impediments, it may be an easy step to simply remove them or to brainstorm an “environment design” change that will undo them.

The other advantage to this approach is that you can start from the most logical place – where you are today.   You don’t need to do a “reset” and start with a clean slate in order to achieve a transformation.  All you need to do is simplify and improve, via the removal of bad habits.

Before you know it, you’ll be hitting the ball in the fairway again.


Leave a comment

Continuous Planning as an Antidote to the Sunk Cost Fallacy

money-drainThe Sunk Cost Fallacy, aka “throwing good money after bad”, is the irrational but common human behavior of continuing to do something not because of the return on investment going forward but because it is too painful to feel like you’ve wasted the money already sunk into the endeavor. Such behavior is very common in large companies, with fat product/project portfolios that are difficult to trim due to political reasons. The opportunity costs associated with sunk cost projects can be huge. Imagine what you can do with the money and people that would be freed up from cancelling even one of those projects. By applying some Lean principles to your portfolio planning process, you can avoid sunk cost fiascos and work toward being a more efficient and innovative organization.

Some high profile sunk cost traps include:

The Concorde – The British and French governments continued to fund the “commercial disaster” joint project due mostly to political reasons.1

Farmville – “Farmville players are mired in a pit of sunk costs. They can never get back the time or the money they’ve spent, but they keep playing to avoid feeling the pain of loss and the ugly sensation waste creates.” 2

The Iraq War – Many have interpreted the rationale for continuing the Iraq War past the point of meeting initial objectives as a perfect example of sunk cost fallacy. 3

One of the suggestions to avoid the sunk cost fallacy is to get opinions from a different set of people periodically, according to John Hammond.4  This helps to prevent the tendency to protect past decisions in order to avoid being associated with a failed project.

But the Lean concept of continuous planning may be the best antidote to sunk cost failures.  This is because it provides a framework for continuously inspecting the value of continuing a project.  You can do this with annual planning, but you only have the opportunity to inspect at a predetermined time, so even if you maintain a healthy process of periodically reevaluating the ROI on each project, on the average you will continue a project for six months past the point where the ROI turns negative.

Continuous planning also offers a subtler advantage to avoiding sunk cost scenarios.  With annual (or coarsely interactive) planning, decision making tends to be based on dividing up a budget.  Projects in progress almost always get consideration for continuance because people usually only do the ROI estimation for new projects.  However, with continuous planning, it is easier to make the value judgment “is this worth continuing?” a natural part of the process.

  1. Weatherhead, P.J. (1979). “Do Savannah Sparrows Commit the Concorde Fallacy?”Behav. Ecol. Sociobiol (Springer Berlin) 5 (4): 373–381
  1. http://youarenotsosmart.com/2011/03/25/the-sunk-cost-fallacy/
  1. http://www.freakonomics.com/2007/08/20/more-sunk-cost-thinking-on-iraq-war/
  1. “The Hidden Traps in Decision Making” John S. Hammond, Ralph L. Keeney, and Howard Raiffa, HBR


Leave a comment

Treat Ideas Like Stories and Your Community Like Your Team

Many of us are working toward the same objectives – improving our organization’s creativity and effectiveness, transforming our workplaces into environments that make us eager to get to work in the morning, maybe even developing ideas that “put a dent in the universe.”

If only we shared ideas the way we develop software – in little chunks, partially formed, and with complete transparency.

Problem is, there are two forces that work against this:

  1. The tendency to polish
  1. The tendency to protect

First, we polish because we feel that our output is a reflection on ourselves.  This is largely true – people do form opinions based on whether or not I dangle a participle.  But isn’t that the same pattern as being afraid to demonstrate an unpolished prototype (or worse, a paper one)?  Yet in agile, we slowly condition our stakeholders to accept ambiguity and imperfection, in the greater interest of earlier feedback and learning.  And Lean Startup shows us that a faster way to our goal is likely to be one that includes coarse validated learnings to help us make optimal decisions.

So why not apply the same principles to idea-space and our community (by community, I mean those who read blogs such as this)?

Second, we protect because of fear.  As countries, we fear that our secrets will get into the hands of enemies and risk our security.  As companies, we fear that our IP will get into the hands of competitors and risk our edge.  Good thing that the folks that developed Linux, WordPress, and Drupal didn’t feel that way.  In the agile community, we should have no such fears.  We should feel free to apply to our community the same transparency we apply to our teams.

Small ideas germinate and beget large ideas, possibly universe-denting ones.  Let’s get them out there.

Sorry this was kind of a partially formed idea, but having early dinner plans, my blog couldn’t wait for me.


Leave a comment

A Cure for Parkinson’s

Parkinson’s Law – “Work expands so as to fill the time available for its completion

– Cyril Northcote Parkinson

Your agile team can be following 99% of the principles and practices behind Scrum, XP, and/or Kanban and yet still working at half efficiency, or worse, due to Parkinson’s Law.

There is very little in the prescribed principles and practices that directly address the efficiency of executing tasks, save the concept of eliminating waste, but even that usually centers on eliminating workflow waste, not task or work item waste.

The good news is that while your agile team may have tremendous untapped potential, realizing that potential may be as simple as setting up the right environment for them.

It is a natural human behavior to allow work to fill the time allotted.  We have all done it. The reasons are many and various: Overanalysis, not balancing task elements across the work item, struggling with a blocker without asking for help, procrastination, lack of a desire to work faster, lack of motivation, distractions, not managing interrupts well, multitasking, and so on.

Note that every one of those behaviors can be identified and remedied.  But where to start?

Empowering Agile Teams

The first place to begin to solve the Parkinson’s Law problem is with empowerment.  An unempowered agile team is a team that doesn’t care.  And that is a prescription for a bad case of Parkinson’s.

Here is one recipe for curing Parkinson’s within an agile team.  The order is logical: Start where the answer to the leading question is “yes” and work downward:

  1. Have non-team members (by “non-team members” here, I mean anyone other than the team; it could be any level of management, the client, the business, the PMO, etc.) set an unrealistic goal for the project?  This situation will make people feel that they have no control and that if the project fails, it is someone else to blame.  A solution may be to improve transparency around the release burnup.  Is there a release burnup?  If not, start there.
  • Create a clear burnup from empirical velocity and an agreed-upon backlog and distribute it to all stakeholders.
  • Make sure that the release backlog is reasonably well sized.
  • Make sure that the velocity is based on empirical data, not wishful thinking.
  • Ensure that the burnup accurately reflects the backlog and velocity.
  • Make sure that everyone sees it – client, PMO, business stakeholders, various management levels, other potentially impacted teams.  Obfuscation occurs when different groups use different artifacts to present information, so ensure that they are all using the same artifact.
  1. Are non team-members selecting the stories to be run each sprint?  This can serve to prevent the team from feeling ownership.

Tip: Put the Team in Charge of Their Commitment

  • Stop the practice of external story selection and have the team select only enough stories to get to the point that they feel that they can commit to complete those stories and not one more.
  • Have a discussion around the meaning of “commitment.”  Make sure that this is one of those things that require unanimous team agreement, like a jury.  There are some decisions that can be made by a democratic majority, others that are okay to be made by an oligarchy.  But, commitment to a Sprint plan should be unanimous across the entire team.  Without that unequivocal commitment, unspoken naysay remains unchallenged, that could spoil morale right when you need it.
  • Then, during the planning session, ask for that one-by-one commitment to the sprint.
  1. Is a “chicken” (e.g. project manager who is not part of the team) selecting work for people, assigning tasks, assigning story owners, assigning roles?  How can they possibly know better than each team member how what tasks are needed for each story and how long each will take?
  • Eliminate that practice.  Only when the team selects their own work do they feel ownership of that work.  And it is only when a team feels ownership that they will begin to have the motivation to work at their full potential.
  1. Are story sizes small enough that each person can work on at least two stories per sprint?  Large stories that take most of the sprint to complete can create a Parkinson’s effect.  People get used to the idea that they work on only one story per sprint, and so, consciously or subconsciously, allow each story to expand to fill the sprint.  Sometimes you can have a breakthrough in velocity just by breaking that pattern.
  • Size stories such that at least two or three can be worked on by each person in one sprint.  This gets people used to the feeling of finishing and then pulling something off the backlog mid-sprint.
  • It might be necessary for the ScrumMaster to keep close watch on stories to ensure that they don’t languish uncompleted for days.  Once the pattern is broken, this is usually no longer a gating concern.

OK, great.  The team is fairly empowered at this point and work should be flowing smoothly.  Still, they may not feel full ownership and commitment.  One of the problems is that they team may still be looking to their PO and ScrumMaster as their “leaders,” and consciously or subconsciously feeling that the ultimate accountability for the story or the work item or the task is more with these “leaders” than themselves.

So spread out the “leadership”.

Tip: Spread the Spotlight

  1. Do the ceremonies revolve around the same people?  Is it always the SM or PO that facilitates the planning sessions, retrospectives, and sprint reviews?  Do people talk to the SM during the standup?  Start by broadening the leadership.
  • One method is to have story owners.  Ultimately, it is probably ideal for the entire team to feel fully accountable for all stories, but it can be difficult to get there in one step.  So, an interim step may be to have designated story owners, whose job it is to drive the story to completion.  They should ensure that there aren’t gaps in time between sequential tasks of the story and help resolve impediments pertaining to that story.  This can have the negative consequence of individual team members still not feeling accountable to that story being complete, but at least the feeling of ownership has been somewhat spread out.  Individuals who have been story owners can then spread that feeling of commitment to others on the next stories.
  • Ensure that BSAs, QA, and developers all get the chance to be story owners.
  • Once everyone begins to demonstrate accountability to all stories, the story ownership may be dropped.

And yet still it may seem like some stories and tasks are dragging on longer than they should.

Tip: Use the Buddy System

  1. Do people tend to hand off work to each other via documents?  Do they communicate via email instead of in person?  Lack of collaboration can lead to low transparency silos, which are a breeding ground for Parkinson’s.  It is very easy to hide when no one else interacts with you.
  • Encourage collaboration sessions around story acceptance test writing or reviews, perhaps even kickoff sessions for each story.
  • Make a habit of including everyone involved with a story in ad hoc discussions.
  • Encourage pairing; not necessarily for just development, but any functional area, especially where Parkinson’s might be lurking.  Pairing generally speeds up tasks, and many in the XP world even observe that pairing can even reduce overall net effort on a task.  But even if it doesn’t, the payoff in terms of reducing the Parkinson’s effect will probably offset any net effort increase due to the pairing.

A healthy pattern will be established which should stick.  It is hard to watch your velocity drop and hard to let Parkinson’s set back in to a team, once good collaborative patterns have been established.

Tip: Focus on Focus

  1. Do people tend to complain about interrupts, or that they have too many responsibilities outside of story work?  Perhaps it is time to retrospect on those problems.
  • Solutions may exist in process changes.  For example, rather than have everyone on the team be interrupted by a production support broadcast, have level 3 support designates that periodically rotate through the team.
  • Practice good interrupt management. Does everyone on the team understand the evils of multitasking, both from the standpoint of flow as well as context shifting?  Perhaps some education is needed.  Spend a sprint recording what you are doing once per hour (set an alarm) and aggregate the results.  See anything surprising?
  • As a Scrum Master, check to see if people are burning down tasks on multiple stories at the same time, rather than working a task to completion before moving on to the next one.
  • Question the interrupt requests.
  • Give feedback to habitual interrupters,
  • Recognize that you are not indispensible and don’t need to always be the hero.
  • Turn off distractions if you can, like email popups.
  • Resist the urge to check email.
  • Manage time in blocks.

There are many more time management ideas that you can find via a few specific web searches.

By now, we should be seeing a healthy buzz of collective ownership and collaborative practices, and Parkinson’s Law should be all but eliminated.

Parkinson’s is all about lack of motivation.  Therefore, it can be cured by taking steps to establish a team environment where motivation comes naturally.  Any steps that serve to improve empowerment, teamwork, collaboration, and the feeling of collective ownership will have the greatest effect.