Agile Accelerate

Leave Nothing on the Table


Leave a comment

Extending Cross-Functionality to Programs

There is an excellent rationale for cross-functional teams.  For large programs, that rationale can be easily scaled to the program-level.  But, for some reason, this isn’t always recognized.

TEAM CROSS-FUNCTIONALITY

Let’s say you have a team with the following profile of highly siloed individuals:

xfunc1

This is great if you have a profile of stories that fits it perfectly, as follows:

xfunc2

But what if your set of sprint stories looks more like this?:

xfunc3

In this case, you have a deficiency of analysts, back-end developers, and QA people to implement the stories that your aggregate team capacity might otherwise support.  And, your UX folks and front-end developers will be twiddling their thumbs for part of the sprint.

So, what to do?

Since you are only as good as your lowest capacity component (which appears to be QA, from this particular example), you will have to scale back the number of stories to fit the profile, as shown:

xfunc4

Now, everyone is underutilized, except for QA.  Good luck finding something useful for everyone else to do in the next two weeks.

The net result is that your team is perhaps 30% less productive than it could be (eyeballing the graphic).

However, if you take advantage of standard cross-functional teamwork, your team’s profile may look something like this:

xfunc5

Note that by “cross-functional” we do not mean that everyone should be able to do anything.  There are very valid reasons (education, experience, proclivity, enthusiasm) why certain people are ideally suited for certain kinds of work.  Think of the cross-functional nature of someone’s role on a team as a bell curve (alternatively, some talk about T-shaped employees – the T is just the bell curve upside down, as the Y-axis orientation is arbitrary).  The more the curve is spread out, the more they are able to take on other people’s roles.  On a good cross-functional team, the bell curves overlap “somewhat,” meaning that everyone can take on a little bit of someone  else’s role, although perhaps not as efficiently.  Still, this allows a team to take on a wide variety of “profiles” of sprint work, as will always be necessary.

So, for example, in the case above,

xfunc6

people will adjust to the desired “sprint needs” profile as follows:

xfunc7

PROGRAM LEVEL CROSS-FUNCTIONALITY

Don’t forget that this model can be applied to more than just teams.

For example, there can be a tendency for teams to develop “specific expertise”, due perhaps to knowledge held by certain BSAs or specific architectural or design skills in the development team.  The program may then tends to assign stories based on this expertise under the theory that this is the most efficient way to get work done. Unfortunately, this has the effect of only further driving each team into a functional silo.  It can become a vicious spiral and soon you may hear things like “well, we don’t have generic teams and, at this point, the schedule is paramount, so we need to keep assigning program work according to the team best suited to do it.”  As a result, program backlogs will consist of stories pre-targeted to specific teams, even arbitrarily far out in time.  Imagine what happens when the stakeholders decide to re-prioritize epics or add new features, or a new dependency arises that doesn’t line up with the ideal team at the right time.  The result will be a work profile that doesn’t match the “team profile,” as follows:

xfunc8

Enter a cadre of fix-it people – project managers, oversight groups, resource managers, program managers – all trying to re-balance the backlog, shuffling stories around, adding people to teams, squeezing some teams to do more work, while other teams tend to be idle, therefore resulting in the assignment of less than necessary filler work.  It is the same wasteful resource management nightmare that is so easily solved by cross-functional teams, except this time at the program level.

So, eliminate the waste, and follow the following simple program level guidelines:

  1. Create a fully prioritized program backlog without consideration for the teams that will be executing the stories.
  2. Once per sprint, have a program planning session or meta-scrum (Uber-PO, Uber-SM, team representatives) where the candidate stories for the upcoming sprint are identified for each team.  Include a little more than each team’s velocity would otherwise indicate in case they are able to take on more than their average.
  3. Make it a goal to avoid specializing teams.

All team “profiles” will be identical and program needs can easily be accommodated.

xfunc9

There may be a little bit of short term inefficiency resulting from having the “slightly less than ideal” team work on particular stories, but the more you do this, the more that inefficiency evaporates.  And the advantages are significant:

  • Holistic view of program backlog allow you to focus on what is important – delivering value
  • No need to engage the expensive swat team of fix-it managers to shuffle around people and project artifacts
  • All team members gain experience and learning, often resulting in greater job satisfaction, and higher performing teams
  • No more single point of failure; no more critical path team
  • Far less chaos and confusion, resulting in more focused individuals
  • Extremely easy to manage – program progress is measured by the simple rate at which all teams work through the stories.  Any gaps in targeted scope versus expected scope is easy to identify.


Leave a comment

An Experiment in Learning, Agile & Lean Startup Style

I always have a backlog of non-fiction books to read. Given the amount of free time that I have every day, I am guessing that it may be years before I get through them. In fact, the rate at which books get added to my backlog probably exceeds my learning velocity, creating an ever-increasing gap. It feels like a microcosm of Eddie Obeng’s “world after midnight.”

So what to do?

books800I am trying to increase my velocity by applying speed reading techniques. But so far, that is probably only closing a small percentage of the gap.

Iterative Learning

Then, upon a bit of soul searching, I had an epiphany. Why do I feel the need to read and understand every single word on every single page? This runs counter to what we coach our teams to do—eliminate waste, only document what makes sense, just-in-time practices, and applying iterative thinking instead of only incremental. The answer seemed to be that I don’t feel that I have really read the book if I haven’t read every word. So what? Am I trying to conquer the thing? It seems like a very egocentric point of view.

What if I was able to let go of the ego, and try to read a book iteratively instead of incrementally? Is it even possible? Would it be effective? There are all sorts of ways to tell stories or build products—top-down, bottom-up, inside-out—each of which have their strong points. Sometimes it is most effective, for instance, to grab the user’s attention by initially giving them a nugget that might logically be placed in the middle of a narrative, and then providing necessary foundation, or by filling in the gaps as necessary. Could one apply the same process to learning from a book? I could imagine scanning through a book randomly, stopping at points that looked interesting and digesting a bit—much like I used to do with encyclopedias as a kid. Or, maybe, first reviewing the TOC for areas of interest, jumping to those sections, absorbing a bit, and then searching for any context that was missing.  This would be a completely different way to learn from a book. I couldn’t call it reading, and don’t have a good term for it, other than a new kind of learning.

This led me to thinking a little more deeply about what I am trying to get out of reading; the learning aspect of it. What if I could scan a book in a tenth of the time that it took to read it, but retain half of the content? Would that be an improvement? There seems to be some sort of formula that I am trying to maximize, like dl/dt=CVR: Rate of learning equals the “learn-worthy” content of the book multiplied by the speed that I scan it multiplied by the percent that I retain. Is the percent retained equal to the percent value obtained? Do I get half the potential value of a book if I retain half as much? I could simply define R to be the percent value and my equation still holds. Something in the back of my mind says this it is really sad to look at learning this way. Something else says I am on to something.

Of course, there are all kinds of nuances.  For example, some books build upon a foundation which must be well understood to get any value at all out of the latter sections of the book.  For others, it may be easier to skip around. Some, you may be able to get value out of scanning the TOC, or the subheadings, digesting the graphics, or just reading the intros and summaries of each chapter; for others, not so much.  Hence, in a sense, different books have different learning profiles.

The Experiment

I was intrigued enough to attempt this on a book near the top of my backlog: Steven Wolfram’s A New Kind of Science, a 1280-page tome that took him ten years to write. So I did it. I didn’t “read” it. I iterated through it and digested some of it. And can honestly say that, for this particular book, I optimized my learning rate equation significantly. I can’t be sure of the total potential value that the book would have to me were I to read it in its entirely, but from what I digested, I feel like I got about 50% in about 5% of the time—a tenfold increase in my learning rate. And Steven got his royalty. Yes, I do appreciate the irony of using a new kind of learning on A New Kind of Science. And letting go of the idea of conquering a book was kind of liberating.

So, what if we look at a particular learning objective in the same way that we manage a large project or program? I am imagining a vision or an objective like “I want to become learned in Digital Philosophy” (one of my particular interests.) That vision results in the creation of a backlog of books, papers, blogs, etc. The larger of these (books) are epics and can be broken down into stories, like “Scan contents to get a sense of the material,” “Determine the core messages of the book by finding and reading the key points,” “Understand this author’s view on this particular topic,” and so on. By thinking about learning material this way, it opens up all kinds of new possibilities. For example, maybe there is another way to slice the backlog, such as by topic. If the most important thing to further my overall objective is to understand everything about cellular automata, I would assign higher priority to the stories related to that topic, even if they come from separate sources. So, my learning process takes a different path; one that slices through different material non-linearly.

Lean Startup Learning & Continuous Improvement

In fact, this all feels a bit to me like a lean startup approach to learning in that you can experiment with different chunks of material that may point you in different directions, depending on the outcome of the reading experiment. Having a finer backlog of reading components and being willing to let go of the need to conquer reading material might make possible a much faster path to an ultimate learning objective.

And so I am passing along this idea as an option for those who have a voracious desire to learn in this after-midnight world, but have a before-midnight backlog of reading material.


Leave a comment

Intuition & Innovation in the Age of Uncertainty

“My [trading] decisions are really made using a combination of theory and instinct. If you like, you may call it intuition.” – George Soros

“The intellect has little to do on the road to discovery. There comes a leap in consciousness, call it intuition or what you will, and the solution comes to you, and you don’t know how or why.” – Albert Einstein

“The only real valuable thing is intuition.” – Albert Einstein

“Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition.” – Steve Jobs

Have you ever considered why it is that you decide some of the things that you do? Like how to divide your time across the multiple projects or activities that you have at work, how and when to discipline your kids, where to go and what to do on vacation, which car to buy?

The ridiculously slow way to figure these things out is to do an exhaustive analysis on all of the options, potential outcomes and probabilities. This can be extremely difficult when the parameters of the analysis are constantly changing, as is often the case. Such analysis is making use of your conscious mind.

The other option is to use your subconscious mind and make a quick intuitive decision.

flip-coin_5

We who have been educated in the West, and especially those of us who received our training in engineering or the sciences, are conditioned to believe that “analysis” represents rigorous logical scientific thinking and “intuition” represents new-age claptrap. Analysis good, intuition silly.

This view is quite inaccurate.

Intuition Leads to Quick, Accurate Decisions

According to Gary Klein, ex-Marine, psychologist, and author of the book “The Power of Intuition: How to Use Your Gut Feelings to Make Better Decisions at Work,” 90% of the critical decisions that we make are made by intuition in any case. Intuition can actually be a far more accurate and certainly faster way to make an important decision. Here’s why…

The mind is often considered to be composed of two parts – conscious and subconscious. Admittedly, this division may be somewhat arbitrary, but it is also realistic.

The conscious mind is that part of the mind that deals with your current awareness (sensations, perceptions, feelings, fantasies, memories, etc.) Research shows that the information processing rate of the conscious mind is actually very low. Dr. Timothy Wilson from the University of Virginia estimates, in his book “Strangers to Ourselves: Discovering the Adaptive Unconscious,” the conscious mind’s processing capacity to be only 40 bits per second. Tor Nørretranders, author of “The User Illusion”, estimates the rate to be even lower at only 16 bits per second. In terms of the number of items that can be retained at one time by the conscious mind, estimates vary from 4 – 7, with the lower number being reported in a 2008 study by the National Academy of Sciences.

Contrast that with the subconscious mind, which is responsible for all sorts of things: autonomous functions, subliminal perceptions (all of that data streaming in to your five sensory interfaces that you barely notice), implicit thought, implicit learning, automatic skills, association, implicit memory, and automatic processing. Much of this can be combined into what we consider “intuition.” Estimates for the information processing capacity and storage capacity of the subconscious mind vary widely, but they are all orders of magnitude larger than their conscious counterparts. Dr. Bruce Lipton, in “The Biology of Belief,” notes that the processing rate is at least 20 Mbits/sec and maybe as high as 400 Gbits/sec. Estimates for storage capacity are as high as 2.5 petabytes.

Isn’t it interesting that the rigorous analysis that we are so proud of is effectively done on a processing system that is excruciatingly slow and has little memory capacity? Whereas, intuition is effectively done on a processing system that is blazingly fast and contains an unimaginable amount of data.

In fact, that’s what intuition is – the same analysis that you might consider doing consciously, but doing it instead with access to far more data, such as your entire wealth of experience, and the entire set of knowledge to which you have ever been exposed.

Innovation Is Fueled by Intuition

The importance of intuition only grows exponentially with every year that passes.  Here’s why…

Eddie Obeng is the Professor at the School of Entrepreneurship and Innovation, HenleyBusinessSchool, in the UK. He gave a TED talk which nicely captured the essence of our times, in terms of information overload. Figure 1 from that talk demonstrates what we all know and feel is happening to us:

Eddie_Obeng_Uncertainty

The horizontal axis is time, with “now” being all the way to the right. The vertical axis depicts information rate.

The green curve represents the rate at which we humans can absorb information, aka “learn.” It doesn’t change much over time because our biology stays pretty much the same. The red curve represents the rate at which information is coming at us.

Clearly, there was a time in the past, where we had the luxury of being able to take the necessary time to absorb all of the information necessary to understand the task, or project at hand. If you are over 40, you probably remember working in such an environment.

At some point, however, the incoming data rate exceeded our capacity to absorb it; television news broadcasts with two or three rolling tickers, tabloids, zillions of web sites to scan, Facebook posts, tweets, texts, blogs, social networks, information repositories, big data, etc. In our work place, projects typically have many dependencies on information from other teams, stakeholders, technologies, end users, and leadership, all of which are constantly changing.

It is easy to see that as time goes on, the ratio of unprocessed incoming information to human learning capacity grows exponentially. What this means is that there is increasingly more uncertainty in our world, because we just don’t have the ability to absorb the information needed to be “certain,” like we used to. Some call it “The Age of Uncertainty.” Some refer to the need to be “comfortable with ambiguity.”

This is a true paradigm shift. It demands entirely new ways of doing business, of structuring companies, of planning, of living. In my job, I help companies come to terms with these changes by implementing agile and lean processes, structures, and frameworks in order for them to be more adaptable to the constantly changing environment. Such processes are well suited for the organizational context in any case given that organizations are complex systems (as opposed to “complicated” ones, in Cynefin, or systems theory, parlance). But they are also the only kinds of processes that will be effective in this new environment because they embrace the idea of sensing and responding to change instead of requiring rigorous analysis to establish a predictable plan.

We no longer have time to do the rigorous analysis necessary to make the multitude of decisions with which we are confronted on a daily basis. Instead, we increasingly need to rely on our intuition. But, while we often concentrate our energies on improving specific technical or leadership skills, we rarely consider the idea that perhaps we can make better use of that powerful subconscious mind apparatus by improving the effectiveness of our intuition. It seems to me that this is a significantly missed opportunity, one that deserves more and more of our attention with every passing year.

Intuition Can Be Developed

Sounds as if intuition is a skill that could be very useful to hone, if possible. So how do we develop that capability?  Here are some ideas:

  • Have positive intent and an open mind – The first step to any new idea is to accept it. Think of it as “greasing the learning skids.”
  • Put yourself in situations where you gain more experience about the desired subject(s) – Intuition works best when you have a lot of experiences from which to draw. If you continue to do the same thing over and over, you are not building new experiences.  Therefore, the more you depart from the norm and from your comfort zone, and develop experiences in your area of interest, the more substantial your “intuitive database.”
  • Meditate / develop point-focus – Meditation develops all sorts of interesting personal capabilities, not least of which is an improved capacity to intuit.
  • Go with first thing that comes to mind – Effectively, you are practicing intuition by doing this. In time, the practice will lead to more effective use of the capability.
  • Notice impressions, connections, coincidences (a journal or buddy may help) – This reinforces the intuitive pathways of the mind. Neuroplasticity is a well-studied phenomenon whereby your thoughts develop incremental neural connections. Reinforcing the positive ones makes them more available for use.
  • 2-column exercises – Another mindfulness technique, these exercises help to raise you awareness of your mental processes, including your subconscious.
  • Visualize success – Think of this as applying the idea of neuroplasticity to build a set of success-oriented neural pathways in your mind.
  • Follow your path – Following a path that feels right to you does two things: First, it puts you into increasingly rewarding situations, generating positive feedback, which helps with all of the above practices. Second, it is simply practicing intuition, but specifically on what your subconscious mind knows are your best decisions.

I am doing many of these practices and finding them to be very valuable.


Leave a comment

Treat Ideas Like Stories and Your Community Like Your Team

Many of us are working toward the same objectives – improving our organization’s creativity and effectiveness, transforming our workplaces into environments that make us eager to get to work in the morning, maybe even developing ideas that “put a dent in the universe.”

If only we shared ideas the way we develop software – in little chunks, partially formed, and with complete transparency.

Problem is, there are two forces that work against this:

  1. The tendency to polish
  1. The tendency to protect

First, we polish because we feel that our output is a reflection on ourselves.  This is largely true – people do form opinions based on whether or not I dangle a participle.  But isn’t that the same pattern as being afraid to demonstrate an unpolished prototype (or worse, a paper one)?  Yet in agile, we slowly condition our stakeholders to accept ambiguity and imperfection, in the greater interest of earlier feedback and learning.  And Lean Startup shows us that a faster way to our goal is likely to be one that includes coarse validated learnings to help us make optimal decisions.

So why not apply the same principles to idea-space and our community (by community, I mean those who read blogs such as this)?

Second, we protect because of fear.  As countries, we fear that our secrets will get into the hands of enemies and risk our security.  As companies, we fear that our IP will get into the hands of competitors and risk our edge.  Good thing that the folks that developed Linux, WordPress, and Drupal didn’t feel that way.  In the agile community, we should have no such fears.  We should feel free to apply to our community the same transparency we apply to our teams.

Small ideas germinate and beget large ideas, possibly universe-denting ones.  Let’s get them out there.

Sorry this was kind of a partially formed idea, but having early dinner plans, my blog couldn’t wait for me.


Leave a comment

Broadening the Grounds for Self Improvement

We all have our least favorite phrases or questions that come up in day to day coaching.  One of mine goes along these lines: “No one has complained about this, so why change?

 ————–

How many of us complained about not having a PC on our desk before HP created one and Apple popularized it?

How many of us complained about not having a mouse and a point-and-click interface before Xerox PARC invented it and Apple popularized it?

————–

Making changes that address complaints or known problems certainly makes sense.  But doesn’t it also make sense to make changes that simply improve the ability to deliver, even if there isn’t an obvious problem to solve?

For just one example, I have worked with teams who used a big visible task board to track progress on their stories, but the board was so confusing that it didn’t add much value.  Simplifying it and making it more readable generally helped to improve the clarity of each story status and the progress of the sprint, and I would notice people starting to have discussions around the board.  Conversations improved, collaboration increased, and the team performance often went up as a result.  Yet, no one had ever complained about the board.

When a team retrospects, they often focus on the things that didn’t go well over the past sprint.  For instance, they might devote a few collective hours of time to solving a very specific problem that was raised; e.g. “I didn’t have the login credentials necessary to access this particular database that I needed to consult to find out some information for my story.”  But, instead they could have spent time figuring out how to collaborate better on identifying business requirements – an activity not driven by any complaints or problems, but one which could generate significant benefits in terms of velocity or delivering value.

————–

How many of us complained about not having a web interface on top of the Internet before Mosaic?

How many of us complained about not having a smart phone before IBM created one and Apple popularized it?

 ————–

Marcus Buckingham, leader of the strengths movement , notes that a person’s “greatest room for growth” is not one of their weaknesses but their strengths.  Might not this also apply to agile teams?

What if, instead of always focusing our retrospectives on fixing the things that are broken, we sometimes take a critical look at things we already do well, but could get so much incremental value out of doing even better?

Investing in cross training, for example, has to potential to be one of those practices that can generate huge improvements in team productivity, even for a team with already broad skill sets.  A team could become so efficient at being cross functional that they never would find tasks blocked due to the lack of an available person with the right skills or knowledge.  Tasks and stories will flow even better and overall team productivity can only go up.  The same might be said for building foundational principles like commitment and empowerment.

Continuous improvement isn’t about fixing problems.  It is about inspecting everything that you do, good and bad, making decisions about where change can make the most impact, and validating your decisions.