Strategy Deployment and Impact Mapping

I’ve had a couple of conversations in recent weeks in which Impact Mapping came up in relation to Strategy Deployment so here’s a post on my thoughts about how the two fit together.

An Impact Map is a form of mind-map developed by Gojko Adzic, visualising the why, who, how and what of an initiative. More specifically, it shows the goals, actors involved in meeting the goals, desired impact on the actors (in order to meet the goals), and deliverables to make the impacts. The example below is from Gojko’s website.

As you can see, an Impact Map is very simple, reductionist visualisation, from Goals down to Deliverables, and while the mind map format doesn’t entirely constrain this, it tends to be what most examples I have seen look like. It does however work in such as way to start with the core problem (meeting the goal) and allow people to explore and experiment with how to solve that problem via deliverables. This is very much in line with how I define Strategy Deployment.

Lets see how that Impact Map might translate onto an X-Matrix.

The Goal is clearly an Aspiration, so any relevant measures would neatly fit into the X-Matrix’s bottom section. At the other end, the Deliverables are also clearly Tactics, and would neatly fit in the X-Matrix-s top section. I would also argue that the Impacts provide Evidence that we are meeting the Aspirations, and could fit into the X-Matrix’s right-hand section. What is not so clear is Strategy. I think the Actors could provide a hint, however, and I would suggest that an Impact Map is actually a good diagnosis instrument (as per Rumelt) with which to identify Strategy.

Taking the 4 levels on an Impact Map, and transposing them onto an X-Matrix, creates a view which can be slightly less reductionist (although not as simple), and opens up the possibility of seeing how all the different elements might be related to each other collectively. In the X-Matrix below I have added the nodes from the Impact Map above into the respective places, with direct correlations for the Impact Map relationships. This can be seen in the very ordered pattern of dots. New Tactics (Deliverables) and Evidence (Impacts), and possible more Aspirations (Goals), would of course also need to be added for the other Strategies (Actors).

Even though this is a very basic mapping, I hope its not too difficult to see the potential to start exploring what other correlations might exist for the identified Tactics. And what the underlying Strategies really are. I leave that as exercise for you to try – please leave a comment with what ideas you have!

This post is one of a series comparing Strategy Deployment and other approaches.

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

Strategy Deployment and Directed Opportunism

A fourth post exploring the relationship between Strategy Deployment and other approaches (see Strategy Deployment and Fitness for PurposeStrategy Deployment and AgendaShift and Strategy Deployment and Spotify Rhythm).

Directed Opportunism is the approach described by Stephen Bungay in his book The Art of Action, in which he builds on the ideas of Field Marshall Helmuth von Moltke, Chief of Staff of the Prussian Army for 30 years from 1857, and applies them to leading businesses. This also follows on from the earlier post on alignment and autonomy in Strategy Deployment.

Bungay starts by describing three gaps between desired Outcomes, the Plans made to achieve them, and the Actions taken which create actual Outcomes.  These gaps (the Knowledge Gap, Alignment Gap and Effects Gap) are shown in the diagram below, and together cause organisational friction – resistance of the organisation to meeting its goals.


Given this model, Bungay explains how the usual approach to reducing this friction, and closing the gaps, is to attempt to reduce uncertainty by pursuing more detail and control, as show below.

More Detail

This generally makes the situation worse, however, because the problem is not linear, reductionistic or deterministic. In Cynefin terms, this is a Complicated approach in a Complex domain. Instead, Bungay recommends reducing detail and control and allowing freedom to evolve with feedback. This is what he calls Directed Opportunism.

Less Detail

This definition of Directed Opportunism seems to me to meet my definition of Strategy Deployment as a form of organisational improvement in which solutions emerge from the people closest to the problem. There is clear communication of intent (the problem) with each level (the people closest) defining how they well achieve the intent (the solution) and having freedom to adjust in line with the intent (the emergence).

From an X-Matrix perspective, being clear on results, strategies and outcomes limits direction to defining and communicating intent, and leaving tactics to emerge (through Catchball) allows different levels to define how they will achieve the intent and gives them freedom to adjust actions in line with the intent.

VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

Strategy Deployment and Spotify Rhythm

This is the third in what has turned into a mini series exploring the relationship between Strategy Deployment and other approaches (see Strategy Deployment and Fitness for Purpose and  Strategy Deployment and AgendaShift).

Last month, Henrik Kniberg posted slides from a talk he gave at Agile Sverige on something called Spotify Rhythm which he descibes as “Spotify’s current approach to getting aligned as a company”. While looking through the material, it struck me that what he was describing was a form of Strategy Deployment. This interpretation is based purely on those slides – I haven’t had a chance yet to explore this more deeply with Henrik or anyone else from Spotify. I hope I will do some day, but given that caveat, here’s how I currently understand the approach in terms of the X-Matrix Model.

Spotify Rhythm: Taxonomy

The presentation presents the following “taxonomy” used in “strategic planning”:

Company Beliefs – While this isn’t something I don’t talk about specifically, the concept of beliefs (as opposed to values) does tie in nicely with the idea that Strategy Deployment involves many levels of nested hypotheses and experimentation (as I described in Dynamics of Strategy Deployment). Company Beliefs could be considered to be the highest level, and therefore probably strongest hypotheses.

North Star & 2-Year Goals – A North Star (sometimes called True North) is a common Lean concept (and one I probably don’t talk about enough with regard to Strategy Deployment). It is an overarching statement about a vision of the future, used to set direction. Decisions can be made based on whether they will move the organisation towards (or away from) the North Star. Strategy Deployment is ultimately all in pursuit of enabling the organisational alignment and autonomy which will move it towards the North Star. Given that, the 2-Year Goals can be considered as the Results that moving towards the North Star should deliver.

Company Bets – The Company Bets are the “Big Bets” – “large projects” and “cross-organisation initiatives”. While these sound like high level tactics, I wonder whether they can also be considered to be the Strategies. As mentioned already, Strategy Deployment involves many levels of nested hypothesis and experimentation, and therefore Strategy is a Bet in itself (as are Results , and also Beliefs).

Functional & Market Bets – If the Company Bets are about Strategy, then the Functional and Market Bets are the Tactics implemented by functional or market related teams.

DIBB – DIBB is a framework Spotify use to define bets and “make the chain of reasoning explicit” by showing the relationships between Data, Insights, Beliefs and Bets. Part of that chain of reasoning involves identifying success metrics for the Bets, or in other words, the Outcomes which will indicate if the Bet is returning a positive payoff.

While this isn’t an exact and direct mapping it feels close enough to me. One way of checking alignment would be the ability for anyone to answer some simple questions about the organisations’ journey. I can imagine how Spotify Rhythm provides clarity on how to answer these questions.

  • Do you know where you are heading? North Star
  • Do you know what the destination looks like? 2 Year Goals (Results)
  • Do you know how you will get there? Company Bets (Strategies)
  • Do you know how you will track progress? DIBBs (Outcomes)
  • Do you know how you will make progress? Functional & Market Bets (Tactics)

One final element of Spotify Rhythm which relates to Strategy Deployment is implied in its name – the cadence with which the process runs. Company Bets are reviewed every quarter by the Strategy Team (another reason why they could be considered to be Strategies) and the Functional and Market Bets – also called TPD (Tech-Product-Design) Bets – are reviewed every 6 weeks.

I’d be interested in feedback on alternative interpretations of Spotify Rhythm. Or if you know more about it than I do, please correct anything I’ve got wrong!

VN:F [1.9.22_1171]
Rating: 5.0/5 (2 votes cast)

Strategy Deployment and Agendashift

Agendashift is the approach used by Mike Burrows, based on his book Kanban from the Inside, in which he describes the values behind the Kanban Method. You can learn more by reading Mike’s post Agendashift in a nutshell. As part of his development of Agendashift, Mike has put together a values based delivery assessment, which he uses when working with teams. Again, I recommend reading Mike’s posts on using Agendashift as a coaching tool  and debriefing an Agendashift survey if you are not familiar with Agendashift.

After listening to Mike talk about Agendashift at this year’s London Lean Kanban Day I began wondering how his approach could be used as part of a Strategy Deployment workshop. I was curious what would happen if I used the Agendashift assessment to trigger the conversations about the elements of the X-Matrix model. Specifically, how could it be used to identify change strategies, and the associated desired outcomes, in order to frame tactics as hypotheses and experiments. Mike and I had a few conversations, and it wasn’t long before I had the opportunity to give it a go. This is a description of how I went about it.

Assessment & Analysis

The initial assessment followed Mike’s post, with participants working through individual surveys before spending time analysing the aggregated results and discussing strengths, weaknesses, convergence, divergence and importance.


Having spent some time having rich conversations about current processes and practices, triggered by exploring various perspectives suggested by the survey prompts and scores, the teams had some good insights about what they considered to be their biggest problems worth solving and which required most focus. Getting agreement on what the key problems that need solving are can be thought of as agreeing the key strategies for change.

Thus this is where I broke away from Mike’s outline, in order to first consider strategies. I asked the participants to silently and individually come up with 2 to 3 change strategies each, resulting in around 20-30 items, which we then collectively grouped into similar themes to end up with 5-10 potential strategies. Dot voting (with further discussion) then reduced this down to the 3 key change strategies which everyone agreed with.

To give some examples (which I have simplified and generalised), we had strategies around focussing collaboration, communication, quality, product and value.


Having identified these key strategies, the teams could then consider what desired outcomes they hoped would be achieved by implementing them. By answering the questions “what would we like to see or hear?” and “what would we measure?”, the teams came up with possible ways, both qualitative and quantitative, which might give an indication of whether the strategies, and ultimately the tactics, were working.

Taking the 3 key strategies, I asked small groups of 3-5 people to consider the outcomes they hope to achieve with those strategies, and then consolidated the output. One reassuring observation from this part of the workshop was that some common outcomes emerged across all the strategies. This means that there were many-to-many correlations between them, suggesting a messy coherence, rather than a simplistic and reductionist relationship.

Some examples of outcomes (again simplified and generalised) were related to culture, responsiveness, quality, understanding and feedback.


Once we have strategies and outcomes, the next step is to create some hypotheses for what tactics might implements the strategies to achieve the outcomes. To do this I tweaked Mike’s hypothesis template, and used this one:

We believe that <change>

implements <strategies>

and will result in <outcomes>

With this template, the hypotheses are naturally correlated with both strategies and outcomes (where the outcomes already consist of both subjective observations and objective measures).

I asked each participant to come up with a single hypothesis, creating a range of options from which to begin defining experiments.

For example (vastly simplified and generalised!):

We believe that a technical practice

implements a quality related strategy

and will result in fewer defects


This as far as we got in the time available, but I hope its clear that once we have hypotheses like this we can start creating specific experiments with which to move into action, with the possibility that each hypotheses could be tested with multiple experiments.


While we didn’t formally go on to populate an X-Matrix, we did have most of the main elements in place – strategies, outcomes and tactics (if we consider tactics to be the actions required to test hypotheses) – along with the correlations between them. Although we didn’t discuss end results in this instance, I don’t believe it would take much to make those explicit, and come up with the correlations to the strategies and outcomes.

On a recent call with Mike he described Agendashift in terms of making the agenda for change explicit. I think that also nicely describes Strategy Deployment, and why I think there is a lot of overlap. Strategy Deployment makes the results, strategies, outcomes and tactics explicit, along with the correlations and coherence between them, and it seems that Agendashift is one way of going about this.

VN:F [1.9.22_1171]
Rating: 5.0/5 (1 vote cast)

Strategy Deployment and Fitness for Purpose

David Anderson defines fitness for purpose in terms of the “criteria under which our customers select our service”. Through this lens we can explore how Strategy Deployment can be used to improve fitness for purpose by having alignment and autonomy around what the criteria are and how to improve the service.

In the following presentation from 2014, David describes Neeta, a project manager and mother who represents two market segments for a pizza delivery organisation.

As a project manager, Neeta wants to feed her team. She isn’t fussy about the toppings as long as the pizza is high quality, tasty and edible. Urgency and predictability is less important. As a mother, Neeta want to feed her children. She is fussy about the toppings (or her children are), but quality is less important (because the children are less fussy about that). Urgency and predictability are more important. Thus fitness for purpose means different things to Neeta, depending on the market segment she is representing and the jobs to be done.

We can use this pizza delivery scenario to describe the X-Matrix model and show how the ideas behind fitness for purpose can be used with it.


Results describe what we want to achieve by having fitness for purpose, or alternatively, they are the reasons we want to (and need) to improve fitness for purpose.

Given that this is a pizza delivery business, its probably reasonable to assume that number of pizzas sold would be the simplest business result to describe. We could possibly refine that to number of orders, or number of customers. We might even want a particular number of return customers or repeat business to be successful. At the same time operational costs would probably be important.


Strategies describe the areas we want to focus on in order to improve fitness for purpose. They are the problems we need to solve which are stopping us from having fitness for purpose.

To identify strategies we might choose to target one of the market segments that Neeta represents, such as family or business. This could lead to strategies to focus on things like delivery capability, or menu range, or kitchen proficiency.


Outcomes describe what we would like to happen when we have achieved fitness for purpose. They are things that we want to see, hear, or which we can measure, which indicate that the strategies are working and which provide evidence that we are likely to deliver the results.

If our primary outcome is fitness for purpose, then we can use fitness for purpose scores, along with other related leading indicators such as delivery time, reliability, complaints, recommendations.


Tactics describe the actions we take in order to improve fitness for purpose. They are the experiments we run in order to evolve towards successfully implementing the strategies, achieving the outcomes and ultimately delivering the results. Alternatively they may help us learn that our strategies need adjusting.

Given strategies to improving fitness for purpose based around market segments, we might try new forms of delivery, different menus or ingredient suppliers, or new alternative cooking techniques.


I hope this shows, using David’s pizza delivery example, how fitness for purpose provides a frame to view Strategy Deployment. The X-Matrix model can be used to tell a coherent story about how all these elements – results, strategies, outcomes and tactics – correlate with each other. Clarity of purpose, and what it means to be fit for purpose, enables alignment around the chosen strategies and desired outcomes, such that autonomy can used to experiment with tactics.

VN:F [1.9.22_1171]
Rating: 4.0/5 (1 vote cast)