Image created by Midjourney

Distillations in this newsletter: How to factor AI into your strategy; Core Values from Brave New Work; Complexity from brevity – a strategy development error; Strategy Distilled – free pdf compilation

STRATEGY DISTILLED:

A monthly concoction of insight, learning and things you might have missed for anyone who works on strategy, works with strategy or just loves strategy.

Would you rather listen to this newsletter as a podcast?

_____________________

This month …

  • How to factor AI into your strategy.
  • Strategy snippets you might have missed: Core Values from Brave New Work; Complexity from brevity – a strategy development error.
  • Strategy Distilled – free pdf compilation: The first two years of Strategy Distilled compiled into an 88-page pdf – free to subscribers.

If you enjoy reading this newsletter, don’t forget to forward it to friends or colleagues who might also find it of interest.

Was this forwarded to you? Sign-up

Discover the content of past issues of Strategy Distilled.

_____________________

How to factor deep uncertainties … like AI … into your strategy

The long-awaited Artificial Intelligence revolution is upon us. According to the London Business School a fortnight ago, “business sectors will be challenged” and “business models will be upended”.

How, then, do we, as strategists, factor uncertainties such as AI into our strategies? As of October 2023, there certainly are a lot of uncertainties surrounding AI.

Incorporating something like ‘profitability’ into your strategy is relatively straightforward. It is easily defined (earnings minus costs, both carefully defined), its value is clear (profit enables re-investment and/or return on prior investment) and strategic decision-making around profitability is a well-worn path (an initial decision on whether your organisation is profit or non-profit, a decision on whether to prioritise shareholder interests or stakeholder interests and then decisions on how best to distribute profits). Similarly, incorporating ‘customer satisfaction’, ‘operational efficiency’, ‘employee engagement’ or ‘diversification into new territories’ into strategy is less of a voyage into the unknown than artificial intelligence.

Here are some of the more strategically important uncertainties about AI:

  1. Whilst the role and value of AI is clear in many niche applications (e.g. image analysis, molecule design), its more general value within organisations is less clear.
  2. Many aspects of AI functionality are struggling to reach a point of stable and reliable functional maturity. For example, the ‘hallucinations’ suffered by large language models, makes them high-risk content creators.
  3. AI platforms, services and features are evolving rapidly, making it difficult to decide when best to commit to AI. The rapidly changing AI landscape also increases the risk of committing to ‘soon-to-be-outdated’ functionality.

Our strategic response to such uncertainties needs to be well thought through and a great place to start is the wise advice of Nobl that uncertainty and risk are different things and should be managed very differently:

“A risk is a known danger, in that we know the potential outcomes and we can apply a probability to those outcomes … Uncertainty is categorically different. By definition, we haven’t encountered these conditions before. The odds may be worse, but they could also be far better. Moreover, there may be untold prizes to be discovered in the journey itself. We simply don’t know until we venture out.”

Conflating risk and uncertainty, according to Nobl, makes organisations:

  1. Delay action (“winnowing potential gains” whilst “conceding gains to competitors”);
  2. Invest in sub-optimal (“devil-you-know”) solutions;
  3. Demand ever-more information in the hope the uncertainties will be reduced to manageable proportions – which might actually lead to unfounded over-confidence.

Instead, a constructive approach to strategic uncertainty follows a three-step process (inspired by the Nobl article but adapted substantially for present purposes):

  1. Develop a sufficient understanding of AI to enable decision-making. Try to get the entire decision-making team to a similar level of understanding. Agree, as far as possible, some clear stopping rules – when do we stop researching and start deciding? Focus entirely on the opportunities to begin with. Focusing on excitement rather than anxieties is more likely to sustain involvement and active engagement.
  2. Share fears and concerns; agree coping tactics. Having agreed the opportunities and the value they offer, we now acknowledge the possible down-sides. What harms and costs sit alongside the benefits and value of adopting AI technologies. It is unlikely you will be able to fully reconcile these pros and cons yet. You may, however, be able to discern how they might be measured and evaluated. You may even be able to sketch out threshold situations where you might commit to embedding AI into your routine ways of working … or decide that a particular use of AI is not currently feasible for you.
  3. Commit to experimentation. Often the only way to resolve deeply entrenched uncertainties is to try things out to see if you can make them work in your own organisation. To quote Nobl, “think of your activities as a means not just to produce a desired outcome, but as a means to a deeper understanding of the conditions around you … think of each of your tactics as a means of asking a question that can deliver an illuminating answer. In the beginning, bet small.”

So, how would we translate these broad principles into a practical approach to strategy development? We start with opportunity definition. How could AI benefit our organisation? There is a growing body of evidence showing jobs or types of activity where AI can outperform humans – see Figure 1 below from Time magazine.

;Figure 1. State-of-the-art AI performance on benchmarks, relative to human performance.

The key question is, where people in your organisation are doing things that AI clearly does better, is there a good business case for introducing AI to augment human performance?

The harder challenge is working out how AI could enhance more generic aspects of human performance. McKinsey’s June 2023 research suggests that “75 percent of the value that generative AI use cases could deliver falls across four areas: Customer operations, marketing and sales, software engineering, and R&D.” Tyna Eloundou et al’s 2023 research shows that, unlike previous technological revolutions, it is higher-wage professionals whose jobs are more exposed to the newer generative AI technologies.

Building the ‘Opportunity Case’ for AI can be a great way of discussing and sharing a high-level overview of where AI platforms, services and features could offer potential value for your organisation.

Once your opportunity case is produced, you can then start to explore where the costs and harms of your specific AI opportunities might arise. Initially, this can be a high-level overview. What types of costs and harms? How likely are they? How severe would they be if they did occur? And how easy would they be to detect and mitigate before too much damage is done? For a structured approach to this, see Failure Modes and Effects Analysis.

Deciding where and how to experiment with AI is usually the biggest set of strategic decisions to make. It is founded upon strategy as hypothesis rather than strategy as imperative. Good AI experiments ask good questions. Given the limitations of large language models and their hallucinations, how can we still use GPT (Generative Pre-trained Transformer) models in content creation without compromising content quality? Can AI technologies be used in combination to cancel out their respective limitations? A great example of this was given by Gary Marcus and Ernest Davis who reported that ChatGPT4’s ability to solve maths and science problems was significantly enhanced by using ChatGTP4 with the Wolfram Alpha plugin (Wolfram Alpha enables computation of maths- and science-type questions based on its structured and curated archive of data, algorithms and methods).

Often the key decisions about AI experiments focus not just on what type of questions to ask but which tools to use and, sometimes, which vendors to partner with in such a fast-moving marketplace with so many new entrants.

The final thought on experimenting with AI is that the scope and intensity of your AI experiments ought to be proportional to its likely impact on your future success as an organisation. One executive who took part in Prof. Lynda Gratton’s survey on the impact of generative AI on the workplace, admitted “We have created a head of generative AI with a role simply to moderate and make sense of the hundreds of experiments we have running on any day.”

_____________________

Strategy snippets you might have missed

Core Values from Brave New Work
Eight months on from the publication of my Core Values book and I’m still collecting the wisdom of others on how they make sense of, and work with organisational values. Brave New Work’s newsletter from last week was all on Values and it includes this advice on translating values into actions:

Translating Values Into Action Gets Us…
FROM empty words thrown together for PR efforts TO principles that help steer strategic work and hold colleagues accountable to one another.
FROM the same-old motivational poster values every company uses TO identifying and emphasizing what’s most important and unique to us.
FROM diminished confidence, togetherness, and engagement TO boosted trust, motivation, and clarity.

Complexity from brevity – a strategy development error
A rant from me … How many strategies do we read with sentences like this:

“We will focus relentlessly on customer engagement, keen pricing and excellent customer service, in order to drive up customer acquisition, maintain our market-leading sales conversion and maximise customer retention.”

That’s three causes and three effects. Together that’s nine possible interactions (focus on engagement for acquisition, for conversion and for retention, focus on pricing for acquisition, for conversion and for retention etc.) and most likely nine strategic KPIs. Really? From a single 29-word sentence? Or is this merely a feel-good declaration that avoids saying what the strategy really ought to be. What is it we really need? Is it improved acquisition, conversion or retention? Surely one is more important than the others. And what is our preferred route to achieving it? Do we need to get better at engaging our customers, serving our customer or do we need to shake up our prices? Surely, it’s not all three!

_____________________

Strategy Distilled – FREE pdf compilation

In case you missed it, I have published a compilation of all my articles and ‘strategy snippets’ from the first two years of my Strategy Distilled newsletter. This 88-page pdf contains 24 articles and 53 ‘snippets’, all with links to source material, covering all aspects of strategy, from ‘how to think big in strategy’ and strategy innovation to strategy metrics and the role of values.

STRATEGY ARTICLES – examples:

  • How strategy actually works
  • Separating strategy from strategic planning
  • The case for strategy scoping
  • Justifying stakeholder consultation in strategy development
  • Managing innovation within strategy
  • Can Ikigai reveal the four deficiencies of strategy?
  • Leading and lagging indicators as strategy metrics

SNIPPETS ON STRATEGY YOU MIGHT HAVE MISSED – examples:

  • Google’s ‘Simplicity Sprint’
  • How risky is innovation in your organisation?
  • McKinsey’s Seven Step Problem-Solving Process
  • Connecting strategy and culture
  • What the Beatles can teach us about strategy
  • What kind of strategist are you?

The pdf is FREE to subscribers – download your copy now. If you think ‘Strategy Distilled’ would be of value to friends and colleagues, get them to sign up here and they will be sent the compilation straight away.

_____________________

Goal Atlas gives you structured processes and tools to ensure strategy is adopted and impactful across your organisation. Get in touch if you think we might be able to help.

_____________________

If you enjoyed reading this newsletter, don’t forget to forward it to friends or colleagues who might also find it of interest.

Was this forwarded to you? Sign-up

Discover the content of past issues of Strategy Distilled.

Share This