“Strategy really must focus on those purely human capabilities of synthesis, and judgment, and sense-making.”
– Ross Dawson

About Ross Dawson
Ross Dawson is a futurist, keynote speaker, strategy advisor, author, and host of Amplifying Cognition podcast. He is Chairman of the Advanced Human Technologies group of companies and Founder of Humans + AI startup Informivity. He has delivered keynote speeches and strategy workshops in 33 countries and is the bestselling author of 5 books, most recently Thriving on Overload.
Website:
LinkedIn Profile:
Books
What you will learn
- How AI is reshaping strategic decision-making
- The accelerating need for flexible leadership
- Why trust is the new competitive advantage
- The balance between human insight and machine analysis
- Storytelling as the heart of effective strategy
- Building learning-driven, adaptive organizations
- The evolving role of leaders in an AI-first world
Episode Resources
Transcript
Ross Dawson: This is a little bit of a different episode. Instead of an interview, I will be sharing a few thoughts in the context of now doubling down on the Humans Plus AI theme. Our community is kicking off the next level. As you may have noticed, the podcast has been rebranded Humans Plus AI, and really just fully focused on this theme of how AI can augment humans—individually, organizations, and society.
So what I want to share today is some of the thoughts which came out of Human Tech Week. I was fortunate to be at Human Tech Week in San Francisco a few weeks ago. I did the opening keynote on Infinite Potential: Humans Plus AI, and I’ll share some more thoughts on that another time.
But what I also did was run a lunch event, a panel with myself, John Hagel, and Charlene Lee, talking about AI and the future of strategy. So it was an amazing conversation, and I can’t do it justice now, but what I want to do is just share some of the high-level themes that came out of that conversation, and I suppose, obviously, bringing my own particular slant on those.
So we started off by thinking around how is change generally, including AI, impacting strategy and the strategy process. So fairly obviously we have accelerating change. That means that decision cycles are getting shorter, and strategy needs to move faster.
It also means that there is the ability for creation of all kinds to be democratized within, across, and beyond organizations, allowing them to innovate, to act without necessarily being centralized. And this idea of this abundance of knowledge, coupled with the scarcity of insight, means that strategy really must focus on those purely human capabilities of synthesis, and judgment, and sense-making.
There’s also a theme where we have institutional trust is eroding. So this means that more and more, strategy shifts to relationships-based models, ecosystem-based models.
And this overlying theme, which John Hagel in particular brought out, is this idea that there is greater fear amongst leaders. There’s greater emotional pressure, and these basically shrink the timeline of our thinking. It forces us to shorter-term thinking. We are based on fear—of a whole variety of pressures from shareholders, stakeholders, politicians, and more.
We need to allow ourselves to move beyond the fear, as John’s latest book The Journey Beyond Fear lays out—highly recommended—which then enables us to enable our strategic imagination and ways of thinking, and how we do that.
So one of the core themes of the conversation was around: what are the relative roles of AI and humans in the strategy process? Humans are strategic thinkers by their very nature, and now we have AI which can support us and complement us in various ways.
Of course, there is a strong way in which AI can use data. It can do a lot of analysis. It is very capable at pattern recognition. It can move faster. It can simulate scenarios and futures, identify signals, and so it can scale what can be done in strategy analysis. It can go deeper into the analysis.
But this brings the human role of the higher levels: of the creativity, of the imagination, of the judgment, the ethical framing, the purpose, the vision, the values.
One of the key things which came out of it was around storytelling, where strategy is a story. It’s not this whole array of KPIs and routes to get them—that’s a little part of it. It is telling a story that engages people, that makes them passionate about what they want to do and how they are going to do it—that’s their heroes and heroines’ journey.
So this insight, this sense-making, is still human.
There’s a wonderful quote from the session, saying, “AI without data is extremely stupid,” but even with the data, it can’t deliver the insight or the wisdom on its own. That is something where the human function resides.
And so we are still responsible for the oversight and for the ethical nature of the decisions. Especially as we have more and more autonomous agents, we have very opaque systems. And accountability is fundamental to all leadership and to the nature of strategy.
So a leader’s role is to be able to bring together those ways in which we bring in AI, deciding when to trust it, deciding when to override, and how to frame its contribution for leaders. So that’s an intrinsic part of strategy: the role of AI in the way the what, how the organization functions, and how the organization establishes and communicates direction.
Well, there was a lot of discussion around the tensions. And again, John shared this wonderful frame he’s been using for a while about “zoom out and zoom in.” Essentially, he says that real leaders—the most successful organizations—they have a compelling 10- or 20-year vision, and they also have plans for the next six to twelve months, and they don’t have much in between.
And so you can zoom out to sort of see this massive scale of: Why do we exist? What are we trying to create? But also looking, shrinking down to saying, All right, well, what is it we’re doing right now—creating momentum and moving towards that.
And so this dual framing is emotionally resonant. It shifts people from fear to hope by being able to see this aspiration and also seeing progress today.
And so there are these polarities that we manage in strategy. We’re balancing focus with flexibility. We need to be clearly guided in where we are going. So we need this coherence. We need to be able to know what we are doing, but we also need to be able to focus our resources.
And so this balance between flexibility—where we can adapt to situations—while maintaining continuity in moving forward, is fundamental.
One of the fundamental themes, which, again, which came out of the conversation, which comes back to some of my core themes from a very long time, is this idea of knowledge and trust.
So AI is widely accessible. Everyone’s got it in various guises. So where does competitive advantage reside? And fundamentally, it is from trust. And it is trust that in the AI. Distrust about how the AI is used is around trust in the intentions. It’s around trust in, ultimately, the people that have shaped the systems and used the systems well.
So this means that as you create long-term, trust-based relationships, you get more and more advantages. And this comes back to my first book on Knowledge-Based Client Relationships, which I’ve extended and applied in quite a variety of domains, including in my recent work on AI-driven business model innovation.
We’re essentially saying that in an AI-driven world, that trust in the systems means you can have access to more data and more insight from people and organizations, which you can apply in building this virtual virtuous circle of differentiation. So you add value, you gain trust, you get insight from that, flowing through into more value.
So ultimately, this is about passion. What John calls the passion of the explorer, where we are committed to learning and questioning and creating value.
So I suppose that, in a way, the key theme that ran through the entire conversation was around learning, and where learning is not something about, how do we do these workshops, and how do we—
Take these bodies of knowledge and get everybody to know them. It is about this continuous exploration of the new. And every successful organization needs to harness and to enable people inside those organizations to be passionate about what they are learning, to explore, to learn from their exploration, to share that, and so building this sustainable learning and scalable learning, which is the nature of a fast-moving world,
Where we can have some consistent strategy based around that learning, which enables us to continue to both have direction and be flexible and adaptable in an accelerating world.
So that just touches on some of the themes which we discussed in the session, and I will continue to share, write some more—what I call mini reports—just to frame some of these ideas.
But the reality is that the nature of strategy is changing. This means the nature of leadership is changing, and we need to understand and to dig into the nature of the changing nature of strategy—where AI plays a role, how that shifts human roles, how leadership changes.
Because these are fundamental to our success, not just as individual organizations, but also as industries and society at large. Because our strategies, of course, must support not just individual entities or organizations, but the entire ecosystems and communities and societies in which they are embedded.
So we’ll come back. We’ve got some amazing guests coming up in our next episode, so make sure to tune in for the next episodes.
Please continue to engage. Get onto Humans Plus AI, sign up for our newsletter, and we’ll see you on the journey.
Podcast: Play in new window | Download