If your recent leadership meetings feel strangely familiar – polished, yet generic, comprehensive, yet somehow shallow – you’re probably not alone.
A shift in the process of strategic thinking across organizations is changing the way teams leverage their expertise to solve problems (Anthropic). Discussions that once drew on deep institutional knowledge and hard-won experience feel interchangeable. Solutions sound sophisticated, but they lack the nuanced understanding of your context, community, and organizational history.
Whether it’s the pressure of delivery, the appeal of commoditized ‘thinking,’ or simply the changing nature of how we approach problem-solving in the age of AI, the risk of strategic thinking becoming less distinctive is real – less rooted in unique wisdom and lived experience. The implications become more significant as emerging leaders develop and grow in this new environment.
When the Emperor Has No Clothes
From hiring practices, to education and professional development, AI-first thinking is producing measurable consequences across multiple domains.
Polluting the Candidate Pool
Globally, more candidates are using tools like ChatGPT to exaggerate their expertise on live, remote interviews, while a completely different skillset shows up for the job on day 1. A Resume Builder survey found that 45% of respondents said they exaggerated their skills with AI tools throughout the hiring process (resume -> interview).
To make matters worse, bad actors now use generative AI to flood the candidate pool with fake resumes, using real, stolen identities (Computerworld). This isn’t just about individual dishonesty. By 2028, a quarter of job candidates could be fake (Gartner).
While likely more prevalent in technology industries, it won’t be long before community organizations, already struggling with staffing challenges, may face the additional burden of distinguishing genuine expertise from brittle skills masked by AI polish.
Eroding Critical Thinking
A recent MIT study measured brain activity of participants while writing essays. They discovered that users that leveraged LLMs experienced significantly reduced neural activity patterns, reflecting weaker cognitive engagement compared to individuals who relied on their own knowledge or traditional search methods (MIT). Most alarming is that “your brain on ChatGPT” not only engages less meaningfully during content creation but is also significantly impaired during recall for content written just minutes before.
Harvard professor Alex Green warns that unchecked AI in the classroom could erode critical thinking, and ultimately job prospects. When AI becomes a crutch for tasks like writing and problem-solving, students (or really anyone) are deprived of the mental rigor needed to build and maintain foundational abilities of critical thinking and problem-solving.
Fish Forewarnings
A striking parallel comes from marine biology. Researchers at the Norway Institute of Marine Research discovered that the world’s largest stock of herring – the Norwegian spring-spawning herring – underwent an abrupt 800km shift in spawning grounds due to a collapse in collective memory caused by overfishing of older, more experienced fish. In the absence of collective wisdom, younger herring invented new migratory paths leading them to colder, more inhospitable waters. In just one generation, centuries of accumulated wisdom vanished (Nature).
Organizationally, the parallel is clear: the capacity to perform tasks remains, but the accumulated wisdom of why certain approaches work, while others fail, disappears with the experts who developed it.
Vanishing Institutional Knowledge
Employee turnover comes with more than just the price tag of replacement. It comes with real losses in context, corporate fluency, and skills in successfully navigating the complex nature of an ever-growing matrixed environment. You start from scratch, and the impact can be felt systematically.
You may think your organization doesn’t have an active turnover problem. However, when experienced staff increasingly rely on AI to replace strategic thinking, and newer employees develop subpar individual skills in an AI-first environment, institutional knowledge fails to transfer and actively degrades – effectively creating experience turnover. This should be of concern for every organization that is intent on remaining viable for decades to come.
Battle Scars Matter
In a recent interview, Simon Sinek reflected on his own development as a writer and thinker and gave a visceral example of why the struggle matters so much. “When we think about AI, we only think about the destination. We only think about the output, we never think about the input. […] The excruciating pain of organizing ideas, putting them in linear fashion […], that excruciating journey is what made me grow.” (Sinek) The journey of wrestling with ideas and pushing through difficulties created the important skills (pattern recognition, critical thinking). The book was just an artifact.
It runs counter to our performance-obsessed culture, but points to a fundamental truth: competence isn’t built through output, but through experiences from successes and failures alike.
Equilibrium is Death
In his final letter to Amazon shareholders, Jeff Bezos paraphrased Richard Dawkins to provide a metaphor to remaining successful. “Staving off death is a thing you have to work at. Left to itself – and that is what it is when it dies – the body tends to revert to a state of equilibrium with its environment. […] The world wants you to be typical – in a thousand ways, it pulls you. Don’t let it happen.” (Bezos)
For organizations, this equilibrium manifests in a gradual loss of competitive advantage. When everyone has access to the same AI tools and output becomes generic, we risk settling into a comfortable equilibrium. One that feels productive but lacks the distinctiveness that comes from genuine engagement.
Take FAANG’s Word For It
Even at the forefront of technology adoption, the most sophisticated teams reflect on the distinction between strategic thought and tactical execution. A FAANG engineer recently described their team’s approach to using AI as a force multiplier, but not a replacement for solid design and architecture (reddit). Extensive design and technical reviews are important precursors to anything landing in AI development. AI handles the “hands on the keyboard,” but humans remain firmly in control of strategy, design, and critical thought.
Protecting Your Sticks & Bricks
At this point, you might be thinking I’m advocating against AI. I am not! In fact, I placed my bets on the idea that AI will transform how organizations leverage advanced intelligence to make better decisions. The cure isn’t about rejecting AI. It’s about ensuring we don’t let go of what former CIA operative Andrew Bustamante calls “sticks & bricks.” (Bustamante) When advanced technology and sophisticated intelligence operations failed to deliver results, the CIA found success by going back to building strong fundamentals: no drones, no satellites, just good old-fashioned espionage and strategic thinking – their sticks & bricks.
For organizations today, this means deliberately protecting and strengthening the foundational elements that make each organization unique in their strategic thinking – the messy, imperfect, and deeply human processes that create breakthroughs and lasting competitive advantage.
Avoiding the Sinking Skyscraper
San Francisco’s Millennium Tower, a glossy 58-story, impressive luxury skyscraper, has been sinking and tilting since opening in 2009. Despite costly fix attempts, the building continues to sink in unexpected ways. What at launched looked like a gleaming example of human engineering ultimately stands as an example of the consequences of not building on solid ground – sophisticated heights masking foundational instability.
Organizations experiencing AI-dependent strategic thinking face similar concerns. Are they 10 steps ahead, or are they climbing an escalator that is going down? And is the loss of institutional wisdom preventing them from building on stable ground?
Fundamentals That Matter
Lived Experiences. In environments that embrace fast failures and give the same credence to failures as they do successes, collective wisdom flourishes. The creation of a shared vision, objectives, and tactical approaches are the result of the irreplaceable process of wrestling with real problems and learning from the collective brain trust.
The Shared Vision. People’s core motivations typically connect deeply with true organizational vision. Passionate, genuine, and impactful talent, is most often driven not by finances. Rather, the idea that their work is meaningful, and it contributes to something bigger than themselves.
Sitting With Complexity. Organizations must maintain the capacity to truly unpack problems (and opportunities) to their core. This requires the patience to sit with complexity and resist the pull toward quick, premature solutions that solve symptoms, not causes.
Seeing Past the Horizon. Strategic thinking demands the willingness to look over the horizon and the patience to navigate toward true north stars, not just the next quarterly milestone. This ability to hold uncertainty while making directional progress still seems to be uniquely human.
When Your Canary Stops Singing
If your leadership meetings feel eerie – polished but lacking depth – or your teams agree too easily, too often, that’s probably a sign you should take a step back and assess. Conflict often seeds innovation, and conflict arises from the diverse experiences each person brings to the table. If the puzzle pieces start looking more like smooth rectangles, your problem-solving has lost the nuance high performing organizations require.
Research has shown that declining employee engagement – particularly engagement with each other – signals dangerous AI over-dependence. When people start collaborating less with colleagues, when cross-functional collaboration drops, and hallway conversations decrease, your strategic capacity is eroding.
So What?
Vigilance Over Prohibition. AI adoption of company-endorsed or shadow AI tools will only continue to increase. Banning AI tools would be a mistake. Instead, developing organizational awareness of AI capabilities would give leaders and staff the ability to recognize where AI is effective, and where humans retain primacy. Keep employees savvy with AI to know where to draw the line between assistance and dependence.
Engagement as Early Warning. Monitor more than just productivity metrics. The quality of collaboration, the intensity of debate and discovery, and the ability of people to solve problems together are important measures (albeit more qualitative). Are conversations getting richer, or more superficial?
Not all motion is forward motion. In an age where AI can make all of us more effective and sophisticated in the types of problems we can solve, our competitive advantage will come from our distinctly human capacity for strategic wisdom that no algorithm has been able to replicate. Your sticks and bricks aren’t primitive tools – they keep your organizational tower standing while others build on quicksand.