01Creating the conditions for honest postmortems
The most common failure of AI transformation postmortems is that they produce narrative that protects the reputation of the leaders involved rather than genuine analysis of what went wrong.
Creating the conditions for honest retrospectives requires: psychological safety (explicit commitment from leadership that the postmortem is a learning exercise, not a blame exercise), senior leader participation in the retrospective rather than just receiving its output, protection for people who contributed to failures in good faith, and separation of the learning exercise from any performance management process.
The postmortem leader should ideally be someone with no accountability for the programme's outcomes: an internal function with independence (internal audit, an external consultant) or a senior leader from a different part of the business. Programme leaders conducting their own postmortems have structural incentives to present their decisions favourably that reduce the quality of the analysis.
02The postmortem framework
A structured AI transformation postmortem covers four analytical areas:
What was the original intent, and what actually happened? Document the original programme objectives, the committed timeline, the expected business outcomes, and the actual outcomes. This creates the factual foundation that subsequent analysis builds on. Disagreements at this stage about what was actually committed are common and important to surface; ambiguous original commitments are themselves a finding.
What were the critical decision points, and what information was available at each? Identify the moments where different decisions would likely have produced different outcomes. For each decision point: what was decided, what information was available, what alternatives were considered, and why was the chosen path selected? This analysis should be descriptive, not evaluative at this stage.
What systemic factors contributed to the outcome? Look beyond individual decisions to the organisational conditions that shaped them: governance design, investment levels, change management support, sponsor commitment, data quality, vendor relationship, regulatory environment. Systemic factors explain why the same individual decisions produce different outcomes in different organisations.
What specific changes to strategy, governance, and process would produce different outcomes? For each significant finding, define the specific change that would have improved the outcome. Vague lessons ('better communication', 'more realistic planning') are not actionable; specific lessons ('the governance framework should have required business sponsor sign-off before each deployment phase, not just at programme launch') are.
03Common AI transformation failure patterns
Across UK AI transformation postmortems, several failure patterns appear consistently:
The technology-first failure: the programme started with a technology decision and never built a strong enough business case for the use cases. When value did not materialise, there was no clear understanding of why because the value hypothesis was never made explicit.
The governance gap: governance was lightweight at programme launch to enable speed, became a problem when the first significant AI incident occurred, and consumed the management attention that should have been driving adoption.
The sponsor withdrawal: the executive sponsor remained committed through the exciting early phase and withdrew attention as the programme entered the harder middle phase of sustained adoption. Without sustained sponsor attention, the programme lost the organisational priority that justified the change management investment.
The measurement vacuum: the programme ran for 12 to 18 months without producing credible business outcome evidence, and could not defend its investment case at the first major business review.
The change management shortfall: technology deployment happened on schedule; change management was delivered partially or lightly. The result was technology available but not used.
04From postmortem to next attempt
The purpose of a postmortem is to enable a better second attempt. The translation from postmortem findings to programme design changes is the most valuable output of the exercise.
For each major finding, define: the specific programme design change that addresses it; the governance mechanism that will prevent the same pattern recurring; the measurement approach that will provide early warning if the pattern is re-emerging; and the accountability owner for ensuring the change is implemented.
The postmortem output should be reviewed by the board before the next AI programme is approved. This is not about punishing the previous programme; it is about ensuring that the board is approving a new programme that has demonstrably incorporated the lessons of the previous one. Boards that approve a second AI programme without understanding what caused the first one to fall short are providing neither good governance nor good support to the programme team.
Key Takeaways
- 1.Honest AI postmortems require psychological safety, senior leader participation rather than just output reception, independence of the postmortem leader, and separation from performance management.
- 2.The postmortem framework covers four areas: what was intended vs what happened, critical decision points and available information, systemic factors, and specific changes that would produce different outcomes.
- 3.Common AI transformation failure patterns: technology-first without explicit business hypothesis, governance gap opened by an incident, sponsor withdrawal in the middle phase, measurement vacuum, and change management shortfall.
- 4.Specific, actionable findings ('the governance framework should have required business sponsor sign-off before each phase') are the output that matters; vague lessons produce no change to the next programme.
- 5.The board should review postmortem findings before approving the next AI programme; boards that approve a second programme without understanding what caused the first to fall short provide neither good governance nor good support.
References & Further Reading
- [1]Project Management Institute: Lessons LearnedProject Management Institute
Want to discuss this with an expert?
Book a strategy call to explore how these insights apply to your organisation.
Book a Strategy Call