01What to assess
An AI readiness assessment should cover six dimensions:
Strategy and leadership: Is there a clear AI strategy with specific objectives, measurement criteria, and executive ownership? Does the board have an adequate understanding of AI risk and governance requirements? Is there executive sponsorship with the authority to make the commitments that transformation requires?
Data: What is the quality, accessibility, and governance of the data that AI will use? Where are the significant data quality gaps that will limit AI value? What data governance investments are needed before or alongside AI deployment?
Technology and infrastructure: What is the current state of the Microsoft 365 or equivalent environment? What integration complexity exists between AI tools and core business systems? What security and compliance considerations affect AI deployment?
Workforce capability: What is the current level of AI literacy across the organisation? Where are the highest-potential and highest-priority populations for AI capability development? What are the main barriers to AI adoption that change management must address?
Governance and risk: Does an AI governance framework exist? If so, is it fit for purpose? What regulatory and compliance requirements affect AI deployment in the organisation's sector?
Change management readiness: What is the organisation's change capacity? What is the current level of trust between leadership and workforce? What change fatigue exists from previous programmes?
02Assessment methods
A comprehensive AI readiness assessment uses multiple methods:
Leadership interviews (four to six key leaders): structured 45-minute conversations with the CEO, CIO, CHRO, CFO, and one or two major business function leaders. These interviews gather the leadership-level perspective on AI strategy, governance, and change readiness.
Workforce survey (quantitative): a 15-question survey measuring current AI awareness and use, attitude towards AI adoption, specific concerns about AI, confidence in leadership's AI communication, and perceived organisational readiness for AI. The survey should reach a statistically representative sample, minimum 200 respondents in an organisation of 1,000 or more.
Technology audit: a review of the current Microsoft 365 environment, data quality and governance, security posture, and integration landscape relevant to planned AI deployments. This is typically led by IT with external support where specialist skills are not available internally.
Workshops with frontline managers (two or three sessions): facilitated discussions with the managers who will be asked to champion AI adoption in their teams. These sessions reveal the practical on-the-ground readiness picture that surveys and leadership interviews may miss.
03Interpreting the assessment
The readiness assessment produces a profile across the six dimensions, typically rated on a four-level maturity scale: foundations not established, foundations in place, developing, and mature.
The interpretation must identify not just the overall maturity level but the sequence implications. A readiness assessment may reveal that an organisation has strong technology infrastructure and a clear AI strategy but weak data governance and low leadership AI literacy. The implication is not to delay AI deployment but to prioritise the data governance and leadership development investments in the near term before broad deployment.
Common readiness patterns and their implications:
High strategy clarity, low workforce readiness: deploy to champions in the near term while building workforce capability for broader rollout. Do not delay deployment waiting for full workforce readiness; use targeted deployment to build the internal case studies that accelerate broader adoption.
Low governance maturity, high technical readiness: establish governance framework before broad deployment. Deploying at scale without governance in regulated sectors creates regulatory and reputational risk that is more expensive to manage retrospectively.
High change fatigue, high technology readiness: design AI deployment to be additive (making existing workflows easier) rather than transformative (requiring significant workflow changes). Save the transformative applications for a later phase when change capacity is higher.
04Using the assessment output
The readiness assessment output should directly inform three decisions:
Programme sequencing: which AI deployments proceed in which order, based on readiness profile rather than technology availability or business unit enthusiasm.
Investment priorities: where the pre-deployment investment should be focused before the main programme begins. Data governance, leadership AI literacy, and governance framework development are the most common pre-deployment priorities identified in assessments.
Change management design: the specific change management investments required given the readiness profile. High change fatigue requires different change management design from low change fatigue; low leadership AI literacy requires different approaches from high literacy.
The assessment should be refreshed annually and at the start of each new significant AI deployment phase. Readiness profiles change as programmes progress; a 12-month-old readiness assessment may be significantly out of date in a fast-moving AI environment.
Key Takeaways
- 1.An AI readiness assessment covers six dimensions: strategy and leadership, data quality and governance, technology and infrastructure, workforce capability, governance and risk, and change management readiness.
- 2.Effective assessment methods combine leadership interviews, a quantitative workforce survey (minimum 200 respondents), a technology audit, and workshops with frontline managers.
- 3.The assessment must identify sequence implications, not just maturity levels; high readiness in some dimensions and low in others informs what needs investment before broad deployment, not whether to proceed.
- 4.Three common readiness patterns: high strategy/low workforce (deploy to champions, build capability in parallel); low governance/high technical (establish governance before scale); high change fatigue/high technical (design additive rather than transformative first deployments).
- 5.Refresh the assessment annually and at the start of each new deployment phase; readiness profiles change as programmes progress and a stale assessment leads to misinformed programme decisions.
References & Further Reading
- [1]Microsoft: AI Adoption FrameworkMicrosoft
Want to discuss this with an expert?
Book a strategy call to explore how these insights apply to your organisation.
Book a Strategy Call