← Back to recipes

Assess your organisation's readiness for AI projects

operationsbeginnerproven

The problem

You've found an AI use case that makes sense technically, but you're not sure if your organisation can actually implement it. AI projects fail not because the technology doesn't work, but because the organisation wasn't ready: data was messier than expected, staff didn't adopt it, there was no one to maintain it, or leadership support evaporated after the initial excitement.

The solution

Run a structured readiness assessment covering six dimensions: data readiness, technical capacity, staff readiness, leadership support, sustainability, and ethical/privacy readiness. Score each honestly, identify gaps, and either address them before starting or choose a simpler starting point. This prevents wasted investment and builds a realistic project plan.

What you get

A readiness scorecard showing where you're strong and where you have gaps. For each gap, you'll have specific actions to address it or a recommendation to start with a smaller project. This can be shared with funders or trustees to demonstrate due diligence and realistic planning.

Before you start

  • A specific AI use case you want to assess readiness for
  • Access to people who know about your data, IT, and operations
  • Willingness to be honest about gaps

When to use this

  • Before committing to an AI project
  • When writing a funding bid that includes AI
  • After a failed AI pilot to understand what went wrong
  • When leadership asks "are we ready for AI?"

When not to use this

  • You're just experimenting with free tools (no assessment needed)
  • The project is already underway and can't be changed
  • You're assessing AI in general rather than a specific use case

Steps

  1. 1

    Assess data readiness (1-5)

    Score your data situation. 5 = Clean, accessible data in a system you control, with clear ownership and documentation. 1 = Data scattered across spreadsheets, inconsistent formats, significant quality issues, unclear who owns it. Key questions: Where is the data? How clean is it? Can you export it? Who maintains it?

  2. 2

    Assess technical capacity (1-5)

    Score your ability to implement and maintain. 5 = In-house technical staff comfortable with APIs and data, or budget for ongoing external support. 1 = No technical capacity, reliant on volunteers, no budget for external help. Key questions: Who would build this? Who maintains it when they leave? What happens when it breaks?

  3. 3

    Assess staff readiness (1-5)

    Score whether staff will actually use it. 5 = Staff are asking for this solution, involved in design, have time to learn it. 1 = Staff are overwhelmed, resistant to change, weren't consulted. Key questions: Do staff see the problem? Were they involved in choosing this solution? Do they have time to adopt it?

  4. 4

    Assess leadership support (1-5)

    Score backing from decision-makers. 5 = CEO/trustees actively championing, budget allocated, willing to accept initial failures. 1 = Grudging approval, no budget, expectations of instant ROI. Key questions: Is this a leadership priority? Will they stick with it through teething problems? Is there real budget?

  5. 5

    Assess sustainability (1-5)

    Score long-term viability. 5 = Clear ongoing budget, identified maintainer, fits into existing workflows. 1 = Project funding only, no maintenance plan, bolted onto existing processes awkwardly. Key questions: What happens after the project ends? Who pays for ongoing costs? Who fixes it in 2 years?

  6. 6

    Assess ethical and privacy readiness (1-5)

    Score GDPR and ethics preparedness. 5 = Data protection impact assessment done (if needed), consent covers AI use, bias risks considered, beneficiary impact evaluated. 1 = No GDPR consideration, unclear consent basis, potential bias unexamined. Key questions: Was data collected with consent for this use? Have you considered bias risks? Would beneficiaries be comfortable with this use of their data?

  7. 7

    Identify critical gaps

    Any dimension scoring 2 or below is a critical gap that will likely cause failure. List these and decide: Can you address them before starting? Should you start with a smaller project? Is this use case wrong for your organisation right now?

  8. 8

    Create an action plan

    For each gap, specify what needs to happen to address it: training, data cleaning, stakeholder engagement, budget approval, hiring. Be realistic about timelines. These become prerequisites for your AI project.

  9. 9

    Choose your starting point

    Based on your scores, decide: proceed with the full project, start with a limited pilot, address prerequisites first, or choose a different use case that better matches your current readiness. Document this decision.

Example code

Example readiness scorecard

A completed assessment for a donor prediction project.

# AI Readiness Assessment: Predict Donor Lapse

## Scores

| Dimension | Score | Notes |
|-----------|-------|-------|
| Data readiness | 3 | Data in CRM but 40% missing email. Need cleanup first. |
| Technical capacity | 2 | No in-house skills. Would need external help. |
| Staff readiness | 4 | Fundraising team keen, involved in scoping. |
| Leadership support | 4 | CEO supportive, board approved pilot budget. |
| Sustainability | 2 | Project funding only, no ongoing maintenance plan. |

**Total: 15/25**

## Critical Gaps
1. Technical capacity (2): No one to build or maintain
2. Sustainability (2): No plan beyond pilot

## Recommended Actions
1. Budget for external implementation AND ongoing support
2. Identify internal "owner" even if external builds it
3. Include 12 months maintenance in project budget
4. Start with simpler version: segment analysis rather than prediction

## Decision
**Proceed with modified scope**: Start with donor segmentation (proven, simpler)
rather than predictive model. Reassess prediction after building internal comfort
with data-driven fundraising.

Tools

Assessment template (spreadsheet)platform · free · open source
Claude or ChatGPTservice · freemium
Visit →

Resources

At a glance

Time to implement
hours
Setup cost
free
Ongoing cost
free
Cost trend
stable
Organisation size
micro, small, medium, large
Target audience
ceo-trustees, operations-manager, it-technical

The assessment itself is free. Addressing gaps may have costs.

Part of this pathway