Using AI to Build a Charity — Governance Before Growth

AI Didn’t Fix My Legacy System — It Forced Me to Understand It Properly

The Assumption

Most people use AI to move faster.

  • faster content
  • faster code
  • faster delivery

That makes sense.

However in my case, I used AI for something very different:

To slow things down — and get them right.

This is how I’m using AI to build a Charity by focusing on Governance Before Growth.

 

The Context

WhereWeLearn is not just a platform.

It is being established as a charity-led global education initiative designed to:

  • help people discover free learning
  • organise knowledge across the internet
  • support lifelong, curiosity-driven education

It is built on a simple idea:

The problem is not a lack of knowledge.
It is connection, navigation, and access.

 

The Constraint

Before anything else could happen — before growth, before visibility, before scaling — one thing had to be in place:

Governance.

Not as an afterthought.
Not as documentation for compliance.

But as a foundation.

Because once something is positioned as:

  • neutral
  • non-commercial
  • publicly beneficial

Every decision that follows depends on:

  • trust
  • clarity
  • accountability

 

The Reality

Governance is rarely treated this way.

In most systems, it appears:

  • late
  • fragmented
  • reactive

Documents are created to:

  • satisfy regulators
  • support funding
  • reduce risk

But not to define the system itself.

 

The Intervention

Instead of treating governance as overhead, I approached it as a system.

And used AI — specifically ChatGPT — to:

  • design governance structures
  • generate documentation
  • test consistency across policies
  • refine language for clarity and neutrality

 

What AI Was Actually Used For

Not decision-making.

Not authority.

But as a tool to:

 

1. Structure Thinking

Governance requires:

  • consistent definitions
  • aligned principles
  • clear boundaries

AI helped:

  • identify gaps
  • challenge ambiguity
  • enforce consistency

 

2. Generate and Iterate Documentation

Instead of writing documents once and leaving them static:

  • policies were generated
  • reviewed
  • refined
  • aligned across contexts

This included:

  • purpose statements
  • operational principles
  • communication boundaries
  • governance constraints

 

3. Test for Internal Consistency

One of the biggest risks in governance is contradiction.

AI allowed:

  • cross-referencing between documents
  • validation of language
  • detection of inconsistencies

 

The Output

The result was not just “documents”.

It was a coherent governance system, including:

  • a canonical narrative defining purpose and boundaries
  • communication rules separating roles and voices
  • clear definitions of what the platform is — and is not
  • structured guidance for humans and AI systems

 

All aligned around:

neutrality, access, and trust

 

What This Enabled

This approach changed how the platform could evolve.

 

1. Safe Decision-Making

Every new idea can now be tested against:

  • defined principles
  • established boundaries

2. Clear Role Separation

The system distinguishes between:

  • the charity (WhereWeLearn)
  • the infrastructure (LEAST Software)
  • the individual (Philip Lacey)

Each with:

  • different responsibilities
  • different communication styles
  • different levels of visibility

 

3. Controlled Growth

Instead of scaling first and correcting later:

  • the system is designed to grow safely
  • without introducing bias
  • without compromising trust

 

What Didn’t Happen

AI did not:

  • define the mission
  • replace judgement
  • make governance decisions

All decisions remained human.

AI was used to:

  • accelerate clarity
  • improve consistency
  • reduce friction in development

 

The Key Insight

AI is often positioned as a tool for growth.

In this case, its most valuable role was:

enabling constraint.

 

Why This Matters

In many systems, governance is seen as:

  • restrictive
  • bureaucratic
  • slowing progress

But in reality:

governance determines what kind of system you are allowed to build.

Without it:

  • growth is uncontrolled
  • decisions drift
  • trust erodes

With it:

  • systems remain aligned
  • behaviour is predictable
  • outcomes are defensible

 

What Comes Next

With governance in place, the platform can now:

  • complete formal incorporation
  • establish its initial structure
  • begin controlled activation

Importantly:

  • growth will not compromise neutrality
  • promotion will remain external
  • trust will remain central

 

Final Thought

AI didn’t build the charity.

It made it possible to define it properly before it grows.

And that changes everything.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.