2026.16: The NFP Sector Is Using AI. Nobody Is Running It.
Happy Friday from windy and cold YYC!
This issue is a bit different…I feel like I’ve been saying that a lot lately?!
I've been spending time lately working on NFP strategy, through my partnership with the Osborne Group. Specifically, the technology and AI dimensions. And I keep finding the same thing: sector leaders who are smart, mission-driven, and genuinely worried about doing right by their communities, navigating a technology environment that's moving faster than their governance can keep up.
This week I want to share three signals from that work. The numbers are Canadian. The risk is real. And the window to get ahead of it is shorter than most boards realize.
Let's break it down.
Signal:
Signal 1: The NFP Sector Is Using AI. Nobody Is Running It.
A January 2026 report from Imagine Canada and the Canadian Centre for Nonprofit Digital Resilience surveyed over 900 Canadian nonprofits. 80% said their organization is using AI in some form. Only 10% have a formal AI policy. 64% of those using AI have no policy at all, and aren't developing one.
That gap is where the risk lives. Not in the technology itself. In the absence of any decision about how it gets used, by whom, for what, and with whose data.
62% of respondents said they're aware of reputational risks. 60% flagged legal and ethical concerns. Most of them are doing nothing about it.
This is the pattern I see in the field: awareness without action. The sector knows the roof has a leak. Nobody's called the contractor yet.
Signal 2: The Sector Is Going It Alone.
Of those same organizations, only 9% have engaged an external consultant for AI support. Larger organizations, those with revenues over $10M, are more likely to seek outside help, but only 27% of that group has done it.
Meanwhile, the Ontario Nonprofit Network reports that 60% of small charities say they lack digital skills, lack a strategy, and struggle to fully adopt the tools they already have. Only 13% have dedicated tech staff and a roadmap.
The combination of those two data points is the problem. The organizations with the least internal capacity are also the least likely to bring in outside expertise. They're not ignoring the problem. They're stretched too thin to solve it.
Signal 3: The Barrier Isn't Money. It's Knowledge.
This one surprised me. In most nonprofit research, limited resources are the top challenge. Not here. The Imagine Canada report found that the biggest barriers to AI adoption and growth are uncertainty and limited hands-on experience, not finances.
Funding matters when scaling, but it's not what's keeping people stuck at the starting line. What's keeping them stuck is not knowing where to start, not trusting what they know, and not having anyone in the room who's done it before.
That's a different problem than budget. And it calls for a different solution.
Scale:
Scale 1: Run a Data Audit Before You Add Anything New
Before an NFP adds another tool, the question to answer is: what data do we already hold, who has access to it, and what would happen if it walked out the door?
Start with a one-page data map. List every platform that touches donor, client, or staff information. Note who owns each one, who has admin access, and when access was last reviewed. That exercise alone will surface problems worth fixing.
Canada's privacy law is being replaced. The expected successor to PIPEDA carries penalties up to $25M or 5% of global revenue. That's not theoretical risk. That's board-level exposure for an organization that's never thought about it.
One process. One hour. More clarity than most organizations have had in years.
Scale 2: Get an Outside Perspective Before Someone Else Brings One In
The sector is navigating AI governance mostly by instinct. That works until it doesn't, and when it doesn't, the consequences tend to arrive in the form of a breach, a complaint, or a funder asking questions the organization can't answer.
External perspective doesn't have to mean a six-month engagement. A structured conversation with someone who's done this work across organizations, who knows what good looks like and what failure looks like, is often enough to reset the direction.
The 9% of nonprofits that have brought in outside support are applying AI to more areas, with more confidence, and with less exposure. That's not a coincidence.
Scale 3: Reframe the Question Your Board Is Asking
If your board is asking "should we have an AI strategy," they're a step behind the question. 80% of organizations already have staff using AI tools. The strategy ship has sailed. The governance ship hasn't left the dock.
The better question is: which decisions do we need to make right now about how AI is used in this organization, and who owns each one?
That conversation takes 30 minutes with the right framing. It produces a short list of decisions, assigns owners, and turns an abstract risk into a workable agenda. That's the starting point, and it's within reach for any organization willing to have the meeting.
Deep Dive:
This week's full piece goes further on all three signals: the regulatory exposure building under Canada's pending privacy legislation, what the 9% of organizations with outside support are actually doing differently, and a framework for having the governance conversation with your board without it turning into a two-hour debate about tools.
Read: 80% of Canadian Nonprofits Are Using AI. 64% Have No Policy.
Thanks for reading!
I'm offering a free 30-minute call for NFP leaders, EDs, ops directors, and board members who want to work through what any of this means for their organization.
No pitch. No deck. Just a structured conversation about where you are, what's at risk, and what a reasonable next step looks like. Interested? Book your 30-minute call.
See you next Friday.
Best,
JT
PS - If someone forwarded this to you and you want it in your inbox directly, subscribe HERE.
And if one of these signals hit closer to home than the others, drop a comment and tell me which one. I read everything.