Audit

AI Safety Audit

Full inventory of AI tools in use — official and unofficial. Risk assessment for each. Clear action plan for what to stop, what to keep, and what needs attention.

The problem

Teachers are using AI tools that haven't been vetted. Some put student names directly into ChatGPT without thinking. Vendors won't sign privacy agreements. IT is drowning trying to keep up.

"They're not even thinking. They're just not thinking, 'Oh, I shouldn't put the name in there.'"
— Database Administrator, St. Louis private school
"Before AI it was cybersecurity and human attacks; now it's AI compromising my network."
— Director of IT, St. Louis private school

What you get

1. Full AI tool inventory

  • • What's officially approved
  • • What teachers are using anyway ("shadow AI")
  • • What students are exposed to through apps, browsers, and Google Workspace

2. Risk assessment for each tool

  • • Did they sign a Student Data Privacy Agreement?
  • • What data are they collecting?
  • • Is the tool age-appropriate?
  • Red / Yellow / Green rating for each

3. Action list

  • • Which tools to stop using immediately
  • • Which vendors to contact for agreements
  • • Which tools are safe to keep
  • • Template email for teacher communication

4. Leadership briefing

45-minute walkthrough of findings. Top 5 risks, ranked. Recommendations you can act on immediately.

Real example

A 2nd grade teacher wanted to use a popular AI reading app for her class. When IT checked, the vendor hadn't signed any privacy agreements and wouldn't commit to following student data laws. The tool was blocked. The school was protected.

Who this is for

Schools worried about liability, data exposure, or compliance — who need someone to do the digging they don't have time for.

Timeline

2-3 weeks from kickoff to final briefing.

Ready to see what's really happening?

Schedule a call to discuss the AI Safety Audit for your school.