Module 4: Faculty Policy – Privacy, Data, and Ethical Considerations
C
CurrikiStudio
Module 4 of 15 7–9 Minute Duration

Privacy, Data, &
Ethical AI Use

Designing an Effective AI Policy for Grades 6–12. Understanding compliance obligations, data “Never-Lists,” and the risks of shadow AI.

Learning Outcomes

Identify data categories never to enter into unapproved AI tools.

Align AI use with FERPA, COPPA, and state privacy laws.

Differentiate between vetted tools and “Shadow AI.”

Draft practical faculty-facing data safety policy language.

“A teacher may believe they are simply saving time… but pating a student email or behavior note into a free AI tool can become a privacy and compliance issue very quickly.”

The “Never Enter” List

If information is treated as confidential in your SIS or cumulative file, it should never be pasted into a consumer AI tool.

Student Identifiers

Names, ID numbers, birthdays, and contact information.

Academic Records

Grades, transcripts, attendance, and specific assignments tied to names.

Sensitive Supports

IEPs, 504 plans, accommodation details, and counseling notes.

Behavior & Discipline

Referrals, behavior logs, and incident narratives.

Staff Records

Evaluations, internal HR matters, and confidential school records.

School Security

Incident reports, legal documents, or internal security protocols.

Legal & Regulatory Compliance

FERPA & COPPA

AI tools that collect personal info from students under 13 (COPPA) or handle educational records (FERPA) must be strictly vetted for data handling and model-training practices.

Institutional Responsibility

Even if a tool is “free” or helpful, the school remains legally responsible for how that student data is processed and stored by third parties.

Beware “Shadow AI”

Shadow AI refers to tools staff use informally without district approval, contracts, or privacy review. These consumer tools often retain prompts to train their models, creating an invisible data leak.

Policy Recommendation: Clearly define which tools are “Approved” and state that unapproved tools may never handle confidential data.

Five Pillars of Ethical Faculty AI Use

Data Privacy
Transparency
Bias Awareness
Human Oversight
Integrity

“Legal compliance is the floor. Ethical use is the standard.”

Scenario Analysis

Privacy Violation

The IEP Copy-Paste

A teacher pastes student IEP text into a free chatbot. The sensitive data is now stored on external servers and potentially used for model training.

Ethical Risk (Bias)

Behavior Summaries

A tool summarizes behavior referrals and describes a student as “defiant,” ignoring context and recommending escalated consequences.

Shadow AI Risk

The Browser Extension

A teacher uses an unvetted extension that syncs every student email and draft to an unknown third-party server.

Safer Practice

The Clean Prompt Test

A teacher generates science questions using an approved tool with no identifiable student information included in the prompt.

Capstone Milestone 04

Draft Your “Never-Enter” List

Identify at least 4 categories of student, staff, or school data that should be strictly prohibited in free or unapproved AI tools. Briefly explain the risk for each.