Module 5: Infrastructure Matters – Tools & Access
C
CurrikiStudio
Module 5 of 15 7–9 Minute Duration

Infrastructure:
Tools & Access

Designing an Effective AI Policy for Grades 6–12. Learn why responsible AI policy requires a foundation of vetted tools, device equity, and managed filters.

Learning Outcomes

Distinguish between unmanaged and school-supported AI access.

Identify operational and equity risks of unguided AI implementation.

Evaluate system readiness across connectivity and platforms.

Draft infrastructure policy language for your specific school context.

“If your school says AI use is permitted, but you do not provide approved tools, device access, or filtering… you are outsourcing policy to chance.

The implementation Foundation

A written policy without implementation infrastructure is like a lab safety policy without goggles or supervision. Infrastructure ensures consistency and safety.

Managed Devices

Consistent bandwidth and device availability for all students.

Filtering & Controls

Age-appropriate access controls that align with safety requirements.

Approved Toolkits

Official platforms with vetted privacy agreements (DPAs).

Accessibility

Translation and text-to-speech supports for diverse learners.

Access Models Comparison

Dimension Unmanaged Access School-Supported Access
Tool Choice Individuals choose public tools District/School-approved tools
Privacy Inconsistent or absent review Vetted and documented review
Equity Uneven by home resources Consistent across all users
Training Fragmented/Self-taught Standardized staff training

The Equity Warning

Equity is not just about technically reaching a tool—it’s about meaningful use.

The Digital Divide

Schools relying on “bring-your-own-tool” deepen the divide for students with limited internet or older home devices.

Uneven Opportunity

When some teachers use robust AI and others don’t, student opportunities become uneven across the same school building.

System Readiness Checklist

Do we have one or more approved tools for expected use cases?
Are approved tools accessible on school networks?
Do account structures align to student safety requirements?
Can students access AI equitably across classrooms?
Do our tools offer accessibility (translation/speech)?
Is there a process for vendor review and incident reporting?

Infrastructure Scenarios

Scenario A: The Gap

“Allowed, But Unsupported”

A district permits AI but provides no approved tools. Teachers use fragmented public platforms with varying privacy levels.

Risk: Inconsistency
Scenario B: Inequity

“The Homework Advantage”

A teacher encourages AI use at home. Students with paid tools and high connectivity outperform peers with limited access.

Risk: Digital Divide
Scenario C: Filters

“Blocked School, Open Home”

Public AI is blocked on campus but open at home. Staff assume the “problem” is solved while students use the tools unguided elsewhere.

Risk: False Security
Scenario D: Best Practice

“The Managed Pilot”

District launches vetted staff and student accounts with training and device checks. Expectations are clear and manageable.

Outcome: Policy Alignment
Capstone Milestone 05

Define Your Infrastructure Policy

Can your school responsibly allow AI use without providing an approved or managed tool? In 3–5 sentences, explain your position considering privacy, consistency, and equity.