FERPA, COPPA, and Beyond… Bridging the EdTech-Education Compliance Gap
I’ve been speaking a lot over the last month, with a focus on governance and helping organizations prepare for a future with AI.
One recent event - the AI for Good conference - underscored how much software and security measures have added significant expenses to schools’, districts’, colleges’ and universities’ budgets alike.
But are they aligned properly for maximum effect?
Creating a common language for privacy protection in educational technology - as defined by Texas
The educational technology (EdTech) sector has seen substantial changes in Terms of Service (TOS) documentation over the past two years, driven primarily by increased regulatory scrutiny, high-profile data breaches and the advent of AI.
State-level legislation has also expanded dramatically, with a 620% increase in cybersecurity and data privacy laws enacted since 2020 compared to previous years. The pandemic changed so much!
The U.S. Department of Education released updated guidance in 2024 through the National Educational Technology Plan, emphasizing the need for stronger data protection measures and more transparent vendor relationships.
Federal agencies have also increased enforcement actions, with the FTC prioritizing student data privacy cases and making it clear that companies cannot trade children's privacy rights for educational access. So we need a common understanding and language.
The "Texas Effect" Comes to Student Privacy
There's an old adage in American education: "As Texas goes, so goes the nation." For decades, Texas's massive student population and centralized textbook adoption process meant that publishers tailored their content to Texas standards, which then became the de facto national curriculum. Textbooks that Texas approved for its 5.4 million students often ended up in classrooms from Maine to Hawaii.
Now, this "Texas effect" is reshaping student data privacy.
With the enactment of the Texas Data Privacy and Security Act (HB 4, signed June 18, 2023, effective July 1, 2024) and the Texas Student Privacy Act, Texas has created the most comprehensive and specific student data protection framework in the nation. Just as textbook publishers once built to Texas curriculum standards, EdTech companies must now build to Texas privacy standards—or risk losing access to one of the largest educational markets in America.
Why does this matter for every EdTech provider? Because the systems you build for Texas compliance will likely become your standard nationwide. It's simply not cost-effective to maintain different privacy architectures for different states. Texas has effectively set the new baseline for student data protection.
The 96% Wake-Up Call in Better Context
This Texas leadership comes at a critical time. As I mentioned in my last post, research found that 96% of EdTech applications shared student data with third parties, likely without realizing they're violating federal privacy laws (FBI/K12 SIX Study). But while federal laws like FERPA and COPPA provide frameworks, they often lack specificity about what exactly constitutes protected student data in our digital age.
Texas has filled that gap with explicit definitions and categorical protections that clarify what FERPA implied but didn't always specify, including:
Discipline records and behavioral data
Biometric identifiers (including voice patterns and keystroke dynamics)
Religious beliefs and political affiliations
Precise geolocation data
Health information beyond basic records
Social Security numbers and government identifiers
This isn't just about Texas compliance… it's about understanding where student privacy protection is headed nationally. When EdTech providers write to these Texas standards, they're not just meeting one state's requirements; they're future-proofing their platforms for the (inevitable?) wave of similar laws in other states (if federal AI laws don’t override them for the next 10 years).
Building Bridges, Not Walls
The comprehensive Texas framework offers something valuable: a chance to finally bridge the gap between how educators and technologists think about student privacy. Texas law translates the vague federal requirements into concrete technical specifications. It turns "protect student privacy" from an abstract goal into an actionable checklist. By understanding and implementing Texas's clear standards, EdTech companies can move from accidental non-compliance to intentional privacy leadership.
Why the Compliance Gap Exists
Educational technology sits at the intersection of two very different worlds:
Education: Highly regulated, privacy-focused, public trust essential. Protect student data.
Technology: Move fast, iterate quickly, data drives everything. Leverage all data.
This reminds me of when I worked for a Voice-over-IP (VoIP) startup in the early 2000’s, where it was also a combination of internet and telephony people. They were also very different worlds:
Internet: Move fast, if something wasn’t working? Restart and reboot.
Telephony: Highly regulated, with 99.999% uptime for reliability. Do NOT break the current.
And, as it was 20+ years ago, neither side is wrong—they're just speaking different languages. Schools assume vendors understand educational privacy law because they work in education. EdTech companies assume their standard tech industry practices apply to schools. Both assumptions create risk.
The Texas Clarification: Texas's new laws illustrate this disconnect perfectly. While public schools are exempt from the TDPSA as state entities, their EdTech vendors must comply fully—creating a situation where vendors have stricter obligations than the schools they serve.
FERPA: A Translator's Guide for EdTech
For educators, FERPA is second nature. For technology professionals, it can seem like a maze of exceptions and interpretations. So, let's find a common language.
What FERPA Actually Protects
Education Perspective: "Student records" means everything—grades, behavior, login times, search queries, writing samples, even metadata.
EdTech Translation: If data can be linked to a specific student, it's protected. This includes:
Analytics data tied to user IDs
Behavioral patterns that could identify a student
Aggregate data that could be reverse-engineered
Any PII, even if collected indirectly
What Texas Clarified (2023-2024): The Texas Student Privacy Act now explicitly lists protected categories that FERPA implied but didn't always specify, including:
Discipline records
Biometric data
Social Security numbers
Religious and political information
Health and medical records beyond basic directory information
This Texas clarification helps EdTech providers understand that "education records" truly means ANY data linkable to a particular student.
How Schools Legally Share Data with EdTech, AKA "Legitimate Educational Interest" Exception
For EdTech companies, understanding FERPA isn't just about knowing what data is protected—it's about understanding how you can legally access that data in the first place. The primary legal mechanism that allows schools to share student records with EdTech vendors is FERPA's "legitimate educational interest" exception.
This exception permits schools to share education records with "school officials," which can include contractors and vendors, without obtaining parental consent. But here's where EdTech companies often misunderstand the scope: having a "legitimate educational interest" doesn't mean you can use student data however you want. The uses must be strictly limited to the educational purpose for which the data was shared.
What Schools Mean: Direct instructional use. Teachers viewing data to help specific students learn better.
Common EdTech Misunderstanding: This doesn't extend to:
Product improvement (even educational products)
AI/ML model training
Benchmark studies
Marketing analytics
Investor metrics
We suggest a better approach for EdTech firms. Separate educational use from business needs. Be transparent about both.
Texas Makes It Crystal Clear: The Texas Student Privacy Act explicitly prohibits using student data for commercial profiling or creating non-educational profiles—removing any ambiguity about what constitutes legitimate educational use.
"Directory Information," and Why Basic Student Data Is Not Basic
FERPA creates a category called "directory information" that schools can share without parental consent—things like student names, addresses, telephone numbers, email addresses, photographs, dates of attendance, and grade levels. Many EdTech companies assume this means such data is "public" or freely usable. This is a dangerous misconception that can lead to compliance violations.
Even this "basic" directory information comes with strict requirements and limitations:
Each school defines its own directory information (what's "directory" at one school may be protected at another)
Parents can opt out individually, meaning you can't assume any data is universally shareable
You need school-specific policies, not general assumptions about what's considered directory information
Default to maximum privacy, not minimum—if you're unsure whether something is directory information for a specific school, treat it as protected.
If your platform auto-populates student profiles with what you consider "basic" information, or if you use student names and photos in marketing materials thinking they're "just directory information," you could be violating FERPA.
Best practice? Verify with each school partner what they've designated as directory information and which parents have opted out.
And… Like my mom always said, when in doubt, don’t.
COPPA, the Under-13 Challenge for EdTech
The Children's Online Privacy Protection Act (COPPA) is a federal law that requires parental consent before collecting personal information from children under 13. For EdTech companies, this creates an immediate problem: how to provide educational services to elementary students if you need individual parental consent from every student's parent before they can even log in?
This is where COPPA's "school consent exception" becomes critical—it's the legal mechanism that makes K-8 EdTech possible at scale.
Understanding COPPA's School Consent Exception
Why This Exception Exists: The FTC recognized that requiring individual parental consent for every educational tool would be impractical and would hinder schools' ability to use beneficial technology. So they created an exception: schools can provide consent on behalf of parents, but only under strict conditions.
What the Exception Allows: When schools use EdTech for educational purposes, they can consent on behalf of parents for the collection of student personal information—but this consent is limited. According to FTC guidance, the school's consent only covers uses that are for the educational purpose authorized by the school.
Critical Limitations That Can Trip Up EdTech Companies:
The school's consent does NOT extend to:
Data use for product development or improvement
Behavioral advertising (even if you call it "educational")
Creating child profiles for non-educational purposes
Sharing data with third parties beyond what's needed for the educational service
Using student data after the school relationship ends
The Compliance Trap: Many EdTech companies mistakenly believe that once a school "consents" under COPPA, they can use student data like any other user data. This is false and can lead to FTC enforcement actions.
How Texas Strengthens Child Protection: Under the TDPSA (effective July 1, 2024), Texas goes further by categorizing ALL data from children under 13 as "sensitive data" requiring explicit consent—essentially eliminating any gray areas about what requires special protection.
Practical COPPA Compliance Strategies
Instead of relying on minimal age gates or assuming school consent covers everything, consider:
Role-based access: Teachers create and manage student accounts
School-controlled authentication: Use single sign-on through school systems
Clear data use limitations: Document and technically enforce educational-use-only restrictions
Transparent third-party relationships: Disclose all data sharing and ensure partners understand COPPA limits
Data deletion workflows: Build systems to purge student data when no longer needed for educational purposes
Growing Patchwork of State Laws
Beyond federal law, state regulations add complexity to EdTech organizations. Here's a quick overview of other considerations in a few other states:
California (SOPIPA)
Prohibits targeted advertising to students
Prevents creation of non-educational profiles
Requires data deletion upon request
Protects data collected on school devices/networks
Illinois (BIPA)
Biometric data requires explicit consent
This includes: voice recordings, facial scanning, behavioral biometrics
Penalties are substantial: $1,000-$5,000 per violation
Consider whether your "engagement tracking" creates biometric profiles
(For comprehensive state law tracking: https://studentprivacycompass.org/state-laws-and-legislation/)
Building Compliance Into Your Products
For EdTech companies wanting to get this right, here's a practical framework with Texas-specific considerations:
1. Data Minimization by Design
Collect only what directly serves educational purposes
Question every data point: "How does this help students learn?"
Build features that work with less data, not more
Make privacy a competitive advantage
Texas Tip: Document your data minimization decisions. Texas requires data collection to be "adequate, relevant, and reasonably necessary"—be prepared to justify each data point.
2. Transparent Data Mapping
Create clear documentation showing:
What you collect
Why you need it
Where it goes
Who can access it
When it's deleted
Texas Requirement: Your privacy notice must be "reasonably accessible and clear" and include specific categories of data processed, purposes, third-party sharing, and consumer rights information.
3. School-Friendly Controls
Build features educators actually need:
Granular permission settings
Easy data export/deletion tools
Clear audit logs
Parent access capabilities
Consent management workflows
Texas Must-Have: Universal opt-out recognition as of January 1, 2025. Build this capability to protect your technology everywhere.
4. Privacy-First Architecture
Technical decisions that demonstrate commitment:
Data encryption at rest and in transit
Minimal third-party dependencies
Geographic data controls
Automatic data expiration
Privacy-preserving analytics
5. Texas-Specific Implementation Tips
Consent Management:
Build separate consent flows for sensitive data (required) vs. non-sensitive data
Remember: ALL data from users under 13 is "sensitive" in Texas
Create clear, affirmative consent mechanisms—no pre-checked boxes or buried consents
Opt-Out Infrastructure:
Implement systems to detect Global Privacy Control (GPC) signals
Build capability to process opt-outs from authorized agents
Create separate opt-out paths for: sale of data, targeted advertising, and profiling
Data Processing Agreements:
Texas requires specific DPA provisions between controllers and processors
Include data deletion requirements at contract termination
Mandate same obligations flow down to sub-processors
Security Practices:
Implement "reasonable administrative, technical, and physical data security practices"
Scale security measures to data volume and sensitivity
Document your security program—Texas AG will want to see itBecause when it comes to student data, what you don't know absolutely can hurt you — and the students you're sworn to protect.
If you need support in protecting your management of students’ data, we are here to help. We can answer your questions about compliance in educational settings, having supported more than 800 schools in our time together.
We need to handle AI on our terms… safely, responsively and with an awareness of the risks and rewards of using technology in every organization. Including schools.
Resources from AIGG on your AI Journey
Is your organization ready to navigate the complexities of AI with confidence?
At AIGG, we understand that adopting AI isn’t just about the technology—it’s about doing so responsibly, ethically, and with a focus on protecting privacy. We’ve been through business transformations before, and we’re here to guide you every step of the way.
Whether you’re a government agency, school district, or business, our team of experts—including attorneys, anthropologists, data scientists, and business leaders—can help you craft Strategic AI Use Statements that align with your goals and values. We’ll also equip you with the knowledge and tools to build your TOS review playbooks, guidelines, and guardrails as you embrace AI.
Don’t leave your AI journey to chance.
Connect with us today for your free AI Tools Adoption Checklist, Legal and Operational Issues List, and HR Handbook policy. Or, schedule a bespoke workshop to ensure your organization makes AI work safely and advantageously for you.
Your next step is simple—reach out and start your journey towards safe, strategic AI adoption with AIGG.