Why? Because streaming services like Spotify pay royalties to artists for each play. By filling playlists with cheap, commissioned tracks, Spotify could reduce royalty payments and boost profit margins, all while giving listeners, like me, the illusion of a carefully curated experience. Listeners thought they were getting the best music for their mood, but in reality, the system prioritized cost efficiency over artistic quality.
Education technology can fall into the same trap.
AI-powered tutors, personalized learning platforms, and automated grading tools promise to also provide our students with a personalized learning, instead of listening, experience. But if we’ve learned anything from the recent adoption of edtech, and from Liz Pelly’s investigation of Spotify's practices, it’s this: just because a tool is innovative doesn’t mean it’s designed solely with students and teachers in mind. It might be built for cost savings, engagement metrics, or data collection, rather than actual learning.
That’s why schools can’t afford to take a passive approach to edtech adoption. The tools shaping student learning deserve the same scrutiny that policymakers are now applying to accessibility, equity, and transparency. Just as Spotify’s algorithm quietly reshaped the music industry, AI-driven edtech is already influencing classrooms—often without teachers, students, or parents fully understanding how.
The question isn’t whether technology belongs in education. It’s whether we’re choosing tools that truly serve students, or ones that increase educational disparities.
Why thoughtful adoption of edtech matters
If you’ve ever sat through a district tech demo, you know the routine. Flashy AI-powered tools, promises of “personalized learning,” and the guarantee that this will make teachers’ jobs easier. But do these tools actually solve real classroom challenges, or are they just adding another layer of complexity?
Too often, schools invest in edtech without a simple plan for how it fits into teaching and learning. Sadly, technology is often built from the perspective of engineers, not students. Schools need a framework for evaluating tools, ensuring they align with pedagogy rather than chasing trends.
Ensuring accessibility for all learners
New state laws in places like Colorado and Maryland require digital tools to be accessible for students with disabilities, ensuring that AI and edtech don’t create additional barriers. These laws exist because, for too long, accessibility has been treated as an afterthought rather than a requirement.
Screen reader compatibility, alt-text, adaptive interfaces, and customizable settings should not be optional features—they should be built into every tool from the start. Schools need to start asking vendors hard questions: Does this platform support screen readers? Are captions standard or an extra step? Can students adjust their learning interface to meet their needs? If the answer isn’t a clear yes, that tool isn’t ready for our classrooms.
Transparency and data privacy
Just as Spotify’s algorithms determine what listeners hear, many edtech platforms dictate what students see—often without educators fully comprehending how content is chosen. Student performance drives the adaptation of personalized learning platforms, but who selects the content? Is AI reinforcing biases? How much data is being collected, and who has access to it?
Schools must demand transparency from edtech providers. Educators should know exactly how AI decides, what data it collects, and whether the system is guiding students toward deeper learning or just optimizing for engagement. Without that clarity, teachers lose control over what their students are actually learning.
The path forward: How schools can make smarter edtech choices
- Create an EdTech Evaluation Framework: Before adopting a tool, schools should have clear criteria for accessibility, equity, and pedagogical alignment.
- Engage teachers in decision making: Educators should be part of the selection and implementation process, ensuring that tools support their instructional goals rather than disrupting them.
- Hold vendors accountable: Schools should not accept vague answers about accessibility, data privacy, and AI decision-making. Schools should demand clear, concrete responses from vendors before approving their tools for classroom use.
Edtech is here to stay. Let’s get it right
Technology in education isn’t going anywhere, but how we use it is still up to us. Schools can either let AI-driven edtech quietly shape learning without oversight, or they can demand tools that are transparent, accessible, and designed to support teachers and students. We shouldn’t let cost-cutting algorithms or engagement metrics dictate the future of education; educators’ expertise should guide it, focusing on the needs of real learners. The choice isn’t coming in the future. It’s already here.
Jason McKenna is V.P. of Global Educational Strategy for VEX Robotics and author of “What STEM Can Do for Your Classroom: Improving Student Problem Solving, Collaboration, and Engagement, Grade K-6.” His work specializes in curriculum development, global educational strategy, and engaging with educators and policymakers worldwide. For more of his insights, subscribe to his newsletter.