BOULDER, CO (September 25, 2025)—Digital educational platforms have become ubiquitous in American classrooms. Teachers now rely on Google Workspace for Education, Kahoot!, Khan Academy, MagicSchool, Zearn, and countless others to deliver curriculum, structure lessons and student collaboration, assess and track learning, and communicate with families. Few facets of schooling remain untouched by these tools.
A new policy brief, Fit for Purpose? How Today’s Commercial Digital Platforms Subvert Key Goals of Public Education, by Faith Boninger of the University of Colorado Boulder and T. Philip Nichols of Baylor University, reviews research examining key pedagogical, logistical, and ethical implications of ed tech platforms for schools. The brief’s analysis of these implications leads to recommendations for teachers, school leaders, and policymakers on how to govern its use.
For years, the technology industry marketed its products to schools as efficient solutions for instruction and management. Before the COVID-19 shutdowns, such efforts had only moderate success. But the sudden shift to remote learning accelerated widespread adoption, even among reluctant educators. This emergency shift obscured the fact that digital platforms may not always serve public education’s core purposes.
Unlike the software of the past, today’s ed tech platforms are complex ecosystems, and these ecosystems are fueled through data extraction. They extract user data not only to personalize and improve their own functionality but also to share across interoperable systems.
Accordingly, while educators may see these platforms as neutral tools, they are in fact shaped by competing interests and hidden imperatives. Teachers, students, and administrators are only one market. The other market involves data on performance, usage patterns and engagement—data flowing to advertisers, data brokers and investors, often without users’ knowledge or consent.
An ecological perspective of digital educational platforms reveals three interlocking layers: (a) visible classroom uses; (b) underlying technical architectures of code, algorithms, and hardware; and (c) deeper political-economic structures of ownership and profit. Recognizing these layers enables schools to consider how, for example, business decisions on the part of a platform provider determine how the platform is programmed to extract and use student data, how those programming decisions determine day-to-day experiences of teaching and learning, and how school data might be used for subsequent business purposes far from the school. This moves schools beyond simple questions of how to select a tool or prevent students from misusing it, and toward critical consideration of how platforms operate, whose values they embed, and what resources they demand.
To ensure a platform is truly appropriate for their communities, schools must first define their own needs, values and goals. They can then evaluate whether digital tools support or undermine them. This clarity can shield educators from aggressive marketing and empower them to adopt only those systems aligned with their mission. Policymakers must also provide stronger safeguards. With federal policy promoting artificial intelligence as a cure-all for modern education, schools have good reason to pause and weigh the broader consequences before embracing new technologies.
Find Fit for Purpose? How Today’s Commercial Digital Platforms Subvert Key Goals of Public Education, by Faith Boninger and T. Philip Nichols, at:
http://nepc.colorado.edu/publication/digital-platforms