Blueprints for the Future: How Research, Policy, and Innovation Can Grow Readers in the Age of AI
Blog Post
Natalya Brill via Roman Samborskyi/Shutterstock and ilikestudio/Shutterstock
Nov. 19, 2025
Ten years ago, Tap, Click, Read: Growing Readers in a World of Screens explored how the technologies rapidly finding their way into children’s homes and schools could enhance, rather than undermine, their literacy and language development. Since then, artificial intelligence (AI) has brought new potential to augment education, along with questions and concerns about its use. Forty states have passed legislation or regulations requiring evidence-based reading instruction, while the nation’s reading scores have fallen, and federal investments in education research have been drastically reduced.
Against this backdrop, New America, the Joan Ganz Cooney Center at Sesame Workshop, and the Campaign for Grade-Level Reading (CGLR) held a public event in October, to revisit the overarching question raised in Tap, Click, Read: How do we raise thoughtful, capable readers in an ever-evolving digital world?
Source: Photo by Annan Productions
The first panel at the event featured the authors of Tap, Click, Read, Lisa Guernsey and Michael Levine, and CGLR managing director Ralph Smith. Smith recalled what CGLR was learning from parents in 2010. “In addition to managing their time, managing their resources, they were overwhelmed by the proliferation of tools and products online. They couldn't figure out where to start,” he said. That challenge became a key inspiration for Tap, Click, Read, which scanned what was a “digital wild west” of tools and apps, seeking evidence of quality and how they stacked up against the science of reading research.
Guernsey and Levine lamented that though the research has long been clear around best practices to teach children to read, the ongoing lack of investments in early learning, drastic cuts to research and development by the current administration, and an unwillingness to regulate technologies like AI were pushing us in the wrong direction. When asked about the expansion of tech bans in schools, both authors advocated for a more balanced approach. As Levine explained, “Tech bans tend not to work, especially if they're not accompanied by robust work with parents and children about how to use the tech that is so ubiquitous in their lives.” Guernsey emphasized the need for offering positive learning activities instead of just telling students not to use phones. She advocated for more engaging, hands-on opportunities across community ecosystems, in libraries, recreation centers, businesses, and schools, to restore balance in children’s lives.
What advice did they have to address present and future challenges? Levine urged states to combine forces with philanthropy and the ed tech community to create new models of innovation, while Guernsey underscored the importance of the U.S. Department of Education for supporting state and local public education, not to mention research on media and tech, like the Ready to Learn program, which has been defunded. Smith noted the ongoing need for guidance that strengthens parents’ discernment around technology, and developing instructional tools and methodologies that don’t isolate literacy and mathematics, given the emerging research demonstrating their entwinement.
Source: Photo by Annan Productions
Angelica DaSilva, fellow for literacy and technology at the Joan Ganz Cooney Center, opened the second panel with an overview of the Cooney Center’s Sandbox initiative, which helps developers integrate literacy research and Universal Design for Learning (UDL) principles into their products. She described the fragmented knowledge-base around the research that is pervasive among developers, which can result in a hyperfocus on one element of reading, rather than building the breadth of skills necessary for comprehension. DaSilva made the case for deeper efficacy research in ed tech products across the board, and encouraged developers incorporating AI in their products to conduct studies, so the field can build an evidence-base around the impact of AI on children’s learning and development.
Medha Tare, senior director of research at the Joan Ganz Cooney Center, shed light on the Sandbox’s model of co-design, new to many of the developers. In their model, children are brought in for a three-day session, early on in the product development process, to act as design partners, aiding in the brainstorming and ideation. The Sandbox also hosts whole-child workshops to strengthen developers’ understanding of learner variability and UDL principles that address diverse learning needs.
Source: Photo by Annan Productions
Tare moderated the second panel, which featured three developers currently participating in the Sandbox’s literacy consultancy. Sago Mini, a studio that creates play-based apps for young children, leveraged the Sandbox team’s knowledge to design a new app aimed at building early elementary students’ literacy skills at home. Youngna Park, product lead at Sago Mini Studios, explained how, through their work with the Sandbox, they developed an evidence-based curricular progression, with scaffolding for struggling readers.
The developers at LitLab.ai, an online platform that offers students decodable texts, worked with the Sandbox team on a feature that allows kindergarten, first, and second grade students to record themselves reading and generates AI-assisted feedback and progress monitoring reports. LitLab’s sessions with the Sandbox resulted in new scaffolds for multilingual learners integrated throughout the platform, such as Spanish translations for the words being decoded, and the ability to navigate the entire platform in Spanish.
Lirvana Labs, an online platform for educators and students, partnered with the Sandbox team to create a personalized voice-based writing coach for upper elementary students. Dan Lee, head of product at Lirvana Labs, talked about co-designing with students between fourth and sixth grade during their sessions in New York. “What we did not expect,” he said, “is how insightful the kids were.” The students were cognizant, he said, that the writing coach was an AI agent, and they expressed skepticism around the chatbot’s recommendations. Through the design-thinking sessions, the developers worked with students to refine the voice-based agent’s personality and tone, which led to increased trust and rapport.
Both of the developers with AI-embedded products emphasized the importance of maintaining students’ safety and protecting them from potential harms. Lee gave the example of Lirvana Labs’ first product line, aimed at pre-K through fifth grade students, which uses curated learning media reviewed by literacy experts, child play therapists, and veteran educators. Children can hear the learning media translated into over 70 languages, but the conversations are one-way, and users can only respond by drag-and-drop, eliminating any personally identifiable information.
Drew McCann, head of learning at LitLab, described the tightly controlled nature of their platform, which enables students to create decodable texts aligned to the skills that their teacher has assigned, but restricts them to rigorously-tested character settings and names. Students are also restricted from chatting with any bots, for safety reasons. “Our core belief,” McCann said, “is that AI should amplify the human teaching and the already existing high-quality curricula, not replace it, all with very rigorous safety standards for students.” She noted AI’s potential to lighten teachers’ workload, as it takes on more progress monitoring and assessment.
Source: Photo by Annan Productions
The final panel, moderated by Guernsey, shifted the focus to the policy and funding conditions that facilitate learning and innovation, couched in the current context. Sara Schapiro, executive director of the Alliance for Learning Innovation (ALI), underscored the need for investment in innovative research, particularly around AI, by the federal government and the states. “We need the data to tell us what is working, for whom, and under which conditions,” she said.
Despite the administration’s cuts to the Institute of Education Sciences, the research arm of the U.S. Department of Education, Sara shared some glimmers of hope. Among those she mentioned was the National AI Institute for Exceptional Education, an initiative funded by the National Science Foundation and the Institute of Education Sciences, and led by the University at Buffalo. The institute is creating an AI-driven universal screener to identify students with dyslexia and dysgraphia, along with an AI-driven tool that will help educators collect data and plan interventions for students with Individualized Education Programs (IEPs).
Guernsey asked An-Me Chung, director of teaching, learning, and tech at New America, about the administration’s push for AI literacy in executive orders and grantmaking priorities. Chung invoked the metaphor of firefighters and architects, which she attributed to Geoffrey Canada, president of the Harlem Children's Zone. According to Chung, Canada posited that while educators and community leaders often rush to put out the fires in children’s lives, what was missing was the work of the architects, who design systems, structures, and policies that prevent fires in the first place.
“What is emerging through the research and interviews," Chung said, “is that digital literacy, just like reading, writing, and math, is foundational. It is the fourth R. And AI literacy is starting to emerge as the connecting tool that actually can strengthen these foundational literacies.”
Michelle Kang, chief executive officer of the National Association for the Education of Young Children (NAEYC), reflected on NAEYC’s earliest guidance on digital media usage with young children, and its resonance today. “Early learning is grounded in adults helping to facilitate the learning of young children and technology is a supplement—it's a tool to help support that,” not a replacement, Kang said.
Kris Perry, executive director of Children and Screens, promoted a more cautious approach to the adoption of technologies in education, with strong protections for students. “It would just be lovely if what's being created was done in partnership, with guardrails established through public policy, that we can all agree on, that ultimately result in safe, meaningful products for families and educators, that enhance children's learning experiences. Unfortunately, right now, what it feels like is industry is leading us,” she said. Perry spoke to the need for investments outside of technology so children and adolescents have safe, outdoor, offline spaces, and a better balance can be struck.
In closing, Guernsey and Levine envisioned what might come next. Guernsey spoke to the need for ongoing, collaborative conversations, and strengthening the overall ecosystem of learning surrounding each child. Levine called on the architects, saying “Thinking back 10 years, and looking ahead to 2035, we've got to build.” He continued, “We need those folks who are policymakers at the state and local level, and we need philanthropy, and we need product designers especially, each of whom have the ability to scale things, to begin to put together new blueprints, and new visions.”