Blaming the Parents, Not the Platforms: How a World Bank Screen-Time Report Lets the Attention Economy Off the Hook

Civics of Technology Announcements

Curriculum Crowdsourcing: One of the most popular activities in our Curriculum is our Critical Tech Quote Activity because the lessons offers a quick way to introduce a range of critical perspectives to students. We have decided to create a Critical AI Quote Activity and we want your help. Please share your favorite quotes about AI with the author and source via our Contact page. We will collect them, write the lesson, and share when it is ready!

Next Tech Talk: Please join us for our next Tech Talk where we meet to discuss whatever critical tech issues are on people’s minds. It’s a great way to connect, learn from colleagues, and get energized about the work. Our next Tech Talk will be held on Tuesday, December 2nd at 8PM Eastern Time. Register here or visit our Events page.

December Book Club: Michelle Levesly is leading a book club of Resisting AI by Dan McQuillan. Join us to discuss at 10:30 AM EST on Wednesday, December 3rd, 2025. Head to the events page to register!

Text reads "Blaming the Parents, Not the Platforms How a World Bank Sreen-Time Report Lets the Attention Economy Off the Hook. There is a picture of the report cover. Next to that is an image of a child with brown skin peering into a mobile device.

Editor note: This is a cross post with Punya Mishra’s blog. It is posted here with permission.

By Punya Mishra 

The World Bank recently released a report titled Balancing the Digital Scales: Screen Time Management in Early Childhood Education (Molina, 2025). The report lays out how excessive screen exposure in young children is associated with measurable developmental harms: language delays, attention difficulties, disrupted sleep, and compromised social-emotional skills. The report synthesizes research from over 80 studies across 18 countries, documents that half to three-quarters of preschoolers now exceed WHO guidelines.

To its credit, the report offers evidence-based intervention strategies like dialogic reading, play-kit lending (boxes of 5–7 open-ended play items, that can be borrowed from schools or clinics by parents), and tech-free family mealtimes. In fact, the bulk of the report focuses on what parents, teachers, and caregivers should do differently. We are given habit-mapping worksheets, family media agreement templates, and "if-then" planning exercises for parents managing toddler tantrums in restaurant queues. 

That said, what is telling is what is NOT in the report. 

What we don't get is any analysis of what makes this technology so addictive in the first place. What we don't get is any description of the root causes of the problem. Technology companies, the people who create this technology, get maybe a couple of passing mentions, buried in platitudes about 'responsibility' and 'age-appropriate design.' Meanwhile, parents get forty-four pages of worksheets.

This imbalance speaks volumes, particularly when you consider what these worksheets are asking parents to resist. David Segal's (2022) story for the New York Times (A kids show juggernaut that leaves nothing to chance), takes us inside Moonbug Entertainment, the company behind CoComelon, the second-largest YouTube channel in the world. Once a month, toddlers are brought to their London headquarters for "audience research." Children watch episodes while researchers track their attention using something they've whimsically named the "Distractatron": a second screen playing mundane footage, positioned to catch any wandering gaze. Every time a child looks away from the show, a note is made. The goal is to identify and eliminate the exact moments when engagement falters.

Data teams conduct continuous A/B testing on every variable. Should a character wear black jeans or blue? Should the music be louder or softer? Should the bus be yellow or red? (Yellow, it turns out—"kids love yellow buses around the world," even in countries where yellow buses transport prisoners.) Toddlers are particularly drawn to objects covered in dirt and to minor injuries requiring Band-Aids. As Moonbug's chief content officer explained to Segal: "The trifecta for a kid would be a dirty yellow bus that has a boo-boo."

Jia Tolentino's (2024) investigation for the New Yorker (How Cocomelon captures our children’s attention) reveals the next phase of optimization. CoComelon episodes are stitched into compilations of increasing length, thirty minutes, sixty, ninety, eventually stretching to five hours. Former employees describe content assembled via spreadsheets tracking YouTube's most popular search terms, with writers tasked to incorporate as many high-traffic keywords as possible. The logic is brutally clear: longer compilations generate more ad revenue, keyword optimization captures more algorithmic distribution. Perhaps most revealing: former employees report an internal directive that episodes should not end with characters going to sleep, even if the episode is set to a lullaby. The reason? If children see characters sleeping, they might press pause and put the screen away.

Is it surprising then that the tech has become a universal babysitter? As the report itself says:

Throughout much of the world, the “digital babysitter” – a phone, tablet, or other screen handed to a child so the device can stand in for an attentive adult– has become a routine fixture of early childhood.

Parents deploy it to calm toddlers in supermarket lines, restaurant queues, waiting rooms, or simply to carve out minutes for work and household chores. In dual‑earner homes, a screen bridges the gap between long workdays and limited formal childcare; single parents juggling multiple jobs often see it as the only practical option when no other adult is available. In low‑resource communities – from rural Ethiopian villages to peri‑urban districts in Latin America – digital content can become a child’s most accessible form of stimulation when safe play spaces or early‑learning programmes are scarce (Abdeta et al., 2024; Rocha et al., 2021). Urban middle‑class families in cities such as Kuala Lumpur, Bogotá, and Istanbul likewise report using tablets as rewards for good behaviour or calming tools before bedtime, reinforcing the device’s role as a convenient caretaker (Raj et al., 2022; González et al., 2022; Güler & Ocak, 2024). [Citations in original.]

Let us be clear: none of this is accidental. None of this should come as a surprise. What we are confronting is industrial-scale design, refined through thousands of hours of testing, optimized by teams of data analysts, deployed at a scale reaching billions of viewing minutes. What makes these shows so seductive to overwhelmed caregivers is not a happy accident. It is the engineered product, doing exactly what it is designed to do.

Given this, is it fair to think that better time management and advance preparation can solve the problem? The fact of the matter is that a parent juggling precarious employment, inadequate childcare, and the impossible math of the second shift cannot out-engineer behavioral scientists armed with eye-tracking software and Distractatrons.

Moreover, families with the fewest resources (less access to paid childcare, more precarious work schedules, and fewer alternatives) face the greatest negative impact. Jenny Radesky, the developmental pediatrician who helped author the American Academy of Pediatrics' screen-time guidelines, is quoted in the New Yorker article as saying: "Just the fact that the compilations are marked thirty minutes, sixty minutes: they know these products are filling a gap created by parents being overworked, not having family leave—who are so stressed that they need to occupy their children for a certain amount of time.” These are, she argues, “systemic issues that keep us from parenting the way we want to.”

Meanwhile, creators of these platforms have zero incentive to change. Every design choice (the Distractatron testing, the yellow buses, the keyword-optimized episodes, the prohibition on characters sleeping) flows from a single imperative: maximize time on screen.

I'm not a policy expert, so I approach solutions with appropriate caution. But it seems clear that if we're serious about protecting child development, we need enforceable standards rather than voluntary guidelines. That might look like banning autoplay for children's content, requiring natural stopping points in programming, mandating transparency about recommendation algorithms for kids, and imposing meaningful penalties for design patterns that prioritize engagement over child welfare.

But before all that can happen, we need to recognize the real issue, the real problem. The World Bank has institutional weight. It could push for regulatory frameworks as conditions for development funding in digital learning initiatives. It could make algorithmic accountability a priority rather than an afterthought in the report. Instead, it offers parents a family media plan template and technology companies vague appeals to “age-appropriate design principles.”

And in doing so, they are falling into a somewhat simple trap. The trick is to individualize the problem, make it one about personal choice, and through that obscure where power actually lies. This is a tactic that has been used before by tobacco companies and by big oil. When BP popularized the "personal carbon footprint," it did more than invent a metric—it invented a distraction (Kaufman, 2020). The message was elegantly simple: if you are to blame for how much you drive, the industry that drills and refines can keep drilling and refining. These moves preserve the business model while appearing to address the harm. By focusing on families and what they can do, the report subtly places responsibility on those with the least power in this situation. The assumption is that overwhelmed caregivers with worksheets could out-engineer billion-dollar attention extraction systems.

The report's recommendations aren't wrong—they're just missing the forest for the trees. This isn't about whether parents bear responsibility. Parenting decisions matter. The report's recommendations for dialogic reading and tech-free mealtimes are useful. But when policy responses focus overwhelmingly on teaching parents better habits without addressing the companies engineering products that exploit structural constraints, they are, either intentionally or otherwise, turning a blind eye to the real problem.

References

Kaufman, M (2020, July 13). The carbon footprint sham: A 'successful, deceptive' PR campaign. Mashable. https://mashable.com/feature/carbon-footprint-pr-campaign-sham

Molina, E. (2025). Balancing the Digital Scales: Screen Time Management in Early Childhood Education. In Digital Innovations in Education. World Bank. Available at https://openknowledge.worldbank.org/entities/publication/76ea3b99-4875-49f8-8330-56cfe3e17339

Segal, D. (2022, May 5). A kid's show juggernaut that leaves nothing to chance. The New York Times. https://www.nytimes.com/2022/05/05/business/cocomelon-moonbug.html

Tolentino, J. (2024, June 10). How CoComelon captures our children's attention. The New Yorker. https://www.newyorker.com/magazine/2024/06/17/cocomelon-children-television-youtube-netflix

Next
Next

“Meet alt+Sam.” A GenAI Exercise in Civic Literacy