← Back to Blog

Strip the Algorithm, Keep the Joy

March 7, 2026

Part 3 of 3: Rethinking the Screen Time Panic

Let's recap.

In Part 1, I showed that teen mental health was improving through the 2000s while screen time climbed rapidly. The crisis didn't start until 2010-2012, when a specific cluster of platform features transformed the internet from a tool into an engagement trap.

In Part 2, I dug into the research. The largest studies find no meaningful harm from moderate screen time (games, TV, messaging). The specific concern is algorithmic social media: likes, public metrics, infinite scroll, and engagement optimization. These two camps in the screen time debate, often presented as contradictory, actually agree on this distinction.

So what do we do about it?

The 2005 model

I keep coming back to the 2000s, because they represent a proof of concept for a healthy digital life.

The 2005 internet user played games with friends. Watched a movie or a show. Texted or messaged people they knew. Browsed with intent, looking for something specific. Built things: fan sites, forum posts, early YouTube videos. And when they were done, they closed the laptop. There was a natural stopping point for everything.

None of these activities had likes. None had follower counts. None had an algorithm deciding what to show next based on what would keep you engaged the longest. None had infinite scroll.

The 2005 model wasn't "no screens." It was unmanipulated screens. Technology that did what you asked it to do and then got out of the way.

That model doesn't require a time machine. It requires better software.

The manipulation layer

Nearly every screen-based experience that people find corrosive has the same basic structure: good content wrapped in a manipulative delivery mechanism.

YouTube has incredible content. Tutorials, music, documentaries, comedy, lectures. It also has an autoplay algorithm that leads you from the thing you wanted to increasingly extreme content you didn't ask for, a recommendation sidebar that never stops offering one more video, and no natural endpoint. You came for a 10-minute video. You're still there 90 minutes later.

Social media platforms connect you to people you care about. They also quantify every interaction into a public score, feed you content selected by an algorithm optimizing for engagement over satisfaction, and present it in an infinite scroll designed to eliminate the moment where you decide you're done.

Streaming services have great shows. They also auto-play the next episode without asking, serve recommendation screens designed to keep you browsing, and track what you watch to optimize for retention, not enjoyment.

In every case, the content isn't the problem. The delivery mechanism is the problem. The algorithm. The autoplay. The infinite scroll. The recommendation engine. The engagement optimization.

And here's what matters: you can separate them.

What we're building

That's the thesis behind Last Gen Labs. And the first product we built from it is KABL.

KABL turns YouTube into actual television. Not "YouTube but with playlists." Actual television. Curated channels, running on a schedule, tied to the wall clock.

You open KABL and you see channels. A cooking channel. A documentary channel. A music channel. The channels are curated and the lineups are programmed (and if you have opinions about what channels should exist, I'd love to hear from you at lastgenlabs@gmail.com). Each one plays according to a real schedule, just like broadcast TV. You might tune into the travel channel and land 12 minutes into an Anthony Bourdain episode. You watch it, or you flip to something else. You don't pick from a menu of 500 million videos. You don't get a recommendation sidebar. You don't scroll. You watch what's on.

This sounds like a limitation. It is the entire point.

Every ad-supported platform is designed to maximize the number of decisions you make per session, because each decision is an opportunity to serve an ad and to learn what keeps you watching. The recommendation grid, the autoplay queue, the sidebar of "up next" videos, the homepage feed. All of it exists to keep you choosing, and choosing, and choosing, forever. The infinite choice is the trap. It's why you open a video app for one video and surface an hour later wondering where the time went. Every "next video" decision resets the clock on your engagement.

KABL removes the decision entirely. Something is on. You watch it or you don't. There is no "what should I watch next?" because you don't control the schedule. There is no rabbit hole because there is no hole to fall into. The channel plays what it plays, like a TV channel in 1998, and when you're done watching, you turn it off.

The content is still YouTube's content. The creators are still YouTube's creators. The quality is the same. What's gone is the engagement machinery that sits between you and the content. No algorithm. No infinite scroll. No recommendation engine. No decisions optimized to keep you on the platform.

Think about it this way. You want something on in the background while you cook dinner. You open a video app, get hit with a wall of algorithmic recommendations, pick one, and then autoplay takes you on a journey you didn't plan. An hour later you're watching something you never intended to see and your dinner is burning.

With KABL, you flip to a channel. It's playing something. You cook dinner. When dinner's ready, you turn it off. There was never a moment where the app fought to keep you.

That's not a small difference. That's the difference between television and a slot machine.

Why this matters beyond convenience

The research from Part 2 showed that non-algorithmic screen time (TV, games, shows) is associated with flat or mildly positive wellbeing at moderate levels. The negative signal comes specifically from algorithmic platforms with infinite scroll and engagement optimization.

KABL doesn't just happen to lack those features. It is designed from the ground up to be their opposite. The wall clock schedule means the content runs whether you're there or not, just like broadcast TV. That changes the dynamic entirely. An algorithmic feed is designed to respond to you, to learn you, to hold you. KABL's schedule is indifferent to you. It doesn't know if you're watching. It doesn't care. It just plays.

That indifference is the product. A screen experience that doesn't want anything from you.

One thing I want to be clear about: KABL works within YouTube's ecosystem, not around it. It's built in compliance with YouTube's Terms of Service. Creator ads play normally. Revenue flows through to creators exactly as it would on youtube.com. If you have YouTube Premium, that works too. YouTube has built an extraordinary content library and an extraordinary platform for creators to reach audiences. We're not trying to replace that or undercut it. We think there are people out there who have stopped watching YouTube (or never started) because the interface doesn't work for them. KABL gives those people a different way in. Same content. Same creators. Same monetization. Different experience.

Why this can't come from a big company

If the algorithm is the problem, the obvious question is: why doesn't any major platform just offer a non-algorithmic mode?

Because the business model won't allow it. Ad-supported platforms are free because they sell your attention to advertisers. The algorithm exists to maximize that attention. Removing it would be like a casino removing the slot machines.

This is a structural problem, not a moral one. Venture-funded companies need growth. Public companies have fiduciary responsibility to shareholders. Both demand more users spending more time, which demands more engagement optimization, which demands more algorithms and more infinite scroll. The incentives push relentlessly in one direction, and no amount of "digital wellbeing" settings changes the underlying economics. Those features are designed to make you feel better about the platform, not to actually reduce your usage (but that's a subject for another post).

The only way to build software that genuinely respects your time is to have a business model that doesn't depend on capturing it.

KABL costs $2 a month or $20 a year. Full prorated refund anytime you want to cancel. That's it. No ads. No data harvesting. No engagement optimization. No investors pushing for growth metrics. No stakeholders whose returns depend on you spending more time on the platform.

You are the customer. Not the product. Not the engagement metric. Not the advertising target. The customer.

We will build up the service as much as that revenue allows. We will not take investment. We will not add ads. We will not add an algorithmic feed. The business model is: you pay a small amount, we run the servers, and the software respects your time. If that means we stay small, we stay small. That's fine. The entire point is to not become the thing we're building against.

A better question

The screen time debate has been asking the wrong question for years. "How many hours?" matters less than "who's in control?"

An hour watching a show is not the same as an hour being fed content by an algorithm. An evening playing a game with friends is not the same as an evening scrolling a feed that was designed to never let you stop. The minutes are the same. The experience is completely different.

The 2000s proved that technology and wellbeing can coexist. What broke wasn't the screen. It was the business model behind it. The algorithm. The scroll. The engagement optimization that turned software from a tool into a trap.

You can't fix that from inside the business model that created it. You have to step outside it entirely.

That's what we're doing. The internet used to be something you visited. It can be again.


This is Part 3 of a three-part series. Start with Part 1: Your Kids Were Fine in 2005.


References (for the full series)

On teen mental health trends:

On screen time and wellbeing:

On social media design:

← Back to Blog

Last Gen Labs