The hour-long pipeline
Real-time is a tempting default. Here's why txtfeed updates its content pipeline once an hour — and why that's the right choice for reading.
Twitter refreshes in real time. Slack refreshes in real time. The modern web has conditioned everyone to expect that feeds update the moment something happens. So when txtfeed's content pipeline runs on a one-hour cadence, the first question every engineer asks is: why not real-time?
The answer is that real-time isn't what reading wants. Reading is episodic. A user opens txtfeed, scrolls for five to twenty minutes, closes it, and comes back hours later. They don't need the feed to update while they're reading — in fact, updates during a session are disorienting (items move, new items insert above, the user loses their place). Real-time optimizes for presence. Reading doesn't need presence.
What reading needs is freshness at the moment of arrival. When a user opens txtfeed at 8 PM, the feed should reflect the state of the web as of ~8 PM. An hour of lag is invisible. Two hours is noticeable. Six hours is a problem. The engineering question becomes: how do we minimize lag-at-arrival, not how do we minimize lag-at-publish?
The hour-long pipeline solves this cleanly. Every hour we pull from Reddit, HN, Wikipedia, 500+ RSS feeds, and Google Trends. We dedupe, score, tag, and push to the edge cache. By the time a user opens the site, the feed is at most sixty minutes stale — almost always less. And the cost of running this is linear in content volume, not linear in user count, which means it scales cheaply.
Real-time would require completely different infrastructure: incremental updates, streaming consumers, diff-based client rendering, per-user push channels. All of that cost for a benefit users don't actually want. The hour-long pipeline is a deliberate underinvestment in presence so we can overinvest in quality.
The lesson generalizes. The default infrastructure choices in 2026 — real-time, always-on, sub-second latency — are optimizing for product categories that aren't yours. If your users read episodically, browse periodically, or think before they act, the latency numbers that make Twitter feel alive are just waste heat for your product.
Found this useful? Share it.
See it for yourself. No signup required.
Open txtfeed