Skip to content(if available)orjump to list(if available)

Despite faster broadband every year, web pages don't load any faster

wodenokoto

Not just load time, but "time to read content" has exploded. Once the webapp has downloaded and it can start doing REST like requests for the actual content, it also needs to start loading pop-ups and the "continue reading"-button, that can hide the content after it is loaded.

So once all that is done, the user needs to click away cookie consent banner, newsletter sign-up and the continue reading button. And only now can we stop the clock on "time to read content".

This is a big reason why I read comments first. I click and get straight to content.

EDIT: I just realized that the "time to ..."-moniker works really bad in my phrasing here. Maybe "time to start reading" would have been better.

roydivision

'Reader View' plugins for Chrome [1], and native for Safari help me a lot with this, cut out all the fat and you're left with only the meat.

Doesn't help with load times though, it would be an interesting exercise for such a plugin to only fetch the content, a la Lynx back in the day.

[1] https://add0n.com/chrome-reader-view.html

kevin_thibedeau

NoScript gives you a 90s style web experience. It's vastly superior to what passes for normal these days and blows away AMP pages on page load performance.

UI_at_80x24

I was a huge fan of NoScript when it first came out and depended on it extensively until FireFox changed something on their backend and it broke NoScript. Then NoScript released an updated version of the extension and honestly I had a terrible time trying to figure out how to use it.

IIRC there was some kind of grid format used that was entirely opaque to me; yet the developer must have thought was an improvement over the previous "allow X.domain.com temporarily" vs "allow X.domain.com permanently." that I was used to.

I should give it another try, maybe it's not as obscure as it once was.

EDIT: I've just reinstalled it, and it is much more intuitive then it was before. Thank-you for reminded me about it.

GoblinSlayer

And for the most egregious cases of broken design, which are sadly regular, you need to disable styling too.

r00fus

The "open all links in Reader view" option in iOS is amazing. I whitelist sites where it breaks or if I need translate, but load times are faster too.

Kiro

What's the point of "continue reading"? Why not just show the full thing immediately?

dspillett

What the network tab in your browser's DevTools when you click it, and you'll see.

Even if the whole text is loaded with the initial page, you'll see a request to somewhere to record that you clicked. Your engagement has been measured. This can be helpful for the site directly (which articles do people actually care about after the first paragraph) and that people are engaged enough to click for more is something they can “sell” to advertisers. A better designed site will have the “read more” button be an actual link so if you have JS disabled (or it fails to load) instead of the content reveal simply failing it falls back to a full-page round-trip so you are counted that way.

This could be done with picking up on scroll events or visibility tests on lower parts of the article, instead of asking the user to click, but those methods are less accurate for a number of reasons (people with big screens, people with JS disabled or JS failed to load, …)

depingus

Once upon a time, web browsers were equipped with a STOP button. As soon as the thing you were interested in was loaded, you could press it and...everything else on the web page would just stop loading. Unfortunately, now days the thing you are interested is hiding behind mountains of cruft and usually the last thing to load.

exo762

So much effort to be able to claim that you do targeted advertising, that is yet to be proven to be effective.

praptak

It's obviously to show more ads.

Kiro

Can't be that obvious considering every reply claims a different reason.

rr888

My guess it figures out who actually wants to read the article, as opposed to accidental clicks, scrapers etc.

Kiro

First reason here that I actually think is valid. Thanks!

lrem

A measuring point.

emiliobumachar

Don't web developers get full access to the state of the user's scrolling?

pacifika

In theory on a list of stories this would give you the excerpt and help you scan the list for interesting pieces.

In practice if editors don’t write excerpts, this is the first paragraph, and if there are no other stories, well it’s a measurement of engagement at that point.

sethammons

Originally, they saved bandwidth. High res images would take time. I believe it continues because they want you to see below at other potential articles/ads.

londons_explore

Makes those user engagement metrics look good?

JohnFen

Those buttons drive me as crazy as cookie banners seems to drive others.

illuminati1911

”So once all that is done, the user needs to click away cookie consent banner, newsletter sign-up and the continue reading button. And only now can we stop the clock on "time to read content".”

You forgot the paywall that you will see at this point.

Maybe there is also a customer service bot saying: ”Hey! Ask me about our special offer on 12 month subscription!”

sidewndr46

Don't forget about the banner advertisement which is lazily loaded and shifts the content you're reading so far down you can't see it anymore.

kamarg

And the banner asking you to install their PWA because you wanted to read one article on their site. Also don't forget to enable push notifications so they can tell you when they post a new update.

2Gkashmiri

haha.. i am building a browser extension that does login and bypass captcha by sending the captcha image to a server and getting a response. that roundabout takes ~6 seconds. the "normal" way of handling was designed with "username<tab>password<tab><captcha send><show captcha getting banner><spinner><captcha get><captcha paste><tab><enter>

this "felt" too much work so i thought of redesigning the workflow so we ended up with <captcha send>username<wait><tab>password><wait><tab><show captcha getting banner><wait><captcha paste><tab><enter>

we made the entire process take time to accommodate the ~6 seconds of captcha which currently does not "feel" as taking that time because people are now not happy with loading and wait spinnners generally or stuff taking time, they want everything to be instant. this is just gaming the system so that we can work around technical limitations

modeless

I believe we are almost at peak GUI. The endgame here is that all these crappy GUIs that are getting worse every year will be relegated to a role of being APIs for AI agents.

Instead of clicking around and filling out forms and waiting for loading spinners all the time, we'll just tell a large language model what we want to do in English, and it will go off and screen-scrape a bunch of apps and websites, do all the clicking for us, and summarize the results in a much simpler UI designed to actually be fast and useful, vs. designed to optimize the business metrics of some company as interpreted by a gaggle of product managers.

This isn't unprecedented. Plaid screen scrapes terrible bank websites and turns them into APIs, though without AI. Google Duplex uses AI to turn restaurant phone numbers into an API for making reservations. DeepMind's Sparrow[1], just announced today, answers factual questions posed in plain English by performing Google searches and summarizing the results. But it's going to be a revolution when it becomes much more general and able to take actions rather than just summarize information. It isn't far off! https://adept.ai is pretty much exactly what I'm talking about, and I expect there are a lot more people working on similar things that are still in stealth mode.

[1] https://www.deepmind.com/blog/building-safer-dialogue-agents

makeitdouble

You mention it in your response, but this whole paradigm looks like an extension of what we have with Google Search.

And Google Search is so utterly weak. Not just because of the neutering of search options or the weird priority conflict inside Google, but just because even with litteraly all the data in the world about a specific user it doesn't seems like it can wrangle what a request actually means.

It can tell me the time in Chicago, but not what computer would actually be the best for my work. That search will only be spam, irrelevant popular results and paid reviews.

Same if I asked for a _good_ pizza recipe, it would probably not understand what that actually means for me.

The whole model of "throwing a request in the box and expecting a result" seems broken to me, I mean even between humans it doesn't work that way, why would it work with an advanced AI ?

PS: even with more back and forth, I'm imagining what we have now with customer support over chat, and while more efficient than by phone, it's definitely not the interface I want by default

throwaway0asd

I have a web app that is essentially a full OS GUI in the browser. The load time in Chrome with state restoration is almost exactly 0.3 seconds, which includes asynchronous gathering of file system artifacts and rendering display thereof.

At the same many primarily text sites struggle, such as news and social media, struggle to load in 15 seconds.

My connection measures about 920mbps down so 15 seconds is really ridiculously slow. This is certainly not a technology problem evidenced by my own app that is doing so much more is such shorter time.

modeless

You are right. It is not a technology problem. It is a problem of incentives. The people writing the software don't have the right incentives to make it pleasant to use.

mod

Some of them do, and then we end up with social media that doesn't take 15 seconds to load.

This site, for instance.

jl6

Using a LLM as an agent will be comical, like a supervillain trying to explain a menial chore to a dimwitted henchman: “No, you fool, that’s not what I meant when I said bring me the head of R&D!”

Then, maybe LLMs will get more intelligent and it will be less comical, and more like having an actual slave. An agent intelligent enough to do all these things is probably intelligent enough make me queasy at the thought of being its master.

bretbernhoft

Your point about more people working on similar projects in stealth mode is likely an underappreciated notion. And thank you for introducing me to Adept; looking forward to keeping my eyes on that company's progress.

nathias

how naive, we have long since moved from peak GUIs now the game is infesting them and rearanging them around the need to serve adds

systemvoltage

> we'll just tell a large language model what we want to do in English

Language is an imprecise tool, it has inherent ambiguity. Language is also laborious and one dimensional (stream of bits over temporal dimension). A tool such as the one you describe would be extremely frustrating to use.

modeless

If you imagine asking Siri or Google Assistant to do things for you, of course that would be incredibly frustrating; they suck. That style of HCI is a dead end for sure. However, asking a human personal assistant to do things for you wouldn't be so frustrating. Especially if they know you well. That is the kind of experience I'm talking about. Yes, I do believe it is possible in the not too distant future. Maybe 10 years or so. It won't be as good as a human assistant at first, but it will not be like the speech interfaces you're used to.

civilized

They won't go down without a fight. They want real human eyeballs.

But I wonder if there will come a point where captchas and "Not a Robot" checkboxes no longer work.

modeless

If AI can't solve the captchas, we can just hire people to do it. Services for that already exist and they are fairly cheap.

null

[deleted]

burlesona

We can wish that crappy websites didn’t get more bloated every year, but as long as they still load fast enough most people don’t know the difference or care. Everything has an opportunity cost, so once things are good enough the resources that could be spent optimizing will be spent elsewhere instead, and bloat that can be added to solve other business goals will be added.

It’s understandable to get frustrated by this, but at some point you realize it’s pointless.

This is true in many, many facets of life. Household possessions tend to expand to fill the available square footage. Cities sprawl haphazardly until commute times become unbearable. Irrigation expands until the rivers are depleted. Life expands to the limit, always.

LAC-Tech

That would only be true if every one of your customers had fast, loss free, reliable internet 100% of the time they want to access a website.

Even in a modern city this is rarely the case.

So I'm afraid the real answer is that webdev is just not mature yet.

deburo

I blame the devs before I blame the platform. Relational databases are mature enough and yet devs still find ways to create queries that time out with only a few thousand rows in their tables.

jacobjr23

Businesses don't need to provide a perfect experience to every one of their customers. If they did, they would. Like previously said, businesses are good at satisfying customer needs as dictated by their customers' wallets.

dazc

Ironically, the worst website I have to regularly navigate is my ISP's.

danburbridge

See also Induced demand/traffic: building more roads/adding lanes to roads increases congestion rather than decreasing it: https://bettertransport.org.uk/sites/default/files/trunk-roa... https://en.wikipedia.org/wiki/Induced_demand

teeray

I thought about exactly this when I saw this headline. Extra bandwidth just means we can have video header images rather than a still. Here, have a few more tracking scripts.

fmajid

Throughput of networks is increasing but latency doesn’t improve as much (although 5G wireless is a big improvement over 4G LTE). Also TCP slow-start limits how fast an HTTP connection can go, and most are short-lived so the connection is still ramping up when it is closed. That’s one of the supposed benefits of QUIC a.k.a. HTTP/3.

And then there is bloat, the scourge of JavaScript frameworks and what passes for front-end development nowadays.

nibbleshifter

> and then there is bloat, the scourge of JavaScript frameworks and what passes for front-end development nowadays.

It keeps getting worse.

When I do web app security assessments, I end up with a logfile of all requests/responses made during browsing a site.

The sizes of these logfiles have ballooned over the past few years, even controlling for site complexity.

Many megabytes of JS shit, images, etc being loaded and often without being cached properly (so they get reloaded every time).

A lot of it is first party framework bloat (webdev active choices), but a lot is third party bloat - all the adtech and other garbage that gets loaded every time (also without cacheing) for analytics and tracking.

fmajid

Yes, and it's not as if tools like Google's Lighthouse/PageSpeedInsights or Mozilla's Firefox Profiler haven't been available for years.

Economists and lawmakers have determined that the economic benefits of personalization accrue to the ad middlemen like Google, not to the publishers who have to encumber their sites with all the surveillance beacons, but the reality of the market is publishers have no leverage. That said, most of those beacons are set with the async/defer attribute and should not have a measurable on page load speed.

acdha

> but a lot is third party bloat - all the adtech and other garbage that gets loaded every time (also without cacheing) for analytics and tracking.

This is also amusing from a change management process at large organizations: want to tweak an Apache setting? Spend a month getting CAB approval and wait for a deployment window.

Want to inject unreviewed JavaScript onto every page in the domain? Login to Adobe…

RF_Savage

On mobile latency has constantly improved. GPRS/EDGE was often +100ms ping --> 3G 400-50ms depending on the day --> 4G/LTE 50-30ms and now I'm often getting sub-20ms ping on 5G connections.

fmajid

Yes, moving from voice-centric to IP-centric network technology has helped, and 5G paid laudable attention to latency. Their target for 5G is actually 1ms, but of course backhaul will dominate that.

gnyman

I had heard the 1 ms number also and did some research some time ago. As far as I understood the 1 ms is for specialised low latency connections and devices not for the every day browsing.

Here are two sources which I found useful at that time https://broadbandlibrary.com/5g-low-latency-requirements/ https://www.linkedin.com/pulse/we-need-talk-low-latency-dean...

gsich

1ms only to the tower.

mostlystatic

Desktop bandwidth is improving over time, but as I understand HTTP Archive is still using a 5 Mbps cable connection.

From their FAQ/changelog [1]:

> 19 Mar 2013: The default connection speed was increased from DSL (1.5 mbps) to Cable (5.0 mbps). This only affects IE (not iPhone).

There was another popular article on HN a while ago [2], claiming mobile websites had gotten slower since 2011. But actually HTTP Archive just started using a slower mobile connection in 2013. I wrote more about that issue with the HTTP Archive data at the time [3].

[1] https://httparchive.org/faq [2] https://www.nngroup.com/articles/the-need-for-speed/ [3] https://www.debugbear.com/blog/is-the-web-getting-slower

mostlystatic

Regarding "4 seconds wasted" per visit: HTTP Archive also publishes real-user performance data from Google, and only 10% of desktop websites take 4 seconds or more to load. (And I think that's not the average experience but the 75th percentile.) https://httparchive.org/reports/chrome-ux-report#cruxSlowLcp

The Google data uses Largest Contentful Paint instead of Speed Index, but the two metrics ultimately try to measure the same thing. Both have pros and cons. Speed Index goes up if there are ongoing animations (e.g sliders). LCP only looks at the single largest content element.

When looking at the real-user LCP data over time, keep in mind that changes are often due to changes in the LCP definition (e.g opacity 0 elements used to count but don't any more). https://chromium.googlesource.com/chromium/src/+/master/docs...

yawnxyz

Despite faster computers every year, my computers don't seem to run any faster, either

CursedUrn

Software wastes any gains we get from hardware. See Jonathan Blow's talk on the subject: https://www.youtube.com/watch?v=FeAMiBKi_EM

kps

“What Andy giveth, Bill taketh away.”

dylan604

Is the quote attributed to Bill "why should I refactor code when CPUs get faster, drives get larger" an internet legend or a terribly paraphrased line said by Bill?

Gigachad

Computer speed is a lot like driving traffic times. The amount of time it takes is pretty fixed as it’s based on what people find acceptable which doesn’t change. With faster computers just comes more features at the same speed.

yawnxyz

Atlanta keeps adding new lanes to highways, but the traffic jams never get any better! (There's probably better throughput though)

timmb

But software is faster to write and so in theory cheaper to buy.

JohnFen

It's just disheartening to see software racing to the bottom, quality-wise.

lostmsu

Count your pixels.

alt227

My 1920x1080 LCD monitor has LESS pixels than my CRT monitor in the 90s :)

gpderetta

You mean that a modern computer has to push more pixels than an old one? I don't think that's enough to explain the apparent lack of progress.

ohCh6zos

This website loads an exceptional amount of data for what it is, at least for me. On my browser it makes calls to a site called streamlitapp.com every 100 ms. It appears to be using about 100MB of bandwidth every 10 seconds from what I can see

robertritz

100MB is a LOT. Do you perhaps mean 100KB?

Streamlit is the framework I used to build the app at the bottom of the article with. It does unfortunately load a decent amount of JS. However it should be non-blocking, which means it won't interfere with how quickly you can see or use the page.

It pings back to Streamlit to keep your session state alive as it's running a whole Python interpreter on the backend for each session.

The speed index for this page hovers between 1-2 seconds when I test it.

classified

Using a wasteful app to complain about bad page design. Oh irony.

robertlagrant

It seems to ping with about 800B for a response of about the same size every 2 seconds. So every minute that's about 25KB up and 25KB down.

I wouldn't do it like that, but to call that wasteful when many websites need 5MB of ad code before they let you see anything is a bit over the top!

rob74

> With home broadband reaching 70 Mbps globally

The source for that is some stats from speedtest.net, which I assume is calculated from the users who used their speed test? So it's probably heavily skewed towards power users who have a fast connection and want to check if they are really getting what they are paying for. Most "casual" users with shitty DSL connections are happy if "the internet" works at all and are pretty unlikely to ever use this service...

Springtime

Prior topics on HN about those speed tests (eg[1]) also show ISPs prioritize them which gives a misleading representation of typical speeds.

[1] https://news.ycombinator.com/item?id=31062799

dspillett

This is why Netflix started their own (fast.com), making it more difficult for ISPs to throttle NF video content (or just prioritise speedtest traffic) and blame NF for poor performance (or poor quality because it was using more highly compressed streams to deal with low throughput) because “if you run any speedtest you'll see your connection through us is fine, it must be NF being busy”.

manuelmoreale

The irony of this post is that the single heaviest resource loaded by this page is a 750kb image used only for the meta tags as a sharing image which most people consuming and downloading the page will never see.

What's baffling to me is how people love to spend seemingly infinite time playing with tech stacks and what not but then pay very little attention to basic details like what to load and how many resources do they really need.

Dma54rhs

These images don't get loaded by a browser, probably it's being used somewhere else. 750kb is way too big for that kind of image anyway.

manuelmoreale

Ah you're right. It's even dumber. It's used as a header image but it's then hidden with a display: None !important;

But I see the image inside the network tab so bandwidth is getting wasted for no reason.

crb

I maintain that "booting a Mac, loading Chrome, loading Google Docs, and getting ready to type", takes as long as "booting DOS, starting Windows 3.1, loading Word for Windows 2.0", takes as long as "loading GEOS from floppy disk on a C64, loading geoWrite", and so on, and so forth.

Joker_vD

If anything, it probably got slightly slower over the years.

Reminds me of that time when we switched from writing by hand (and sometimes typing on a typewriter) all kinds of forms and reports to composing them on a computer and then printing it: initial time savings were pretty huge and so, naturally, the powers that be said "well, guess We can make you fill much more paperwork than you currently are filling" and did so. In the end, the amount of paperwork increased slightly out of proportion and we're now spending slightly more time on it than we used to. A sort of a law of conservation of effort, if you will.

dspillett

Try beat:

1. Power on BBC

2. Type *EDIT[ENTER]

Though obviously not as fully featured as a modern word processor, or even some editors of the time.

tekkk

No, c'mon. I still remember how it took ages for my old Windows to boot up and even get to the desktop. Maybe DOS was faster without GUI but it was god damn slow at one point. And all that weird noise your PC would make while reading from the disk non-withstanding the dance with the modem. With my current Mac I'd say it takes about 15s to open Docs.

makeitdouble

The interesting twist to me is that waking up a phone and opening the Google Docs app is the same time length or a tad faster.

We haven't progressed in speed, but versatility is theough the roof compared to 10 ~ 20 years ago.

null

[deleted]

Joeri

Speaking from experience, at shipping time of a new site performance testing is done, and the developers are asked to improve performance until it is just good enough. So it doesn’t matter how big or small the site is, unless by shipping time it is already faster than desired it will only get optimized to run as fast as every other site, but no faster.