Need help?
<- Back

Comments (98)

  • PunchyHamster
    Our developers managed to run around 750MB per website open once.They have put in ticket with ops that the server is slow and could we look at it. So we looked. Every single video on a page with long video list pre-loaded a part of it. The single reason the site didn't ran like shit for them is coz office had direct fiber to out datacenter few blocks away.We really shouldn't allow web developers more than 128kbit of connection speed, anything more and they just make nonsense out of it.
  • vsgherzi
    Modern web dev is ridiculous. Most websites are an ad ridden tracking hellacape. Seeing sites like hn where lines of js are taken seriously is a godsend. Make the web less bloated.
  • frereubu
    When working at the BBC in the late 90s, the ops team would start growling at you if a site's home page was over 70kb...
  • hilbert42
    These days the NYT is in a race to the bottom. I no longer even bother to bypass ads let alone read the news stories because of its page bloat and other annoyances. It's just not worth the effort.Surely news outlets like the NYT must realize that savvy web surfers like yours truly when encountering "difficult" news sites—those behind firewalls and or with megabytes of JavaScript bloat—will just go elsewhere or load pages without JavaScript.We'll simply cut the headlines from the offending website and past it into a search engine and find another site with the same or similar info but with easier access.I no longer think about it as by now my actions are automatic. Rarely do I find an important story that's just limited to only one website, generally dozens have the story and because of syndication the alternative site one selects even has identical text and images.My default browsing is with JavaScript defaulted to "off" and it's rare that I have to enable it (which I can do with just one click).I never see Ads on my Android phone or PC and that includes YouTube. Disabling JavaScript on webpages nukes just about all ads, they just vanish, any that escape through are then trapped by other means. In ahort, ads are optional. (YouTube doesn't work sans JS, so just use NewPipe or PipePipe to bypass ads.)Disabling JavaScript also makes pages blindingly fast as all that unnecessary crap isn't loaded. Also, sans JS it's much harder for websites to violate one's privacy and sell one's data.Do I feel guilty about skimming off info in this manner? No, not the slightest bit. If these sites played fair then it'd be a different matter but they don't. As they act like sleazebags they deserve to be treated as such.
  • cjs_ac
    My family's first broadband internet connection, circa 2005, came with a monthly data quota of 400 MB.The fundamental problem of journalism is that the economics no longer works out. Historically, the price of a copy of a newspaper barely covered the cost of printing; the rest of the cost was covered by advertising. And there was an awful lot of advertising: everything was advertised in newspapers. Facebook Marketplace and Craigslist were a section of the newspaper, as was whichever website you check for used cars or real estate listings. Journalism had to be subsidised by advertising, because most people aren't actually that interested in the news to pay the full cost of quality reporting; nowadays, the only newspapers that are thriving are those that aggressively target those who have an immediate financial interest in knowing what's going on: the Financial Times, Bloomberg, and so on.The fact is that for most people, the news was interesting because it was new every day. Now that there is a more compelling flood of entertainment in television and the internet, news reporting is becoming a niche product.The lengths that news websites are going to to extract data from their readers to sell to data brokers is just a last-ditch attempt to remain profitable.
  • userbinator
    I also use and like the comparison in units of Windows 95 installs (~40MB), which is also rather ironic in that Win95 was widely considered bloated when it was released.While this article focuses on ads, it's worth noting that sites have had ads for a long time, but it's their obnoxiousness and resource usage that's increased wildly over time. I wouldn't mind small sponsored links and (non-animated!) banners, but the moment I enable JS to read an article and it results in a flurry of shit flying all over the page and trying to get my attention, I leave promptly.
  • dizzy9
    I remember in 2008, when Wizards of the Coast re-launched the official Dungeons & Dragons website to coincide with the announcement of the fourth edition rules. The site was something in the region of 4 MB, plus a 20 MB embedded video file. A huge number of people were refreshing the site to see what the announcement was, and it was completely slammed. Nobody could watch the trailer until they uploaded it to YouTube later.4 MB was an absurd size for a website in 2008. It's still an absurd size for a website.
  • galphanet
    This is just the top of the iceberg. Don't get me started on airlines websites (looking at you Air Canada), where the product owner, designers, developers are not able to get a simple workflow straight without loading Mb of useless javascript and interrupt the user journey multiple times. Give me back the command line terminal like Amadeus, that would be perfect.How can we go back to a Web where websites are designed to be used by the user and not for the shareholders?
  • h4ch1
    This rubbish also exists disproportionately for recipe pages/cooking websites as well.You have 20 ads scattered around, an autoplaying video of some random recipe/ad, 2-3 popups to subscribe, buy some affiliated product and then the author's life story and then a story ABOUT the recipe before I am able to see the detailed recipe in the proper format.It's second nature to open all these websites in reader mode for me atp.
  • decimalenough
    This is why people continue to lament Google Reader (and RSS in general): it was a way to read content on your own terms, without getting hijacked by ads.
  • the_snooze
    It's really hard to consider any kind of web dev as "engineering." Outcomes like this show that they don't have any particular care for constraints. It's throw-spaghetti-at-the-wall YOLO programming.
  • ray023
    I think it's a GOOD thing, actually. Because all these publications a dying anyway. And even if your filter out all the ad and surveillance trash, you are left with trash propaganda and brain rot content. Like why even make the effort of filtering out the actual text from some "journalist" from these propaganda outlets. It's not even worth it.If people tune out only because how horrible the sites are, good.
  • mrb
    Let's play a fun prediction: I ask HN readers what will be the page size of NYTimes.com in 10 years? Or 20 years?Want to bet 100 MB? 1 GB? Is it unthinkable?20 years ago, a 49 MB home page was unthinkable.
  • zahlman
    This site more or less practices what it preaches. `newsbanner.webp` is 87.1KB (downloaded and saved; the Network tab in Firefox may report a few times that and I don't know why); the total image size is less than a meg and then there's just 65.6KB of HTML and 15.5 of CSS.And it works without JavaScript... but there does appear to be some tracking stuff. A deferred call out to Cloudflare, a hit counter I think? and some inline stuff at the bottom that defers some local CDN thing the old-fashioned way. Noscript catches all of this and I didn't feel like allowing it in order to weigh it.
  • mvrckhckr
    Only major media can get away with this kind of bloat. For the normal website, Google would never include you in the SERPs even if your page is a fraction of that size.
  • lousken
    rule #1 is to always give your js devs only core 2 quad cpus + 16GB of RAMthey won't be able to complain about low memory but their experience will be terrible every time they try to shove something horrible into the codebase
  • Crowberry
    I hate this trend of active distraction. Most blogs have a popup asking you to subscribe as soon as you start scrolling.It’s as if everyone designed their website around the KPI of irritating your visitors and getting them to leave ASAP.
  • napolux
    and the NYT web team was praised as one of the best in the world some (many?) years ago.
  • throwatdem12311
    49mb web page? Try a 45mb graphql response.
  • anon
    undefined
  • shevy-java
    Ublock origin helps mitigate at the least a little bit here.
  • Bratmon
    Maybe I'm just getting old, but I've gotten tired of these "Journalists shouldn't try to make their living by finding profitable ads, they should just put in ads that look pretty but pay almost nothing and supplement their income by working at McDonalds" takes.