<- Back
Comments (101)
- susamA little shell function I have in my ~/.zshrc: pages() { for _ in {1..5}; do curl -sSw '%header{location}\n' https://indieblog.page/random | sed 's/.utm.*//'; done } Here is an example output: $ pages https://alanpearce.eu/post/scriptura/ https://jmablog.com/post/numberones/ https://www.closingtags.com/blog/home-networking https://www.unsungnovelty.org/gallery/layers/ https://thoughts.uncountable.uk/now/ On macOS, we can also automatically open the random pages in the default web browser with: $ open $(pages) Another nice place to discover independently maintained personal websites is: https://kagi.com/smallweb
- varun_chA fun trend on the "small web" is the use of 88x31 badges that link to friends websites or in webrings. I have a few on my website, and you can browse a ton of small web websites that way.https://varun.ch (at the bottom of the page)There's also a couple directories/network graphs https://matdoes.dev/buttons https://eightyeightthirty.one/
- 8organicbitsOne objection I have to the kagi smallweb approach is the avoidance of infrequently updated sites. Some of my favorite blogs post very rarely; but when they post it's a great read. When I discover a great new blog that hasn't been updated in years I'm excited to add it to my feed reader, because it's a really good signal that when they publish again it will be worth reading.
- graemep> The US spends ~$14,570 per person on healthcare. Japan spends ~$5,790 and has the highest life expectancy in the OECD.The difference in life expectancy will be influenced by multiple factors and may have more to do with diet and lifestyle than with healthcare.Japan also spends less per capita than the UK, France or Germany. The US spends a lot more than any of those so the US system is bad value for money.
- GuB-42I don't expect many people to agree but I think that the "small web" should reject encryption, which is the opposite direction that Gemini is taking.I don't deny the importance of encryption, it is really what shaped the modern web, allowing for secure payment, private transfer of personal information, etc... See where I am getting at?Removing encryption means that you can't reasonably do financial transactions, accounts and access restriction, exchange of private information, etc... You only share what you want to share publicly, with no restrictions. It seriously limits commercial potential which is the point.It also helps technically. If you want to make a tiny web server, like on a microcontroller, encryption is the hardest part. In addition, TLS comes with expiring certificates, requiring regular maintenance, you can't just have your server and leave it alone for years, still working. It can also bring back simple caching proxies, great for poor connectivity.Two problems remain with the lack of encryption, first is authenticity. Anyone can man-in-the-middle and change the web page, TLS prevents that. But what I think is an even better solution is to do it at the content level: sign the content, like a GPG signature, not the server, this way you can guarantee the authenticity of the content, no matter where you are getting it from.The other thing is the usual argument about oppressive governments, etc... Well, if want to protect yourself, TLS won't save you, you will be given away by your IP address, they may not see exactly what you are looking at, but the simple fact you are connecting to a server containing sensitive data may be evidence enough. Protecting your identity is what networks like TOR are for, and you can hide a plain text server behind the TOR network, which would act as the privacy layer.
- afisxistoCool to see Gemini mentioned here. A few years back I created Station, Gemini's first "social network" of sorts, still running today: https://martinrue.com/station
- ZebrasEatHas there been any effort in taking any of these small web type approaches into a headscale type space? My preferences would be to have a private area where whitelisting prevents crawling or scraping. Am baffled why someone hasn’t created a headscale server and started distributing nodes to personally known ‘good intentioned’ humans. Anyone ever heard of anything like this?
- freediverKagi Small Web has about 32K sites and I'd like to think that we have captured most of (english speaking) personal blogs out there (we are adding about 10 per day and a significant effort went into discovering/fidning them).It is kind of sad that the entire size of this small web is only 30k sites these days.
- 627467I read alot against monetization in the comments. I think because we are used monetization being so exploitative, filled with dark patterns and bad incentives on the Big Web.But it doesnt need to be thia way: small web can also be about sustainable monetization. In fact there's a whole page on that on https://indieweb.org/business-modelsThere's nothing wrong with "publishers" aspiring to get paid.
- danhiteIsn't this a simple compute opportunity? ...> March 15 there were 1,251 updates [from feed of small websites ...] too active, to publish all the updates on a single page, even for just one day. Well, I could publish them, but nobody has time to read them all.if the reader accumulates a small set of whitelist keywords, perhaps selected via optionally generating a tag cloud ui, then that est. 1,251 likely drops to ~ single page (most days)if you wish to serve that as noscript it would suffice to partition in/visible content eg by <section class="keywords ..." and let the user apply css (or script by extension or bookmarklet/s) to reveal just their locally known interests
- upboundspiralI think the article briefly touches on an important part: people still write blogs, but they are buried by Google that now optimizes their algorithm for monetization and not usefulness.Anyone interested in seeing what the web when the search engines selects for real people and not SEO optimized slop should check out https://marginalia-search.com .It's a search engine with the goal of finding exactly that - blogs, writings, all by real people. I am always fascinated by what it unearths when using it, and it really is a breath of fresh air.It's currently funded by NLNet (temporarily) and the project's scope is really promising. It's one of those projects that I really hope succeeds long term.The old web is not dead, just buried, and it can be unearthed. In my opinion an independent non monetized search engine is a public good as valuable as the internet archive.So far as I know marginalia is the only project that instead of just taking google's index and massaging it a bit (like all the other search engines) is truly seeking to be independent and practical in its scope and goals.
- shermantanktopThis is a specific definition of "small web" which is even narrower than the one I normally think of. But reading about Gemini, it does make me wonder if the original sin is client-side dynamism.We could say: that's Javascript. But some Javascript operates only on the DOM. It's really XHR/fetch and friends that are the problem.We could say: CSS is ok. But CSS can fetch remote resources and if JS isn't there, I wonder how long it would take for ad vendors to have CSS-only solutions...or maybe they do already?
- lasgawemm, yeah. I like the idea of the small web not as a size category but as a mindset. people publishing for the sake of sharing rather than optimizing for attention or monetization.
- lich_kingIt's easy to hand-curate a list of 5,000 "small web" URLs. The problem is scaling. For example, Kagi has a hand-curated "small web" filter, but I never use it because far more interesting and relevant "small web" websites are outside the filter than in it. The same is true for most other lists curated by individual folks. They're neat, but also sort of useless because they are too small: 95% of the things you're looking for are not there.The question is how do you take it to a million? There probably are at least that many good personal and non-commercial websites out there, but if you open it up, you invite spam & slop.
- dwgLove the irony: Man builds a Gemini-style feed aggregator for small web, finding it, well, not so small.
- trinsic2Can anyone point me to the best place to get castor going? I cant install it on my 22.04 install unmet dependencies...
- jmclnxI moved my site to Gemini on sdf.org, I find it far easier to use and maintain. I also mirror it on gopher. Maintaining both is still easier than dealing with *panels or hosting my own. There is a lot of good content out there, for example:gemini://gemi.dev/FWIW, dillo now has plugins for both Gemini and Gopher and the plugins work find on the various BSDs.
- GunaxIt's sad how the snall web became invisible.I used to use all sorts of small websites in 2005. But by 2015 I used only about 10 large ones.Like many changes, I cannot pinpoint exactly when this happened. It just occurred to me someday that I do not run into many unusual websites any longer.It's unfortunate that so much of our behavior is dictated by Google. I dint think it's malicious or even intentional--but at some point they stopped directing traffic to small websites.And like a highway closeure ripples through small town economies, it was barely noticed by travellers but devestating to recipients. What were once quaint sites became abandoned.The second force seems to be video. Because video is difficult and expensive to host, we moved away from websites. Travel blogs were replaced with travel vlogs. Tutorials became videos.
- romanivSmall Web, Indie Web and Gemini are terminally missing the point. The web in the 90s was an ecosystem that attracted people because of experimentation with the medium, diversity of content and certain free-spirited social defaults. It also attracted attention because it was a new, exciting and rapidly expanding phenomenon. To create something equivalent right now you would need to capture those properties, rather then try to revive old visual styles and technology.For a while I hoped that VR will become the new World Wide Web, but it was successfully torpedoed by the Metaverse initiative.
- tonymetI’m not sold on gemini. Less utility, weaker, immature tools. Investing on small HTTP based websites is the right direction. One could formalize it as a browser extension or small-web HTTP proxy that limits JS, dom size, cookie access etc using existing Web browsers & user agents.
- tonymethats off to https://1mb.club/ and https://512kb.club/ for cataloging and featuring small web experiences
- heliumteraHow many would be left after removing self promotion, AI generated content and "how I use AI?"(Claude code like everybody else)
- productinventor[flagged]
- hettygreen[flagged]
- myylogicgreat work