Need help?
<- Back

Comments (354)

  • barrkel
    I don't buy the central thesis of the article. We won't be in a supply crunch forever.However, I do believe that we're at an inflection point where DC hardware is diverging rapidly from consumer compute.Most consumers are using laptops and laptops are not keeping pace with where the frontier is in a singular compute node. Laptops are increasingly just clients for someone else's compute that you rent, or buy a time slice with your eyeballs, much like smartphones pretty much always have been.I personally dropped $20k on a high end desktop - 768G of RAM, 96 cores, 96 GB Blackwell GPU - last October, before RAM prices spiked, based on the logic that hardware had moved on but local compute was basically stagnant, and if I wanted to own my computing hardware, I'd better buy something now that will last a while.This way, my laptop is just a disposable client for my real workstation, a Tailscale connection away, and I'm free to do whatever I like with it.I could sell the RAM alone now for the price I paid for it.
  • bluejay2387
    The general take here seems to be "everything eventually passes". That isn't always true. I wonder how many people have a primary computing device that they don't even have full control over now (Apple phones, tablets...). Years ago the concept of spending over $1k on a computer that I didn't even have the right to install my own software on was considered ridiculous by many people (myself included). Now many people primarily consume content on a device controlled almost entirely by the company they bought it from. If the economics lead to a situation where its more profitable to sell you compute time than sell you computers then businesses will chose to not sell you computers. I have no idea if that is what ends up happening.
  • rswail
    A long article begging the question when the last paragraph or two countered the panic of the beginning. Two Chinese firms are ramping up production of consumer RAM/SSDs because they see a market opening as the existing producers move to selling to enterprise/hyperscalars.There have been memory chip panics before, the US funded RAM production back into the 80s/90s in competition with Japan at the time.The AI boom/"hyperscale" currently is almost exactly like the dotcom boom.It's already starting to shake down. Anthropic is occupying the developer space, OpenAI has just exited the video/media production space. More focused and vertical market AI is emerging.The current vortice of money between OpenAI <-> Microsoft <-> Oracle <-> NVidea <-> Google <-> etc etc is going to break.
  • BLKNSLVR
    This may not be entirely appropriate to the reasons behind the article, but it feels tangentially related:I'd like to say a brief thank you to what the brief, golden period of globalisation was able to bring us.I hope that that level of international trade and economic cooperation across geographical, ideological, political, and religious boundaries can be achieved again at some point in the future, but it seems the pendulum is swinging the other way for the time being.I hope that, wherever the current direction ends up, there are lessons that can be learnt about what we had, and somehow fumbled, such that there is motivation enough to get back there.
  • saadn92
    The article's dystopia section is dramatic but the practical point is real. I've been self-hosting more and more over the past year specifically because I got uncomfortable with how much of my stack depended on someone else's servers.Running a VPS with Tailscale for private access, SQLite instead of managed databases, flat files synced with git instead of cloud storage. None of this requires expensive hardware, it just requires caring enough to set it up
  • meindnoch
    I know this may sound ridiculous, but m-maybe... maybe it's time for us to make software... less bloated?Maybe... just maybe, a TODO list app shouldn't run 4 processes, and consume hundreds of megabytes of RAM?
  • upofadown
    This article inspired me to look and see what this computer is. Apparently it is a "AMD Athlon(tm) II X2 250 Processor" from 2009. So 17 years old. It has 8 GB of DDR3 memory and runs at 3 GHz. It currently has OpenBSD on it, but at least one source thinks it could run Windows 10.The fact that I didn't know any of this is what is significant here. At some point I stopped caring about this sort of thing. It really doesn't matter any more. Don't get my wrong, I am as nerdy as they come. My first computer was a wire wrapped 8080 based system. That was followed by an also wire wrapped 8086 based system of my own design I used for day to day computing tasks (it ran Forth). If someone like me can get to the point of not caring there is no real reason for anyone else to care.
  • dust42
    Just to mention one thing, helium -which is a necessity for chip production- is a byproduct of LNG production. And 20% of that is just gone (Qatar) and the question is how long it will take to get that back. So not only a chip shortage because of AI buying chips in huge volumes but also because production will be hampered.Tongue in cheek: we urgently need fusion power plants. For the AI and the helium.
  • lenova
    Oh man, I've come across this person's blog before and I love it, not just because of the personalization/personality they've put into the site's design, but because of all of the random CLI/TUI-based tools they've developed. Examples:- https://xn--gckvb8fzb.com/projects/Their github repos:- https://github.com/mrusmeThey even built a BBS-style reader client that supports Hacker News:https://github.com/mrusme/neonmodemI miss the days of the web being weird like this :-)
  • Velocifyer
    Why doesn't Hacker News render punnycode in domains?
  • drillsteps5
    I very much would like to know how much of this presumably ordered (and backordered) hardware (RAM/SSD/.../wafers) is going to end up being released back to the market when the dust settles. I haven't seen any estimations but in order to put all this hardware to work the hyperscalers need to be building data centers at ludicrous speed. That should be appearing in construction data, jobs data, and many other places. Are we actually seeing any of that? Or is it all just based on the back-of-the-napkin math by Mr Altman and Co and they put all the money they got towards the future projects?
  • CraigJPerry
    Articles entire thesis looks like it can be completely de-railed if one activity happened: ai infrastructure firms cease to be able to secure more capital.Is that likely? History says it's inevitable, but timeframe is an open question.
  • philip1209
  • 2716057
    As long as there are consumers paying for hardware ownership there will be businesses willing to sell it to them. The worst scenario I could imagine is that one has to pay a premium for fully-owned hardware simply because consumer's desire for it becomes an oddity and it is thus sold in low quantities.The current AI-induced shortages aside, the times have never been better in my opinion. There is overwhelming choice; ordinary consumers can access anything from Raspberry PIs all the way up to enterprise servers and AI accelerators. The situation was very different in the 1990s when I built my first PC.
  • anonzzzies
    I do not see this from an infinite shortage point; I see this from a locked down hardware point. Old hardware is hackable, new hardware mostly not. That is for me where the real pain is and why I just buy old computers and phones that are rootable.
  • MisterTea
    To the people saying "The shortage wont last forever." - Yes, you might be right. However, such a supply crunch creates a perfect vacuum for rapidly change to fill in the vacuous hardware landscape of computing and shift the balance of power.Think about it like this: Imagine the AI/Cloud/Crypto companies who are buying up all these compute and storage resources realize they now control the compute hardware market becoming compute lords. What happens when joe/jane six pack or company xyz needs a new PC or two thousand but cant afford them due to the supply crunch? Once the compute lords realize they control the compute supply they will move to rent you their compute trapping users in a walled garden. And the users wont care because they aren't computer enthusiasts like many of us here. They only need a tool that works. They *do not* care about the details.They hardware lords could further this by building proprietary hardware in collusion with the vendors they have exclusivity with to build weaker terminal devices with just enough local ram and storage to connect to a remote compute cluster. Hardware shortage solved!All they need to do is collude with the hardware makers with circular contracts to keep buying hardware in "anticipation of the AI driven cloud compute boom." The hardware demand cycle is kept up and consumers are purposefully kept out of the market to push people into walled gardens.This is unsustainable of course and will eventually fall over but it could tie up computing resources for well over a decade as compute lords dry up the consumer hardware market pushing people to use their hoarded compute resources instead of owning your own. We are in a period where computing serfdom could be a likely outcome that could cause a lot of damage to freedom of use and hardware availability and the future ability to use the internet freely.
  • the__alchemist
    It is wild thinking how a few years ago, I didn't buy a 4090 direct from nvidia because "$1600 (USD) is too much to pay for a graphics card; if I need a better one, i'll upgrade in a few years. (Went with 4080, which is substantially slower and was $1200) Joke's on me!It will be scarcity mindset from here on out; will always buy the top tier thing .
  • abmmgb
    I actually think the central thesis is thought provoking, we have shifted far away from locally installed shit to remote data centre access, this was initially driven by cloud-based initiatives and now spiralling upwards by AI. For any researchers, hackers, builders wanting to play with locally installed AI, hardware could become a bottleneck especially as many machines, such as the beloved Macs, are not upgradable
  • xbmcuser
    In the last month 20-30% of oil supply 30% gas supply and 30-40% of fertilizer production has been destroyed and could take any where from 8 months to 5 years to come back online. Governments are acting as everything is okay so that there is no panic but we have crossed the point of no return even if the war ends today food & energy shortages are over the horizon. If you can get an ev, solar heat pumps, battery storage etc get it now today as fossil fuel based energy prices are going to go through the roof. I see similarities to when covid hit people kept looking at things happening in other countries and not preparing for the shit to hit their own cities and countries.
  • jleyank
    Hold onto your hardware. Hold on to your existing software and the current version. Don’t upgrade without a specific need. None of the “progress” is actually helpful to hackers and I’m not sure it’s even helpful to typical users. There’s enough information being given to and slurped by others, don’t make it more effective.
  • bob1029
    I am still rocking that 5700XT 50th anniversary edition. I see no reason it won't make it to 2030 at this point. There was a moment where I thought it was dying, but it was a combination of dust and a shader bug in BF6 that caused the concern. I've also got a 1080ti in case of disaster.Newer graphics hardware is pointless to me. The expensive new techniques I find incredibly offense from an interactivity standpoint (temporal AA, nanite & friends). I run Battlefield 6 at 75% render scale with everything set to low. I really don't care how ass the game looks as long as it runs well. I much more enjoy being able to effectively dispatch my enemies than observe aesthetic clutter.
  • mememememememo
    In such a future the iPhone and android ecosystem is dead? Because a single $1k phone is a hell of a computer. So if you can still buy a phone you can still get a computer. Local AI aside these are very capable.
  • vladde
    when you click away to another tab, the title and favicon of the page changes to something weird, but really legit looking.a couple of my favorites: "rust programming socks - Google", "Amazon.com: waifu pillow", "Rick Astley - Never Gonna Give You Up", "censorship on hacker news - Google"
  • kelvinjps10
    I was about to upgrade because I'm using a Thinkpad t480, I decided to optimize my computer instead. I run i3 and a couple of native apps and chromium web apps run fast enough. And some kernel and other tweaks + gamemode in arch make gaming better.I must admit that my workflow it's not that heavy.
  • n2j3
    I wish I was well versed into dialectic/Hegelian thought as I am sure there's a way of seeing this as a step towards abolition of private property altogether. The question is who owns the means of production(computation) I suppose.
  • tmtvl
    I grabbed an upgrade at the end of last year because my ~10 year old workhorse is starting to show signs of aging. Despite 16 gigs of RAM having lasted me thus far I decided to bite the bullet and get 32; so I expect this new machine to last me another 10 years (although I now have a full SSD, whereas my old workhorse had an SSD for the OS and a hybrid drive for /home, so we'll see whether or not it will actually last).
  • arexxbifs
    It's not that I disagree with the basic premise and concern of the text, but I'm not convinced about the "RAM shortage will lead to thin clients" argument, because the thin client is going to be a browser.Everything today is a web app. If it doesn't exist and you want to vibe code it? It's probably going to become a web app, vibed using a web app.The problem is, web apps are stupendous memory hogs. We're even seeing Chromebooks with 8 gigs of RAM now. LLM:s are all trained for and implemented in apps assuming the user can have $infinity browsers running, whether it's on their PC or on their phone. It's going to be very hard to change that in a way that's beneficial to what passes for business models at AI companies.Ah, the paradoxes of modern software.
  • pmdr
    I've seen comments on here before that went somewhere along the line of "adults don't care about RAM prices." HN is no stranger to siding with the oppressors.
  • darkwater
    > For the better part of two decades, consumers lived in a golden age of tech. Memory got cheaper, storage increased in capacity and hardware got faster and absurdly affordable.I got my first PC circa 1992 (a 2nd hand IBM PS/2, 80286 processor with 2MB RAM and 30MB HDD) and the "golden age" was already there. We are well over 40 years of almost uninterrupted "pay less for more performances" in the home/personal computing space, and that's because that space started around 50 years ago. There was some fluctuation (remember the earthquake affecting HDD prices a few years ago?) but demand was there and manufacturing tech became more efficient.The actual important change is that for most consumer uses, the perf improvements stopped to make sense already what, over 10 years ago?
  • mmackh
    We are in a renaissance of computing right at this moment. If expand our definition of computers outside of screens and traditional input devices, microcontrollers are capable of so much more, with so much less (energy consumption | ram | storage).The tipping point for MCUs was WiFi - which not only allows you to speak multiple protocols (UDP/Zigbee/HTTP/etc) and have audio IO, but also P2P communication and novel new form factors. There's been incredible progress with the miniaturisation of sensors and how we're able to understand and perceive our environment.So yes, whilst traditional hardware is getting more expensive and locked down, there's a strong counter movement towards computing for everyone - and by that I also mean that there's going to be less abstraction in the entire stack. Good times ahead!
  • tangotaylor
    I'm going to fight pessimism with cynicism here: the Department of Defense is not going to let everything move to the cloud because they need compute at the edge for AI-enabled weapons and R&D. For example, Anduril's products, Eric Schdmit's secretive Bumblebee project, or startups like Scout AI. Communications and GPS are just too easy to jam and their answer is giving weapons more last-mile autonomy to operate in radio silence.War aside, I also bet there's going to be a huge demand for edge-compute for other kinds of robotics: self-driving cars, delivery robots, factory robots, or general-purpose humanoids (Tesla Optimus, Boston Dynamics Atlas, 1X NEO, etc). Moving that kind of compute to the cloud is too laggy and unreliable. I know researchers who've tried it, the results were mixed.Also, the engineers working on these platforms aren't going to reinvent the wheel every time they need to connect hardware together and they're going to use interoperable standards, like PCIe for storage or GPUs, DIMM slots for memory, ATX for power, etc. So I don't see general-purpose computing dying.
  • solomonb
    I'm not saying we are in one, but isn't a RAM shortage like this is exactly what one would expect at the early stages of a take off scenario?
  • duskdozer
    uBlock Origin has prevented the following page from loading: https://xn--gckvb8fzb.com/hold-on-to-your-hardware/ This happened because of the following filter: ||xn--$document The filter has been found in: IDN Homograph Attack Protection - Complete Blockage
  • commandlinefan
    When I started programming in the early 80's, personal computing had just recently become a thing. Before that, if you wanted to learn to program, you first needed access to a very rare piece of hardware that only a select few were granted access to. But when personal computing became a reality, programming exploded - anybody could learn it with a modest investment.I suspect we're trending back to the pre-personal computing era where access to 'raw' computing power will be hard to come by. It will become harder and harder to learn to program just because it'll be harder and harder to get your hands on the necessary equipment.
  • adamwong246
    I have often imagined writing a book, roughly "Fahrenheit 451 but with computers instead of books". Imagine a world you do not buy an iPhone- one is assigned to you at birth, a world were "installing software" on "a computer you own" are not just antiquated or taboo, but unthinkable.
  • redbell
    In a totally unrelated matter to the subject, I found the linked website's name very strange! Visiting the website, I can see in the address bar that the name is in Chinese or Japanese!! This is the first occurrence I witnessed of this kind.
  • politelemon
    > If you need a new device, buy it;I would specifically add, whatever you have, or whatever you choose to buy, it would greatly benefit you to ensure a degree is Linux compatibility to ensure its lifespan can be extended further than the greed enthusiasts at MS, Apple, and Google would like you to. They will be facing the same declines in purchasing habits and are further incentivised to assert their ownership over what you might mistakenly consider your devices.
  • G_o_D
    Memory got cheaper, storage increased in capacity.In my country for offline store purchase of USB HDD only 4Tb seagate variant available, thats 15000 in pur currency thats almost 1.5 month salary in private sectorAny higher size and have to import, and and forex applied, prices goes upto 4 0's , when i read people on youtube or blogs saying they rotate 15Tb and higher on their nas raids, that seems just dream for use never to fulfill
  • pcblues
    I think what many people don't realise is that there will be a glut of cheap computer parts including CPUs, GPU cards, and memory when the AI and AI-adjacent businesses go bust and a bunch of data centres get pulled down.
  • shusaku
    > These days, the biggest customers are not gamers, creators, PC builders or even crypto miners anymore. Today, it’s hyperscalers. … > These buyers don’t care if RAM costs 20% more and neither do they wait for Black Friday deals. Instead, they sign contracts measured in exabytes and billions of dollars.Does all this not apply to businesses buying computers for their employees?
  • pjmlp
    I have been holding for my hardware for decades, some of my private hardware traces back to 2009.Phones and tablets only get replaced when they die.Why should I throw away stuff that still works as intended?
  • Kiboneu
    The other side of this is that we can still make software more efficient, and make better use of the old hardware than we had ever thought possible.I’m doing more with a decade old GPU, which was manufactured before “Attention is all you need“, than I could 5 years ago, when quantization techniques were implemented.I’m holding on to my 32 bit machines.Most linux distributions dropped support for them (for good reason). But at the end of the day these machines are a fabric of up to ~ 4 billion bytes that can be used in a myriad of ways, and we only covered a fraction of the state space before we had moved on.
  • vjerancrnjak
    haha, all of a sudden I see a tab "waifu pillow" on Amazon, and think I have a split personality that runs searches in between consciousness shifts, and then I come back to a funny message.
  • rolandhvar
    So what happens when the datacenters need to upgrade (new hardware, or stupid enterprisey reasons like "must be new when replacing broken stuff")? Surely there remains a secondary market for the enthusiasts?
  • altcognito
    Part of this is that memory companies recognize that nobody is going to enforce antitrust law for the forseeable future, so collusion to raise prices is the norm now.
  • cirelli94
    Okay but what about the icon and tab name changing and the pop-up about disabling javascript?!
  • usrbinbash
    As the old saying goes: "This too will pass."Consumer hardware will always be a market worth serving for companies who don't see their stock price as their product.If the existing companies are unwilling to make a sale, I am sure new players will arise picking up their slack.https://www.youtube.com/watch?v=SrX0jPAdSxU
  • lmz
    Micron is killing its Crucial consumer brand, not supplies to consumer brands who use its chips. Hynix never had a consumer brand for RAM I don't think?
  • camgunz
    I feel like this is just the bubble talking. I'm pretty naive here, but at some point suppliers will adjust so they can take money from data center builders and consumers, just like pre-bubble.
  • taikahessu
    Pidän kaksin käsin kiinni muistitikuista niinku.Cha cha cha ...
  • sigbottle
    semantic decentralization (not just AWS owning thousands of data centers and having their own distributed interoperability problems), standards, and regulations.These are super interesting problems. However, it seems like selection pressures, or just pure greed, attracts people to the "easiest" solution: pure domination. You don't need to care about any of these (well, you still do eventually, but in the minds of said people) if you just have pure utter control over every part of the stack.
  • poolnoodle
    I've never seen a non-latin alphabet URL before, huh.
  • rla3rd
    What a bunch of BS. The price of a commodore 64 from 1980's is over $4K in todays dollars. $4K buys a pretty decent workstation these days.
  • kogasa240p
    This site finally got me to disable javascript through ublock. 10/10!
  • jagermo
    I knew the time for my cable box would come!
  • dade_
    https://www.photonics.com/Articles/Nortel-Completes-Acquisit...Oh bubbles... their so bubbly. Remember when there was an unlimited demand for fibre optics because - The Internet? So Nortel and other manufacturers lent the money to their clients building the Internet because the growth was unlimited forever? Except they actually didn't have any money, just stock valuations?"This is a critical step in our effort to unleash the full potential of our high-performance optical component solutions business," said Clarence Chandran, COO of Nortel Networks. "This acquisition really strengthens Nortel Networks' leadership position in high-performance optical components and modules which are essential to delivering the all-optical Internet."
  • matheusmoreira
    Depressing...
  • grahammccain
    I feel like we will get out of the hardware constraints eventually.
  • defraudbah
    I refuse, I'll buy when I need to and can hold on for a few months if prices become insane. This means I'll spend less on hardware then what I could, if I wanted to buy max mpro or latest framework I just will not, because prices are too mad and g o for a cheaper version.whatever happens it's crazy and hope AI madness is worth it
  • aurareturn
    Capitalism at work. There is more value to be generated by moving resources to data centers for the moment. This isn't some me be insensitive or anything. It's the same people who are buying iPhones and PCs who are demanding more compute for AI.There could be a swing in the future where people will demand local AI instead and resources could shift back to affordable local AI devices.Lastly, this thesis implies that we will be supply constrained forever such that prices for personal devices will always be elevated as a percentage of one's income. I don't believe that.
  • CrzyLngPwd
    The overlay and page title changing worked really well, ffs, I was like "Why is my machine displaying a page with zuckerberg nudes" haha.
  • flyinglizard
    It's a thought provoking article and I felt the pain when I shopped around for a new GPU lately to replace a 4090 I thought was faulty (eventually a cleaning of the PCIe connector solved those crashes). I bought it at the end of 2022 and three and a half years it seems like we've gone backwards, not forward on GPUs available for end users. They cost more and do less.But also consider that PCs have been an anomaly for very long. I don't think there's an equivalent market where you, as a consumer, can buy off-the-shelf cutting-edge technical pieces in your local mall and piece them together into a working device. It's a fun model, for sure, but I'm not sure it's an efficient model. It was just profitable enough to keep the lights on, thanks primarily to a bunch of Taiwanese companies in that space but it wasn't growing anywhere and the state of software is a mess.Apple the PCs collective lunch before DCs did. So have gaming consoles. So I weep for consumer choice but as things become more advanced maybe PCs and their entire value chain don't make a lot of sense any more.Obviously at the end there will still be consumer devices, because someone needs to consume all of this AI (at least people are thrown entirely out of the loop, but then all those redundant meat sacks will need entertainment to keep them content). We have the consumer device hyperscaler Apple doing rather OK even with these supply crunches although I'm not sure for how long.
  • xyst
    Not a single comment mentioning how programmers these days don’t give a shit about optimization.
  • selectively
    This is just brainrot garbage. The idiotic stuff you see YouTubers saying. Why is this at the top of HN? Bots, I assume?
  • imtringued
    I just realized that this blog site is pretending to be malware. I opened the tab and was constantly switching between the blog and writing this HN comment (I deleted the rest of the comment after realizing it) and was wondering where the tab went and kept opening it over and over again, then I realized that it completely rewrote the tab title with NSFW content (one of the title contained the world "nudes" with a faked amazon favicon) and when you reopen the tab, it shows you a black overlay with a message intended to induce shock if you ever bother to read it (I didn't read past the first sentence so I don't know what it was actually about).Can dang/a moderator please ban the domain from HN? Even if its not exactly malware, it's pretending to be malware to grab your attention and it's obviously intending to fill your browser history with inappropriate content, which didn't work on my browser because I opened the blog in a private browser session. The operator clearly doesn't run his blog in good faith.
  • shevy-java
    AI companies driving RAM prices up is, in my opinion, theft from the common man (and common woman). Sure, you can say that in capitalism, those who pay more benefit the most, but no system, not even the USA, has a purely driven capitalistic system. You still have transfer money, public infrastructure and what not. So private companies driving up the prices, such as for RAM, is IMO also theft from common people. And that should not happen. It can only happen when you have lobbyists disguised as politicians who benefit personally from helping establish such a system. The same can be said about any other prive-upwards scaling that is done via racketeering.
  • pissinwind
    Everything about tech and economy slowing is 1000% man made.The Trump/anti-America phase has gone on way longer than I thought but it won’t last forever.Even if we have to wait for this old world cabal to die and fade away, time is still on our side.Boomers are stupid for using time as a weapon.I’m chillin. Waiting for people to die while growing my businesses.Travel to a functional place off the beaten path to see nobody can really stop forward progress. Even in these places where time has stopped.
  • cynicalsecurity
    Fear mongering hysteria.
  • dist-epoch
    I'm not sure why people are upset. This is how Capitalism is supposed to work - resource allocation towards the most productive (in terms of Capital) usage.Those who are best able to use a resource are willing to pay the most for it thus pricing out unproductive usages of it.This is pure Capitalism.If one is in general against Capitalism, yes, one can complain.But saying "I want free markets" and "I want capitalism", but then complaining when the free markets increase the price of your RAM is utterly deranged.Some will say "but Altman is hoarding the RAM, he's not using it productively". It's irrelevant, he is willing to pay more than you to hoard that RAM. In his view he's extracting more value from that than you do, so he's willing to pay more. The markets will work. If this is unproductive use of Capital, OpenAI will go bankrupt.And the RAM sellers make more money, which is good in Capitalism. It would be irresponsible for them to sell to price sensitive customers (retail), when they have buyers (AI companies) willing to pay much more. And if this is a bad decision, because that AI market will vanish and they will have burned the retail market, Capitalism and Free Markets will work again and bankrupt them.Survival of the fittest. That is Capitalism. And right now AI companies are the fittest by a large margin.AI and Capitalism are the exact same thing, as famously put. We are in the first stages of turning Earth into Computronium, you either become Compute or you will fade away.
  • Yanko_11
    [dead]
  • not_a9
    [dead]
  • yubainu
    [dead]
  • inquirerGeneral
    [dead]
  • sheefers
    [dead]
  • keybored
    Owning hardware is great. But I get the impression that some people view owning petty hardware as some liberty panacea.You might have a DVD collection, ten external drives, three laptops, and a workstration. You may still for all intents and purposes be wholly dependent on cloud computing, say, because that it is the only practical way to run whatever AI-driven software three years from now.Edit: That’s an example. It goes beyond AI. and...:Liberty goes beyond that.
  • anon
    undefined
  • Bender
    It is a good article but I am holding onto my hardware for other reasons. I predict it will not be long until all hardware has a set of Nanny chips that are named and marketed so that even people here on HN will argue on behalf of having them. It will be some "Secure enclave AI accelerated Super Mega Native Processing Underminer" and will start off securing and accelerating something or a set of somethings but will eventually tie into age verification, censorship and a Central Nanny Agency that all countries will obey.- "Stare into this hole to verify your age.- "Stick your finger in the box.- "Ignore the pain to get your AI token bucks and unlock access to the shiny new attestation accelerated internet."- "Sync ALL of your usernames and passwords into this secure enclave."Every packet and data stream will be analyzed locally by the AI to determine the intentions and predict future behavior. The AI summarized behavior will be condensed into an optimized encoded table to be submitted hourly to the Central Nanny Overseer. I might be slightly exaggerating and a bit hyperbolic but it will be something in this spirit and people will sleep walk right into it.My only question is which country will control the behavior of these chips.