Need help?
<- Back

Comments (111)

  • WJW
    This essay, like so many others, mistakes the task of "building" software with the task of "writing" software. Anyone in the world can already get cheap, mass-produced software to do almost anything they want their computer to do. Compilers spit out new build of any program on demand within seconds, and you can usually get both source code and pre-compiled copies over the internet. The "industrial process" (as TFA puts it) of production and distribution is already handled perfectly well by CI/CD systems and CDNs.What software developers actually do is closer to the role of an architect in construction or a design engineer in manufacturing. They design new blueprints for the compilers to churn out. Like any design job, this needs some actual taste and insight into the particular circumstances. That has always been the difficult part of commercial software production and LLMs generally don't help with that.It's like thinking the greatest barrier to producing the next great Russian literary novel is not speaking Russian. That is merely the first and easiest barrier, but after learning the language you are still no Tolstoy.
  • physicsguy
    I've just done my first almost fully vibe coded hobby project from start to near completion, a village history website with a taxonomy, and it's taken so much poking and prodding and cajoling to get the software to do exactly what I want it to do. Having built plenty of production stuff, I know what I want it to look like and the data model was really clear, yet even trying every trick in the book to constrain them, I just found the LLMs went off and did totally random things, particularly as the project got further from the start.Maybe there'll be an enormous leap again but I just don't quite see the jump to how this gets you to 'industrial' software. It made it a lot faster, don't get me wrong, but you still needed the captain driving the ship.
  • huevosabio
    I've been thinking about this for a while, and largely agree that industralization of software development is what we are seeing. But the emphasis on low quality is misplaced.Take this for example:``` Industrial systems reliably create economic pressure toward excess, low quality goods. ```Industrial systems allow for low quality goods, but also they deliver quality way beyond what can be achieved in artisanal production. A mass produced mid-tier car is going to be much better than your artisanal car.Scale allows you not only to produce more cheaply, but also to take quality control to the extreme.
  • bolangi
    This thought-provoking essay does not consider one crucial aspect of software: the cost of a user developing a facility with a given software product. Historically monopolistic software producers can force these costs to be borne because the user has no alternative to upgrading to the latest version of, for example, Windows, or gmail, or the latest version of the github GUI. A signficant portion of the open source / free software movement is software providing stable interfaces (including for the user) so that resources otherwise spent on compulsory retraining to use the latest version of something proprietary, can be invested in configuring existing resources to better suit the user's problem domain. For example, programs like mutt or vim, or my latest discovery, talon.
  • motbus3
    I'm not through yet but I don't know.As a developer for almost 30 years now, if I think where most of my code went, I would say, quantitatively, to the bin.I processed much data, dumps and logs over the years. I collected statistical information, mapped flows, created models of the things I needed to understand. And this was long before any "big data" thing.Nothing changed with AI. I keep doing the same things, but maybe the output have colours.
  • philipallstar
    These damn articles. Software moved into an industrial revolution when you could write in a high level language, and not in assembly. This has already happened.
  • yosefk
    You could say the same things about assemblers, compilers, garbage collection, higher level languages etc. In practice the effect has always been an increase in the height of a mountain of software that can be made before development grinds to a halt due to complexity. LLMs are no different
  • zkmon
    A question that was not addressed in the article and contrasts software with industrialized products from the past is - who are the consumers of the software produced at industrial scale? Stitching of clothes by machines accelerated garment product only because there was demand and consumption tied to population. But software is not tied to population similar to food and clothes. It doesn't deprecate, it is not exclusively consumed by persons.Another common misconception is, it is now easier to compete with big products, as the cost of building those products will go down. Maybe you think you can build your own Office suite and compete with MS Office, or build a SAP with better features and quality. But what went into these software is not just code, but decades of feedback, tuning and fixing. The industrialization of software can not provide that.
  • choeger
    Thing is: Industrialization is about repeating manufacturing steps. You don't need to repeat anything for software. Software can be copied arbitrarily for no practical cost.The idea of automation creating a massive amount of software sounds ridiculous. Why would we need that? More Games? Can only be consumed at the pace of the player. Agents? Can be reused once they fulfill a task sufficently.We're probably going to see a huge amount of customization where existing software is adapted to a specific use case or user via LLMs, but why would anyone waste energy to re-create the same algorithms over and over again.
  • ciconia
    Hmm, I'm not sure I see the value in "disposable software". In any commercial service people are looking for software solutions that are durable, dependable, extensible, maintainable. This is the exact opposite of disposable software.The whole premise of AI bringing democratization to software development and letting any layperson produce software signals a gross misunderstanding of how software development works and the requirements it should fulfill.
  • xorcist
    Too many articles are written comparing LLMs to high-level languages. Sure, if you squint enough, both has to do with computers. But that comparison misses everything that is important about LLMs.High-level languages are about higher abstractions for deterministic processes. LLMs are not necessarily higher abstractions but instead about non-deterministic processes, a fundamentally different thing altogether.
  • sriku
    I find it hard to think of code as being the output of programming. I keep re-reading Naur's "Programming as theory building" paper and it still feels relevant and closer to how the activity feels to me, AI or no AI.https://pages.cs.wisc.edu/~remzi/Naur.pdf
  • djantje
    It is just going to be even more less important software.There is a difference between writing for mainstream software and someone's idea/hope for the future.Software that is valued high enough will be owned and maintained.Like most things in our world, I think ownership/stewardship is like money and world hunger, a social issue/question.
  • MORPHOICES
    What I've been pondering is the nature of what makes the user interface of some software "industrial" versus "complicated." ~“The difference I return to again and again isn’t tech depth. It’s constraints.”"Rough framework I’m using lately:"Consumer software aims at maximizing joy.“Enterprise software is all about coordination.”"Industrial software operates in a environment of the real-world "mess", yet its"Industrial stuff appears to be more concerned with: a. failure modeslong-term maintenancepredictable behavior vs clevernessBut as soon as software is involved with physical processes, the tolerance for ambiguity narrows quickly.Curious how others see it:What’s your mental line between enterprise and industrial? What constraints have affected your designing? “Nice abstractions.” Any instances where these failed the test of reality?
  • zx8080
    > This website uses anonymous cookies to enhance the user experience.This sounds weird, or wrong. Does anonymous stats need cookies at all?
  • japhyr
    > Previous industrial revolutions externalised their costs onto environments that seemed infinite until they weren't. Software ecosystems are no different: dependency chains, maintenance burdens, security surfaces that compound as output scales. Technical debt is the pollution of the digital world, invisible until it chokes the systems that depend on it. In an era of mass automation, we may find that the hardest problem is not production, but stewardship. Who maintains the software that no one owns?This whole article was interesting, but I really like the conclusion. I think the comparison to the externalized costs of industrialization, which we are finally facing without any easy out, is a good one to make. We've been on the same path for a long time in the software world, as evidenced by the persistent relevance of that one XKCD comic.There's always going to be work to do in our field. How appealing that work is, and how we're treated as we do that work, is a wide open question.
  • Abh1Works
    But I think the important part of this is the reach that the Industrial Revolution had. Consumer facing software, or the endusers who were able to "benefit" from the Industrial Revolution, and individual needs for all of these mass produced goods.The important thing is that goods =/= software. I, as an end user, of software rarely need specialized software. I dont need an entire app generated on the spot to split the bill and remember the difference if I have the calculator.So, yes, we are industrializing software, but this reach that people talk about (I believe) will be severely limited.
  • Deukhoofd
    The industrial revolution was constrained by access to the means of production, leaving only those with capital able to actually produce, which lead to new economic situations.What are the constraints with LLMs? Will an Anthropic, Google, OpenAI, etc, constrain how much we can consume? What is the value of any piece of software if anyone can produce everything? The same applies to everything we're suddenly able to produce. What is the value of a book if anyone can generate one? What is the value of a piece of art, if it requires zero skill to generate it?
  • pyrale
    The article kind of misses that cost has two axes : development cost and maintenance cost.low cost/low value software tagged as disposable usually means development cost was low, but maintenance cost is high ; and that's why you get rid of it.On the other hand, the difference between good and bad traditional software is that, while cost is always going to be high, you want maintenance cost to be low. This is what industrialization is about.
  • memoriuaysj
    Steve Yegge called it "factory farmed code"
  • empiko
    Not convinced. There is an obvious value in having more food or more products for almost anybody on Earth. I am not sure this is the case for software. Most people's needs are completely fulfilled with the amount and quality of software they already have.
  • dustinboss
    "Technical debt is the pollution of the digital world, invisible until it chokes the systems that depend on it." Such a great line.
  • vincnetas
    i would say comparing making of software and working factory makes analogy mistake. complete software is analogy to running factory. making software is making of the factory. that is specialised tooling, layouts, supply chain etc. when you have all this your factory runs on industrial scale and produces things. like your software produces value when its completed and used by enduser.
  • torginus
    So many fallacies here, imprecise, reaching arguments, attempts at creating moral panic, insistence that most people create poor quality garbage code, in start contrast to the poster, the difference between his bespoke excellence, and the dreck produced by the soulless masses is gracefully omitted.First the core of the argument that 'Industrialization' produces low quality slop is not true - industrialization is about precisely controlled and repeatable processes. A table cut by a CNC router is likely dimensionally more accurate than one cut by hand, in fact many of the industrial processes and machines have trickled back into the toolboxes of master craftsmen, where they increased productivity and quality.Second, from my experience of working at large enterprises, and smaller teams, the 80-20 rule definitely holds - there's always a core team of a handful of people who lay down the foundations, and design and architect most of the code, with the rest usually fixing bugs, or making bullet point features.I'm not saying the people who fall into the 80% don't contribute, or somehow are lesser devs, but they're mostly not well-positioned in the org to make major contributions, and another invariable aspect is that as features are added and complexity grows, along with legacy code, the effort needed to make a change, or understand and fix a bug grows superlinearly, meaning the 'last 10%' often takes as much or more effort than what came before.This is hardly an original observation, and in today's ever-ongoing iteration environment, what counts as the last 10% is hard to define, but most modern software development is highly incremental, often is focused on building unneeded features, or sidegrade redesigns.
  • npodbielski
    If that is true we will live in a funny world when you will loose all your money because you where running some outdated, riddled with holes software written by LLM running on some old router old cheap camera. Or some software will stop working after an update because some fix was written by LLM and nobody checked that nor tested. Or they will 3 outages of big internet services in 2 months.Oh wait. It is already a thing.
  • spiderfarmer
    “industrialisation of agriculture led to ultraprocessed junk food“The mass production of unprocessed food is not what led to the production of hyper processed food. That would be a strange market dynamic.Shareholder pressure, aggressive marketing and engineering for super-palatable foods are what led to hyper processed foods.
  • ofalkaed
    Personally I think AI is going to turn software into a cottage industry, it will make custom software something the individual can afford. AI is a very long ways off from being able to allow the average person to create the software they want unless they are willing to put a great deal of time into it, but it is almost good enough that the programmer can take the average person's idea and execute it at an affordable price. Probably only a year or two from when a capable programmer will be able to offer any small buisness a completely customized POS setup for what the cost of a canned industrial offering today; I will design your website and build you a POS system tailored to your needs and completely integrated with the website, and for a little more I can throw in the accounting and tax software. A bright dishwasher realizing they can make things work better for their employer might be the next billionaire revolutionizing commerce and the small buisness.I have some programming ability and a lot of ideas but would happily hire someone to realize those ideas for me. The idea I have put the most time into, took me the better part of a year to sort out all the details of even with the help of AI, most programmers could have probably done it in a night and with AI could write the software in a few nights. I would have my software for an affordable price and they could stick it in their personal store so other could buy it. If I am productive with it and show its utility, they will sell more copies of it so they have an incentive to work with people like me and help me realize my ideas.Programming is going to become a service instead of an industry, the craft of programming will be for sale instead of software.
  • anon
    undefined
  • bgwalter
    Another AI entrepreneur who writes a long article about inevitability, lists some downsides in order to remain credible but all in all just uses neurolinguistic programming on the reader so that the reader, too, will think the the "AI" revolution is inevitable.
  • eitally
    I spent 15 years writing literal industrial software (manufacturing, test, and quality systems for a global high-tech manufacturing company, parts of which operated in regulated industries).One of the things that happened around 2010, when we decided to effect a massive corporate change away from both legacy and proprietary platforms (on the one hand, away from AIX & Progress, and on the other hand, away from .Net/SQL Server), was a set of necessary decisions about the fundamental architecture of systems, and which -- if any -- third party libraries we would use to accelerate software development going forward.On the back end side (mission critical OLTP & data input screens moving from Progress 4GL to Java+PostgreSQL) it was fairly straightforward: pick lean options and as few external tools as possible in order to ensure the dev team all completely understand the codebase, even if it made developing new features more time consuming sometimes.On the front end, though, where the system config was done, as well as all the reporting and business analytics, it was less straightforward. There were multiple camps in the team, with some devs wanting to lean on 3rd party stuff as much as possible, others wanting to go all-in on TDD and using 3rd party frameworks and libraries only for UI items (stuff like Telerik, jQuery, etc), and a few having strong opinions about one thing but not others.What I found was that in an organization with primarily junior engineers, many of which were offshore, the best approach was not to focus on ideally "crafted" code (I literally ran a test with a senior architect once where he & I documented the business requirements completely and he translated the reqs into functional tests, then handed over the tests to the offshore team to write code to pass. They didn't even mostly know what the code was for or what the overall system did, but they were competent enough to write code to pass tests. This ensured the senior architect received something that helped him string everything together, but it also meant we ended up with a really convoluted codebase that was challenging to holistically interpret if you hadn't been on the team from the beginning. I had another architect, who was a lead in one of the offshore teams, who felt very strongly that code should be as simple as possible: descriptive naming, single function classes, etc. I let him run with his paradigm on a different project, to see what would happen. In his case, he didn't focus on TDD and instead just on clearly written requirements docs. But his developers had a mix of talents & experience and the checked-in code was all over the place. Because of how atomically abstract everything was, almost nobody understood how pieces of the system interrelated.Both of these experiments led to a set of conclusions and approach as we moved forward: clearly written business requirements, followed by technical specifications, are critical, and so is a set of coding standards the whole group understands and has confidence to follow. We setup an XP system to coach junior devs who were less experienced, ran regular show & tell sessions where individuals could talk about their work, and moved from a waterfall planning process to an iterative model. All of this sounds like common sense now that it's been standard in the tech industry for an entire generation, but it was not obvious or accepted in IT "Enterprise Apps" departments in low margin industries until far more recently.I left that role in 2015 to join a hyperscaler, and only recently (this year) have moved back to a product company, but what I've noticed now is that the collaborative nature of software engineering has never been better ... but we're back to a point where many engineers don't fully understand what they're doing, either because there's a heavy reliance on code they didn't write (common 3P libraries) or because of the compartmentalization of product orgs where small teams don't always know what other teams are doing, or why. The more recent adoption of LLM-accelerated development means even fewer individuals can explain resultant codebases. While software development may be faster than ever, I fear as an industry we're moving back toward the era of the early naughts when the graybeard artisans had mostly retired and their replacements were fumbling around trying to figure out how to do things faster & cheaper and decidedly un-artisanally.
  • constantcrying
    I think the idea is interesting, but immensely flawed.The following is just disingenuous:>industrialisation of printing processes led to paperback genre fiction>industrialisation of agriculture led to ultraprocessed junk food>industrialisation of digital image sensors led to user-generated videoIndustrialization of printing was the necessary precondition for mass literacy and mass education. The industrialization of agriculture also ended hunger in all parts of the world which are able to practice it and even allows for export of food into countries which aren't (Without it most of humanity would still be plowing fields in order not to starve). The digital image sensor allows for accurate representations of the world around us.The framing here is that industrialization degrades quality and makes products into disposable waste. While there is some truth to that, I think it is pretty undeniable that there are massive benefits which came with it. Mass produced products often are of superior quality and superior longevity and often are the only way in which certain products can be made available to large parts of the population.>This is not because producers are careless, but because once production is cheap enough, junk is what maximises volume, margin, and reach.This just is not true and goes against all available evidence, as well as basic economics.>For example, prior to industrialisation, clothing was largely produced by specialised artisans, often coordinated through guilds and manual labour, with resources gathered locally, and the expertise for creating durable fabrics accumulated over years, and frequently passed down in family lines. Industrialisation changed that completely, with raw materials being shipped intercontinentally, fabrics mass produced in factories, clothes assembled by machinery, all leading to today’s world of fast, disposable, exploitative fashion.This is just pure fiction. The author is comparing the highest quality goods at one point in time, who people took immense care of, with the lowest quality stuff people buy today, which is not even close to the mean clothing people buy. The truth is that fabrics have become far better and far more durable and versatile. The products have become better, but what has changed is the attitude of people towards their clothing.Lastly, the author is ignoring the basic economics which separate software from physical goods. Physical goods need to be produced, which is almost always the most expensive part. This is not the case for software, distributing software millions of times is not expensive and only a minuscule part of the total costs. For fabrics industrialization has meant that development costs increased immensely, but per unit production costs fell sharply. What we are seeing with software is a slashing of development costs.
  • pharrington
    This is written by the same guy who proudly blogged about not knowing how computers work. [https://chrisloy.dev/post/2013/04/27/no-i-can't-fix-your-com...]