Need help?
<- Back

Comments (589)

  • stiiv
    > If this tech is as amazing as you say it is, I'll be able to pick it up and become productive on a timescale of my choosing not yours.Broadly speaking, I think this is a wise assessment. There are opportunities for productivity gains right now, but it I don't think it's a knockout for anyone using the tech, and I think that onboarding might be challenging for some people in the tech's current state.It is safe to assume that the tech will continue to improve in both ways: productivity gains will increase, onboarding will get easier. I think it will also become easier to choose a particular suite of products to use too. Waiting is not a bad idea.
  • heytakeiteasy
    Feels like a false equivalency. It's just my experience, but I've completely ignored crypto and the metaverse, and I don't get the sense I'm missing out on much. In contrast, LLMs in their current state have (for me) dramatically reduced the distance between an idea and a working implementation, which has been legitimately transformative in my software dev life. Transformative for the better? Time will tell I suppose, but I'm really enjoying it so far.
  • wolframhempel
    There's value in being early - in the right thing.- If you'd invested in Bitcoin in 2016, you'd have made a 200x return- If you'd specialized in neural networks before the transformer paper, you'd be one of the most sought-after specialists right now- If you'd started making mobile games when the iPhone was released, you could have built the first Candy CrushOf course, you could just as well have- become an ActionScript specialist as it was clearly the future of interactive web design- specialized in Blackberry app development as one of the first mobile computing platforms- made major investments in NFTs (any time, really...)Bottom line - if you want to have a chance at outsized returns, but are also willing to accept the risks of dead ends, be early. If you want a smooth, mid-level return, wait it out...
  • bogzz
    It's a horrifying feeling facing the possibility that the career I spent so much time and money to get into is fading away. Sure, LLMs are not there yet, and they might not ever quite get there. But will companies start hiring again? If productivity has gone up, and it seems like it has, then no.So, a decade of hanging by a thread, getting by and doubling down on CS, hoping that the job market sees an uptick? Or trying to switch careers?I went to get a flat tire fixed yesterday and the whole time I was envious of the cheerful guy working on my car. A flat tire is a flat tire, no matter whether a recession is going on or whether LLMs are causing chaos in white collar work. If I had no debt and a little bit saved up I might just content myself with a humble moat like that.
  • keiferski
    I actually think the opposite approach might be the most optimal one, at least from a monetary perspective. That is, be on the cutting-edge of something, but be willing to bail out at the moment its future starts seeming questionable. Or even more specifically, maximize your foothold in it while minimizing your downside.Bitcoin is a good example: if you bought it 15 years ago and held it, you're probably quite wealthy by now. Even if you sold it 5 years ago, you would have made a ton of money. But if you quit your job and started a cryptocurrency company circa 2020, because you thought crypto would eat the entire economic system, you probably wasted a lot of time and opportunities. Too much invested, too much risked.AI is another one. If you were using AI to create content in the months/years before it really blew up, you had a competitive advantage, and it might have really grown your business/website/etc. But if you're now starting an AI company that helps people generate content about something, you're a bit late. The cat is out of the bag, and people know what AI-speak is. The early-adopter advantage isn't there anymore.
  • pjmlp
    Agree with the message, coding since 1986, I have learned to not suffer from FOMO and wait for the dust to settle.Ironically one might even get projects to fix the mess left behind, as the magpies focus their attention into something else.In the case of AI, the fallacy is thinking that even if ridding the wave, everyone is allowed to stay around, now that the team can deliver more with less people.Maybe rushing out to the AI frontline won't bring in the interests that one is hoping for.EDIT: To make the point even clearer, with SaaS and iPaaS products, serverless, managed clouds, many projects now require a team that is rather small, versus having to develop everything from scratch on-prem. AI based development reduces even further the team size.
  • another-dave
    On the otherhand, when Cloud Computing started to come in, I knew a bunch of sysadmins. Some were in the "it'll never take off" camp and no doubt they know it now, kicking and screaming.But the curious early adopters were the ones best positioned to be leading the charge on "cloud migration" when the business finally pulled the trigger.Similarly with mobile dev. As a Java dev at the time that Android came along, I didn't keep abreast of it - I can always get into it later. Suddenly the job ads were "Android Dev. Must have 3 years experience".Sometimes, even just from self-interest, it's easier to get in on the ground floor when the surface area of things to learn is smaller than it is to wait too long before checking something out.
  • eqmvii
    A lot of people feel this way.But IMO the most fruitful thing for an engineering org to do RIGHT NOW is learn the tools well enough to see where they can be best applied.Claude Code and its ilk can turn "maybe one day" internal projects into live features after a single hour of work. You really, honestly, and truly are missing out if you're not looking for valuable things like that!
  • linsomniac
    >I didn't use Git when it first came out.This really hinges on what you mean by "didn't use git".If you were using bzr or svn, that's one thing.If you were saving multiple copies of files ("foo.old.didntwork" and the like), then I'd submit that you're making the point for the AI supporters. I consulted with a couple developers at the local university as recently as a couple years ago who were still doing the copy files method and were struggling, when git was right there ready to help.
  • RobinL
    For me, it's beyond doubt these tools are an essential skill in any SWE's toolkit. By which I mean, knowing their capabilities, how they're valuable and when to use them (and when not to).As with any other skill, if you can't do something, it can be frustrating to peers. I don't want collegeues wasting time doing things that are automatable.I'm not suggesting anyone should be cranking out 10k LOC in a week with these tools, but if you haven't yet done things like sent one in an agentic loop to produce a minimal reprex of a bug, or pin down a performance regression by testing code on different branches, then you could potentially be hampering the productivity of the team. These are examples of things where I now have a higher expectation of precision because it's so much easier to do more thorough analysis automatically.There's always caveats, but I think the point stands that people generally like working with other people who are working as productively as possible.
  • ctoth
    The Programmer's PrayerNothing is happening. And if it is, it's just hype.And if it isn't, it only works on toy problems. And if it doesn't, I'll learn it when it stabilizes.And if I can't, the gains all go to owners anyway. And if they don't, it's just managers chasing metrics.And if it isn't, well I'm a real programmer. And if I'm not, then neither are you.
  • bonoboTP
    That's a reasonable strategy. I don't think spreading FOMO is good. But pragmatically, I enjoy working with the latest crop of AI models regarding all sorts of computer tasks, including coding but many other sysadmin stuff and knowledge organization.I didn't pick them up until last November and I don't think I missed out on much. Earlier models needed tricks and scaffolding that are no longer needed. All those prompting techniques are pretty obsolete. In these 3-4 months I got up to speed very well, I don't think 2 years of additional experience with dumber AI would have given me much.For now, I see value in figuring out how to work with the current AI. But next year even this experience may be useless. It's like, by the time you figure out the workarounds, the new model doesn't need those workarounds.Just as in image generation maybe a year ago you needed five loras and controlnet and negative prompts etc to not have weird hands, today you just no longer get weird hands with the best models.Long term the only skill we will need is to communicate our wants and requirements succinctly and to provide enough informational context. But over time we have to ask why this role will remain robust. Where do these requirements come from, do they simply form in our heads? Or are they deduced from other information, such that the AI can also deduce it from there?
  • babarock
    I don't understand the rush to be "the first". Facebook isn't the first social media, Google isn't the first search engine, iPhone is not the first smart phone, Microsoft is not the first OS, the list goes on.Clearly there's an advantage for being an early adopter, but the advantage is often overblown, and the cost to get it is often underestimated.
  • JumpCrisscross
    “There are a 16,000 new lives being born every hour. They're all starting with a fairly blank slate. Are you genuinely saying that they'll all be left behind because they didn't learn your technology in utero?”This is a great framing.
  • MarkusWandel
    In general, a good strategy is just staying a little bit behind. Let the new fads play themselves out. Some have staying power. Bitcoin never did turn into a usable currency, just another speculator's toy. Luckily I am - so far - in a position where I can watch the AI thing from the sidelines to see how it plays out.
  • hsaliak
    My experience so far tells me that the default path with AI tooling is that it lets us create without learning. So the author is right in that they can pay for a seat in this revolution whenever they want.A practitioner with more experience maybe a few percentage points more productive, but the median - grab subscription, get tool, prompt, will be mostly good enough.
  • swframe2
    I've noticed 2 very obvious trends. The high-performers who previously wrote the most PRs are now using AI the most aggressively and the number PRs they produce has almost doubled. The other trend is that large teams are being split into smaller teams that work on new products. I hope the trend is more projects and less grunt coding. (I know the ai coders are not yet L6 but I suspect they will achieve L3-L5 this year).
  • asim
    I'm healthily skeptical of new technology. Meaning I'm not the early adopter. But I've also found over the years I don't get left behind. I become curious at the time things are stabilising. Maybe on the cusp where there's still a lot of pushback but there's also clear value. Crypto in 2014-2017. AI in 2023-2024. You don't have to feel FOMO but if you're a technologist, if you have a healthy desire to evolve, change and learn then you'll naturally pick things up. I went from total crypto skepticism in 2014 to investing most of what I had. I went from total AI skepticism to doing RAG for the Quran and agentic tech for the small web. I think there's value in staying true to who you are but also naturally discovering and learning on your own timeline.
  • sd9
    > If this tech is as amazing as you say it is, I'll be able to pick it up and become productive on a timescale of my choosing not yours.In contrast to the current top comment [1], I don't think this is a wise assessment. I'm already seeing companies in my network stall hiring, and in fact start firing. I think if you're not trying to take advantage of this technology today then there may not be a place for you tomorrow.I find it hard to empathise with people who can't get value out of AI. It feels like they must be in a completely different bubble to me. I trust their experience, but in my own experience, it has made things possible in a matter of hours that I would never have even bothered to try.Besides the individual contributor angle, where AI can make you code at Nx the rate of before (where N is say... between 0.5 and 10), I think the ownership class are really starting to see it differently from ICs. I initially thought: "wow, this tool makes me twice as productive, that's great". But that extra value doesn't accrue to individuals, it accrues to business owners. And the business owners I'm observing are thinking: "wow, this tool is a new paradigm making many people twice as productive. How far can we push this?"The business owners I know who have been successful historically are seeing a 2x improvement and are completely unsatisfied. It's shattered their perspective on what is possible, and they're rebuilding their understanding of business from first principles with the new information. I think this is what the people who emerge as winners tomorrow are doing today. The game has changed.Speaking as an IC who is both more productive than last year, but simultaneously more worried.[1] https://news.ycombinator.com/item?id=47454614
  • dgxyz
    I make my money cleaning up all the stupid fads. The tail end of the curve is profitable.
  • redm
    I agree with the sentiment of this article.Sadly, I'm still disagreeing while crypto kiddies are driving past me in lambo's. If its the future of money, yes we'll get there eventually, but like every technology shift, there's a lot of money to be made in the transition, not after. ** I sold all crypto a few years ago and I'm a happier person :D
  • ge96
    If this is about vibe-coding.I remember when React was the hotness and I was still using jQuery, I didn't learn it immediatley, maybe a couple years later is when I finally started to use React. I believe this delayed my chance in getting a job especially around that time when hiring was good eg. 2016 or so.With vibe-coding it just sucks the joy out of it. I can't feel happy if I can just say "make this" and it comes out. I enjoy the process... which yeah you can say it's "dumb/waste of time" to bother with typing out code with your hands. For me it isn't about just "here's the running code", I like architecting it, deciding how it goes together which yeah you can do that with prompts.Idk I'm fortunate right now using tools like Cursor/Windsurf/Copilot is not mandatory. I think in the long run though I will get out of working in software professionally for a company.I do use AI though, every time I search something and read Google's AI summary, which you'd argue it would be faster to just use a built in thing that types for you vs. copy paste.Which again... what is there to be proud of if you can just ask this magic box to produce something and claim it as your own. "I made this".Even design can be done with AI too (mechanical/3D design) then you put it into a 3D printer, where is the passion/personality...Anyway yeah, my own thoughts, I'm a luddite or whatever
  • muskstinks
    Crypto was interesting to think through and it was clear very early on how many flaws it has. It basically just moved the goal post to level deeper and it was quite an eye opener how few people even understood the major flaw of crypto: You can only do crypto savely with anything on blockchain and it has not solved any real issue off blockchain (which means you can literlay just send crypto to each other and thats it).But AI is a beast.Its A LOT to learn. RAG, LLMs, Architecture, tooling, ecosystem, frameworks, approaches, terms etc. and this will not go away.Its clear today already and it was clear with GPT-3 that this is the next thing and in comparison to other 'next things' its the next thing in the perfect environment: The internet allows for fast communication and we never have been as fast and flexible and global scaled manufactoring than today.Which means whatever the internet killed and changed, will happen / is happening a lot faster with ai.And tbh. if someone gets fired in the AI future, it will always be the person who knows less about AI and knows less about how to leverage than the other person.For me personally, i just enjoy the whole new frontier of approaches, technologies and progress.But i would recommend EVERYONE to regularly spend time with this technology. Play around regularly. You don't need to use it but you will not gain any gut knowledge of models vs. models and it will be A LOT to learn when it crosses the the line for whatever you do.
  • quantified
    > I wrote my MSc on The Metaverse. Learning to built VR stuff was fun, but a complete waste of time. There was precisely zero utility in having gotten in early.Wonderful life lesson on hype cycles. I am curious if hype literacy will join media literacy in academia.
  • etwigg
    I don't think the "craftsman" self-identification is going to work for software engineers anymore. The tool capabilities are too dynamic, you have to be some sort of opportunistic pirate/entrepreneur. Sure you can jump in and get up to speed on some aspect of the toolchain later on, but the identity shift is the hard and slow part that I think it's wise to get started on ASAP.
  • picafrost
    I am increasingly feeling okay with the idea of being left out. The worst parts of working professionally in a software development team have been amplified by LLMs. Ridiculously large PRs, strong opinions doubled down due to being LLM-"confirmed", bigger expectations coming from above, exceptionally unwarranted confidence in the change or approach the LLM has come up with.I am dying inside when I make a comment and receive a response that has clearly been prompted toward my comment and possibly filtered in the voice of the responder if not copied and pasted directly. Particularly when it's wrong. And it often is wrong because the human using them doesn't know how to ask the right questions.Fortunately, most of the fundamental technological infrastructure is well in place at this point (networking, operating systems, ...). Low skilled engineers vibe coding features for some fundamentally pointless SaaS is OK with me.
  • mazone
    Many companies usually want to compare themselves to Apple and at the same time say they are disruptors and innovators but Apple is probably the best company at being okey with being left behind. Many think about them as experts in products but for me they always been best att copy what others are doing and refine it, maybe not neccassary better technical but always seen the market fit better then others. Like poker, the later you need to take your decision the more information you have.
  • nsfmc
    > For every HTML 2.0 you might have tried, you were just as likely to have got stuck in the dead-end of Flash.i'll just say, and i understand this is not the point of the article at all, but for all its faults, if you got in on flash as earl as html 2.0 and you were staring at an upcoming dead-end of flash in say, 2009, you also knew or had been exposed at that time to plenty of javascript, e4x and what were essentially entirely clientside SPAs, providing you a sort of bizarro view of the future of react in a couple of years. honestly, not a bad offramp even if flash itself didn't make it.
  • mathgladiator
    I heard from a senior leader at Amazon that "Today, I am choosing how I fail". This has echoed in my head for many years.At any moment, you are failing at thousands of things that you may not even know about, and that is the gist of what I took away from it. The thing is that you have to be OK when you intentionally choose to not invest in something as regret is ultimately a poison.The other thing is this: you are not obligated to bring people with you and you have a choice of free association.
  • hintymad
    > If this tech is as amazing as you say it is, I'll be able to pick it up and become productive on a timescale of my choosing not yours.I saw meme in X the other day, which roughly says that one does not have to learn if she learns slow enough in the age AI. I guess the undertone is that AI evolves faster than one can learn about the tricks of using it.
  • wolvesechoes
    The biggest issue is - you will be left behind, in the end. This is the race you cannot win. You can try as much as you like, spend free time trying to catch up, and you, most likely, will lose. If you play this game, you've already lost.I am actually surprised by people willingly trying to be more productive, like... machines. And then crying when machines are proven to be better at being machines than meatbags.
  • mmargenot
    Especially when people pushing it are trying to capture your attention, it’s good to be deliberate about the tech that you introduce.
  • nslsm
    >There are a 16,000 new lives being born every hour. They're all starting with a fairly blank slate.No, they are not.
  • fdghrtbrt
    Guy who's ok with being left behind (crypto, AI) did a MSc on the metaverse. Sounds like he tried to go with the hype once, got burned.
  • tracker1
    I remember working on a few early tools for VRML towards the end of the 90's... It was cool, but far from great... was remembered from the mention of VR in the article.That said, my only regret with Bitcoin was deleting my early wallets when I realized the coins were only worth $.25 ... if I'd had any inkling what they'd be worth someday, I'd probably have just bought $1000 worth back then and zipped it up until closer to today. I'm truly curious how many bitcoins were similarly deleted from existence.
  • pgwalsh
    Solid piece, very wise. AI is fun but where I find it useful with code is first writing more thorough comments and then writing the base of your tests.Writing the actual code that's efficient is iffy at times and you better know the language well or you'll get yourself in trouble. I've watched AI make my code more complex and harder to read. I've seen it put an import in a loop. It's removed the walrus operator because it doesn't seem to understand it. It's used older libraries or built-ins that are no longer supported. It's still fun and does save me some time with certain things but I don't want to vibe code much because it removes the joy out of what you're doing.
  • mindsandmach
    I'm reminded of the parable of the Chinese farmer (a quick Google search if you aren't familiar) when I see this sentiment. Is going all in on crypto good or bad? Maybe so, maybe not. We'll see. Is going all in on AI-assisted development good or bad? Maybe so, maybe not. We'll see.All I know is, I've always enjoyed building things. And I enjoy building things with AI-assisted tools too, so I'll continue doing it.
  • mmmore
    One hidden premise of this is "AI tools are not useful now, even if they might be in the future." For example:> Few are useful to me as they are now.Except current AI tools are extremely useful and I think you're missing something if you don't see that. This is one of the main differences between LLMs and cryptocurrency; cryptocurrencies were the "next big thing", always promising more utility down the road. Whereas LLMs are already extremely useful; I'm using them to prototype software faster, Terrance Tao is using them to formalize proofs faster, my mom's using them to do administrative work faster.
  • nkozyra
    Not advocating for crypto here, but the ROI evaluations here are a bit incongruous.The risk of getting in early on crypto is you lose a little money. The risk of not is missing out on money. You can't simply replay that later, the way that you could invest the time to catch up on how git works.
  • jwsteigerwalt
    At some point you commit the time to learn what you need to. I like to think of the analogy to SEO. The veterans in the industry are not who they are because they were at the front of the line. It’s because they have the 15 years of experience under their belt.
  • A_Duck
    Crypto isn't bad because it failed to make early adopters rich — it did make them rich. It's bad because it has horrible externalities in scams, war crimes / sanctions evasion, organised crime — which most of those early adopters were well aware of.
  • jedberg
    It's more about job seeking than anything. If you jump on a fad early, and it turns out to be the winner, when you're looking for work you can say you have X years of experience with it, which will be a few more than most of the other candidates.It also shows a passion for learning and improvement, something hiring managers are often looking for signals of.But of course it's a trade off. This rewards people who don't have family or other obligations, who have time to learn all the new fads so they can be early on the winners.
  • vinayaksodar
    While you can certainly take a wait and watch approach on many things you also have to take a strategic bet on some otherwise you will never be at the forefront of any field.
  • bdcravens
    This is fine so long as you don't confuse stubbornness for caution. As technologies lose favor, and others suggested you expand your toolset, don't post about your frustration while you're standing in the unemployment line.
  • cjbgkagh
    > There are a 16,000 new lives being born every hour. They're all starting with a fairly blank slate. Are you genuinely saying that they'll all be left behind because they didn't learn your technology in utero? > No. That's obviously nonsense.That's does not obviously follow, I do worry about the ever increasing proportion of humanity who are no longer 'economically viable' and this includes people who are not yet born.
  • grim_io
    Devs who never mentored or never had to delegate/explain the work to be done to someone else, might be in for a rough first few weeks/months.It is a skill, but not a special AI specific skill.
  • _345
    I would not have started my article with "I could get into Bitcoin anytime, why the rush". That is not the killer first example you think it is. It's been ~17 years of proof so far that you would've made a ton of money by simply mindlessly buying $200 of bitcoin every month after lower risk contributions are made, and just holding onto it.I mean if you did that you'd have contributed ~$38K USD by 2026 and had ~1.5B USD now if you started in 2010. BTC being so cheap back then dominates the whole process so to demonstrate my point more if you had heard about it all those years and were nervous about trying it and decided to wait until 2016, you'd still need to just put in $24K overall to come out with ~$450K by 2026.That's not biting your finger nails over the price changes, the hype cycles, the price drop scares. You just set and forget a $200 recurring buy a month and put your energy elsewhere and pocket half a million for basically no effortAnd if anything is possible in hindsight, then why in hindsight would you write an article acting like bitcoin was a bad decision to be an early adopter for
  • waynecochran
    This is one of those posts I would like look back on in a year or two. I am usually a late adopter with everything. This time is think its different. I am seeing what AI can do with my own eyes. I am creating new things at light speed and figuring out this all works. I don't think you want to be late to the party on this one.
  • lowken
    There’s a lot of truth to this post. I’m very pro AI, and I believe everyone should get comfortable with it because it’s not just the future, it’s already the present. If you want to stay competitive in today’s workforce, AI is going to be part of your toolkit.But on the other hand... I also only learned git when I needed it at a new job... So we can pump the breaks a bit.
  • halapro
    The last comment is ridiculous. Newborns have a literal lifetime to catch up—rather, they will learn what they need when they need it.
  • cheevly
    Ive been using AI/LLMs for 3 years non-stop and feel like I've barely scratched the surface of learning how to wield them to their full potential. I can’t imagine the mindset of thinking these tools don’t take extreme dedication and skill to master.
  • somenameforme
    The irony is that if LLMs live up to their potential then the value of software development as a skill is going to plummet, at least as far as something to do for others. I say it's ironic because obviously the people most interested in using LLMs for software development are software developers, and most are not working independently. It'd be like if we were all proactively getting involved in training our own replacements.I was highly skeptical of this happening not that long ago, but I have to say that it seems increasingly likely. LLMs are still quite mediocre at esoteric stuff, but most software development work isn't esoteric. There's the viable argument that software development largely isn't about writing code, but the ability to write code is what justifies software developer salaries, because there's a large barrier to entry there that most just can't overcome. The 80/20 law seems to apply to everything, certainly here - 80% of your salary is justified from 20% of what you spend your time doing.It's quite impossible to imagine what this will do to the overall market, because while this sounds highly negative for software developers, we're also talking about a future where going independent will be way easier than ever before, because one of the main barriers for fully independent development is gaps in your skillset. Those gaps may not be especially difficult, but they're just outside your domain. And LLMs do a terrific job of passably filling them in.It'd be interesting if the entire domain of internet and software tech plummets in overall value due to excessive and trivialized competition. That'd probably be a highly disruptive but ultimately positive direction for society.
  • romaniv
    > weaponisation of FOMOThis is in an excellent characterization of the kind of marketing tactic I see all over social media right now and that I find absolutely disgusting.The keyword here is fear. Despite faux-positive veneer, the messaging around certain technologies (especially GenAI) is clearly designed to induce anxiety and fear, rather than inspire genuine optimism or pique curiosity. This is significant, because fear is one of the most powerful tools to shut down rational thinking.The subliminal (although not very subtle) message there is something very primitive. "If you don't join our group, you will soon starve to death." This is radically different from how most transformative technologies were promoted in the past.
  • DonsDiscountGas
    I started working in AI/ML about ten years ago. Reasonably early. Today, professionally and financially I'm doing about as well as a typical programmer. I find the field interesting so I have no regrets but I tend to agree with OP.
  • markbnj
    Adoption of a new technology has always sorted itself into buckets by early adopters, mainstream adopters and late adopters. I think this post is just demonstrating the mindset of the latter.
  • al_borland
    I prefer to move slower. I've accepted that I'm not going to create some unicorn startup (that was never an aspiration). As an employee at a company, my goal is to focus my time learning the things that are relevant to my job and that will be useful for 10 years, not 10 weeks.Chasing every new tech will lead to burnout and disillusionment at some point.AI probably isn't going away in the same way NFTs largely did, and I use it to some degree. However, I don't see a lot of value of being on the bleeding edge of AI, as the shape it takes for those skills that will be used for the next 10 years are still forming. Trying to keep up now means constantly adapting how I work, where more time is spent keeping up on the changes in AI than actually doing something useful with it.After the bubble pops, I think we'll start to see a much more clear picture of what the landscape of AI will look like long-term. Who are the winners, who are the losers, and what tools rise to the top after the hype is gone. I'll go deeper at that time.Right now, the only thing I'm allowed to use at work is Copilot, so I just use that and don't bother messing around with much more in my free time.
  • Aldipower
    WordStar for DOS was great! A lot better then my hand-writing. But still, I get the point. :-)
  • lionkor
    I had this with Rust. I always saw the huge hype, especially some years ago, and it was hugely off-putting. Ridiculous projects like rewriting famously full coverage branch tested projects like SQLite in Rust, or rewriting the GNU coreutils, and always spamming "blazing fast" and "written in Rust (crab emoji)" was very, very hostile to a C++ developer.When I eventually got around to using Rust, I was hooked, and now I don't use C++ anymore if I can choose Rust instead. The hype was not completely unjustified, but it was also misplaced, and to this day I disagree with most of those hype projects.It was no issue to silently pick up Rust, write some code that solves problems, and enjoy it as a very very good language. I don't feel a need to personally contact C or C++ project maintainers and curse at them for not using Rust.I do the same with AI. I'm not going around screaming at people who dare to write code by hand, going "Claude will replace you", or "I could vibe code this for 10 bucks". I silently write my code, I use AI where I find it brings value, and that's it.Recognize these tools for what they are: Just tools. They have use-cases, tradeoffs, and a massive community of incompetent idiots who like it ONLY because they don't know better, not because they understand the actual value. And then there's the normal, every day engineers, who use tools because, and ONLY because, they solve a problem.My advice: Don't be an idiot. It's not the solution for all problems. It can be good without being the solution to a problems. It can be useful without replacing skill. It can add value without replacing you. You don't have to pick a side.
  • argee
    I agree with the conclusion but not with the premise. The conclusion is, "I don't have to be an early adopter," but the premise seems to be "there is zero utility in getting in on anything early."
  • anon
    undefined
  • mvrckhckr
    It’s a personal choice, and both early and late(er) can be valid rational choices if it’s you who is making the choice and not just following a crowd (or even a single person).
  • taraharris
    As long as it's not coupled with calls to tax and regulate those who do get in early and reap benefits from doing so, this is good and healthy.(I'm not the earliest adopter of crypto and AI by any means. I only rode up crypto a couple of times for 2X and 3X kinda gains on my investment, and I only started using Claude last year.)
  • aavci
    FOMO is making me feel like I should mess around with openclaw but I can’t see any use cases that I can’t accomplish with other tools. What should I do based on this article?
  • abrztam
    sure you can pick up any tool whenever you want, but from your employer's perspective AI is the best force multiplier since slavery, everything between it still required humans with leverage, the question is if your boss will need you at all by then
  • anon
    undefined
  • aaurelions
    Why launch Voyager-1 if, in X years, no matter how far it flies, we’ll catch up to it and overtake it using a new version?
  • fantasizr
    the main value being created is by selling courses and convincing people they're late and need to catch up
  • Tade0
    To me the main question is the long term pricing.It is said that major providers more than break even on what they're charging.But at the same time that's not the point of capitalism, is it? The point is to charge close to the value you're providing.My lunch money is approximately $10 and I often blow through as much in Claude tokens generously provided by the company which hired me. But I'm not getting $10 value from those tokens, but much more.The cost of entry to this market is extremely high. Should Anthropic win and become an almost monopoly, it is bound to keep increasing prices to the point, where the value it's providing matches the cost.That's the endgame of every AI company out there. It's worth using these tools now, while there's still competition and moats weren't established.
  • aavci
    How else can you be in the right place and right time to discover a problem to solve that can’t be seen from afar?
  • Footprint0521
    Say what you want but the layoffs of people who don’t use these tools getting replaced with those who can have and continue to happen; I miss manual coding just as much as the next man but this seems like a hot take
  • neya
    > I'm OK being left behind, thanks!> It is 100% OK to wait and see if something is actually useful.> I took part in a vaccine trial> Getting Jabbed With EXPERIMENTAL SCIENCE!This is such a weird article. The author presents so many contradictory anecdotal experiences against the author's own conclusion.
  • kxrm
    As someone further down the road in my career, I would argue that waiting is your prerogative but you do so at your own peril.I made these kind of mistakes early in my career, stuck it out with PHP for far too long ignoring all the changes with frontend design trends, react, etc. I was using jQuery far too late in my career and it really hurt me during interviews. What I was doing was seen as dated and it made ageism far worse for me.Showing a portfolio website that was using tables instead of divs.I had to rapidly skill up and it takes longer than you think when you stick too long with what works for you.If AI truly is a nothing-burger than guess what? Nothing lost and perhaps you learned some adjacent tech that will help you later. My advice is to NEVER stop learning in this field.Learning is your true superpower. Without that skill, you are a cog that will be easily replaced. AI has revealed to me who among my colleagues is curious, and a continuous learner. Those virtues have proven over the course of my 25+ year career in technology to be what keeps you relevant and marketable.
  • senordevnyc
    I’m running a solo saas. In the last six months, I’ve added about $300k in ARR. There’s zero chance I could have done this without AI. My velocity just keeps going up, month after month.
  • throwaway330935
    >what's the point of "getting in early"?You're trying to make the point using BitCoin, but in the early 2000s I had just over 14,000 of them, so I can quite clearly see a point in getting in early.
  • tutanosh
    I understand and empathize with his sentiment, but I think he is missing the point. Using AI effectively as an engineer requires a paradigm shift in terms of how you work. You cannot approach your work as you did in the past, and use AI and expect it to be a big improvement. In fact, if you do that you will likely be disappointed, and worse off. Shifting your paradigm is one of the hardest things you can do, even more so if you have been in the field for a while, but it is also the most rewarding, and opens up many new possibilities. It's not about being left behind, as much as it is about limiting yourself unnecessarily, by staying in your comfort zone.
  • groundzeros2015
    I’m suprised nobody has mentioned that Claude is the realization of all this blockchain work - an internet computer you rent time from where the computation is measured in tokens :)
  • 0xblinq
    Comparing these tools to the crypto or NFTs hype is so out of touch with reality.This is more on the scale of the invention of the press, the telegraph, or the internet itself."I'm ok being left behind, I will join this Internet thing when it really becomes useful"...Ok... you do you. Hope you don't get there too late.
  • sktrdie
    So? Economy is entertainment. When crypto was hype, billions were made and burned from building whatever entertaining thing around that. Now it's AI's turn. Billions will be made and burned. Economy is just a fun game. Let's have fun. The idea that everything needs to be "useful" is highly subjective. What is truly useful? Is it food? Shelter? Medicine?
  • thr0w
    > What is there to be left behind from?Employment?
  • nubg
    His ignorance is my first-mover opportunity!
  • duskdozer
    Anyone obsessively insisting others adopt $tech with threats that you'll be obsolete, left behind, whatever, are just selling you something. If anything, they should be trying to keep it a secret, so that they stay among the elite few who get outsized benefits from $tech while everyone else plays in the mud.
  • bartread
    I mean, whatever, man.This line, as one example:> For every HTML 2.0 you might have tried, you were just as likely to have got stuck in the dead-end of Flash.Like a lot of tech Flash had its moment in the sun and then faded away, but that “moment” lasted a decade, and plenty of people got their start because of or built successful businesses around it. Did they have to pivot as Flash waned? Sure, but change is part of life.I’m sorry but I find the take expressed in this piece to be absolutely miserable and uninspiring.But, hey, congratulations on the 20:20 hindsight, I suppose.
  • lo_zamoyski
    It is a far better investment to attend to the eternities rather than the times.
  • surgical_fire
    Okay, this text was pretty good. Refreshing to read something that doesn't seem written by AI too (would be ironic given the contents).The only scenario where I think it pays off to be on top of the hype is of you are chasing money sloshing around the latest hype. You know, the hustle culture thing. If that's not your thing, waiting until things are established (if they ever get there) is harmless.And yeah, AI as it is now is at best moderately useful. I use it on a daily basis, but could do without it with little harm.
  • A_Duck
    I'm upvoting because it's useful to see and debate this viewpoint — shared by many engineers I knowI do think it's a bad take though. Not all new trends are the same: the metaverse was an obvious flop and crypto hasn't found practical applications. AI isn't like those because it's already practically changed the way I get my job done.It takes time to learn skills, and getting started earlier will means more time to use them in your working life.
  • rileymichael
    i’ve said this before, but the “left behind” narrative is FUD nonsense. as an llm avoider i’ve never felt further _ahead_ than now. all of my peers who never bothered to learn their tools (which gave tangible benefits) have opted into deskilling themselves further.it’s readily apparent who has bought into the llm hype and who hasn’t
  • bravetraveler
    It's a problem of motivation, all right? Now if I work my ass off and Initech ships a few extra units, I don't see another dime, so where's the motivation? And here's something else, Bob: I have eight different bosses right now.
  • hparadiz
    When did ignorance become a virtue? Or is it the contrarian montra?
  • dreamcompiler
    It amazes me that companies are developing proprietary IP with somebody else's cloud-based AI that ingests and learns from everything that they type and it generates.These companies are paying for the privilege of having their IP stolen.
  • josefritzishere
    This reasoning is solid and applies equally to AI. I do not need it crammed into every service and forced on me thank you very much "If this tech is as amazing as you say it is, I'll be able to pick it up and become productive on a timescale of my choosing not yours."
  • make_it_sure
    some people are not ok, some people lose their jobs and suffer because they are too complacent and it's too uncomfortable to adapt.This is the lazy guy path, is not the wise one.
  • nromiun
    Getting early into any technology only makes sense if you are building your business on top of it. Or you are making money from it in some way. Other than that it makes sense for the rest of us to wait.Of course those that believe that AI will convert into AGI and destroy society as we know it won't be convinced.
  • dist-epoch
    General idea is true, except for this particular technology.When AI will be easy to pick up and guide, guess what, there will be no need for a programmer to pick it up. AI will be using itself, Claude Manager driving Claude programmers.So leverage AI while you still can provide value doing so.It's literally a "use it or lose it situation".
  • hota_mazi
    Whenever I hear "It's never too late to do X", I can't help but think "Well in this case, there is no harm in waiting a bit longer, is there?".
  • mocmoc
    Wouldn't play that game with LLM's
  • nailer
    I'm glad I missed: GraphQL, Kubernetes, Microservices, the Metaverse.I'm glad I jumped early on: Linux, Python, virtualization, cloud, nodejs, Solana.I wish I'd gotten into Rust and LLMs earlier.
  • raincole
    > If this tech is as amazing as you say it is, I'll be able to pick it up and become productive on a timescale of my choosing not yours.I mean... yeah? It's obviously true. However people use LLM coding today not because they're "afraid of being left behind" or "investing into a new tech" or whatever abstract reasoning. It's because they're already reaping the benefit right away. It takes just a few hours to go through like 80% of the learning curve.
  • m132
    The thing is, Bitcoin, at least before cryptocurrencies were picked up by "tech bros", was originally a way to disconnect from the corrupt, centralized banking system.LLMs, at the moment, are all about giving up your own brain and becoming fully dependent on a subscription-based online service.
  • simianwords
    Its interesting to see this author's historical takes about AI.IMO it reads a little desperate and very much like the hype bros but from opposite side. Take a look at the articles if you don't believe.https://shkspr.mobi/blog/tag/ai/- I'm OK being left behind, thanks!- Unstructured Data and the Joy of having Something Else think for you- This time is different- How close are we to a vision for 2010?- AI is a NAND Maximiser- Reputation Scores for GitHub Accounts- Agentic AI is brilliant because I loath my family- Stop crawling my HTML you dickheads - use the API!- Removing "/Subtype /Watermark" images from a PDF using Linux- LLMs are still surprisingly bad at some simple tasks- Books will soon be obsolete in school- Winners don't use ChatGPT- Grinding down open source maintainers with AI- Why do people have such dramatically different experiences using AI?- Large Language Models and Pareidolia- How to Dismantle Knowledge of an Atomic Bomb- GitHub's Copilot lies about its own documentation. So why would I trust it with my code?- LLMs are good for coding because your documentation is shit
  • gabordemooij
    I find the hivemind terribly oppressing at times. AI tools are great, but in the end it seems to me that the results matter most. However we seem to go from hype to hype, again and again. It's all so tiresome. Why can't we just respect individual choices and focus less on the tools and more on the results?
  • LaGrange
    > Have fun being poorNot going to lie, I’d rather be poor. Not destitute - I’ve been poor but not destitute and I’d rather not go desperate - but poor? As in (because “poor” is very imprecise and can imply anything between utter poverty to “not owning three homes”) like having a low paying job but still enough to pay rent?I’d rather be that than do AI assisted software development. Genuinely the only thing stopping me now is that there’s actually way more skill and qualifications in most low-paying jobs than a typical software developer imagines, and acquiring those takes time and money itself. But by now I know multiple people who made the jump even before the latest madness, and they’re all happier. Some still code, but don’t even publish. Some are like “I haven’t used a proper computer in _months_ this is great.” All work hard jobs at odd hours. None regret.
  • dakolli
    I love the Cryptocurrency analogy. The LLM hype monkeys are the same people that were screaming that NFTs/Digital Art was going to replace all the traditional art in gallaries in 10 years. They are literally the same people, and they are all addicted to money and hype. Ignore them..Did handmade Swiss watch movements lose all demand when Asia started mass manufacturing watches? No. There is always going to be more demand for quality over slop. Its the same reason that handmade clothes are worth 100x more than clothes at a department store.This is all by design too, these billionaires selling thinking machines are trying to make us all dependent on their fountain of tokens. Don't fall for it. Just like how maps apps made everyone reliant on Google/Apple for your ability to navigate around your own city, these billionaires want to do the same think with your ability to think, build, plan and even learn/read.Don't fall for this scam, unlike other hype cycles like NFTs and Crypto this will actually damage more than just your bank account, it will fry your brain if you become over reliant on it.Take a second and consider why these LLM tool companies design their products like slot machines. They put multipliers in there UIs (run this x3,x4,x5) times so that you inevitably treat the thing like a slot machine. And it is like a slot machine, you have no way to control the results its quite random, in the case of llms they just have a better payout percentage, at the cost of making your brain become dopamine and structurally dependent on their output. They convince people there is some occulted art in the formation of a prompt, like a gambler who thinks if they press buttons in a certain order they'll get better results or many other gambling superstitions.If you're writing software please take a moment to breath, and ask yourself if its really that useful to have piles of code where you have little idea how things work, even if they do. Billionaires will sell you on the idea that this doesn't matter because the llm, that you conveniently have to pay them to use, will always be able to fix that bug.Don't fall for the ruling classes trick, they want you reliant on this thing so they can tell you that your input isn't as valuable, and therefor your salary and skills are not as valuable. We have to stop this now.
  • dakolli
    The kid who showed his work in detail in math class is doing better in life 9/10 times than the kids that only knew how to use a calculator. Now consider how well the people who think you just need to know how to yell at the calculator are going to do?When Maps apps came around, people totally lost the brain muscle for being able to navigate. Using LLMs is no different, people over reliant on these tools are simply ngmi. They are going to be totally reliant on their favorite billionaire being willing to sell them competency via their thinking machines.I would caution everyone to consider if the Billionaires who are screaming that you're going to be left behind, laid off and redundant if you don't (pay them to) use their brain nerfing machine, whether or not they have your best interest at heart.You're not going to be left behind.https://arxiv.org/abs/2506.08872
  • homeonthemtn
    I think this is a carry over from the early 2000s boom and bust mindset. That if you jump in early enough on nearly any technology, you'll become a billionaire. So hop on board our burning platform!In general, we as a society have not adjusted to technology. We've gone through to much change to have any stable base lines. So we're going to float in insanity for a while until things finally settle down. Probably 2 wars, a famine, and several periods of resource scarcity away still, but we'll get there one day...
  • maxothex
    [dead]
  • hideyoshi_th
    [dead]
  • bigbawls
    Had another 2-hour throwaway session today with an LLM.These only really happen in mature codebases tied up in complex business requirements.The last few times I’ve tried LLMs with this codebase it has not been fruitful.Weird because it’s impressive in other areas, especially tech with no real users lmao
  • jruz
    Have fun writing code yourself /s
  • j3th9n
    Tldr; dude makes wrong choices one after the other and copes with it by being ok with being left behind. "I wrote my Msc on The Metaverse", ....
  • stefantalpalaru
    [dead]
  • null-phnix
    [dead]
  • adampunk
    [flagged]
  • butILoveLife
    This would hit harder if Bitcoin didn't win and AI coding didn't completely change our jobs.Why not simply evaluate things instead of ignoring them until its too late?Sure, we don't have infinity time, but the fact that OP mentions these two things, means the pattern showed up enough.
  • danielbln
    A great blog article for 2023. In 2026 I think the wait is over..
  • marnett
    > I didn't use Git when it first came out. Once it was stable and jobs began demanding it, I picked it up.What jobs aren’t requiring usage of these tools by now?
  • adriancooney
    There are real productivity gains by using these tools right now. Instead of doing 1x your normal work, you can do 5x while still maintaining quality. This is like an accountant sticking to pen and paper because calculators are big and clunky.