<- Back
Comments (130)
- malmelooOne big issue with FPGAs is how annoying it is to learn how to use them. I did a course on embedded systems a few years ago and nobody could truly get to enjoy it because we spent most of our time downloading and installing huge toolchains, waiting for synthesis and PnR to complete and debugging weird IDE issues. We need to open up the space to allow people to develop better solutions than what these companies are forcing down our throats.There already exist fantastic open source tools such as Yosys, Nextpnr, iverilog, OpenFPGALoader, ... that together implement most features that a typical hardware dev would want to use. But chip support is unfortunately limited, so fewer people are using these tools.We decided to build a VSCode extension that wraps these open source tools (https://edacation.github.io for the interested) to combat this problem. Students are already using it during the course and are generally very positive about the experience. It's by no means a full IDE, but if you're just getting started with HDL it's great to get familiar with it. Instead of a mess of a toolchain that nobody truly knows how to use, you now get a few buttons to visualize and (soon) program onto an FPGA.There's also Lushay Code for the slightly more advanced users. But we need more of these initiatives to really get the ball rolling and make an impact, so I'd highly recommend people to check out and contribute to projects like this.
- BrannonKingThe whole place-and-route thing is completely wrong for using FPGAs as accelerators. We don't need an optimal layout, we need a tiled layout (like the GPU does). All that we need for this to happen is for the companies making the FPGAs to open up the board layout file spec. They don't need to even make/ship any software at all. Just ship the dang file that says where the resources & timings are and some instructions on how to toggle the LUT config.My feeling is that hardware companies do better when they ship the software needed to utilize their hardware for free. (You need a little margin in the hardware price to cover the software development). However, the FPGA companies haven't figured this out. They try to make way too much software and charge exhorbitant fees for it, somehow thinking that their hardware is useless without that. In fact, their hardware is useless because I can't put anything on it without a 1-to-20 hour compile time. That makes it impossible to use it as an accelerator. I can compile OpenCL for my GPU in a few milliseconds; that's what we need for the FPGA. Even thirty seconds would be easily tolerable -- there's many a game that still requires 15 seconds to load a level and compile its shaders.FPGAs could be much more useful than they are at present. They've artificially limited themselves to ASIC prototyping alone.So Intel bought an FPGA company -- nobody knows why. AMD got scared and did the same thing with no clue what to do with it. They've both let them rot. Intel did start incorporating it into its compiler targets, but it was only half-baked. Now they've wisely divested themselves of the company, but it should have never happened. They should have just focused on selling the hardware at a small margin whilst opening up the data to use it.
- exmadscientistFPGAs need their "Arduino moment". There have been so, so, so many projects where I've wanted just a little bit of moderately-complicated glue logic. Something pretty easy to dash off in VHDL or whatever. But the damn things require so much support infrastructure: they're complicated to put down on boards, they're complicated to load bitstreams in to, they're complicated to build those bitstreams for, and they're complicated to manage the software projects for.As soon as they reach the point where it's as easy to put down an FPGA as it is an old STM32 or whatever, they'll get a lot more interesting.
- mgilroyThe issue with the software team using an FPGA is that software developers generally aren't very good at doing things in parallel. They generally do a poor job in implementing hardware. I previously taught undergraduates VHDL, the software students generally struggles with the dealing with things running in parallel.VHDL and Verilog are used because they are excellent languages to describe hardware. The tools don't really hold anyone back. Lack of training or understanding might.Consistently the issue with FPGA development for many years was that by the time you could get your hands on the latest devices, general purpose CPUs were good enough. The reality is that if you are going to build a custom piece of hardware then you are going to have to write the driver's and code yourself. It's achievable, however, it requires more skill than pure software programming.Again, thanks to low power an slow cost arm processors a class of problems previously handled by FPGAs have been picked up by cheap but fast processors.The reality is that for major markets custom hardware tends to win as you can make it smaller, faster and cheaper. The probability is someone will have built and tested it on an FPGA first.
- CadwhiskerThis article is a rant about how bad tools are without going into specifics. "VHDL and Verilog are relics", well so is "C" but they all get the job done if you've been shown how to use them properly."engineers are stuck using outdated languages inside proprietary IDEs that feel like time capsules from another century.". The article misses that Vivado was developed in the 2010's and released around 2013. It's a huge step-up from ISE if you know how to drive it properly and THIS is the main point that the original author misses. You need to have a different mindset when writing hardware and it's not easy to find training that shows how to do it right.If you venture into the world of digital logic design without a guide or mentor, then you're going to encounter all the pitfalls and get frustrated.My daily Vivado experience involves typing "make", then waiting for the result and analysing from there (if necessary). It takes experience to set up a hardware project like this, but once you get there it's compatible with standard version control, CI tools, regression tests and the other nice things you expect form a modern development environment.
- mastaxMy prediction is one of the Chinese FPGA makers will embrace open source, hire a handful of talented open source contributors, and within a handful of years end up with tooling that is way easier to use for hobbyists, students, and small businesses. They use this as an inroad and slowly move upmarket. Basically the Espressif strategy.Xilinx, Altera, and Lattice are culturally incapable of doing this. For lattice especially it seems like a no brainer but they don’t understand the appeal of open source still.
- TrasterI was quite surprised the direction this article took. I wasn't expecting reheated whinging about the toolchain.FPGAs do need a new future. They need a future where someone tapes out an FPGA! Xilinx produced Ultrascale+ over a decade ago and haven't done anything interesting since. Their Versal devices went off a tangent into SoCs, NOCs, AI engines - you know what they didn't do? Build a decent FPGA.Altera did something ambitious back in 2014 when they proposed the hyper-register design, totally failed to execute on it and have been treading water because of the Intel cluster**. They're now an independent company but literally don't have anyone who knows how to tape out a chip.I'm less familiar with the Lattice stuff, but since their most advanced product is still 16nm finfet I suspect they aren't doing anything newer than Xilinx or Altera.We need a company that builds an FPGA. It doesn't matter what tooling you have because the fundamental performance gap between a custom FPGA solution and a CPU or GPU based solution is entirely eaten up by the fact the newest FPGA you can buy is a decade old and inexplicably still tens of thousands of dollars.If FPGA technology had progressed at the same rate NVidia or Apple had pushed forward CPU/GPU performance, thered be some amazing opportunities to accelerate AI (on the path to then creating ASICs). But they haven't, so all the scaling laws have worked against them and the places they have a performance benefit have shrunk massively.
- ImPleadThe5thI have an old FPGA sitting around at home and I'm relatively comfortable with VHDL.I've never really thought of any interesting projects to do with it. Anyone know of anything?
- jhideThis article could’ve been written 20 years ago with only minor revisions, and it would’ve been true then. But it’s not now. It is trivial, literally a day of work, to set up a build system and CICD environment using Verilator if you are already proficient with your build system of choice. Learning TCL to script a bitfile generation target using your FPGA vendor’s tools is a few extra days of work. And regarding IDE support, the authors complain about the experience of writing code in the vendor GUI. They should look at one of the numerous fully featured systemverilog LSPs available in e.g. VS Code.The real argument for open source toolchains is much narrower in scope and implying its requirement for fixing a nonexistent tool problem is absurd
- juliangmp> I once hoped things would improve when Xilinx launched the Zynq line, combining a processor with FPGA fabric. Instead, the accompanying tools were so unusable that they made things worse, pushing developers even further away.I once tried to use Xilinx' Vitis (2025) to make a small-ish piece of software running on such a Zynq chip. After wrestling with it* for like 5 weeks, me and my colleagues decided to ditch the entire Xilinx suite entirely and just pick a compiler and make a bare-metal binary with it. The FPGA part is done by a separate team of course, so us traditional software devs can stick with decent tools. We actually opted for a Rust toolchain and I'm extremely glad we did this, despite the additional time it took.I don't know how my FPGA colleagues work with the proprietary toolchains and not go insane.*The IDE is effectively a wrapper with a custom python API around cmake and gcc. It's not very well written cmake and I also don't know how they configure the linker that it does the weird things it does.
- mikewarotAn FPGA is like a spreadsheet for bits that can recalculate at hundreds of millions of times per second.It's a declarative programming system, and there's a massive impedance match when you try to write source code for it in text. I suspect that something closer to flow charts, would be much easier to grok. Verilog is about as good at match as you are likely to get, if you stick with the source code approach to designing with them.
- omneityIf performant FPGAs were more accessible we’d be able to download models directly into custom silicon, locally, and unlock innovation in inference hardware optimizations. The highest grade FPGAs also have HBM memory and are competitive (on paper) to GPUs. To my understanding this would be a rough hobbyist version of what Cerebras and Groq are doing with their LPUs.Unlikely this will ever happen but one can always dream.
- rzerowanOn the software front as mentioned VHDL and Verilog are showing their age with their design as well as ther tooling ecosystem.Attempts such as CHISEL[1] (written in Scala)also havent gotten much traction - seeing also the language choice - would have btter have been in something more accesible like kotlin/ocaml.Secondly the integration with consumer devices and OS is almost non-ecistant - it should really be simpler to interact with ala GPU/Network chip and have more mainboards with lowcost integrated FPGAs even if they are only a couple of hundred of logic cells.[1]https://github.com/chipsalliance/chisel/blob/main/README.md
- dsabI had the misfortune of working with the Xilinx Vivado environment, it's a fucking garbage, the software is straight out of the 90s, everything is glued together with shell scripts and the TCL scripting language, the IDE throws thousands of warnings and errors while building a sample project, the documentation is missing or spread over 150 PDFs, if the manufacturer of your evaluation board prepared an example for the previous version of Vivado, you must have two installations, which is probably about 2 * 100GB, if you want to keep anything under version control, you have to use some external tools, it's all absurd.
- mgaunardThere are a number of alternatives to VDHL and Verilog, many of which lower to Verilog, for example MyHDL.
- rcxdudeFPGA toolchains certainly could do with being pulled out of the gutter but I don't think that alone will lead to much of a renaissance for them. Almost definitionally they're for niches: applications which need something weird in hardware but aren't big enough to demand dedicated silicon, because the flexibility of FPGAs comes at a big cost in die area (read:price), power, and speed.
- PeteragainProgramming languages were originally designed by mathematicians based on a Turing machine. A modern language for FPGAs is a challenge for theoretical computer science, but we should keep computer literate researchers away from it. This is a call out to hard core maths heads to think about how we should think about parallelism and what FPGA hardware can do.
- jonesjohnsonI'm surprised, no one mentioned https://f4pga.org/ yet.
- jauntywundrkindMaking FPGA's actually available (without encumbering stacks) would be so great. Companies IMO do better when they stop operating from within their moat & this would be such the amazing use case to lend support for that hypothesis.Gowin and Efinix, like Lattice, have some very interesting new FPGAs, that they've innovated hard on, but which still are only so-so available.Particularly with AI about, having open source stacks really should be a major door opening function. There could be such an OpenROAD type moment for FPGAs!
- marcosscriven> An FPGA, by contrast, defines data pathways specifying how signals change on each clock tick based on internal states and external inputs. In essence, we describe global per-clock-cycle behavior rather than an individual act of data manipulation per step.I think that’s the clearest explanation of FPGAs I’ve ever seen.
- Ericson2314This is a very correct article. FPGAs should indeed be really easy to use!
- xgstationI imagine FPGA could just be part of general CPU that provides user space APIs to program them to accelerate certain work flow, in other words, this sounds like exactly JIT to me. People may program FPGA as they need to, e.g. AV1 encoder/decoder, accelerate some NN layers, or even a JS runtime, am I thinking something too wild for hardware capability or is it just the ecosystem isn't there yet to allow such flexible use cases?
- lefraI program FPGAs professionally (Xilinx Zynq, VHDL). I agree that the tool's GUI is atrocious, the actual way to use it is to write everything in TCL scripts, and invoke the tools through a Makefile. I only open the GUI to look at timing issues or ask it for code templates.I disagree with "HDL is software" though. It's not, it's even in the name: "hardware description language". Yes it's a text file with what looks a lot like regular code in it. However what's being decribed is how to connect boxes of logic together, and how to compute the output of the boxes from their inputs. There's no implicit program counter that's advancing from one line to the next.It is (theoretically) possible to write these kind of things with a lot of abstraction, but every time you try that by using more advanced language features, you hit some bugs in the tool's implementation of the language. If you're lucky it'll tell you where you're doing undupported stuff. Often it'll crash. Sometimes it'll sythesize hardware that doesn't conform with the language spec.Finally, FPGAs are simple only when you're looking at a bird's eye view (just like CPUs are simple when you're looking at a diagram with a few boxes saying "ALU", "Cache", "Registers"). The actual datasheets are thousands of pages long.FPGAs are still useful though, their use case is "I need custom hardware and I don't have the volume to build an ASIC". For example, my application is a custom signal processing pipeline that's handling about 3.5 Gbit/s of streaming raw data. On a $40 chip.I think my main point is that yes, the tooling is a pain to use, with heaps of bugs and bad language support. However a HDL is conceptually different from a software language and I'm not sure you can hide away the complexities of designing hardware behind "modern" language features.For those suggesting diagram-based languages, go program something in LabView, you'll quickly understand why that's a bad idea (works for trivial designs, anything complex is an opaque mess of boxes and lines, unsearchable, and impossible to integrate with version control).
- bradorNo mention of video game emulation on FPGA?No mention of that Brazilian company that was set to manufacture them to undercut the market?
- phendrenad2It seems to me that there are exactly 3 buyers for FPGAs: Government contractors (who spend millions all at once), retro gamers (small market), and electronics hobbyists (another small market). It's no wonder every company has orientated itself towards the first one. I look to China to accidentally make chips that are an order of magnitude better "just because they can".
- anondawg55No. FPGAs already have a future. You just don't know about it yet.
- octoberfranklinThe problem is that FPGA companies don't see themselves as chip companies.They see themselves as CAD software companies. The chip is just a copy-protection dongle.
- checker659Cost is also such a big issue.
- nospiceTo folks who wax lyrical about FPGAs: why do they need a future?I agree with another commenter: I think there are parallels to "the bitter lesson" here. There's little reason for specialized solutions when increasingly capable general-purpose platforms are getting faster, cheaper, and more energy efficient with every passing month. Another software engineering analogy is that you almost never need to write in assembly because higher-level languages are pretty amazing. Don't get me wrong, when you need assembly, you need assembly. But I'm not wishing for an assembly programming renaissance, because what would be the point of that?FPGAs were a niche solution when they first came out, and they're arguably even more niche now. Most people don't need to learn about them and we don't need to make them ubiquitous and cheap.
- CamperBob2Here’s the first big misconception: HDL is hardware. It isn’t. HDL is software and should be managed like software.Yes, that's certainly a big misconception. Maybe not the one the author meant to call out, but... yes, a big misconception indeed.
- pavel-novikau[dead]
- ursAxZAIf Jim Keller says it, I’ll believe it.My Ryzen agrees — the fans just spun up like it’s hitting 10,000 rpm.