Need help?
<- Back

Comments (112)

  • woodruffw
    I think this post does a really good job of covering how multi-pronged performance is: it certainly doesn't hurt uv to be written in Rust, but it benefits immensely from a decade of thoughtful standardization efforts in Python that lifted the ecosystem away from needing `setup.py` on the hot path for most packages.
  • epage
    > uv is fast because of what it doesn’t do, not because of what language it’s written in. The standards work of PEP 518, 517, 621, and 658 made fast package management possible. Dropping eggs, pip.conf, and permissive parsing made it achievable. Rust makes it a bit faster still.Isn't assigning out what all made things fast presumptive without benchmarks? Yes, I imagine a lot is gained by the work of those PEPs. I'm more questioning how much weight is put on dropping of compatibility compared to the other items. There is also no coverage for decisions influenced by language choice which likely influences "Optimizations that don’t need Rust".This also doesn't cover subtle things. Unsure if rkyv is being used to reduce the number of times that TOML is parsed but TOML parse times do show up in benchmarks in Cargo and Cargo/uv's TOML parser is much faster than Python's (note: Cargo team member, `toml` maintainer). I wish the TOML comparison page was still up and showed actual numbers to be able to point to.
  • ethin
    > Zero-copy deserialization. uv uses rkyv to deserialize cached data without copying it. The data format is the in-memory format. This is a Rust-specific technique.This (zero-copy deserialization) is not a rust-specific technique, so I'm not entirely sure why the author describes it as one. Any good low level language (C/C++ included) can do this from my experience.
  • pecheny
    The content is nice and insightful! But God I wish people stopped using LLMs to 'improve' their prose... Ironically, some day we might employ LLMs to re-humanize texts that had been already massacred.
  • w10-1
    I like the implication that we can have an alternative to uv speed-wise, but I think reliability and understandability are more important in this context (so this comment is a bit off-topic).What I want from a package manager is that it just works.That's what I mostly like about uv.Many of the changes that made speed possible were to reduce the complexity and thus the likelihood of things not working.What I don't like about uv (or pip or many other package managers), is that the programmer isn't given a clear mental model of what's happening and thus how to fix the inevitable problems. Better (pubhub) error messages are good, but it's rare that they can provide specific fixes. So even if you get 99% speed, you end up with 1% perplexity and diagnostic black boxes.To me the time that matters most is time to fix problems that arise.
  • bastawhiz
    > When a package says it requires python<4.0, uv ignores the upper bound and only checks the lower. This reduces resolver backtracking dramatically since upper bounds are almost always wrong. Packages declare python<4.0 because they haven’t tested on Python 4, not because they’ll actually break. The constraint is defensive, not predictive.This is kind of fascinating. I've never considered runtime upper bound requirements. I can think of compelling reasons for lower bounds (dropping version support) or exact runtime version requirements (each version works for exact, specific CPython versions). But now that I think about it, it seems like upper bounds solve a hypothetical problem that you'd never run into in practice.If PSF announced v4 and declared a set of specific changes, I think this would be reasonable. In the 2/3 era it was definitely reasonable (even necessary). Today though, it doesn't actually save you any trouble.
  • robertclaus
    At Plotly we did a decent amount of benchmarking to see how much the different defaults `uv` uses lead to its performance. This was necessary so we could advise our enterprise customers on the transition. We found you lost almost all of the speed gains if you configured uv behave as much like pip as you could. A trivial example is the precompile flag, which can easily be 50% of pips install time for a typical data science venv.https://plotly.com/blog/uv-python-package-manager-quirks/
  • andy99
    I remain baffled about these posts getting excited about uv’s speed. I’d like to see a real poll but I personally can’t imagine people listing speed as one of the their top ten concerns about python package managers. What are the common use cases where the delay due to package installation is at all material?Edit to add: I use python daily
  • didibus
    There's an interesting psychology at play here as well, if you are a programmer that chooses a "fast language" it's indicative of your priorities already, it's often not much the language, but that the programmer has decided to optimize for performance from the get go.
  • blintz
    > PEP 658 went live on PyPI in May 2023. uv launched in February 2024. The timing isn’t coincidental. uv could be fast because the ecosystem finally had the infrastructure to support it. A tool like uv couldn’t have shipped in 2020. The standards weren’t there yet.How/why did the package maintainers start using all these improvements? Some of them sound like a bunch of work, and getting a package ecosystem to move is hard. Was there motivation to speed up installs across the ecosystem? If setup.py was working okay for folks, what incentivized them to start using pyproject.toml?
  • eviks
    > Every code path you don’t have is a code path you don’t wait for.No, every code path you don't execute is that. Like> No .egg support.How does that explain anything if the egg format is obsolete and not used?Similar with spec strictness fallback logic - it's only slow if the packages you're installing are malformed, otherwise the logic will not run and not slow you down.And in general, instead of a list of irrelevant and potentially relevant things would be great to understand some actual time savings per item (at least those that deliver the most speedup)!But otherwise great and seemingly comprehensive list!
  • ofek
    > pip could implement parallel downloads, global caching, and metadata-only resolution tomorrow. It doesn’t, largely because backwards compatibility with fifteen years of edge cases takes precedence.pip is simply difficult to maintain. Backward compatibility concerns surely contribute to that but also there are other factors, like an older project having to satisfy the needs of modern times.For example, my employer (Datadog) allowed me and two other engineers to improve various aspects of Python packaging for nearly an entire quarter. One of the items was to satisfy a few long-standing pip feature requests. I discovered that the cross-platform resolution feature I considered most important is basically incompatible [1] with the current code base. Maintainers would have to decide which path they prefer.[1]: https://github.com/pypa/pip/issues/13111
  • ggm
    Some of these speed ups looked viable to backport into pip including parallel download, delayed .pyc, ignore egg, version checks.Not that I'd bother since uv does venv so well. But, "it's not all rust runtime speed" implies pip could be faster too.
  • yjftsjthsd-h
    > No bytecode compilation by default. pip compiles .py files to .pyc during installation. uv skips this step, shaving time off every install. You can opt in if you want it.Are we losing out on performance of the actual installed thing, then? (I'm not 100% clear on .pyc files TBH; I'm guessing they speed up start time?)
  • zahlman
    I've talked about this many times on HN this year but got beaten to the punch on blogging it seems. Curses.... Okay, after a brief look, there's still lots of room for me to comment. In particular:> pip’s slowness isn’t a failure of implementation. For years, Python packaging required executing code to find out what a package needed.This is largely refuted by the fact that pip is still slow, even when installing from wheels (and getting PEP 600 metadata for them). Pip is actually still slow even when doing nothing. (And when you create a venv and allow pip to be bootstrapped in it, that bootstrap process takes in the high 90s percent of the total time used.)
  • IshKebab
    Mmm I don't buy it. Not many projects use setup.py now anyway and pip is still super slow.> Plenty of tools are written in Rust without being notably fast.This also hasn't been my experience. Most tools written in Rust are notably fast.
  • VerifiedReports
    So... will uv make Python a viable cross-platform utility solution?I was going to learn Python for just that (file-conversion utilities and the like), but everybody was so down on the messy ecosystem that I never bothered.
  • nurettin
    > When a package says it requires python<4.0, uv ignores the upper bound and only checks the lower.I will bring popcorn on python 4 release date.
  • pkaodev
    AI slop
  • looneysquash
    I don't have any real disagreement with any of the details the author said.But still, I'm skeptical.If it is doable, the best way to prove it is to actually do it.If no one implements it, was it ever really doable?Even if there is no technical reason, perhaps there is a social one?
  • ec109685
    The article info is great, but why do people put up with LLM ticks and slop in their writing? These sentences add no value and treats the reader as stupid.> This is concurrency, not language magic.> This is filesystem ops, not language-dependent.Duh, you literally told me that the previous sentence and 50 million other times.
  • agumonkey
    very nice article, always good to get a review of what a "simple" looking tool does behind the scenseabout rust thoughsome say a nicer language helps finding the right architecture (heard that about cpp veteran dropping it for ocaml, any attempted idea would take weeks in cpp, was a few days in ocaml, they could explore more)also the parallelism might be a benefit the language orientationenough semi fanboyism
  • pwdisswordfishy
    > Some of uv’s speed comes from Rust. But not as much as you’d think. Several key optimizations could be implemented in pip today: […] Python-free resolutionUmm…
  • skywhopper
    This is great to read because it validates my impression that Python packaging has always been a tremendous overengineered mess. Glad to see someone finally realized you just need a simple standard metadata file per package.
  • hallvard
    Great post, but the blatant chatgpt-esque feel hits hard… Don’t get me wrong, I love astral! and the content, but…
  • efilife
    this shit is ChatGPT-written and I'm really tired of it. If I wanted to read chatgpt I would have asked it myself. Half of the article are nonsensical repeated buzzwords thrown in for absolutely no reason
  • aswegs8
    uv seems to be a pet peeve of HN. I always thought pipenv was good but yeah, seems like I was being ignorant