Skip to content

Add a uv build backend #3957

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Tracked by #190
chrisrodrigue opened this issue Jun 1, 2024 · 143 comments
Open
Tracked by #190

Add a uv build backend #3957

chrisrodrigue opened this issue Jun 1, 2024 · 143 comments
Assignees
Labels
build-backend enhancement New feature or improvement to existing functionality

Comments

@chrisrodrigue
Copy link

chrisrodrigue commented Jun 1, 2024

uv is a fantastic tool that is ahead of its time. In the same vein as ruff, it is bundling many capabilities that Python developers need into a single tool. It currently provides the capabilities of pip, pip-tools, and virtualenv in one convenient binary.

Python can't build out of the box

As of Python 3.12, which has removed setuptools and wheel from standard Python installations, a user is unable to perform pip install -e . or pip install . in a local pyproject.toml project without pulling in external third-party package dependencies.

This means that in an offline environment without access to PyPI, a developer is dead in the water and cannot even install their own project from source. This is a glaring flaw with Python, which is supposed to be "batteries included."

uv can fix this

I propose that the uv binary expand its capabilities to also function as a build backend.

If uv could natively build projects from source, it would be a game changer!

@chrisrodrigue chrisrodrigue changed the title uv should provide a build backend uv should provide a build backend Jun 1, 2024
@potiuk
Copy link

potiuk commented Jun 1, 2024

I think that if uv could natively build projects from source, it would be a game changer.

I personally believe that this is pretty much against the whole idea of modern approach and splitting the backend vs. frontend responsibilities. The idea (philosophically) behind the backend/frontend split is that the maintainers of the project (via pyproject.toml) choose the backend that should be used to build their tool, while the user installing Python project is free to choose whatever fronted they prefer. This is IMHO a gamechanger in the python packaging and we should rather see a stronger push in that direction than weakening it. And I think it's not going to go back - because more and more projects will revert to use pyproject.toml and backend specification of build environment. In case of Airflow for example - as of December there is no way to install airlfow "natively" without actually installing hatchling and build environment. But we do not give the option to the frontend. You MUST use hatchling in specified version and few other dependencies specified in specific versions to build airflow. Full stop. UV won't be able to make their own choices (It will be able to choose a way how to create such an environment but not to choose what should be used to build airlfow).

But also maybe my understanding of it is wrong and maybe you are proposing something different than I understand.

BTW. I do not think access to PyPI is needed to build a project with build backend. The frontend my still choose any mechanism (including private repos if needed) to install build environment, no PyPI is needed for it, the only requirement is that pyproject.toml specifies the environment.

@chrisrodrigue
Copy link
Author

chrisrodrigue commented Jun 1, 2024

@potiuk

I agree and think the backend and frontend specifications should be separate, I am merely suggesting that uv could provide a build capability similar to its virtualenv capability (uv venv), such that building can be possible for users that don't have one, perhaps via Project API such as uv install. I like the pdm approach of providing its own build backend (pdm-backend) and installing projects as editable by default inside the automatically managed .venv of the project when a user does pdm install.

You do need access to PyPI or some repo hosting the build backend to build a project. Python does not include setuptools or any other build backend in the latest distributions (it used to include at least setuptools). It downloads the build backend specified in the pyproject.toml (or setuptools if none specified) in an isolated environment and uses it to build python projects from source. If --no-build-isolation is specified, it expects the build-backend to already be available in the current system/virtual environment in order to build.

IMO, pip install [-e] . is broken out of the box because it relies on third party dependencies which may not be accessible. In the stdlib we get argument parsing, unit testing, logging, and other amenities, but we can't perform the most fundamental action in the software development process: installing/building your code.

Without the capability to build from source out of the box, Python is hamstringed and can only run the most rudimentary scripts, leaving users with multi-module projects to resort to ugly PYTHONPATH/sys.path hacks to get their packages/subpackages/modules found by the interpreter.

@potiuk
Copy link

potiuk commented Jun 1, 2024

MO, pip install [-e] . is broken out of the box because it relies on third party dependencies which may not be accessible. In the stdlib we get argument parsing, unit testing, logging, and other amenities, but we can't perform the most fundamental action in the software development process. Without the capability to build from source out of the box, Python is hamstringed and can only run the most rudimentary scripts, leaving users to resort to ugly PYTHONPATH/sys.path hacks to get their packages/subpackages/modules found by the interpreter.

But this is where the whole packaging for Python is heading. The PEPs of packaging precisely specify this is the direction and if project maintainers choose so, unless you manually (like conda) maintain your build recipes for all the packages our there, you won't be able to build project "natively".

Just to give you example of Airflow. Without hatchling and build hooks (implemented in hatch_build.py you are not even able to know what Airflow dependencies are, because "requirements", "optional-requirements" are declared as dynamic fields - and they don't even have (as mandated by the right PEP) specification of those dependencies in pyproject.toml. And we have no setup.py either any more.

The only way to find out what dependencies Airflow needs for editable build, or to build a wheel package is to get the right version of hatchling and get the frontend execute the build_hook - the build hook returrns such dependencies dynamically. You can see it yourself here https://github.com/apache/airflow/blob/main/pyproject.toml -> there is no way to build airflow from sources in current main without actually installing those packages:

    "GitPython==3.1.43",
    "gitdb==4.0.11",
    "hatchling==1.24.2",
    "packaging==24.0",
    "pathspec==0.12.1",
    "pluggy==1.5.0",
    "smmap==5.0.1",
    "tomli==2.0.1; python_version < '3.11'",
    "trove-classifiers==2024.5.22",

And letting hatchling invoke hatch_build.py. I am not sure what you mean by "native" installation - but you won't be able to install airflow differently.

And I think - personally (though I was not part of it) - that the decisions made by the packaging team were pretty sound and smart, and they deliberately left the decision for maintainers of a project to choose the right backend packages needed (and set of 3rd-party tools) and all frontends have no choice but to follow it. I understand you might have different opinion, but here - the process of Python Software Foundation and Packaging team is not an opinion - they have authoritative power to decide it by voting and PEP approval. And the only way to change it is to get another PEP approved.

Here is the list of those PEPs (and I am actually quite happy Airflow after 10 years finally migrated out of setuptools and setup.py by following those standards as finally the tooling - including the modern backends you mentioned support it for sufficently long time):

  • PEP-440 Version Identification and Dependency Specification
  • PEP-517 A build-system independent format for source trees
  • PEP-518 Specifying Minimum Build System Requirements for Python
  • PEP-561 Distributing and Packaging Type Information
  • PEP-621 Storing project metadata in pyproject.toml
  • PEP-660 Editable installs for pyproject.toml based builds (wheel based)
  • PEP-685 Comparison of extra names for optional distribution

@potiuk
Copy link

potiuk commented Jun 1, 2024

And BTW. if uv provides a build backend, you will still be able to choose it when you maintain your project - but it will also be a 3rd-party dependency :)

Someone installing your project with any frontend will have to download and install according to your specification in pyproject.toml. Similarly as hatch/hatchling pair that are separate. Even if you use hatch to install airlfow, it has to anyhow download hatchling in the version specified by the maintainer of the project you are installing and use it to build the package.

@chrisrodrigue
Copy link
Author

chrisrodrigue commented Jun 1, 2024

It's interesting that setuptools remains the "blessed" build backend that pip defaults to use when no backend is specified in the pyproject.toml. Rather than bake setuptools into the stdlib, it silently forces the download and installation of it as an unauthorized third-party build dependency. This also seems to violate the principle of "explicit is better than implicit."

One of the nice things about Astral tools like uv and ruff is that they do not bloat your package/library when you use them, and bundle the capabilities of many tools into single static binaries.

If you use uv specifically for its virtual environment capability, it's a single dependency. Conversely, if you were to use virtualenv, you've now just pulled in 3 more transitive dependencies ( distlib, filelock, platformdirs). Similarly, instead of using pylint, flake8, pyupgrade, black, isort, and all of their dependencies, you only need ruff. This has far reaching implications for companies abiding by Software Bill of Materials (SBOMs) that need to manually vet each piece of FOSS in their software toolchains.

I think uv could provide a build backend that could be specified in pyproject.toml and used as the default when no backend is specified. It doesn't necessarily have to be separate from the main uv binary, since the uv binary could support the build capability as builtin uv/uv pip commands (uv install, uv pip install -e .).

[build-system]
requires = ["uv"]
build-backend = "uv.api"

@potiuk
Copy link

potiuk commented Jun 2, 2024

I think uv could provide a build backend that could be specified in pyproject.toml and used as the default when no backend is specified. It doesn't necessarily have to be separate from the main uv binary, since the uv binary could support the build capability as builtin uv/uv pip commands (uv install, uv pip install -e .).

Well. There are always trade-offs and what you see from your position might be important for you, might not be important for others and the other way. For example - If (like we did with airflow being popular package) you had gone through some of the pains where new releases of setuptools suddently started breaking packages being built - because deliberately or accidentally breaking compatibilities and suddenly being flooded by 100s of your users having problem with installing your package without you doing anything you'd understand that bundling the way how things are run with specific version of frontend is a good idea when you have even moderately big package.

That's what I love that we as maintainers can choose and "lock" the backend to the tools and version of our choice - rather than relying that the version of the tool that our users choose will behave consistently over the years. That was a very smart choice of packaging team based on actual learnings from their millions of users over many years, and while I often criticized their choices in the past, I came to understanding that it's my vision that is short-sighted and limited - I learned a bit of empathy.

I'd strongly recommend a bit more reading and understanding what they were (and still do) cooking there. Python actually deliberately removeed setuptools in order to drive more the adoption of what's being developed as packaging standards (and that's really smart plan that was laid out years ago and is meticulously and consistently, step-by-step put in motion. And I admire the packaging team for that to be honest.

What you really think about is not following and bringing back old "setuptools" behaviour is something else that packaging team has already accepted and a number of tools are implementing https://peps.python.org/pep-0723/ - which allows you to define a small subset of pyproject.toml metadata (specifically dependencies) in the single-file scripts. And this is really where yes - any front-end implementing PEP-723 should indeed prepare a venv, install dependencies and run the script that specifies such dependencies.

Anything more complex that really has a bit more complex packaging need - putting more files together, should really define their backend, in order to maintain "build consistency", otherwise you start to be at mercy of tool developers who might change their behaviours at any time and suddenly not only you but anyone else who want to build your package will suddenly have problems with it.

But yes. If uv provides backend to build packages, that you (as project maintainer) specify in your build dependencies, this is perfectly fine - just one more backend to choose among about 10 available today. And if - as maintainer - you will prefer to specify it in your project, you should be free to do so. But as a maintainer, I would never put faith that future versions of specific front-end (including uv) will continue building my package in the same way in future versions.

BTW. Piece of advise - for that very reason, as a maintainer you should do:

[build-system]
requires = ["uv==x.y.z"]
build-backend = "uv.build"

Otherwise you never know which version of uv people have and whether they have a version that is capable of building your package at all (i.e. old version that has no backend).

@potiuk
Copy link

potiuk commented Jun 2, 2024

Also - again if you have proposal how to improve packaging, there are discourse threads there and PEP could be written, so I also recommend you, if you are strongly convinced that you can come up with a complete and better solution - please start discussion there about new PEP, and propose it, lead to approval and likely help to implement in a number of tools - this is the way how standard in packaging are being developed :)

@notatallshaw
Copy link
Collaborator

It's interesting that setuptools remains the "blessed" build backend that pip defaults to use when no backend is specified in the pyproject.toml. Rather than bake setuptools into the stdlib, it silently forces the download and installation of it as an unauthorized third-party build dependency. This also seems to violate the principle of "explicit is better than implicit."

FYI, I beleive this is because pip maintainers are very conservative when it comes to breaking changes, not because it is the intended future of Python packaging.

For example, the old resolver, which can easily install a broken environment, is still available to use even though the new resolver has been available for over 5 years and turned on by default for over 4 years.

@daviewales
Copy link

A uv-aware build backend would enable private git packages to depend on other private git packages, and still be installed with pip, pipx, poetry, etc, without needing the package end user to change their tools. See #7069 (comment) for a detailed example. (Poetry packages work this way, as they use the poetry-core build backend.)

@chrisrodrigue
Copy link
Author

chrisrodrigue commented Oct 2, 2024

Just some more thoughts, notes, and ramblings on this.

Full stack uv

The build backend capability could be rolled into the uv binary itself as a feature rather than fractured off as a separate dependency.

This would give developers the option to utilize uv as either a frontend, a backend, or both, while still maintaining the single static dependency.

Developers electing to use uv as both frontend and backend could have access to some optimizations that might not be possible individually.

We could call this use case “full stack uv” since that’s usually what we call frontend + backend, right? 🤪

Full stack uv without a specified version

requires = ["uv"]
build-backend = "uv"

Backend uv wouldn’t need to be downloaded since uv can just copy itself into the isolated build environment.

Or, uv can special case itself as a backend and do something even more optimized. This jives with PEP 517:

We do not require that any particular “virtual environment” mechanism be used; a build frontend might use virtualenv, or venv, or no special mechanism at all.

Full stack uv with pinned or maximum version

requires = ["uv==0.5"]
build-backend = "uv"

PEP 517 build isolation can guarantee that frontend uv and backend uv of different versions do not conflict.

A build frontend SHOULD, by default, create an isolated environment for each build, containing only the standard library and any explicitly requested build-dependencies

However, a decision could be made to maintain backward compatibility for the backend, such that newer versions of uv could satisfy whichever version is declared.

On PEP 517 compliance

The backend feature could be PEP 517 compliant so that other frontends (like poetry, pdm, pip, hatch, etc.) can use uv as a backend, but this could be distinct from full stack uv.

A Python library will be provided which frontends can use to easily call hooks this way.

uv would need to expose some hooks in the build environment via Python API. The mandatory and optional hooks at the time of this writing are:

# Mandatory hooks
def build_wheel(wheel_directory, config_settings=None, metadata_directory=None):
    ...

def build_sdist(sdist_directory, config_settings=None):
    ...

# Optional hooks
def get_requires_for_build_wheel(config_settings=None):
    ...

def prepare_metadata_for_build_wheel(metadata_directory, config_settings=None):
    ...

def get_requires_for_build_sdist(config_settings=None):
    ...

Perhaps a highly optimized, importable module or package named uv could be autogenerated by the uv binary at build time to satisfy these requirements? PEP 517 says it could even be cached:

The backend may store intermediate artifacts in cache locations or temporary directories.

@hauntsaninja
Copy link
Contributor

hauntsaninja commented Oct 3, 2024

If it was up to me, and assuming the build backend isn't too complicated, I'd consider doing both:

  1. Have a uv-build package that provides a build backend
  2. The uv frontend detects this backend and special cases the hell out of it to avoid PEP 517/660 overhead.

uv-build can currently basically just depend on uv, but a separate package gives Astral the possibility of adding a pure Python implementation of the build backend or having a slimmer build, since build dependencies with build dependencies is finicky business

@ofek
Copy link
Contributor

ofek commented Oct 3, 2024

Hatch does 2. with Hatchling.

@notatallshaw
Copy link
Collaborator

Slight correction: "flat" layout usually means not having a src directory but still having a package directory: https://packaging.python.org/en/latest/discussions/src-layout-vs-flat-layout/

This was in my head when I wrote my above comment, there may be some nuances to not having a package directory at all that I'm unfamiliar with. E.g you probably need to specify exactly which files are considered part of the package so, for example, you don't end up accidentally including the build contents in the package (and recursively doing so each time).

@eli-yip
Copy link

eli-yip commented Mar 21, 2025

Thank you for your clear guidance, but after I added __init__.py to the project root directory as instructed, I still encountered the same error as above.

If you just have a single python module, it's possible you don't need a build-backend or package at all and you can use this docs.astral.sh/uv/guides/scripts ?

Thanks for your advice. I might be facing an XY problem. My goal is to install the typer app from main.py using uv tool install . for global CLI access. It appears I need to build the package first, and when using uv_build as my backend, I ran into the issue. Any better suggestions?

@nathanscain
Copy link

nathanscain commented Mar 21, 2025

Honestly, it will just work if you use the standard src/package/__init__.py layout. Flat layout (package/__init__.py) can also work with configuration. Having just a top level Python file isn't easily package-able as far as I'm aware, and any solution would certainly be non-standard for no discernible reason.

Summary:

  1. Move main.py to src/audio_connect/__init__.py
  2. Update your script to point to audio_connect:app
  3. Remove other packaging configuration

@eli-yip
Copy link

eli-yip commented Mar 21, 2025

Honestly, it will just work if you use the standard src/package/__init__.py layout. Flat layout (package/__init__.py) can also work with configuration. Having just a top level Python file isn't easily package-able as far as I'm aware, and any solution would certainly be non-standard for no discernible reason.

Summary:

  1. Move main.py to src/audio_connect/__init__.py
  2. Update your script to point to audio_connect:app
  3. Remove other packaging configuration

Thanks. I misunderstood “flat layout.” Turns out uv describes what I wanted in the docs: https://docs.astral.sh/uv/reference/cli/#uv-init–app

@nathanscain
Copy link

Happy to help. Yes uv init --app --build-backend uv should do that if I remember correctly. (Requires UV_PREVIEW=1)

@ofek
Copy link
Contributor

ofek commented Mar 22, 2025

I would also strongly advise users in general against implicit namespace packages, they have a lot of quirky behavior that can be difficult to reason about.

Implicit namespace packages is the recommended approach based on the official docs: https://packaging.python.org/en/latest/guides/packaging-namespace-packages/#creating-a-namespace-package

What issues have you had with their use? I personally have had less issues with them compared to the older approaches and just switched to them at work last week since we dropped Python 2 somewhat recently.

@nathanscain
Copy link

nathanscain commented Mar 22, 2025

Implicit namespace packages is the recommended approach based on the official docs: https://packaging.python.org/en/latest/guides/packaging-namespace-packages/#creating-a-namespace-package

That's not what that page says. It is recommending native namespace packages over legacy namespace packages. At no point does the linked page promote namespace packages over standard, non-namespace packages.

On the contrary, it specifically states that this type of packaging is only for when you want multiple, separate packages to install within the same import namespace when installed in the same environment - a concept that is both rare and "not appropriate in all cases" even when you have that objective (unlikely).

It is an easy mistake to make (I once thought __init__.py was effectively optional due to this), but no - namespace packages are not the norm, default, or recommended path for the vast, vast majority of Python packages.

Additionally, look and see that every other example on that site uses the standard non-namespace package method, and they even add this note:

Technically, you can also create Python packages without an __init__.py file, but those are called namespace packages and considered an advanced topic (not covered in this tutorial). If you are only getting started with Python packaging, it is recommended to stick with regular packages and __init__.py (even if the file is empty).

@potiuk
Copy link

potiuk commented Mar 22, 2025

@notatallshaw

I would also strongly advise users in general against implicit namespace packages, they have a lot of quirky behavior that can be difficult to reason about.

@nathanscain

That's not what that page says. It is recommending native namespace packages over legacy namespace packages. At no point does the linked page promote namespace packages over standard, non-namespace packages.

On the contrary, it specifically states that this type of packaging is only for when you want multiple, separate packages to install within the same import namespace when installed in the same environment - a concept that is both rare and "not appropriate in all cases" even when you have that objective (unlikely).

It is an easy mistake to make (I once thought init.py was effectively optional due to this), but no - namespace packages are not the norm, default, or recommended path for the vast, vast majority of Python packages.

@ofek

What issues have you had with their use? I personally have had less issues with them compared to the older approaches and just DataDog/integrations-core#19826 at work last week since we dropped Python 2 somewhat recently.

My comments after huge refactor of Airlfow where we use both namespace packages (impllicit and legacy unfortunately mixed due to historical reasons for now) I must agree with @notatallshaw and @nathanscain . Implicit namespace packages introduce ambiguity especially when various tools (starting from mypy, but also pytest, various IDEs and a number of others) are trying to determine what is the root package we should import things from. You have to explicitly tell what are the "base" folders to start importing things from, otherwise weird things happens. When you press "Cmd + Enter" in Intellij on missing imports it will propose some semi-random choices sometimes or it might import things including folders that were never meant to be python packages. This especially when you have workspaces with multiple python packages in the same monorepo, where current tooling is not yet adapted well to understand where such "root" folders are.

This is especially problematic where you have implicit namespace packages in tests and leads to things like this https://github.com/apache/airflow/blob/main/pyproject.toml#L554 - where we had to explicitly liist "src" and "test" root folders for all 100 packages we have in Airflow's monorepo. And in order to properly get imports in IDE you have to explicitly mark "tests" folder as "Test sources root" in your IDEs (unlike "src", the "tests" folders are not recognized by IDEs as root folders even if you use uv workspace)

That's one of the reasons we had to use legacy namespace packages all over the project to keep out from the ambiguities that the implicit namespaces introduce.

I wish in the future the tooling will become a bit better in handling implicit namespaces but I do agree with the statement __init__.py is here to stay and you should use implicit namespace packages only when you really need it (like we do in airflow due to historical choices where various distributions are sharing the same "airflow" package namespace). I wish we had done better decisions to separate those namespaces in the past, but some of the other maintainers feel very strongly that all our distributions should share the same common "airflow" import and we have to deal with the complexity.

@notatallshaw
Copy link
Collaborator

I appreciate we are going a little off topic here, but for anyone interested in a (long) tutorial on the import system in Python, including a good diversion into implicit namespaces, I strongly recommend David Beazley's Modules and Packages: Live and Let Die!

I watched it near when it first came out, and watched it again about ~1 year ago, and this time immediately enabled INP001 on all my work projects.

@merwok
Copy link

merwok commented Mar 22, 2025

Move main.py to src/audio_connect/__init__.py

Can a uv developer confirm or infirm whether uv supports modules (not packages)?
(regardless of usage of top-level vs src directory)

@charliermarsh
Copy link
Member

I don't fully understand the question. "Supports" in what sense? (You might be better off asking for help in the Discord if you have a specific question unrelated to this issue.)

@merwok
Copy link

merwok commented Mar 22, 2025

Someone gave advice to turn a module into a package. Is that required by uv, or does it support packaging simple modules?

@nathanscain
Copy link

nathanscain commented Mar 22, 2025

Technically, it would still be a module after that file move - it would just be a module within a package (which was the objective).

I was mostly just trying to give a "this is what is standard and will just work" piece of advice. There isn't anything I'm aware of in the spec that would prevent a single python file being at the root of a wheel (for example), but it would certainly be nonstandard without much of a reason to be.

I can't say uv doesn't support that because I've never tried (though I imagine it is build-backend dependent), but note that the uv --package flag which adds a build-backend will also setup a standard src package layout.

@merwok
Copy link

merwok commented Mar 22, 2025

Let’s stop this side discussion, but I strongly disagree with your viewpoint.
Python modules that are not inside a package (import package, not a distribution) are perfectly fine and valid.

@nathanscain
Copy link

Let’s stop this side discussion, but I strongly disagree with your viewpoint.
Python modules that are not inside a package (import package, not a distribution) are perfectly fine and valid.

I agree - last message on the subject.

That wasn't my viewpoint at all. Please reread. Of course Python modules are valid outside of a package, but - as the user was wishing to install it as a tool with uv - it was necessary for the module to be distributable due to the nature of how uv tool install works (creating a venv and installing the referenced package within it). There are other ways to make a module globally executable (combining shebang, uvx, and inline metadata comes to mind), but that wasn't the stated objective.

And finally, I specifically never even said such a construction (top level module placed directly within a distribution artifact) would be invalid - just that it would be non-standard without cause/need and would require a build-backend that supported it (which I am not aware of and couldn't immediately find when I went searching before my response).

@ofek
Copy link
Contributor

ofek commented Mar 22, 2025

I specifically never even said such a construction (top level module placed directly within a distribution artifact) would be invalid - just that it would be non-standard without cause/need and would require a build-backend that supported it (which I am not aware of and couldn't immediately find when I went searching before my response).

Hatchling can do that easily, and shipping files outside of the package directory is quite common in the case of extension modules. For example, take a look at what Mypyc produces: https://pypi.org/project/black/#files

@nathanscain
Copy link

... so it was:

  • valid in the spec ✅
  • build-backend dependent to use ✅
  • non-standard (requires additional configuration buried in the explicit select of wheel content and hooks) ✅
  • irrelevant to the stated objective where the user is specifically using the uv_build backend for pure python code (not an extension module) ✅

I really don't get why what I said is so controversial? uv's own defaults go with the standard layout (as does hatchling's). We had a case of a user wanting to install a module as a uv tool, so I recommended packaging said module with the default and universally supported method. I'm three hours into the video that was posted earlier in this thread and it is obvious that Python will let you do anything when it comes to packaging - that doesn't mean those options should be recommended for these basic cases.

@eli-schwartz
Copy link

@nathanscain,

python packages, such as {root}/src/modname/, are just as nonstandard and build backend specific as {root}/modname.py. Which is to say, not very. Every build backend supports the latter, except maybe uv. At least flit and setuptools will autodetect modname.py using the default rules. I am genuinely baffled that anyone could believe otherwise. e.g.:

Claiming that src/ is "the standard" layout seems like a pretty hot take to me as two of the most significant build backends of all time either refuse to get into fights about which is better or go out of their way to warn you that it has "advantages and disadvantages" and that you should consider both the advantages and the disadvantages and decide whether it fits you well.

This conversation is... surreal to me. Half the time I cannot tell whether people in this thread are debating {root}/modname.py or {root}/main.py installed via backend rules via file renaming as {site-packages}/modname.py. The latter case seems logically to me like something that would have to be very advanced usage indeed, in particular it should require user-defined build rules e.g. setuptools overriding cmdclass build_py or meson via fs.copyfile() and installing the result.

(But the actual pyproject.toml which was posted and which used "main.py" also declared the script name as audio-collect = "main:app" so it seems plain the intent was to install an import main module. This is a terrible idea because the name can trivially conflict with software you didn't intend, and it's possible to have a discussion about the dangers of this without berating people for doing extremely common and well-supported things. For much the same reason that nobody should install an import tests.)

And finally, I specifically never even said such a construction (top level module placed directly within a distribution artifact) would be invalid - just that it would be non-standard without cause/need and would require a build-backend that supported it (which I am not aware of and couldn't immediately find when I went searching before my response).

You cannot possibly have checked very hard!

https://setuptools.pypa.io/en/latest/userguide/package_discovery.html#single-module-distribution

There is also a handy variation of the flat-layout for utilities/libraries that can be implemented with a single Python file:
single-module distribution

A standalone module is placed directly under the project root, instead of inside a package folder:

Very clearly called out in the docs for the single most significant build backend of all time, bar none.

it is obvious that Python will let you do anything when it comes to packaging - that doesn't mean those options should be recommended for these basic cases.

Your opinion appears to be directly in contradiction with the authors of an actual build backend which is extremely widely used, who claim that this option is an excellent example of a simple use case important enough to be documented for encouragement and quick access using a zero-config pyproject.toml and autodetection.

I really do not understand why after berating people and telling them that the idea "isn't supported by anything" and being corrected, you feel the need to switch gears and start berating people for doing something "unrecommended" and "not even an extension module" and "requires explicit configuration".

Look, we get it, you hate modules and believe support for them should be removed from the cpython interpreter (we should start off by removing all the ones in https://github.com/python/cpython/tree/3.13/Lib such as os and subprocess but I digress). But maybe wait until build backend developers say they aren't supporting it before jumping down people's throats about use cases you apparently haven't even researched.

Thanks. :)

@eli-schwartz
Copy link

@nathanscain,

That's not what that page says. It is recommending native namespace packages over legacy namespace packages. At no point does the linked page promote namespace packages over standard, non-namespace packages.

On the contrary, it specifically states that this type of packaging is only for when you want multiple, separate packages to install within the same import namespace when installed in the same environment - a concept that is both rare and "not appropriate in all cases" even when you have that objective (unlikely).

It is an easy mistake to make (I once thought __init__.py was effectively optional due to this), but no - namespace packages are not the norm, default, or recommended path for the vast, vast majority of Python packages.

I mostly agree that namespace packages are an advanced use case which should never be unknowingly done.

It is also worth remembering that namespace packages are slower than normal packages, as they have to keep searching the entire import path just to check whether someone has taken advantage of the defining purpose of namespace packages. You're basically shooting yourself in the foot if you use them without having a very specific reason to need them.

However please do note that the sole use case for namespace packages is when they are not installed in the same site-packages directory, i.e. "environment" but exist in stacked environments. If you install all packages to the same virtualenv it technically works fine, since they get tree-merged and are indistinguishable from a single wheel with multiple sub-packages. Which __init__.py file gets installed last and overwrites the others is unspecified by pip, and non-pip tools require the downstream packager (e.g. linux distros) to select one, usually the "parent" package, and delete all the other __init__.pys to prevent fatal errors when conflicting files are noted.

(Arguably if you only care about virtualenvs, you do not need namespace packages, ever. Just install them on top of each other and let them clobber each others' files. pip doesn't care, any package is allowed to hijack another package's files whenever it wishes.)

@nathanscain
Copy link

nathanscain commented Mar 23, 2025

Don't want to dive into this super hard again.

  1. The case at hand was someone having a uv init --app layout - to which I simply suggested moving to the uv init --package layout. I don't recall coming across someone doing this in production from my time reviewing our dependencies. I didn't say that I looked hard - I did a quick search and not much came up so I choose to help the person back to (close to) what they would have been given if they specified --package from the start (so that it would "just work").

  2. The cited example that I supposedly should have seen is a single heading and 2 sentences before diving into custom discovery. You caught me - I didn't know it existed because it isn't very common and almost never what someone means by "flat layout" in my experience - but a variation of it. I wasn't claiming that src was better than flat - if flat was the default, I would have suggested a move to that with a top-level directory. And obviously the goal was the thing we agree on - renaming the main.py. I wasn't berating people for doing standard things, not even sure what that is aimed at as I was only trying to help the original person go from broken to working in as few steps as possible - for which he seemed grateful - and made no comment on anyone else's approaches within their own projects. The example that was shown to me was not zero-config at all (I eagerly went to look to learn something). Idk why you put quotes around "isn't supported by anything" - I didn't say that - just that I couldn't immediately find mention of it and hadn't seen it - but yes valid and in-spec. I don't hate modules, obviously. I haven't been jumping down anyone's throat. All I've been doing is responding when others respond claiming that I said things I didn't say while my daughter was napping, eating, etc.

  3. As far as personal growth, thank you for showing me that variation - I hadn't seen it before. Much of this could probably have been avoided if I said "common" instead of "standard" in a few places, so I'll own that. Precision of language is always important where the connotation can be lost. It is clear that that is supported by the standard implementations (rather than just the spec) as you show. The specific case of my initial response to merwok shows that I completely misunderstood his query as I wasn't aware of this variation of the flat layout. My response was simply meant to say that I put it in one of the two common layouts - looking back, it appears they were asking about uv_build's support for this case which was unknown to me. I'll leave it to that team to answer, as I still can't speak to it.

Goodnight everyone. Apologies to anyone who felt I attacked them - was never my intention and I hope you will forgive.

@eli-yip
Copy link

eli-yip commented Mar 23, 2025

Thank you all for your replies and assistance! I sincerely apologize if I’ve veered off-topic at any point.

I’m a Python beginner with no prior experience in Python packaging (whether with setuptools or hatchling) before writing the script (or package) mentioned above. My understanding of Python modules is also quite limited.

Based on the response here (#3957 (comment)), I tried renaming main.py to audio_collect.py (without adding an __init__.py file). I found that I could successfully build it with setuptools and install/use my CLI program with uv tool install .. However, uv build doesn’t seem to support this approach.

Could I ask if uv build plans to offer similar support for single-module distributions (as described in https://setuptools.pypa.io/en/latest/userguide/package_discovery.html#single-module-distribution)?

Additionally, for a simple case like mine—using a flat layout and wanting to install and use a single module from the project root directory as a package—are there any best practices? I’d prefer to use uv build rather than other build backends.

@charliermarsh
Copy link
Member

Yes, I expect us to support single-module distributions, but I'd appreciate it if you could file a separate issue for that functionality.

@notatallshaw
Copy link
Collaborator

notatallshaw commented Mar 25, 2025

Sorry if this has already been addressed but reading this comment:

and uv init --build-backend uv use constraints by default, e.g., requires = ["uv_build>=0.6.9,<0.7"]. I think this could be normalized.

It made me think about the scenario of installing from a uv_build backed sdist from a Python tool. If I publish a pure Python package today with build requirements "uv_build>=0.6.9,<0.7", will someone in a few years be able to install the sdist using pip on Python 3.16?

I'm assuming not, because there won't be a built wheel for "uv_build>=0.6.9,<0.7" that is compatible with Python 3.16? And building an old uv_build from source for a new version of Python is also likely to fail.

So, if using uv_build means in general users shouldn't be relying on sdists for installing from other front end installers and should be relying on wheels that should be documented as a known issue that uv_build might not be appropriate for. This recent setuptools breaking change has shown a lot of users seemingly still depend on sdists, for whatever reasons.

@konstin
Copy link
Member

konstin commented Mar 25, 2025

We're intentionally keeping uv_build small and compatible, hopefully making it forward compatible, especially on a short horizon such as Python 3.16. In that setup, the build backend should not be the bottleneck to using a library with a more modern Python version.

@adamcik
Copy link

adamcik commented Apr 6, 2025

How is this supposed to work with building namespace packages? Looking at the options mentioned in #8779 I tried configuring module-root = "src" and module-name = "pkg" which I would expect to be the "right" way of doing this.

Options

tool.uv.build-backend.module-root

The directory that contains the module directory, usually src, or an empty path when
using the flat layout over the src layout.

tool.uv.build-backend.module-name

The name of the module directory inside module-root.

The default module name is the package name with dots and dashes replaced by underscores.

Note that using this option runs the risk of creating two packages with different names but the same module names. Installing such packages together leads to unspecified behavior, often with corrupted files or directory trees.

But this just results in the following error:

uv sync --frozen                                                                                                                                                                               
  × Failed to build `pkg-shared-testing @ file:///home/.../pkg`                                                                                     
  ├─▶ The build backend returned an error                                                                                                                                                      
  ╰─▶ Call to `uv_build.build_editable` failed (exit status: 1)                                                                                                                                
                                                                                                                                                                                               
      [stderr]                                                                                                                                                                                 
      Error: Expected an `__init__.py` at: `src/pkg/__init__.py`                                                                                                                                
                                                                                                                                                                                               
      hint: This usually indicates a problem with the package or the build environment.

But as I understand namespace packages I should not have src/pkg/__init__.py, but only e.g. src/pkg/shared/testing/__init__.py which is the root of this package.

For now I can just keep using an non uv build system, or try as suggested in #3957 (comment) and just have pkg/__init__.py in each package in our mono repo, as they all end up in the same venv any way.

But it would be good to either have this explicitly listed as not (yet) supported or to have a proper guidance on how to configure things with namespace packages, instead of having to have to run into the error like it did. Thanks!

@merwok
Copy link

merwok commented Apr 7, 2025

You’re trying to package pkg.shared.testing, so try this as module name, not pkg

(Not a uv user and haven’t checked the docs – in general, don’t expect tools to support namespace packages unless they say so)

@adamcik
Copy link

adamcik commented Apr 7, 2025

I already tried that, just forgot to mention it. That didn't pass the validation for the option, so it didn't work. But thanks for the suggestion.

And just to be clear, I'm not expecting help debugging my concrete issue here, that should probably be in a discussion, or wherever the maintainers want to redirect such things. I was just hoping to get a clarification of if namespaces is within scope now or perhaps later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
build-backend enhancement New feature or improvement to existing functionality
Projects
None yet
Development

No branches or pull requests