Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Requirements files need to be rethought #2647

Open
rocco8773 opened this issue Apr 25, 2024 · 2 comments
Open

Requirements files need to be rethought #2647

rocco8773 opened this issue Apr 25, 2024 · 2 comments
Labels
CI Related to continuous integration nox Related to the nox automation software packaging Related to packaging or distribution

Comments

@rocco8773
Copy link
Member

rocco8773 commented Apr 25, 2024

There are numerous issues with the current structure of requirements files...

  1. There is no documentation on when or how to regenerate files
  2. There is no documentation on the intended purpose of each requirements file
  3. There are no breadcrumbs telling the developer to regenerate the requirements files after the pyproject.toml is modified.
  4. Requirements files are purpose built for CIs (not an issue) but at the expense of the developer use (an issue)
  5. When uv generates a requirements files it by default adds OS specific packages to that file. So, if you generate the file on a Windows machine without the --no-deps flag, then all associated CIs and RTD runners will fail. Again, not documented anywhere.
  6. IDEs often use the requirements.txt for inspection, so a frozen set of dependencies is a nuisance for the developer. Since the main requirements.txt is frozen set of dependencies for a given py version, it is not a good reference file for IDEs when the developer is using a different py version. It would be better to have a flexible dependency file for developers.
  7. I don't think a frozen set of requirements is ideal for the RTD builds, since that would require us to do a dedicated PR on the plasmapy side every time we update plasmpay_sphinx.

Possible Solutions:

  1. Restrict the requirements/ directory to CI focused requirements only.
  2. Add breadcrumbs to pyproject.toml to tell the developer to update the CI specific requirement.
  3. Have the GitHub actions self generate the requirements files so we offload the burden from the contributor.
  4. Add to the documentation instructions for buildings requirements for CIs and personal developer use.
  5. Reserve the requirements.txt for the developer and add it to .gitignore so the developer can create a requirements.txt for their own development environment. (This would be done in combination with the above item.)
  6. Instead of the above, have a manually created requirements.txt that mirrors pyproject.toml, like we used to have. This is not ideal from a maintainer perspective, but offloads a lot of burden from the contributor side.
  7. Update .readthedocs.yml so it install dependencies like plasmapy[docs] instead of a requirements file. I'm doing this right now in PR Remove plasmapy_sphinx #1643 . (Currently this does not appear to affect RTD build time.)
@namurphy namurphy added CI Related to continuous integration packaging Related to packaging or distribution labels Apr 29, 2024
@namurphy
Copy link
Member

namurphy commented May 1, 2024

There is no documentation on when or how to regenerate files

There are no breadcrumbs telling the developer to regenerate the requirements files after the pyproject.toml is modified.

Very important points! Addressing this in #2650. 🔧

When uv generates a requirements files it by default adds OS specific packages to that file. So, if you generate the file on a Windows machine without the --no-deps flag, then all associated CIs and RTD runners will fail. Again, not documented anywhere.

📝 uv can now perform multi-platform and multi-python resolution. We could perhaps bake this into the tox environment which regenerates requirements files. If we switch over to nox from tox (e.g., #2654), we could set it up so that the correct requirements file is chosen smartly from the ones that were generated. 🤔

IDEs often use the requirements.txt for inspection, so a frozen set of dependencies is a nuisance for the developer. Since the main requirements.txt is frozen set of dependencies for a given py version, it is not a good reference file for IDEs when the developer is using a different py version. It would be better to have a flexible dependency file for developers.

🏕️ Since IDE support for pyproject.toml has improved considerably in the last few years, I've been wondering if we could drop the top-level requirements.txt completely. We no longer use requirements.txt in CI (except for generating a hash used in .tox cache invalidation). I believe the only other remaining place is that we recommend using pip install -r requirements.txt in the contributor guide, though that could perhaps be replaced with suggesting editable installations. I'm thinking about doing this via a follow-up to #2650.

I don't think a frozen set of requirements is ideal for the RTD builds, since that would require us to do a dedicated PR on the plasmapy side every time we update plasmpay_sphinx.

⚠️ My concerns about using unfrozen (melted?) requirements for RTD builds are that (1) we will have spontaneously appearing build failures, (2) the RTD builds will be done in a different environment than the regular CI doc build, and (3) it will be more difficult to reproduce doc builds from eons past. If we use a pinned environment for the doc build and check out the repo from a year ago, we'd have a much higher likelihood of being able to reproduce the doc build with the tox environment. Since we have a tox environment and a GitHub workflow to regenerate the requirements files, it's become less effort to update a single requirement.

Possible solutions

Restrict the requirements/ directory to CI focused requirements only.

Have the GitHub actions self generate the requirements files so we offload the burden from the contributor.

This is currently what we do. The one thing I'd change would be to have the PR to update pinned requirements be merged automatically if all required tests pass, but I couldn't figure that out.

Reserve the requirements.txt for the developer and add it to .gitignore so the developer can create a requirements.txt for their own development environment.

👍🏻 Sounds good to me! We'd probably want a comment in .gitignore to explain why requirements.txt is being ignored by default.🪧

@namurphy
Copy link
Member

namurphy commented May 10, 2024

  1. There is no documentation on when or how to regenerate files

uv pip compile has a --custom-compile-command flag which lets us include the command we use to regenerate requirements files in the headers of the resultant requirements files. We now have it set to nox -s requirements in the requirements session in noxfile.py, so each requirements file now says how to regenerate it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CI Related to continuous integration nox Related to the nox automation software packaging Related to packaging or distribution
Projects
None yet
Development

No branches or pull requests

2 participants