-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Migration of some metadata into PEP 621
-compliant pyproject.toml
#409
base: master
Are you sure you want to change the base?
Conversation
Is there any compelling advantage to switching from cfg to toml? Particularly, the toml support is relatively new, so that would make it more complex to use in Linux distros that don't yet have that version of setuptools packaged. It seems like quite a bit of trouble for no real gain IMO. |
It's their problem. If they package a new version of PGPy, they can probably package a new version of
You are right. Updating mature libraries to "new shiny shit" is always a bit of trouble for no real gain. So feel free to postpone merging of this PR untill you perceive the troubles associated with it have decreased enough. |
TBH I was thinking more of the users of those distros, than the developers of those distros.
Although this also seems clueless. You do realize that "manually" running
... The thing is, that PyPA have made several good arguments in favor of PEP 517/518 (and several bad ones!) but none of this really has to do with "unique recipes" and the avoidance thereof. So I'm not sure where you got this from. |
For the users it should not be a problem at all if Python version shipped in their distro is supported. They have to just use pip. It's package builders have the most of troubles to integrate pip properly into their tools for package build automation.
That's why I have to add
The last time I have tried to integrate installation of wheels into my metabuild system (BTW my interest into this lib was driven basically by the need to replace |
c46ee5f
to
77c399d
Compare
We actually host several package builds for PGPy in this repo. if you take a look at these branches:
you’ll find that pip should not be needed to build PGPy into a distribution package at all anyway. Library dependencies at the OS level should be packaged separately and individually so there’s no need to resolve dependencies using pip. |
Outside of that case, most of the other issues you raise here can be mitigated by using a virtualenv instead of mixing pip and os managed packages. Using virtualenvs avoids a host of potential issues, and at least as of right now it is the recommended way to install packages with pip, including this one |
I looked into them qnd found that it is some what I meant under "non-automated". For example the list of
I'd argue that it is the current approach, but IMHO is not the correct one in the long term, so IMHO the verb But I recognize that it is the current approach, so some tooljng should be created for it. But the source of truth should be not the recepies for building packages for package managers, but the native package of Python ecosystem - a wheel.
venvs are prescription drugs and using them where they are not really needed (I know only one use case where usage of venvs is justified: testing software in controlled environments without uninstalling/downgrading packages in the main system) is just hiding dirt and garbage under a carpet and it can become addictive to "solve" the problems this way. The same with so called "self-sufficient containers" usage for distributing apps: snap, flatpak and sometimes Docker. I'm not the only person thinking like this. For example in https://habr.com/ru/post/433052/ (it's in Russian, but you can try to use machine translation tools. Though the points written there are obvious, it is nice that |
77c399d
to
9ef16ae
Compare
As a matter of curiosity, what experience, if any, do you have in distro tooling? I ask because it's generally accepted among all distros that you need to know the dependencies in order to set up the build, before you download the source code. You can't have one build dependency that gets run as a program to figure out the other build dependencies. That doesn't mean there aren't tools to automatically transcribe the build dependencies from a pyproject.toml into a debian control file or Gentoo ebuild or Arch PKGBUILD or void template or rpm spec or what have you -- because those tools definitely do exist, and you may even be unknowingly looking at the output of one. Think of it as a lockfile in a different format. |
I stay away of it as much as possible. I prefer using universal tooling like CPack.
It feels just incorrect. It not the one who makes a package decides which deps a package must have. It is the one who writes software decides which deps his software uses and which workarounds he can apply to support legacy versions of deps, and whether it is in runtime or in compile time. If there are no deps in the distro, there are 3 options: find a repo with them and fetch them from it (for example
If the build tools used in a distro are highly inconvenient unusable piece of s...oftware, this means we have to use own build tools (it doesn't exclude using distro-specific build tools, we can wrap them). This is the idea on how
|
Conventional usage trumps feelings. Conventions are not conventional for no reason.
It seems like you have some fundamental misunderstandings here.
To you.
Package maintainers don't "decide" anything here, they either understand how to make it work, i.e. supplying the correct dependencies specified by the software maintainers, or they don't, and they build a broken package.
Why should it be my responsibility to ensure that my software runs on some random distro I neither use nor intentionally support? The reason distribution package maintainers are responsible for maintaining distro-specific packages is because the vast majority of distro-specific incompatibilities are caused by idiosyncracies created by those very same maintainers. It's their responsibility to manage that, not mine. Do these patch stacks get misused? Does it cause problems occasionally? Yes. But most packages don't need that kind of work, and honestly it's a shitty reason to think you know better than the debian repo maintainers, especially when you freely admit that not only have you never done that type of work, but you actively avoid it. If you never do something, your opinions on its intricacies are therefore necessarily uninformed and not valuable. But that's all beside the point: distro-specific patches aren't needed for this package and chances are they never will be, so railing against that of all things is highly suspect as a strawman argument. |
…d` or any other `PEP 517` frontend.
Moved all the "requirements*.txt" into `pyproject.toml` under extras.
There are 3 options resulting into working packages of latest versions:
There is another option, resulting in broken software: the upstream says it is not his responsibility, distro maintainers say that they don't want to do their job of updating "shiny new shit" and that they don't ove anyone anything and that the idiosyncracies are needed and they won't get rid of them. Noone is responsible for the software being working (and not the latest version is considered "unworking" and "broken", because not the latest version doesn't have the latest features, and not having a feature is one of the cases of a broken feature).
This info is encoded in metadata files shipped by upstream. If they don't understand how to make use of these files properly, they can build a broken package.
Because if your software doesn't support a target platform, this means your software can be unusable there. Of course one can ignore any platform one wants. Even a big one. Even Micro$oft Windows. Even Googlag Chrome (I do it in my WebExtensions, if one uses Chrome, he is worthless). But not supporting a platform means that it is likely the software won't be used there. Usually one wants to support as many platforms as possible. And as many distros as possible. It is usually not distro who wants to get your software as a package. It is you who wants the distro have a package for your software. Distros usually don't care of the most of software and the most of "shiny new shit". If they care, it is usually because they need it. For example to deliver updates for the distro itself. If they need a feature from a new version of a lib, then there will be a package. If they don't, they can choose save man-machine-hours and do nothing. The only escape from this nasty situation is hardcorely automating build processes in order to keep people out of the process as much as possible. Ideally the machinery should be capable to provide packages for working latest versions of software for decades without any human modification of recepies on distro side. |
It is good.
The argument were.
It is distro responsibility to keep packages within the distro updated to the latest versions. It shouldn't be a burden to them to update them even from each commit if they automate the process properly. If they don't, they should blame themselves. |
9ef16ae
to
bf12ed4
Compare
I'm slightly confused why this has now become a discussion about how "hardcore" the project developers are?
Why do you keep assuming that people don't know how to make use of these files? These files are format-shifted from a python API function call into the native format of the meta-build system. The process works, is automatable, and doesn't have any of the problems you seem to imply. |
Automation without human oversight is indescribably foolish. We're discussing setuptools, so surely setuptools can serve as the best example here -- well, guess what, setuptools changes a lot these days, and also keeps breaking, then getting fixed. Have you been watching recent discussions around the future of distutils? Know anything about numpy.distutils? The scientific ecosystem relies rather heavily on that, and setuptools 60+ is outright unsupported and won't be getting support. It's in deep maintenance mode, while alternative PEP 517 build backends are explored. So yeah, various groups -- maybe distros, but not only distros -- have maximum versions of setuptools pinned, because automatically using the latest version is actually proven to be known broken. This is also why stable distros like Debian exist, and don't upgrade anything, not even pgpy -- but users might upgrade pgpy while still using the system setuptools. I do not understand why you're so dismissive of this entire line of thought that you build mysterious strawman arguments blaming everything imaginable for a multitude of shortcomings, many of which weren't mentioned before. Be that as it may, the purpose of a strawman argument is to set up a fake argument that you can then tear down, but the fake argument you made up turns out to be right. I'm not sure this strawman argument is fulfilling its goal. |
Maybe. What can happen? A lot of things. For example incompatible API changes, a backdoor, a logical bomb and so on. But the reality is maintainers cannot really properly audit every version, even if the packages are small, there is a lot of them. So the maximum what the maintainers usually have capacity to do is to read the changelog, to build the new version and to check if it works for them (often running the test suite takes too a lot of time).
I know and it is good. PEP 621 was the most awaited change in
I know that cexts have always been untidy mess full of custom-written
They depend on
Instead of pinning the versions they should fix their software to work with the latest versions. Or, if it is
Stable (in that sense) software is useless. If a user needs the software with new features, sometimes he has to build only that software version from source. But the new version often requires new versions of libs, and new versions of those libs can require new versions of other libs, and to build those libs user can have to install new versions of build tools ... So either a user finds a side repo with new packages, or makes an own repo for the missing ones, or he has to change the distro to a more practical one. Maybe not as "stable", but at least actually solving user's problem instead of making excuses about why it cannot solve user's problem. |
Well no, I already said, they are migrating to non-setuptools build systems, which is to say, Meson. I don't understand why you think that the numpy ecosystem is to blame however. They didn't break a stable API. They are migrating to something with a robust long-term future. It's not their fault that they are stuck between a rock and a hard place, and it's not their fault that the only maintenance fix is to pin their dependencies while building out new infrastructure from scratch in the development branch. That doesn't mean it didn't happen regardless of "fault". And here you are talking about how no matter what, it's always the fault of anyone that doesn't use the latest version of everything, because if they don't use the latest version then their project is "garbage". And this is supposed to be the life advice for why projects should make changes such as this PR that don't add any new functionality, only drop support for old versions? Is that it, then? If pgpy supports versions of setuptools older than the very latest release, that inherently makes pgpy "garbage" because it isn't "hardcore" enough? |
It is a right thing to do (though I prefer CMake for their rich box of batteries).
It's good.
It is their fault they have not addressed it already. We all have faults. But for them I guess the quickest fix possible is just forking
It is the fault if one is incompatibke with the latest version. Currently PGPy is incompatible with the latest version of packaging practices within Python ecosystem, this PR fixes it.
The new functionality is not so for this project itself. The functionality a project provides is always for third parties. I consider "can be parsed by the tools supporting only PEP 621 rather than the zoo of formats various buikd systems" as a feature. And I prefer the rest of formats for specifying metadata for python packages to be used in sdists to be extinct. If only dead and unmaintained packages use the formats that are nit PE; 621, then I can drop all the formats that are not PEP 621 in my tools and state that in this case they should be converted to PEP 621.
It is not about versions of |
No description provided.