Maybe my issue here is that the whole “buildPythonPackage” ecosystem of nix is fundamentally at odds with the python ecosystem of late, and the rest of the nix ecosystem is better.
Python has been moving toward a more “define your exact dependencies” model lately (see pipenv, poetry, piptools, etc.), while nixpkgs expects that python package dependencies can be replaced with whatever version is in the nixpkgs repo (essentially the opposite view). Nix also assumes that tests will catch any problem caused by a version mismatch, which means defensive dependency pinning won’t work. (A dependency that is pinned to be under some assumed api breaking change is trying to prevent code from breaking before it does. If the dependency changes behavior in a way that changes results but still works, it will break the code in ways that tests don’t necessarily catch)
This is wrong more often than it is right. Partly because of the pain of the arm64 transition on Mac, and the Mac vs linux issue, and the fact that data science based python packages frequently have dependencies on C libraries (which just multiplies the dependency space), I have yet to have a python package build correctly the first time from nix.
But this is partly beside the point. If I have to learn a new way to install packages for every language, then I kinda have to be knowledgable in nix and the language to install a throwaway piece of software in whatever ecosystem I want to try out. I can probably make the buildPythonPackage
thing work for python, because I know python… but if I have to do that for a nodejs project? or a ruby project? When I just want to use the output of the package? That is a ton of work to use a nix system… is that really the only way?
This isn’t an argument against the standard way of doing things, it is an argument to follow the xdg standard, and use xdg environment variables, rather than creating a new unconfigurable directory in $HOME.