@kirby Every time that I use python, I start to really regret it even before my soyware reaches a couple hundred lines of code. I don't know how some niggers make entire soywares 100% in python.
this might be devs' fault, but I find that there's an issue when installing requirements.txt with pip almost 100% of the time, sometimes it's a specified package not existing, often devs don't specify version numbers so everything will just install latest, meaning shit breaks. Those are just off the top of my head, there's more reasons that installing requirements is consistently retarded on python. I've found that python shit has like a 3 month shelf life before pip makes it a hassle to install.
in terms of virtual environments, there's just a bunch of shit that will reference dependencies on your bare machine, i.e. cuda. Which kind of defeats the purpose
@kirby@burner@mikuphile You can do a lot with the standard library (since it's "batteries-included") but CPython is just slow as aids by design. And virtual envs and packages are such a mess because of the lack of proper version number specifications as burner mentioned, but also because for a long time, there was no standard way for making a package for python. there was setup.py files, then setup.cfg files, pyproject.toml files to declare package metadata. And various tools to actually do the packaging from setuptools, poetry, hatch, meson, etc. Also because a lot of popular packages rely on C extensions for speed, the packaging process can get sextra complicated. Pip also doesnt do the best version number resolution for installing packages so other tools like conda popped for managing dependencies but that is bloated in its own way and another rabbit hole...
compare this to something like R, where they made sure there was a coherent standard for packaging pretty early on (not that I like R much, but its a comparable language for certain use cases).