@prettygood i dont want to use fucking python but its like the 2nd best thing for integrating local llms into code after ollama, and not everyone adds their llms to ollama