For those of you who use LLMs to help you code, here's a warning: these tools have been shown to hallucinate packages in a way that allows an attacker to poison your application. https://www.theregister.com/2024/03/28/ai_bots_hallucinate_software_packages/ #ai #gpt #chatgpt #security