Conversation
Notices
-
Embed this notice
Sick Sun (sun@shitposter.world)'s status on Sunday, 17-Nov-2024 08:39:52 JST Sick Sun Watched a whole video on AI risk related to using LLMs to write code. Never once addressed the possibility of using a local LLM.
My biggest objection is, you should know what you're doing before you use an LLM to save time coding.
This is the viewpoint I came around to:
0. write tests first.
1. if you are a beginner, use an LLM to write code you can't figure out by yourself, and learn from it. It will make mistakes you won't catch.
2. If you are intermediate skill/experience get your code peer reviewed.
3. If you are an expert, then use the LLM all you want to write code but still vet every line it produces.
Also never addressed but really underrated: using AI to set up your programming environment. Necessary to do, AI can save you a ton of time, little risk of getting pwned.
Oh also: malicious AI injecting backdoors in your code is valid high level concern but in practice you're just not likely to be targeted. As an individual, just not worth worrying about.-
Embed this notice
Sick Sun (sun@shitposter.world)'s status on Sunday, 17-Nov-2024 08:42:54 JST Sick Sun Finally, people ranting about refusing to use AI professionally will sound really stupid in a couple years. I wonder if they'll take down their articles/youtube videos/reddit posts or take their lumps. It's not a big deal anyway if someone changes their opinion but people are so aggressive about it that I hope they feel some humility eventually. Another Linux Walt Alt likes this. -
Embed this notice
⚡Eineygður Flakkari⚡ (toiletpaper@shitposter.world)'s status on Sunday, 17-Nov-2024 09:16:34 JST ⚡Eineygður Flakkari⚡ @sun
This is a decent option for vscode users who have ollama installed locally.
https://twinnydotdev.github.io/twinny-docs/general/quick-start/Sick Sun likes this.
-
Embed this notice