Embed Notice
HTML Code
Corresponding Notice
- Embed this notice@brigrammer @TrevorGoodchild Sure. But there were people training open-source models without the ethics shit, the difference is that none of them had access to the scale of compute needed to train a 70 or 130B
So perhaps this is a "the world realizes OpenAI aren't magic" correction, but it was looking like people were going to be able to train an open source 130 soon anyway especially with the unified memory systems, albeit at like 5x the time