Embed Notice
HTML Code
Corresponding Notice
- Embed this notice@Codeki @PopulistRight @snappler @RustyCrab @Inginsub you can run the model slowly for the cost of a consumer card, about 5k there abouts.
while the ones that deepseek used to train it, aftermarket h100s are around 30k each not counting other gear
nvidia pricing AI specific top line chips at 60k each, and everyone is just supposed to buy hundreds of them at once. like openai allegedly had like 150k individual units for some reason
now factor that into having to "retire" entire setups every 3 years because "nvidia" made a better card.
suddenly nvidia revenue stream is bloated 1000x for zero reason. now that news is out the first tier clients (server runners like super micro) cant sell their subscription service, nvidia is going into the trash.
just look at their own cooked up projections, i doubt they even shipped this any chips. investor ponzi