$3000 and can run 200B models? Not bad compared to $30000 for an H100. If all you care about is running the latest, biggest LLMs locally then it's the cheapest option by far. Otherwise you're looking at a Mac Studio.
If it runs on plain old Linux (with the gay Nvidia drivers of course) then that's even better than Digits, which uses DGX OS.
If only there was a truly neutral, uncensored LLM released to the wild. The possibilities would be endless.