I think we know the answer to that question. If you've been around for a while, you are aware that this nasty part of the American character resurfaces every 20-30 years or so. We never learn, but this time it's worse, and Democracy may disappear completely. Many are not aware that the in the 30's, Nazism grew in the USA. We only snapped out of it during WWII. KNOW your history, and go beyond what was taught in school. I also recommend reading Jim Marrs', 'The Rise of the Fourth Reich'.