I wonder when that was?
Lynching black people in the street?
Overthrowing democratic governments to install far-right dictatorships?
Invading other nations based on flimsy excuses?
Working people to death, and letting them starve on the street?
I wonder what good example America ever set for the world.