Democrats don't really believe it's a center-right country anyway; they just know that's where the money and the all-important white voter is and they would rather work for them.
And what we're seeing is a Democratic Party steadily moving rightward trying to capture the space "traditional" Republicans left behind even as the wealthy and white voters are veering farther right every day.
The rest of us don't really have a say.
But in the meantime, yeah, Democrats might be nicer to us.