At this point the Democrats are unquestionably far left, and have far left militants in their ranks. I digress however. A lot of the dominant culture in the US is left-wing, and with that comes an extreme hatred of white people particularly white men even working class white people. The right is telling disaffected and downtrodden white people that they matter and they aren't evil people, and that their feelings matter. Do you think this is causing the rise of the right? And by right, I don't mean establishment conservatives or neo-cons.
Select age and gender to cast your vote: