Ten years ago, Americans were one nation, and despite their ideological, party, racial, gender, and other differences, there was a general consensus in society about core values and understandings of what a country America is. Then the liberals, along with their media, started cultural wars, began to call anyone who disagreed with them a racist, sexist, fascist, fanatic. They began to turn everything into a competition between different social groups, men against women, whites against blacks. They suggest that white people, men, Christians, heterosexuals are to blame for all the bad things in the world. Don't you think so?
Select age and gender to cast your vote: