Is the U. S. becoming more sexist?

Anonymous
The more I meet and talk to people, the more I read the news and watch TV, the more I realize that the US is becoming a country composed of a gender war instead of an equal peaceful nation.

here’s a simple equation: More feminism = more men putting their guard up and becoming even more sexist, because they don’t understand feminism’s intentions, nor they want women to hold an equal position. Maybe fear? Maybe just an ego thing? Maybe their upbringing or religious reasons?
Yes, sadly.
No, it isn’t.
Select gender and age to cast your vote:
Is the U. S. becoming more sexist?
32 Opinion