here’s a simple equation: More feminism = more men putting their guard up and becoming even more sexist, because they don’t understand feminism’s intentions, nor they want women to hold an equal position. Maybe fear? Maybe just an ego thing? Maybe their upbringing or religious reasons?
Is the U. S. becoming more sexist?
here’s a simple equation: More feminism = more men putting their guard up and becoming even more sexist, because they don’t understand feminism’s intentions, nor they want women to hold an equal position. Maybe fear? Maybe just an ego thing? Maybe their upbringing or religious reasons?
Most Helpful Opinions