So, my questions are...
-Does sexism still exist in America?
-If yes, which do you think is more of a problem, sexism towards men or sexism towards women?
-Do you think the sexes are considered equal by society?
-Is feminism still needed or are men's rights activists needed?
What is your opinion?
I know I mentioned feminism as an example but when I say "sexism" I am referring to it BOTH ways. Towards men or towards women, whichever you think either occurs more often or is currently a problem.
I'm leaving this question somewhat open ended so feel free to add any thoughts you like regarding this issue :)
Most Helpful Opinions