Do you believe that women are really still being oppressed by men?

Anonymous
*Note: I am strictly referring to the western world, America in particular. I am also not trying to start any arguments or debates, I'm just looking for honest opinions.*

Recently, I have been seeing a lot of social media posts by women depicting men as "evil rapists set on oppressing women." My interest was especially sparked after seeing this article on Facebook: thoughtcatalog.com/.../ But perhaps these are just extreme views held by some women.

As a woman myself, I can say that I have never felt oppressed by anyone. I have never felt harassed, intimidated, or "less than" any man. I view the sexes as equal and I've never heard any man say that he thought of women as below him. But maybe these are just my personal experiences and they don't accurately reflect the experiences of other women, I don't know.

So, my questions are:

Women: Do you ever feel oppressed by men in your real life? Do you think there is still a need for feminism in modern-day America?

Men: Do you think that your sex still oppresses women in some ways? What do you think of feminism and do you think it is genuinely needed or just some women overreacting?
Updates
+1 y
I just want to say I'm sorry to anyone who has been offended or upset by this question. That wasn't my intentions at all. I was simply looking for honest opinions to see what individual women face in our society and how men feel about the issue.
Do you believe that women are really still being oppressed by men?
165 Opinion