Do you think that most feminists take things too far and overreact? Do you believe that women are still intentionally oppressed in the United States? Unintentionally? Do you believe that men and women are still victims of patriarchy? Are you ambivalent on the subject? Besides reproductive organs, what are some key differences between men and women? Can their differences be celebrated while still promoting equality? Those questions can be used as guidelines if you're unsure of where to start, I'm genuinely curious to know how people feel. As for me, I'm largely ambivalent on the subject; I consider myself more of a humanist. I'm not entirely in agreement with some contemporary voices and their stances on what constitutes power and oppression, I'm also not a fan of entitlement, nor people who believe that respect is a right. That being said, my issues are with the minority; fanatical, fringe groups and not feminists or feminist theory as a whole.