Do you guys not think that is necessary anymore and that we're there already, or do you think that that's not what the movement is about anymore?
I feel like there are still so many issues in our society that affect women disproportionately, like wage inequality, lack of employment opportunities within many sectors, objectification, harassment at the workplace, things like childcare fall on women despite them being full-time workers, we still don't have equal political and media representation, it's now coming out that women have been giving up opportunities in the entertainment industry for decades after being harassed and assaulted by men, and so on. Do you feel like all of these are not actually issues or that feminism isn't covering them sufficiently?
Could you also state if you actually had experience in the professional workforce and where you guys are from? Because I feel like that might also play into it.
Most Helpful Opinions