I keep hearing about feminism. all I hear is that it was supposed to support women, but I heard that feminists get negative reputation. what exactly is going on? are women abusing the law system (child support)? What are men doing wrong? I need to be more informed about this. please help, guys and girls, and tell me ur opinions of it!
Most Helpful Guy
Feminism is no longer needed. It's based on the idea that western women are oppressed. Which is not the case. So essentially they have begun to push for the oppression of men. Which they are succeeding in0