Has Feminism done anything good for women?

Just wondering what feminism has done to both women and men? Has it done more good than bad? Many types of feminists, some want all men dead, some take it too far, most level headed feminists are cool and some feminists are blatantly hypocrites/double standard when it comes to blaming women for being victims of rape. It's hard to define Feminism these days because there are so many groups of women doing so many wrong things for the wrong cause. Like political groups you align yourself with a group that you believe share your values and perception on how and what feminism means to you as a woman. Some feminists would believe that young girls don't need fathers to be there for them to be a role model, to guide, to protect and love.
Has Feminism done anything good for women?
Add Opinion