You hear it all the time, Feminism means:
“The advocacy of women’s rights on the grounds of political, social, and economic equality to men.” Oxford Dictionary
But I believe that it misses the point, that there are many assumptions that are just not in there. So I propose a new definition:
Feminism: ˈfeməˌnizəm ( Noun)
A political group that believes that men have been the ruling class in the majority of history, and as a ruling class they have granted themselves certain rights and privileges that have been denied to women and men who do not conform to a gender standard set forth by this male ruling class (Patriarchy). Whose membership works towards balancing this gender inequality through lobbying, information campaigns, and activism.
I realize that to most feminists these definitions are synonymous, but without the extended background the dialog looses itself in the assumptions.
What do you think?
Most Helpful Opinions