I have kind of an idea, but not really.
Most Helpful Guy
its supposed to be about the sexes finding some sort of equality between themselves, socially, politically and economically4
Political, social and economic equality for all genders
In its purest form, feminism is a movement with the aim of achieving equality between the sexes. However, radical feminists have given the movement a bad name, and many people don't take it seriously anymore.
"the advocacy of women's rights on the grounds of political, social, and economic equality to men"
Feminism is the belief that all genders should be equal. Specifically men and women. It is not the common misconception that women believe they should be above men.
simply put it's equality between both genders
It started out as a movement to create equality for women all around the world but has now turned out into more of an man hating movement, pushing men down into the dirt whenever possible. (This does not have to be true in every country, but it is how it is in at least my country)
The true definition in practice: Elevating girls and women to superior social, political, and judicial status over males, by grabbing special rights and liberties for themselves, at the ultimate cost of usurping the masculinity, roles and basic rights of men and boys. It largely pushes a radical abortion agenda as well in order to help accomplish this.
It's on dictionary. com if you even bothered.
Feminism is equality among rights and stuff but NOT double standards so in the end the reason why it's called FEMinism is cause it favors women.
The spawn of the devil and witchcraft combined, men who practice feminisms' penis's eventually shrivel up and fall of. If you ever meet one of these feminists on the street be sure to guard your genitals, as they are surely after them for a potion.
You cannot undo this action. The opinion owner is going to be notified and earn 7 XPER points.