It seems that over the past few years especially, as feminism becomes more and more focused on in the limelight, that the true definition of it is demonized and distorted.
At a core, feminism is simply the fight to have women equal to men in every way; professionally, socially, economically, and politically. While everybody who identifies as a feminist shares the same core belief, there are those who take it to an extreme level. With any sort of ideal or belief, there is always a radical side; hence where the term "feminazis" come from.
These radical feminists give the ideal as a whole a poor, toxic image to the rest of the world. While feminism is about equality, the radical view is that females are superior to males. This is false, of course, as is the opposite. While there are certainly those who believe this wholeheartedly, the majority of people who identify as feminists want nothing more than true equality between the sexes.
However, this is difficult to truly achieve due to the constant backlash that feminism unfairly receives. In order for this to be realized, there must be more of an understanding of what this ideal is at its true level.
I encourage everybody to take some time to read and research what feminism actually is all about with as little preconceived ideas as possible. Make your own choice and thoughts on this issue. There is nothing wrong with feminism at its core, and is in fact a good thing. Equality is what is fought for, and in order for it to be successful, it needs to be understood and discussed.
Naturally, this will get flak from some, but please, keep the conversation civil. I'm curious what people's stances on this are.