Has feminism actually gotten you anywhere?

Anonymous

As an American, I'm often shocked by how many people believe in the rhetoric of liberal politics and the federal government.

Take slavery for instance. Just the other day, a gentleman on here left a comment about how the Civil War was justifiable because it freed the slaves, and how the Confederate flag represents white supremacy.

Wow! So you think Northern Aggression was justifiable? Does no one look at history? Do you know what an incompetent job they actually did of freeing the slaves? How the feds opened the door for race wars, hatred, and violence and then blamed it on the South, when they are the ones who made things far, far worse?

So, is it the same for feminism? Has it actually gotten you anywhere, or do you just believe whatever the big business of liberal politics tells you?

Has feminism actually gotten you anywhere?
Updates
1 y
In the South, it wasn't uncommon for a farmer to work right alongside his slave. Some slaves were treated better than "free" people. Better check yourself before you wreck yourself.
Has feminism actually gotten you anywhere?
23 Opinion