Women, be honest please. Genuine equality or payback, which do you desire more?

After billions of years of being called weaker, dumber, too emotional, denied certain jobs, all the violence and rape you experience, and other shit, don't you on any level desire revenge? Now that times are advancing, and women are given more fair shakes, we've learned that women are more pain tolerant, handle disease and sickness better, live longer, do better in school, and are less likely to go on killing sprees (which implies better emotional control). So as time progresses and women start advancing past the stereotypes and glass ceilings. Don't you think on some level giving men what they gave you would be satisfying? I'm starting to notice that it's becoming more accepted to generalize, demonize, mock, and overall be sexist towards males, and to be honest, we've had it coming for a long, long time.

Women, be honest please. Genuine equality or payback, which do you desire more?
Post Opinion