I'm not saying that the direction we're headed in isn't correct. But sometimes I think about how women have rights literally out of the kindness of men's hearts.
Like technically guys could just. Ollectively at any moment be like "yea were tired of these bitches." and there would be literally nothing they could do to stop them.
Do women ever think of that or do you genuinely believe you're better than men? Serious question.