my question to this of course is, why?
in a nation that preaches equality, with feminist charging forward with there "Women are equal to men" arguments, why is something like this more socially accepted than the latter?
rather you feel its wrong or not, one is definitely more accepted than the other.
I would think, no matter what, equal should mean equal, a woman should never place her hands on a man or vice versa and if so should be treated with the same scrutiny as the latter.
unfortunately, its just not that way, why is that?
Most Helpful Opinions