Why is it okay for women to hit men?

Rather it be at home or outside, you see it everyday. Women hit men, almost as much as men hit women, however there is a stigma In our society, that somehow, these violent attacks are viewed as "Not that bad" or "Not the same" (As when a man hits a woman)

my question to this of course is, why?

in a nation that preaches equality, with feminist charging forward with there "Women are equal to men" arguments, why is something like this more socially accepted than the latter?

rather you feel its wrong or not, one is definitely more accepted than the other.

I would think, no matter what, equal should mean equal, a woman should never place her hands on a man or vice versa and if so should be treated with the same scrutiny as the latter.

unfortunately, its just not that way, why is that?
Why is it okay for women to hit men?
Add Opinion