Why are men always seen as the aggressors, and women as victims?

Seen this happen countless of times. In any type of scenario or situation, people always see the woman as the victim or the one who "deserves better" than the guy she deals with, even if she is the one who causes the problems.

For example,

A woman texts a guy incessantly, desperately trying to get his attention and not stopping. This is called harassment by the way. But a lot of people see that as a sign that the guy is not paying enough attention to her, and so she needs to find a better match that will give her the love and affection she "deserves." Even the man in question, after some time, thinks that he was at fault and decides to do "better" in future relationships... He blames himself too. I literally have seen this so many times it's not even funny... I've seen people tell women to find a better man because the man wasn't able to give her the love she deserved, despite her harassing and abusing the guy.

However, if a man does the same to a woman, this is more serious. He's being a creep, and harassment magically comes into play because it's a man doing it. In this case, she still deserves a lot better, and he's an abusive person. People come to her defense even more, and she's of course the victim again.

This is only one scenario, and I know that in many cases women really are the victims of abuse, and there are also instances where women are unfairly called other names such as slut-shaming and not being taken seriously. But that's another topic. This is one thing I've noticed.

And it's not only women. Society as a whole sees women as inherent victims, unable to be guilty of most interactions. This perception has had its benefits as well though, making women less prone to crime and violence because of their perceptions that they cannot do any harm to others. But it's still a double standard.
Why are men always seen as the aggressors, and women as victims?
Add Opinion