Why do women hate men?

It’s taken a toll on my mental health a bit, I first heard it on social media and I didn’t care that much but now everywhere I go I hear women talk about how men are trash, sexist, beasts, wild animals and more. To the extent of saying they would prefer being around animals that would kill them in a second than be around a man alone. I’m always cautious of everything now, if I don’t have to i just don’t talk to women because I don’t want to be labeled a creep and I avoid looking in a woman’s general direction. I’m not saying it’s all women but why do I have to hear it so often. I feel like guys are either gonna turn into what’s being said or just stop interacting with women all together.
Why do women hate men?
Post Opinion