Why are so many woman making everything about there gender nowadays?

I get that woman have it somewhat rough in society but in terms of influential power between genders they’ve certainly got the upper hand this day and age maybe a little to much power if you ask me. But now it’s like when it comes to men and woman if something normal is remotely perceived as a task for a woman then it’s “oh why should I do this cause I’m a woman?” Example I was watching a show earlier and this dude saved this girl from getting shot he took a bullet to the hip and is bleeding out everywhere and he asks the girl if she could stick him up the only other person around mind you , cause he was struggling. And her first response is “oh because I’m a woman I should know how to stitch?” It’s like bruh what does you being a woman have to do with my bullet wound? I see a lot of stuff like this on tv shows social media and unfortunately in real life. It’s so annoying like not everything is all about you.
Why are so many woman making everything about there gender nowadays?
Post Opinion