Has feminism destroyed the movie industry for good?

The problem is there even if we sometimes chose to ignore it.

In an effort to give women some kind of "equality" that they weren't getting in movies, studios have proceeded to butcher long standing fan favorite franchises.

A simple example is Marvel, now they've done a lot lately so I won't name all the BS that happened in the name of feminism but the latest one was the new thor: it seems he will be replaced by Jane Foster, perhaps the least intersting character in to ever appear in the MCU played by the very talended Natalie Portman. Like you wanna add a female hero fine? But why do you have to remove a fan favorite hero just for being a male?
In fact if you look at the Avenger roster as it stands now women outnumber men and the numbers are still tipping in that direction.

Other franchises are suffering too, 007 is now some woman instead of James bond after some 60 years of there being one 007, John Wick is getting a spinoff called The Balerina, comedy as a gender is dead because if you remove all "objectifying joke" there's not enough left to make a movie. And on and on we go.

I understand culturally there is no going back and I am all for women having major roles but not at the expense of art we grew up watching and loving.
Has feminism destroyed the movie industry for good?
Post Opinion