I think so, but I think it's more like a demographic-specific issue.
Latinos and Latinas respect each other.
Asian men and Asian women respect each other for the most part.
Black men kinda look down on black women, nevertheless black women are very committed to black people's causes.
White women think white men are privileged, racist and bad. And white men think white women have no dignity, and are entitled, unstable and untrustworthy.
Are we in the middle of a gender war?