Do most women realize that deep down men don’t give a shit about “women empowerment” or are just using their egos to care?

For the record, I’m not talking about women finding peace after being assaulted,

Im talking about the “strong woman” and other stuff.

I know that
I did not know that
I think a lot of men genuinely care and are not just using their egos.
I don’t know
Select gender and age to cast your vote:
Do most women realize that deep down men don’t give a shit about “women empowerment” or are just using their egos to care?
Post Opinion