Should men take back society?

Anonymous
Over the last 20ish years men have taken a back seat and let the gynocentric side of society take control.

As we can now see, this was a massive mistake.

Society is based on peoples “feelings” and letting everyone be a winner for showing up.
The bullshit like “inclusivity” is injected into everything.

The school system is teaching our kids to be social justice warriors when they can’t even clean their rooms.

Our society cares more about making everyone feel “included” and “equal” than being efficient.

We have created a bubble wrapped weak skinned generation of snowflake adult babies with no ability to exist in the struggles of a real world.

Do you think it is time for men to step back up and say enough is enough. Silence the whining babies and put this ship back on course?
Should men take back society?
6 Opinion