Since around the 70s or so, feminism has been a massive part of society. However, it has been very exclusive to women and has promoted much misandry. Feminism should have been inclusive of everyone and there shouldn't have been any anti-male propaganda, such as the modern term 'toxic masculinity'.
Over the last few decades, boys have been performing worse than girls in schools here in the UK and throughout the western world. Schools have been treating any masculine behaviour as bad and feminine behaviour as the standard. We should embrace masculine and feminine behaviours and empower both genders.
I feel as though feminism should have been more inclusive of both genders and not just women. If we involved men and celebrated positive aspects of masculinity as well as femininity, perhaps more men would be more into feminism and these days, we may have had a more equal world. And 'feminism' should be called 'egalitarianism'.