Am I the only one who thinks western society is nihilistic, pacifist and pessimist?

I feel like western society is toxic and abusive. I live in Europe. Ecology goes to shit, we keep cutting more trees. Third worlders rot away in sweatshops so we can buy cheap shoes and clothes. We dont worry about the war in Ukraine, it can't reach us in west Europe. The only activism we can do is woke bullshit. If you promote gym culture they call you right wing extremist. Our families break apart. People are apathetic. Its so depressing.
Am I the only one who thinks western society is nihilistic, pacifist and pessimist?
Post Opinion