
Isn’t the west supposed to be the good guys?


This is literally inevitable and happens in pretty much any war it can happen in. It doesn't matter what army, there will always be opportunists in there who do really fucked up shit to civilians.
This is why you don't start wars if you can help it.
According to Western propaganda, yes. In reality - no.
Opinion
5Opinion
why wouldn't a liberal gynocentric west ever be the good guys? I mean they stand up for social justice and believe in equality and justify selfishness under the disguise of "this is a free country"
the west isn't god-centric, it's more atheist than the rest of the world
the rest of the world is more religious than the west
not even close. America has grifters trying to take advantage of religion. even liberals are using religion against religious people and then trying to play the moral high ground afterwards
the west isn't the bad guys because of their religion. it's their degeneracy they're trying to spread to the rest of the world, sticking their tentacles in other country's business that has the rest of the world hating them, trying to force feminism in the Middle-East, trying to force LGBTQ in Africa, starting proxy wars for bullshit reasons, destabilizing those countries so refugees have a reason to be pissed, illegally enter western countries, commit crimes and causes the citizens to demonize the illegals and justify more proxy wars creating a cycle of endless hate
no you just lack the IQ to think outside the box and let the concept of religion live rent free in your head
if you want to focus on just the story itself, most military personnel aren't religious at all. most who join aren't even particularly patriotic, they do it for the benefits
but sure, blame it on religion that is already dying out in the west
Everyone does anything for personal benefit. Thats why capitalism works.
Even in whatever shithole country you’re in (kidding I’m sure it’s fine), people aren’t doing things because of religion. It’s because of personal benefit.
But make no mistake, too many people in America THINK they’re religious…thus are fighting FOR god. That’s terrifying.
America isn't religious at all. too many grifters from both sides of politics using religion to try and gain a moral high ground without actually practicing it. even when they're not religious, they still try to pretend they have moral high ground when they don't
churches these days are empty and being desecrated with pride flags and OnlyFan models filming sex scenes inside them. mosques are the only places that are still full but I don't think you'd have a problem with them being full, do you?
Wait till you find out what Africans do to Africans.
Generaly speaking yes, but not everyone of course. While a lot of the West are good people, some are like that and support people like Biden.
nope. we're just suckers and losers
No. Not ever.
Be the first girl to share an opinion
and earn 3 more Xper points!
You can also add your opinion below!
Most Helpful Opinions