I've seen some questions and americans act like they are the best or sth but here's what i think about america:
1) you're not safe
2) people are lazy and ignorant and selfish
3) there's so much racism there
4) you have killed a huge percentage of your native people and basically most white people have european roots (read some history)
5) you have a kinda shallow culture (soronity parties, pop etc)
So can someone explain me why americans feel like they're on top of the world when they're not?
Most Helpful Guy
Most Helpful Girl
We are taught to think we are safe etc. and honestly for the most part it's pretty safe outside of the major cities. America doesn't equal New York and Miami and Los Angeles.
Yes there's racism, but that's not specific to America. Yes many people are lazy and selfish, but that's not an American thing either. The ignorance thing - again, we're taught a lot more about our own country than we are about others, and we sort of gloss over what was done to the Native Americans and others. I'm not making excuses by the way, just explaining.
That said - that's only one part of the American public. "The public" in almost any country tends to be a mix of less-than-educated, it's not just America. And there are plenty of educated, non-lazy, etc. people in this country who are very aware of the problems. So it's a mix.
I think America's economic and cultural influence allows its people to be complacent about the rest of the world - I don't think it's right but it's how it is.
Basically - people who act that way seem to do so out of both ignorance and lack of perspective.0