I'm not implying that the U. S. is above criticism, it definitely isn't. And much of the criticisms I've seen are justifiable. But it seems as if it's become a bandwagon thing of sorts to criticize the U. S. a disproportionate amount compared to other countries. Why is this? Every country on the planet has unique problems, but I see the U. S. specifically being called out quite often. And I see stereotypes being spread about people from here, as if there is no diversity of thought and opinion, which is quite nonsensical in my opinion, considering we are a huge country full of all types of people of every race, religion, and walk of life.