Americans, tell me about the truths and myths about life in your country?

As someone who travels a lot of countries (not the US) and got to realize that the media gives us false views on countries be it directly or indirectly, good or bad.

Americans, tell me about the truths and myths about life in your country?


I’m curious to know about America from someone born and raised there.

thank you

Americans, tell me about the truths and myths about life in your country?
Post Opinion