Why is it that when natural disasters happen in America, no other country seems to care?

However, when anything else bad ever happens to other countries, the U.S. busts ass to make it right. I know other countries don't really respect America and its citizens that much, but when does decency come into play for us?

So when other countries send thing to in fact help us, why does the media keep that information off of the air?

Most Helpful Guy

  • I'm American, but I don't have much national pride at all. One reason could because of the mentality and reputation our nation holds. We are seen as greedy, and like the other guy said, arrogant. There are countries out there that help us, but for the ones that don't, I don't blame them. If you really look at our government, society and culture, and our twisted history, I find many reasons to not give money to the nation. However, when innocent lives are at stake you must rethink the argument I presented. People first, pride second.