Is it fair to say America has always portrayed itself as the "Good guys" in the movie before the world and other nations as the "Bad guys"?

Hispanic-Cool-Guy
Is it fair to say America has always portrayed itself as the Good guys in the movie before the world and other nations as the Bad guys?
Fair to say.
Vote A
Not true.
Vote B
Other.
Vote C
Select age and gender to cast your vote:
GirlGuy
Updates:
2 mo
I'm not actually talking about a "movie" but rather on a intentional and domestic sense that America's government paints themselves are the good guys and other nations and their governments are horrible and needs America to step in and "fix" their country.
Is it fair to say America has always portrayed itself as the "Good guys" in the movie before the world and other nations as the "Bad guys"?
2
10
Add Opinion