I was thinking about moving abroad in my future and I wanted to know about some countries (I'm referring to its people) that have favorable views towards America (and Americans in general).
So that leaves off West Europe and Russia. It should be made obvious that developing countries also have VERY favorable views. I think Ukraine and the Philippines are among some of the countries that also like America.
Most Helpful Girl
I don't think America even likes America.1