Does America have good doctors?

I live in a country, where doctors can't diagnose even the most simple conditions. Me and other people I know have got a wrong medical diagnosis several times (once I also died from the side effects of a medication that was contraindicated to my condition). They also have an air of arrogance, are rude and you can't ask them any questions.

From my impression doctors in the US are more knowledgeable and professional. Is this true?

Updates
1 y
*I almost died typo
Does America have good doctors?
Post Opinion