When did the Americans lose respect for France?

Before, I will say until the beginning of the 20th century, the United States respected France, the greatness of its history and culture, there were even politicians and leaders who were Francophile or at least knew well the French culture and respect its power.

But since then, all that has disappeared and it seems that for the Americans for a very long time now that its history and culture is just average.

Personally I have a lot of respect and admiration for what the Americans have managed to accomplish through their history and all that they have brought to the world, I consider that the US has been a very positive force for the world through all that it has brought.

I feel the same way about the English.
When did the Americans lose respect for France?
When did the Americans lose respect for France?
Post Opinion