Are Americans brought up to think that current society is better than the past?

It seems like a lot of Americans are ignorant to cultural history and associate everything B. C with savagery and masculine supremacy.

It seems like people in Other countries, especially European ones are more informed about the strong and respectable cultures of the past. They realize that the dark ages are a more recent thing and not all of the past.

Are Americans brought up to think that current society is better than the past?
Post Opinion