Is the US one of the most racist Western countries?

From TV it seems that there is less racism in places like the UK, France, and other Western Countries than there is in the US. Is this accurate? Is the US really more racist?

Is the US one of the most racist Western countries?
Post Opinion