Why does the West spend its time denigrating itself?

Many historians, politicians, journalists, in documentaries spend their time accusing Western countries of all the problems of the earth.

For them, Western civilization can only be summed up in slavery, colonization and imperialism. The West would never have brought anything good. The West to believe many of these people would be practically the "devil".


Of course it is important to take a critical look and yes unfortunately the West has been able to commit unspeakable acts. But it is also the case of other civilizations. But in all these criticisms one should not forget all the good things brought to the world by the West.

So I was wondering where this self-denigration was coming from
Why does the West spend its time denigrating itself?
Post Opinion