the West brings together several countries, but I will reduce it to France, the United States and England. Of course anyone can answer lol. I have noticed that for 30 or 40 years in my country, that the elites, politicians even and the media, and associations have developed an anti-nation feeling, they hate what France represents, its history, its culture, its traditions. . The associations that have emerged treat the great characters in the history of my country, and my country, as racist, colonialist, homophobic, sexist, slavery etc. I know that these movements were born in the USA. And today we can see it very well with the black lives matter movement, which unbolt the statues of famous characters, because they are judged as racist etc. And do you think that Should westerners be ashamed of their history and apologize for the harm they have done in their past?