Since the West has become a chaos these days, why should we impose Western values on other peoples?

Both liberals and conservatives have this mindset that we should impose our values on everyone else.

But look at the society we have.

Have our "values" fixed any social issue? Or at least without creating new issues?

Why the f* should we want other peoples to be decadent like us?

Since the West has become a chaos these days, why should we impose Western values on other peoples?
Post Opinion