Why is it so wrong to be "woke"?

Woke just originated as meaning being aware of racial and social injustice. Now radical right wingers throw it around as an insult for anyone they don't agree with. So why is it so wrong to call out injustice where you see it?

I personally believe the reason people are so opposed to it is that they are scared they will lose their power over others. It's the same arguments they made when women fought for equal rights and the same stuff when different minorities fought for theirs.

Be civil, or I will remove comments.
Why is it so wrong to be "woke"?
Post Opinion