As you know there are many sensitive topics being brought up in society today. Important issues are being addressed, but I'm curious to know how you feel about the way they are being addressed?
The anonymous option is available so you can be comfortable to share your honest opinion, regardless of what that may be.
Do you believe that "WOKE" culture is the SOLUTION to these important issues OR do you believe Woke culture is ADDING problems instead of making a positive change? OR do you just feel incredibly confused about what to believe? OR, do you find it offensive?
Let's keep this a safe place for people to share their honest opinions. If you don't have anything nice to say to someone who posts, please wait until you feel calm before replying, or don't reply at all.