This is a broad paintbrush but I’ve noticed this a lot. Liberals blame white men specifically; but white people in general as the root of a lot of cultural problems. While I’ve seen the right also does this hypocritically; but to not appear as hypocrites they blame white women. What’s up with all of that 🙄 why do white women get the butt of all the bullshit?
The left blame white people for everything while the right blames white women, is there no winning for white women?

Superb Opinion