But his words did hurt, because it is based in some truth. Much of the world does not value black women, and blackness in general.
However, despite black women making such great strides in education. 60% are college educated, many black women are opening businesses, and supporting their families. However even now the much of the world see's us as welfare queens, or simply hoodrats who have no life goals when that's far from the truth.
My question is, is this the media's fault or have people always held this belief since slavery and it passed down or is the media instilling in people what they think black women should mean to the world?
Most Helpful Opinions