Do you think white women in the west act the most entitled?

Anonymous
Like they deserve to have things handed to them, treat people however they want and don't expect to be held accountable for their decisions/actions. I pick white women because I've notice they are the ones who always seem to want to not be judged for their actions and yet still expect to be treated like a princess out of all other races I've encountered.

Do you think white women in the west act the most entitled ?


Yes
No
Select gender and age to cast your vote:
Do you think white women in the west act the most entitled?
15 Opinion