I've been told so by quite few people that live in USA. And I wonder is it really true?
Don't get me wrong I believe it is an amazing thing because it shows just how much better the society we live in is and it actually really changes my perception about white American girls, which I always thought were racist and full of themselves.
But anyway, is it true :D?
Most Helpful Girl
Interracial relationships are still a minority where ever you go.0