I am of multicultural background and grew up with different cultures. My mother being African (actual African American) and the rest of my family of European decent. As such I have faced discrimination on various grounds. However, I have only faced true racism from the Black Americans here. I remember when I first came here out of an oppressive country. I went to a private school primary and loved it. Sadly, I had to relocate to a different primary due to distance. This started my journey to public schooling. This new primary school was not the best for me in the sense of community. My only friend was the chair I sat in at the principles office due to how I was treated. On numerous occasions I was called out of class to sit in the nurses office to continuously get tested if I had aids. I was continuously asked why my mother married a white guy instead of "a strong black man". Eventually I went on to middle school where I noticed a weird set up. I remember that you sat with what part of town you lived though this was not implemented through the school but rather the kids did it on their own. I was spit on, bullied, cursed at, cyber bullied, laughed, mocked, and almost died. I was also called racial slurs and was told to be raped. I was told first hand that "blacks aren't and cannot be racist, racism was invented by the whites. Blacks can be prejudice but not racist". I saw an older guy who said he wouldn't never purchase anything white. As an African American myself I actually have had it pretty great in America otherwise. So I find it intriguing when they say how "being black in America" is. Why not fix the inner communities & racism in the black American community? Why do you continue to play victim? Why do you only blame Caucasians yet everyone else gets a tap on their back? I believe that we all bleed the same color and that skin color is an illusion (if that concept makes sense). I always wondered if this was more of a cultural thing. Thanks!