I hear some American girls said bad things about them like won't date black men and think black men are thugs and disrespect it rub me the wrong way. I know many good black men I meet in real. It make me strongly dislike America girls. I don't talk to girls I don't pay attention to them anymore. They are not worth risk. Am I wrong about how I feel for them? They don't give a dang about black men.
Most Helpful Girl
Honestly, they say that because of how most black men portray themselves. And I'm saying this as a black woman. Many do portray themselves as thugs and have an immature mentality. Part of that is because of where many black PEOPLE, not just men, come from. I teach in an urban district and almost all the male students (a mostly black distrct) act like that. They don't use proper English, they have slang, pants sag, are on food stamps but wear clothes that costs hundreds of dollars, and I'd say about 1/2 of them are in gangs.
I've met a lot of black men who are not like that and have dated them, so I would never say I wouldn't date a black man. But I mean, when you look and act like 'the thug life', what do you expect? Women who have standards don't go for guys like that.0