Should schools teach racism?

With Affirmative action and holding white kids guilty for being white for the racism their race has done to black people. Is this acceptable to tell white kids in school that it's ok to blame them for the problems of the past? It seems like it's okay to blame white people for all the problems of the world. Even when a black person holds up a gas station for money, white people taking the black kids job so he had to do crime to make money.
Should schools teach racism?
Post Opinion