Think we should have forced Southerners to tour plantations after the Civil War?

Not sure most understood what slavery actually was. They were just fighting for white supremacy. Lazy fcks.

The Germans “knew” what was happening during the Holocaust. They benefited. They felt good about “being Aryan” like Confederates. But I don’t think it set in until the camps were liberated and films were shown.

Like you would riot if you saw in person how the poorest kid lives today…in cockroach feces.

Think we should have forced Southerners to tour plantations after the Civil War?
Think we should have forced Southerners to tour plantations after the Civil War?
Post Opinion