Skip to main content

What went wrong with the South?

What went wrong with the South?

What went wrong with the Deep South? What went wrong with the Deep South is, in many ways, what went wrong with America. In the South, the effects of our nation’s enduring racism are most apparent, and it’s hard to overstate the continued legacy of slavery.