What Was Reconstruction?
Reconstruction -- the period after the Civil War -- was meant to give newly freed Black people the same rights as white people. And indeed there were monumental changes once slavery ended -- thriving new Black communities, the first Black members in.