HBCU History

Historically Black Colleges and Universities (HBCUs) are higher education institutions established to primarily educate Black Americans.

A majority of HBCUs were established in the southern United States, following the Civil War. After repeatedly being denied and disqualified from attending predominantly white institutions, HBCUs emerged as the predominant institutions for the education of black Americans—for Black Americans to learn, excel, and assimilate the importance of the Black American community.

HBCUs By Location