"Medicine has evolved over centuries, transforming from prehistoric nature-based treatments to modern medicine as we know it today. For decades, black doctors were a rarity in the United States. After the Civil War ended in 1865, several African American medical schools were established. However, it wasn't until the wake of the 1960s Civil Rights Movement that medical schools and facilities could no longer discriminate based on race, giving African Americans a more equal opportunity to enter the medical field and play a role in its advancement. In addition to medical professionals, chemists and engineers have also played a vital role in the advancement of modern medicine."