(HealthDay)—The U.S. medical field is less dominated by white men than it used to be, but there are still few Black and Hispanic doctors, dentists and pharmacists, a new study finds.
Comments are closed.