Sumary of Majority of U.S. Doctors No Longer White, Male A new study finds the U.S. medical field is less dominated by white men than it used to be, but there are still few Black and Hispanic doctors, dentists and pharmacists.:
- By Amy Norton HealthDay ReporterTUESDAY, July 20, 2021 (HealthDay News) — The U.S. medical field is less dominated by white men than it used to be, but there are still few Black and Hispanic doctors, dentists and pharmacists, a new study finds.
- By 2019, they accounted for about 44% of those positions nationally — down from 54% in 2000. That was due to an increase in women entering those fields, particularly white and Asian women.
- Meanwhile, similar patterns were found in dentistry and pharmacy — two other lucrative health care fields.
- Minority representation increased more broadly in jobs such as nursing, physical therapy and home health care.
- “Physicians who are underrepresented minorities, such as Black and Hispanic physicians, are more likely to practice in areas federally designated as medically underserved or experiencing health-professions shortages than white physicians are,” Ly said.
- “It wouldn’t surprise me if there are health benefits to seeing a pharmacist or dentist who looks like you and may better understand you or your experience — just like there is in medicine,” Ly said.
- Between 2000 and 2019, a growing number of women became doctors — accounting for about one-third of the health care workforce by 2015 to 2019. White and Asian women both saw gains of three percentage points.