— The structural gap
What clinical AI systems systematically miss
Clinical AI is trained on data from health systems that serve specific populations. In the US: Medicare/Medicaid databases skew toward elderly, urban, insured populations. In Europe: predominantly white European cohorts. LATAM, Sub-Saharan Africa, indigenous communities worldwide: near-absent.
The consequence is not bias in the colloquial sense — it is structural exclusion in the formal sense. These populations have welfare functions $W_e$ that the system has never observed. Their $\Delta W_e$ generates zero signal in the objective function.