From Pacific Standard: Technology is increasingly collecting and sharing data on individuals’ mental and physical health, which is then converted into useable knowledge, such as recommendations about diet and exercise and predictions about potential health conditions, based on individual health profiles. Many of these recommendations and predictions may be biased, overlooking cultural differences and reflecting historical prejudice and discrimination.
“While statistical analyses like these have become ubiquitous in daily life, the data and technology powering them is often opaque and can produce unexpected results, amplifying historical biases in obscure ways that are difficult to identify or monitor. For instance, data sets that are analyzed for predictive purposes might calculate the likelihood of depression or chronic illness in your future, which might then be used to determine your eligibility for life insurance. If the data being used is biased or has some other flaw, it’s likely to be skewed against already vulnerable or underrepresented groups. The reality that a population may be quite diverse may be lost, generating recommendations that skew ‘Western, educated, industrialized, rich, and democratic’—or WEIRD.”