Does My Algorithm Have a Mental Health Problem?

1
195

From Big Think: As our algorithms are increasingly being made in our own image, they are at heightened risk of experiencing “mental health problems.”

“Take the case of driverless cars. A driverless car that sees its first stop sign in the real world will have already seen millions of stop signs during training, when it built up its mental representation of what a stop sign is. Under various light conditions, in good weather and bad, with and without bullet holes, the stop signs it was exposed to contain a bewildering variety of information. Under most normal conditions, the driverless car will recognize a stop sign for what it is. But not all conditions are normal. Some recent demonstrations have shown that a few black stickers on a stop sign can fool the algorithm into thinking that the stop sign is a 60 mph sign. Subjected to something frighteningly similar to the high-contrast shade of a tree, the algorithm hallucinates.

How many different ways can the algorithm hallucinate? To find out, we would have to provide the algorithm with all possible combinations of input stimuli. This means that there are potentially infinite ways in which it can go wrong. Crackerjack programmers already know this, and take advantage of it by creating what are called adversarial examples. The AI research group LabSix at the Massachusetts Institute of Technology has shown that, by presenting images to Google’s image-classifying algorithm and using the data it sends back, they can identify the algorithm’s weak spots. They can then do things similar to fooling Google’s image-recognition software into believing that an X-rated image is just a couple of puppies playing in the grass.

Algorithms also make mistakes because they pick up on features of the environment that are correlated with outcomes, even when there is no causal relationship between them. In the algorithmic world, this is called overfitting. When this happens in a brain, we call it superstition.”

Article →

Support MIA

MIA relies on the support of its readers to exist. Please consider a donation to help us provide news, essays, podcasts and continuing education courses that explore alternatives to the current paradigm of psychiatric care. Your tax-deductible donation will help build a community devoted to creating such change.

$
Select Payment Method
Personal Info

Credit Card Info
This is a secure SSL encrypted payment.

Billing Details

Donation Total: $20 One Time

1 COMMENT

LEAVE A REPLY