Mad In America’s Chapters

Preface

The World Health Organization has repeatedly found that people diagnosed with schizophrenia in the U.S. and other developed countries fare much worse than schizophrenia patients in poor countries. In the poor countries, a high percentage of patients recover and lead active social lives. In the U.S. and other developed countries, most patients so diagnosed become chronically ill.  An understanding of this failure of modern medicine can be found by tracing the history of medical treatments for madness to the present day.

Part One: The Original Bedlam (1750-1900)

1. Bedlam in Medicine               

A look at early medical therapies for madness in Europe and the colonial U.S., and why those therapies were at times viewed as curative. The early therapies included bleeding patients, putting them in “tranquilizer chairs,” spinning them, dunking them in water, and even holding them underwater until they lost consciousness. Such therapies arose, in part, because of a belief that “reason” was the highest human faculty, and thus the mad, having lost their reason, were  “brutes” and needed to be treated as such.

2. The Healing Hand of Kindness   

In the early 1800s, there arose a form of care in England and France known as moral treatment, which emphasized treating the insane with kindness and empathy, and avoiding medical remedies that “worked” by weakening the patient. Moral treatment emphasized that mental patients should be seen as part of the human family. This form of care produced good outcomes for more than 30 years.

Part Two: The Darkest Era (1900-1950)

3. Unfit to Breed                     

The eugenics movement took hold in the U.S. in the early years of the 20th century. This “science” preached that insanity was a genetic disorder, and that a gene for insanity was spreading throughout the U.S. population at alarming rates. As a result, the humanitarian attitudes common to moral treatment gave way to a belief, said to be grounded in science, that the mentally ill were a threat to the general well-being of the country. To counter this threat, eugenicists argued that the mentally ill should be segregated in asylums and forcibly sterilized. By the end of the 1920s, American society had  embraced involuntary sterilization of the mentally ill as a progressive health measure, with the New York Times and numerous other newspapers editorializing in support of it. The asylums were also run on bare-bones budgets, a fiscal policy that was consistent with eugenic notions that devalued the mentally ill.     

4. Too Much Intelligence     

After the fall of moral treatment in the late 1800s, American psychiatry once again devoted itself to finding physical remedies for psychotic disorders. Therapies of every kind were tried. These ranged from water therapies like the continuous bath, in which patients were kept in bathtubs for days on end, to gastrointestinal surgery. Doctors also tried fever, sleep and refrigeration therapies (this last one involving cooling patients to the point they lost consciousness.) Finally, in the 1930s, there arose a trio of therapies–insulin coma therapy, metrazole convulsive therapy, and electroshock–that all worked, as was freely acknowledged at the time, by damaging the brain.

5. Brain Damage as Miracle Therapy          

The fourth “brain-damaging” therapeutic that was embraced in asylum medicine in the 1930s and early 1940s was prefrontal lobotomy. This operation was pronounced safe and effective in numerous trials, and in 1949 its inventor, Portuguese neurosurgeon Egas Moniz, was awarded the Nobel Prize in Medicine. Many physicians who tried it concluded that the operation could not possible harm the mentally ill, and during the 1940s newspapers and magazines regularly wrote about this “miracle” therapy for curing mental disorders. Today, this operation is viewed as a mutilating surgery, and its rise and fall provides a cautionary tale about the capacity of a society to delude itself about the merits of its medical treatments for the mentally ill.

Part Three: Back to Bedlam (1950-1990s)

6. Modern-Day Alchemy            

In the early 1950s, chlorpromazine–marketed as Thorazine–was introduced for the treatment of psychotic disorders. Initially, physicians praised it for producing a “chemical lobotomy,” and noted that it also produced symptoms similar in kind to the encephalitis lethargic virus. It was seen as a drug useful for quieting asylum patients, and not as a “cure” for psychosis. However, over the next decade, the drug underwent an image makeover (which was driven by the pharmaceutical companies), and by the early 1960s chlorpromazine and other newly introduced neuroleptics were hailed as “safe, antischizophrenic” medications.

7. The Patients’ Reality   

Neuroleptics “worked” by blocking dopamine receptors (and not by normalizing dopamine levels.) The drugs occupied 70% to 90% of all D2 receptors, and this hindrance of dopamine function retarded movement, made people lethargic, and reduced visible symptoms of psychosis. Patients often vigorously resisted these drugs, stating that they induced great physical suffering and turned them into “zombies.” In addition, contrary to what the public has been led to believe about the drugs’ “efficacy,” the research literature clearly shows that the drugs made patients chronically ill, and impaired recovery.

8. The Story We Told Ourselves            

Although there was ample evidence that chlorpromazine, haloperidol and other neuroleptics were making people worse off,  the story that we told ourselves about the merits of the drugs was quite different. The public was led to believe that those diagnosed with schizophrenia had overactive dopamine systems, and that neuroleptics normalized dopamine activity in the brain. By doing so,  the drugs were said to effectively knock down psychosis, and prevent relapse. However, every element of that medical paradigm is easily proven false.

9. Shame of a Nation          

Patient groups in the 1960s and 1970s often protested vigorously against the use of the medications, and fought in court for the right to forgo such treatment. What made their protests particularly powerful was that they came at the same time that the Soviets were using neuroleptics to punish dissidents. Although mental patients in the U.S. did win the right to refuse drug treatment, they often had to take the drugs in order to obtain social support services. They won the battle but lost the war. Also in the 1970s, the head of schizophrenia studies at the National Institute of Mental Health, Loren Mosher, conducted an experiment that compared two-year outcomes in patients treated with and without neuroleptics, and he, like others, found superior outcomes for patients who weren’t put on the drugs. He was subsequently forced out of the NIMH for pursuing this research agenda. The result of all this showed up  in World Health Organization studies conducted in the 1970s and 1980s: Whereas the majority of patients in poor countries, where neuroleptics were much less used, had favorable long-term outcomes, most patients in “developed” countries became chronically ill, and suffered from a very “poor quality of life.”

10. Away from Nuremberg         

In the late 1940s, American psychiatric researchers intent on investigating the “biology” of psychosis began conducting studies in which they gave mentally ill patients a variety of chemical agents–LSD, amphetamines, methylphenidate–expected to worsen their symptoms. This type of federally funded experimentation went on for nearly 50 years, with more than 1,000 mentally ill patients ushered into such experiments, including some people who’d come to emergency rooms seeking help. In addition, the paper trail for these symptom-exacerbation experiments shows that American researchers regularly misled patients about their intentions.

Part Four:  Mad Medicine Today (1990s – Present)                        

11. Not So Atypical       

In the early 1990s, new “atypical” drugs for schizophrenia were brought to market amid much fanfare, hailed as much more safe and effective than the old neuroleptics  However, those claims arose from drug trials paid for by the pharmaceutical companies, and the FDA, in its review of the data, concluded that the trials had been biased by design, and that there was no good evidence that the new drugs were better than the old.

12. Epilogue

In Finland today, researchers are reporting great results with care that emphasizes social support and the selective use of neuroleptics. Some patients appear to do better without the drugs, while others do better on low doses. Any reform of care in this country will require a willingness to explore alternatives like the Finnish program.

***

Mad in America hosts blogs by a diverse group of writers. These posts are designed to serve as a public forum for a discussion—broadly speaking—of psychiatry and its treatments. The opinions expressed are the writers’ own.

***

Mad in America has made some changes to the commenting process. You no longer need to login or create an account on our site to comment. The only information needed is your name, email and comment text. Comments made with an account prior to this change will remain visible on the site.

2 COMMENTS

LEAVE A REPLY