Article
Author(s):
Psychotic episodes are devastating for the individuals who have them, their friends, and families. Wouldn’t it be wonderful if individuals could receive treatment before the first psychotic episode strikes, so that it could be avoided altogether?
Psychotic episodes are devastating for the individuals who have them, their friends, and families. Wouldn’t it be wonderful if individuals could receive treatment before the first psychotic episode strikes, so that it could be avoided altogether? After all, “an ounce of prevention is worth a pound of cure.”[1 Unfortunately, in psychiatry, we are a long way from achieving primary prevention-there is no vaccine for psychosis, nor have clear genetic markers for severe and persistent forms of mental illness been identified.
Throughout the 20th century, psychiatrists have therefore focused their attention on the early detection of signs and symptoms of mental ill health, assuming that early treatment will stop conditions from becoming worse. However, the ideal of secondary prevention can only be realized if these early signs and symptoms, or a “pre-psychotic syndrome” can be identified successfully. During the 20th century, psychiatrists have defined many of these “pre-mental illness syndromes”; unfortunately, it has not always been demonstrated that they indeed constitute the early phases of severe and persistent forms of mental illness.
In June 2011, a number of Australian newspapers reported that a high-profile medical trial targeting psychosis in young adults would not go ahead. It was to be conducted by Prof. Patrick McGorry, who had been Australian of the year (an honorary and mostly symbolic title bestowed by the Australian government on an unusually deserving citizen advocating worthy causes). In the proposed trial, youths as young as age 15 would receive Seroquel (quetapine) when they were first diagnosed, not with psychosis but with attenuated psychosis syndrome (previously called psychosis risk syndrome). Treating young adults with this syndrome would nip the danger in the bud-their potential psychosis would be treated before it even arose. The trial was to have been sponsored by the drug’s manufacturer, AstraZenaca, which, like many pharmaceutical companies, was probably eager to test its medication on a younger age group to expand the market for its medications. What could be wrong with such a commendable initiative?
Attenuated psychosis syndrome is proposed for inclusion in DSM-5 and has attracted an unusual amount of discussion (and dissent). In particular, its relation to psychosis is unclear. Emeritus professor Allen Frances, MD, who chaired the DSM-IV Task Force, is a fierce critic of the concept. According to him, there is hardly any evidence that attenuated psychosis syndrome, if left untreated, will ultimately develop into a full-blown psychosis (current estimates state that this will happen in merely 10% to 20% of cases). The number of “false positives” is therefore staggering. Dr Frances warns that treating a group of individuals of whom 90% would never become psychotic appears to be a waste of resources and a rather risky proposition.
McGorry’s proposed trial was widely criticized by psychiatrists world-wide, raising a number of significant ethical problems. First, there is the high number of false positives who would receive medication for a condition they would never develop if left untreated. The trial would not target incipient psychosis but probably address more or less unrelated conditions. This leads to the second ethical problem: a great number of young people would therefore be put on a medication they do not need. This would not matter so much if only aspirin or vitamins were tested. Unfortunately, Seroquel has many highly undesirable side effects including extreme weight gain and diabetes. One should only prescribe it when it is absolutely necessary.
Last April, AstraZeneca settled a lawsuit by the United States government after allegations it paid kickbacks to physicians while promoting the drug for unapproved uses by children, the elderly, veterans and prisoners for $525 million.[2 It has also settled, for $647 million, product liability cases for misleading patients about the risks of diabetes and weight gain associated with the use of the drug. Total expenses in legal fees associated with Seroquel are now $1.9 billion, which constitutes less than 5 months of Seroquel sales. Not a great medication to prescribe to individuals who do not need it.
Prof McGorry’s proposed research has attracted (unfavorable) media attention (in Australia); I highlight it here-not because it is exceptional or unusual in any way-but because it illustrates ways of thinking that have been part and parcel of 20th century psychiatry. The most important of these is the ideal of secondary prevention in psychiatry: it is imperative to treat psychiatric conditions when they first appear and when they are not as serious as they could become if left untreated. This prevents them from becoming worse and less responsive to treatment. This strategy is of course commendable when there is a proven link between these less serious conditions and more serious ones. In most cases, it has been assumed that such a link exists; it has hardly ever been demonstrated.
The emphasis on prevention is not unique to psychiatry but characterizes developments in several (if not all) medical specialties. In days long since gone, one would see a dentist when one’s toothache became unbearable-today, dentists fill cavities and polish our teeth so that we will never end up in this situation. They also whiten and straighten our teeth although this prevents neither toothaches nor tooth decay. Today, the demands we make of physicians (and dentists) far exceed those of average patients a hundred years ago. Today, physicians do a lot more than treat serious illness-and we expect them to do that.
Most historians of psychiatry have discerned two themes in the history of 20th century psychiatry. First, there has been a broadening of the definition of what constitutes mental ill-health. A wide range of conditions in between mental health and severe and persistent forms of mental illness have been identified and investigated. The formerly almost absolute distinction between mental health and mental illness has been replaced by a wide spectrum of conditions, which has led to the blurring of the distinction between the normal and the abnormal.
Second, conditions on this spectrum have increasingly become the target of psychiatric intervention; psychiatrists now treat a variety conditions less serious than severe and persistent forms of mental illness but definitely in need of treatment. During the 20th century, prevention has been the most important argument to hold both themes together: treating less serious psychiatric conditions prevents these from becoming worse-because it has been assumed these conditions will inevitably become more serious over time. It was well into the 20th century before any effective medical treatments for severe and persistent forms of mental illness were developed. Mental hospitals were severely overcrowded while little could be done for their inmates. Therapeutic nihilism reigned. Any type of intervention that promised to prevent mental illness from developing or becoming worse was therefore worth considering.
The blurring of the distinction between normal and abnormal is generally associated with Sigmund Freud: according to psychoanalysis, nobody is entirely normal, although some individuals are better in keeping their unconscious desires in check than others, thereby maintaining an appearance of mental health and normality. Despite differences in appearance, we are all to a certain extent mad. Views like these open up unexpected vistas for psychiatric attention: behind the everyday veils of normality, happiness, and adjustment hide psychopathology, lust, and perversion.
Nevertheless, the blurring of the distinctions between the normal and the abnormal is not unique to psychoanalysis. The historian of psychiatry Elizabeth Lunbeck has analyzed how, in the 1910s, American psychiatrists proposed psychopathy as a category to designate forms of psychopathology that had previously been unrecognized because they had been able to pass as normal.[3 No longer would mental illness, as insanity, be limited to insane asylums, where it could be contained successfully. On the contrary, the widespread presence of psychopaths everywhere, hiding under the veil of normality, threatened the social fabric of American society. These views made psychiatric intervention even more compelling: not only would it remove sick individuals from public life, it could also protect the social order.
In my own work on the history of the mental hygiene movement, similar themes appear. In the 1920s, mental hygienists launched a major project on the treatment of juvenile delinquency to prevent children from developing life-long criminal careers. The concept of adjustment as an essential marker of mental health, central to the philosophy of mental hygiene, brought a great range of human behavior under the purview of psychiatry. Instead of treating maladjustment in adults (for example, adults with mental illness), mental hygienists argued that treating maladjustment in children (for example, children with enuresis or temper tantrums) would prevent serious forms of mental illness from arising later in life.
By labeling all forms of undesirable conduct as “maladjustment,” it became self-evident to expect relatively innocent forms of the behavior to become serious later on. Rather than punish delinquency, the therapeutic treatment of children with “pre-delinquent syndrome” could be expected to bear fruit. Unfortunately, the central assumptions of this approach were never put to the test, and would most certainly not hold up when investigated properly. Led by their convictions, psychiatrists and mental hygienists were not bothered by this. They focused on lesser complaints, while neglecting the plight of the mentally ill in increasingly overcrowded mental hospitals (leaving them to somatic psychiatrists who experimented with insulin therapy, metrazol shock therapy, ECT, and lobotomy).
The mistaken impression could arise that the two themes in the history of psychiatry identified thus far (blurring the distinction between normal and abnormal, and targeting less serious states for psychiatric intervention) were characteristic of psychoanalysis or other psychiatric approaches focusing on mental and behavioral factors. It would be too easy to dismiss them because, with psychiatry becoming increasingly biological and scientific, such trends have been reversed today.
Nothing could be farther from the truth, however. During the last 20 years or so, we see these trends developed in an unprecedented way in psychopharmacological psychiatry. In the 1950s, only individuals with severe and persistent forms of mental illness received medication such as Thorazine. Today, fidgety and distracted kids as well as shy adults are portrayed as individuals who could benefit from psychopharmacology. Increasingly, a wider range of psychiatric medications are prescribed to young children, with the idea that early intervention will prevent problems from getting worse. It is this mind-set, now more than a century old, that made McGorry’s research project appear innovative and cutting edge.
Dr McGorry has introduced a slight modification to his study, which will now go ahead. Instead of Seroquel, he will now test the efficacy of fish oil.
References1. Franklin B. Protection of towns from fire (Letter). The Pennsylvania Gazette. February 4, 1735.
2. Wilson D. AstraZeneca settles most Seroquel suits. New York Times, July 28, 2011. http://prescriptions.blogs.nytimes.com/2011/07/28/astrazeneca-settles-most-seroquel-suits/. Accessed October 27, 2011.
3. Lunbeck E. The Psychiatric Persuasion: Knowledge, Gender, and Power in Modern America. Princeton, NJ: Princeton University Press; 1994.