Publication

Article

Psychiatric Times
Psychiatric Times Vol 30 No 1
Volume 30
Issue 1

Planck's Law of Generations

Planck's Law of Generations: scientific change doesn't happen by changing minds, but by changing generations.

Editor’s note-Please see the response to Dr Ghaemi’s essay, “The Older Psychiatrist in an Era of ‘Unprecedented Change,’” by James L. Knoll IV, MD.

John Tyndall (1881) on opposition to anesthesia during surgery: “It is interesting and indeed pathetic to observe how long a discovery of priceless value to humanity may be hidden away, or rather lie openly revealed, before the final and apparently obvious step is taken towards its practical application.”1

We should respect our elders, if for no other reason that we will all (we hope) one day be old. I come from an Islamic culture that is much more generous in its veneration of elders than Western culture. I share that perspective. I am now neither young nor old, and so, in the middle of my passage, I have become increasingly preoccupied with what it means to age, and-unfortunately-I’ve come to the unwelcome conclusion that age frequently brings with it many drawbacks, not so much for those aging, but for everyone else. I say this without intent to criticize others, but as a true problem in human affairs. So, knowing that political correctness would require otherwise, I’d like to address the question of how the young and the old compare in their approaches to knowledge.

An initial insight comes from the great German physicist Max Planck, who said: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”2(p150) I’ll simplify Planck’s Law of Generations: scientific change doesn’t happen by changing minds, but by changing generations. This is an immensely wise, and depressing, thought-a thought most of us are afraid to discuss explicitly. Let’s do.

In America, our democratic heaven, we citizens find it easy to moralize against injustice. We, justly, defame racism, and sexism, and stigma against mental illness, and discrimination by sexual orientation, and-we can now add-“ageism.” We especially dislike ageism, because, unlike the other categories, where we might moralize out of abstract sympathy, we know we will all age, and one day, being aged, we should dislike being disdained.

So we call it ageism if any of us should criticize another on the basis of his or her age, whether old or young. It used to be that the young were discriminated against; it still is the case in some parts of the world, such as my homeland of the Middle East, or in the Far East. Ancestor worship was formal, and still is informal. In that part of the world, men (usually) in their 60s and 70s are more trusted than those youngsters in their 30s and 40s, not to mention the postadolescents in their 20s.

The situation is reversed in Western cultures-or so we think. We like our presidents to be in their 40s or 50s; rarely are they elected in their 60s, and almost never older. Our tabloids and magazines celebrate our virile young celebrities, usually in their teens and 20s; the magazines mention 30-something celebrities less frequently, and those unfortunate enough to be 40 or older even less. Ours is a young nation, and a young culture; so, we are told, be on guard against ageism.

This is so, but the problem of ageism should not prevent us from understanding the problem of generations, which has both scientific and historical evidence for it.1

Here’s the science.3 In the early 1980s, randomized clinical trials (RCTs) began to show the falsity of the common belief that vitamin E supplementation was beneficial for cardiovascular disease. Yet for 15 years, most authors of scientific articles continued to claim benefit for vitamin E, and during that period, despite accumulation of evidence from multiple RCTs to the contrary, about half of scientific articles still claimed efficacy for that disproven treatment. Similarly with β-carotene for cancer prevention and estrogen for Alzheimer dementia prevention, RCTs showed inefficacy 20 years before the authors of scientific articles began to admit that fact.

How long is 20 years? A generation of human beings.

Turn to history. There has never been, claims one historian of medicine,1 any example of an historic medical advance that now is widely accepted (like the germ theory of disease, or anesthetic surgery, or disproof of the efficacy of bleeding for pneumonia, or washing hands before delivery of a baby, or widespread antibiotic use) without a delay of at least 40 years (usually longer) between the presence of sufficient scientific evidence and widespread acceptance by the medical community.

How long is 40 years? Two generations of human beings.

The rule of thumb, proven by science and history, is that current generations always reject new truths. Can we ever get beyond this depressing fact?

It’s not ageism; it’s honesty, the most brutal honesty, to admit that we, not others, are the problem, and even more difficult, to admit that our teachers and our leaders have been part of the problem more often than they have been part of the solution.

We can find another source of insight in another late 19th century scientist, the physician William Osler, who got into trouble for a talk in which he jokingly recommended chloroform for those over 60 years. He gave that speech at that age, when retiring from Johns Hopkins, where he had founded and chaired the department of medicine. If he truly advocated euthanasia, he was advocating suicide. Osler’s comment was symbolic, not vulgar. Yet even symbolically, we don’t want to hear it. But it is worthwhile to listen:

“It may be maintained that all the great advances have come from men under 40, so the history of the world shows that a very large proportion of the evils may be traced to the sexagenarians-nearly all the great mistakes politically and socially, all of the worst poems, most of the bad pictures, a majority of the bad novels, not a few of the bad sermons and speeches.”4(pp382-383) Osler argued that almost all new and original ideas or projects are started by people in their 20s and 30s. Even if great work is produced after 40, it is almost always conceived, or started, earlier. Most persons think their great thoughts early in life and spend the rest of their lives proving, or expanding, or teaching those thoughts. It is rare for any human being to have a truly novel, important, original contribution to humankind after age 40, or 50, or 60, which he had never considered at all before those ages.

Or, to put it another way, as we get older, we stop changing our minds; our ideas become frozen; our minds become like museums, where the furniture doesn’t change, but is merely dusted off and spruced up. When we are younger, we have no past to defend; we are just beginning to furnish the houses of our mind; we take in new ideas, test them, experiment, accept, reject. Eventually, we choose the chairs and tables we like, and we settle down; our minds, made up of those belongings, settle down too. We are loathe to make radical changes afterward.

But what about the benefits of experience? As we age, we gain more clinical experience, which is thought to reflect wisdom. Often, the reverse is the case. As we get older, we learn so much, we see so much, that eventually our minds are full-and then, we become blind to all that we have not already seen or learned. Osler calls it “mind-blindness”-the most tragic problem in human knowledge: “It is not . . . that some people do not know what to do with truth when it is offered to them, but the tragic fate is to reach, after years of patient search, a condition of mind-blindness in which the truth is not recognized, though it stares you in the face. . . . It is one of the great tragedies of life that every truth has to struggle to acceptance against honest but mind-blind students.”4(p398)

There are exceptions; there are always exceptions. Some of us like to move; we don’t settle down. We continue to have new and different ideas as we age; we even change our minds into our 80s, sometimes radically. Secretary of Defense John McNamara, the architect of the Vietnam War, admitted he had been wrong when he was in his 80s. Governor George Wallace, the paragon of segregation, admitted he had been wrong when he was in his 60s. In addition to my father, Kamal Ghaemi, I’ve had excellent teachers, mentors, and friends who have retained their mental flexibility into their 60s and beyond-like Ross Baldessarini, Frederick Goodwin, Paul Roazen, Leston Havens, Kenneth Kendler, Jules Angst, Athanasios Koukopoulos, and Ronald Pies. Havens always taught us: “Take your theories lightly” and, to the end of his life, he was still rearranging the furniture of his mind. My friend, Dr Marc Agronin,5 has written about how some people manage to age well, including in the intellectual sense I’m describing: he relates examples from his experiences with Erik Erikson and Senator George McGovern, among others. An observer of the event once told me that he saw Erikson, in his 70s, walk up to a Harvard dean at the faculty club, and say, “Dean, can I be given a sabbatical for the next year? I have been thinking that most of my ideas may have been wrong.” These exceptions give us hope that some of us may evade Planck’s law; but they are, unfortunately, a minority. In contrast to these exceptions, over 2 decades of my active involvement in our profession, I’ve known multiple-fold more psychiatric leaders who follow Planck’s law, rather than break it.

This is the problem of generations: new ideas tend to grow not because contemporaries are convinced, but because unborn generations are.

So often I hear experienced psychiatric colleagues, usually past Osler’s cutoffs, talk simply about the need to go back to ideas from decades ago: the biopsychosocial model is the most commonly cited, as if turning back the clock would solve the problems of today.6 They mistake the memories of youth as visions for the future. They don’t seem to realize that they haven’t changed their minds on any central ideas of importance since Richard Nixon was president. They don’t seem to be bothered by the idea that progress, at least in science, usually doesn’t happen by sticking to the same ideas all the time.

I have already passed Osler’s first cutoff, and I can only hope to join McNamara and Wallace and Havens in changing my mind about something important in the future. In the meantime, I would like to propose a test of who among the older can deign to advise the younger: before recommending the beliefs of your past youth for today’s generation, think of something that you fervently believed while young and now realize is false. If you can’t come up with the latter, withhold the former. This is not to say that Darwin should have changed his mind about evolution in his 80s; but his enemies should have.

I can cite some personal examples of changing my mind, now that I’m in my mid-40s. Until the last few years, I believed that antidepressants were much more effective in major depressive disorder (MDD) than I can bring myself to believe now after the results of studies like STAR*D. After I critiqued those results to interpret them as showing relatively low antidepressant efficacy,7 a colleague commented to me: “Nassir, you can’t expect people who have spent their entire careers trying to show that antidepressants work very well to suddenly accept that they don’t work so well.” Another example: I assumed for a long time that the biopsychosocial model was a benign, pleasant philosophy. Over time, after studying it, I determined that it was used for decades as a way to hide a certain postmodernist relativism about truth, an unwillingness to value science with a higher priority than other opinions, which connects to a third recent example. I naively believed, until recently, that our DSM leaders based psychiatric diagnoses on science wherever possible, and I defended the DSM system, by and large, against those who were skeptical of it. In just the past few years, I’ve learned, based on their own confession,8 that science mattered least for past DSM leaders: they “pragmatically” made diagnoses up, based on their personal views about what was best for the profession.9 So I’ve made a change in my view: I’ve had to accept the criticism that DSM revisions are mostly unscientific. Furthermore, I’ve concluded that we have failed to progress in psychiatric knowledge for the past half century-not because of the complexity of mind and brain, but because of the “pragmatic” gerrymandering of our psychiatric diag­noses, which fail, naturally, to corre­late with nature.

Keep going, our elders tell us: we are still on the right path, even though the path has gone nowhere. Ezra Pound’s World War I poem doesn’t exactly apply: “. . . Walked eye-deep in Hell, believing in old men’s lies, then unbelieving came home, home to a lie. . . .”10 The errors of our past leaders-whether in psychopharmacology or DSM or the biopsycho­social model-weren’t lies because they weren’t intentional-which is worse. Napoleon once killed a Duke for plotting against him; the outcry that followed was more harmful to Napoleon than the plots of the Duke. Said Talleyrand years later: It was worse than a crime; it was a mistake. So, too, with our past psychiatric leaders: they didn’t lie, or knowingly harm. It was worse: they made mistakes with the best of intentions-hence, they still don’t realize it.

Positions of power, even in our young nation, are held mostly by persons in their 50s and 60s; these are exactly the decades that resist novelty, as Osler rightly noted. Almost the entire biological establishment opposed Darwin. The great British psychiatrist Aubrey Lewis, so unjustly unappreciated by his American cousins, once noted that “in positions where freshness is all, the old are not left to clog and petrify affairs; for we have it on wise authority that men of age object too much, consult too long, adventure too little, repent too soon, and seldom drive business home to the full period, but content themselves with a mediocrity of success.”11 I might slightly correct Lewis to note that I see less and less repentance with age, and one might say that Lewis himself, having written that comment in his 40s, failed to prove it wrong in his 60s and 70s. In older age, he abetted an unwise attack on lithium, partly out of his lifelong attachment to social aspects of psychiatry as against drugs, and thereby he unjustly harmed acceptance of a drug that some consider our most effective medication ever, a drug that, to this day, is overly avoided by patients who don’t know better, and clinicians who should.12

The philosopher William James13 consciously took this attitude to every new idea: He first accepted it wholeheartedly for as long as he could; only afterward did he begin to analyze or critique it. For most of us, we barely begin to hear a new idea before we automatically mentally criticize it.

For those who are young now, realize this: you will never be as open to new ideas as you are now. Pay close attention to your attitude, and try to keep it forever, although all the forces of nature and of society will oppose you.

If only we could be as wisely naive as James, as children are, and as we increasingly lose the ability to be as we get older-and less wise. Let us respect our elders-yes, but let’s also respect ourselves and, in honor of our youth, respect truth above all. Amicus Plato, sed magis amica veritas. Loosely translated: I love Plato, but I love truth even more.

References:

References

1. Wootton D. Bad Medicine: Doctors Doing Harm Since Hippocrates. New York: Oxford University Press; 2007.

2. Kuhn TS. The Structure of Scientific Revolutions. 2nd ed. Chicago: University of Chicago Press; 1970:150.

3. Tatsioni A, Bonitsis NG, Ioannidis JP. Persistence of contradicted claims in the literature. JAMA. 2007;298:2517-2526.

4. Osler W. Aequanimitas. Philadelphia: The Blakiston Company; 1948.

5. Agronin ME. How We Age: A Doctor’s Journey Into the Heart of Growing Old. New York: Da Capo Lifelong Books; 2011.

6. Ghaemi SN. The Rise and Fall of the Biopsychosocial Model: Reconciling Art and Science in Psychiatry. Baltimore: Johns Hopkins University Press; 2009.

7. Ghaemi SN. Why antidepressants are not antidepressants: STEP-BD, STAR*D, and the return of neurotic depression. Bipolar Disord. 2008;10:957-968.

8. Frances AJ. DSM5 should not expand bipolar II disorder. http://www.psychologytoday.com/blog/dsm5-in-distress/201004/dsm5-should-not-expand-bipolar-ii-disorder. Accessed November 15, 2012.

9. Ghaemi N. Mood swings. DSM 5 and bipolar disorder: science versus politics. http://www.psychologytoday.com/blog/mood-swings/201004/dsm-5-and-bipolar-disorder-science-versus-politics; and http://www.psychiatrictimes.com/mood-disorders/content/article/10168/1642824. Accessed November 15, 2012.

10. Pound E. Hugh Selwyn Mauberley. Whitefish, MT: Kessinger Publishing; 1920 (2010).

11. Lewis A. The problem of ageing. Lancet. 1944;ii:569.

12. Shorter E. The history of lithium therapy. Bipolar Disord. 2009;11(suppl 2):4-9.

13. Simon S, ed. William James Remembered. Lincoln, NE: University of Nebraska Press; 1996.

Related Videos
brain depression
brain
© 2024 MJH Life Sciences

All rights reserved.