Publication

Article

Psychiatric Times

Vol 39, Issue 5
Volume

Thought, CircRNAs, and a Neurodigital Hypothesis

Author(s):

"Physiologically speaking, what is a thought?” The truth is, we have been unable to answer this question—until now.

circuit brain_Vector Tradition/Adobe Stock

circuit brain_Vector Tradition/Adobe Stock

CLINICAL REFLECTIONS

If you ask a cardiologist, “Physiologically speaking, what is a heartbeat?” they can bore you to tears with details on anatomy, electrophysiology, hemodynamics, innervation, and pulmonary and peripheral circulation. But if you ask a psychiatrist the entirely pertinent question, “Physiologically speaking, what is a thought?” the truth is, we have been without a clue—until now.

Of all psychiatry’s mysteries, this is the most crucial void in our scientific knowledge: how the brain executes the higher functions that are the actual focus of psychiatric practice—the generation of thought and behavior. This knowledge gap has been responsible for psychiatry’s own split personality—historically swinging back and forth between a biological focus on the brain and a psychological focus on the mind. Just as physics sought a unified “theory of everything” to incorporate the mystery of gravity, this missing piece of science has eluded psychiatry since its inception.

The Brain’s Digital Revolution

Psychiatry’s last revolution occurred in the late 1970s, when significant advances in brain physiology triggered the emergence of this biological era of psychiatric treatment and shuttled psychoanalysis out the door. But this scientific surge has yet to advance our physiological understanding of thought. For more than 70 years, psychiatry’s prevailing model for memory and learning remains the one proposed in 1949 by Donald Hebb: synaptic plasticity. Hebb hypothesized that fortified neuronal connections assemble to form an engram—any permanent change in the brain that would account for the existence of a memory. A single memory might engage a network of several such engrams dispersed to different areas of the brain, referred to as an engram complex.

This sounds a lot like an analog computer—which does not use digital processing and became obsolete around 1960. In a 2010 issue of Scientific American, cognitive neuroscientist Paul Reber estimated that the human brain has a memory capacity of about 2.5 petabytes—which is 2500 terabytes, or 2.5 million gigabytes.1 It is hard to believe that Hebb’s model, which was concocted at a time when digital processing was unknown, could account for so much computing power. But recent evidence suggests that our brain may be much more up-to-date than that, having already completed its own digital revolution.

What triggered me to consider this possibility was a video posted on YouTube by the Albert Einstein College of Medicine in 2014 showing the incorporation of a memory, fluorescently tagged beta-actin messenger RNA traveling through the dendrites of a neuron in the hippocampus of a mouse, upon exposure to light.2 This intrigued me because both RNA and DNA are commonly regarded as forms of digital memory storage, utilizing a quaternary code of nucleotides rather than the binary code that is our industry standard. This led me to consider that such digital memory might be used to execute digital processing as well—within an elaborate software construction of nucleic acids that we call a mind.

Since then, I have found an impressive amount of information supporting this neurodigital hypothesis for the nature of thought. One is an eye-opening Wikipedia entry including 54 citations on the subject of “DNA computing.”3 There you will discover “an emerging branch of computing which uses DNA, biochemistry, and molecular biology hardware, instead of traditional electronic computing,” and learn that this capacity was first demonstrated in 1994. DNA processing speed is slow relative to our computers, but this is compensated for by its potential to make multiple parallel computations at the same time. 

Then there is the hot topic of “DNA digital data storage,” which generates more than 6000 results on Google, including many companies already in business.4 Its primary attraction is information density. It is estimated that 1 gram of DNA could possibly store up to 455 billion gigabytes of information, and that the entirety of humanity’s data could thus be stored in 1 room. Other advantages include the rapid transportation of huge amounts of data and vastly increased stability of data over long time periods. It is an expensive process but its financial promise is attracting much speculation.

These novel applications of DNA demonstrate the capacity of nucleic acids to store and process digital data—but they do not prove that they are being utilized in this manner by living organisms. However, in May 2018, eNeuro published the results of a UCLA study headed by David Glanzman that succeeded in using RNA to transfer learned behavior from 1 animal to another.5

Aplysia californica, a species of sea slug, was used for this study because these slugs have the largest neurons found in nature. When tapped by a researcher, these slugs typically exhibited a defensive withdrawal reflex lasting about 1 second. The experimental design called for such taps to be followed by the administration of mild electric shocks to the slugs’ tails to prolong their response, each receiving a series of 5 shocks administered 20 minutes apart on 1 day and repeated the next day. This training resulted in sensitization of the snails, manifested by increased intensity and length of their defensive withdrawal reflex when given ordinary taps. These prolonged contractions lasted an average of 50 seconds, rather than the typical 1-second response.

After 2 days of shocks, the researchers extracted RNA from the nervous systems of these sensitized slugs. Seven untrained slugs received an injection of this sensitized RNA, with a control group of 7 receiving RNA from other untrained slugs. All 7 untrained slugs receiving the sensitized RNA exhibited the prolonged response seen in the trained slugs, lasting an average of 40 seconds in this group; all the controls exhibited no such prolongation of response. Subsequent in-vitro studies revealed that the sensitized RNA produced increased excitability in sensory neurons, but not motor neurons. Glanzman noted that these findings negate the longstanding model of synaptic plasticity, stating that, “If memories were stored at synapses, there is no way our experiment would have worked.6 He proposes that memories are instead stored in the nuclei of neurons, elevating RNA’s prospective role as that fabled engram—the fixed change in the brain accounting for the existence of memory.

CircRNAs

Equally exciting was a review article published by The Neuroscientist in October 2020 titled “Circular RNAs in the Brain: A Possible Role in Memory?7 The article cites a large body of recent evidence supporting a critical role for regulatory RNAs in coordinating “experience-dependent neural plasticity” (aka “learning”) and focuses on the contribution of a structurally distinct class of RNAs known as circular RNAs, or circRNAs.

CircRNAs were first visualized by electron microscopy in 1979 and thought to be artifacts of splicing or rare oddities derived from only a few genes. However, thousands of circRNAs have since been identified and they are particularly enriched in brain tissue. These closed-loop, single-stranded RNA molecules are highly stable and long-lived compared with other forms of RNA because of their structural resistance to exonuclease-mediated degradation. Their structure is similar to viroids—virus-like RNA pathogens that are able to infect flowering plants without the protection of a virus’ protein coat—and plasmids, the autonomous circular DNAs most familiar to us by their reproductive role in our own mitochondria. Given their structural stability, circRNAs could serve as “memory molecules” transferring packets of information between brain cells. Current evidence suggests that this class of RNAs contributes greatly to higher-order functions such as cognition and memory.

A point is made that approximately 98% of the output from the human genome does not code for proteins and is thus classified as “noncoding.” Such DNA is responsible for the production of noncoding RNAs (or ncRNAs), which for many years have been dismissively referred to as “junk RNA.” However, they have more recently been recognized as performing regulatory roles that are critical in the direction of brain function and behavior. It is also noted that the percentage of noncoding RNAs within the genome of an organism increases in proportion to its evolutionary advancement and organismal complexity, while the number of protein-coding genes remains about the same. This implies that higher-order cognition may be a product of this abundance of regulatory architecture in our genome, which in turn results in an abundance of circRNAs.

The article goes on to suggest that dysregulation of circRNAs could contribute to the pathophysiologies of Alzheimer disease, depression, schizophrenia, and autistic spectrum disorder. The authors conclude that “with the discovery of circRNAs, our understanding of how molecular networks function and communicate with each other, both intracellularly and intercellularly, may soon be revised.…Given their relatively recent discovery, there is still a lot to uncover about the regulation and function of circRNAs and their involvement in cognition.”

This “noncoding DNA” and “junk RNA” could constitute the operating system of the human brain, establishing the data framework that an infant uses for its limited operations after birth and its subsequent growth. This would likely include the firmware for maintenance of homeostasis, our sensory peripherals (vision, hearing, smell, taste, touch), and our emergent motor activities (suckling, smiling, grasping, etc). It could explain the universal vocabulary of expressions shared by humanity seen even in infants, like smiling, cooing, grimacing, crying, and laughing. It could even vindicate Carl Jung’s proposition that all humanity has a shared collective unconscious—a segment of the mind that is genetically inherited and not shaped by human experience.

Some might think this is too amazing a proposition to consider. But is it any more amazing than all the things a smartphone does by manipulating 1s and 0s? Or the fact that primordial plants, after millions of years of evolutionary trial and error, managed to formulate chlorophyll, endowing them with the power to convert carbon dioxide, water, and sunlight into sugar? We live in a world overflowing with scientific wonders, within a universe likewise full of scientific wonders.

Concluding Thoughts

To convince me that digital processing cannot be happening in our brains, you now have to convince me that life has been sitting on a digital storage system that has that capacity but has never bothered to utilize it—which, as we all know, would be very out of character for life. Now that we know our nucleic acids can perform these functions, the notion that brains might utilize a process with which we are already very familiar in our day-to-day life seems rather mundane, or even predictable.

So how can we modify the software of our brain-mind? External modification of data in a computer requires the input of new data by using a peripheral, such as a keyboard, disc drive, or modem. Our brain-mind can likewise use our organs of sensation to obtain information from the environment. Have you ever read a book, or had a conversation, that had a profound effect on how you perceive your life? Have you ever seen a movie that changed how you felt about its subject matter? Do certain songs trigger certain thoughts, feelings, or memories? All these examples reveal the psychic power of external information. That power is most clearly demonstrated by posttraumatic stress disorder, an onslaught of negative information flooding the sensations of someone experiencing a major traumatic event—enough information, sadly, to change one’s mind and life forever.

I know that proving this neurodigital hypothesis is far beyond my capabilities. I am advancing this model not to presume hard scientific knowledge, but to encourage scientific skepticism toward the simplistic assumptions that drive psychiatry’s current treatment model. The computer is our best available model for psychiatric physiology, if for no other reason than that it clarifies the relationship of the mind to the brain and calls attention to the limits of our medication-based treatment model.

It is my contention that brains deserve the respect that we give to computers. You would not accept a cheaper hardware intervention to resolve a software problem in your MacBook Pro. So why do we give antidepressants to patients struggling with marital problems? In response to economic forces, we psychiatrists typically default to medication—when our patients might benefit more from some restorative information.

This exciting research points toward a wholesale reassessment of how the brain works. Having already witnessed one revolution in psychiatry, it seems likely that another is coming—after which many of our current assumptions and interventions will be regarded as less than state-of-the-art. As to what comes next… I think that topic is ripe for discussion.

Dr Minot is assistant medical director of psychiatry with Maine General Health and clinical assistant professor of psychiatry with Tufts University School of Medicine.

References

1. Reber P. What is the memory capacity of the human brain? Scientific American. May 1, 2010. Accessed March 24, 2022. https://www.scientificamerican.com/article/what-is-the-memory-capacity/

2. The molecular basis of memory: tracking mRNA in brain cells in real time. Albert Einstein College of Medicine. 2014. Accessed March 24, 2022. https://www.youtube.com/watch?v=6MCf-6It0Zg&ab_channel=AlbertEinsteinCollegeofMedicine

3. DNA computing. Wikipedia. Accessed March 24, 2022. https://en.wikipedia.org/wiki/DNA_computing

4. DNA digital data storage. Wikipedia. Accessed March 24, 2022. https://en.wikipedia.org/wiki/DNA_digital_data_storage

5. Bédécarrats A, Chen S, Pearce K, et al. RNA from trained aplysia can induce an epigenetic engram for long-term sensitization in untrained aplysia. eNeuro. 2018;5(3).

6. Wolpert S. UCLA biologists ‘transfer’ a memory. UCLA Newsroom. May 14, 2018. Accessed March 24, 2022. https://newsroom.ucla.edu/releases/ucla-biologists-transfer-a-memory

7. Zajaczkowski EL, Bredy TW. Circular RNAs in the brain: a possible role in memory? Neuroscientist. 2021;27(5):473-486. ❒

Related Videos
brain
brain schizophrenia
eating disorder brain
© 2024 MJH Life Sciences

All rights reserved.