clomid com

In the early 20th century, there were a few psychiatrists ― perhaps intrepid, perhaps deluded ― who, despite no formal surgical training, were impelled to breach their patients’ skulls.

Inspired by the surgical innovations of the day, they were curious about the physical nature of the psyche and whether invasive surgical procedures might be a cure for psychiatric disease.

Today it sounds ridiculous. In modern medical school classes, there tends not to be much overlap between those bound for careers in psychiatry and surgery. But a century ago, effective treatments for most mental illnesses were scant. The nascent field of psychiatry was only beginning to categorize and define mental maladies.

Surgery, meanwhile, medicines for mental sickness was at the height of its sophomoric era ― there was a growing ability to resect internal organs without much knowledge of the their actual function. And despite psychiatry being a somewhat young field, it was accepted that mental illness was caused by dysfunctions in the brain. And so, surgical cures for psychiatric symptoms seemed a promising pursuit; and it is one that is gradually making its way back into modern mental health care.

The Politics of It All

So-called “psychosurgery” owes plenty to the Progressive Era.

This was a period of intense social and political reform that began around 1890 and manifested in medicine with the emergence of organized healthcare and with science informing medicine at an unprecedented level.

Gone was the concept of the four humors ― blood, phlegm, black bile, and yellow bile. For centuries, it had been assumed that disease was caused by an imbalance between these humors. With scientific giants such as Santiago Ramón y Cajal (1852–1934) advancing neuroscience and microscopy, with Louis Pasteur (1822–1895) advancing germ theory, and Joseph Lister (1827–1912) applying germ theory to surgery, late 19th century medicine recognized the brain as the seat of the mind and personality.

Swiss neuropsychiatrist Gottlieb Burckhardt was among the first to dabble with modern psychosurgery. With no formal surgical training, in 1888 he performed craniotomies and induced cerebral white matter lesions in six psychiatric patients described as aggressive and chronically excited and as having paranoid delusions.

The procedures yielded questionable results, and the medical community was skeptical, if not shocked. Pursuing psychosurgical operations subsided among doctors for decades.

Yet around the same time, American psychiatrist Henry Cotton took up the scalpel to operate outside the brain in the interest of treating mental illness.

This was at once both a progressive and an incorrect assumption: that mental illness might be due to a physical, biological mechanism.

This idea is becoming more widely accepted today with every new gene variant linked to mental illness and with certain infections and immune reactions associated with disorders of the brain. Yet Cotton’s approach was far from scientifically sound.

In 1913, a report was published showing that spirochete bacteria could be isolated from the brains of people who had died with neuropsychiatric abnormalities. Now we know that these patients likely died of neurosyphilis, but Cotton took the findings to imply that depression and delusional disorders might be due to a bodily infection that makes its way to the brain.

So he went for what he deemed low-hanging surgical fruit: his patients’ teeth.

Cotton came to believe that underlying most psychiatric disorders was a dental infection, going so far as to have some of his teeth removed when worried his own mental health was failing. Worse yet, he then extracted all the teeth of his children and wife as a prophylactic measure against mental illness.

If extraction of a few, several, or all the teeth wasn’t enough to alleviate behavioral symptoms in any given patient, he would work his way through the body, resecting tonsils, gallbladders, spleens, stomachs, colons, and anything else known to harbor bacteria. Thus, during his tenure at the Lunatic Asylum ― a woefully offensive and dated name ― in Trenton from 1907 to 1930, Cotton and his staff extracted more than 11,000 teeth, thousands of tonsils, and numerous other organs. Mortality exceeded 30% for big operations, such as colectomies.

The Next Wave of Psychsurgery

Coinciding with Cotton’s retirement in 1930 and subsequent death in 1933 was research by Yale researchers Carlyle Jacobson and John Fulton. The pair lesioned and resected regions of chimpanzee brains to help elucidate brain anatomy and function, research that today would be considered cruel and is illegal. They presented their work at a well-attended conference. Most attendants were horrified, but a handful were quite impressed. The latter grouped included Portuguese neurologist Antonio Egas Moniz and the American neurologist-psychiatrist Walter Freeman, a future co-founder of the American of Board of Psychiatry and Neurology, who operated on the brains of numerous human patients.

Jumping to conclusions that the Yale study on chimpanzees meant that surgery on the frontal lobe could cure various behavioral symptoms that the animals in the study seemed to display prior to surgery, Moniz developed what he called frontal leucotomy. In this procedure, an instrument ― the “lecuotome” ― was inserted through a burr hole in the skull. A wire was then inserted and was manipulated so as to cut in a circular pattern in the brain. Though not trained in surgery, Freeman was eager to try the procedure as well, being equally emboldened by a chimpanzee study presented at the conference.

Moniz would receive the 1949 Nobel Prize in Physiology or Medicine for his the research.

Meanwhile, teaming up with neurosurgeon James Winston Watts, Freeman, who was not a surgeon, modified Moniz’s leucotomy and gave it another name: prefrontal lobotomy. The procedure was reportedly effective in treating patients with with depression, anxiety, psychosis, and other behavioral conditions, as well as some seizure disorders.

Less well publicized was the fact that patients often suffered relapse of their ailments, even as early as a few weeks after the procedure, so it was not uncommon for patients to undergo the procedure multiple times. Nevertheless, lobotomies became increasingly common in the 1930s and 1940s. Rose Marie Kennedy, sister of the future US president, underwent the procedure and wound up institutionalized in Wisconsin. Her story went unrevealed until well after her brother’s ascent to the Oval Office.

Frustrated by the inconvenience of the need for a neurosurgeon, a surgical suite, an anesthesiologist, and the various nurses and technicians that kept lobotomy off limits to the various asylums across the United States, Freeman sought a simpler procedure. He found one in the work of Italian psychiatrist Amarro Fiamberti, who had employed a transorbital approach beginning in 1937.

This meant circumventing the eye to access the brain. And Freeman managed to make the procedure even worse than it sounds.

Inspired by a kitchen icepick with which he practiced on grapefruits, he developed a swiveling instrument that fit over the eye and that could lobotomize a patient in the course of a minute. It required only local anesthesia or the induction of a seizure by an electroconvulsive shock.

From the mid 1940s onward, Freeman began performing the “icepick lobotomy” on patient after patient and continued doing so, despite numerous objections from Watts. Freeman and Watts thus parted ways, and soon after, Moniz also ended his association with Freeman.

Meanwhile, Freeman went on tour to teach his procedure. By the early 1950s, many thousands of patients had been lobotomized with the Freeman technique, including roughly 3500 on whom the procedure was performed by Freeman himself. As for sterile technique, Freeman didn’t bother with it. A photograph depicts Freeman and Watts lobotomizing a patient with the earlier burr hole technique. Freeman wears no gown, no gloves, and his nose is peaking out above his mask.

To demonstrate the speed and ease of his procedure, sometimes Freeman would lobotomize both sides of the brain at once, one with each hand. He was a showman, after all, a trait that brought tragic consequences in 1951. During a lobotomy procedure, he paused for a photo. As he turned away from his patient, his instrument penetrated too deeply, causing a fatal hemorrhage. He continued performing the procedure on more patients.

That same year, promethazine, the first antipsychotic drug, was approved in the United States, initiating a process that would ultimately mean the end of widespread lobotomies but not until a presidential commission exposed the problems in the early 1970s.

By that time, an estimated 40,000 to 50,000 people had undergone lobotomy. Numerous deaths and debilitations were attributed directly to the procedure. Consequently, depictions of the technique made their way into popular culture, the most famous example in the novel and movie, One Flew Over the Cuckoo’s Nest.

The horrors of such a sloppy, inappropriately applied surgery notwithstanding, MRI studies on lobotomy patients who survived to the 1990s and beyond have helped to elucidate lesions of various frontocingulate, frontothalamic, and frontocapsular pathways. The findings support the idea that lesioning, however crude it was, did do something to alleviate some anxiety and other symptoms by severing connections between regions involved in planning in the frontal lobe and deeper centers, such as the thalamus and parts of the limbic system involved in emotion.

This dovetails with an idea of Moniz, that symptoms of what today is called obsessive-compulsive disorder (OCD), which apparently characterized a portion of his surgical patients, resulted from excessive signaling between the frontal lobe and the deeper areas. Following rapid advances, both in stereotactic neurosurgery and in elucidating the neuroanatomic basis of some psychiatric disease in the late 20th century, the stage was set for the much more precise procedures that are employed today (primarily in treatment-refractory cases).

The history of psychosurgery may seem shocking, but perhaps it’s not so outlandish after all. Today, neurosurgical interventions show promise in treating disorders of the brain. Implanting electrodes and simply lesioning certain areas of the brain are used as a last resort in patients with mood and anxiety disorders, including OCD. These procedures include anterior cingulotomy, limbic leucotomy, anterior capsulotomy, and subcaudate tractotomy, techniques that could be considered more localized lobotomies

Utilizing stereotactic techniques and advanced imaging, such procedures entail micrometer precision and of course are carried out by neurosurgeons, not psychiatrists. Furthermore, patient selection requires detailed discussions between patients, their families, and an extensive, interdisciplinary medical, surgical, and mental health team.

But modern psychosurgery does have one thing in common with psychosurgery of 100 years ago: in both cases, the surgery produces some interruption in communication between different brain regions, often involving the limbic system and frontal cortex, which are involved, respectively, in emotions and complex planning and cognition.

This suggests that while early psychiatrists were — excuse the pun — out of their minds in venturing to penetrate the skull, they nevertheless played a role in shaping modern psychiatric care.

David M. Warmflash, MD, is a freelance health and science writer living in Portland, Oregon. His recent book, Moon: An Illustrated History: From Ancient Myths to the Colonies of Tomorrow, tells the story of the moon’s role in a plethora of historical events, from the origin of life, to early calendar systems, to the emergence of science and technology, to the dawn of the Space Age.

For more news, follow Medscape on Facebook, Twitter, Instagram, and YouTube.

Source: Read Full Article