Monday, May 31, 2021

The Current Moral Crisis In The United States

 


It is fashionable these days to talk about moral crises that really aren’t moral crises. The level of rhetoric is at the point where disagreements can be spun as moral crises, while people are dying in the streets. The best examples I can think of are the long-standing epidemics of gun violence and racism. New examples are cropping up every day. There are current trends in violence against Asian Americans and Jews against the backdrop of long-standing trends. Discrimination and violence against black Americans is finally acknowledged as being widespread and is the basis of an activist civil movement and hopefully systematic reforms.

All of the statistics to back up my statements in the first paragraph are easily available and I am not going to post all those references here. Since I started writing this blog one of my concerns has been gun violence and how to stop it given the level of interference with common sense gun law reforms by one of the major parties and major lobbying concerns. I saw the attempt to counter that political interference as being futile and focused more on public health interventions and possible psychiatric intervention. The latest good review of that approach is available in a review by Knoll and Pies (3).  For many years I have advocated that homicidal ideation should be seen as a public health intervention point and that it should be part of the strong public health message. To this day nothing has happened. Public health organizations do have research-based suggestions such as locking up firearms and common-sense gun laws like banning large capacity magazines, banning assault rifles, and universal background checks, but the general lack of progress in that area is not reassuring. There has been some movement in allowing more research on gun violence, an area that was previously blocked by gun lobbyists.

What is the connection between gun violence, racism, and violence toward our fellow Americans?  I think there are all based on the same interpersonal dynamic. That dynamic is seeing another person as being significantly different from you, attributing negative characteristics to them, and using both of those premises for treating them different from you up to and including perpetrating violence toward them.  In psychiatric jargon, we use the term projection to capture this process or in the extreme projective identification. These are not psychiatric diagnoses, but defense mechanisms that are distributed across the population even though they may be more likely in people with specific psychiatric diagnoses.

In my readings over the years I have been looking for a likely origin or at least first sign of this kind of thought pattern. In other words, have people been thinking like this since the beginning of recorded time, or is this a new phenomenon?  In the course of that reading, I came across a book written by the anthropologist Lawrence Keeley called War Before Civilization. In this book, Keeley explores the idea of the noble savage from prehistoric times.  In other words, were pre-historic people inherently peaceful as some had suggested or were there early signs of violence and aggression. A review of the evidence suggests that the majority of human prehistoric civilizations engaged in frequent warfare and total warfare – in other words attacks not limited to combatants and decimating the opposition’s infrastructure and ability to make war.  Keeley reviews the motivations and consequences of primitive warfare in great detail including tactics (surprise attacks, slaughter of noncombatants, and general massacres) and specific practices like mutilating dead bodies. There is clear evidence the latter functioned in part to dehumanize and humiliate the enemy and send a message to the survivors. These dynamics were not limited to prehistoric man and have continued through modern times and modern warfare.

A recent report referencing Keeley’s book appeared on Scientific Reports (2) this week.  It was a reanalysis of a Nile Valley burial site of 61 people from about 13,400 years ago. It is thought to be some of the earliest evidence of Homo sapiens interpersonal violence.  In that analysis over 100 lesions were identified in the skeletal remains from what appeared to be projectile weapons.  Examining the mortality curve of the individuals in the cemetery showed that it was consistent with multiple burials rather than a single event.   The stone artifacts examined were consistent with spear or arrow heads. Some we designed to kill by lacerating and causing blood loss. Some were discovered embedded in bones, but others were discovered within the area where the body was discovered and that was viewed as being consistent with the ability to penetrate the body.  The authors conclude that the majority of people in the cemetery died of blunt or sharp force trauma and that there were multiple episodes of interpersonal violence.  Some of the combatants had been wounded multiple times prior to death.  They also concluded that these episodes were most likely the result of “skirmishes, raids or ambushes” likely related to territorial disputes that may have been affected by the weather. (p. 9).

What can be inferred from this long history of human violence and aggression? First, groups of humans have been perpetrating violence against one another since prehistoric times. Second, during these episodes total warfare was very common and the human cost of war is always high. The estimated percentages of deaths in ancient society were generally higher than in modern society for a number of reasons. That was not a deterrent to ancient humans.  Third, the psychological states during these episodes of violence show a potentially broad range of thinking leading to aggression.  Very limited incidents such as the theft of livestock or a rumor of a sexual affair between members of different tribes or villages may be all that was required to start a series of retaliations leading to all out war.  Once a violent conflict ensued – there were thought patterns and rituals in place to justify the killing, prevent bad outcomes for the killers, humiliate the dead, and embarrass their families.

The current moral crisis in America seems to have a direct link with prehistoric behaviors. It is enacted by aggressive behavior that is described as racism, antisemitism, and gun violence, but the dynamic is the same one described in ancient man.  In other words, once a person can be seen and characterized as an enemy (for whatever reason),  it is very easy to vilify them, attribute the worst possible motivations to them, and use that as a basis for rationalizing aggressive behavior. In the past weeks, I saw two elderly Asian American women attacked at a bus stop by a man wielding a knife. The attack as so violent that the large blade of the weapon broke off inside the body of one of these women. In a more recent event, a heavily armed long time employee shot 9 of his coworkers and then killed himself when he was surrounded by police.  In both cases, the “motivation” for the violent behavior is unknown.  There is a suggestion of mental illness, but the majority of people with diagnosed mental illnesses and even the same diagnoses are not violent or aggressive. The sheer volume of mass shootings in the United States suggests it is more of a cultural phenomenon here than anywhere else but that is confounded by the easy availability of firearms.  The main difference between modern and ancient times is that we have a societal structure that is designed to contain violence and aggression and prevent larger outbreaks.  It is clearly ineffective at this point in preventing violence.

I am suggesting a common thought process here that does not require any psychiatric diagnosis and one that can be intervened upon and self-monitored.  In order to perpetrate discrimination, hate crimes, and even homicidal violence toward others 3 conditions have to exist.  First, the potential victims of violence need to be seen as sufficiently different from the perpetrator so that he can attribute unrealistic negative attributes to them and rationalize his aggressive action.  Second, the attacker can see himself as sufficiently different from the potential victims that he feels threatened by them and can rationalize attacking them for that reason alone.  A common example is that the attacker feels victimized by his coworkers and feels the need to strike out at them.  And finally, the attacker must have a plan to either seriously injure or kill the victim(s). All of these thought patterns can be considered derivative of thoughts present in ancient man leading to the wide ranging aggression and warfare described in the references.

I think there is much to be said for intervention based on the observations in this post.  For the time I have written this blog, I have advocated for intervention based on homicidal or aggressive behavior. When I worked as an acute care psychiatrist – treating violence and aggression was easily half of my job.  If we can suggest that persons with suicidal ideation or self-injurious behavior contact a crisis intervention service or hotline – why don’t we have a similar suggestion for people with homicidal thinking?  And further what about general education about the primitive origins of these thought patterns.  Just the other day I posted the following:

“Ridiculing people who died of C-19 and were antivaxxers and anti-maskers is bad form - plain and simple.

Bring civility back and restart civilization.

It starts with recognizing the value of a single human life.”

There was much agreement with the post, but also several people who suggested that I was naïve for not being able to recognize enemies or that I was a “better person” for being able to overlook the behaviors of a group of people who were potentially dangerous to others.  My post was not about moral superiority or not recognizing enemies – it is all about the fact that disagreement should not lead to enmity and beyond that we are all members of the same tribe.  We all came from Africa. And seeing differences between us that do not exist is probably ancient thinking that obscures the fact that we are all a lot more similar than we are different.  As I explained to some of the critics of my post, they seemed to be focused on the exceptions rather than the rule.  They also seemed to be making arbitrary exceptions based on seeing more differences than similarities. 

We are currently at a crossroads in this country.  People are making money and generating political capital by emphasizing differences and exploiting the primitive thinking that I have outlined in this post.  Much of the aggression plays out at a symbolic level in social media, but the Insurrection at the Capitol building and the increasing levels of physical violence illustrates that it is far from always symbolic. Americans have traditionally left ethics and morality up to religious institutions where it may be presented at an abstract level.   

It is time to get back to the basic premise of why every person is unique and needs to be treated with respect by virtue of being a member of the human race. It seems like an obvious but untested approach to reducing interpersonal violence at all levels in a society that is not currently equipped to prevent it.

 

George Dawson, MD, DFAPA

 

References:

1:  Lawrence H. Keeley. War Before Civilization. Oxford University Press, 1997.

2:  Crevecoeur I, Dias-Meirinho MH, Zazzo A, Antoine D, Bon F. New insights on interpersonal violence in the Late Pleistocene based on the Nile valley cemetery of Jebel Sahaba. Sci Rep. 2021 May 27;11(1):9991. doi: 10.1038/s41598-021-89386-y. PMID: 34045477 (Open Access).

3:  Knoll JD, Pies RW.  Moving Beyond "Motives" in Mass Shootings.  Psychiatric Times 36(1) Jan 13, 2019. Link


Permissions:  Graphic above is from reference 2 per the following Creative Commons license. 

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to theCreative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http:// creative commons. org/ licenses/ by/4. 0/.

Saturday, May 29, 2021

News From Africa

 



 











Some readers of this blog may recall posts in the past about human origins from Africa. I first became aware of the migratory pathways of early humans and the archaeological and genetic evidence from participating in a National Geographic test. That test looked at haplotypes and how modern humans migrated from East Africa to essentially around the world. The test also estimated the percentage of Neanderthal heritage. Since that original sample I have been tested at 2 other laboratories. One of them confirmed the National Geographic analysis and the third test is pending. My original intent was to highlight the fact that all Homo sapiens are from East Africa, and that the distinctions we like to consider “race” mean a lot less than most people think. Genetically human beings from different races are much more similar than different.  If we all share common ancestors, similarities would seem to be intuitive, but the events that have occurred since my original post suggest otherwise.  Racial and tribal biases have been with her since prehistoric times and continue in modern societies.

Despite those biases, evidence continues to flow from Africa. In the may 6 Nature there was a research paper on the discovery of a 78,000-year-old gravesite in Panga Ya Saidi, Kenya that contained the body of a 3-year-old child. The body and gravesite had evidence of burial practices including intentional placement in an excavated grave, specific positioning of the body, and wrapping of the body in a perishable material.  The authors exhaustively review the evidence for those conclusions. They and the author doing the commentary on this article (2) also review the meaning of these rituals.

The editorial first and the commentary by Louise Humphrey (2) from the Center for Human Evolution Research. Dr. Humphrey sets the stage in the Middle capstone Age (320,002 30,000 years ago). The research question is the first evidence of modern human behavior during that era. Various human innovations were discovered during this timeframe in Africa. The earliest human fossils were also discovered in Africa during this timeframe. One of the key indicators of complex behavior is how the dead were prepared and handled. Anthropologists considered this to be a marker of increased symbolic capacity and thought. Dr. Humphrey reviews existing fossil records of mortuary treatments done by humans and provides a map in her paper (2) where the archaeological sites are mapped ranging from 50 to 780 thousand years ago. Given the time range the hominid species identified included Homo sapiens, Homo neanderthalis, and other species of the genus Homo.  The critical work of the scientist involved is to “reconstruct a series of human actions associated with the deposition of the body”. What does the planning suggest? What meaning can be inferred from the site? Time and effort involved in the ritual suggest that the treatment is more meaningful. In the literature reviewed that included a dedicated site, wrapping the body, and positioning the body. Her conclusion was that the paper suggested the burial in the main paper was “symbolically significant”.

The research article (1) is a detailed description of the excavation and analysis of gravesite of a 2.5 to 3.0-year-old child. The researchers named the child Mtoto – a Kiswahili word for small child.  Estimated age of the gravesite was about 78,000 years ago. There is a detailed drawing of the preserved skeleton that includes a large portion of the cranium, spine, ribs, right clavicle, left scapula, left humerus, and proximal femurs.  There were 5 teeth present and they were felt to be consistent with H. sapiens, although some other features were present suggesting that there may have been regionally distinct populations. The evidence for placement of the body in a specific location was reviewed. The skeletal remnants were minimally displaced. The body was in a flexed position. The body present position was consistent with wrapping. This was interpreted as “more elaborated involvement of the community in the funerary rite…”. The evidence for intentional burial included an excavated trench and settlement patterns consistent with burial.

The authors review the scant evidence for mortuary practices during this era. They conclude that H. sapiens was probably preserving corpses of the young members of their groups between 69 and 78 thousand years ago.  That is contrasted with burial practices in Eurasia by Neanderthals and other modern humans dating back 120,000 years.  Infant and child burials in the sites were described as “ubiquitous”. The authors see the lack of mortuary practices during the middle Stone Age in Africa in general as being inconsistent with “modern-like conceptions of the afterlife and/or treatment of the dead”. They do point out that the absence of the behavior is not the same as the lack of capacity.

This paper was important from a number of perspectives. Overall, it is apparent that archaeological/paleobiological evidence of burial practices during the Stone Age is limited. East Africa is commonly viewed as the cradle of civilization. In 2 of the DNA analyses I have had done on myself, all my ancestors retraced back to East Africa. The data about Neanderthals mortuary practices is interesting because in the past decade, archaeological evidence supports the idea that they were conceptually more sophisticated than they had previously been given credit for.  One of the questions I came away with from this paper is why so few burial sites or other evidence of mortuary practices exist in Africa.

The inferences about human cognition based on mortuary practices are interesting to consider even in modern times. Over the course of my lifetime, funerals have changed significantly. Embalming and displaying the body, was fairly typical in the families I have been affiliated with until the turn-of-the-century. That mortuary practice was primarily grief focused. There was a religious service that was often a divine explanation of what had occurred and what was to be expected. Over the years I grew to become very interested in what the clergy from different faiths would say during the funeral. Other people would frequently speak with varying degrees of effectiveness. A common meal or reception would frequently follow the religious service there is often a separate burial with the graveside religious service the next day.

In about the year 2000, things seem to change significantly. I remember attending a funeral and being shocked that the body had been replaced by a small box of ashes. Cremation suddenly became the rule rather than the exception. The funeral service was focused on being a celebration of the deceased’s life rather than strictly grief focused. There were often photographs and video displays relevant to the deceased person’s life. The eulogies were also more lighthearted. Jokes or humorous vignettes about the deceased were more common. I don’t know what lead to these changes and have not been able to find a good analysis of why it occurred. The archeological elements of ritual, respect for the dead, the existential balance of the meaningfulness of their life in contrast to death, and the promise of a spiritual afterlife is all there. With cremation there is an added element of remembering the deceased as they were in real life and that theme is more consistent with a celebration of their life.  All of these elements are fairly implicit and embedded in ritual. An obituary is written and proofed several times. In the Internet era, it is posted on several sites and is eventually routed to sites when ancestry analysis occurs.  I have seen direct evidence that Internet obituaries exist in cyberspace much longer than they could be viewed in a newspaper. There is no doubt that multiple people have carefully planned the event.

The most important aspect of the death of an individual is their impact on the conscious states of others.  That is often simplified as a “memory” but it is more complex than that. For decades the grief process was considered to be a closed process. The person grieving goes through a number of cognitive-emotional stages and at some point they reach a stage where there baseline emotions return and they are left with memories of the individual. In the common vernacular that was described as closure.  In reality, the process is typically more open ended and the relationship with the deceased lives on. Any one of us who has lived long enough can recall at will or by association what those relationships and that person was like, why they are missed, and how they are still affecting us.  The increasing lifespan of modern humans leaves us all with a lot more time for those thoughts.

An additional consideration is the pattern of mortality and how it differs from the Stone Age to modern times. The average age of a person in the Stone age is estimated to be in the 25-35 range but that is skewed by considerable (45%) infant mortality. Did that have an impact on mortuary practices and the grief process?  Some experts suggest that more care was taken in attending to deceased infants and children implying that our ancestors had a selective thought process about those deaths. Given the time and scant evidence we may never know what our ancestors were thinking with a high degree of certainty. We do know that in the Middle Stone Age – our ancestors engaged in rituals that reflected their thoughts on death in general and that specific person.

 

George Dawson, MD, DFAPA

 

 

References:

1:  Martinón-Torres M, d'Errico F, Santos E, Álvaro Gallo A, Amano N, Archer W, Armitage SJ, Arsuaga JL, Bermúdez de Castro JM, Blinkhorn J, Crowther A, Douka K, Dubernet S, Faulkner P, Fernández-Colón P, Kourampas N, González García J, Larreina D, Le Bourdonnec FX, MacLeod G, Martín-Francés L, Massilani D, Mercader J, Miller JM, Ndiema E, Notario B, Pitarch Martí A, Prendergast ME, Queffelec A, Rigaud S, Roberts P, Shoaee MJ, Shipton C, Simpson I, Boivin N, Petraglia MD. Earliest known human burial in Africa. Nature. 2021 May;593(7857):95-100. doi: 10.1038/s41586-021-03457-8. Epub 2021 May 5. PMID: 33953416.

2:  Humphrey L.  Burial of a child during the Middle Stone Age in Africa.  Nature. 2021 May; 593(7857): 39-40.

3:  Ponce de León MS, Bienvenu T, Marom A, Engel S, Tafforeau P, Alatorre Warren JL, Lordkipanidze D, Kurniawan I, Murti DB, Suriyanto RA, Koesbardiati T, Zollikofer CPE. The primitive brain of early Homo. Science. 2021 Apr 9;372(6538):165-171. doi: 10.1126/science.aaz0032. PMID: 33833119.

4:  Beaudet A. The enigmatic origins of the human brain. Science. 2021 Apr 9;372(6538):124-125. doi: 10.1126/science.abi4661. PMID: 33833107.

5:  Olden K, White SL. Health-related disparities: influence of environmental factors. Med Clin North Am. 2005 Jul;89(4):721-38. doi: 10.1016/j.mcna.2005.02.001. PMID: 15925646.

6:  Brotherton P, Haak W, Templeton J, Brandt G, Soubrier J, Jane Adler C, Richards SM, Der Sarkissian C, Ganslmeier R, Friederich S, Dresely V, van Oven M, Kenyon R, Van der Hoek MB, Korlach J, Luong K, Ho SYW, Quintana-Murci L, Behar DM, Meller H, Alt KW, Cooper A; Genographic Consortium. Neolithic mitochondrial haplogroup H genomes and the genetic origins of Europeans. Nat Commun. 2013;4:1764. doi: 10.1038/ncomms2656. PMID: 23612305; PMCID: PMC3978205.

7:  Fu Q, Rudan P, Pääbo S, Krause J. Complete mitochondrial genomes reveal neolithic expansion into Europe. PLoS One. 2012;7(3):e32473. doi: 10.1371/journal.pone.0032473. Epub 2012 Mar 13. PMID: 22427842; PMCID: PMC3302788.

 

Supplementary:

Additional references above are for a more expansive essay on paleobiology, genetics and the importance recognizing a common ancestry.

Thursday, April 29, 2021

Hypertension - Clinical and Historical Significance for Psychiatry

 

 

I have written about hypertension in the past on this blog. During the treatment and ongoing care of the many patients I have seen over the years it is always present. The prevalence of hypertension increases with age and other comorbidities. The case of the patients I have seen alcohol and other substance use, obesity, smoking, stress, and prescribed medications are all risk factors. As a psychiatrist following blood pressures, I have to be more compulsive than the average physician. I have rarely been in an outpatient clinic where blood pressures were routinely checked. On the inpatient units where I have worked, blood pressure monitoring could also be a problem. I am reminded of teaching in services on blood pressure monitoring. In inpatient settings is also fairly common to see patients admitted who have discontinued antihypertensive therapy and developed dangerously high blood pressures. In many of those cases they continued to refuse the medication. I was put in the uneasy position of having to follow extremely high blood pressures until a probate court judge could convince the patient it was in their best interest to take those medications.

I have also seen the long-term consequences of uncontrolled hypertension in the form of acute hemorrhagic strokes, subarachnoid hemorrhages, aortic aneurysms, hypertensive cardiomyopathy, and the variations of hypertension related dementia. Many of these findings were in the context of an acute emergency. Several were more of an unexpected finding such as the likely long-term consequences of eclampsia and a brain imaging study done 30 years later.

In the day-to-day care of patients, knowing whether or not they may have hypertension is a critical aspect of care. That is true whether you are considering a medication that can elevate or decrease blood pressure, advising the patient on lifestyle changes to improve their health, or discussing their current exercise program. Most people are unaware of the acute effects of exercise on blood pressure and why strenuous exercise may be contraindicated until they have adequate control of blood pressure.

For all of these reasons, I am always interested in when new guidelines come out or blood pressure screening. Over the years that I have been practicing the suggested cutoffs demarcating hypertension and ranged anywhere from 120/80 to [Age + 100]/90. The [Age + 100]/90 cutoff was a guideline we used when I was an intern in the 1980s. That meant that if you are treating a 70-year-old their acceptable blood pressure was a maximum of 170/90. Over the years extensive research has examined blood pressure dependent outcomes and determined that systolic blood pressures that high are problematic. The question is always-where is the cutoff? Specifically at what point are we maximizing the gains and reducing the risks from overtreatment and using excessive diagnostics. 

The question is one that the US Preventive Services Task Force (USPSTF) seeks to answer. They published their comprehensive look at the issue recently (1).  Hypertension prevalence of 45% of all adults in the US is noted as well as the morbidity and mortality associated with untreated hypertension.  The quoted range of cutoffs is from 130/80 to 140/90. The technical considerations of blood pressure determinations are discussed. Suggested sensitivity of 0.8 and specificity 0.55 for office blood pressure measurement (OBPM) and 0.84 and 0.6 for home blood pressure measurement (HBPM).  Review of 13 study showed that the harms of blood pressure screening are minimal. 

The standard online medical text in the US is UpToDate and it defines hypertension as <120 mmHg systolic and <80 mmHg diastolic with Stage 1 hypertension being 130-139 mmHg systolic and 80-89 mmHg diastolic.  Stage 2 hypertension is defined as systolic of 140 mmHg and diastolic of 90 mmHg. UpToDate also defines a category of treated hypertension for any patient taking antihypertensive medication irrespective of their blood pressure reading. 

The USPSTF paper had an interesting section called How Does Evidence Fit with Biological Understanding? This did not involve a discussion of pathophysiology, but the description of subtypes and what the implications might be.  Sustained hypertension was defined as elevated blood pressure determine both in the office and outside of office settings. Whitecoat hypertension was defined as elevated blood pressure in the office but not in ambulatory settings. Masked hypertension was defined as elevated blood pressure outside of the office but not in office settings. For the purposes of the document, sustained hypertension is considered the entity that the recommendations are based on and the overall risk of cardiovascular disease is sustained hypertension > masked hypertension > whitecoat hypertension.  The diagnosis of white coat hypertension is made by comparing OBPM with HBPM or ABPM (ambulatory blood pressure measurement).  No specific biological mechanisms are discussed. The document points out that even though masked hypertension and whitecoat hypertension are associated with adverse cardiovascular outcomes there is no current evidence that treatment improves as outcomes and they consider that to be a knowledge gap.

UpToDate take a more detailed look at primary and secondary hypertension but does not elaborate much more on the pathogenesis and biology of hypertension. For example, it outlines the autonomic nervous system, the renin aldosterone system, and total plasma volume as being the main systems involved in hypertension. Secondary causes and screening for these causes is suggested but there are no confirmatory tests for essential hypertension.  Interestingly atypical antipsychotics and antidepressants are on a list of medications thought to contribute to hypertension but in personal correspondence with a hypertension specialist – he considered even the most likely medications in that category (bupropion and venlafaxine) to be rare causes.  Empirical treatment and how to treat more resistant forms of hypertension are reviewed. The medications typically address a purported mechanism of hypertension but there is no suggestion to determine the underlying physiology and match it with a medication effect.

Monitoring is another role that psychiatrists can fill. I see the same patient ranging from 6 to 24 visits per year and ideally those would all be heart rate and blood pressure data points. With many of those patients I also discuss home monitoring since approved devices are now very affordable and many of them are being treated often intermittently for hypertension. It is also critical that some patients are able to do HBPM if they are treated with medications that can clearly affect blood pressure such as beta-blockers, prazosin, and clonidine. For subgroup of people who have sustained tachycardia who need close monitoring I also recommend HBPM.

Every psychiatrist should be aware of both the USPSTF screening guideline and either the UpToDate chapter or a similar comprehensive book chapter or review.  Making sure that the patients in question get adequate screening, evaluation, and treatment is as critical as the treatment for their psychiatric disorder.  Comorbidities that are the direct result of end organ damage from hypertension also need to be addressed. I have been able to advise patients on dietary changes, exercise programs, and accepting treatment for obstructive sleep apnea when it was ignored from other sources.

Apart from the medical and clinical considerations of hypertension – are there any other lessons for psychiatry?  It turns out there are and they were first noted in 1960 and since forgotten.  Until that year there was a predominance of the view that diseases are caused by discrete pathological lesions. That view was advanced by Virchow and Koch and was the predominant view of the day. A corollary is that there are always qualitative differences between health and disease.  If a person has the required lesion, they have the disease and if not, they are healthy. That theory was disrupted by a paper by Oldham, et al (3) on the nature of essential hypertension. At that time, the dominant theory of hypertension was that it was an autosomal dominant determined disease that “separately sharply” from the normotensive population. The authors looked at collected data on families and showed that the percentage of families in previous generations with hypertension was too low for Mendelian inheritance.  The authors looked at data on the blood pressure ranges of first-degree relatives of their index hypertensives. The graphical data was interpreted as bimodal distribution of blood pressures consistent with a clear demarcation between elevated blood pressure and normotension.  However, re-examination of the data and a further trial showed that the frequency distribution of blood pressures was not consistent with a bimodal distribution or as the author’s state:

 “seems to illustrate once again that it is not hypertension that is inherited but the degree of hypertension.”

The authors use this data to reject a dominant gene and qualitative differences between disease and non-disease state.  They go on to describe the biological implausibility:

The alternative hypothesis-that arterial pressure is inherited polygenically over the whole range, and that the inheritance is of the same kind and degree in the so-called normal range as in that characteristic of essential hypertension-is in general conformity with biological theory and with the facts of observation. Just as stature, the classical human example of polygenic inheritance, is the sum of a number of separate bones and tissues, so is the arterial pressure the resultant of a number of discrete components of the cardiovascular system. One need only mention the radii of different parts of the vascular system, the lengths of the vessels constituting the resistance, their elasticity, the chemical composition.”  p. 1092.

As I read that passage, I was reminded of current work looking at the tens to hundreds of network genes activated across the genotypes of millions of unique individuals as a basis for the polygene events that occur in polygenic disorders including psychiatric disorders.

Once the polygene quantitative model was accepted over the single dominant gene qualitative model, it led to a broader application including the obvious one to psychiatric disorders.  Psychiatric disorders have been demonstrated to have familial patterns and some have a very high degree of heritability, but they also do not follow single dominant gene inheritance.  To recap, Oldham, et al basically blew the single gene, qualitative difference between disease state and normality, single pathological mechanism out of the water for complex disorders and they did it in 1960! No philosophy or rhetoric – just good old science. At one point the authors point out that “no student of genetics” had explained the dips in the hypertension frequency graphs.

The psychiatric significance of these authors’ work occurs when Kendell (4) highlighted it 15 years later to illustrate why the single pathological mechanism as “proof” of psychiatric disease is a failure.  Hypertension is a complex polygenic disorder that all psychiatrists must concern themselves with if they are actively treating patients.  It is also a useful comparison model for the psychiatric disorders that we treat,

 the body fluids, the action of the heart, and the

 George Dawson, MD, DFAPA

 

References:

1:  US Preventive Services Task Force, Krist AH, Davidson KW, Mangione CM, Cabana M, Caughey AB, Davis EM, Donahue KE, Doubeni CA, Kubik M, Li L, Ogedegbe G, Pbert L, Silverstein M, Stevermer J, Tseng CW, Wong JB. Screening for Hypertension in Adults: US Preventive Services Task Force Reaffirmation Recommendation Statement. JAMA. 2021 Apr 27;325(16):1650-1656. doi: 10.1001/jama.2021.4987. PMID: 33904861.

2:  Basile JM, Bloch MJ. (2021) Overview of Hypertension in Adults. In GL Bakris, WG White, GP Forman, L Kunins, UpToDate (Accessed 4/28/2021) from:  https://www.uptodate.com/contents/overview-of-hypertension-in-adults

3:  Oldham PD, Pickering G, Fraser Roberts JA, Sowry GS. The nature of essential hypertension. Lancet. 1960 May 21;1(7134):1085-93. doi: 10.1016/s0140-6736(60)90982-x. PMID: 14428616.

4:  Kendell RE. The concept of disease and its implications for psychiatry. Br J Psychiatry. 1975 Oct;127:305-15. doi: 10.1192/bjp.127.4.305. PMID: 1182384.

5:  Breu AC, Axon RN. Acute Treatment of Hypertensive Urgency. J Hosp Med. 2018 Dec 1;13(12):860-862. doi: 10.12788/jhm.3086. Epub 2018 Oct 31. PMID: 30379139.

6:  Rossi GP, Rossitto G, Maifredini C, Barchitta A, Bettella A, Latella R, Ruzza L, Sabini B, Seccia TM. Management of hypertensive emergencies: a practical approach. Blood Press. 2021 May 8:1-12. doi: 10.1080/08037051.2021.1917983. Epub ahead of print. PMID: 33966560.

 

Graphics Credit:

The image at the top of this blog is for Shutterstock per their standard licensing agreement. I picked it based on the fact that it reminded me of a patient I saw in the emergency department when I was an intern.  He had a large left basal ganglia cerebral hemorrhage that was most likely due to sustained hypertension.


Apologies:

Editing this post was tough. For some reason my Word processer switched to Polish language and stopped automatically checking my grammar and spelling. That was compounded by the fact that I was dictating in Dragon and sound alike words that were spelled correctly were substituted.  I ended up proofing everything on my phone and just finished tonight (4/29).   

Monday, April 26, 2021

The First 25 Pages….

 

I was minding my own business on Twitter last week and noticed a slide posted with the image of the DSM-5.  It did not take too long the realize that it was not posted by anyone who had read the DSM – at least not the first 25 pages.  These pages are technically the introduction to the diagnostic section of the manual.  Important words because they summarize the process, orient the reader to the manual, and describe several important qualifiers.  That is how I was able to tell that the slide on Twitter had nothing to do with the DSM.  The statements made about it were essentially false.

The first problem is the characterization that diagnoses are “operational criteria” and that therefore it is a “fallible tool”. These are common mistakes by anyone who has not been trained in medicine and the understanding of disease states.  For simplicity, consider the definition from my physical diagnosis text from medical school:

"For several thousand years physicians have recorded observations and studies about their patients.  In the accumulating facts they have recognized patterns of disordered bodily functions and structures as well as forms of mental aberrationWhen such categories were sufficiently distinctive, they were termed diseases and given specific names. “ 

 

DeGowan and DeGowan, Bedside Diagnostic Examination. 1976, p 1

 

The introduction notes that the precursor to the American Psychiatric Association (APA) published the precursor to the DSM back in 1844.  Even before that, the description of psychiatric disorders stretches back for thousands of years. The above definition notes the importance of patterns that are consistent over time.  A detailed description of these patterns and those evolved descriptions is how all of medicine has advanced.  The other important aspect of these descriptions is that they are sufficiently descriptive.  In the most basic analysis, the DSM is the standard way that physicians have indexed diseases and medical problems from the beginning.  The idea that it is merely operational criteria” as in arbitrary routine measurement is far from accurate. The introduction is very clear that a diagnosis is not a checklist of symptoms and that a formulation is required.



The fact that the DSM inconveniently contains a Neurocognitive Disorders chapter and qualifiers about ruling out all other medical illnesses as causes of the presenting disorder is typically not mentioned by the discrete pathological lesion crowd.  If it is, the standard rhetoric that is applied goes something like this: "Well it is a disease it's just no longer a psychiatric disease. When real diseases are discovered they are no longer in the purview of psychiatry."  Even though psychiatrists have been diagnosing and studying these diseases for over a hundred years.

 One of the frequent mischaracterizations of medicine and psychiatry is that it operates from a biomedical model. This is confusing to a lot of people because physicians are certainly trained and interested in the molecular biology of both normal human function and all of the associated pathophysiological functions. Psychiatrists are interested in brain function in particular but also other systems that directly affect psychiatric care. Every psychiatrist has performed physical and neurological examinations at some point in their career.  Every psychiatrist has done a detailed neurological examination. Every psychiatrist has seen and read ECGs and brain imaging studies. That does not mean that psychiatrists don’t know the limitations of standard medicine when it comes to analyzing problems generated by both the brain and its associated conscious state.  If fact, psychiatrists have some of the best analyses and criticisms of these approaches. The standard biomedical model criticism is used to suggest an absurd degree of reductionism.  That is a model that no psychiatrist adheres to and the evidence is the statement in the DSM about multiple underlying causes of mental disorders.  Interestingly many of these same critiques often advocate for specific psychosocial causes of mental disorders on a global scale – a form of psychosocial reductionism.

 

There are often philosophical digressions on the nature of mental illness and whether mental illness is a disease or not.  I have written fairly extensively about that in other areas of this blog.  For the purpose of addressing the slides I will say that the lesion basis for both mental illnesses and physical illnesses was addressed from within the field in response to the pathological theories by Virchow and Koch. Interestingly, the answer to that theory was a study of hypertension:

“It was in fact the example of hypertension which finally discredited the nineteenth-century assumption that there was always a qualitative distinction between sickness and health. The demonstration by Pickering and his colleagues twenty years ago (5) that such a major cause of death and disability as this was a graded characteristic, dependent, like height and intelligence, on polygenic inheritance and shading insensibly into normality, was greeted with shock and disbelief by most of their contemporaries, and the prolonged resistance to their findings showed how deeply rooted the assumptions of Koch and Virchow had become.” (2)

Sixty years later, some academics apparently still have a hard time realizing that mental illnesses are polygenic illnesses of varying severity and a source of significant death and disability and yet there is no clear qualitative difference between illness and disease demarcated by a lesion.  We are well past the time that they should be ignored.




 Conflict of interest is also a favorite tactic of those who seek to discredit psychiatry.  The suggestion in the original slide was that both committee approaches and pharmaceutical influence were sources of corruption.  The first 25 pages describes why this is not true.  The financial limitations of committee members were significant. In the intervening 6 years since the DSM-5 was released there has been no evidence of pharmaceutical influence.  Why would there be?  Pharmaceutical companies can come up with any indication they need for medication indications. They don’t need a manual to develop a symptom list and initiate a clinical trial for that purpose.  Anyone who has actually read the manual notices that the highlights under each category stress a pluralistic approach to mental illness and no actual treatment approaches are described.  The vast majority of new pharmaceuticals are prescribed by non-psychiatrists like primary care physicians and physician extenders. In my experience, many of these prescriptions are for transient conditions that a psychiatrist would not prescribe a medication for.

 

The current reality is this.  The DSM consider mental disorders to be disorders. They don’t address the issue of what is a disease and what is not. The manual is very clear about their process and the fact that it is a work in progress. That is nothing unique to psychiatry. Diagnoses are always in a state of flux across all of medicine and that even includes diagnoses that are defined by particular lesions.  As the science of medicine advances, expect more diagnoses and large diagnostic categories like asthma, diabetes mellitus Type II, and depression to be broken up into smaller and smaller categories that will probably correlate with physiological findings.  The authors of DSM-5 are very clear that the manual is designed to be a cooperative document with both NIMH Research Domain Criteria (RDoC) for research purposes and International Classification of Diseases 11th revision (ICD-11) for administrative an epidemiological purposes.  The good news is that if you are not a psychiatrist or mental health clinician the details contained in the manual are probably not useful for you to know.  On my blog, I pointed out that even primary care physicians don’t read it, so why would anyone else?




 Psychiatrists have obvious theoretical and historical interest in the manual, but on a day to day basis it is safe to say that nobody is closely reading it except for researchers. It is very apparent that the so-called critics of psychiatry rarely do or they would not be adhering to premises that are clearly wrong at the outset. Equally disappointing is the endless stream of philosophical arguments that make similar errors. I read a paper by Jefferson (6) less than a month ago where she posits three different ways that mental disorders can be considered brain diseases. And of course the first one is Szasz’s – specifically:

 

If Szasz is right, the very idea that mental illness is an illness depends on the idea that there is independent brain pathology causing mental distress.”

 

She goes on to say that Szasz ”drew a skeptical conclusion” from his own definition of brain disease and concluded that most mental disorders were not brain diseases. I seem to be the only one that recognizes that Szasz has been wrong about a lot of things for a long time, most notably the restricted pathologically based view of any or all diseases. 

 

That doesn’t seem to prevent it from being dragged out time and time again. The realm of philosophers and antipsychiatrists is apparently the only place Szasz is never wrong. And people can say whatever they want about the DSM-5 – even if they clearly have not read the first 25 pages.

 

 

 

 

George Dawson, MD, DFAPA

 

 

 

References:

 

1:  Leonard A. The theories of Thomas Sydenham (1624-1689). J R Coll Physicians Lond. 1990 Apr;24(2):141-3. PMID: 2191117; PMCID: PMC5387565.

 

2:  Kendell RE. The concept of disease and its implications for psychiatry. Br J Psychiatry. 1975 Oct;127:305-15. doi: 10.1192/bjp.127.4.305. PMID: 1182384.

 

3:  Smith R. In search of "non-disease". BMJ. 2002 Apr 13;324(7342):883-5. doi: 10.1136/bmj.324.7342.883. PMID: 11950739; PMCID: PMC1122831.

4:  Meador CK. The art and science of nondisease. N Engl J Med. 1965 Jan 14;272:92-5. doi: 10.1056/NEJM196501142720208. PMID: 14223129.

5:  Oldham PD, Pickering G, Fraser Roberts JA, Sowry GS. The nature of essential hypertension. Lancet. 1960 May 21;1(7134):1085-93. doi: 10.1016/s0140-6736(60)90982-x. PMID: 14428616.

6:  Jefferson, A. (2021). On Mental Illness and Broken Brains. Think, 20(58), 103-112. doi:10.1017/S1477175621000099


Graphics Credit:

Slides are all made by me with appropriate referencing.  Click on any slide to enlarge.

 

 

 

Friday, April 16, 2021

Adding Rather than Subtracting Bias - An Underlying Basis for Polypharmacy?




There was an interesting piece in Nature this week (1,2) about cognitive biases in complex problem solving.  The research psychologists asked subjects to solve problems of varying complexity and structure from the perspective of whether additional structures or steps were necessary or whether an optimal solution could be obtained by subtracting structures or steps. I will briefly describe each of the problems in the table below (pending permission to use one of their graphics).

Task

Description

Abstract grid task

Transform a grid pattern to make it symmetrical

Suggested changes to a large public university

Changes to improve the sense of community, enable student learning, and prepare students for a lifetime of service

Lego block structure

Improve the 8 or 10 block structure

Lego block structures

3 possible structures built from 12 blocks of a pool of 24 blocks on a 6” x 8” base.

Lego block structure

Revision of original structures made from a possible 20 blocks to make a 10 block structure

Lego block structure

Modify a Lego structure so that it can hold a brick over the head of an action figure in the structure

Read and summarize an article

Make a 6-8 sentence summary and then edit it to a shorter version

Read and summarize an article

Edit someone else’s summary and edit it to “omit needless words”

Day trip to Washington DC

Inspect a trip itinerary and suggest changes to improve it

Make a grilled cheese sandwich

Make a grilled cheese from 27 ingredients -

Modify a soup recipe

From 5, 10, and 15 ingredient soup recipes – modify from a list of ingredients and modify to improves the soup.

 

 


Inspection shows that the cognitive tasks cover many domains ranging from 2D and 3D visuospatial tasks, language tasks, and more theoretical tasks that involve speculative rather than confirmed outcomes. The authors suggest an all-encompassing definition: “the cognitive science of problem solving describes iterative processes to imagining and evaluating actions and outcomes to determine if they would produce an improved state.”(p. 258).  They define subtractive transformations as fewer components than the original and additive transformations as more components than the original.  The authors noted a bias in anecdotal literature to making conscious subtractive transformations and that suggested to them that strategy may be less common or undervalued. 

Across all experiments, the tendency toward subtractive strategies with the general instruction were lower but probabilistic.  For example, across all experiments, subtractions ranged from 21-41%.  A second set of conditions with subtle subtraction cues increased the rate of subtractive transformations to 43-61% across the same experiments.  At one point the researchers added a cognitive load task that was basically a distractor to use more attentional resources. In these conditions cognitive shortcuts are less accessible. Under those conditions subjects failed to identify a subtractive solution more frequently.  The authors also studied subjects form Germany and Japan suggesting that there is cultural generalizability of the additive over subtractive strategies.

The authors consider that the differences could be accounted for by generating a number of additive and subtractive ideas and selecting the additive or they simply default to the additive.  They elected to look at the default to the additive mode. They describe heuristic memory searches allowing for the timely access of relevant information.  They suggest a number of reasons what additive strategies may be favored including – the processing may be easier, semantic biases such as more being better, cognitive biases may favor the status quo or less change, and it may be more probable that additive rather than subtractive changes offer a better outcome.

This is an interesting paper from a number of perspectives.  First, it presents a cognitive psychology approach with no purported biological mechanisms. There are no functional imaging studies or brain systems described.  The theories and design of experiments depends on a psychological model of cognitive function. Second, the model is probabilistic.  Although the title suggests systematic overlooking of subtractive strategies, it turns out that many don’t and this bias can be modified by experimental conditions such as subtraction cues. Third, the effect of increased cognitive load can be demonstrated to increase the likelihood of additive rather than subtractive biases. Fourth, the biases extend across a number of domains including physical, social, and intellectual. Fifth, the authors suggest that there may be a number of “cognitive, cultural, and socioecological reasons for favoring the additive bias over the subtractive one.  Sixth, although the additive transformation was more likely to occur that does not mean it offers the best solution to the problem.  It may simply be the most commonly used solution. 

Real world experience illustrates how the additive transformations can be reinforced.  Advertising is a common one. The goal of advertising is basically to sell someone something that they don’t need or change their preferences for something that they do need to a different product.  If it works, it is an additive strategy on top of additive behavior.  If the product being sold affects other learning centers in the brain like reward-based learning that can lead to further additive effects. The photo at the top of this post illustrates another example.  This kitchen drawer for spoons and spatulas is a solution to the cooking problem of how many are needed to accomplish what the cook in this case needs to accomplish. The drawer is packed to the point where it barely closes and at that point, the cook is forced to reassess and decide about cleaning the drawer out and starting over.  Homeowners often forced to make similar downsizing or subtractive decisions after 20-30 years of additive ones and being forced with either space constraints or a smaller family.  

What about medical and psychiatric treatment?  I don’t think there is any doubt that additive transformations are operating. Most treatments that involve medication have a step approach with the addition of medications for symptoms that do not respond or partially respond to the initial treatment. This occurs after an explicit subtractive bias or at least a bias to maintain the status quo 20 years ago.  At that time, hospitals and clinics were reviewed based on criteria to limit the amount of polypharmacy defined as more than one drug from the same class. Today, polypharmacy is common.  Reference 3 below gives an example of polypharmacy defined as 5 or more medications taken concomitantly and hyper-polypharmacy was defined as 10 or more medications taken concomitantly in a 3-month sample of 404 geriatric patients with cardiovascular disease admitted to a hospital during 3-month period.  They found the prevalence of polypharmacy was 95%.  The prevalence of hyper-polypharmacy was 60%.  Most patients (77.5%) also had a potential drug-drug interaction.  Their suggestion be vigilant is a strategy discussed as being potentially successful in containing the additive strategies (2).  

From psychiatry, I am including a common problem that I encountered as a tertiary consultant.  That problem is what to do about a person with a depression that has not responded to high dose venlafaxine. There are geographic areas in the US, where very high dose venlafaxine is used with and without pharmacogenomic testing.  From the options listed in the diagram it is apparent that there are 4 additive (black arrows) strategies and 2 subtractive (red arrows). There is a robust literature on the additive strategies and not so much with the subtractive. As a result, it is common these days to encounter patients who have tried numerous combinations right up to and including “California Rocket Fuel” (4) of the combination of an SNRI like venlafaxine with mirtazapine.  The ways to analyze this situation, especially if there has not been any improvement are significant and depend a lot on patient preferences and side effects in addition to the lack of response. I have found that very high dose venlafaxine, can be sedating to a significant number of people and that they feel better when it is tapered.  I have also seen many people far along the augmentation strategies when tapering or discontinuing the venlafaxine was never considered. In some of these cases, the patient reports that venlafaxine is historically the only antidepressant that has worked for them in the past.

That brings up the issue of additive versus subtractive biases on the part of the patient. We have all been bombarded by pharmaceutical commercials suggesting the best way to mood stabilization is adding another medication – typically aripiprazole or brexpiprazole. In fact, those commercials speak directly to additive biases. It is often very difficult to convince a person to discontinue or reduce a medication that they have talked for years – even when careful review suggests it has been ineffective or creates significant side effects. 

Could a discussion of additive versus subtractive transformations be useful in those situations? There is currently no empirical guidance, but these might be additional experiments to consider for both prescribing physicians and the patients they are seeing. Certainly the expectations that they patient has for any given treatment needs to be discussed and whether that expectation is reasonable given their personal experience and the objective evidence. On the side of prescribing physicians, it is fairly easy to flag medication combinations that are problematic either from the perspectives of too many medications being used at once, physical and side effects not being analyzed closely enough, or medications being changed too frequently. Would discussing additive and subtractive strategies be useful in that setting?  Would a discussion of basic rules to address additive biases such as discontinuing a medication when it is replaced be useful?

Remaining vigilant that there are subtractive strategies out there is a useful lesson from this paper. Physicians are aware of the concept of parsimony and how that can be applied to medical care. Given the fact that the additive strategies are probabilistic and modifiable with conscious strategies – that should still prove to useful in containing polypharmacy.  

 

George Dawson, MD, DFAPA


Supplementary:

Another common additive strategy that I have encountered in the past 10 years is performance enhancement.  The patient presents not so much for treatment of a psychiatric problem but because they believe that adding a medication or two or three will improve their overall ability to function. Common examples would include:

1.  Presenting for treatment of ADHD (with a stimulant medication) not because of an attentional problem but because the stimulant creates increased energy and the feeling of enhanced productivity.

2.  Presenting for treatment of insomnia in the context of drinking excessive amounts of caffeine in the daytime and the caffeine is viewed as necessary to enhance energy at work or in the gym.  In some cases, stimulants are taken in the daytime and the idea is that the medication for insomnia would counter the effect of stimulants or caffeine taken late into the day.

3.  Taking anabolic androgenic steroids (AAS) and expecting to treat the side effects of mood disturbances, insomnia, anger, and irritability in order to keep taking the AAS.  Many AAS users also take other medications for this purpose as well as various vitamins, supplements, and stimulants to enhance work outs.

4.  Taking excessive numbers of supplements with no proven value and seeking to use medications for nondescript symptoms associated with the supplement use. In many cases, patients with psychiatric disorders are sold on elaborate mixtures of minerals and supplements with the promise that they address their symptoms.  In many cases it is difficult to determine if the associated vitamins and supplements interact with the indicated medical treatment or not.

All of these are additive strategies with no proven value that I have seen in the outpatient settings.  It is obviously important to know if the patient being treated is using these strategies.  There are often competing considerations – for example does the patient have a substance use disorder and are substance use disorders another predisposing condition to additive biases (I suspect they strongly are).

 

References:

1:  Meyvis T, Yoon H. Adding is favoured over subtracting in problem solving. Nature. 2021 Apr;592(7853):189-190. doi: 10.1038/d41586-021-00592-0. PMID: 33828311.

2:  Adams GS, Converse BA, Hales AH, Klotz LE. People systematically overlook subtractive changes. Nature. 2021 Apr;592(7853):258-261. doi: 10.1038/s41586-021-03380-y. Epub 2021 Apr 7. PMID: 33828317.

3:  Sheikh-Taha M, Asmar M. Polypharmacy and severe potential drug-drug interactions among older adults with cardiovascular disease in the United States. BMC Geriatr. 2021 Apr 7;21(1):233. doi: 10.1186/s12877-021-02183-0. PMID: 33827442; PMCID: PMC8028718.

4:  Stahl, SM . Essential psychopharmacology: neuroscientific basis and practical applications. Cambridge University Press, Cambridge 2000. p. 363.

 

Graphics Credit:

So far they are all mine.  Yes that is one of my kitchen drawers but I am fairly good at avoiding polypharmacy.  Click on any graphic to enlarge.