Is It Really Possible to Effect Informed Consent?

Thinker on a Rock, Barry Flanagan, National Sculpture Museum, Washington, D.C.

We are always trying to make sense of the world by telling ourselves stories, constructing our beliefs and connecting events through cause and effect, inserting a sense of purpose in the latter from the former (as illustrated and explained in William Storr’s The Science of Storytelling).  Lisa Feldman Brett (Seven and a Half Lessons About the Brain, and her much less satisfying How Emotions are Made) and others insist that we “think” first in our emotions and our conscientious thoughts follow, often trying to explain the decisions of our emotions. She also says that the brain is constantly predicting future events so as to keep the body ready to respond.  I wonder whether this work of the brain is emotional or rational, though clearly not conscious.

As we try to look backward to decide which events we understand sufficiently to consent or dissent, we must be wary of confirmation bias, our needs to understand ourselves in a certain way (beware cognitive dissonance!) and the fact that others we meet or read about or hear about, live with the same constraints, needs, and prejudices.

Be all that true, the world does seem crazy at the moment.  As I’ve discussed in earlier blogs, people are dying deaths of despair; corporations have devised ways of hyper-focusing our emotions, thoughts, and group associations to drive us to ever more passionately oppose some others ‘way past the level of rationality (for a detailed explanation, see An Ugly Truth: Inside Facebook’s Battle for Domination, by Sheera Frenkel and Cecillia Kang); and we are having great difficulty deciding not just how, but whether to protect ourselves and the world against the ravages of COVID and its successors. The COP26 conference on the climate emergency may or may not be our final decision as a world about whether we will reduce the threats of weather to our food supply; enhance our ability to live in extreme temperatures; live through, and rebuild after, ever more frequent and violent weather; and build a world where essential workers get the respect and income they should have.  Doubtless the reader can expand the list.

In my first post of July 25, 2018, I wrote:

Apparently in a time when we in democracies need most to be able to think for ourselves, take in information, and analyze complex matters, we are to believe that most of us if not all are inherently unable to do so. But we who do have individual thoughts may be able to influence those others, not because we are authoritative nor perhaps persuasive in any way, but simply because we are heard by people near us. The task is to influence others more than they influence us on a certain matter.

Judging by Storr’s information, this is what really happens:  our emotions acquire our moral stance by age seven – he doesn’t explain how.  In the meantime, as part of that, the brain (as if it is an independent thing, not really oneself) has been developing theories about personal identity and values and our place in the world and in the scheme of things,  and has developed concepts of what is normal in life and what should be expected.  Having done that, the brain defends those conclusions against new academic information and news, different opinions, experiences which don’t jive with what was expected, and the people associated with those challenges and experiences.  Because we grow up in social relationships, some or much of how we understand ourselves stems from the culture around us.  So in the West, highly individualized experiences are emphasized and prized: competition, well-constructed moral tales, and stories with definite beginnings, middles, and conclusions, are all featured.  Compare these with Eastern cultures in which ancestors, elders, community welfare and community support, and obligations to all those, shape and frame our self-understanding (see the previously cited The WEIRDest People in the World:  How the West Became Psychologically Peculiar and Particularly Prosperous, by Joseph Patrick Henrich).

The history of our people, so to speak, also affects us, as eminent historian Margaret MacMillan demonstrates in War:  How Conflict Shaped Us.  She gives commanding emphasis on how very difficult it is in some cultures to disrespect war.  I can certainly attest to that:  I am a veteran of the Cold War and the wars in southeast Asia, having lost friends in the latter, but I do not commemorate Remembrance Day in Canada.  I can feel peoples’ eyes on me as I go around without wearing a poppy, and I do not attend cenotaph events.  I always felt that we all should have learned about the horrors of war from the First World War, and those lessons should have led us to avoid all subsequent wars. Instead, we repeat the old ceremonies as if remembrance is a redeeming virtue in itself, and a forefending against war. But we arm, and then ignore, those who continue to fight around the world.  I don’t call this hypocrisy, because people I know are deeply sincere in their respect for the importance of this remembrance.  I just think it is futile, and actually encourages us to continue wars (more about this in an upcoming post).

We are shaped also by education programs in our countries, according to the wonderful recent overview https://www.americanpurpose.com/articles/history-the-dictators-plaything/. Apparently it is more difficult than I realized to develop independent thought.

There is growing disrespect for intellectual honesty and impartiality, both labelled as elitist privilege which serve only to reinforce White values, according to the NYT’s John McWhorter https://www.nytimes.com/2021/10/26/opinion/wokeness-america.html?searchResultPosition=4.  Arguing on the other side is Jonathan Rauch’s The Constitution of Knowledge:  A Defense of Truth. He believes that just as science can claim that there is evidence-based research, so also is there evidence-based truth.  I’m not sure he makes the case, but his effort is passionate and elegant.

In facilitating group discussions about contentious topics, I have shown that when people talk about the life experiences which brought forth particular views at this point in time, those who hear them find it easy to understand the full significance of the issue for them.  No one can argue with a person’s life story, and there is an intuitive understanding together with the intellectual one, of what that person means when speaking their convictions.  It is often easier to disagree pleasantly with someone else, when you understand their story.

With that in mind, I say:  I guess I was spoiled.

In my formative and teen years in the U.S. during the fifties, sixties, and seventies, television news was expanding to thirty-minute nightly broadcasts; the Sunday interview shows were seen as necessary information to really understand the world around us; Edward R. Murrow was an icon; and young Americans were showing themselves much more reluctant than older American generations (but not Quebeckers, who historically were very resistant) to trust government decisions about going to war.  Blacks were pushed to desperation in Los Angeles, Detroit, and elsewhere, sometimes to the point of massive riots.  I could see that things were not right for everyone, and being able to see that and to argue about it were considered highly respectable ways of comportment. Of course, doing something helpful was even better. Being in-the-know was becoming ever more important.

And intellectual acumen was expected and respected.  For example, in the large “screening out” course in the School of International Relations at the University of Southern California, the actual Dean of the school taught, which showed deep interest from the very top.  We were assigned many journal articles to read each week. Next week he would call out one student to summarize one of the articles and express an opinion of it.  Then another student was called upon to critique the first’s opinion as well as the article.  The demand for intellectual quality was profound, and not many of us made it through that class.  My final exam in one of the upper level courses was oral, undertaken at the office of a professor in the RAND Corporation, the Air Force’s prestigious Research and Development agency, another sign of intellectual expectations. (I did not do well in that one.)

Because my bachelor of foreign service program had only two slots for courses outside the major, I was required to take an undergraduate thesis course.  I developed my thesis on “Moral Responsibility in the Ideas of Freud and Jung” during conversations while walking around campus alone with the very dean of Letters, Arts, and Sciences and professor of religion.  You can see how interested the university was in developing our intellects.

I received my officers’ training through Air Force Reserve Officers Training and the Air Force Academy; the Air Force required a bachelor’s degree to be commissioned.  I can’t say that everyone I encountered demonstrated particular intellectual brilliance, nor probably did I.  But, as an intelligence briefing officer, I faced questions and expectations from generals who were very sharp, probing, and demanding.  Also, because many young men avoided being drafted into the army by enlisting in the Air Force, many of my staff held masters’ degrees –if I wanted their respect and best efforts in helping me prepare my analyses and briefings, I had to at least be their intellectual equals.  As all officers were told in a giant assembly with the Commander in Chief of Strategic Air Command, that old adage that “ours is not to reason why, ours is but to do or die,” may apply to some in the chain of command, but not to any of us there.

Seminary was a graduate school — three years of academics, a year of supervised practicum.  It had a very demanding faculty noted for preparing graduates for Ph.D. programs at the most eminent schools, and who were much-touted experts in their own fields (especially Biblical archaeology, and ethics.)  Required were intense class participation, written tests, oral exams (sometimes by panels of scholars), three languages to learn (and the fourth language of esoteric theological terms).  Theology, church history, pastoral care and counseling, and, of course, deep research into the religious, historical, anthropological, and archeological background of the Bible.  Preparing for ministry was more than just being a personable guy, speaking well, and being sensitive in highly emotional situations.

As chaplain and patients’ rights advisor, I met regularly with clinicians at the Detroit Psychiatric Institute. Taking courses with second year interns, I found that medicine, basic compassion, and social work skills required being on our toes. Mental hospitals were moving from preference for institutionalization to treatment in the community, with “milieu therapy” the order of the day, while the professions moved from talk therapy and conjoint family therapy to more medicine-based treatment.  Everyone was on shifting ground, but we all had to believe that science, compassion, and our intellects would eventually give us clear understanding of what was causing mental illness, and would cure our patients.  I don’t think we accomplished that, but we were definitely not working “by guess and by golly.”  The chief of staff, himself a child psychiatrist, many doctors and psychologists, and social workers, were Black Americans as well as foreigners of different ethnic and cultural backgrounds.  So I never had the impression that intellectual rigor and impartiality were White privilege.

But younger people tell me I’m spoiled.  They seem to not encounter such rigorous expectations.

So these are the experiences which bring me to this point of view.

What to make of all this – is seeking informed consent or dissent regarding the past, present, or future, Quixotic?  How will I know – perhaps I can’t see past my confirmation bias, or I haven’t really tuned into my emotions and am trying to reach an intellectual conclusion without their consent.  But how would I know?

Those who propound theories about decision-making run the same risks as I.  With all due respect to the scientific process, I note that evidence is derived among people who have already agreed that the proposed process and expected results would indicate a specific conclusion.  Everybody starts from some background, discipline, and experience, as do I.  Anyone examining evidence does so through some lens of values.  Their stories shape their intellects as much as mine does my own.  “Everyone is entitled to their own opinions but not to their own facts,” it is said, and I agree. But the conclusions drawn from the facts of studies in the brain, the mind, and personhood are often debatable.  A rabbi taught a class that it is absolutely true, as Scripture says, that God spoke: what was actually said is not so clear.

I am persuaded that all our decisions are influenced by many things, but I think that awareness of those influences, makes it possible to resist them.  It is possible to know when I am thinking independently, and when I am going with the flow.  I think that many people do not try to think independently. Their susceptibility to going with the crowd, believing everything they read on Facebook and other social media, is a fault of theirs which, en masse, is causing the current crises in democracies.  People must find a way to think for themselves.

I hope that at the very least, people who read this blog will be encouraged to think independently and to exercise every possible conscious process of examining, questioning, and thinking for themselves, to have informed consent and dissent.  And that they will talk about these matters with those who share this impulse, and perhaps more importantly, with those who don’t.