Were you brainwashed?

 
jen-kiaba-lessons-leaving-blog-cult-brainwashing.png
 

If you prefer to listen to this episode, click here.

Back in May, writer and fellow second generation ex-Moonie Lisa Kohn, and I were interviewed by Fen Alankus for the “Follow the Woo” podcast. We were comparing and contrasting our experiences growing up in the Unification Church in different decades, and in different “castes.” While I was born in, Lisa’s mother joined when Lisa was just 10. In Church theology, much is made about the level of purity of the children born into the group. We were called “Blessed Children,” while the kids whose parents brought them in at a young age were called a “Jacob’s Child,” although that term wasn’t commonly used when Lisa was growing up in the Church.

(Fun fact: Fen was almost a Jacob’s Child too, but her mother left the Church in time! As soon as the episode is live I’ll link to the episode so you can hear more.)

One of the things that Lisa and I are both passionate about is helping others like us by sharing our stories of survival and resilience, often with an educational underpinning. So during the course of the podcast interview, Lisa shared that while she was on tour to promote her memoir To the Moon and Back: A Childhood Under the Influence, she had the opportunity to be interviewed for a popular daytime talk show. (The following is paraphrased from my memory of the conversation.)

“Before I went on stage the producer was doing a pre-interview and asked me, point blank, ‘were you brainwashed?’” she said.

There was a long pause where both Fen and I stared into our webcams, mouths hanging open.

“Are you fucking kidding me?” I finally managed.

“Nope.” Lisa shook her head and put her hand over her heart. “Swear to God.”

I must have audibly growled before muttering, “What a horrible question. That is such a fundamental misunderstanding of what thought reform is.” I might have also slammed my fist on my desk. “And what a horrible, shaming thing to say to a survivor!”

Frankly, at that point I was tempted to completely derail the conversation and delve into a treatise on what thought reform is, and why asking a survivor if they were brainwashed is a stigmatizing question - at best. But I refrained. After all, that’s what this blog is for!

The briefest history of the word “brainwashing”

So, first, let’s talk about the word “brainwashing.It was coined by American journalist and intelligence agent Edward Hunter in the 1950s. According to Dr. Janja Lalich and Madeleine Tobias, authors of the book Take Back Your Life: Recovering from Cults and Abusive Relationships, the term “brainwashing” is a poor translation of Chinese characters for "thought struggle" or "thought remodeling," and they refer to Chairman Mao Zedong’s political campaigns of the 1920s to 1960s. They say that when discussing cult dynamics, psychologists and sociologists prefer the term “thought reform,” because “brainwashing” is misunderstood and often associated with Communism or torture. But, they say, that the sophisticated and subtle systems of influence found in cults today hardly require a prison cell or a torture chamber.

Still, the word “brainwashing” prevails in both popular imagination and writing on the topic of cults. Some, like Rebecca Moore, sister of women who helped plan the 1978 Jonestown massacre, have gone as far to say that brainwashing is a pseudoscience because, she says the word “ignores research based explanations for human behaviour and dehumanises people by denying their free will.”

Others, like Dick Anthony, a research and forensic psychologist, who wrote Pseudoscience and Minority Religions: An Evaluation of the Brainwashing Theories of Jean-Marie Abgrall for Social Justice Research cite the failed CIA MKUltra experiments, where the aim was to develop mind-controlling drugs for use against the Soviet bloc in response to alleged Soviet, Chinese, and North Korean use of mind control techniques on U.S. prisoners of war during the Korean War. Anthony states that the “CIA theory has been evaluated scientifically in research in several contexts […] and it has been shown to be ineffective in coercively changing worldviews.” After retiring, Sidney Gottlieb, the head of the CIA's MKUltra programs, dismissed the entire effort as “useless.”

So, is brainwashing A Thing?

Some, like anthropologist Geri-Ann Galanti, Ph.D. and sociology professor emeritus Eileen Barker have written about their experiences with the Unification Church and echoed the sentiment that “brainwashing” techniques either did not exist or were ineffective. Galanti, who spent a week at Camp K, wrote in a paper Brainwashing and the Moonies for a 1984 issue of Cultic Studies Journal, wrote that she expected she would encounter “brainwashing” techniques, but that expectation was not met. However, she wrote that she was “struck by the subtle, yet powerful, socialization techniques through which the UC members were able to influence her.”

Barker, on the other hand, spent nearly seven years studying Unification Church members. Not only did she attend numerous workshops and visit communal facilities, but she interviewed members, ex-members, "non-joiners," and control groups of uninvolved individuals from similar backgrounds, as well as parents, spouses, and friends of members. Her thesis was that she rejects the "brainwashing" theory as an explanation for conversion to the Unification Church, because, she said it explained neither the many people who attended a Unification Church recruitment meetings and did not become members, nor the voluntary disaffiliation of members.

Disclosure: I have not finished reading all of Making of a Moonie, Barker's book which goes into these interviews and sums up her rejection of the “brainwashing” theory. But as somebody who has read a significant amount of first-generation memoirs, I feel like this thesis does not acknowledge that many of the people who did join were in that vulnerable state that I talk about in a lot of my other posts. Also to call it the “disaffiliation” of members - it's not like people just woke up one day and said, “you know what, I'm done with this.” That moment, if and when it comes, occurs after years of abuse. And I think that the words “voluntary disaffiliation” cover up and ignore the incredible fight for a lot of us to get out.

Many have criticized Barker as being a cult apologist. She is the chairperson and founder of the Information Network Focus on Religious Movements {INFORM}. Dr. Steve Hassan’s website also cautions that Barker is on the referral list of the now Scientology-run Cult Awareness Network. So those are things that I want to posit as we're looking at the “two sides” of this brainwashing question.

On the other side of the argument, in 2004, neuroscientist and physiologist Kathleen Taylor published a book called Brainwashing: The Science of Thought Control, in which she explores the neurological basis for reasoning and cognition in the brain, and proposes that the self is changeable, while describing the physiology of neurological pathways. In a 2005 article for The Guardian she writes,

“Say the word ‘brainwashing,’ and chances are we think of the Manchurian Candidate, of some mysterious process that installs malevolent new beliefs in people's minds - as if ideologies were just like versions of Windows. This horror-movie view of brainwashing is alarming, but also comforting. It scares us with one of the most frightening visions we can contemplate: the utter loss of free will as our minds are "turned" or "poisoned", our core identity subjugated to some evil-minded fanatic.”

And she goes on to say that the reason that this is comforting is because it removes us from the process. It makes the “monster,” as she calls it the, responsible party. She was referring to Osama Bin Laden and our thinking of, “Oh, he’s the evil one,” because when we think of brainwashing, we think of a person that's able to just put new ideas into our brains and it removes our responsibility.

But, she cautions, brains are not computers, and there is no button that someone can press to turn a man from good to evil. Instead, she says, we need to think of brainwashing realistically: not as black magic but in secular, scientific terms, as a set of techniques that can act on a human brain to produce a change in beliefs - sometimes drastically and over short periods of time.

She says that these techniques include:

  • isolating the individual and controlling their access to information,

  • challenging their belief structure and creating doubt,

  • and repeating messages in a pressurized environment, which help change someone’s old beliefs and introduce their brain to new ones.

Jen-Kiaba-Isolation.jpg

Isolating individuals

is a technique used in thought reform.

But for the transplant to take, she says, emotion is required. A person needs to have positive feelings when they talk about the new beliefs, and negative ones whenever they think of old ideas. This uses a human brain's own power of association to manipulate it.

The Eight Criteria for Thought Reform

As mentioned, most cult-aware psychologists and sociologists prefer the term “thought reform” over “brainwashing” because the former circumvents the associations with the latter noted by Kathleen Taylor. Personally I also believe that “thought reform” as a preferred term helps to create clarity in the issues brought up by Galanti and Barker. One may not be able to flip a switch and turn someone into a “mindless robot” (or “brainwashed zombie”, as the popular media of the 70’s and 80’s referred to Unification Church members as), but I believe that people can be coerced by what Michael D. Langone, PhD refer to as “Deception, Dependency, and Dread” to facilitate the conversion experience. (Dr. Alexandra Stein builds on this her book “Terror, Love and Brainwashing: Attachment in Cults and Totalitarian Systems,” to describe how people are converted and why they stay.)

Like “brainwashing,” we get the term “thought reform” from the Cold War era, specifically from campaigns of the Communist Party of China to reform the thinking of Chinese citizens into accepting Marxism-Leninism and Mao Zedong Thought. American psychiatrist Dr. Robert Jay Lifton, popularized the term in his book Thought Reform and the Psychology of Totalism: A Study of "Brainwashing" in China.

According to the publisher, Lifton's research for the book began in 1953 with a series of interviews with American servicemen who had been held captive during the Korean War. In addition to interviews with 25 Americans, Lifton also interviewed 15 Chinese who had fled their homeland after having been subjected to indoctrination in Chinese universities. From these interviews, which in some cases occurred regularly for over a year, Lifton identified the tactics used by Chinese communists to cause drastic shifts in one's opinions and personality and "brainwash" American soldiers into making demonstrably false assertions.

The word “brainwashing,” Lifton says, was originally used to describe Chinese indoctrination techniques from this period, although it would also later be applied to Russian and Eastern European techniques as well. He goes on to say that the term has been applied to everything from American advertising to corporate training programs and even private prep schools. These applications, he says, are not always without basis, and suggests there is a continuum on which the method exists but that this liberal usage also contributes to confusion. As mentioned before, many of us imagine brainwashing as this all-powerful, irresistible, or magical method of achieving total control over an individual, but according to Lifton, it is none of those things.

In 1957 there was a symposium panel of experts at the New York Academy of Medicine to discuss the topic of mind control. During this panel, Lifton stated that thought reform consists of four stages: the emotional assault, followed by leniency, then confession, and finally re-education The latter, in Cold War era China meant remaking of a person in the Communist image. In the summary of the symposium Lifton is paraphrased, discussing an example of the process which aligns with the techniques that Taylor wrote about:

The prisoner is brought in and confronted with his crimes. Then he’s asked to detail his entire life and experience in China, which, for missionaries, might be a period of 30 or 40 years. After the interrogation, the prisoner endures “the struggle” in his cell: “he sits in the middle of the cell, and the other prisoners, his cellmates, form a circle around him and begin to shoot invectives at him denouncing him as an arch criminal, a stubborn imperialist who refuses to recognize his crimes.” These are prisoners working towards their own release.

After months of this treatment, there is leniency. Lifton quotes a priest, who describes the kindness of a judge—a judge who encourages him to confess and get it over with. The theme of confession is then repeated over and over.

When he eventually confesses, he is made to denounce his colleagues… and then undergoes a reeducation process, and the end of which there is “the development of a new identity, the recoding of reality, the conversion, if you will, in which the prisoner begins to look at the world through Communist eyes and if the process is successful, is reborn.”

Jen-Kiaba-What-You-Say.jpg

Cultic thought reform

is aimed at both social control and individual change.

In his book, Lifton says that these elements of thought reform are closely related and overlapping because they bring into play a series of pressures and appeals, on an intellectual, emotional and physical level, that are aimed at both social control and individual change. Again, he and Taylor are in alignment on this, and these elements also resonate with Dr. Janja Lalich’s Bounded Choice framework, which I discuss in my post “The Illusion of Choice.”

At the end of Thought Reform and the Psychology of Totalism, Lifton writes, “Where totalism exists, a religion, political movement, or even a scientific organization becomes little more than [a] cult.” He goes on to identify eight psychological themes that he uses as criteria for evaluating if a situation meets the standard of a “totalist environment,” or a cult.

The criteria are as follows, with examples taken from psychotherapist Rosanne Henry’s website, https://www.cultrecover.com/ and Take Back Your Life:

  1. Milieu control:

    • This is control, or attempted control, over all a person sees, hears, reads, writes. This can be isolation from other people, psychological pressure, geographical distance or unavailable transportation, and sometimes physical pressure.

    • This can also be related to the control of a sequence of events, such as seminars, lectures, group encounters, which become increasingly intense and increasingly isolated, making it extremely difficult, both physically and psychologically, for one to leave.

  2. Mystical manipulation:

  3. Demand for purity:

    • This is a black and white worldview, where the world becomes sharply divided into the pure and the impure, the absolutely good (the group/ideology) and the absolutely evil (everything outside the group). In the Unification Church we used the terms “Fallen World” and “Outside World” interchangeably to refer to people not in the group.

    • The leader is the ultimate arbiter of what and who is good or bad.

    • One must continually change or conform to the group "norm."

    • Tendencies towards guilt and shame are used as emotional levers for the group's controlling and manipulative influences.

    • This ties in with the process of confession -- one must confess when one is not conforming.

  4. Confession:

    • Confession is carried beyond its ordinary religious, legal and therapeutic expressions to the point of becoming a cult in itself.

    • The follower is now “owned” by the group.

    • Makes it virtually impossible to attain a reasonable balance between worth and humility, and they experience a loss of boundaries between what is secret and what the group knows.

  5. Aura of sacred science:

    • The group’s doctrine or ideology is held up as the ultimate moral vision for the ordering of human existence. The group demands reverence for the ideology, its originator and those who presently bear it.

    • Questioning or criticizing those basic assumptions is prohibited.

    • This inhibits individual thought, creative expression and personal development.

    • It offers considerable security to young people because it greatly simplifies the world and answers a contemporary need to combine a sacred set of dogmatic principles with a claim to a science embodying the truth about human behavior and human psychology.

  6. Loading the language (special vocabulary):

  7. Doctrine over person:

    • This is the denial of the self, and any perception other than the one dictated by the group.

    • If one questions the beliefs of the group or the leaders of the group, one is made to feel that there is something inherently wrong with them to even question -- it is always "turned around" on them and the questioner/criticizer is questioned rather than the questions answered directly.

      • In the Unification Church this manifested in telling members that they were being “invaded by evil Spirit World,” or “forming a common base with Satan” when they questioned. I talk about this more in my post Toxic positivity and the thought-terminating cliché as well.

  8. Dispensing of existence:

    • Similar to the demand for purity, this is when the group has an absolute or totalist vision of truth, those who are not in the group are evil/ not enlightened/ not saved, and therefore do not have the right to exist.

      • When I was growing up my mom used to quote Genesis 6:1-4, “the sons of God saw the daughters of men that they were fair; and they took them wives of all which they chose.” Her interpretation of this verse was that when the sons of God married the daughters of men, it meant that people with souls (the sons of God) intermarried with people who had no souls (daughters of men). Therefore, to her, essentially all of humanity was wandering around in a no-soul or diluted-soul state, and it was only by receiving the marriage Blessing from Rev. Moon that people could be engrafted back onto the lineage of God and have souls. I don’t know if this came from Unification Church ideology, but it illustrates the point in a pretty stark manner.

    • If the non-people cannot be recruited, then they can be punished or even killed.

    • Similarly, this breeds a fear into followers who learn that their life can literally depend on this willingness to obey the group or the group leader.

Thought reform in action

Similarly to Lifton, Dr. Margaret Singer discussed conditions that lend themselves to psychological manipulation. Like Lifton, she investigated and testified about techniques used by North Koreans against American soldiers in wartime. (In later years she also repeatedly testified against the Unification Church.)

Those conditions, adapted from Take Back Your Life, are:

  • Keeping a person unaware of what is going on and how they are being changed one step at a time.

  • Controlling the persona’s social and/or physical environment, especially the person’s time.

  • Systematically creating a sense of powerlessness, covert fear, guilt, and dependency.

  • Manipulating a system of rewards, punishments, and experiences in order to inhibit behavior that reflects the person’s former identity, attitudes, and/or beliefs.

  • Manipulating a system of rewards, punishments, and experiences in order to promote the group’s ideology, belief systems, and group-approved behaviors.

  • Putting forth a closed system of logic and an authoritarian structure that permits no feedback and cannot be modified except by leadership approval or executive order.

In the short term, these techniques might not produce lasting changes. But over the long term, they can create what Dr. Lalich and Madeline Tobais refer to as the formation of the cult identity. In Take Back Your Life, they reference American psychiatrist, Louis J. West, and Dr. Singer’s piece “Cults, Quacks, and Nonprofessional Therapies,” from the Comprehensive Textbook of Psychiatry/III, in which West and Singer describe practices and behaviors that contribute to the control and potential exploitation of an individual:

  1. Isolation of recruits and manipulation of their environment.

  2. Control over channels of communication and information.

  3. Debilitation through inadequate diet and fatigue.

  4. Degradation or diminution of the self.

  5. Induction of insecurity, fear, and confusion, with joy and certainty through surrender to the group as the goal.

  6. Peer pressure, often applied through ritualized “struggle sessions” generating guilt and requiring open confessions.

  7. Insistence by seemingly all-powerful hosts that the recruits' survival—physical or spiritual—depends on identifying with the group.

  8. Assignment of monotonous tasks or activities, such as chanting or copying written materials or mind-numbing repetitious lessons in so-called classes.

  9. Acts of symbolic betrayal or renunciation of self, family, and previously held values, designed to increase the psychological distance between recruits and their previous way of life.

(They also have very strong words for cult apologists saying, “Many apologists—individuals and organizations—contribute to the cults' veneer of respectability behind which ugly and harmful things are happening. […] If the apologists are church officials, physicians, or behavioral scientists, the grateful cults have been known to reward them with grants, awards, published praise, and even ‘research’ opportunities.

This makes me think of the honorary degrees that were given to the Moons, the awards that were handed out or the crowning of Reverend Moon in one of the Senate buildings at the U.S. Capitol. Josh Gordenfeld’s book, “Bad Moon Rising,” opens with a scene about this crowning. All of these people, who, for whatever reason, are involved in contributing to this veneer of respectability are, in many ways, denying the abuse that people suffer and are contributing to the facade that these groups are respectable and are not engaging in human rights violations.)

Langone says that “after converts commit themselves to a cult, the cult’s way of thinking, feeling, and acting become second nature, while important aspects of their pre-cult personalities are suppressed or, in a sense, decay through disuse.”

Jen-Kiaba-Monsters.jpg

The pre-cult identity

can “decay” through disuse.

So I'm curious to know if any other second gen survivors of the Unification Church or, even those from other groups, have ever experienced a moments where they see, what they think might be, their parents pre-cult identities coming through. I used to see it all the time with my mother. She was, what she called, a “fundamentalist Moonie.”

But at the same time, there were occasions where her raunchy sense of humor or her adventurous nature would come through, I'd feel like I'd see the person that she used to be. She had this very strict, upright “Moonie personality,” (again, the fundamentalist Moonie that she called herself), but I don't think that's who she really was. And so it is alarming when, you think of Langone’s statement, to wonder with both our parents and others who were coerced into joining these groups, how often is that original personality fighting to get through?

To point back to Lifton's criteria for thought reform, in the Unification Church, we would have been taught that the old personality was your “fallen nature” trying to fight through, and that the cult personality was your “original mind.” And so as the original personality is trying to assert itself, one would thought reform oneself into the cult’s way of thinking to suppress that old self

Thought reform in the Unification Church conversion experience

Now that we understand what thought reform is, I want to illustrate it via the process used by the Unification Church. After I escaped the Church, I spent years reading first generation memoirs, trying to understand what had happened to my parents to make them “join” the Church. Few helped me understand what had been so compelling about the Church. My own parents’ testimonies about “meeting the Church,” were always vague and misty eyed, but they never offered details of their indoctrination.

It was only in the book Moonwebs: Journey into the Mind of a Cult, where author and journalist Josh Freed describes the process by which the Unification Church utilized thought reform in the Boonville camp that I understood what had been done to my parents and so many others. Taking the criteria of thought reform described above, we can see how someone who is in a vulnerable state might be susceptible to a cult’s indoctrination.

It’s an intense read, and I highly recommend it for anyone who might be on a similar journey of trying to piece together their past. Caveat: just because I understand that my parents were victims does not mean I absolve them of the abuse they perpetuated. I know this is a big trigger point for my fellow ex-second generation, and so I wanted to state that very clearly. Speaking of triggers, the following details the psychological horrors that totalist environments can inflict upon a person. So feel free to skip if reading about it would upset you.

Some of the passages that I found particularly illuminating from Moonwebs are in Chapter 10 where Freed writes,

Boonville’s assault rips apart their complacency and finds the “seeker” inside them. For some, the vulnerable spot is a lack of fulfillment in their work or personal lives; for others, the guilt of being modern “consumers” who have compromised their past ideals. Their own unused potential is used as a weapon to push them into extreme introspection…and further. […]

Then, poised at the abyss of nervous collapse, the recruit is offered only one avenue of escape, which he takes in sheer desperation: he fastens onto the group to escape his pain. It is not said, but it is implicit: as long as he remains in Boonville and continues his inner exploration, he will be accepted, loved—even respected, for the courage to face his problems and try to change.

The recruit did not need such support when he arrived at Boonville, but now, with his sense of identity collapsing, he needs it desperately; he clings to the group like a raft in a storm, carried along on a wave with no idea of where it will break, knowing only that for the time being it will keep him sane and alive. […]

Yet for all their power, these changes do not signify the “creation” of a Moonie. The resulting loss of self may be sufficient to create a new charismatic, or a Billy Graham convert, but it is not enough to send the recruit out flower selling and proselytizing for Moon; in fact, he usually does not yet even know that Moon exists.

If the recruit were to return to the world and his old “connections” at this point, says Dr. Clark, he would likely fall back into his old life, with little remaining but a sense of wonder at a remarkable experience (and a possible willingness to make regular contributions to the group that “saved” him). But just as evangelist John Wesley discovered 100 years earlier that he required regular “study groups” if his members were not to “slide back into misery”, Moon too requires continuing—though far greater supervision of the “convert”.

The new recruit is weak and frightened; he has lost his sense of identity and is desperately in need of something to restore or replace it, but he is not a “Moonie”. He is only the raw material with which the making of a Moonie will begin.

And in Chapter 14 where he writes,

Boonville creates an intense emotional environment that pushes the “patient” into the recesses of his own mind. In fact, Boonville is far more intense than any recognized therapy, since the recruit is totally isolated from the real world for days on end—food and sleep are cut back; the routine is bizarre and unsettling; and throughout, the recruit does not even know he is undergoing a “therapy” experience.

As a result, many recruits break down quickly, coming to a volatile and vulnerable state that is sometimes the object of intense therapies. However, at this point, it is as if the “therapist” has gone mad: rather than pulling back to help the patient find answers in himself, the Moonies close in, deliberately using the recruit’s growing vulnerability to drive him to the very brink of a nervous breakdown. Then, as he teeters in terror on the fringes of sanity, he is offered only one refuge to avoid going insane: abandoning his own identity entirely and fastening onto the “benevolent therapist” to lead him to safety […] once this has been achieved, the group then systematically turns this awesome power back upon the weakened recruit to dismember and obliterate all remnants of his former identity.

In the weeks ahead, the recruit is convinced that he has lived a hopelessly selfish existence: the only way out of his horrifying plight is to annihilate his old personality and rebuild it from scratch at the “therapist’s” directions. This is never stated explicitly, but it is understood: from this point onwards, the recruit may trust only the group’s standards—rigorous and strange as they may seem—and not his own ego-centered perspective. Whatever experience, knowledge and emotions that previously guided his behavior are worth nothing in the days ahead. They are “old concepts” that he must be strong enough to subdue.

This is the crucial juncture in becoming a Moonie; it is at this point that whatever makes up the person’s identity is totally invalidated, blotted out of any role in determining his future path. The recruit has been so terrified that he has lost all confidence in his own thoughts and feelings to guide his actions. To be accepted, he is ready to disbelieve everything he thinks and feels and obey anything the group tells him: a state of such utter submission that, like Benji, recruits can be steered not just off the path—but in the very opposite direction to that in which they had intended to go.

[…] This total annihilation of self-worth and utter willingness to obey is the very essence of the Unification Church’s control over its members.

If you would like to read the entire book and cannot find a physical copy for sale, the website The Tragedy of the Six Marys does have the full text available.

Similarly, if you’re interested in how Lifton’s Eight Criteria for Thought Reform applies to NXIVM’s Executive Success Programs, check out this piece by Paul Martin, Ph.D., the former Director of Wellspring Retreat and Resource Center near Albany, an internationally known residential treatment facility for persons abused by cults and dysfunctional relationships.

Why this matters for cult survivors

According to Lifton, “with greater knowledge about [cults], people are less susceptible to deception.”

I would also argue that the more people understand the true psychological abuse that thought reform entails, the less they will say stupid and harmful things to cult survivors or ask them hurtful, demeaning questions.

Here’s an example in action: the other day I was working on a different blog post, and digging through my cult books (I jokingly call them my “cult library”) looking for information to support my thesis. In my perusing of the shelf, I came across the book “Brainwash: The Secret History of Mind Control” by Dominic Streatfeild. I’d purchased the book at the end of 2019, but with the events of 2020 I couldn’t bring myself to sit down and read it. However, as I was flipping through it, I remembered why I’d been interested in the book in the first place: it claimed to have an entire chapter dedicated to examining “brainwashing‘ in the Unification Church. So, I decided to finally sit down and read.

While I was at it, I decided read the whole chapter for the camera (inspired by Faith Yen’s YouTube Live readings of the novel Heavenly Deception” by Maggie Brooks, which also examines the indoctrination process in the Unification Church), and posted it in six episodes on IGTV:

I had been hopeful for the book, because it examined the way that the Unification Church used fraud and deception to recruit and indoctrinate, and how the deprogramming that was practiced to extract young people out of the Church was problematic as well. But by the end of the chapter (episodes 5 & 6) I felt like I was stopping every few sentences to discuss how the language that the author was using seemed to be a fundamental misunderstanding of indoctrination and that high demand groups use a cycle of abuse to keep their members within the group.

Essentially the author spends nearly 30 pages talking about how the techniques practiced by the Church are similar to the abuses committed against prisoners of war, but then pivots to judge the members for committing to the Matching & Blessing ceremonies in the Church. He even goes so far as to ask one of his interview subjects if she was brainwashed, at which point I seriously wanted to throw the book.

(Another thing that I noted as I read the chapter was that Streightfeild began by utilizing the word cult when referring to the Unification Church, but by the last third of the chapter was either referring to it as a new religious movement or was using “cult” in quotations. So it made me wonder if Streightfeild might fall more into the “cult-apologist” camp.)

By the end of the chapter, I was quite disappointed in the author because I felt like he does a huge disservice to the survivor. Not just if a survivor reads the book; I think that could be problematic in and of itself. But if somebody else comes to this book to get an education about what happened to people to bring them to join or to stay, the writing shames those survivors and contributes to that overall thought that cult survivors are weak, stupid, etc.

In his paper, The Relational System of the Traumatizing Narcissist, psychoanalyst Daniel Shaw says

One of the reasons of many of the people who leave cultic groups choose not to identify their own experience as abusive is because to do so would mean acknowledging an extraordinary degree of grief over the loss of a deeply cherished idealized attachment, connected to their most cherished hopes about themselves and about life.

Along with the unleashing of an extraordinary degree of shame about their own self deception and gullibility and shame and rage about the amount of abuse they were willing to endure for the sake of maintaining their tie to the leader. Eventually the realization that their devotion and labor within the group led to no real personal growth and to no significant contribution to society will also become a source of deep shame and regret.

Now, I don't know if that differs between first generation, second and multi-generational members. Maybe there is a good deal of nuance, depending on how much of one’s adult life is spent in a cult, especially for people who were born in. But, I want to add to that, I feel that a lot of us don't identify our experiences as abusive because, when we share about our experiences, the shame and regret is reflected back at us by an uneducated populace.

I think it’s important to remove stigma for cult survivors, because the less shame we feel around our experiences, the more we can heal from them. Questions like, “were you brainwashed” are both shame-inducing because of the associations with the word, and indicate a voyeuristic pleasure in the survivor’s experience, where the questioner is looking to hear about bizarre rituals that confirm the “otherness” of the survivor. It also expresses doubt about the real harm inflicted on survivors. In essence to me, it’s nearly as bad as the question, “Why Didn't You Just Leave?

But when we understand what thought reform techniques are, and that they are a form of coercive control and psychological abuse, we should realize that the real question ought to be, “Are you comfortable sharing how you were harmed?” This question invites the survivor to set a boundary, to choose whether or not they are comfortable sharing their story, and indicates that they are believed. As I’ve written about in Memory Loss, Dissociation and Euphoric Recall in Cults, survivors are so conditioned not to believe their own experiences, that someone inherently saying, “I believe you” can make all the difference in our healing processes.

So not only do I want to help educate and empower fellow survivors, but I want to help inform others as well. If you are a survivors, and somebody asks, “were you brainwashed?” you can shut that question down. You don't have to answer. Again, the question in and of itself indicates a lack of empathy and a lack of understanding.

If somebody came to me and said, “were you thought reformed"?”

I would say, “Yeah, I was psychologically abused. And the fact that you can acknowledge it by that use of this term is really helpful.”

So again, if somebody asks, “were you brainwashed,” not only do you not have to answer, but you can send them this post to help explain what happened to you. If you so choose, you can start to share your experiences in the context of those eight criteria. Maybe those eight criteria for thought reform will help you understand your own experience better. But at the end of the day, what I want to happen for all of us is for us to alleviate that sense of shame for our experiences. Because again, the less we acknowledge our experiences as abuse, the harder it is for us to process them.


If you have been in ANY high control group or religion, share your story with the hashtag #igotout. Share on your own platform OR if you need to be anonymous and / or would like support, there are resources at the I Got Out website.

When you see a survivor share their story, let them know they have been heard. This is such a meaningful part of the movement. We all need to know we're not alone.

If you know someone who has been harmed by a high demand group, share #igotout posts you think would help them.

Together we can bring awareness to how many of us have been harmed by high control organizations and end the shame or stigma we might feel about our experiences.

Tell your story.

Impact lives.

Change the world.

Listen to the episode here:

Get Blog Posts In Your Inbox!

    I respect your privacy. Unsubscribe at anytime.

    Jen Kiaba1 Comment