How to Sell a Pseudoscience

by Anthony R. Pratkanis
from Skeptical Inquirer, Volume 19, Number 4 (July/August 1995): Pages 19-25. Notes and references removed; refer to original for these.

Every time I read the reports of new pseudosciences in the Skeptical Inquirer or watch the latest “In Search Of”-style television show I have one cognitive response, “Holy cow, how can anyone believe that?” Some recent examples include: “Holy cow, why do people spend $3.95 a minute to talk on the telephone with a ‘psychic’ who has never foretold the future?” “Holy cow, why do people believe that an all-uncooked vegan diet is natural and therefore nutritious?” “Holy cow, why would two state troopers chase the planet Venus across state lines thinking it was an alien spacecraft?” “Holy cow, why do people spend millions of dollars each year on subliminal tapes that just don’t work?”

There are, of course, many different answers to these “holy cow” questions. Conjurers can duplicate pseudoscientific feats and thus show us how sleights of hand and misdirections can mislead. Sociologists can point to social conditions that increase the prevalence of pseudoscientific beliefs. Natural scientists can describe the physical properties of objects to show that what may appear to be supernatural is natural. Cognitive psychologists have identified common mental biases that often lead us to misinterpret social reality and to conclude in favor of supernatural phenomena. These perspectives are useful in addressing the “holy cow” question; all give us a piece of the puzzle in unraveling this mystery.

I will describe how a social psychologist answers the holy cow question. Social psychology is the study of social influence__how human beings and their institutions influence and affect each other. For the past seven decades, social psychologists have been developing theories of social influence and have been testing the effectiveness of various persuasion tactics in their. It is my thesis that many persuasion tactics discovered by social psychologists are used every day, perhaps not totally consciously, by the promoters of pseudoscience.

To see how these tactics can be used to sell flimflam, let’s pretend for a moment that we wish to have our very own pseudoscience. Here are nine common propaganda tactics that should result in success.

  1. Create a Phantom

The first thing we need to do is to create a phantom __ an unavailable goal that looks real and possible; it looks as if it might be obtained with just the right effort, just the right belief, or just the right amount of money, but in reality it can’t be obtained. Most pseudosciences are based on belief in a distant or phantom goal. Some examples of pseudoscience phantoms: meeting a space alien, contacting a dead relative at a seance, receiving the wisdom of the universe from a channeled dolphin, and improving one’s bowling game or overcoming the trauma of rape with a subliminal tape.

Phantoms can serve as effective propaganda devices. If I don’t have a desired phantom, I feel deprived and somehow less of a person. A pseudoscientist can take advantage of these feelings of inferiority by appearing to offer a means to obtain that goal. In a rush to enhance self-esteem, we suspend better judgment and readily accept the offering of the pseudoscience.

The trick, of course, is to get the new seeker to believe that the phantom is possible. Often the mere mention of the delights of a phantom will be enough to dazzle the new pseudoscience recruit. After all, who wouldn’t want a better sex life, better health, and peace of mind, all from a $14.95 subliminal tape? The fear of loss of a phantom also can motivate us to accept it as real. The thought that I will never speak again to a cherished but dead loved one or that next month I may die of cancer can be so painful as to cause me to suspend my better judgment and hold out hope against hope that the medium can contact the dead or that Laetrile works. But at times the sell is harder, and that calls for our next set of persuasion tactics.

  1. Set a Rationalization Trap

The rationalization trap is based on the premise: Get the person committed to the cause as soon as possible. Once a commitment is made, the nature of thought changes. The committed heart is not so much interested in a careful evaluation of the merits of a course of action but in proving that he or she is right.

To see how commitment to a pseudoscience can be established, let’s look at a bizarre case__mass suicides at the direction of cult leader Jim Jones. This is the ultimate “holy cow” question: “Why kill yourself and your children on another’s command?” From outside the cult I it appears strange, but from the inside it seems natural. Jones began by having his followers make easy commitments (a gift to the church, attending Wednesday night service) and then increased the level of commitment __ more tithes, more time in service, loyalty oaths, public admission of sins and punishment, selling of homes, forced sex, moving to Guyana, and then the suicide. Each step was really a small one. Outsiders saw the strange end product; insiders experienced an ever increasing spiral of escalating commitment.

This is a dramatic example, but not all belief in pseudoscience is so extreme. For example, there are those who occasionally consult a psychic or listen to a subliminal tape. In such cases, commitment can be secured by what social psychologists call the foot-in-the-door technique. It works this way: You start with a small request, such as accepting a free chiropractic spine exam, taking a sample of vitamins, or completing a free personality inventory. Then a larger request follows __ a $1,000 chiropractic realignment, a vitamin regime, or an expensive seminar series. The first small request sets the commitment: Why did you get that bone exam, take those vitamins, or complete that test if you weren’t interested and didn’t think there might be something to it? An all too common response, “Well gosh, I guess I am interested.” The rationalization trap is sprung.

Now that we have secured the target’s commitment to a phantom goal, we need some social support for the newfound pseudoscientific beliefs. The next tactics are designed to bolster those beliefs.

  1. Manufacture Source Credibility and Sincerity

Our third tactic is to manufacture source credibility and sincerity. In other words, create a guru, leader, mystic, lord, or other generally likable and powerful authority, one who people would be just plain nuts if they didn’t believe. For example, practitioners of alternative medicine often have “degrees” as chiropractors or in homeopathy. Subliminal tape sellers claim specialized knowledge and training in such arts as hypnosis. Advocates of UFO sightings often become directors of “research centers.” “Psychic detectives” come with long resumes of police service. Prophets claim past successes. For example, most of us “know” that Jeane Dixon predicted the assassination of President Kennedy but probably don’t know that she also predicted a Nixon win in 1960. As modern public relations has shown us, credibility is easier to manufacture than we might normally think.

Source credibility is an effective propaganda device for at least two reasons. First, we often process persuasive messages in a half-mindless state __ either because we are not motivated to think, don’t have the time to consider, or lack the abilities to understand the issues. In such cases, the presence of a credible source can lead one to quickly infer that the message has merit and should be accepted.

Second, source credibility can stop questioning (Kramer and Alstad 1993). After all, what gives you the right to question a guru, a prophet, the image of the Mother Mary, or a sincere seeker of life’s hidden potentials? I’ll clarify this point with an example. Suppose I told you that the following statement is a prediction of the development of the atomic bomb and the fighter aircraft:

They will think they have seen the Sun at night
When they will see the pig half-man:
Noise, song, battle fighting in the sky perceived,
And one will hear brute beasts talking.

You probably would respond: “Huh? I don’t see how you get the atomic bomb from that. This could just as well be a prediction of an in-flight showing of the Dr. Doolittle movie or the advent of night baseball at Wrigley field.” However, attribute the statement to Nostradamus and the dynamics change. Nostradamus was a man who supposedly cured plague victims, predicted who would be pope, foretold the future of kings and queens, and even found a poor dog lost by the king’s page. Such a great seer and prophet can’t be wrong. The implied message: The problem is with you; instead of questioning, why don’t you suspend your faulty, linear mind until you gain the needed insight?

  1. Establish a Granfalloon

Where would a leader be without something to lead? Our next tactic supplies the answer: Establish what Kurt Vonnegut terms a “granfalloon,” a proud and meaningless association of human beings. One of social psychology’s most remarkable findings is the ease with which granfalloons can be created. For example, the social psychologist Henri Tajfel merely brought subjects into his lab, flipped a coin, and randomly assigned them to be labeled either Xs or Ws. At the end of the study, total strangers were acting as if those in their granfalloon were their close kin and those in the other group were their worst enemies.

Granfalloons are powerful propaganda devices because they are easy to create and, once established, the granfalloon defines social reality and maintains social identities. Information is dependent on the granfalloon. Since most granfalloons quickly develop out-groups, criticisms can be attributed to those “evil ones” outside the group, who are thus stifled. To maintain a desired social identity, such as that of a seeker or a New Age rebel, one must obey the dictates of the granfalloon and its leaders.

The classic séance can be viewed as an ad-hoc granfalloon. Note what happens as you sit in the dark and hear a thud. You are dependent on the group led by a medium for the interpretation of this sound. “What is it? A knee against the table or my long lost Uncle Ned? The group believes it is Uncle Ned. Rocking the boat would be impolite. Besides, I came here to be a seeker.”

Essential to the success of the granfalloon tactic is the creation of a shared social identity. In creating this identity, here are some things you might want to include:

(a) rituals and symbols (e.g., a dowser’s rod, secret symbols, and special ways of preparing food): these not only create an identity, but provide items for sale at a profit.

(b) jargon and beliefs that only the in-group understands and accepts (e.g., thetans are impeded by engrams, you are on a cusp with Jupiter rising): jargon is an effective means of social control since it can be used to frame the interpretation of events.

(c) shared goals (e.g., to end all war, to sell the faith and related products, or to realize one’s human potential): such goals not only define the group, but motivate action as believers attempt to reach them.

(d) shared feelings (e.g., the excitement of a prophecy that might appear to be true or the collective rationalization of strange beliefs to others): shared feelings aid in the we feeling.

(e) specialized information (e.g., the U.S. government is in a conspiracy to cover up UFOs): this helps the target feel special because he or she is “in the know.”

(f) enemies (e.g., alternative medicine opposing the AMA and the FDA, subliminal-tape companies spurning academic psychologists, and spiritualists condemning Randi and other investigators): enemies are very important because you as a pseudoscientist will need scapegoats to blame for your problems and failures.

  1. Use Self-Generated Persuasion

Another tactic for promoting pseudoscience and one of the most powerful tactics identified by social psychologists is self-generated persuasion — the subtle design of the situation so that the targets persuade themselves. During World War II, Kurt Lewin was able to get Americans to eat more sweetbreads (veal and beef organ meats) by having them form groups to discuss how they could persuade others to eat sweetbreads.

Retailers selling so-called nutritional products have discovered this technique by turning customers into salespersons. To create a multilevel sales organization, the “nutrition” retailer recruits customers (who recruit still more customers) to serve as sales agents for the product. Customers are recruited as a test of their belief in the product or with the hope of making lots of money (often to buy more products). By trying to sell the product, the customer-turned-salesperson becomes more convinced of its worth. One multilevel leader tells his new sales agents to “answer all objections with testimonials. That’s the secret to motivating people,” and it is also the secret to convincing yourself

  1. Construct Vivid Appeals

Joseph Stalin once remarked: “The death of a single Russian soldier is a tragedy. A million deaths is a statistic.” In other words, a vividly presented case study or example can make a lasting impression. For example, the pseudosciences are replete with graphic stories of ships and planes caught in the Bermuda Triangle, space aliens examining the sexual parts of humans, weird goings-on in Borley Rectory or Amityville, New York, and psychic surgeons removing cancerous tumors.

A vivid presentation is likely to be very memorable and hard to refute. No matter how many logical arguments can be mustered to counter the pseudoscience claim, there remains that one graphic incident that comes quickly to mind to prompt the response: “Yeah, but what about that haunted house in New York? Hard to explain that.” By the way, one of the best ways to counter a vivid appeal is with an equally vivid counter appeal. For example, to counter stories about psychic surgeons in the Philippines, Randi tells an equally vivid story of a psychic surgeon palming chicken guts and then pretending to remove them from a sick and now less wealthy patient.

  1. Use Pre-Persuasion

Pre-persuasion is defining the situation or setting the stage so you win, and sometimes without raising so much as a valid argument. How does one do this? At least three steps are important.

First, establish the nature of the issue. For example, to avoid the wrath of the FDA, advocates of alternative medicine define the issue as health freedom (you should have the right to the health alternative of your choice) as opposed to consumer protection or quality care. If the issue is defined as freedom, the alternative medicine advocate will win because “Who is opposed to freedom?” Another example of this technique is to create a problem or disease, such as reactive hypoglycemia or yeast allergy, that then just happens to be “curable” with whatever quackery you have to sell.

Another way to define an issue is through differentiation. Subliminal-tape companies use product differentiation to respond to negative subliminal-tape studies. The claim: “Our tapes have a special technique that makes them superior to other tapes that have been used in studies that failed to show the therapeutic value of subliminal tapes.” Thus, null results are used to make a given subliminal tape look superior. The psychic network has taken a similar approach — “Tired of those phony psychics? Ours are certified,” says the advertisement.

Second, set expectations. Expectations can lead us to interpret ambiguous information in a way that supports an original hypothesis. For example, a belief in the Bermuda Triangle may lead us to interpret a plane crash off the coast of New York City as evidence for the Triangle’s sinister effects. We recently conducted a study that showed how an expectation can lead people to think that subliminal tapes work when in fact they do not. In our study, expectations were established by mislabeling half the tapes. The results showed that about half the subjects thought they improved (though they did not) based on how the tape was labeled (and not the actual content). The label led them to interpret their behavior in support of expectations, or what we termed an “illusory placebo” effect.

A third way to pre-persuade is to specify the decision criteria. For example, psychic supporters have developed guidelines on what should be viewed as acceptable evidence for paranormal abilities — such as using personal experiences as data, placing the burden of proof on the critic and not the claimant, and above all else keeping James Randi and other psi-inhibitors out of the testing room. Accept these criteria and one must conclude that psi is a reality. The collaboration of Hyman and Honorton is one positive attempt to establish a fair playing field.

  1. Frequently Use Heuristics and Commonplaces

My next recommendation to the would-be pseudoscientist is to use heuristics and commonplaces. Heuristics are simple if-then rules or norms that are widely accepted; for example, if it costs more it must be more valuable. Commonplaces are widely accepted beliefs that can serve as the basis of an appeal; for example, government health-reform should be rejected because politicians are corrupt (assuming political corruption is a widely accepted belief). Heuristics and commonplaces gain their power because they are widely accepted and thus induce little thought about whether the rule or argument is appropriate.

To sell a pseudoscience, liberally sprinkle your appeal with heuristics and commonplaces. Here are some common examples.

(a) The scarcity heuristic, or if it is rare it is valuable. The Psychic Friends Network costs a pricey $3.95 a minute and therefore must be valuable. On the other hand, an average University of California professor goes for about 27 cents per minute and is thus of little value!

(b) The consensus or bandwagon heuristic, or if everyone agrees it must be true. Subliminal tapes, psychic phone ads, and quack medicine feature testimonials of people who have found what they are looking for.

(c) The message length heuristic, or if the message is long it is strong. Subliminal-tape brochures often list hundreds of subliminal studies in support of their claims. Yet most of these studies do not deal with subliminal influence and thus are irrelevant. An uninformed observer would be impressed by the weight of the evidence.

(d) The representative heuristic or if an object resembles another (on some salient dimension) then they act similarly. For example, in folk medicines the cure often resembles the apparent cause of the disease. Homeopathy is based on the notion that small amounts of substances that can cause a disease’s symptoms will cure the disease. The Chinese Doctrine of Signatures claims that similarity of shape and form determine therapeutic value; thus rhinoceros horns, deer antlers, and ginseng root look phallic and supposedly improve vitality.

(e) The natural commonplace, or what is natural is good and what is made by humans is bad. Alternative medicines are promoted with the word “natural.” Psychic abilities are portrayed as natural, but lost, abilities. Organic food is natural. Of course mistletoe berries are natural too, and I don’t recommend a steady diet of these morsels.

(f) The goddess-within commonplace, or humans have a spiritual side that is neglected by modern materialistic science. This commonplace stems from the medieval notion of the soul, which was modernized by Mesmer as animal magnetism and then converted by psychoanalysis into the powerful, hidden unconscious. Pseudoscience plays to this commonplace by offering ways to tap the unconscious, such as subliminal tapes, to prove this hidden power exists through extrasensory perception (ESP) and psi, or to talk with the remnants of this hidden spirituality through channeling and the seance.

(g) The science commonplaces. Pseudosciences use the word “science” in a contradictory manner. On the one hand, the word “science” is sprinkled liberally throughout most pseudosciences: subliminal tapes make use of the “latest scientific technology”; psychics are “scientifically tested”; health fads are “on the cutting edge of science.” On the other hand, science is often portrayed as limited. For example, one article in Self magazine reported our subliminal-tapes studies showing no evidence that the tapes worked and then stated: “Tape makers dispute the objectivity of the studies. They also point out that science can’t always explain the results of mainstream medicine either”. In each case a commonplace about science is used: (1) “Science is powerful” and (2) “Science is limited and can’t replace the personal.” The selective use of these commonplaces allows a pseudoscience to claim the power of science but have a convenient out should science fail to promote the pseudoscience.

  1. Attack Opponents Through Innuendo and Character Assassination

Finally, you would like your pseudoscience to be safe from harm and external attack. Given that the best defense is a good offense, I offer the advice of Cicero: “If you don’t have a good argument, attack the plaintiff.”

Let me give a personal example of this tactic in action. After our research showing that subliminal tapes have no therapeutic value was reported, my coauthors, Tony Greenwald, Eric Spangenberg, Jay Eskenazi, and I were the target of many innuendoes. One subliminal newsletter edited by Eldon Taylor, Michael Urban, and others claimed that our research was a marketing study designed not to test the tapes but to “demonstrate the influence of marketing practices on consumer perceptions.” The article points out that the entire body of data presented by Greenwald represents a marketing dissertation by Spangenberg and questions why Greenwald is even an author. The newsletter makes other attacks as well, claiming that our research design lacked a control group, that we really found significant effects of the tapes, that we violated American Psychological Association ethics with a hint that an investigation would follow, that we prematurely reported our findings in a manner similar to those who prematurely announced cold fusion, and that we were conducting a “Willie Horton”-style smear campaign against those who seek to help Americans achieve their personal goals.

Many skeptics can point to similar types of attacks. In the fourteenth century, Bishop Pierre d’Arcis, one of the first to contest the authenticity of the Shroud of Turin, was accused by shroud promoters as being motivated by jealousy and a desire to possess the shroud. Today, James Randi is described by supporters of Uri Geller as “a powerful psychic trying to convince the world that such powers don’t exist so he can take the lead role in the psychic world.”

Why is innuendo such a powerful propaganda device? Social psychologists point to three classes of answers. First, innuendoes change the agenda of discussion. Note the “new” discussion on subliminal tapes isn’t about whether these tapes are worth your money or not. Instead, we are discussing whether I am ethical or not, whether I am a competent researcher, and whether I even did the research.

Second, innuendoes raise a glimmer of doubt about the character of the person under attack. That doubt can be especially powerful when there is little other information on which to base a judgment. For example, the average reader of the subliminal newsletter I quoted probably knows little about me knows little about the research and little about the peer review process that evaluated it, and doesn’t know that I make my living from teaching college and not from the sale of subliminal tapes. This average reader is left with the impression of an unethical and incompetent scientist who is out of control. Who in their right mind would accept what that person has to say?

Finally, innuendoes can have a chilling effect. The recipient begins to wonder about his or her reputation and whether the fight is worth it. The frivolous lawsuit is an effective way to magnify this chilling effect.

Can Science Be Sold with Propaganda?

I would be remiss if I didn’t address one more issue: Can we sell science with the persuasion tactics of pseudoscience? Let’s be honest; science sometimes uses these tactics. For example, I carry in my wallet a membership card to the Monterey Bay Aquarium with a picture of the cutest little otter you’ll ever see. I am in the otter granfalloon. On some occasions skeptics have played a little loose with their arguments and their name-calling. As just one example, see George Price’s 1955 Science article attacking Rhine’s and Soal’s work on ESP — an attack that went well beyond the then available data.

I can somewhat understand the use of such tactics. If a cute otter can inspire a young child to seek to understand nature, then so be it But we should remember that such tactics can be ineffective in promoting science if they are not followed up by involvement in the process of science — the process of questioning and discovering. And we should be mindful that the use of propaganda techniques has its costs. If we base our claims on cheap propaganda tactics, then it is an easy task for the pseudoscientist to develop even more effective propaganda tactics and carry the day.

More fundamentally, propaganda works best when we are half mindless, simplistic thinkers trying to rationalize our behavior and beliefs to ourselves and others. Science works best when we are thoughtful and critical and scrutinize claims carefully. Our job should be to promote such thought and scrutiny. We should be careful to select our persuasion strategies to be consistent with that goal.

Comments are closed.