Tuesday, September 6, 2016

The Confidence Game: A Guide to the 2016 Presidential Election?


Found this book at the library and thought I'd read it to understand the 2016 Presidential election cycle.  It's been useful in that regard.

The author arranges the chapters in the way she defines the structure of a confidence game.  Each chapter explores the progressive steps in the con:  Put-up, play, rope, tale, convincer, breakdown, send, touch, blow-off, and fix.  You can play along until Election Day 2016 by identifying each step of at least one candidate's confidence game as he/she strings the public along.


The Confidence Game:  Why We Fall for It… Every Time by Maria Konnikova
NY:  Viking, 2016
ISBN 978-0-525-42741-4

(5)  “Religion,” Voltaire is said to have remarked, “began when the first scoundrel met the first fool.”

(18)  “There’s a sucker born every minute, and one to trim ‘em and one to knock ‘em.”

(34-35)  According to psychologist Robert Feldman, who has spent more than four decades studying the phenomenon, we lie, on average, three times during a routine ten-minute conversation with a stranger or casual acquaintance.

(36)  Would you be a grifter - even a mild one - if given the chance?  Try this short test.  Take your index finger, raise it to your forehead, and draw the letter Q.

Done?  Which way is your Q facing - tail to the right, or tail to the left?  The test, described in detail by Richard Wiseman, a psychologist and famed skeptic, is a way to gauge your “self-monitoring”tendency.  If you drew the letter with the tail to the left, so that others could read it you are a high self-monitor.  That means you are more concerned with appearance and perception - how others see you.  To achieve the desired effect, you are likely more willing to manipulate reality - even just a bit - to make a better impression.

(59)  In one series of studies, [Nicholas] Epley and his colleagues found that people were far slower to discern a different perspective from their own, and that under time pressure they were unlikely to do so at all.  He called it “egocentric anchoring”:  we are our own point of departure.  We assume that others know what we know, beleive what we believe, and like what we like….

We never learn to be expert people-readers because that expertise can backfire spectacularly.  Why form accurate judgments when the inaccurate ones make our lives far more pleaseant and easy?

(82)  Fred Demara:  “Americans would rather be liked that right.  (This fact allowed you to operate after reasonable suspicion was aroused.)  Americans are amazingly forgiving to the errant sinner (almost everywhere you went they would take you back).  Americans are among the most trusting people in the world.  Accept you at your word and at your face value until proven otherwise.  (They don’t stand and watch you or quesiton you but wait for you to volunteer your own information.  This, of course, is a great asset to the impostor.)  The test of freedom of this country.  Where else but in American could a guy like me operate?  On ability of famed impostor to circulate:  If they aren’t looking for you, they don’t see you."

(111)  The con artist can employ something called “wishful identification.”  We don’t feel sorry for the character;  we want to be him.  He has attained presicely what we want.  And don’t we deserve that, too?  Now it’s our turn.  The more similar the characters in the story are to us, whether because of appearance or social position, the more likely we are to relate to them. The more we like the confidence man, the more we relate to him.

(118)  It’s no coincidence that cons tend to thrive in the wake of disaster:  natural disaster, illness, economic disaster, national disaster, personal disaster.  The play is almost built into disaster zones from the start.  Emotions are already high.  There’s already a compelling story line.  Imagine the implications for the play:  create a sense of fear, and then the feeling of relief (not to worry!  there’s a solution!) and your mark is all but guaranteed to fall.

(121)  In one study, arousal alone was enough to get someone to agree with a request for help;  it little mattered what the content of the request might happen to be.  

What visceral states do is create an intense attentional focus.  We tune out everything else and tune in to the in-the-moment emotional cues.  It’s similar to the feeling of overwhelming hunger or thirst - or the need to go to the bathroom - when you suddenly find yourslef unable to think about anything else.  In those moments, you’re less likely to deliberate, more likely to just say yes to something without fully internalizing it, and generally more prone to lapses that are outside the focus of your immediate attention. 

…Cons, long and short both, thrive on in-the-moment arousal:  we have no time to repent.  The best play makes use of that tendency.  Con artists heat us up.  That is their living.  As one put it, “It is imperative that you work as quickly as possible.  Never give a hot mooch time to cool off.  You want to close him while he is still slobbering with greed.”

(133)  The first, alpha, was far more frequent:  increasing the appeal of something.  The second, omega, decreased resistance surrounding something.  In the one, you do what you can to make your proposition, whatever it may be, more attractive….

The put-up identified the mark and mapped out his idiosyncrasies, hopes, and fears.  The play caught the mark’s attentioin and baited the hook.  The rope makes sure he bites and the hook sinks deep - else, with a bit of wiggling, the almost-sure-deal prey swim hastily away.

(133-134)  Robert Cialdini…. argues that six principles govern most persuasive relationships:  reciprocity (I rub your back, you rub mine), consistency (I beleive the same thing today as I did yesterday), social validation (doing this will make me belong), friendship or liking (exactly what it sounds like), scarcity (quick! there isn’t much to go around), and authority (you seem like you know what you’re talking about).  These are all alpha principles, used to increase persuasive appeal…

(136)  In 1966, Stanford University psychologists Jonathan Freeman and Scott Fraser observed an interesting phenomenon in their experiment:  someone who has already agreed to a small request - like opening the door for you - would become more, not less, likely to agree to a larger request later on.

(137)  As Cialdiini points out, one of the elements that make us more vulnerable to persuasion is our desire to maintain a good image of ourselves.  If something is framed so as to make us feel like worthy people, we are much mroe likely to comply with it.  We want to behave in a way that’s consistent with the image we’ve created.

Consistency here plays a crucial role in the other direction too - not just in our evaluation of ourselves but in our evaluation of the person we’re helping:  if I’ve helped you before, you must be worth it.  Therefore, I’ll help you again.

(139)  But niceness isn’t the only way to go.  Another effective technique that Cialdini first identified in 1975 is the door-in-the-face, a near opposite of the foot-in-the-door.  When someone we don’t really know asks us for a large favor - or even someone we do know catches us on an off day - and we (understandably) refuse, we _do_ indeed feel rude, just as [Daryl]  Bem would have predicted.  But we don’t like feeling rude.  And so we also feel something else we don’t like:  guilty.  So what happens when the person we turned down asks us for something else, something smaller, something that seems far more reasonable in comparison?  We say yes.  Guilt assuaged - and con artist’s mission accomplished.

(142-143)  In 1986, Santa Clara University psychologist Jerry Burger proposed a persuasion - or roping, if you will - tactic that relied not on a comparison between two separate favors but on a comparison within the favor self:  the that’s-not-all-technique.  An effective approach, Burger found, is to start with a false baseline (that is, not at all what you’re planning to eventually propose) and then, in quick succession, make changes and additions to that starting point that make it seem increasingly attractive.  You make an initial bid - how would you like to get in on this land deal in Florida? - and before your mark can respond, you turn it into something else.  “That’s not all.  You also get a guaranteed return on your initial investment.”  People who were approached with a that’s-not-all story, Burger found, were more likely to buy into it than those who heard the great offer right away.  (The that’s-not-all-ing, incidentally, can continue for a while.  You need not stop at one.)

That’s-not-all is actually a member of a broader set of persuasive tactics, known as disrupt-then-reframe techniques.  First you disrupt someone’s understanding of an attempt to influence her, and then you reframe the attempt in a way that makes her more vulnerable to it.  Here’s how it works.  Harvard psychologist Daniel Gilbert proposes that we understand the world in two stages.  First we take it at face value, in order to decipher the sense of what someone is telling us.  And then we evaluate it, in order to judge the soundness of what we’ve just deciphered.  Disrupt-then-reframe attacks the evaluative part of the process:  we don’t have a chance to give a proper assessment because each time we try to do so, the situaiton changes.

(144)  A request for a tiny amount of money legitimizes you in the eyes of others.  If you were a swindler, you’d ask for a lot, wouldn’t you?

(145)  A closely related approach is Cialdini’s lowball technique.  This time, you tell your intended victim that what you wnat is actually quite small - and oncehe commits to doing it, raise the stakes.

(149)  In their influential 1959 work “The Bases of Social Power,” John French and Bertram Raven posited that there were five major bases from which power derives:  reward power, or the belief that someone is able to reward you;  coercive power, or the belief that someone is able to punish you somehow;  legitimate power, or an actual basis of authority;  referent power, or power derived from your affiliation with someone (or desire to be affiliated with them);  and expert power, from someone’s expertise on a topic.

(154)  When someone in power tells us to do something, we tend to do it.  The rope is often at its most effective when we trust the power of its source, the con man….

One of the first things a con artist does is establish trust - often by being the exact type of person he thinks you aspire to be, or at least, want to be associated with.

(155)  And there was something else crucial about Madoff:  he was part of the Jewish community.  A community he leveraged to its full extent.  As Michael Shermer put it, “It was an affinity scheme, it was insidery.  We have to take care of each other;  he’s one of us.”  Madoff is far from alone.  Con artists often use communities to quickly gauge character and belief targets…  The authority we grant someone comes often as more of an afterthought than anything else, by virtue of their belonging to the exact right group, one that we’re particularly eager to either join or be liked by.

(159)  Our desire to be accepted as a member of groups that appeal to us is, according to Cialdini, one of the strongest motivators in our being persuaded by something:  it is an important reason that the rope often works effectively.  We are more likely to go along with something if it has the stamp of approval of a group we trust or promises us entry in a group we’d like to belong to.

(160)  Take this example:  by the order in which someone presents us with options, she can reliably make those options look better or worse - even if we wouldn’t naturally think so.

(161)  … position effects - where something is located physically….  default effects - or what your choice is by default….  anchor effects - the initial cues you see that influence your subsequent decision...

(163)  Information priming works so well because it exploits an effect we’ve already seen several times:  the ease that comes from familiarity. Mention something in passing , and then when you elaborate on it later - especially if it’s a few days later - it seems that much more convincing.  It’s a phenomenon known as the illusion of truth:  we are more likely to think something is true if it feels familiar.

(165)  Something else happens, too, when our minds feel bombarded from all sides.  In situations where we’re overtaxed, psychologist Katherine Milkman has found, we are more likely to make decisions that fit with what we _want_ to do rather than what we _should_ do.  The two are often in conflict, and even without outside help, it can be difficult to choose the path of the “should.”

(183)  Simply put, when it comes to ourselves - our traits, our lives, our decisions - our personal attachment overshadows our objective knowledge.  We systematically misevaluate evidence based on our own characteristics, and if we’re given evidence that something about us poses a threat, instead of thinking about how to change our own behavior, we call the evidence itself into question.  

(188)  Memory is a tricky thing, and once we’ve been taken once, it becomes all the more likely that we will fall for a con again.  There is no better mark, many a con artist will tell you, than one who has already been duped.

(195)  “The secret of rulership,” wrote George Orwell, “is to combine a belief in one’s own infallibility with the power to learn from past mistakes.”

(205)  Had it been up to Raines, he might have kept believing until the end.  It would have been a simpler, happier reality.  And that basic desire for a happier, simpler reality is at the center of the convincer’s success.

(235)  That’s the question at the heart of the breakdown, the moment when the con artist sees just how far he can take us.  In the put-up, he picked us out of the crowd with care.  In the play, he established a bond through some emotional wrangling and expert storytelling.  In the rope, he laid out his persuasive pitch for our already-willing ears.  In the tale, he’s told us how we will personally benefit, relying on our belief in our exceptionalism.  In the convincer, he’s let us win, persuading us that we’d been right in going along with him.  And now comes the breakdown.  We start to lose.  How far can the grifter push us before we balk?  How much of a beating can we take?  Things don’t completely fall apart yet - that would lose us entirely, and the game would end prematurely - but cracks begin to show.  We lose some money.  Something doesn’t go according to plan.  One fact seems to be off.  A figure is incorrectly labeled.  A wine bottle is “faulty.”  The crucial question:  do we notice, or do we double down?  High off the optimism of the convincer, certain that good fortune is ours, we often take the second route.  When we should be cutting our losses, we instead recommit - and that is entirely what the breakdown is meant to accomplish.

(237)  Changing your perception or your memory is easier than changing behavior.

(263)  The send is that part of the con where the victim is recommitted, that is, asked to invest increasingly greater time and resources into the con artist’s scheme - and in the touch, the con finally comes to is fruition and the mark is completely, irrevocably fleeced….  Once the send is in motion, with the mark recommitted to raising the stakes the touch - the con’s end - is inevitable.  Once we are in, well and good, we are all in.

(366)  … once we’ve invested heavily in something, we no longer see it clearly, no matter the costs.

(367)  [Richard] Thaler termed the phenomenon the sunkuuu-cost fallacy….

In theory, we should only care about new, incremental costs.  What we’ve already put into something shouldn’t matter:  it’s lost anyway, whatever “it” happens to be - time, money, energy, whatever else.  We should stick with it only if it still seems worthwhile in light of new evidence.

(271)  In psychology, that idea is called the endowment effect, first articulated by Thaler in 1980.  By virtue of being ours, our actions, thoughts, possessions, and belief acquire a glow they didn’t have before we committed to them.  Sunk costs make ys loath to spot problems and reluctant to swerve from a committed path.  Adn the endowment effect imbues the status quo - what we’ve done - with an overly optimistic rosy glow.  It makes us want to hold on to it all the more.

(273)  The status quo bias only makes things worse.  We like things as they are.

(286)  In the blow-off, the confidence artist has one main goal:  now that the touch has been taken, get the mark out of the way as quickly as possible.  The last thing you wnat is for someone to complain and thus draw attention to the whole enterprise.  The blow-off is often the final step of the con, the grifter’s smooth disappearance after the game has played out.  Sometimes, though, the mark may not be so complacent.  If that happens, there’s always one more step that can be taken:  the fix, when a grifter puts off the involvement of law enforcement to prevent marks from making their complaints official.

(307)  “When people want to believe what they want to believe, they are very hard to dissuade.”  - David Sullivan

(310)  We’re really adamant we have free will,” [Jennifer] Stalvey said.  “But so often, that’s simply not true.  everyone has a weakness.  We want to connect to someone or something greater. 

(311)  Joshua Jelly-Shapiro:  “They [cults] are all founded on meaning, community:  what everyone wants.”

That’s why [David] Sullivan found cults to be a particularly enraging confidence game, more infuriating than most:  it was a co-optation of a very legitimate quest for meaning.  Everyone wants to believe, everyone wants meaning, everyone wants stories that make sense of incoherence.

(312) … the key to resisting persuasion and manipulation was to have a strong, unshakeable, even, sense of self.  Know who you are no matter what, and hold on to that no matter what.  It isn’t easy - it was years before Sullivan was able to find a suitable female infiltrator;  Stalvey, he said, was an exception.  “It’s very rare to find someone to put into a cult.  You have to have a very strong sense of your own identity,” he said.  “And it’s not easy to do this.  The psychological techniques that are now employed to coerce you are phenomenal.”

When we spoke, Stalvey elaborated on the approach her mentor had taught her.  One of the most important things, she said, was to maintain objectivity:  logic to counteract feeling.  You know your emotions will be manipulated - they always are, in any con, big or small.  That’s the whole point of the put-up and the play.  And once you become emotional, your reasoning can easily become short-circuited.  “Always pay attention to the details,” she told me.  That is one way to ensure that you are staying rooted in the physical, the objective, rather than the psychological, the subjective…. “Through it all, you have to make sure you are observing as much as feeling.”

(313)  Know what people you’re likely to trust, what triggers are likely to catch you, whether positive or negative, and try to be aware enough of your own behavior that you won’t get swept up in it.  In short, hone your skills of observation and detail-noting, as Stalvey puts it, when it comes not just to others but to yourself.

Another key element is Stalvey and Sullivan’s arsenal:  set limits.  “I’d decide before I went in what my limits were, the lines I wouldn’t cross, physically or emotionally,” Stalvey said.  She made sure that trusted others knew those limits and were ready to step in if she was getting close to the edge.

(320-321)  Nobody joins a cult, Sullivan repeated often and emphatically.  People join something that will give them meaning.  “They join a group that’s going to promote peace and freedom throughout the world or that’s going to save animals, or they’re going to help orphans or something.  But nobody joins a cult.”  Nobody embraces false beliefs:  we embrace something we think is as true as it gets.  Nobody sets out to be conned:  we set out to become, in some way, better than we were before.

(327)  The Big Con by David Maurer

(328)  Hustlers and Con Men by Jay Robert Nash

1 comment: