The media today is full of people who have lived a lie.
There’s Elizabeth Holmes, the biotech entrepreneur, who in 2015 was declared the youngest and richest self-made female billionaire. She now faces 20 years in prison for fraud. Then there’s Anna Sorokin – aka Anna Delvey, who pretended to be a German heiress, and subsequently fleeced New York’s high society of hundreds of thousands of dollars. And Shimon Hayut, aka Simon Leviev – the so-called Tinder Swindler.
What marks all of these people is not just the lies they told others – but the lies they must have told themselves. They each believed their actions were somehow justifiable, and – against all odds – believed they would never be found out. Time and again, they personally seemed to deny reality – and dragged others into their scams.
You might hope that this kind of behaviour is a relatively rare phenomenon, restricted to a few extreme situations. But self-deception is incredibly common, and may have evolved to bring some personal benefits. We lie to ourselves to protect our self-images, which allows us to act immorally while maintaining a clear conscience. According to the very latest research, self-deception may have even evolved to help us to persuade others; if we start believing our own lies, it’s much easier to get other people to believe them, too.
This research might explain questionable behaviour in many areas of life – far beyond the headline-grabbing scams in recent years. By understanding the different factors contributing to self-deception, we can try to spot when it might be swaying our own decisions, and prevent these delusions from leading us astray.
Safeguarding the ego
Any psychologist will tell you that studying self-deception scientifically is a headache. You can’t simply ask someone if they are fooling themselves, since it happens below conscious awareness. As a result, the experiments are often highly intricate.
Let’s begin with the research of Zoë Chance, an associate professor of marketing at Yale University. In an ingenious experiment from 2011, she showed that many people unconsciously employ self-deception to boost their egos.
One group of participants were asked to take an IQ test, with a list of the answers printed at the bottom of the page. As you might expect, these people performed considerably better than a control group who did not have the answer key. They did not seem to recognise how much they had relied on the ‘cheat sheet’, however – since they predicted that they would do equally well on a second test featuring another hundred questions, without the answer key. Somehow, they had fooled themselves into thinking that they had known the solutions to the problems without needing the helping hand.
To be sure of this conclusion, Chance repeated the whole experiment with a new set of participants. This time, however, the participants were given a financial reward for accurately predicting their results in the second test; overconfidence would come with a penalty. If the participants were conscious of their behaviour, you might expect this incentive to reduce their overconfidence.
In reality, it did little to puncture the participants’ inflated self-belief; they still fooled themselves into thinking they were smarter than they were, even when they knew that they would lose money. This suggests that the beliefs were genuine and deeply held – and surprisingly robust.
It’s not hard to see how this might apply in real life. A scientist may feel that their results were real, despite the use of fraudulent data; a student may believe they earned their place at a prestigious university, despite cheating on a test.
Despite knowing they’d had help, experiment participants convinced themselves they were smarter than they were (Credit: Getty)
Moral sincerity
The use of self-deception to enhance self-image has now been observed in many other contexts.
For instance, Uri Gneezy, a professor of economics at the University of California, San Diego, has recently shown it can help us to justify potential conflicts of interest in our work.
In a 2020 study, Gneezy asked participants to take on the roles of investment advisors or clients. The advisors were given two different opportunities to consider – each of which came with different risks and different payoffs. They were also told that they would receive a commission if the client opted for one of the two investments.
In one set of trials, the advisors were told about this potential reward at the very start of the experiment, before they started considering the different options. While they were ostensibly picking the best choice for the client, they were much more likely to go with the choice that was favourable to themselves.
In the rest of the trials, however, the advisors were only told of this potential reward after they had been given some time to weigh up the pros and cons of each. This time few chose to let the reward influence their decision; they remained honest to their goal of giving the best advice to the client.
To Gneezy, the fact that the knowledge of the personal benefits only influenced the participants’ decision in the first scenario suggests that their self-deception was unconscious; it changed the way they were calculating the benefits and risks without them being aware of the bias, so that they could feel that they were still acting in the clients’ interest. In the second scenario, it would have required a complete change of mind, which would have been harder to justify to themselves. “They just couldn’t convince themselves that they would be acting ethically,” he says.
[Self-deception] means that we can continue to see ourselves as good people – Uri Gneezy
In this way, self-deception is a way of protecting our sense of morality, says Gneezy. “It means that we can continue to see ourselves as good people,” he says – even when our actions would suggest otherwise.
This form of self-deception might be most obviously relevant to financial advisors, but Gneezy thinks it could also be important for private healthcare. Despite having good intentions, a doctor could unconsciously deceive themselves into thinking the more expensive treatment was best for the patient – without even recognising their self-deception, he says.
Persuading ourselves, persuading others
Perhaps the most surprising consequence of self-deception concerns our conversations with others.
According to this theory, self-deception allows us to be more confident in what we are saying, which makes us more persuasive. If you are trying to sell a dodgy product, for instance, you will make a better case if you genuinely believe it is a high-quality bargain – even if there is evidence to suggest otherwise.
This hypothesis was first proposed decades ago, and a recent paper by Peter Schwardmann, an assistant professor of behavioural economics at Carnegie Mellon University, US, provides some strong evidence for this idea.
Like Chance’s study, Schwardmann’s first experiments began with an IQ test. The participants weren’t given the results, but after the test was finished, they had to privately rate how well they thought they’d done. They then took a test of persuasion: they had to stand before a jury of mock employers and convince the panel of their intellectual prowess – with a potential 15 euro ($16, £12.80) reward if the judges believed that they were among the smartest in the group.
Some people were told about the persuasion task before they rated their confidence in their performance, while others were told afterwards. In line with the hypothesis, Schwardmann found that this changed their ratings of their abilities: the prior knowledge that they would have to convince others resulted in greater overconfidence in their abilities, compared to those who had not yet been told. The need to persuade others had primed them to think that they were smarter than they really were.
He describes this as a kind of “reflex”. Importantly, Schwardmann’s experiments showed that the self-deception paid off; unfounded overconfidence did indeed increase people’s ability to persuade the mock employers.
The need to argue a point makes us think we’re smarter than we are, research shows (Credit: Getty)
Picking sides
Schwardmann has now observed a similar process in debating tournaments. At these events, the participants are given a topic and then randomly assigned a point of view to argue – before being given 15 minutes to prepare their arguments. During the debate, they are then judged on how well they present their case.
Schwardmann tested the participants’ personal beliefs about the topics before they had been assigned their position, after they had started formulating their arguments, and after the debate itself. In line with the idea that self-deception evolved to help us persuade others, he found that people’s personal opinions substantially changed after they had been told which side of the debate they would need to argue. “Their private beliefs moved towards the side that they’d been given just 15 minutes beforehand – to align with their persuasion goals,” says Schwardmann.
After the debate, the participants were also given the chance to allocate small sums of money to charity – selected from a long list of potential organisations. Schwardmann found they were much more willing to choose organisations that aligned with the position of their argument – even though it had initially been chosen at random.
Many of our opinions may have been formed in this way. In politics, it could be that a campaigner who is asked to canvas on a particular point really comes to persuade him- or herself that it is the only way of viewing the point – not because they have carefully appraised the facts, but simply because they were asked to make the argument. Indeed, Schwardmann suspects this process may lie behind much of the political polarisation we see today.
Delusions of grandeur
In all these ways, our brains can fool us into believing things that are not true. Self-deception allows us to inflate our opinion of our own abilities, so that we believe we are smarter than everyone around us. It means that we overlook the repercussions of our actions for other people, so that we believe that we are generally acting in a moral way. And by deceiving ourselves about the veracity of our beliefs, we show greater conviction in our opinions – which can, in turn, help us to persuade others.
We can’t ever know what was truly going through the minds of Holmes, Sorokin or Hayut and other fraudsters – but it is easy to speculate how some of these mechanisms may have been at play. At the very least, these con artists seem to have had abnormally high opinions of their own abilities and their right to get what they want – and they happily shrugged off the potential ethical implications of what they were doing.
Holmes, in particular, seems to have believed in her product, and attempted to justify her use of misleading data. Despite all evidence to the contrary, she still declared at her trial that “the big medical device companies like Siemens could easily reproduce what we had done”. Hayut, meanwhile, still claims he is “the biggest gentleman”, who had done nothing wrong.
Schwardmann agrees it may be possible for some fraudsters to inhabit incredibly elaborate lies. He points out that some even show a kind of righteous anger when they are being questioned, which might be hard to fake. “Maybe that’s a sign that they really buy into their own lie,” he says.
Tellingly, a desire for social status seems to increase people’s tendency for self-deception. When people feel threatened by others, for example, they are more likely to inflate their perceptions of their own abilities. It may be that the bigger the stakes, the greater the lies we are able to tell ourselves.
Most of the time, our self-deception may be benign – allowing us to feel just a bit more confident in ourselves than is justified. But it’s always worth being aware of these tendencies – especially if we’re making potentially life-changing decisions. You don’t want to deceive yourself about the risks of cutting corners in your current job, or the likelihood of success from an adventurous career move, for example.
One good way of puncturing all kinds of bias is to “consider the opposite” of your conclusions. The technique is as straightforward as it sounds: you try to find all the reasons that your belief may be wrong, as if you were interrogating yourself. Multiple studies have shown that this leads us to think more analytically about a situation. In laboratory tests, this systematic reasoning proves to be much more effective than simply telling people to “think rationally”.
This is only possible if you can accept your flaws, of course. The first step is acknowledging the problem. Perhaps you think that you don’t need this advice; self-deception only afflicts others, while you are perfectly honest with yourself. If so, that may be your greatest delusion of all.