Therapy Near Me Mental Health Articles

MENTAL HEALTH ARTICLES

Why do politicians lie so much?—a psychologist’s guide to incentives, cognition, and institutions

Why do politicians lie so much—a psychologist’s guide to incentives, cognition, and institutions
Why do politicians lie so much—a psychologist’s guide to incentives, cognition, and institutions

Why do politicians lie so much?—a psychologist’s guide to incentives, cognition, and institutions

Overview

If it sometimes feels as though political life is saturated with half‑truths and strategic vagueness, you are not imagining it. But the full story is more nuanced than “politicians lie more than everyone else.” Human beings commonly shade the truth in everyday life (DePaulo et al., 1996), most people are poor lie‑detectors (Bond and DePaulo, 2006), and modern information environments reward novelty and speed over correction (Vosoughi, Roy and Aral, 2018). On top of that, electoral incentives, strategic communication, and voter psychology interact in ways that can make deception or ambiguity seem instrumentally rational even to otherwise conscientious leaders (Mayhew, 1974; Crawford and Sobel, 1982; Downs, 1957; Kunda, 1990).

This article synthesises evidence from political science, psychology, behavioural economics, and communication research to explain why deception occurs in politics, how it differs from adjacent phenomena like “spin” and “strategic ambiguity,” and what practically reduces it. Throughout, we use in‑text Harvard‑style citations and finish with a full reference list for further reading.

First principles: what counts as a “lie” in politics?

lie is a knowingly false statement intended to mislead. Political communication also includes other grey‑zone practices:

  • Paltering: using literally truthful statements to create a misleading impression (Rogers et al., 2017).
  • Lying by omission: leaving out critical qualifiers or context.
  • Strategic ambiguity: remaining purposefully vague to keep coalitions broad or avoid alienating segments of the electorate (Crawford and Sobel, 1982; Bach, 2025).

These strategies are not equivalent in ethics or effect, but they share the same outcome for citizens: reduced clarity when evaluating policy and performance.

Do politicians lie more than other people?

The empirical answer is mixed. Diary and survey research shows that deception is common in everyday life for non‑politicians (DePaulo et al., 1996). Meta‑analysis suggests that ordinary people detect lies at barely‑above‑chance levels (Bond and DePaulo, 2006). Meanwhile, large comparative studies of election pledge fulfilment report that governing parties actually deliver a substantial share of their promises, especially in single‑party executives (Thomson et al., 2017). In other words, politicians operate in a system that both incentivises obfuscation and constrains itthrough institutions, media scrutiny, and later accountability.

Incentives that reward deception or ambiguity

1) The electoral connection

Politicians are “single‑minded seekers of re‑election” who invest in advertising, credit‑claiming, and position‑taking to maximise electoral returns (Mayhew, 1974). When the true distributional effects of policies are complex or unpopular, leaders may resort to framing, selective disclosure, or ambiguity to hold together fragile coalitions.

2) Cheap talk in multi‑audience settings

Cheap‑talk models show that when communicators and audiences have misaligned preferences, truthful, fully‑revealing communication is hard to sustain; equilibrium often involves partial revelation, coded language, or vagueness (Crawford and Sobel, 1982). Campaigns face multiple audiences—donors, base voters, swing voters—so incentives for strategic ambiguity multiply.

3) Rational ignorance and limited attention

From the citizen side, rational ignorance predicts that most people won’t invest heavily in political information because a single vote rarely changes outcomes (Downs, 1957). That creates space for over‑simplification, slogans, and sometimes deception to flourish in the gaps.

The psychology of believing (and sharing) falsehoods

Motivated reasoning

People tend to process information in ways that protect identity and prior commitments (Kunda, 1990). Misperceptions can be stubborn when facts threaten group loyalties (Nyhan and Reifler, 2010).

Corrections help—but not always as we expect

Large‑sample replications suggest that dramatic “backfire effects” are uncommon; corrections usually move beliefs toward the facts (Wood and Porter, 2019; Nyhan, 2021). Still, corrections can be fragile, especially where identity is at stake.

Repetition and the illusory truth effect

Repeated headlines feel truer, even when we know they’re false (Fazio et al., 2019; Fazio, 2020). That is a design challenge for the modern attention economy.

Platform dynamics

On social media, false news spreads faster and further than true news, particularly in politics, largely because humans preferentially share novel, surprising content (Vosoughi, Roy and Aral, 2018).

“Spin,” paltering, and ambiguity: how the truth gets bent without outright lying

  • Spin reframes facts to favour an interpretation without changing the underlying data.
  • Paltering leverages true statements to mislead; senders often judge it as more ethical than lying even though receivers view it as equally dishonest when discovered (Rogers et al., 2017).
  • Strategic ambiguity preserves room to manoeuvre and sustain disparate coalitions but reduces accountability (Bach, 2025; Crawford and Sobel, 1982).

Why voters rarely punish deception decisively

  1. Information asymmetries: Citizens cannot easily verify complex claims in real time and are poor lie‑detectors (Bond and DePaulo, 2006).
  2. Issue bundling: People vote on packages of positions and identities; one detected falsehood may be traded off against preferred stances elsewhere (Achen and Bartels, 2016).
  3. Cue‑taking: Many citizens use party and elite cues as efficient “traffic signals,” which can work well but also amplify elite misinformation (Lupia and McCubbins, 1998).

What actually reduces political deception?

1) Strong transparency and oversight institutions

Randomised field evidence shows that audits can reduce corrupt behaviour (a cousin of political deception) by meaningful margins (Olken, 2007). At a system level, freedom of information regimes and proactive disclosure are associated with higher transparency and lower perceived corruption (Vadlamannati, Madsen and de Soysa, 2017).

2) Independent, high‑quality fact‑checking

Comparative studies report high agreement rates across reputable fact‑checkers on the same claims (Lee, 2023). While corrections do not reach everyone, neutral, well‑sourced fact‑checks reliably improve accuracy on average (Wood and Porter, 2019; Nyhan, 2021).

3) Better design of information environments

Light‑touch “accuracy prompts” and friction that slows impulsive sharing can reduce the spread of misleading content (Pennycook and Rand, 2019). Platform architectures that down‑rank repeat offenders and reward source credibility are promising public‑health measures for the information ecosystem (Vosoughi, Roy and Aral, 2018).

4) Smarter citizen habits

The Debunking Handbook 2020 recommends: lead with the fact, warn before the myth, explain the fallacy, and avoid accidentally reinforcing the myth through repetition (Lewandowsky et al., 2020). Individually, adopting slow‑thinking checks—What is the source? Is there independent corroboration?—reduces our own contribution to the problem (Pennycook and Rand, 2019).

Practical checklist for readers during campaigns and policy debates

  1. Define the claim: Is it verifiable, value‑based, or a prediction?
  2. Distinguish lie vs paltering: Ask what crucial context is missing.
  3. Look for primary sources: Budget papers, audit reports, and independent inquiries beat press events.
  4. Consult multiple fact‑checkers: Agreement across outlets is a good signal (Lee, 2023).
  5. Beware “too neat” narratives: If a claim is unusually tidy or outrage‑provoking, slow down (Fazio, 2020; Vosoughi, Roy and Aral, 2018).

Bottom line

Politics does not have a monopoly on dishonesty; the human propensity to spin, omit, and rationalise is widespread. What is distinctive about politics is the structure of incentives and the information environment: misaligned preferences, multi‑audience signalling, identity‑laden issues, and platforms that reward novelty all tilt the field toward distortion. Yet deception is not destiny. Robust transparency mechanisms, independent fact‑checking, smarter platform design, and evidence‑based media habits can meaningfully improve truthfulness in public life (Olken, 2007; Lewandowsky et al., 2020; Pennycook and Rand, 2019; Thomson et al., 2017).


References

Achen, C.H. and Bartels, L.M. (2016) Democracy for Realists: Why Elections Do Not Produce Responsive Government. Princeton: Princeton University Press.

Bach, P. (2025) ‘Let me be perfectly unclear: strategic ambiguity in political rhetoric’, Communication Theory, 35(2), pp. 96–118.

Bond, C.F. and DePaulo, B.M. (2006) ‘Accuracy of deception judgements’, Personality and Social Psychology Review, 10(3), pp. 214–234.

Crawford, V.P. and Sobel, J. (1982) ‘Strategic information transmission’, Econometrica, 50(6), pp. 1431–1451.

DePaulo, B.M., Kashy, D.A., Kirkendol, S.E., Wyer, M.M. and Epstein, J.A. (1996) ‘Lying in everyday life’, Journal of Personality and Social Psychology, 70(5), pp. 979–995.

Downs, A. (1957) An Economic Theory of Democracy. New York: Harper and Row.

Fazio, L.K., Rand, D.G. and Pennycook, G. (2019) ‘Repetition increases perceived truth across levels of plausibility’, Psychonomic Bulletin & Review, 26(5), pp. 1705–1713.

Fazio, L.K. (2020) ‘Repetition increases perceived truth even for known falsehoods’, Collabra: Psychology, 6(1), 38.

Lee, S. (2023) ‘“Fact‑checking” fact checkers: a data‑driven approach’, Harvard Kennedy School Misinformation Review, 4(1), pp. 1–16.

Lewandowsky, S., Ecker, U.K.H., Cook, J., Albarracín, D., Amazeen, M.A., Kendeou, P., Lombardi, D. et al. (2020) The Debunking Handbook 2020. Fairfax, VA: George Mason University Center for Climate Change Communication.

Lupia, A. and McCubbins, M.D. (1998) The Democratic Dilemma: Can Citizens Learn What They Need to Know?Cambridge: Cambridge University Press.

Mayhew, D.R. (1974) Congress: The Electoral Connection. New Haven: Yale University Press.

Nyhan, B. (2021) ‘Why the backfire effect does not explain the durability of political misperceptions’, Proceedings of the National Academy of Sciences, 118(15), e1912440117.

Nyhan, B. and Reifler, J. (2010) ‘When corrections fail: the persistence of political misperceptions’, Political Behavior, 32(2), pp. 303–330.

Olken, B.A. (2007) ‘Monitoring corruption: evidence from a field experiment in Indonesia’, Journal of Political Economy, 115(2), pp. 200–249.

Pennycook, G. and Rand, D.G. (2019) ‘Fighting misinformation on social media using accuracy prompts’, Proceedings of the National Academy of Sciences, 116(16), pp. 7662–7668.

Rogers, T., Zeckhauser, R., Gino, F., Norton, M.I. and Schweitzer, M.E. (2017) ‘Artful paltering: the risks and rewards of using truthful statements to mislead others’, Journal of Personality and Social Psychology, 112(3), pp. 456–473.

Thomson, R., Royed, T., Naurin, E., Artés, J., Costello, R., Ennser‑Jedenastik, L., Ferguson, M., Kostadinova, P., Moury, C., Pétry, F. and Praprotnik, K. (2017) ‘The fulfilment of parties’ election pledges: a comparative study on the impact of power sharing’, American Journal of Political Science, 61(3), pp. 527–542.

Vosoughi, S., Roy, D. and Aral, S. (2018) ‘The spread of true and false news online’, Science, 359(6380), pp. 1146–1151.

Wood, T. and Porter, E. (2019) ‘The elusive backfire effect: mass attitudes’ steadfast factual adherence’, Political Behavior, 41(1), pp. 135–163.

How to cite this article

Therapy Near Me (2025) ‘Why do politicians lie so much?—a psychologist’s guide to incentives, cognition, and institutions’. Available at: https://TherapyNearMe.com.au

wpChatIcon

Follow us on social media

Book An Appointment