How to spot a scam: a psychologist’s advice
General information only. If you or someone else is in life‑threatening danger, call 000. For non‑emergency cybercrime, report at ReportCyber (ACSC) and Scamwatch. Support for identity and scam harm: IDCARE.
Why we fall for scams: the human factors
Scams succeed not because victims are foolish, but because offenders exploit predictable features of human decision‑making. Under time pressure, uncertainty, and emotion, we rely on fast, intuitive shortcuts (“System 1”) rather than slow, analytic reasoning (“System 2”) (Tversky and Kahneman, 1974; Kahneman, 2011). Offenders pair this with persuasive levers—authority, scarcity, social proof, reciprocity, liking, and commitment/consistency—to nudge compliant responses (Cialdini, 2001). Threat‑based messages that heighten fear and urgency further narrow attention; unless accompanied by clear, doable actions, fear appeals tip people into maladaptive responses (Witte, 1992).
Key cognitive traps include loss aversion (over‑weighting potential losses), anchoring (first numbers set a frame), and the sunk cost effect (continuing because of what we already invested) (Tversky and Kahneman, 1974; Arkes and Blumer, 1985). Scammers sequence these tactics—for instance: a small, harmless ask (foot‑in‑the‑door) that later escalates to a large transfer (Freedman and Fraser, 1966).
The current landscape in Australia
The National Anti‑Scam Centre reports that scam losses remain substantial despite falling report volumes, with sustained offender activity across investment, phishing, and remote‑access categories (ACCC/NASC, 2025). The Australian Signals Directorate’s ACSC notes high rates of cyber incidents impacting individuals and small organisations and urges prompt reporting to strengthen national threat intelligence (ASD/ACSC, 2025). Australia has introduced a Scams Prevention Framework and strengthened SMS sender ID controls to disrupt impersonation scams, but vigilance at the individual level remains critical (ACCC/NASC, 2025).
The scammer’s playbook: signals to recognise early
Below are cross‑channel red flags distilled from psychological research and cyber‑security evidence.
1) Artificial urgency and fear
- Messages claiming your account will be closed or you’ll be arrested unless you act now.
- Requests to move conversations off the platform or to bypass normal processes.
These leverage fear appeals and time pressure to derail analytic reasoning (Witte, 1992; Workman, 2008).
2) Authority and legitimacy theatre
- Use of logos, uniforms, or “case numbers” with poor grammar or atypical phrasing.
- Caller ID or email display names that look right, but domains are slightly off (e.g.,
@paypaI.comwith a capital “i”).
Perceived authority and surface cues increase compliance, especially under cognitive load (Cialdini, 2001; Tversky and Kahneman, 1974).
3) Scarcity, prizes and “exclusive” offers
- Limited‑time investment windows, guaranteed returns, or “you’ve won” notifications that require a fee to release funds.
Scarcity and certainty promises are classic persuasion levers in investment frauds (Cialdini, 2001; ACCC/NASC, 2025).
4) Pre‑texting and social grooming
- A believable story that fits your context (e.g., “IT support” who knows your department; a “partner” who shares your niche interests).
- A small initial request (confirm a code; send ID for “verification”) followed by larger asks (Freedman and Fraser, 1966; Workman, 2008).
5) Romance‑style grooming
- Rapid intimacy, crisis narratives requiring money, and migration to private channels.
- Refusal to video‑chat or always “just missed” calls; high emotional oscillation (love‑bombing → silent treatment).
Victims describe profound psychological harm beyond financial loss (Whitty, 2016; Cross, 2016; Drew, 2024).
6) Deepfakes and synthetic identities
- Faces and voices that seem familiar or unusually trustworthy but are hard to verify. AI‑synthesised faces can be indistinguishable from real ones and rated more trustworthy (Nightingale and Farid, 2022).
- Audio calls with urgent payment instructions from a “boss” or “relative.”
7) Phishing across email, SMS and social apps
- Links that go to look‑alike sites; attachments with “invoice”, “statement”, or “delivery note”.
- URL shorteners masking destination; login pages without security indicators.
Training helps, but susceptibility varies with habits and context (Parsons et al., 2017; Wright and Marett, 2010; Naqvi et al., 2023).
A five‑minute Scam Spot checklist
Use this short, repeatable process before you click, pay, or share sensitive information:
- Stop the clock. If it cannot wait five minutes, it is not legitimate (Witte, 1992).
- Verify the channel. Use the official number or website you find yourself—never the contact in the message.
- Check the domain and destination. Hover on links; type addresses manually; avoid QR codes from unknown sources (Naqvi et al., 2023).
- Cross‑check with a second person. Scammers isolate; a quick second brain reduces errors (Wright and Marett, 2010).
- Refuse remote access and crypto instructions. Legitimate organisations will not demand remote control or crypto payments for “urgent tax/bills” (ACCC/NASC, 2025).
Psychological tools to resist manipulation
- Pre‑commit to a delay rule. Decide now that any unsolicited payment request waits 24 hours. This interrupts time‑pressure compliance and the sunk‑cost pull (Arkes and Blumer, 1985).
- Inoculate yourself. Brief exposure to common tactics with refutations builds mental “antibodies” against future manipulation (McGuire, 1961; van der Linden et al., 2017; Roozenbeek et al., 2020).
- Label emotions. Putting feelings into words reduces threat reactivity and supports better choices—e.g., “I feel anxious and rushed” (Lieberman et al., 2007).
- Use a written checklist. Simple lists reduce slips and impulsive responses, especially under fatigue or stress (Parsons et al., 2017; Naqvi et al., 2023).
If you think you have been scammed: what to do next in Australia
- Contact your bank immediately (ask for scam team; attempt recall or hold).
- Report at ReportCyber (ASD/ACSC) to contribute to national disruption efforts and receive tailored guidance (ASD/ACSC, 2025).
- Report to Scamwatch to assist broader disruption and warnings (ACCC/NASC, 2025).
- Get support from IDCARE, Australia’s national identity and cyber support service; they provide practical and behavioural help (IDCARE, 2025).
- Change passwords on all accounts that share the same credentials; enable multi‑factor authentication.
- Document everything (screenshots, dates, amounts, account identifiers).
- Consider a credit ban with major credit reporting bodies (Equifax, illion, Experian) and monitor for new‑account fraud.
- For threats or coercion, call 000. For non‑emergency identity concerns, OAIC has guidance on privacy breaches.
Helping someone you care about
Victims often experience shame, grief and isolation, which can inhibit reporting and help‑seeking (Whitty, 2016; Cross, 2016). Use non‑judgemental language, focus on safety and practical steps, and encourage specialist support (IDCARE; victim support services). Avoid blaming—offenders are skilled manipulators who exploit universal human tendencies.
For clinicians: brief intervention pathway (one session to four sessions)
- Assess safety and exposure: current financial risk, ongoing contact, family coercion dynamics.
- Psychoeducation: explain cognitive mechanisms (heuristics, fear appeals, persuasion levers) and normalise reactions (Tversky and Kahneman, 1974; Witte, 1992; Cialdini, 2001).
- Skills: practise a 24‑hour delay rule, emotion labelling, and a two‑person verification habit.
- Repair shame: employ self‑compassion and reframe from “I was foolish” to “I was targeted by a professional social engineer” (Cross, 2016; Whitty, 2016).
- Relapse prevention: co‑create a personalised Scam Spot checklist; schedule a follow‑up call at the time the client expects renewed contact from offenders.
- Referral: coordinate with IDCARE for identity remediation; encourage formal reports for disruption and support.
Limitations and nuance
Most evidence on scam susceptibility is observational or experimental in simulated settings. Real‑world behaviour varies with context, culture and offender adaptation. Nevertheless, converging findings across psychology and cyber‑security point to the same protective pattern: slow down, verify independently, and use social support.
References
ACCC/NASC (2025) Targeting scams: report of the National Anti‑Scam Centre on scams data and activity 2024. Canberra: Australian Competition and Consumer Commission. Available at: Scamwatch.gov.au (Accessed 9 December 2025).
ASD/ACSC (2025) Annual Cyber Threat Report 2024–25. Canberra: Australian Signals Directorate, Australian Cyber Security Centre. Available at: cyber.gov.au (Accessed 9 December 2025).
Arkes, H.R. and Blumer, C. (1985) ‘The psychology of sunk cost’, Organizational Behavior and Human Decision Processes, 35(1), pp. 124–140.
Cialdini, R.B. (2001) Influence: Science and practice. 4th edn. Boston: Allyn & Bacon.
Cross, C. (2016) Improving responses to online fraud victims: An examination of reporting and support. Canberra: Australian Institute of Criminology.
Drew, J.M., Petnelis, R. and Birtchnell, T. (2024) ‘The victimology of online fraud: A focus on romance fraud’, Journal of Policy & Practice in Cybersecurity, 2(1), pp. 1–18.
Freedman, J.L. and Fraser, S.C. (1966) ‘Compliance without pressure: the foot‑in‑the‑door technique’, Journal of Personality and Social Psychology, 4(2), pp. 195–202.
IDCARE (2025) ‘Individual support services’. Available at: idcare.org (Accessed 9 December 2025).
Kahneman, D. (2011) Thinking, fast and slow. London: Penguin.
Lieberman, M.D., Eisenberger, N.I., Crockett, M.J., Tom, S.M., Pfeifer, J.H. and Way, B.M. (2007) ‘Putting feelings into words: affect labelling disrupts amygdala activity in response to affective stimuli’, Psychological Science, 18(5), pp. 421–428.
McGuire, W.J. (1961) ‘The effectiveness of supportive and refutational defenses in immunizing defenses’, Sociometry, 24(2), pp. 184–197.
Naqvi, B., Syed, W.S., Agrawal, A. and Rauf, S. (2023) ‘Mitigation strategies against phishing attacks: a systematic literature review’, Computers & Security, 128, 103123.
Nightingale, S.J. and Farid, H. (2022) ‘AI‑synthesised faces are indistinguishable from real faces and more trustworthy’, Proceedings of the National Academy of Sciences, 119(8), e2120481119.
Parsons, K., McCormac, A., Pattinson, M., Butavicius, M. and Jerram, C. (2017) ‘The Human Aspects of Information Security Questionnaire (HAIS‑Q): Two further validation studies’, Computers & Security, 66, pp. 40–51.
Roozenbeek, J., van der Linden, S. and Nygren, T. (2020) ‘Prebunking interventions based on inoculation theory can reduce susceptibility to misinformation’, Harvard Kennedy School Misinformation Review, 1(1), pp. 1–12.
Tversky, A. and Kahneman, D. (1974) ‘Judgment under uncertainty: heuristics and biases’, Science, 185(4157), pp. 1124–1131.
Whitty, M.T. (2016) ‘The online dating romance scam: The psychological impact on victims – both financial and non‑financial’, Criminology & Criminal Justice, 16(2), pp. 176–194.
Witte, K. (1992) ‘Putting the fear back into fear appeals: the Extended Parallel Process Model (EPPM)’, Communication Monographs, 59(4), pp. 329–349.
Workman, M. (2008) ‘Wisecrackers: a theory‑grounded investigation of phishing and pretext social engineering threats to information security’, Journal of the American Society for Information Science and Technology, 59(4), pp. 662–674.
Wright, R.T. and Marett, K. (2010) ‘The influence of experiential and dispositional factors in phishing: an empirical investigation of the deceived’, Journal of Management Information Systems, 27(1), pp. 273–303.
How to cite this article
Therapy Near Me (2025) ‘How to spot a scam: a psychologist’s advice’. Available at: TherapyNearMe.com.au (Accessed 9 December 2025).
If you have been impacted by a scam, support is available: IDCARE 1800 595 160; Lifeline 13 11 14 (24/7). For urgent threats, call 000.





