Cognitive Scientist

Shane Littrell | Research

Research

Misinformation / bullshit / conspiracy theories

Much of my current research focuses on identifying the cognitive and motivational factors that might underlie a higher propensity in some individuals to fall for and/or spread misleading information to others. For example, in a recently published paper, my co-authors and I created the Bullshitting Frequency Scale (journal | open access) which measures the self-reported frequency with which individuals engage in bullshitting other people (when given the opportunity) across a range of various social contexts (Littrell, Risko, & Fugelsang, 2021). In this paper, we defined and provided evidence for two types of bullshitting: persuasive bullshitting (i.e., bullshitting intended to impress or persuade others) and evasive bullshitting (i.e., bullshitting intended to avoid social harm for oneself or others). Across three studies, we demonstrated that both types of bullshitting are psychometrically distinguishable from lying and differentially associated with several factors thought to play important roles in analytic thinking processes (e.g., cognitive ability, engagement in cognitive reflection, open-minded cognition, etc).

I have also begun to examine factors that may offer insights into why misleading information is more seductive to some individuals and in what contexts. For instance, my most recent publication in British Journal of Social Psychology (journal | open access) explores whether individuals who are more likely to create and transmit misleading information are also more prone to being duped by various types of misinformation.

In essence, I attempted to find out whether one can “bullshit a bullshitter,” finding that individuals who report engaging in certain types of bullshitting more often are also more receptive to fake news, scientific bullshit, and pseudo-profound bullshit even when variables associated with metacognition, cognitive reflection, and cognitive ability are taken into account. Further experimental evidence suggests that these bullshitters are unable to determine whether a statement simply “sounds profound” or truly “is profound.” These findings provide helpful insights into understanding some of the ways that misinformation can spread and I was recently invited onto The Guardian’s Science Weekly podcast to talk about it.

Finally, I am currently working on several projects examining the cognitive, psychological, and ideological factors associated with other types of epistemically-suspect and unwarranted beliefs, such as conspiracy theory beliefs. For example, a paper my colleagues and I recently submitted for review reports results from an examination of factors associated with the spread of extremist and anti-LGBTQ partisan rhetoric and conspiracy theories from public figures and their impact on voter beliefs and attitudes. Other papers we are preparing for review focus on topics such as the political and psychological characteristics associated with intentionally spreading misinformation online, the associations of left/right political orientations and attitudes toward violence, how partisan conspiracy beliefs prevent vaccination, and factors associated with trading cryptocurrency, to name a few.


Analytic thinking, overconfidence, and personality

Another interest of mine is examining the ways in which dual process accounts of analytic thinking and reasoning may be associated with dispositional attributes, such as excessive overconfidence (e.g. trait narcissism) and impulsivity. For example, in a paper published in Thinking and Reasoning (journal | open access), my co-authors and I reported a series of studies that examined the extent to which both dispositional overconfidence (i.e., trait narcissism) and impulsivity were related to a broad range of factors critical for analytic thinking. Overall, we found that higher levels of grandiose (but not vulnerable) narcissism are consistently negatively associated with cognitive reflection, need for cognition, metacognitive insight, and intelligence, and positively associated with intellectual overconfidence and reliance on intuition rather than critical processes.

We followed up this work with an investigation published in Personality and Individual Differences examining how different measures of impulsiveness are related to engagement in reflective thinking processes.


Future projects

I’m currently following up this work with a series of studies examining which metacognitive and analytic processes are most important for successful detection of misleading information as well as how errors in these processes can influence reasoning and decision-making. In addition to this theoretical and empirical work, I am also interested in pursuing multidisciplinary collaborative projects which apply misinformation and bullshit research to political, organizational, and educational contexts.

One such project seeks to better understand the effects of perceived source credibility on bullshit receptivity, particularly the extent to which a person can be debiased from their receptivity to misleading information when it comes from people perceived to be credible experts in relevant fields. My co-authors and I have recently completed this project and submitted a manuscript for review. For anyone interested, I have also posted the open-access “pre-print” version on PsyArxiv, which can be found here: https://psyarxiv.com/4bstf/

Another recent project I was involved with investigated perceptions of pseudoscientific bullshit within the field of sports science and coaching, including an examination of the extent to which researchers and practitioners in that field are receptive to falling for bullshit. An open-access “pre-print” version of the manuscript associated with this project can be found here: https://psyarxiv.com/7t2my/