The Internet has evolved into a ubiquitous and indispensable digital environment in which people communicate, seek information, and make decisions. Despite offering various benefits, online ...environments are also replete with smart, highly adaptive choice architectures designed primarily to maximize commercial interests, capture and sustain users’ attention, monetize user data, and predict and influence future behavior. This online landscape holds multiple negative consequences for society, such as a decline in human autonomy, rising incivility in online conversation, the facilitation of political extremism, and the spread of disinformation. Benevolent choice architects working with regulators may curb the worst excesses of manipulative choice architectures, yet the strategic advantages, resources, and data remain with commercial players. One way to address some of this imbalance is with interventions that empower Internet users to gain some control over their digital environments, in part by boosting their information literacy and their cognitive resistance to manipulation. Our goal is to present a conceptual map of interventions that are based on insights from psychological science. We begin by systematically outlining how online and offline environments differ despite being increasingly inextricable. We then identify four major types of challenges that users encounter in online environments: persuasive and manipulative choice architectures, AI-assisted information architectures, false and misleading information, and distracting environments. Next, we turn to how psychological science can inform interventions to counteract these challenges of the digital world. After distinguishing among three types of behavioral and cognitive interventions—nudges, technocognition, and boosts—we focus on boosts, of which we identify two main groups: (a) those aimed at enhancing people’s agency in their digital environments (e.g., self-nudging, deliberate ignorance) and (b) those aimed at boosting competencies of reasoning and resilience to manipulation (e.g., simple decision aids, inoculation). These cognitive tools are designed to foster the civility of online discourse and protect reason and human autonomy against manipulative choice architectures, attention-grabbing techniques, and the spread of false information.
Full text
Available for:
NUK, OILJ, SAZU, UKNU, UL, UM, UPUK
There is a broad agreement that psychology is facing a replication crisis. Even some seemingly well-established findings have failed to replicate. Numerous causes of the crisis have been identified, ...such as underpowered studies, publication bias, imprecise theories, and inadequate statistical procedures. The replication crisis is real, but it is less clear how it should be resolved. Here we examine potential solutions by modeling a scientific community under various different replication regimes. In one regime, all findings are replicated before publication to guard against subsequent replication failures. In an alternative regime, individual studies are published and are replicated after publication, but only if they attract the community's interest. We find that the publication of potentially non-replicable studies minimizes cost and maximizes efficiency of knowledge gain for the scientific community under a variety of assumptions. Provided it is properly managed, our findings suggest that low replicability can support robust and efficient science.
Belief polarization is said to occur when two people respond to the same evidence by updating their beliefs in opposite directions. This response is considered to be “irrational” because it involves ...contrary updating, a form of belief updating that appears to violate normatively optimal responding, as for example dictated by Bayes' theorem. In light of much evidence that people are capable of normatively optimal behavior, belief polarization presents a puzzling exception. We show that Bayesian networks, or Bayes nets, can simulate rational belief updating. When fit to experimental data, Bayes nets can help identify the factors that contribute to polarization. We present a study into belief updating concerning the reality of climate change in response to information about the scientific consensus on anthropogenic global warming (AGW). The study used representative samples of Australian and U.S. participants. Among Australians, consensus information partially neutralized the influence of worldview, with free‐market supporters showing a greater increase in acceptance of human‐caused global warming relative to free‐market opponents. In contrast, while consensus information overall had a positive effect on perceived consensus among U.S. participants, there was a reduction in perceived consensus and acceptance of human‐caused global warming for strong supporters of unregulated free markets. Fitting a Bayes net model to the data indicated that under a Bayesian framework, free‐market support is a significant driver of beliefs about climate change and trust in climate scientists. Further, active distrust of climate scientists among a small number of U.S. conservatives drives contrary updating in response to consensus information among this particular group.
Full text
Available for:
BFBNIB, FZAB, GIS, IJS, IZUM, KILJ, NLZOH, NUK, OILJ, PILJ, PNG, SAZU, SBCE, SBMB, UL, UM, UPUK
Social media has arguably shifted political agenda-setting power away from mainstream media onto politicians. Current U.S. President Trump's reliance on Twitter is unprecedented, but the underlying ...implications for agenda setting are poorly understood. Using the president as a case study, we present evidence suggesting that President Trump's use of Twitter diverts crucial media (The New York Times and ABC News) from topics that are potentially harmful to him. We find that increased media coverage of the Mueller investigation is immediately followed by Trump tweeting increasingly about unrelated issues. This increased activity, in turn, is followed by a reduction in coverage of the Mueller investigation-a finding that is consistent with the hypothesis that President Trump's tweets may also successfully divert the media from topics that he considers threatening. The pattern is absent in placebo analyses involving Brexit coverage and several other topics that do not present a political risk to the president. Our results are robust to the inclusion of numerous control variables and examination of several alternative explanations, although the generality of the successful diversion must be established by further investigation.
Misinformation can undermine a well-functioning democracy. For example, public misconceptions about climate change can lead to lowered acceptance of the reality of climate change and lowered support ...for mitigation policies. This study experimentally explored the impact of misinformation about climate change and tested several pre-emptive interventions designed to reduce the influence of misinformation. We found that false-balance media coverage (giving contrarian views equal voice with climate scientists) lowered perceived consensus overall, although the effect was greater among free-market supporters. Likewise, misinformation that confuses people about the level of scientific agreement regarding anthropogenic global warming (AGW) had a polarizing effect, with free-market supporters reducing their acceptance of AGW and those with low free-market support increasing their acceptance of AGW. However, we found that inoculating messages that (1) explain the flawed argumentation technique used in the misinformation or that (2) highlight the scientific consensus on climate change were effective in neutralizing those adverse effects of misinformation. We recommend that climate communication messages should take into account ways in which scientific content can be distorted, and include pre-emptive inoculation messages.
Full text
Available for:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
We investigated if people's response to the official recommendations during the COVID-19 pandemic is associated with conspiracy beliefs related to COVID-19, a distrust in the sources providing ...information on COVID-19, and an endorsement of complementary and alternative medicine (CAM).
The sample consisted of 1325 Finnish adults who filled out an online survey marketed on Facebook. Structural regression analysis was used to investigate whether: 1) conspiracy beliefs, a distrust in information sources, and endorsement of CAM predict people's response to the non-pharmaceutical interventions (NPIs) implemented by the government during the COVID-19 pandemic, and 2) conspiracy beliefs, a distrust in information sources, and endorsement of CAM are related to people's willingness to take a COVID-19 vaccine.
Individuals with more conspiracy beliefs and a lower trust in information sources were less likely to have a positive response to the NPIs. Individuals with less trust in information sources and more endorsement of CAM were more unwilling to take a COVID-19 vaccine. Distrust in information sources was the strongest and most consistent predictor in all models. Our analyses also revealed that some of the people who respond negatively to the NPIs also have a lower likelihood to take the vaccine. This association was partly related to a lower trust in information sources.
Distrusting the establishment to provide accurate information, believing in conspiracy theories, and endorsing treatments and substances that are not part of conventional medicine, are all associated with a more negative response to the official guidelines during COVID-19. How people respond to the guidelines, however, is more strongly and consistently related to the degree of trust they feel in the information sources, than to their tendency to hold conspiracy beliefs or endorse CAM. These findings highlight the need for governments and health authorities to create communication strategies that build public trust.
Full text
Available for:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK
Critical Ignoring as a Core Competence for Digital Citizens Kozyreva, Anastasia; Wineburg, Sam; Lewandowsky, Stephan ...
Current directions in psychological science : a journal of the American Psychological Society,
02/2023, Volume:
32, Issue:
1
Journal Article
Peer reviewed
Open access
Low-quality and misleading information online can hijack people’s attention, often by evoking curiosity, outrage, or anger. Resisting certain types of information and actors online requires people to ...adopt new mental habits that help them avoid being tempted by attention-grabbing and potentially harmful content. We argue that digital information literacy must include the competence of critical ignoring—choosing what to ignore and where to invest one’s limited attentional capacities. We review three types of cognitive strategies for implementing critical ignoring: self-nudging, in which one ignores temptations by removing them from one’s digital environments; lateral reading, in which one vets information by leaving the source and verifying its credibility elsewhere online; and the do-not-feed-the-trolls heuristic, which advises one to not reward malicious actors with attention. We argue that these strategies implementing critical ignoring should be part of school curricula on digital information literacy. Teaching the competence of critical ignoring requires a paradigm shift in educators’ thinking, from a sole focus on the power and promise of paying close attention to an additional emphasis on the power of ignoring. Encouraging students and other online users to embrace critical ignoring can empower them to shield themselves from the excesses, traps, and information disorders of today’s attention economy.
Full text
Available for:
NUK, OILJ, SAZU, UKNU, UL, UM, UPUK
People frequently continue to use inaccurate information in their reasoning even after a credible retraction has been presented. This phenomenon is often referred to as the continued influence effect ...of misinformation. The repetition of the original misconception within a retraction could contribute to this phenomenon, as it could inadvertently make the "myth" more familiar-and familiar information is more likely to be accepted as true. From a dual-process perspective, familiarity-based acceptance of myths is most likely to occur in the absence of strategic memory processes. Thus, we examined factors known to affect whether strategic memory processes can be utilized: age, detail, and time. Participants rated their belief in various statements of unclear veracity, and facts were subsequently affirmed and myths were retracted. Participants then rerated their belief either immediately or after a delay. We compared groups of young and older participants, and we manipulated the amount of detail presented in the affirmative or corrective explanations, as well as the retention interval between encoding and a retrieval attempt. We found that (a) older adults over the age of 65 were worse at sustaining their postcorrection belief that myths were inaccurate, (b) a greater level of explanatory detail promoted more sustained belief change, and (c) fact affirmations promoted more sustained belief change in comparison with myth retractions over the course of 1 week (but not over 3 weeks), This supports the notion that familiarity is indeed a driver of continued influence effects.
Full text
Available for:
CEKLJ, FFLJ, NUK, ODKLJ, PEFLJ, UPUK
Among American Conservatives, but not Liberals, trust in science has been declining since the 1970's. Climate science has become particularly polarized, with Conservatives being more likely than ...Liberals to reject the notion that greenhouse gas emissions are warming the globe. Conversely, opposition to genetically-modified (GM) foods and vaccinations is often ascribed to the political Left although reliable data are lacking. There are also growing indications that rejection of science is suffused by conspiracist ideation, that is the general tendency to endorse conspiracy theories including the specific beliefs that inconvenient scientific findings constitute a "hoax."
We conducted a propensity weighted internet-panel survey of the U.S. population and show that conservatism and free-market worldview strongly predict rejection of climate science, in contrast to their weaker and opposing effects on acceptance of vaccinations. The two worldview variables do not predict opposition to GM. Conspiracist ideation, by contrast, predicts rejection of all three scientific propositions, albeit to greatly varying extents. Greater endorsement of a diverse set of conspiracy theories predicts opposition to GM foods, vaccinations, and climate science.
Free-market worldviews are an important predictor of the rejection of scientific findings that have potential regulatory implications, such as climate science, but not necessarily of other scientific issues. Conspiracist ideation, by contrast, is associated with the rejection of all scientific propositions tested. We highlight the manifold cognitive reasons why conspiracist ideation would stand in opposition to the scientific method. The involvement of conspiracist ideation in the rejection of science has implications for science communicators.
Full text
Available for:
DOBA, IZUM, KILJ, NUK, PILJ, PNG, SAZU, SIK, UILJ, UKNU, UL, UM, UPUK