Hostname: page-component-68c7f8b79f-kbpd8 Total loading time: 0 Render date: 2025-12-25T23:46:01.602Z Has data issue: false hasContentIssue false

Blending Truth and Lies: Using an Ethnographic Sensibility to Study Online Misinformation

Published online by Cambridge University Press:  24 November 2025

Rights & Permissions [Opens in a new window]

Abstract

Ethnographic methods of all varieties contend with the idea of the “truth” of accounts and the meanings attached to them, as well as the importance of context in mitigating truth or falseness in how these accounts are presented. Discerning truth from lies and the purpose of both in the context of making meaning in a time and place is at the heart of the ethnographic enterprise. Because powerful images or messages evoke emotional reactions on social media or contributory websites like message boards, the relative accuracy of the representations they make is often less important than their reach and the ways they make and remake “reality” for their audiences. A picture or an image, even one attributed to a context or a meaning wholly independent of the context from which it emerged, becomes part of how people online see or experience an event. The context in which information is presented and the speakers or presenters of this information also condition its uptake and resonance. This paper argues that ethnography is uniquely suited to understand the effects and reach of decontextualized information and the ways it makes meaning, both on- and offline.

Information

Type
Reflection
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of American Political Science Association

In the midst of the February 2021 winter storm that gripped much of the continental United States, a picture circulated widely on social media. While rolling blackouts and power-grid failures plagued Texas, tens of thousands of people—starting with a major player in Texas’s oil and gas industry and later including members of Congress and other high-profile figures—shared and retweeted an image of a helicopter de-icing a wind turbine (Kahn Reference Kahn2021; Link Reference Link2021). The commentary accompanying these posts blamed Texas’s power outages on the failure of green energy infrastructure like wind turbines. While later analysis has shown that the grid failures and outages were largely due to problems with natural-gas-based generation capacity (Storrow Reference Storrow2021), state leaders—including the Texas governor (Shepherd Reference Shepherd2021) and leaders from other states—and major figures in cable news commentary (Chute Reference Chute2021) blamed the outages on frozen turbines. The central problem with this argument, however, is that the image that generated so much outrage was from Sweden in 2014, not Texas in 2021 (Schultz Reference Schultz2015).

In the digital realm, information can be dissolved from the context in which it was produced. A years-old picture from Sweden can become, for many people, an image of Texas. Images and video of a wildfire from years prior becomes a story of Australia in 2020 (Shammas Reference Shammas2020). A quote from a literal neo-Nazi can acquire a veneer of philosophical legitimacy when posted with a misattribution by business titans (Flynn Reference Flynn2023) and congressional representatives (Pengelly Reference Pengelly2022).

As with the chemical process of dissolution, in which solids like salt dissolve in solvents like water, the change that occurs is contextual, not intrinsic. The salt that was crystalline is now a saline solution. The salt is still there, but in another form. The image from Sweden was not altered, but dissolved from its context and applied to another. The process of dissolving information from context complicates the question of adjudicating truth claims: a photo of a frozen turbine is real but misattributed. The image is not fake, but the conversation around it mischaracterizes its origins, importance, and relevance. When shared during resonant moments and by assumed experts or well-networked individuals, these decontextualized artifacts, often blending truth and fabrication,Footnote 1 take on a life of their own. Ethnography, with its attentiveness to how information transmission and uptake is inherently intersubjective and how deceptions and falsehoods are meaning-making practices, is methodologically extremely well suited to inform the study of how decontextualized information online produces durable, real-world effects. This reflection argues that deep immersion in contexts of online information, in addition to the extant agnosticism about truth claims that has been adopted by some in the field, would enrich the study of decontextualized information, allow for the deeper understanding of its offline effects, and see such exchanges as inherently productive of reality with measurable effects.

In online communication, in which information can spread easily and quickly, artifacts from one context can be applied to others with relative ease, and the advent of user-friendly generative artificial intelligence (AI) technologies that can fabricate photorealistic images that are difficult to distinguish from actual photos compounds these difficulties. Some images are entirely fabricated and then take on a life of their own, either as jokes that are relatively harmless because they are not based in reality, like the case of an AI-generated image of then pope Francis in a puffer coat (Ellery Reference Ellery2023), or as jokes that become fodder for conspiracy theorists, as in the case of a manipulated movie poster image that gained traction amid the late-2021 COVID-19 surge (Reuters Fact Check 2021).

Broadly, this misattributed or fabricated information is sorted into misinformation and disinformation: the former is empirically untrue while the latter is intentionally deceptive, though any given piece of information can potentially be both (Persily and Tucker Reference Persily, Tucker, Persily and Tucker2020).Footnote 2 Both mis- and disinformation have been blamed for myriad social and democratic ills, including fueling xenophobia and violence (Chenzi Reference Chenzi2021), intentionally degrading public knowledge (Van Der Zee et al. Reference Van Der Zee, Poppe, Havrileck and Baillon2022), and preventing coalition formation (Anspach and Carlson Reference Anspach and Carlson2020). Disinformation has also been identified as a threat to both the demand for and supply of democracy (Schiffrin Reference Schiffrin2017), and can be weaponized by powerful actors to fundamentally shift the information environment to achieve political goals through manipulation (Prier Reference Prier2017).

Yet misinformation can also be difficult to define. A standard of falseness is insufficient to demarcate it conceptually, as “no ‘unambiguously false’ claim would be believed by anyone, as its falseness would be unambiguously apparent to everyone” (Uscinski Reference Uscinski2023, 8). Individual pieces of information can also demonstrate aspects of truth and falsehood: the picture of the frozen turbine was a real photo, but its description as a scene from Texas in 2021 was false.

Fact checking as a post hoc correction has mixed effects in generating more accurate discourses. There is some evidence of the durable correction effects of fact checking and misinformation warning labels (Martel and Rand Reference Martel and Rand2023; Porter and Wood Reference Porter and Wood2021), and there are a few empirically verifiable backfire effects of correction (Nyhan Reference Nyhan2021; Porter and Wood Reference Porter and Wood2024). However, other evidence points to the limited durability of fact checking, the diminution of its effects in nonbinary measures, and the effects of preexisting attitudes on its efficacy (Walter et al. Reference Walter, Jonathan Cohen and Morag2020). The influence of confirmation bias, in addition to a fractured media landscape, makes information resilient in the face of correction for many news consumers (Nickerson Reference Nickerson1998). Individual cognitive factors, like the coherence of the information presented, its compatibility with existing beliefs, and source credibility also contribute to the “stickiness” of information and its continued uptake (Lewandowsky et al. Reference Lewandowsky, Ullrich, Seifert, Schwarz and Cook2012). All these factors are also compounded by the difficulty of in-the-moment fact checking that may be inaccurate, applied with bias, or incompletely sensitive to the complexities of the mixed truth and falsehood of individual claims (Uscinski, Littrell, and Klofstad Reference Uscinski, Littrell and Klofstad2024). Additionally, some scholars (e.g., Uscinski Reference Uscinski2023) have called into question the epistemological foundations of much misinformation research, arguing that it demonstrates “causal asymmetry” because it seeks to provide a special explanation for the uptake of misinformation that is often guided by researchers’ own judgments about what is (allegedly) “obviously” incorrect (Uscinski, Littrell, and Klofstad Reference Uscinski, Littrell and Klofstad2024).

The effects of compounding incorrect information in cases like the “Pizzagate” incident in 2016 (Siddiqui and Svrluga Reference Siddiqui and Svrluga2016) or the “birther” movement (Barbaro Reference Barbaro2016) have compounded over time, escalating with passing events. The ripples of the original misleading information grow, and in saturated information environments become self-reinforcing (Menczer and Hills Reference Menczer and Hills2020). Such effects can be seen in extremist supporters of grand conspiracy theories (Goldberg Reference Goldberg2008), or in adherents of less totalizing ideological commitments like the antivaccination movement (Reich Reference Reich2018) or climate change denial (Lahsen Reference Lahsen2013). The consequences of misapplied information can be seen in a variety of different contexts, from vaccine hesitancy among older, white evangelicals in the US in the context of the COVID-19 pandemic (Tollefson Reference Tollefson2021) to the lack of support for green energy interventions in the US (Lovins Reference Lovins2013). In all these examples, the (mis)information is taken up as credible, and is not inherently distinguishable in its velocity or spread from that of information propounded by communities that leverage empirical evidence to support their precepts.

Given that much decontextualized information is durable and has observable consequences offline, it becomes important to deal with it outside a simple truth-and-falsehood dichotomy and consider the context that produced its initial resonance. Rather than saying that decontextualized information is simply wrong and can therefore be corrected or dismissed—like asserting that the photo of the wind turbine is from Sweden and not from Texas, and those circulating it are merely incorrect—there seems to be a social-scientific value in engaging with mistaken information as a source of data and in recognizing that the context that produced it is of central importance to understanding its resonance. This approach mirrors much of the public opinion literature on the uptake of conspiracy theories, in which researchers have remained “decidedly agnostic about the truth claims, accuracy, or epistemological integrity” of the conspiracies themselves and have looked instead to the broader effects of adherence to them (Oliver and Wood Reference Oliver and Wood2014, 953).

The added value of an ethnographic approach comes, however, in leveraging both the context of the resonant but decontextualized information and the positionality of those who spread it. This approach is not merely agnostic about truth claims but actively engaged in exploring how spreaders of information take part in practices of meaning making and community building. The “lies” are data. As such, this approach not only allows for an agnosticism about the truth claims being made, but also enables the building of a more complete picture of how truth and fabrication are blended and remixed to create offline effects. The photo from Sweden is an artifact that may have been produced in one context, but it became important and real (if not empirically true) in another. Additionally, the “reality” of the image is produced by the context in which it is reproduced (the Texan ice storm), and its uptake by individuals is conditioned by the forum in which it is presented and the speakers who have presented it (sectoral and political elites).

I will argue in this paper that an ethnographic “sensibility” like that proposed by Schatz (Reference Schatz and Schatz2009) is needed to understand the circulation, effects, uptake, and durability of decontextualized information—especially online—and how the new contexts in which information is presented create new information landscapes and observable offline effects. This reflection argues that social scientists must engage with the ways in which this information makes and remakes reality, mixing what is externally verifiable with what is felt or fabricated based not only on the information itself, but also the context and positionality of the speakers who disseminate it. Furthermore, they must do so in a manner that has consequences for political engagement and public life. This extends the agnosticism of some public opinion and conspiracy theory scholarship (see, e.g., Douglas et al. Reference Douglas, Uscinski, Sutton, Cichocka, Nefes, Ang and Deravi2019; Munn, Magee, and Arora Reference Munn, Magee and Arora2024; Uscinski, Littrell, and Klofstad Reference Uscinski, Littrell and Klofstad2024), and encourages researchers to engage with the ways that decontextualized information becomes real and produces effects. Because it is a methodological and ontological field in political science that engages consistently with questions of truth and falsehood, context, and positionality in meaning making, political ethnography, as both a method and a sensibility, is uniquely well situated to deal with the instability of the information environment and discern the import of context and positionality in evaluating information.

This paper will first examine the question of how decontextualized information is disseminated online, and suggests using an ethnographic approach to understand it. Using an ethnographic approach allows researchers to assess these information flows while simultaneously suspending truth claims and taking seriously the real-world effects of the misleading information. In doing so, it seeks to understand the meaning-making activities of individuals who engage with such information online. From there, the paper will use case studies of misapplied information to demonstrate how this information has created real-world effects, even in the face of rapid and comprehensive “corrections.” These case studies seek to illustrate that the resonance and uptake of decontextualized information must be studied in terms of its context and positionality if the complexities of truth and fabrication are to be understood.

Case Selection

To demonstrate the world-making possibilities as well as the spread and durability of misattributed information, this reflection includes three case studies. The first, referenced above, examines the viral image of a frozen wind turbine that spread quickly in the context of a power-grid failure in Texas in 2021. The second case study looks at the repeated misattribution of a quote from a neo-Nazi activist to the Enlightenment philosopher Voltaire. The third looks at activism by white-rights groups in South Africa and the resultant 2025 policy by the Trump administration to give refugee status to “Afrikaners.” These cases do not constitute a representative sample of all such incidents nor indeed do they constitute ethnographic accounts in themselves; rather, they provide a variety of time frames and contexts in which to examine the world-making possibilities and consequences of misinformation and explore how these claims are situated by context and the positionality of their speakers. The benefit of examining all these cases in parallel is in how they demonstrate the process of information uptake and the production of real-world effects over a variety of time frames from weeks to decades long. Furthermore, they demonstrate the importance of the recontextualized information’s context and origins in explaining its uptake and blend of truth and falsehood.

The Problem: Complexities of Information Online

The information ecosystem online poses unique challenges for understanding the origins and reach of information for at least two distinct reasons: first, information can spread in viral cascades and second, information can be completely disconnected from the setting in which it is produced, either by transplanting it from its original historical or geographic context or by using generative AI to easily fabricate text, images, or video.

Because of the ease of sharing and the size of potential audiences in decentralized social media, individual pieces of information—arguments, images, frames, and so forth—have the potential to spread quickly and with relatively few transaction costs. These cascades of information occur when an image, fact, or frame is shared widely in a short time, both in support of and in opposition to a particular claim. They can be observed either while they are happening, by documenting incidences in a digital space like a message board or a social media network, or after the fact, through searches of key terms or reverse image searches. Information that evokes strong reactions is more likely to travel in cascades because it inspires people to share and interact with that information on social media (Stieglitz and Dang-Xuan Reference Stieglitz and Dang-Xuan2013). These cascades can be discrete events or a collection of incidents that form part of a broader pattern. Often, such patterns of information sharing are repeated within “homogenous clusters” of users within a social media platform, and partisan polarization accelerates the process (Del Vicario et al. Reference Del Vicario, Bessi, Zollo, Petroni, Scala, Caldarelli, Eugene Stanley and Quattrociocchi2016).

This is all the more true in the context of generative AI—computer programs trained on large caches of data that, in response to simple, text-based prompts, can generate new content with similar characteristics to that data, including humanlike writing and speech and realistic video. As such, well-documented public figures, documents, styles, or places are easily reproduced with new parameters, as with AI-generated images of then candidate Donald Trump with Black “supporters” (Marianna Reference Marianna2024), purported explosions at the Pentagon (Bond Reference Bond2023), or misleading images of election outcomes (Duffy Reference Duffy2024).

The intervention of generative AI is “democratizing the creation of propaganda” (Waldrop Reference Waldrop2023, 4). When paired with the cascading nature of information online, the “danger of new AI-based tools is scale and velocity: the ability to produce large volumes of credible-sounding misinformation quickly, then to leverage networks to distribute it expeditiously online” (Kreps, McCain, and Brundage Reference Kreps, McCain and Brundage2022, 114). As they are trained from existing information, these AI tools often blend largely factual accounts with fabricated data, making the evaluation of this information context specific, complex, and difficult to parse (Munn, Magee, and Arora Reference Munn, Magee and Arora2024).

The velocity and density of information spread online, when paired with the seemingly free-floating nature of information dissolved from its context, constitutes a new terrain of what Arendt (Reference Arendt1971) called “political lies,” insofar as such information (whether intentionally deceptive or not) creates new realities. In seeking to understand the political lies of Donald Trump, McGranahan (Reference McGranahan2017, 246) argues that such speech “is more than denying history; it is a rearranging of society,” in the sense that it creates new political communities based on its internal logics.

Ethnography of Truth and Lies

In ethnography, political science has a set of methodological tools that have been used to navigate complex information environments and explore how they are shaped by context and positionality. Pathbreaking work by ethnographers like Lee Ann Fujii (Reference Fujii2010) has delved into the idea of how to discern truth from lies in politics. Jessica Allina-Pisano’s (Reference Allina-Pisano and Schatz2009) work has explored the ways that idiom can be used by interlocutors to misrepresent historical relationships but shed light on present political realities, and how the meaning of idioms can change depending on the context in which they are used. James Scott’s (Reference Scott1990) Domination and the Arts of Resistance similarly explores how people deliberately conceal behavior and act differently depending on the networks of power in which they are situated, and how the context in which words are said shapes their truth values. In El Alto, Rebel City: Self and Citizenship in Andean Bolivia, Sian Lazar (Reference Lazar2008) addresses the ways in which local and indigenous knowledge mediates relationships between citizens and the state and is conditioned by relationships to power. Whether using a positivist or interpretivist mode, ethnography has consistently grappled with the idea of truth, falsehood, authenticity, and consistency in presentation. Deception and falsehoods, both intentional and unintentional, create worlds and are situated within contexts of meaning.

Discerning truth from lies, to borrow a phrase from Fujii (Reference Fujii2010), is not necessarily about the pursuit of truth as such. Nor does studying misinformation require an acceptance of incorrect claims, or even an effort to refute them. Rather, what is being investigated and revealed in the virality of an image, piece of text, or video (and its resistance to “factual correction”) is the cognitive and political milieux in which such an image, piece of text, or video resonates, and how its resonance is a world-making process.

The existing literature on digital ethnography has well-established standards for employing ethnographic methods to understand the online space (see, e.g., Hine Reference Hine, Hjorth, Horst, Galloway and Bell2017; Pink et al. Reference Pink, Horst, Postill, Hjorth, Lewis and Tacchi2016). These methods outline the parameters of immersion-based study in terms of site-specific details about platforms and fora, norms around noninterference, dealing with ephemerality, the identification of relevant content, and the construction of genealogies of information. The goal throughout digital ethnography is a deep immersion in the digital space to understand how and why information is produced, resonates, and morphs in new contexts. As an immersion-based method with tools that are well suited to understanding meaning-making processes, digital ethnography can help scholars of digital information to understand not only the (mis)information itself, but also how it becomes relevant.

To this end, I propose leveraging the toolkit of ethnography, both on- and offline, to help understand how the context in which decontextualized information is presented online and how the carriers of this information condition its validity and take-up. Paired with an agnosticism about truth claims borrowed from public opinion research and interpretive ethnography, this allows researchers of misinformation to account for both the ways that information is adopted and how it makes new realities. In practice, this can take place in mapping or genealogical exercises employed by ethnographers (see, e.g., Smith Reference Smith2019), field-based observations (Mannik and McGarry Reference Mannik and McGarry2017), or conducting ethnographic interviews (Spradley [1979] Reference Spradley2016), or others. Broadly, this reflection is a provocation to adopt an ethnographic “sensibility” (see, e.g., Schatz Reference Schatz and Schatz2009; Yanow and Schwartz-Shea Reference Yanow and Schwartz-Shea2006) in conducting research on decontextualized information, “including an attention to language, context, and meaning” (Wedeen Reference Wedeen2010, 258; emphasis added).

It remains important to understand the resonance of the wind turbine photo because it is reflective and productive: it was initially shared by an energy expert, resonated in the context of a crisis and because of underlying skepticism about wind power, was reinforced by elites, and resulted in a measurable downturn in public support for green energy. All these causes and effects are missed if we think about misinformation as merely incorrect, or without the context and positionality that produced the viral cascade centered on the photo itself.

Misinformation Case Study 1: Tilting at Windmills?

In February 2021, the state of Texas faced a record-breaking series of winter storms, with associated snow and ice. As a result of the freezing temperatures, demands on the state’s electric power grid outstripped supply, leading to more than 4.5 million residences losing power. The power outages caused an estimated $195 billion in damage (King et al. Reference King, Rhodes, Zarnikau and Lin2021), and resulted in the deaths of at least 210 people (Flores et al. Reference Flores, McBrien, Do, Kiang, Schlegelmilch and Casey2023). On February 16, a Texas oil and gas consultant named Luke Legate shared a tweet that showed an image of a helicopter de-icing a wind turbine, accompanied by the text “A helicopter running on fossil fuel spraying a chemical made from fossil fuels onto a wind turbine made with fossils [sic] fuels during an ice storm is awesome” (see figure 1). The tweet racked up 30,741 retweets and 89,720 likes before Legate made his account private (Legate Reference Legate2021). The assertion and the photo, it seems, were given credibility by Legate’s position in the oil and gas industry. From there, the story of frozen wind turbines in Texas (figuratively) snowballed.

Figure 1 Frozen Wind Turbines: Legate Tweet

Tucker Carlson, at the time a host on Fox News, took to his show to condemn the wind turbines, saying “[t]he windmills failed like the silly fashion accessories they are, and people in Texas died” (Reality Check Team Reference Team2021). Texas governor Greg Abbott, in another Fox News segment, said, “[W]e have a massive amount of wind farms out in west Texas that are frozen up … all of that wind energy was lost” (Usero and Rizzo Reference Usero and Rizzo2021). Former governor of Texas Rick Perry was also interviewed on Fox News and said of the turbines, “This shows how the Green New Deal would be a deadly deal for the United States of America. … Our wind and our solar got shut down, and they were collectively more than 10 percent of our power grid, and that thrust Texas into a situation where it was lacking power on a statewide basis. … It just shows that fossil fuel is necessary” (quoted in Shepherd Reference Shepherd2021). Texas agriculture commissioner Sid Miller posted on Facebook, saying, “We should never build another wind turbine in Texas. … The experiment failed big time” (quoted in Browne Reference Browne2021). In a subsequent post Miller went on to say, “Insult added to injury: Those ugly wind turbines out there are among the main reasons we are experiencing electricity blackouts” (quoted in Douglas and Ramsey Reference Douglas and Ramsey2021). Texas congressional representative Dan Crenshaw (Reference Crenshaw2021) published a 13-tweet thread from his official (noncampaign) account on Twitter, of which the first three substantive tweets focused on frozen turbines, including one that said, “This is what happens when you force the grid to rely in part on wind as a power source. When weather conditions get bad as they did this week, intermittent renewable energy like wind isn’t there when you need it.”

Other representatives and senators from all over the country, like Steve Daines (Reference Daines2021) from Montana, Lauren Boebert (Reference Boebert2021) from Colorado, Andy Barr (Reference Barr2021) from Kentucky, and Marjorie Taylor Greene (Reference Greene2021) from Georgia, took to social media to discuss the failures of wind-based power generation as the source of the blackouts from their campaign, personal, and official accounts. Some went on to emphasize their opposition to the Biden administration’s proposed “Green New Deal,” and to argue in favor of fossil fuels to generate electricity. Each of these posts garnered hundreds or thousands of likes and retweets. Then-former president Donald Trump also weighed in, saying in a speech at the Conservative Political Action Conference in February 2021, “The windmill calamity that we’re witnessing in Texas … it’s so sad when you look at it. … [President Biden] wants windmills. … The windmills that don’t work when you need them” (quoted in Yen and Woodward Reference Yen and Woodward2021).

The central difficulty in all the above discourse is, of course, that wind-power failures accounted for a very small minority of the power generation failure in the case of the Texas blackouts. Wind energy accounted for 10% of the total power supply in Texas in 2021, and while some turbines did ice over, “shutdowns of thermal power plants, primarily those relying on natural gas, dwarfed the dent caused by frozen wind turbines, by a factor of five or six,” and the state’s two nuclear power plants also stopped supplying electricity because of frozen equipment (Usero and Rizzo Reference Usero and Rizzo2021). Yet the story of the frozen wind turbines is what became the talking point in the context of the power failure for more than a week.

While not every one of the officials quoted above may have seen or interacted with the photo referenced in the introduction, it was (chronologically) the start of the information cascade. The picture itself, as an image from Sweden in 2014, has been the subject of several unassociated cascades on social media dating as far back as 2016. The picture gained a life of its own in the context of the Texas power outages, not because the image was fabricated, but because it was misapplied. A reverse image search on Google in February and March 2021 showed more than 1,400 instances of that exact image being used in recently updated websites. Some of these websites were, of course, debunking the application of this image to Texas, but more were participating in the dissemination of the image. So popular was the image that a suggested auxiliary search from Google at the time was “frozen windmills in Texas” (see figure 2).

Figure 2 Google Suggested Search Term

What is it about the image of a frozen turbine that resonated in the midst of one of the most acute power failures in recent memory (Searcey Reference Searcey2021)? Why were people so quick to ascribe blame to the wind turbines, which provide a minority of the state’s total power supply (Chute Reference Chute2021)? How does this instance of misinformation link with other critiques of wind turbines from other time periods from figures like Donald Trump (CNN 2019), or with larger critiques of environmental initiatives or green energy? In answering these questions, it is less important that the image is from Sweden than that it traveled as far and as quickly as it did, and that it began with an oil and gas industry leader in the midst of a serious public crisis. From these origins, so many people, including powerful voices in the media and in politics, thought of this image as being of, and from, Texas. To simply say that people were wrong in attributing the picture or that this application of the picture is incorrect is to largely miss its importance and influence, as well as to fail to see how the context that produced it blended credible and falsifiable information about the photo. Certainly, it was a picture taken from its context and applied to another. It was never actually a picture of Texas, but in the minds and experiences of the people sharing the image, it was (or is) Texas. It was a political lie that created a new reality, and was given credence by elite signaling and a crisis context. It reflected a piece of the audience’s already understood reality in ways detached from the origins of the image, and resonated with the ways they saw (or see) the world.

The popularity of the picture is not (just or only) untrue; it is data, and only understandable as part of the context that reproduced a decontextualized piece of information. The storm, the expert origins of the photo, and the subsequent elite echoing of these points all created the conditions for its uptake. It is also demonstrable that the sharing of this picture is temporally correlated with measurable outcomes. A June 2021 poll shows that Republican respondents had become less supportive of wind turbines (and indeed all renewable sources of energy) in the previous years, with a major dip between 2020 and 2021 (Kennedy and Spencer Reference Kennedy and Spencer2021).

The provocation here, then, is not merely to maintain an agnosticism about the truth claims of the photo or its propagators, but rather to actively engage with the meaning-making processes as a subject of study. Rather than debunking such claims as merely false, it is imperative that social-scientific researchers treat them as data to understand how they produce offline effects.

Misinformation Case Study 2: Philosophers Make Everything Sound More Respectable

In May of 2023, Elon Musk, who had bought Twitter a year earlier amid significant controversy, tweeted an image of a hand crushing a group of seemingly helpless people. Accompanying the image was a quote, attributed to French philosopher Voltaire, that read, “To learn who rules over you, simply find out who you are not allowed to criticize” (see figure 3).

Figure 3 Screenshot of Elon Musk Tweet

This provocative quote—part of the billionaire’s path of radicalization toward the extreme right (Darcy Reference Darcy2024)—was seemingly intended as a response to the criticisms that had been leveled against him for promoting such viewpoints on his newly acquired platform (Flynn Reference Flynn2023). The quote, given the veneer of legitimacy by reference to a canonical philosopher, implicitly alleges that the central antidemocratic impulse is in criticizing radical-right speech, rather than in propagating deeply flawed versions of history or policy positions taken from antidemocratic regimes and thinkers.

Yet as with the picture of the wind turbine discussed above, there is a central problem with the circulation of these words in this context: the quote is not from Voltaire at all, but rather a loosely paraphrased thesis of a manifesto by Kevin Alfred Strom, a neo-Nazi activist (Flynn Reference Flynn2023). Strom (Reference Strom2017) claimed credit for the idea behind these words, and attributed it to a 1993 broadcast in a blog post on the white supremacist website National Vanguard in 2017.

Musk is hardly the first person to credit Voltaire for these words. Kentucky congressman Thomas Massie tweeted the same quote and attribution, though with a different image, in early 2022 (Pengelly Reference Pengelly2022). Actor John Cusack tweeted the same image and quote, this time including a blue Star of David in 2019 (Associated Press Reference Press2019), as did the then head of the Philadelphia branch of the National Association for the Advancement of Colored People, Rodney Muhammad, in 2020 (CBS Philadelphia 2020). Australian senator Cory Bernardi tweeted the same quote and attribution in 2015 (Hunt Reference Hunt2015).

In each of these cases, the sharing of this quote is seemingly provoked by an individual with a large audience claiming they are facing undue criticism for an unpopular opinion. Musk’s sharing of this image was in the context of his acquisition of Twitter, when he came under increasing scrutiny for his politics and public persona (Mezrich Reference Mezrich2023). Bernardi’s sharing of the quote occurred in the context of public criticism over alleged anti-Muslim bias (Hunt Reference Hunt2015). Both Cusack and Muhammad had been accused of antisemitism before sharing the quote with a Star of David attached to it (Associated Press Reference Press2019; CBS Philadelphia 2020). Massie’s sharing of the quote was in the context of significant public protests around COVID-based restrictions, and contained text sardonically urging his audience not to question such measures (Pengelly Reference Pengelly2022). In each case, the speaker or disseminator of the quote was a target of public criticism and was using the quote to “fight back.”

Even aside from the ideological ramifications of this shift in attribution from a French philosopher to an American neo-Nazi, it is important to note that all these misattributions have occurred after substantial efforts at debunking the connection between this quote and Voltaire. Efforts to attribute the quote to its original source have been ongoing since at least 2012, when etymologist Barry Popik (Reference Popik2012) posted about it on his website in the wake of the quote being posted widely in forums. Indeed, the quote’s attribution to Voltaire has been so thoroughly debunked over the course of the last 13 years that the majority of websites in the first two pages of Google search results are websites discussing the origins and correct attribution of these words.

The persistence of this misinformation online, even in the face of more than a decade of fact checking, points to something broader, which should be of interest to scholars studying rhetoric in the social and political world. Whether disseminated by politicians, artists, or other powerful people, the quote’s continued use and the attribution of it to a “reputable” source is indicative of its resonance. While not all those who have used it are necessarily fellow travelers with the neo-Nazi movement, the idea that criticism in public is akin to domination rather than public sphere contestation is a troubling shift, and one that has partisan implications. Public opinion surveys by the Pew Research Center show the partisan differences in perceptions of “cancel culture,” which Republican and conservative-leaning respondents are significantly more likely to define in terms of undeserved punishment rather than accountability (Vogels et al. Reference Vogels, Anderson, Porteus, Baronavski, Atske, McClain, Auxier, Perrin and Ramshankar2021), and indicate that the effect of using the “cancel culture” framing on an issue actually increases partisan divides in terms of support for groups (Fahey, Roberts, and Utych Reference Fahey, Roberts and Utych2023).

While the attribution of this quote to an Enlightenment philosopher is incorrect, its continual use sheds light on the ways in which dissent and public discourse are understood by the people who use it, and its spread in the context of their sustained public criticism is vital to understanding its resilience. This quote, and its reputation-washing attribution, demonstrate the same dynamics underlying the data from public opinion surveys and experimental social science. Yet if social scientists merely say that these actors were misinformed or spreading incorrect information, we miss the opportunity to leverage this important data about context and speakers for wider understanding. By remaking the argument of a socially unacceptable activist into the words of a philosopher, the propagators of this quote have remade the terrain of democratic contestation, with measurable offline effects.

Misinformation Case Study 3: Who Gets to Be a Victim?

In early February 2025, Donald Trump signed an executive order entitled “Addressing Egregious Actions of the Republic of South Africa,” which directed the freezing of all aid to South Africa and the creation of a refugee program to “promote the resettlement of Afrikaner refugees escaping government-sponsored race-based discrimination, including racially discriminatory property confiscation” (The White House 2025a). In May, the first charter plane of white South Africans arrived at Dulles International Airport in Virginia, welcomed by administration officials who championed their work ethic, potential for assimilation, and tenacity, and highlighted the discrimination they faced in South Africa (US Department of State 2025). The welcoming of these refugees was all the more notable because during the same time period the Trump administration had paused all other refugee programs (The White House 2025b).

The central tension is that the available statistical evidence demonstrates that the situation in South Africa does not match the picture painted by the executive orders. White South Africans continue to be significantly wealthier than their fellow citizens on average (Díaz Pabón et al. Reference Díaz Pabón, Leibbrandt, Ranchhod and Savage2021) and are less likely to experience violent crime (Kriegler and Shaw Reference Kriegler and Shaw2016). The allegations of a “white genocide” that were later made by administration officials (Chothia Reference Chothia2025) have been rejected by South African courts (Masih Reference Masih2025), mass media (C. Wilson Reference Wilson2020), and in scholarly evaluations of crime data (Pogue Reference Pogue2019) for at least the last decade (Chothia Reference Chothia2018). New legislation around the expropriation of land, signed into law in early 2025, does change the legal regime around state seizures of land, but it is limited by oversight from the South African legal system and there is no racial motivation for expropriation in the language of the law (Gerber Reference Gerber2018). Additionally, there have been no confiscations of land on the basis of race (or any other logic) since the passage of the new legislation (Boggenpoel Reference Boggenpoel2025).

Given this, how have white activists been able to successfully seed the notion of unique victimhood in South Africa? In large part, this common perception is the result of a public campaign by South African activists that focuses on infrequent but recurring cases of shocking violence and was later taken up by major figures in the right-wing and alt-right media (Holmes Reference Holmes2019). This strategy, in which activists are “constantly reminding the audience of the victims of crime in the past,” is “collapsing the timeline of threat so that the spectacular violence of only a few cases is happening always and everywhere, as part of a single, seamless trauma” (Holmes Reference Holmes2022, 377). These activists, whose accounts have been circulating in right-wing and alt-right circles for at least 20 years, have garnered significant political alliances since the rise of Donald Trump to the national political stage, even in the face of countervailing evidence (Swain Reference Swain2002).

Nevertheless, the perceived threats of land seizure and crime, even if not borne out by statistical facts, have long been a motivating cause for activists focused on white rights in southern Africa (Steyn and Foster Reference Steyn and Foster2008). They have shaped white South Africans’ evaluations of governmental legitimacy (Lemanski Reference Lemanski2004), life satisfaction (Møller Reference Møller2005), and their sense of investment in democracy (Comaroff and Comaroff Reference Comaroff and Comaroff2006). They have been the subject of mass advocacy campaigns both in South Africa (Jordaan Reference Jordaan2017) and internationally (Ward Reference Ward2018).

Such campaigns have been very successful in creating a sympathetic international public for these perceived victims, especially in the context of tensions around immigration in places like Australia (J. Wilson Reference Wilson2018) and the US (Klee Reference Klee2024). Proponents of the “great replacement theory,” among them policy makers within the Trump administration, have held up the cases of South Africa and Zimbabwe as cautionary tales about the consequences of ceding power to multicultural majorities (Swain Reference Swain2002). The culmination of these campaigns and perceived threats has been the granting of refugee status to white South Africans by the US. Once again, the context of the rise and mainstreaming of major alt-right figures is critical in understanding why and how this refugee program came about, and why it came about when it did. While decades of activism underpin the sudden rise to prominence of a victimhood narrative for white South Africans, it is only the context of the late 2010s and mid-2020s that converts what was a fringe talking point into a point of public policy.

By concentrating on a small number of violent cases and dissolving them from their historical, criminal justice, or geographic context, activist groups convert “the emotional draw of their victimhood narrative … even if exceptionally rare, into political opportunity through repetition” (Holmes Reference Holmes2022, 381). That activism created sympathies, and from those sympathies a program was created in which dozens of people came to the US with refugee status (Iqbal, Macaulay, and Drenon Reference Iqbal, Macaulay and Drenon2025). Whether they are credible or not, whether grounded in statistical data or perceived vulnerabilities, the fact remains that powerful people in the Trump administration chose to institute a program that enabled dozens of people to radically alter their lives. To simply argue that there is no evidence to support these claims misses the very real effects that these claims have, how they gained common currency, and when and where they acquired their impact.

Conclusion

Social scientists miss critical potential sources of evidence by viewing mis- and disinformation online as merely wrong, as pointed out by Uscinski, Littrell, and Klofstad (Reference Uscinski, Littrell and Klofstad2024). This reflection provokes further conversation by arguing that the context in which decontextualized information is presented and the positionality of those who present it also informs truth claims and conditions the uptake of this information. In doing so, an ethnographic sensibility applied to the uptake of decontextualized information online allows social scientists to account for the ways that such information can act as Arendtian “political lies,” whether through intentional deception, reliance on different data, or mixing broadly true and untrue information in ways informed and conditioned by the context of its repetition. The creative potential for this information is in some ways unrelated to the truth at its basis, and in adopting an ethnographic approach to understanding it, it is possible to account for the uptake of the information, the times and audiences with which it resonates, and the reality it produces.

Suspending the need for truth claims in addressing misinformation online—insisting, for example, that the quote was from a neo-Nazi, that the wind turbine photo was from Sweden, or that white South Africans are not singled out for violence or discrimination—allows social scientists to focus instead on exploring how the decontextualized information is leveraged by those who deploy it in particular contexts, and how it creates durable effects. Social scientists are missing much of the effects of mis- and disinformation online if they persist in thinking about these effects purely in terms of truth claims, and if they fail to account for the importance of context and positionality in producing mixes of truth and fabrication.

But discerning these kinds of baselines takes time, and the process of tracing the dissemination of a fact, image, or frame and its debunking can sometimes be difficult. Just as with ethnography in in-person contexts, getting a sense of how information moves through a space and an audience requires immersion in that context. While ethnography has many virtues, it is not a method to be applied quickly. This is all the more true in a digital environment, where the density of information, the speed at which it travels, and the reach of the information far surpasses the local contexts in which so much ethnography is conducted.

Even with these shortcomings, the image of the wind turbine is useful not merely as an artifact from a different time and place, but also as a means to understand Texas in 2021 and the national conversations around green energy. A quote from a neo-Nazi, while misattributed, tells the reader about the ways in which its disseminators receive critique: as censorship, not dialogue. The perceived threat to white South Africans is spun out into a program that facilitates their claiming of special legal status. In all these cases, fact checking and post hoc correction may be important for a positivist approach to establishing a credible record of events. If we adopt an ethnographic sensibility, however, these instances of misapplied information can help us to understand the milieu in which they resonate, and the reasons why certain contexts amplify their resonance. Additionally, applying an ethnographic sensibility to decontextualized information allows for a much more fine-grained analysis of both truth claims and fabrications, which elides the technical and epistemological difficulties in distinguishing “misinformation” from other types of informational cascades.

An agnosticism to the truth may not be useful in understanding where a photo comes from, the source of a quote or meme, or the empirical value of the threat that it is perceived to pose. But it does help to explain why and how such pieces of information take on a life of their own in the context of online discussions and become true in ways that supersede their empirical origins. Added to this, sensitivity to the context that produces the information and the contexts in which it resonates allows for fine-grained analysis of the blending of truth and fabrication. Being cognizant of the background environment also helps us to analyze the resonance and importance of decontextualized information when it achieves virality in new contexts.

Footnotes

1 This phrase and the title of this piece reference and signal an intellectual debt to the work of Lee Ann Fujii, especially her influential article “Shades of Truth and Lies: Interpreting Testimonies of War and Violence” (Fujii Reference Fujii2010).

2 For the purposes of this paper, I will use the term “misinformation” primarily because proving the intention of the disseminators of misapplied or decontextualized information is outside of the scope or ambitions of this project. It is an important distinction in many instances, but for the purposes of this argument the distinction is not critical. The information is taken up as credible, regardless of whether it is intentionally deceptive or not.

References

Allina-Pisano, Jessica. 2009. “How to Tell an Axe Murderer: An Essay on Ethnography, Truth, and Lies.” In Political Ethnography: What Immersion Contributes to the Study of Power, ed. Schatz, Edward, 5373. Chicago: University of Chicago Press.Google Scholar
Anspach, Nicolas M., and Carlson, Taylor N.. 2020. “What to Believe? Social Media Commentary and Belief in Misinformation.” Political Behavior 42 (3): 697718. DOI: 10.1007/s11109-018-9515-z.10.1007/s11109-018-9515-zCrossRefGoogle Scholar
Arendt, Hannah. 1971. “Lying in Politics: Reflections on the Pentagon Papers.” New York Review of Books, November 18. https://www.nybooks.com/articles/1971/11/18/lying-in-politics-reflections-on-the-pentagon-pape.Google Scholar
Press, Associated. 2019. “John Cusack Apologizes for Anti-Semitic Tweet.” WWMT, June 19. https://wwmt.com/news/entertainment/john-cusack-apologizes-for-anti-semitic-tweet.Google Scholar
Barbaro, Michael. 2016. “Donald Trump Clung to ‘Birther’ Lie for Years, and Still Isn’t Apologetic.” New York Times, September 16. https://www.nytimes.com/2016/09/17/us/politics/donald-trump-obama-birther.html.Google Scholar
Barr, Andy (@RepAndyBarr). 2021. “@FriendsofCoalKY is correct: this storm exposed the dangers of relying too heavily on renewable energy. Wind turbines are freezing in Texas, unable to provide power and leaving thousands of Texans in the cold.” Twitter, February 16, 7:48 p.m. https://twitter.com/RepAndyBarr/status/1361764379037949959.Google Scholar
Boebert, Lauren (@laurenboebert). 2021. “Rolling blackouts from ND to TX have turned into lengthy power outages in freezing conditions. Biden needs to lift his oil & gas ban as we need reliable energy sources. The Green New Deal was just proven unsustainable as renewables are clearly unreliable.” Twitter, February 16, 2:45 a.m. https://twitter.com/laurenboebert/status/1361506987339571201.Google Scholar
Boggenpoel, Zsa-Zsa Temmers. 2025. “Land Seizure and South Africa’s New Expropriation Law: Scholar Weighs Up the Act.” The Conversation, January 30. http://theconversation.com/land-seizure-and-south-africas-new-expropriation-law-scholar-weighs-up-the-act-244697.10.64628/AAJ.awxancnkyCrossRefGoogle Scholar
Bond, Shannon. 2023. “Fake Viral Images of an Explosion at the Pentagon Were Probably Created by AI.” NPR, May 22. https://www.npr.org/2023/05/22/1177590231/fake-viral-images-of-an-explosion-at-the-pentagon-were-probably-created-by-ai.Google Scholar
Browne, , Ed. 2021. “Why Did Wind Turbines Freeze in Texas When They Work in the Arctic?” Newsweek, February 18. https://www.newsweek.com/texas-wind-turbines-frozen-power-why-arctic-1570173.Google Scholar
CBS Philadelphia . 2020. “Jewish Federation of Greater Philadelphia Calls for Philly NAACP President to Resign after Anti-Semitic Facebook Post.” CBS News, July 25. https://www.cbsnews.com/philadelphia/news/philadelphia-naacp-president-rodney-muhammad-anti-semitic-facebook-post.Google Scholar
Chenzi, Vincent. 2021. “Fake News, Social Media and Xenophobia in South Africa.” African Identities 19 (4): 502–21. DOI: 10.1080/14725843.2020.1804321.10.1080/14725843.2020.1804321CrossRefGoogle Scholar
Chothia, Farouk. 2018. “South Africa: The Groups Playing on the Fears of a ‘White Genocide.’” BBC News, September 1. https://www.bbc.com/news/world-africa-45336840.Google Scholar
Chothia, Farouk. 2025. “Is There a Genocide of White South Africans as Donald Trump Claims?” BBC News, May 16. https://www.bbc.com/news/articles/c9wg5pg1xp5o.Google Scholar
Chute, Nate. 2021. “What Percentage of Texas Energy Is Renewable? Breaking Down the State’s Power Sources from Gas to Wind.” Austin American-Statesman, February 19. https://www.statesman.com/story/news/2021/02/17/texas-energy-wind-power-outage-natural-gas-renewable-green-new-deal/6780546002.Google Scholar
CNN . 2019. “Watch Trump Rant about Wind Turbines during Speech.” Video, 1:40. CNN, December 23. https://www.cnn.com/videos/politics/2019/12/23/trump-wind-sot-turning-point-usa-conference-vpx.cnn.Google Scholar
Comaroff, Jean, and Comaroff, John L.. 2006. “Figuring Crime: Quantifacts and the Production of the Un/Real.” Public Culture 18 (1): 209–46. DOI: 10.1215/08992363-18-1-209.10.1215/08992363-18-1-209CrossRefGoogle Scholar
Crenshaw, Dan (@RepDanCrenshaw). 2021. “With electricity blackouts across Texas, many are wondering: what happened? How does the energy capital of the United States run out of power? Here’s what happened. THREAD.” Twitter, February 16, 9:41 p.m. https://twitter.com/RepDanCrenshaw/status/1361792727097212929.Google Scholar
Daines, Steve (@SteveDaines). 2021. “Texas is frozen solid as folks are left w/ no power to stay safe & warm. This is a perfect example of the need for reliable energy sources like natural gas & coal. These blackouts would be devastating to MT. No heat & no power simply are not options in the dead of winter.” Twitter, February 16, 4:19 p.m. https://twitter.com/SteveDaines/status/1361711788493074432.Google Scholar
Darcy, Oliver. 2024. “Radicalized by the Right: Elon Musk Puts His Conspiratorial Thinking on Display for the World to See.” Yahoo Finance, March 19. https://finance.yahoo.com/news/radicalized-elon-musk-puts-conspiratorial-055858883.html.Google Scholar
Del Vicario, Michela, Bessi, Alessandro, Zollo, Fabiana, Petroni, Fabio, Scala, Antonio, Caldarelli, Guido, Eugene Stanley, H., and Quattrociocchi, Walter. 2016. “The Spreading of Misinformation Online.” Proceedings of the National Academy of Sciences 113 (3): 554–59. DOI: 10.1073/pnas.1517441113.10.1073/pnas.1517441113CrossRefGoogle ScholarPubMed
Díaz Pabón, Fabio Andrés, Leibbrandt, Murray, Ranchhod, Vimal, and Savage, Michael. 2021. “Piketty Comes to South Africa.” British Journal of Sociology 72 (1): 106–24. DOI: 10.1111/1468-4446.12808.10.1111/1468-4446.12808CrossRefGoogle ScholarPubMed
Douglas, Erin, and Ramsey, Ross. 2021. “No, Frozen Wind Turbines Aren’t the Main Culprit for Texas’ Power Outages.” Texas Tribune, February 16. https://www.texastribune.org/2021/02/16/texas-wind-turbines-frozen.Google Scholar
Douglas, Karen M., Uscinski, Joseph E., Sutton, Robbie M., Cichocka, Aleksandra, Nefes, Turkay, Ang, Chee Siang, and Deravi, Farzin. 2019. “Understanding Conspiracy Theories.” Political Psychology 40 (S1): 335. DOI: 10.1111/pops.12568.10.1111/pops.12568CrossRefGoogle Scholar
Duffy, Clare. 2024. “Top AI Photo Generators Produce Misleading Election-Related Images, Study Finds.” CNN, March 6. https://www.cnn.com/2024/03/06/tech/ai-images-election-misinformation/index.html.Google Scholar
Ellery, Simon. 2023. “Fake Photos of Pope Francis in a Puffer Jacket Go Viral, Highlighting the Power and Peril of AI.” CBS News, March 28. https://www.cbsnews.com/news/pope-francis-puffer-jacket-fake-photos-deepfake-power-peril-of-ai.Google Scholar
Fahey, James J., Roberts, Damon C., and Utych, Stephen M.. 2023. “Principled or Partisan? The Effect of Cancel Culture Framings on Support for Free Speech.” American Politics Research 51 (1): 6975. DOI: 10.1177/1532673X221087601.10.1177/1532673X221087601CrossRefGoogle Scholar
Flores, Nina M., McBrien, Heather, Do, Vivian, Kiang, Mathew V., Schlegelmilch, Jeffrey, and Casey, Joan A.. 2023. “The 2021 Texas Power Crisis: Distribution, Duration, and Disparities.” Journal of Exposure Science & Environmental Epidemiology 33 (1): 2131. DOI: 10.1038/s41370-022-00462-5.10.1038/s41370-022-00462-5CrossRefGoogle ScholarPubMed
Flynn, Sheila. 2023. “Elon Musk Tweets Quote by Neo-Nazi Wrongly Attributed to Voltaire.” The Independent, May 29. https://www.the-independent.com/news/world/americas/us-politics/elon-musk-twitter-voltaire-neo-nazi-b2347605.html.Google Scholar
Fujii, Lee Ann. 2010. “Shades of Truth and Lies: Interpreting Testimonies of War and Violence.” Journal of Peace Research 47 (2): 231–41. DOI: 10.1177/0022343309353097.10.1177/0022343309353097CrossRefGoogle Scholar
Gerber, Jan. 2018. “National Assembly Adopts Report on Land Expropriation without Compensation.” News24 (Cape Town), December 4. https://www.news24.com/news24/southafrica/news/national-assembly-adopts-report-on-land-expropriation-without-compensation-20181204.Google Scholar
Goldberg, Robert Alan. 2008. Enemies Within: The Culture of Conspiracy in Modern America. New Haven, CT: Yale University Press. DOI: 10.12987/yale/9780300090000.001.0001.Google Scholar
Greene, Marjorie Taylor (@mtgreenee). 2021. “If passed, the Green New Deal will literally kill people. Millions of people have suffered in TX with #rollingblackouts & some died bc of reliance on ‘green’ energy. Increasing the use & banning the export of clean & plentiful reliabale [sic] natural gas for electricity is saving TX.” Twitter, February 18, 1:10 p.m. https://twitter.com/mtgreenee/status/1362389013521432582.Google Scholar
Hine, Christine. 2017. “From Virtual Ethnography to the Embedded, Embodied, Everyday Internet.” In The Routledge Companion to Digital Ethnography, eds. Hjorth, Larissa, Horst, Heather, Galloway, Anne, and Bell, Genevieve, 4754. Abingdon: Routledge.Google Scholar
Holmes, Carolyn E. 2019. “Tucker Carlson, Those South African White Rights Activists Aren’t Telling You the Whole Truth.” Washington Post, May 15. https://www.washingtonpost.com/politics/2019/05/15/tucker-carlson-those-south-african-white-rights-activists-arent-telling-you-whole-truth.Google Scholar
Holmes, Carolyn E. 2022. “Victimhood Gone Viral: Portrayals of Extra-Lethal Violence and the Solidarity of Victims in the Case of South African Farm Violence Activists.” Politics, Groups, and Identities 10 (3): 367–87. DOI: 10.1080/21565503.2020.1838303.10.1080/21565503.2020.1838303CrossRefGoogle Scholar
Hunt, Elle. 2015. “Cory Bernardi Mistakenly ‘Quotes’ Voltaire on Twitter with Neo-Nazi’s Line.” The Guardian, November 27. https://www.theguardian.com/australia-news/2015/nov/27/cory-bernardi-mistakenly-quotes-voltaire-on-twitter-with-supposed-neo-nazis-line.Google Scholar
Iqbal, Nomia, Macaulay, Cecilia, and Drenon, Brandon. 2025. “Dozens of White South Africans Arrive in US under Trump Refugee Plan.” BBC News, May 12. https://www.bbc.com/news/articles/crljn5046epo.Google Scholar
Jordaan, Nomahlubi. 2017. “#BlackMonday the Biggest Protest Ever against Farm Murders: Afriforum.” Times Live (Johannesburg), October 30. https://www.timeslive.co.za/news/south-africa/2017-10-30-blackmonday-the-biggest-protest-ever-against-farm-murders-afriforum.Google Scholar
Kahn, Brian. 2021. “Viral Image Claiming to Show a Helicopter De-Icing Texas Wind Turbines Is from Winter 2014 in Sweden.” Gizmodo, February 16. https://gizmodo.com/viral-image-claiming-to-show-a-helicopter-de-icing-texa-1846279287.Google Scholar
Kennedy, Brian, and Spencer, Alison. 2021. “Most Americans Support Expanding Solar and Wind Energy, but Republican Support Has Dropped.” Short Reads, June 8. Washington, DC: Pew Research Center. https://www.pewresearch.org/short-reads/2021/06/08/most-americans-support-expanding-solar-and-wind-energy-but-republican-support-has-dropped.Google Scholar
King, Carey W., Rhodes, Josh D., Zarnikau, Jay, and Lin, Ning. 2021. “The Timeline and Events of the February 2021 Texas Electric Grid Blackouts.” Committee report, July. Austin: University of Texas at Austin Energy Institute. https://energy.utexas.edu/sites/default/files/UTAustin%20%282021%29%20EventsFebruary2021TexasBlackout%2020210714.pdf.Google Scholar
Klee, Miles. 2024. “Elon Musk All but Endorses the Great Replacement Conspiracy Theory.” Rolling Stone, January 5. https://www.rollingstone.com/culture/culture-news/elon-musk-great-replacement-conspiracy-theory-1234941337.Google Scholar
Kreps, Sarah, McCain, R. Miles, and Brundage, Miles. 2022. “All the News that’s Fit to Fabricate: AI-Generated Text as a Tool of Media Misinformation.” Journal of Experimental Political Science 9 (1): 104–17. DOI: 10.1017/XPS.2020.37.10.1017/XPS.2020.37CrossRefGoogle Scholar
Kriegler, Anine, and Shaw, Mark. 2016. A Citizen’s Guide to Crime Trends in South Africa. Johannesburg: Jonathan Ball.Google Scholar
Lahsen, Myanna. 2013. “Anatomy of Dissent: A Cultural Analysis of Climate Skepticism.” American Behavioral Scientist 57 (6): 732–53. DOI: 10.1177/0002764212469799.10.1177/0002764212469799CrossRefGoogle Scholar
Lazar, Sian. 2008. El Alto, Rebel City: Self and Citizenship in Andean Bolivia. Durham, NC: Duke University Press. DOI: 10.1215/9780822388760.10.2307/j.ctv11vc8csCrossRefGoogle Scholar
Legate, Luke (@lukelegate). 2021. “A helicopter running on fossil fuel spraying a chemical made from fossil fuels onto a wind turbine made with fossils fuels during an ice storm is awesome.” Twitter, February 14, 7:05 p.m. https://web.archive.org/web/20210217010434/https://twitter.com/lukelegate/status/1361149723072208896.Google Scholar
Lemanski, Charlotte. 2004. “A New Apartheid? The Spatial Implications of Fear of Crime in Cape Town, South Africa.” Environment and Urbanization 16 (2): 101–12. DOI: 10.1177/095624780401600201.10.1177/095624780401600201CrossRefGoogle Scholar
Lewandowsky, Stephan, Ullrich, K. H. Ecker, Seifert, Colleen M., Schwarz, Norbert, and Cook, John. 2012. “Misinformation and Its Correction: Continued Influence and Successful Debiasing.” Psychological Science in the Public Interest 13 (3): 106–31. DOI: 10.1177/1529100612451018.10.1177/1529100612451018CrossRefGoogle ScholarPubMed
Link, Devon. 2021. “Fact Check: Meme of Wind Turbine Being De-Iced Includes 2015 Image from Sweden.” USA Today, February 18. https://www.usatoday.com/story/news/factcheck/2021/02/18/fact-check-image-meme-wind-turbine-de-icing-sweden/6787470002.Google Scholar
Lovins, Amory. 2013. “Debunking the Renewable Energy Disinformation Campaign.” National Geographic, August 8. https://www.nationalgeographic.com/environment/article/lovins-debunking-the-renewable-energy-disinformation-campaign.Google Scholar
Mannik, Lynda, and McGarry, Karen. 2017. Practicing Ethnography: A Student Guide to Method and Methodology. Toronto, ON: University of Toronto Press.Google Scholar
Marianna, Spring. 2024. “Trump Supporters Target Black Voters with Faked AI Images.” BBC News, March 4. https://www.bbc.com/news/world-us-canada-68440150.Google Scholar
Martel, Cameron, and Rand, David G.. 2023. “Misinformation Warning Labels Are Widely Effective: A Review of Warning Effects and Their Moderating Features.” Current Opinion in Psychology 54: 101710. DOI: 10.1016/j.copsyc.2023.101710.10.1016/j.copsyc.2023.101710CrossRefGoogle ScholarPubMed
Masih, Niha. 2025. “‘White Genocide’ in South Africa Is Not Real, Court Rules in Estate Case.” Washington Post, February 27. https://www.washingtonpost.com/world/2025/02/27/south-africa-white-genocide-claim.Google Scholar
McGranahan, Carole. 2017. “An Anthropology of Lying: Trump and the Political Sociality of Moral Outrage.” American Ethnologist 44 (2): 243–48. DOI: 10.1111/amet.12475.10.1111/amet.12475CrossRefGoogle Scholar
Menczer, Filippo, and Hills, Thomas. 2020. “Information Overload Helps Fake News Spread, and Social Media Knows It.” Scientific American, December 1. DOI: 10.1038/scientificamerican1220-54.10.1038/scientificamerican1220-54CrossRefGoogle Scholar
Mezrich, Ben. 2023. Breaking Twitter: Elon Musk and the Most Controversial Corporate Takeover in History. New York: Grand Central.Google Scholar
Møller, Valerie. 2005. “Resilient or Resigned? Criminal Victimisation and Quality of Life in South Africa.” Social Indicators Research 72 (3): 263317. DOI: 10.1007/s11205-004-5584-y.10.1007/s11205-004-5584-yCrossRefGoogle Scholar
Munn, Luke, Magee, Liam, and Arora, Vanicka. 2024. “Truth Machines: Synthesizing Veracity in AI Language Models.” AI & Society 39 (6): 2759–73. DOI: 10.1007/s00146-023-01756-4.10.1007/s00146-023-01756-4CrossRefGoogle Scholar
Nickerson, Raymond S. 1998. “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises.” Review of General Psychology 2 (2): 175220. DOI: 10.1037/1089-2680.2.2.175.10.1037/1089-2680.2.2.175CrossRefGoogle Scholar
Nyhan, Brendan. 2021. “Why the Backfire Effect Does Not Explain the Durability of Political Misperceptions.” Proceedings of the National Academy of Sciences 118 (15): e1912440117. DOI: 10.1073/pnas.1912440117.10.1073/pnas.1912440117CrossRefGoogle ScholarPubMed
Oliver, J. Eric, and Wood, Thomas J.. 2014. “Conspiracy Theories and the Paranoid Style(s) of Mass Opinion.” American Journal of Political Science 58 (4): 952–66. DOI: 10.1111/ajps.12084.10.1111/ajps.12084CrossRefGoogle Scholar
Pengelly, Martin. 2022. “Republican under Fire for Using ‘Voltaire’ Quote Actually Coined by Neo-Nazi.” The Guardian, January 31. https://www.theguardian.com/us-news/2022/jan/31/republican-thomas-massie-voltaire-quote-neo-nazi.Google Scholar
Persily, Nathaniel, and Tucker, Joshua A.. 2020. “Introduction.” In Social Media and Democracy: The State of the Field, Prospects for Reform, eds. Persily, Nathaniel and Tucker, Joshua A., 19. Cambridge: Cambridge University Press. DOI: 10.1017/9781108890960.002.10.1017/9781108890960CrossRefGoogle Scholar
Pink, Sarah, Horst, Heather, Postill, John, Hjorth, Larissa, Lewis, Tania, and Tacchi, Jo. 2016. Digital Ethnography: Principles and Practice. Thousand Oaks, CA: SAGE Publications.Google Scholar
Pogue, James. 2019. “The Myth of White Genocide.” Harper’s Magazine (Johannesburg), February 15. https://pulitzercenter.org/reporting/myth-white-genocide.Google Scholar
Popik, Barry. 2012. “‘To Learn Who Rules over You, Simply Find Out Who You Are Not Allowed to Criticize’ (Voltaire?).” Big Apple [blog], August 16. https://barrypopik.com/new_york_city/entry/to_learn_who_rules_over_you_simply_find_out_who_you_are_not_allowed_to_crit.Google Scholar
Porter, Ethan, and Wood, Thomas J.. 2021. “The Global Effectiveness of Fact-Checking: Evidence from Simultaneous Experiments in Argentina, Nigeria, South Africa, and the United Kingdom.” Proceedings of the National Academy of Sciences 118 (37): e2104235118. DOI: 10.1073/pnas.2104235118.10.1073/pnas.2104235118CrossRefGoogle ScholarPubMed
Porter, Ethan, and Wood, Thomas J.. 2024. “Factual Corrections: Concerns and Current Evidence.” Current Opinion in Psychology 55 (February): 101715. DOI: 10.1016/j.copsyc.2023.101715.10.1016/j.copsyc.2023.101715CrossRefGoogle ScholarPubMed
Prier, Jarred. 2017. “Commanding the Trend: Social Media as Information Warfare.” Strategic Studies Quarterly 11 (4): 5085.Google Scholar
Team, Reality Check. 2021. “Texas Weather: Are Frozen Wind Turbines to Blame for Power Cuts?” BBC News, February 22. https://www.bbc.com/news/world-56085733.Google Scholar
Reich, Jennifer A. 2018. Calling the Shots: Why Parents Reject Vaccines. New York: New York University Press. DOI: 10.18574/nyu/9781479884506.001.0001.Google Scholar
Reuters Fact Check. 2021. “Fact Check—Vintage Movie Poster for ‘The Omicron Variant’ Is Fake.” Reuters, December 2. https://www.reuters.com/article/factcheck-coronavirus-omicron-idUSL1N2SN1PJ.Google Scholar
Schatz, Edward. 2009. “Ethnographic Immersion and the Study of Politics.” In Political Ethnography: What Immersion Contributes to the Study of Power, ed. Schatz, Edward, 122. Chicago: University of Chicago Press.10.7208/chicago/9780226736785.001.0001CrossRefGoogle Scholar
Schiffrin, Anya. 2017. “Disinformation and Democracy: The Internet Transformed Protest but Did Not Improve Democracy.” Journal of International Affairs 71 (1): 117–26.Google Scholar
Schultz, Charlotta von. 2015. “Helikopter-strålen är nya vapnet mot isen.” Ny Teknik, January 21. https://www.nyteknik.se/energi/helikopter-stralen-ar-nya-vapnet-mot-isen/915835.Google Scholar
Scott, James C. 1990. Domination and the Arts of Resistance: Hidden Transcripts. New Haven, CT: Yale University Press. DOI: 10.12987/9780300153569.Google Scholar
Searcey, Dionne. 2021. “No, Wind Farms Aren’t the Main Cause of the Texas Blackouts.” New York Times, February 17. https://www.nytimes.com/2021/02/17/climate/texas-blackouts-disinformation.html.Google Scholar
Shammas, Brittany. 2020. “Hoaxers Are Using the Australia Wildfires to Spread Misinformation—Sometimes for Profit.” Washington Post, January 9. https://www.washingtonpost.com/technology/2020/01/09/hoaxers-are-using-australia-wildfires-spread-misinformation-sometimes-profit.Google Scholar
Shepherd, Katie. 2021. “Rick Perry Says Texans Would Accept Even Longer Power Outages ‘to Keep the Federal Government out of Their Business.’” Washington Post, February 18. https://www.washingtonpost.com/nation/2021/02/17/texas-abbott-wind-turbines-outages.Google Scholar
Siddiqui, Faiz, and Svrluga, Susan. 2016. “N.C. Man Told Police He Went to D.C. Pizzeria with Gun to Investigate Conspiracy Theory.” Washington Post, December 5. https://www.washingtonpost.com/news/local/wp/2016/12/04/d-c-police-respond-to-report-of-a-man-with-a-gun-at-comet-ping-pong-restaurant.Google Scholar
Smith, Nicholas Rush. 2019. Contradictions of Democracy: Vigilantism and Rights in Post-Apartheid South Africa. Oxford: Oxford University Press. DOI: 10.1093/oso/9780190847180.001.0001.10.1093/oso/9780190847180.001.0001CrossRefGoogle Scholar
Spradley, James P. (1979) 2016. The Ethnographic Interview. Long Grove, IL: Waveland.Google Scholar
Steyn, Melissa E., and Foster, Don. 2008. “Repertoires for Talking White: Resistant Whiteness in Post-Apartheid South Africa.” Ethnic and Racial Studies 31 (1): 2551. DOI: 10.1080/01419870701538851.10.1080/01419870701538851CrossRefGoogle Scholar
Stieglitz, Stefan, and Dang-Xuan, Linh. 2013. “Emotions and Information Diffusion in Social Media—Sentiment of Microblogs and Sharing Behavior.” Journal of Management Information Systems 29 (4): 217–48. DOI: 10.2753/MIS0742-1222290408.10.2753/MIS0742-1222290408CrossRefGoogle Scholar
Storrow, Benjamin. 2021. “Why the Deep Freeze Caused Texas to Lose Power.” Scientific American, February 18. Originally published in E&E News. https://www.scientificamerican.com/article/why-the-deep-freeze-caused-texas-to-lose-power.Google Scholar
Strom, Kevin Alfred. 2017. “Voltaire Didn’t Say It.” National Vanguard, January 19. https://nationalvanguard.org/2017/01/voltaire-didnt-say-it.Google Scholar
Swain, Carol M. 2002. The New White Nationalism in America: Its Challenge to Integration. Cambridge: Cambridge University Press.Google Scholar
Tollefson, Jeff. 2021. “The Race to Curb the Spread of COVID Vaccine Disinformation.” Nature, April 16. DOI: 10.1038/d41586-021-00997-x.10.1038/d41586-021-00997-xCrossRefGoogle Scholar
Uscinski, Joseph. 2023. “What Are We Doing When We Research Misinformation?Political Epistemology 1 (2): 213.Google Scholar
Uscinski, Joseph, Littrell, Shane, and Klofstad, Casey. 2024. “The Importance of Epistemology for the Study of Misinformation.” Current Opinion in Psychology 57 (June): 101789. DOI: 10.1016/j.copsyc.2024.101789.10.1016/j.copsyc.2024.101789CrossRefGoogle Scholar
US Department of State. 2025. “Welcoming Afrikaner Refugees Fleeing Discrimination.” Video, 8:09. YouTube, May 13. https://www.youtube.com/watch?v=i_f79hIW1po.Google Scholar
Usero, Adriana, and Rizzo, Salvador. 2021. “‘Frozen Windmills’ Aren’t to Blame for Texas’s Power Failure.” Washington Post, February 18. https://www.washingtonpost.com/politics/2021/02/18/frozen-windmills-arent-blame-texass-power-failure-neither-is-green-new-deal.Google Scholar
Van Der Zee, Sophie, Poppe, Ronald, Havrileck, Alice, and Baillon, Aurélien. 2022. “A Personal Model of Trumpery: Linguistic Deception Detection in a Real-World High-Stakes Setting.” Psychological Science 33 (1): 317. DOI: 10.1177/09567976211015941.10.1177/09567976211015941CrossRefGoogle Scholar
Vogels, Emily A., Anderson, Monica, Porteus, Margaret, Baronavski, Chris, Atske, Sara, McClain, Colleen, Auxier, Brooke, Perrin, Andrew, and Ramshankar, Meera. 2021. “Americans and ‘Cancel Culture’: Where Some See Calls for Accountability, Others See Censorship, Punishment.” Feature, May 19. Washington, DC: Pew Research Center. https://www.pewresearch.org/internet/2021/05/19/americans-and-cancel-culture-where-some-see-calls-for-accountability-others-see-censorship-punishment.Google Scholar
Waldrop, M. Mitchell. 2023. “How to Mitigate Misinformation.” Proceedings of the National Academy of Sciences 120 (36): 14. DOI: 10.1073/pnas.2314143120.10.1073/pnas.2314143120CrossRefGoogle ScholarPubMed
Walter, Nathan, Jonathan Cohen, R. Lance Holbert, and Morag, Yasmin. 2020. “Fact-Checking: A Meta-Analysis of What Works and for Whom.” Political Communication 37 (3): 350–75. DOI: 10.1080/10584609.2019.1668894.10.1080/10584609.2019.1668894CrossRefGoogle Scholar
Ward, Justin. 2018. “The Dangerous Myth of ‘White Genocide’ in South Africa.” Hatewatch, August 23. Montgomery, AL: Southern Poverty Law Center. https://www.splcenter.org/resources/hatewatch/dangerous-myth-white-genocide-south-africa.Google Scholar
Wedeen, Lisa. 2010. “Reflections on Ethnographic Work in Political Science.” Annual Review of Political Science 13: 255–72. DOI: 10.1146/annurev.polisci.11.052706.123951.10.1146/annurev.polisci.11.052706.123951CrossRefGoogle Scholar
The White House. 2025a. “Addressing Egregious Actions of the Republic of South Africa.” Presidential Actions, February 7. Washington, DC: The White House. https://www.whitehouse.gov/presidential-actions/2025/02/addressing-egregious-actions-of-the-republic-of-south-africa.Google Scholar
The White House. 2025b. “Realigning the United States Refugee Admissions Program.” Presidential Actions, January 20. Washington, DC: The White House. https://www.whitehouse.gov/presidential-actions/2025/01/realigning-the-united-states-refugee-admissions-program.Google Scholar
Wilson, Chris. 2020. “Nostalgia, Entitlement and Victimhood: The Synergy of White Genocide and Misogyny.” Terrorism and Political Violence 34 (8): 1810–25. DOI: 10.1080/09546553.2020.1839428.10.1080/09546553.2020.1839428CrossRefGoogle Scholar
Wilson, Jason. 2018. “Peter Dutton’s Offer to White South African Farmers Started on the Far Right.” The Guardian, March 16. https://www.theguardian.com/commentisfree/2018/mar/16/peter-duttons-offer-to-white-south-african-farmers-started-on-the-far-right.Google Scholar
Yanow, Dvora, and Schwartz-Shea, Peregrine. 2006. Interpretation and Method: Empirical Research Methods and the Interpretive Turn. Armonk, NY: M. E. Sharpe.Google Scholar
Yen, Hope, and Woodward, Calvin. 2021. “AP FACT CHECK: Trump Clings to His Core Election Falsehoods.” AP News, March 1. https://apnews.com/article/ap-fact-check-joe-biden-donald-trump-wind-power-immigration-95084f81ec1638763ed4edabed3a4fcc.Google Scholar
Figure 0

Figure 1 Frozen Wind Turbines: Legate Tweet

Figure 1

Figure 2 Google Suggested Search Term

Figure 2

Figure 3 Screenshot of Elon Musk Tweet