Google-backed RAND report recommends infiltrating & subverting online conspiracy groups from within
‘Conspiracists’ distrust authority, but will they accept authoritative messaging if it’s filtered through someone else? perspective
Google’s Jigsaw unit sponsors a RAND report that recommends infiltrating and subverting online conspiracy groups from within while planting authoritative messaging wherever possible.
With a focus on online chatter relating to alien visitations, COVID-19 origins, white genocide, and anti-vaccination, the Google-sponsored RAND report published last week shows how machine learning can help detect and understand the language used by “conspiracy theorists.”
While the 108-page report can be highly technical in describing machine learning approaches for identifying and making sense of conspiracy language online, here we’re not going to focus on any of that.
Instead, we will zoom-in on the report’s “Policy Recommendations for Mitigating the Spread of and Harm from Conspiracy Theories” section and attempt to see how they might be received in the real world.
“Conspiracists have their own experts on whom they lean to support and strengthen their views […] One alternative approach could be to direct outreach toward moderate members of those groups who could, in turn, exert influence on the broader community” — RAND report
Diving into the report’s policy recommendations, they all have one thing in common — they all seek to plant authoritative messaging wherever possible while making it seem more organic, or to make the messaging more relatable to the intended audience at the very least.
The four policy recommendations are:
- Transparent and Empathetic Engagement with Conspiracists
- Correcting Conspiracy-Related False News
- Engagement with Moderate Members of Conspiracy Groups
- Addressing of Fears and Existential Threats
The original narrative from authoritative sources always stays the same, but the message is usually filtered through intermediaries that act like marketing, advertising, and PR firms.
What follows doesn’t have anything to do with the validity of any conspiracy theory, but rather focuses on the Google-sponsored RAND report’s messaging strategy through the following lens:
Are ‘conspiracy theorists’ more likely to believe an authoritative message when it comes from someone else?
Or
Are they more likely to focus on the validity of the message itself without placing all their trust on the messenger?
The Google-sponsored RAND report recommends that the government bet on the former.
But could such a move actually encourage the latter?
It’s a message versus messenger type of debate.
Let’s dig in.
“A common thread among all the conspiracy groups was distrust of conventional authority figures” — RAND Report
To begin, Jigsaw’s latest collaboration with the RAND Corporation reveals that across the board “conspiracy theorists” show a high distrust of “conventional authority figures” while preferring “their own experts on whom they lean to support and strengthen their views.”
The idea of distrust in conventional authority will be a major theme throughout this story as the RAND report promotes subversion from within, planting conventional authority messaging among certain members of the community and hoping it will spread.
The report suggests that conspiracy theorists won’t listen to conventional authority, but they’ll listen to leaders in their groups, so the plan is to target potential influencers in online conspiracy groups who are somewhat on the fence and could tow the conventional authority line.
For example, the report recommends infiltrating and subverting online conspiracy chatter by singling out the more “moderate members” of the group who could become social media influencers in their own rite.
“Evidence suggests that more than one-quarter of adults in North America believe in one or more conspiracies” — RAND report
According to the report, “Conspiracists have their own experts on whom they lean to support and strengthen their views, and their reliance on these experts might limit the impact of formal outreach by public health professionals. [all emphases are mine]
“Our review of the literature shows that one alternative approach could be to direct outreach toward moderate members of those groups who could, in turn, exert influence on the broader community.”
So the logic goes:
- Problem – Conspiracists have their own experts
- Solution – Direct outreach toward moderate members
- Purpose – Exert influence on the broader community
In other words, they want to turn those who aren’t completely onboard with the entirety of the conspiracy into social media influencers for their authoritative marketing campaigns.
But what would be the incentive to flip?
“Commercial marketing programs use a similar approach when they engage social media influencers (or brand ambassadors)” — RAND report
The report goes on to say, “Commercial marketing programs use a similar approach when they engage social media influencers (or brand ambassadors), who can then credibly communicate advantages of a commercial brand to their own audiences on social media."
Incentivizing social media influencers to become ambassadors for a specific brand means the influencers benefit by getting paid, and the companies benefit by reaching a wider audience.
It’s a deal driven by financial incentives in order to gain more influence.
But again, what’s the incentive for “moderate members” of so-called conspiracy groups to flip?
What would a moderate member gain by not only denouncing their former beliefs, but to be a continuous bullhorn shouting at people as one who has seen the folly of their ways?
Would it be for moral reasons, or for some other type of gain?
“It might be possible to convey key messages to those who are only ‘vaccine hesitant,’ and these individuals might, in turn, relay such messages to those on antivaccination social media channels” — RAND report
Remembering that all four chatter groups studied have a distrust of conventional authority figures, RAND suggests using the more easily-persuaded in the group (moderates who aren’t fully convinced) to carry out the messaging of conventional authority figures on their behalf.
With regards to “anti-vax” groups the report suggests, “it might be possible to convey key messages to those who are only ‘vaccine hesitant,’ and these individuals might, in turn, relay such messages to those on antivaccination social media channels.”
This tactic of being sneaky about where the messaging is coming from may be one of the reasons why people don’t trust conventional authority in the first place — a lack of transparency.
The Google-backed RAND report attempts to balance its infiltration and subversion technique by recommending another approach: transparency via “transparent and empathetic engagement with conspiracists.”
“Instead of confrontation,” the report reads, “it might be more effective to engage transparently with conspiracists and express sensitivity. Public health communicators recommend engagements that communicate in an open and evidence-informed way—creating safe spaces to encourage dialogue, fostering community partnerships, and countering misinformation with care.”
In any case, all efforts at “mitigating the spread and harm from online conspiracy theories” are aimed at directing users to accept the very sources they trust the least — conventional authority.
“An additional technique beyond flagging specific conspiracy content is facilitated dialogue, in which a third party facilitates communication (either in person or apart) between conflict parties,” — RAND report
Another example of transparent and empathetic engagement suggested in the report has to do with outsourcing the authoritative messaging to third-parties.
“An additional technique beyond flagging specific conspiracy content is facilitated dialogue, in which a third party facilitates communication (either in person or apart) between conflict parties,” the report suggests.
This third party approach “could improve communication between authoritative communities (such as doctors or government leaders) and conspiracy communities.”
Again, the logic goes:
- Problem: Conspiracy communities neither trust nor interact with authoritative communities
- Solution: Third party facilitates communication
- Purpose: To improve communication between authoritative communities and conspiracy communities
Google’s Jigsaw unit sponsors a RAND report that recommends infiltrating and subverting online conspiracy groups from within while planting authoritative messaging wherever possible.
With a focus on online chatter relating to alien visitations, COVID-19 origins, white genocide, and anti-vaccination, the Google-sponsored RAND report published last week shows how machine learning can help detect and understand the language used by “conspiracy theorists.”
While the 108-page report can be highly technical in describing machine learning approaches for identifying and making sense of conspiracy language online, here we’re not going to focus on any of that.
Instead, we will zoom-in on the report’s “Policy Recommendations for Mitigating the Spread of and Harm from Conspiracy Theories” section and attempt to see how they might be received in the real world.
“Conspiracists have their own experts on whom they lean to support and strengthen their views […] One alternative approach could be to direct outreach toward moderate members of those groups who could, in turn, exert influence on the broader community” — RAND report
Diving into the report’s policy recommendations, they all have one thing in common — they all seek to plant authoritative messaging wherever possible while making it seem more organic, or to make the messaging more relatable to the intended audience at the very least.
The four policy recommendations are:
- Transparent and Empathetic Engagement with Conspiracists
- Correcting Conspiracy-Related False News
- Engagement with Moderate Members of Conspiracy Groups
- Addressing of Fears and Existential Threats
The original narrative from authoritative sources always stays the same, but the message is usually filtered through intermediaries that act like marketing, advertising, and PR firms.
What follows doesn’t have anything to do with the validity of any conspiracy theory, but rather focuses on the Google-sponsored RAND report’s messaging strategy through the following lens:
Are ‘conspiracy theorists’ more likely to believe an authoritative message when it comes from someone else?
Or
Are they more likely to focus on the validity of the message itself without placing all their trust on the messenger?
The Google-sponsored RAND report recommends that the government bet on the former.
But could such a move actually encourage the latter?
It’s a message versus messenger type of debate.
Let’s dig in.
“A common thread among all the conspiracy groups was distrust of conventional authority figures” — RAND Report
To begin, Jigsaw’s latest collaboration with the RAND Corporation reveals that across the board “conspiracy theorists” show a high distrust of “conventional authority figures” while preferring “their own experts on whom they lean to support and strengthen their views.”
The idea of distrust in conventional authority will be a major theme throughout this story as the RAND report promotes subversion from within, planting conventional authority messaging among certain members of the community and hoping it will spread.
The report suggests that conspiracy theorists won’t listen to conventional authority, but they’ll listen to leaders in their groups, so the plan is to target potential influencers in online conspiracy groups who are somewhat on the fence and could tow the conventional authority line.
For example, the report recommends infiltrating and subverting online conspiracy chatter by singling out the more “moderate members” of the group who could become social media influencers in their own rite.
“Evidence suggests that more than one-quarter of adults in North America believe in one or more conspiracies” — RAND report
According to the report, “Conspiracists have their own experts on whom they lean to support and strengthen their views, and their reliance on these experts might limit the impact of formal outreach by public health professionals. [all emphases are mine]
“Our review of the literature shows that one alternative approach could be to direct outreach toward moderate members of those groups who could, in turn, exert influence on the broader community.”
So the logic goes:
- Problem – Conspiracists have their own experts
- Solution – Direct outreach toward moderate members
- Purpose – Exert influence on the broader community
In other words, they want to turn those who aren’t completely onboard with the entirety of the conspiracy into social media influencers for their authoritative marketing campaigns.
But what would be the incentive to flip?
“Commercial marketing programs use a similar approach when they engage social media influencers (or brand ambassadors)” — RAND report
The report goes on to say, “Commercial marketing programs use a similar approach when they engage social media influencers (or brand ambassadors), who can then credibly communicate advantages of a commercial brand to their own audiences on social media."
Incentivizing social media influencers to become ambassadors for a specific brand means the influencers benefit by getting paid, and the companies benefit by reaching a wider audience.
It’s a deal driven by financial incentives in order to gain more influence.
But again, what’s the incentive for “moderate members” of so-called conspiracy groups to flip?
What would a moderate member gain by not only denouncing their former beliefs, but to be a continuous bullhorn shouting at people as one who has seen the folly of their ways?
Would it be for moral reasons, or for some other type of gain?
“It might be possible to convey key messages to those who are only ‘vaccine hesitant,’ and these individuals might, in turn, relay such messages to those on antivaccination social media channels” — RAND report
Remembering that all four chatter groups studied have a distrust of conventional authority figures, RAND suggests using the more easily-persuaded in the group (moderates who aren’t fully convinced) to carry out the messaging of conventional authority figures on their behalf.
With regards to “anti-vax” groups the report suggests, “it might be possible to convey key messages to those who are only ‘vaccine hesitant,’ and these individuals might, in turn, relay such messages to those on antivaccination social media channels.”
This tactic of being sneaky about where the messaging is coming from may be one of the reasons why people don’t trust conventional authority in the first place — a lack of transparency.
The Google-backed RAND report attempts to balance its infiltration and subversion technique by recommending another approach: transparency via “transparent and empathetic engagement with conspiracists.”
“Instead of confrontation,” the report reads, “it might be more effective to engage transparently with conspiracists and express sensitivity. Public health communicators recommend engagements that communicate in an open and evidence-informed way—creating safe spaces to encourage dialogue, fostering community partnerships, and countering misinformation with care.”
In any case, all efforts at “mitigating the spread and harm from online conspiracy theories” are aimed at directing users to accept the very sources they trust the least — conventional authority.
“An additional technique beyond flagging specific conspiracy content is facilitated dialogue, in which a third party facilitates communication (either in person or apart) between conflict parties,” — RAND report
Another example of transparent and empathetic engagement suggested in the report has to do with outsourcing the authoritative messaging to third-parties.
“An additional technique beyond flagging specific conspiracy content is facilitated dialogue, in which a third party facilitates communication (either in person or apart) between conflict parties,” the report suggests.
This third party approach “could improve communication between authoritative communities (such as doctors or government leaders) and conspiracy communities.”
Again, the logic goes:
- Problem: Conspiracy communities neither trust nor interact with authoritative communities
- Solution: Third party facilitates communication
- Purpose: To improve communication between authoritative communities and conspiracy communities
Geen opmerkingen:
Een reactie posten