When an article on Facebook detailed a QAnon conspiracy theory about a “16-Year Plan to Destroy America,” commenters demanded death for those supposedly involved, including former president Barack Obama, former secretary of state Hillary Clinton and other Democrats.

Some Facebook commenters were specific in their calls for justice: “Firing squad---by SHOTGUN!”
Others just craved speed: “TREASON = FIRING SQAUD [sic] OR HANGING! DO IT NOW PLEASE THAT’S THE LAW! ! ! ! ! ! ! ! ! ! ! ! ! !”
These posts — from January 2018, just months after QAnon flamed to life from the embers of Pizzagate, with its false claims of a child sex ring run by Democrats out of a Washington pizzeria — were among the many early warnings that the new conspiracy theory was fueling hatred and calls for violence on Facebook, Twitter and other social media.
But it would be years before Facebook and Twitter would make major moves to curb QAnon’s presence on their platforms, despite serious cases of online harassment and offline violence that followed, and moves by other social media companies to limit the spread of QAnon’s lurid and false allegations of pedophilia and other crimes.
One social media company, Reddit, closed forums devoted to the conspiracy theory in 2018 because of online harassment and calls for violence, and YouTube removed tens of thousands of QAnon videos and hundreds of related channels in June 2019 as part of a broader crackdown on content that violated its hate speech and other policies, YouTube spokesman Alex Joseph said.
Still, it would be another year before Facebook and Twitter would initiate broad crackdowns against QAnon, waiting until this past summer to close or limit the reach of more than 20,000 QAnon-connected accounts and pages after two years of QAnon-fueled threats of violence and numerous real-world crimes. By then, FBI officials, in an intelligence briefing, had warned that QAnon was becoming a potential domestic terrorism threat, and the U.S. Military Academy’s Combating Terrorism Center had warned that “QAnon represents a public security threat with the potential in the future to become a more impactful domestic terror threat.”

QAnon adherents made good use of the delay, using the power of those mainstream social media platforms to grow the movement into what many researchers consider the world’s largest and most virulent online conspiracy theory.

Feverishly analyzing cryptic “drops” of information from the anonymous leader “Q,” followers spread misinformation about a host of seemingly unconnected issues, from the Sandy Hook, Conn., mass shooting to the supposed dangers of vaccines to the recent wildfires in the Pacific Northwest. Throughout, they traded in anti-Semitic tropes and other hateful content.
“These accusations were so deranged,” said researcher Travis View, who co-hosts a podcast called “QAnon Anonymous” and has watched in growing horror as the conspiracy theory grew. “I always knew it would get to the point where people would ask: How did it get to this point? How did it get so bad?”
One key answer, researchers who have studied QAnon say, was Silicon Valley’s fierce reluctance to act as “an arbiter of truth” even as disinformation with potentially dangerous consequences ran rampant across its platforms. Mainstream social media companies permitted the growth of the conspiracy theory in part because they considered it authentic domestic political speech at a time when President Trump and other Republicans were bashing the firms for alleged bias against conservatives, people familiar with internal debates at the companies say.

Twitter’s head of site integrity, Yoel Roth, acknowledged his company had been slow. “Whenever we introduce a change to our policies, we can look back and wish that we’d introduced it earlier. And I think in the case of QAnon in particular, there were signals that I wish we and the entire industry and the world had responded to sooner,” he said in an interview.
Facebook spokesman Andy Stone said his company had done all it could. “Removing hundreds of QAnon pages and groups, restricting the reach of many more, and soon prohibiting anyone from running ads that praise or support QAnon are not the actions of a company afraid of upsetting QAnon supporters," he said. “It’s the important work we’ve done in consultation with outside experts.”
Nearly 600,000 people have voted for candidates who support QAnon
Facebook and Twitter did take actions against individual QAnon accounts and pages in the years before the recent crackdowns, including in April, when Facebook took down five pages and six QAnon-affiliated groups that had amassed more than 100,000 members and followers.

But by the time of more systemic action this summer, more than 7,000 accounts affiliated with QAnon were spreading what Twitter called harmful disinformation on its service. Facebook removed nearly 800 groups and banned 300 hashtags when it acted in August, and placed restrictions on an additional 10,000 accounts across Facebook and Instagram. The company declined to say how many members the groups had, but researchers have said that millions of Facebook users were probably affected.
Researchers say these moves curbed QAnon’s reach somewhat, but several asked: What took so long?
“I don’t think QAnon gets as big as it is without the platforms as an essential piece of the infrastructure holding these communities together,” said Joan Donovan, director of the Technology and Social Change Project at Harvard Kennedy School’s Shorenstein Center. “Early intervention does matter.”

Baseless and bizarre claims​

At QAnon’s core are baseless allegations that Democratic officials and Hollywood celebrities engaged in unconscionable crimes, including raping and eating children, while seeking to subvert the Constitution. Trump, the conspiracy theory holds, is quietly battling these evils.
The “Q” of QAnon is supposedly a high-level government official privy to these secrets because of a top-secret security clearance. The shadowy figure speaks only on the site 8kun, a successor to the now-closed 8chan, but the information for years spread almost instantly across mainstream social media platforms, powered by those analyzing Q’s pronouncements.
More than 70 Republican candidates have promoted or voiced support for at least some elements of the conspiracy theory this year, according to tracking by liberal research group Media Matters, and one open adherent, Marjorie Taylor Greene, is virtually guaranteed to win a seat in Congress in November’s election. Trump has praised Greene, defended QAnon supporters and retweeted content from QAnon accounts.
QAnon T-shirts, slogans and posters have regularly appeared at Trump events since 2018 and, as his reelection effort intensified this year, in campaign ads as well. White House social media director Dan Scavino has posted QAnon-themed imagery. Vice President Pence had plans earlier this month to attend a Montana fundraiser hosted by a couple that has shared QAnon posts and memes on social media until the Associated Press reported on the event.

The list of violence inspired by QAnon is long, and the serious incidents date back to 2018, when an armed man touting the conspiracy theory was arrested after a standoff at the Hoover Dam. Another man fixated on QAnon fatally shot a New York crime family figure in 2019. And this past April, police arrested a woman armed with more than a dozen knives after she announced on Facebook that Clinton and former vice president Joe Biden “need to be taken out.”
How conservatives learned to wield power inside Facebook
Researchers at Graphika, a network analysis firm that works with Facebook and other social media companies, found that QAnon and Trump’s online support overlapped to such an extent in 2018 that the two online communities were almost inextricable for the purposes of mapping relationships among accounts. Camille François, the company’s chief innovation officer, called the resulting network maps of interactions “a hairball” of overlapping accounts.
Now Graphika’s network maps show QAnon has spread beyond Trump supporters, a finding that coincides with the sprawling conspiracy theory absorbing new themes, including baseless claims about vaccines and the dangers of 5G technology.

“QAnon has morphed into something, like a Frankenstein, that defies existing categories of harmful content for the platforms,” François said.
But even as the companies regarded QAnon posts as a largely protected class of free speech, there often were apparent violations of company policies against calls for violence and harassment of individuals. Though the targets often were public figures — including Obama, model Chrissy Teigen and Serbian artist Marina Abramovic — the intensity and hatefulness of the posts were as obvious in 2018 and 2019 as they were when Facebook and Twitter took action this summer, researchers said.
The same month that Facebook hosted the article about the “16-Year Plan to Destroy America” and its litany of responses calling for summary executions, in January 2018 Twitter seethed with similar content, said Clemson University researcher Darren Linvill, who found frequent references on posts with QAnon hashtags to “shooting,” “hanging” and “firing squad.” Some have been removed among recent enforcement by the company.

One image on Twitter, posted Jan. 5, 2018, depicted an apparently satanic ritual in which a hooded figure prepared to plunge a dagger into an infant as Obama and former presidents George H.W. Bush and Bill Clinton looked on, smiling. “We’ve gotta hang these assh*les! #qanon,” wrote the poster, whose account description says “opposed to progressive liberal indoctrination” and includes the hashtags #MAGA #TrumpTrain.
Another tweet from that month, responding to a post with the #QAnon hashtag and depicting Obama behind bars, read: “He deserves the firing squad. The gallows, the chair, whatever. Make him an example to all traitors who may think of pulling crap like this ever again.”

A highly organized approach​

The original post from Q appeared in October 2017 on 4chan, a fringe online forum rife with hate and political extremism. It predicted the imminent arrest of Hillary Clinton and warned of “massive riots.” This and subsequent posts also raised a series of conspiratorial questions about Trump, billionaire George Soros and the Obamas. Even though the predictions and allegations proved false, followers began to spread Q’s messages across tech platforms, creating Reddit boards, YouTube channels, Facebook pages, Twitter accounts and businesses selling merchandise on Amazon.
From the beginning, adherents of QAnon used highly organized strategies to grow their audience and capitalize on the infrastructure of social media sites to alter the political conversation, said Kate Starbird, associate professor of human-centered design and engineering at the University of Washington, who has researched the movement.
One common technique to amass supporters was called the “follow-back,” in which a Twitter account would put out a call for followers and promise to return the favor. These requests helped some accounts gain tens of thousands of followers, she said.
Another strategy — which, like requesting follow-backs, was permitted by Twitter — was the “hashtag rally,” in which a group of online accounts would all tweet the same hashtag at the same time. One took place on May 11, 2018, when QAnon followers flooded the Twitter account of Sen. John McCain with memes suggesting McCain’s imminent death and pro-Trump imagery. McCain, who died that August, revealed he had been diagnosed with brain cancer in 2017.
The tweets contained dozens of QAnon-related hashtags, and 97 percent of them mentioned Trump’s Twitter account in an apparent attempt to grab his attention, Starbird said. One hashtag, #OPMAYFLOWER2018, appeared in all of them. An anonymous researcher on Twitter traced the hashtag back to a private Facebook group called OPERATIONMAYFLOWERWWG1WGA, which incorporated an acronym for one of the movement’s slogans, “Where We Go One We Go All.”
Administrators of the Facebook page had posted step-by-step instructions for followers to participate in the operation against McCain. The instructions divided followers into teams, and each team was instructed to tweet specific hashtags at McCain at the same time.
“A lot of what we’ve seen wasn’t yet against the social media companies’ policies,” Starbird said. “It’s a lot easier for them to see in retrospect the size of the problem that was manifesting.”
Facebook, Twitter and YouTube long had policies against specific threats and incitements to violence, but the platforms struggled with how to enforce these rules in cases when the targets were public figures.
This was especially true in the aftermath of the 2016 presidential campaign, when Trump rallies routinely erupted in chants of “lock her up” in reference to his opponent, Hillary Clinton, often led by campaign officials or Trump family members. That pushed the boundaries of acceptable political discourse, making it harder for Silicon Valley to draw clear distinctions when seeking to enforce policies, said Ethan Zuckerman, director of the Center for Civic Media at the Massachusetts Institute of Technology.
“That line between incitement of violence versus legitimate political speech gets really, really fine under Trump,” Zuckerman said.
But Reddit, having struggled with its role incubating the Pizzagate conspiracy theory in 2016, found numerous violations of its policies against online harassment, incitement to violence and “doxing” — the publication of a target’s home address or other identifying information — and closed QAnon forums in March and September 2018.
“We try to police behaviors, not beliefs,” said Chris Slowe, Reddit’s chief technology officer, echoing the common position within Silicon Valley that enforcement actions should be in response to prohibited actions — inciting violence, harassing others — as opposed to political views.
After Reddit acted, he said, QAnon’s followers largely abandoned the platform. Many moved on to Facebook, Twitter and YouTube, where QAnon flourished among more mainstream audiences.

Early discussions, but no action​

Like Reddit, Facebook officials began discussing signs that QAnon was growing dangerous in 2018. At least one employee voiced concern then about the conspiracy theory’s potential to develop into a domestic terrorism threat, said a person who was on Facebook’s Integrity Team and others familiar with those conversations, speaking on the condition of anonymity to avoid retaliation. But the company was focused on foreign electoral threats to that year’s congressional elections and was reluctant to enforce against domestic speech. They also viewed the vicious conversations in private groups as more deserving of protection than other types of content.
“I remember thinking, Jesus, this stuff is insane and these people are crazy,” said another former Facebook official. “But back then, it was seen as a community of people who purposefully sought out this information, and the burden of proof for taking action against a private group was very high.”
The political persuasion of most of QAnon’s supporters also cooled interest within Facebook in cracking down on the conspiracy theory at a time when the company was working to refute Republican allegations of bias, said people familiar with internal conversations. Executives feared that punishing Trump supporters would compromise the perception of neutrality that the company hoped to achieve and would result in the censorship of genuine political speech.

What is ‘Obamagate,’ anyway? And how does it involve Michael Flynn?
In 2018, Twitter also had heated internal conversations about the rise of domestic conspiracy theories and the way adherents of those theories had begun using its platform to gain the attention of influential voices on the right, including the president, his eldest son and other supporters.
Like Facebook, Twitter concluded that the tactics did not break its rules against direct incitements to violence, spam, or the use of fake accounts or bots, despite that several engaged in “borderline behaviors,” according to a person familiar with the company’s discussions at the time who spoke on the condition of anonymity because that person was not authorized to discuss those talks with a reporter.
So Twitter gave wide latitude to what it classified as political conversation — sometimes even when that debate led to the harassment of individuals, including Teigen, who had been repeatedly and falsely accused over Twitter of affiliating with pedophiles. (Teigen, through a spokesperson, declined to comment).
Twitter eventually developed stronger rules against misinformation by launching an initiative dedicated to “healthy conversations” and developing policies against real-world harm and dehumanizing speech, setting the stage for stronger enforcement.
One notable change that heightened calls for action this year: QAnon conspiracy theorists began touting false cures for covid-19, crossing a red line the social media companies had drawn during the early phases of the global pandemic.
On this subject, Twitter, Facebook, YouTube and other companies have chosen to combat falsehoods more directly than ever before, arguing that untrue medical information was different from political speech. That applied even to posts by Trump and his top supporters, who for the first time faced sanctions from social media sites, including warning labels and the removal of some especially blatant misinformation.

Tech firms take a hard line against coronavirus myths. But what about other types of misinformation?
In May, a conspiratorial documentary called Plandemic, in which a discredited research scientist made false claims that wearing masks helps cause the coronavirus, went viral, becoming one of the top trending videos on YouTube. Social media researcher Erin Gallagher traced the aggressive promotion of the documentary back to a handful of Facebook groups with tens of thousands of members each.
Among the main Facebook groups spreading links to the documentary were OFFICIAL Q/QANON and the Great Awakening, two QAnon groups that have since been taken down. QAnon and anti-vaccine groups on Facebook also played a large role in promoting protests against shutdowns, said Renée DiResta, technical research manager at the Stanford Internet Observatory.
Facebook officials noted the April 2020 incident in which an exotic dancer and QAnon supporter was arrested after showing up at a Navy hospital ship with a car full of knives that played heavily into their decisions to increase their sanctioning. Twitter’s Roth cited the 2019 FBI report as well.
Even more recently, researchers have documented QAnon accounts pushing false claims that members of antifa, a loosely organized, far-left political faction, had started wildfires in the Pacific Northwest. This prompted Stone, the Facebook spokesman, to tweet on Sept. 12 that the company was removing posts because police were having to “to divert resources from fighting the fires and protecting the public.”
The move was ridiculed in replies to Stone’s tweet as too little, too late.
Research by Media Matters, the liberal group, reported in mid-September that posts on private Facebook groups echoed the same violent themes that QAnon supporters had pitched from the beginning.
“RISE UP AMERICA!!!!! FIGHT FOR OUR COUNTRY!” said one that included the images of people it claimed were arsonists affiliated with antifa. “HERE ARE JUST 8 (WITH MORE TO COME) OF THE EVIL DEMONIC ‘ANTIFA’ TERRORISTS WHO STARTED THE FIRES THAT DESTROYED FORESTS AND COMMUNITIES AND KILLED FAMILIES AND CHILDREN.”
The FBI took to Twitter to debunk the allegation. “Reports that extremists are setting wildfires in Oregon are untrue,” the Portland office tweeted.
EDITOR’S NOTE: This story quotes Travis View, co-host of the QAnon Anonymous podcast, without reporting that this name is a pseudonym, a detail View did not disclose to The Post when the story was written. That violates Post policy, which prohibits the use of pseudonyms except in rare cases and requires disclosure that a pseudonym is being used. View’s real name is Logan Strain.





Timberg C, Dwoskin E. 2021. Washington Post. As QAnon grew, Facebook and Twitter missed years of warning signs about the conspiracy theory’s violent nature. [online] Available at: <https://www.washingtonpost.com/technology/2020/10/01/facebook-qanon-conspiracies-trump/> [Accessed 12 June 2021].
 

 

 

 

 

 

 

STATISTICS

Threads
46
Messages
55
Members
27
Latest member
LouiePed

 

 

 

369 DISCLOSURE

SEASON ONE

  • The concept of Season One is to begin and test this hub of collaboration aiming to result to an invulnerable research database of all our opinions through various means of exploring all the research topics from the community. This game enriches and evolves through public contribution, offering unbiased, critical discussion among people of all different backgrounds and opinions regarding the worlds most contravertial subjects.

 

 

 

 

 

DISCLAIMER The information contained in this game is for general information purposes only. The information is provided by 369 Disclosure for free and while we endeavour to keep the information up to date and correct we make no representations or warranties of any kind express or implied about the completeness accuracy reliability suitability or availability with respect to the website or the information products services or related graphics contained on the website for any purpose. Any reliance you place on such information is therefore strictly at your own risk. In no event will we be liable for any loss or damage including without limitation indirect or consequential loss or damage or any loss or damage whatsoever arising from loss of data or profits arising out of or in connection with the use of this game. Through this website you are able to link to other websites which are not under the control of 369 Disclosure. We have no control over the nature content and availability of those sites. The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them. Every effort is made to keep the game expanding. However 369 Disclosure takes no responsibility for and will not be liable for the game’s database being temporarily unavailable due to technical issues beyond our control.