Parler Community Jury

Parler’s goal has always been to provide an unbiased platform where users can engage in civil discourse without fear of ideological censorship. An objective review of alleged violations is a vital part of this. Parler’s Community Jury provides due process to every user by applying our Community Guidelines and Terms of Service in a fair, viewpoint-neutral manner. 

What makes Parler’s moderation process different?

Parler has distinguished itself from Big Tech platforms by, when it comes to legal speech, allowing users to moderate their own feed and decide what they want to see.

It starts with Parler’s Terms of Service (ToS) and Community Guidelines, which specify what is and isn’t a violation as objectively as possible. Our guidelines are, in substance, guided by the First Amendment of the U.S. Constitution. Although Parler is not a state actor, our policies are informed by the U.S. Constitution and precedents set by the U.S. Supreme Court; we believe these legal standards are consistent with what ethical content moderation looks like.

As for procedure, Parler has modeled traditional practices of American and English common law, enlisting the help of users from within our community. Parler Jurors deliberate and help determine what content violates our Community Guidelines.

Parler’s Community Jury Vision

“No users shall be stripped of their parleys or comments, nor shall they be suspended, banned, or deprived of their standing in any other way, except by the conscientious judgment of their equals.” 

-Adapted from the Magna Carta 

The Jury Process

We solicit, select, and recruit jurors from within the Parler community. Jurors attend training at least once a week where we discuss cognitive biases and how they can impact decisions and go over some of the more difficult “gray area” cases. (See juror selection criteria, below.) Each juror strives to provide unbiased verdicts, based on his or her individual objective judgment. Each juror is trained to err on the side of “not guilty” when there is significant doubt about whether content reported amounts to a violation of Parler’s ToS or Community Guidelines.

“I consider trial by jury as the only anchor ever yet imagined by man, by which government can be held to the principles of its constitution.”   

-Thomas Jefferson, 1788

User-reported parleys, comments, and profiles are sent to the Jury Portal. Multiple jurors review the content in an isolated instance to prevent groupthink bias. Once the required number of jurors have analyzed the reported content and provided their recommendations, a Parler-employee judge makes the final determination. A judge may overrule a jury’s majority verdict. In such a case, the report is shared with the wider jury pool for further discussion and deliberation. Judges and jurors often learn from these case reviews.

Each juror is encouraged to exercise his or her individual judgment, free from bias and guided by Parler’s ToS and Community Guidelines. Jurors are tasked with applying the appropriate standards to the factual context provided in each report. They deliberate on only individual violation reports or appeals; they do not have authority over other Parler users.

Violation Points and Banning

In the event the jury reaches a “guilty” verdict that a judge affirms, the accused will be issued an appropriate number of violation points, scaled according to the severity of the offense—from spam on the low end (2 points), all the way up to support for terrorist organizations (20 points).

Violation points expire after 90 days. But any user who accumulates 20 points within a 90-period will be banned. 

Check out our elaboration on the community guidelines for more information. 

User-Driven Moderation

Parler gives each user the ability to control his or her own experience using our mute, block, and comment moderation features. Using these tools can minimize the need to report content violations and helps Parler in its quest to use the least restrictive means possible of enforcing the ToS. 

AI-Augmented Moderation

Due to the particular risk posed by content that threatens or incites violence, as well as the sensitive nature of NSFW content, Parler uses a privacy-preserving algorithmic review process to address this content. We have partnered with Hive (, a premier Cloud-based AI solutions company, to help us detect this content with state-of-the-art, viewpoint-neutral algorithms. Should any threatening or inciting content be detected, it will be removed. The user will have the opportunity to appeal the determination, removal, and violation points. NSFW content will be placed behind a splash screen, and mistakes here may also be appealed.

Finally, we have incorporated an algorithm offered by Hive which detects personal attacks based on immutable or otherwise irrelevant characteristics such as race, sex, sexual orientation, or religion. Many people call this “hate speech” or “objectionable content,” and in fact Apple and Google’s app store guidelines require that this content be removed from the apps entirely. But on Parler for web or sideload Android we call it “trolling content,” and we place it behind a clickable splash screen. This content doesn’t contribute to a productive conversation and, while we don’t believe it is right to remove this legal speech entirely, we do want to provide our users with a way to minimize it in their feeds, should they choose to do so. 

In the spirit of the First Amendment, no violation points are assigned for trolling content and, again, any determination with respect to it may be appealed. We are currently implementing a fully integrated appeal process for application of the trolling filter, but in the meantime, please tag @parlerjury and we will manually review it for you. If you would like to know more about this process, read our Guidelines Enforcement Process.  

Appeals Process

The appeals process is one of the most essential functions of the Community Jury, as it allows for the maximum possible freedom of expression on Parler. How? By correcting the inevitable errors which occur in AI-augmented moderation—and even purely human moderation! Yes, human moderators are also fallible, and so even if purely human moderation could be provided at scale, a fully functioning appeals process would be necessary. If you receive notice of a content violation and think it was applied in error, you can appeal by going to Settings > Current Violations > Violation list > Select details and click “Appeal.” 

Interested in joining the Community Jury?

To be considered as a juror, you must: 

  1. have joined Parler at least 90 days ago. 
  1. be at least 18 years of age. 
  1. be a user in good standing who: 
    • believes in free speech. 
    • hasn’t recently violated the Community Guidelines. 
    • is level-headed and able to interact effectively with others. 
    • demonstrates familiarity and aptitude for the platform. 

If you meet the above criteria and you’re interested in helping to safeguard free speech for everyone regardless of viewpoint, please email Be sure to tell us why free speech is important to you. 

Applicants who are selected will be onboarded into the Community Jury and provided scenario-based training, video tutorials, and in-depth explanations regarding the dangers of biases, the importance of assuming innocence until guilt is determined beyond a reasonable doubt, and more.