Tuesday, July 12, 2016

Presidential Debate Grammar Power Rankings

Ready or not, the U.S. presidential campaign season is upon us. Whoever your pick for POTUS, one thing’s certain—political topics inspire passionate discussions. With a light heart and heavy-hitting algorithms, we visited each candidate’s official Facebook page and looked at the comments there to see how well their supporters handle themselves when they communicate their ideas in writing.

Our first study put followers of Republican candidates—the participants in the first national presidential debate on August 6—in the spotlight. The Democratic candidates climb into the debate ring on October 13, so this time we put their supporters to the test. Then, in the spirit of friendly competition, we combined the studies into one infographic to allow for comparisons between the parties.

Whether your discussion style is passionate or placid, as the 2016 presidential election approaches there’s no better time for intelligent discourse.

 

To share this infographic with your blog readers, embed this in your blog post by pasting the following HTML snippet into your web editor:

Please attribute this infographic to https://www.grammarly.com/grammar-check.

Methodology

We began by taking a large sample of Facebook comments containing at least fifteen words from each candidate’s official page between April, 2015 and August, 2015. Next, we created a set of guidelines to help limit (as much as possible) the subjectivity of categorizing the comments as positive or negative. Since the point of the study was to analyze the writing of each candidate’s supporters, we considered only obviously positive or neutral comments. Obviously negative or critical comments, as well as ambiguous or borderline negative comments, were disqualified.

We then randomly selected at least 180 of these positive and neutral comments (~6,000 words) to analyze for each candidate. Using Grammarly, we identified the errors in the comments, which were then verified and tallied by a team of live proofreaders. For the purposes of this study, we counted only black-and-white mistakes such as misspellings, wrong and missing punctuation, misused or missing words, and subject-verb disagreement. We ignored stylistic variations such as the use of common slang words, serial comma usage, and the use of numerals instead of spelled-out numbers.

Finally, we calculated the average number of mistakes per one hundred words by dividing the total word count of the comments by the total number of mistakes for each candidate.

No comments:

Post a Comment