Thursday, November 3, 2016

Suffolk Poll a bellwether for strong methodology


What is the Suffolk University Poll?
Beginning in 2002, the Suffolk University Political Research Center has conducted political polls at the state and national level. Although this political research center is relatively young, bellwether survey analyses indicate that the data from these polls are about 89% accurate in predicting political outcomes.
As described by the Suffolk University website, “Used both locally and nationally, the model has an 85% accuracy rating in predicting straight-up winners”. You can access the Suffolk University website and Daved Paleologos’ full bio here: http://www.suffolk.edu/college/12224.php.
What is the Bellwether model?
The Suffolk University website explains that, “Bellwethers are polls conducted separately from statewide surveys”. Bellwethers can be a town, county, or region that accurately reflects how a state will vote. Bellwethers do not remain constant, they change every election cycle and differ with election type. Unlike state-polls, bellwether polls are not designed to predict margins of victory, just outcomes. Examples of Paleologos’ most accurate bellwether predictions and a history the of voting predictions proving the center’s 89% record of accuracy can be found here: http://www.suffolk.edu/academics/29013.php.
Charts detaling the data gathered and the methodologies used can be found here: http://www.suffolk.edu/academics/10741.php.
   
The survey of 1,000 likely voters is based on live telephone interviews of adults 18 years of age or older, residing in all 50 states and the District of Columbia, who intend to vote in the general election. We know that live interviews produce accurate data, but they are costly, this is likely why the survey was limited to 1,000 voters. The way in which these voters are selected to be called however, balances out the effects of a limited population.
Demographic information for both polls was determined using 2010 Census data. This data was then used to create samples of both standard landlines and cellphones. The location of these numbers was determined using a probability-proportionate-to-size method, meaning that the phone numbers, “assigned to each state were proportional to the number of adult residents in each state”. States were then grouped into four general regions.
By doing live phone interviews, Suffolk University was able to ask for the youngest adult in the house, and administer surveys in both Spanish and English. If polling had not been conducted by live telephone interview, these availabilities might not have existed and the data would be less accurate. Instead, live phone interviewing in combination with the probability-proportionate-to-size method and Census data creates an accurate description of voters opinions.
Bloggers Nate Silver and Steve Singiser rate the accuracy of many polls on their blogs FiveThirtyEight and Daily KOS. Pollster ratings for FiveThirtyEight are calculated by, “analyzing the historical accuracy and the methodology of each firm’s polls”. Daily KOS ranks polls by calculating point values for polls based on wins, Average Error, and Partisan Error. Both blogs rank the polling methodology of Suffolk University in the top tier. Infact Daily KOS specifically mentions Suffolk University as accurate when Steve Singiser says:
Credit should be given, however, to three outfits in particular. Ipsos, Suffolk University, and PPP were in the unique position of being in the upper half of the pollster ratings in both cycles. Ipsos is the leader in that regard, having averaged 236 points between the two cycles. But Suffolk should be given credit for their consistency (only a twelve point difference between cycles), and PPP deserves loads of credit for their top-tier performance....

Wednesday, November 2, 2016

Fox News poll affiliations balanced

Political Polls
What numbers can we really trust?
            Fox News is a prominent contributor in delivering election news. Their online blog outlines the overall sampling, interviewing, weighting and accuracy methodology, all of which are important to consider when determining which polls to trust.
Can We Trust Fox News Poll?
             Upfront, Fox News states that their polls, in general, are conducted via telephone using a nationally representative sample of the population that is comprised of roughly 1,000 registered voters. 530 are conducted through landlines and 470 are through cell phones.
             Another method of control Fox News practices, is their use of probability proportionate to size, meaning that phone numbers for each state and proportional to the number of voters in each state. Their sample also mirrors gender proportional nationally, so that 53% of females are surveyed and 47% of males are surveyed.
            Poll sponsorship can sometimes skew the results. A strength of the Fox News Poll is that it is conducted under the joint direction of Anderson Robbins Research, a liberal research company and Shaw & Company Research, a conservative research business.
            At 36 questions a recent the poll is relatively lengthy. The poll was conducted by live interviewers via telephone from October 15– 17, 2016, and it resulted with a margin of error of plus or minus three percentage points.

            Fox News Poll keeps an active Twitter account, where they post all of their poll results as well as real time campaign news.
by Sarah Richardson

Monmouth disclaimer makes limitations clear

Poll methodology is important to ensure accuracy. You often hear in the news as a candidate “leading in the polls”, this is most likely in reference to a public opinion poll conducted by a reputed organization that is dedicated to conducting such polls. 
Since September, the Monmouth Polling Institute has kept a close watch on the 2016 Presidential Race, conducting polls in specific states and nationally.
The Monmouth University Poll is sponsored and conducted by the Monmouth Polling Institute. Depending on the particular poll, participants are sometimes registered voters, but at other times likely voters are the sample. A likely voter is a voter who is most likely to vote on election day, as opposed to a registered voter who is registered in their county to vote but may or may not exercise that right on election day. Monmouth conduct a national random sample, with interviewers calling voter’s landlines and cellphones. 
Polls with this method can be representative of a certain population because all members of the population have a possibility of being chosen. Finding a truly a random sample, especially on a national level, can be laborious and expensive. The Monmouth poll is upfront about some limitations, saying “In addition to sampling error, one should bear in mind that question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls.” This transparency is laudable in an election season where daily opinion surveys can become confusing. 

by Suraj Minisandram

Bloomberg balances pool size, methods

As the presidential election season comes to a close, it seems that voters are inundated with confusing and deceitful polls. Polls shouldn’t be taken at face value. Factors like sample size, population representation and pollster bias can all affect poll accuracy
How the Poll is Conducted
Consider The Bloomberg Poll, a longstanding source for political coverage, uses an automated dialing method to conduct live telephone interviews. This method randomizes phone numbers on landlines and cellphones for live callers to dial. These phone numbers are supplied by the market research company Survey Sampling International. The callers ask interviewees an extensive list of questions about the current election. Questions include:
·      If the general election were held today, and the candidates were [Hillary Clinton for the Democrats] and [Donald Trump for the Republicans], for whom would you vote?
·      In general, do you think things in the nation are headed in the right direction, or have they gotten off on the wrong track?
·      Which of the following do you see as the most important issue facing the country right now? 
Using live callers can make respondents more comfortable and is an alternative to automated polls where the participant keys in responses to recorded questions. Automated voice response systems have been found to record faulty responses and return low response rates. People are much less likely to respond to a computer than a human voice.
Because Bloomberg uses live callers, their poll is more time consuming and costly. Bloomberg polls conducted in the current election season show an average sample size of 1,000 interviews. One thing that limits the sample size is a federal regulation requiring that cellphone numbers be dialed manually to limit spam calling.
Cell phone interviews are important for ensuring that those called represent Americans well. Young adults and Americans in poverty may only have a cell phone. Another way to ensure proper representation is to weight responses. Weighting factors can include age, race and level of education. Polling firms will select people according to these characteristics in order to have the sample match the known demographics of the population they are questioning.
Bias in Polls
Another important factor is poll sponsorship. Organizations with political or religious agendas may use flawed methodology to provide results consistent with their ideology. The Bloomberg Poll is overseen by Selzer & Company, a public opinion research company led by famed pollster J. Ann Selzer. Selzer & Company provides polling research to various news media outlets such as the Iowa Newspaper Association, The Newspaper Association of America and the American Press Institute. J. Ann Selzer’s client list can help us evaluate any political leaning tendencies.

Pollster watch dogs can also provide us with insight into polling bias. The political and pop culture news site FiveThirtyEight relies on statistical analysis for storytelling. Founder Nate Silver has rated dozens of polls in an effort to expose polling inaccuracy. Selzer & Company polls were given an A+ rating, citing that 84% of the Selzer polls analyzed were called correctly. These ratings can help us compare polls to achieve a better understanding of bias.

by Rachyl Jackson

Poll timing means snapshot effects

Looking into the methodology of a poll can help the audience understand the results, the presentation of those results and if there is any obvious bias. Factors such as when, how, who and where the poll was taken can often affect
For example poll taken on September 27 by SurveyUSA asked nine hundred participants in California “If the election for President of the United States were today, and you were filling out your ballot right now, who would you vote for?” 33 percent of respondents said they would vote for Donald Trump, and 59 percent said they would vote for Hillary Clinton (SurveyUSA).The same poll was taken again on October 13, 30 percent of respondents said they would vote for Donald Trump, and 56 percent said they would vote for Hillary Clinton (SurveyUSA).
The raw data tells us that Clinton still has a large lead but that both Trump and Clinton went down three percent in popularity in just over two weeks. However, the first poll was taken over the phone right after the first presidential debate, which many consider Clinton winning. The timing of the poll might have enhanced favorability for Clinton. The second poll was conducted over the phone right after the second debate. This was also just days after a 10-year-old tape resurfaced of Trump making lewd remarks about women. The timing of this poll may explain why Trump lost of support (inappropriate remarks about women). Also, remember that this poll was conducted in California, a liberal state, so these results are not representative of the entire country’s choice on Election Day.
This poll succeeded in that it surveyed nine hundred California residents, a large sample size, which is large enough that ti could be  representative of how California will vote this year. The poll also succeeded in recording responses over the phone because respondents feel they can be more honest due to the anonymity of voice calls.

by PJ Collins