Research

Online Polls vs. Quizzes

Published on September 18th, 2013

Create your own quiz using our Quiz Creator

Summary

Many news websites feature online polls. These polls typically ask site visitors about their opinions, such as whether they favor or oppose a new policy or who they think is likely to win an upcoming election. Online quizzes, where people are asked factual questions and then are told whether their responses are correct or incorrect, are less common. Both online polls and quizzes can be entertaining for site visitors, and can increase site visits and time spent on a page.

Despite the advantages of online polls and quizzes, there are downsides. Specifically, some site visitors may believe that online poll results are accurate reflections of public opinion when, in fact, they often are not.1 Further, many online polls and quizzes use a multiple-choice format which provides respondents with four or five response options from which to choose. Although this format has some advantages, it may not maximize visitor engagement.

In this project, we investigated whether we could improve upon commonly-used online polls. Our mission was to evaluate a new form of an online quiz, called a slider poll, which is designed to both engage and inform site visitors. The poll presents factual data and lets site visitors quiz themselves on the answer. This new way of presenting information both helps people learn and has benefits for a news site’s bottom line by increasing time on page.

Key Findings

Below you will find a list of recommended practices and conclusions based on our research. Compared to presenting factual information without an interactive feature, online slider and multiple-choice quizzes:

  • Increase time on site
  • Are seen as more enjoyable
  • Help people recall more information

Slider polls also have unique benefits when compared to multiple-choice polls. Specifically, slider polls:

  • Increase time on site
  • Help poll participants learn in diverse ways – both in terms of free recall and in terms of recognizing the information from the poll

Solutions for Newsrooms

Poll Format

  • To increase engagement, more than one quiz question should be on a page.
  • Sites should use multiple-choice polls and, even better, slider polls.

Poll Content

  • The substantive content of polls can be more informative than it is at present. Instead of providing site visitors with data about other users’ responses, online polls and quizzes can present reliable public opinion estimates and factual data from reputable sources.
  • Any sort of statistical or numerical data can be used to create online quizzes. News organizations should be on the lookout for this type of data and consider presenting it in an interactive format.
  • Public opinion poll data from reputable polling organizations can be used to generate questions on news sites. For example, the Pew Research Center’s April 2012 survey found that “66 percent of American adults have a high-speed broadband connection at home.” For an article about broadband, a news organization could include a quiz question asking visitors to guess: “What percentage of American adults do you think have a high-speed Internet connection at home?”

Introduction

Online polls and quizzes have become an increasingly common feature on Internet websites. The appeal of these interactive features is obvious: polls and quizzes are familiar to site visitors, making participation easy. Further, the interactivity inherent in polls and quizzes can engage site visitors and may yield commercially-desirable outcomes, such as increasing time on site. Yet we are only beginning to understand the best practices for deploying these sorts of interactive features.

Most research to date on online polls has focused on their prevalence and content.2 Although evidence exists that online polls can enhance learning, little research has examined how to best employ online polls to affect critical democratic outcomes like policy knowledge, confidence in one’s knowledge, and recognition of diverse arguments related to political issues.3 We examine the extent to which online polls can both entertain and inform citizens about important political information.

This research report first offers a background on online polls, focusing on the commercial benefits and democratic drawbacks of how site polls are used. We describe reasons for serious concern about how news sites present online polls and how citizens interpret these polls.4

We next offer two new directions for online polls: (a) new formats and (b) a focus on verifiable public affairs content. Incorporating an open-ended slider poll design containing numerical policy information or reputable public opinion data may enhance critical democratic outcomes. We used experimental methods to test the effects of online polls, both in the lab and in the field.

The results of our studies illustrate that online polls can be used as a vehicle for educating and entertaining citizens who visit news websites. Used with democratic ends in mind, sites can achieve important commercial goals while enhancing citizens’ political knowledge.

Background on Online Polls

Online polls and quizzes have become gradually more common in modern newsrooms. In 1998, nearly a quarter of newspaper websites and eight percent of television news websites used online polls.5 Site polls increased steadily throughout the early 2000s, with roughly half of newspaper websites using online polls by 2004.6 To update these reports, we analyzed 107 randomly selected local television news websites across the United States. In 2012, forty-seven percent had some type of poll on their site.7

From a business perspective, polls and quizzes are attractive features for news sites. News websites gain much and spend little by including interactive features such as polls.8 Traditional multiple-choice polls are easy to incorporate into web stories and several websites offer the programming to create free polls (albeit with some advertising). Further, conventional wisdom is that these features increase website “stickiness” by increasing the amount of time visitors spend on the site.

Despite their business promise, online polls do not always advance a newsroom’s journalistic mission. First, online news polls sometimes cover entertainment topics, as opposed to hard news.9 In our survey of local television news websites, we found polls focused on entertainment and sports-related news, including predictions for hometown football teams and one’s favorite season of the year.10

When topics turn to political news, we saw polls that could leave citizens more cynical about the political process. For example, by focusing on who will win an election or including response options such as “Who cares? It won’t matter who wins,” polls could fuel citizen alienation from politics and distrust of politicians. Although this language may mirror the manner in which some citizens talk about politics, news polls that focus on citizen disaffection are missing out on an opportunity to play a positive role in visitors’ civic development.

Second, most online news polls are not based on reliable data. Some scholars have gone so far as to describe these polls as “virtually worthless.”11 The reasons behind this bold pronouncement are multiple.

  • Poll participants are not representative of any larger public. They are not even representative of site visitors, since any number of those visiting a site may decide not to participate in the poll.
  • Some online polls allow site visitors to vote multiple times.12 This can allow for “ballot stuffing” whereby someone with a strong opinion can change the distribution of the poll results to favor his/her view.
  • Online news sites typically do not include a disclaimer warning respondents that the results are not scientific.13 Our analysis of polls on television news websites revealed that over 90 percent of sites with polls did not offer even a simple warning to site visitors that online poll results are not scientific representations of public viewpoints.

This is not to say that all site visitors are dupes – many people do recognize the benefits of representative samples and legitimate polling efforts. But this is not true for everyone. One survey found that nearly six in ten respondents believed that “opinion polls found in newspapers and on TV provide more accurate reflections about the views of the public than online polls found on the Internet.”14 Yet online polls included on news websites confound distinctions between polls found in traditional news media and those found online. One experiment compared reactions to (a) a news article reporting an online poll with (b) an identical news article reporting the same poll results, but attributing the results to a random-sample telephone poll. Study participants found the online poll just as credible as the telephone poll.15 This study raises questions about whether news audiences actually distinguish between unrepresentative online polling and random-sample telephone polling.

When respondents receive training about poll quality and basic statistical principles, however, they are better poised to make judgments about the reliability of polls. One study, for example, demonstrated that students trained in social scientific methods were better equipped to understand the benefits of large and representative samples compared to students not receiving this training.16

In sum, although some citizens appreciate the difference between representative and non-representative opinion polls, citizens may need cues to aid their understanding.17

Providing quality polling information is crucial given that people often base their attitudes and opinions on their understanding of what “other people” think. In electoral contexts, polling information can influence citizens’ beliefs and behaviors.18 In a series of three different experiments, for example, political scientists Stephen Ansolabehere and Shanto Iyengar found that learning the results of a public opinion poll can sway people’s vote choice. Research on a theory known as the “spiral of silence” demonstrates that citizens’ decisions about whether to voice their opinions are related to their perception of whether their view is held by the majority or minority.19

Those holding a view perceived to be in the majority are more likely to voice their opinions in comparison to those holding a view perceived to be in the minority. Polling data made available by news organizations influence citizen perceptions about how widely-held different views are. As a worst- case scenario, unscientific polls could misinform the public by providing them with an inaccurate picture of public opinion.

The fact that citizens use polling information to help them form their opinions suggests that polls can be important, and even beneficial, when used correctly. Research demonstrates that just the act of participating in a poll can help people form more reliable and stable judgments about their own views on an issue.20 Furthermore, learning about polling results can lead people to think about other views and why people may hold various opinions. In her research, political scientist and communication scholar Diana Mutz found that upon hearing poll results, people generate thoughts “in response to learning what others think.”21 Encountering poll results that report on the opinions of others invites a mental listing of possible reasons for why one may hold certain beliefs. For example, upon learning that 38 percent of Americans believe that the Affordable Care Act will make things better for them, Mutz’s research suggests that people would generate reasons about why people may feel this way.

Little scholarship has focused specifically on the effects of online polls and quizzes. One important exception was research conducted by communication scholar Mark Tremayne.22 Tremayne created an experiment with four different website conditions that incorporated varying amounts of hyperlinks and online polls that included fact-based, quiz questions.

Interactivity was assumed to increase from (a) a site with no hyperlinks and no interactive poll feature, to (b) a site with no hyperlinks, but an interactive poll feature, to (c) a site with hyperlinks, but no interactive poll feature, to (d) a site with hyperlinks and an interactive poll feature. After interacting with the website, study participants then completed a multiple-choice quiz and two essay questions. Tremayne did not find any evidence that the interactive features improved respondents’ performance on a 20 question multiple-choice quiz about the information. He did, however, find that study participants composed higher quality essays as the amount of interactivity increased. The study, however, tested online polls and hyperlinks conjointly. More information is needed to understand the specific contribution of online polls to democratic outcomes such as knowledge.

New Direction For Online Polls

In light of previous research, we hypothesized that online polls could have both democratic and business benefits, but that they may not be maximizing their potential. This research reports on two new directions for online news polls: innovative formats and new ways of incorporating reliable data.

Innovative Formats

Presently, most news polls utilize a format where site visitors are allowed to choose one answer from several listed answers, similar to a multiple-choice test.23 Scholars have found that this type of multiple-choice format can be an effective means of testing how much people know and of gathering public opinion data.24 Further, four to five answer options are ideal to avoid taxing or confusing those taking part in the poll.25 Indeed, many news polls we encountered in our analysis of local news websites had four to five response options.

Yet other formats exist. Open-ended test questions (e.g. essay questions), in general, can be quite effective in revealing an individual’s true knowledge as well as assisting with a more thorough search of one’s memory to answer questions.26 Further, open-ended questions can, in some instances, increase response time compared to closed-ended questions.27 A longer response time would meet valuable commercial aims associated with audience engagement. In our research, slider polls resemble open-ended questions in that they ask respondents to use their own knowledge to answer questions instead of choosing from a predetermined set of answers. Specifically, we analyze whether multiple-choice or slider polls are better at engaging and educating site visitors.

In addition to testing these two types of poll questions, we also test non-interactive presentations of polling information as a baseline for comparison. Compared to static information on a public policy issue where no interaction occurs, how do site visitors evaluate multiple-choice and slider polls?

Poll Content and Data

The current content used in online news polls does not take full advantage of educating site visitors. This is the case in two ways. First, although entertainment and sports-based poll questions engage and entertain visitors, they fail to inform citizens about important policy topics that may impact their lives. Second, poll results provided to respondents based on those choosing to participate in an online poll are not representative of any larger public.

This does not have to be the case, however. Substantive, hard news content can be incorporated by adding quizzes, instead of opinion polls. Quantitative information, in particular, is easily incorporated into these interactive formats. For example, news stories on the ongoing impact of the budget sequester may emphasize the amount that was cut from the defense budget through automatic spending reductions, the top income tax rate supported by President Obama, or how much of the federal budget is devoted to the entitlement programs. Poll questions could ask about the percentages and dollar figures associated with these policy topics, which may help site visitors retain the information.

In addition, poll questions can ask respondents to predict the results of actual public opinion data. For example, questions can ask respondents “What percentage of Americans believe that the economy will improve next year?” The result could be based on reliable data gathered by a reputable public opinion organization, such as the Pew Research Center.

Polls and Democratic Outcomes

We test whether practical, current-events-based poll information and innovative formats can assist with (a) learning about contemporary issues, (b) the confidence citizens have in their knowledge, and (c) the rationales that citizens are able to provide for their own views and the views of others. All of these outcomes are critical building blocks for an informed and engaged citizenry. We also analyze reported enjoyment and time spent with different poll versions. These outcomes help us make a business case for using interactive quiz features.

Increasing Policy Knowledge

The first way in which using public affairs information to form online poll features may benefit site visitors is by increasing their policy knowledge. When citizens are exposed to facts and figures in an interactive format, they may be more likely to remember the details for later use. Engaging online tools have been shown to increase cognitive stimulation, information recall, and the connections among different facts – all important parts in the process of building knowledge.28

Not only might a focus on public affairs content encourage learning about politics, the poll format may contribute to this process as well. Open-ended questions enhance memory searches.29 Based on the principle that if someone writes something down, they are more likely to remember it, we examine whether the slider poll format contributes to policy knowledge gains more than the multiple- choice format and the static presentation of information used to form the poll.

Better Assessments of How Confident To Be About One’s Knowledge

Second, if citizens do increase their public affairs knowledge by engaging with online polls, might this new knowledge come with more accurate assessments of one’s confidence in one’s knowledge? Past research indicates that many people are confident about their political knowledge – even when the information they believe to be true is, in fact, incorrect.30 In one study, people who believed that 25 percent of American families were on welfare, an overestimation, were substantially more confident in their incorrect belief than people who estimated correctly that only 7 percent of families used welfare.31

Is it possible to harness this human tendency to be confident in one’s knowledge while also encouraging people to remember correct information? One potentially encouraging finding suggests that yes, it is possible. Political scientist James Kuklinski and his co-authors found that a two-step process can help individuals to form political opinions based on correct, rather than previously-held, incorrect information.32 First, individuals need to state their belief about a particular political fact. Second, they must be informed afterward about whether the information they hold is correct or incorrect.

We suggest that the online poll we propose here is an ideal tool for promoting more accurate assessments of how confident one should be in the political information that they hold. Why? Because our polls follow the two steps outlined above. Website users provide their own best guess at the answer to a factual question and then are immediately given the correct information. Polls typically contained on news websites reveal how many people responded a certain way. The poll content we test changes this by immediately notifying a site visitor whether their answer is correct. The poll notification provides either a correct political fact (for instance, the number of burglaries in a community in the past year) or public opinion information from a representative sample (for example, the percentage of people who believe the Affordable Care Act will make life better for them). By providing immediate feedback on a site user’s perceptions, we anticipate that not only will people learn, but they also will be better equipped to assess how confident they should be in their response.

Understanding Diverse Points of View

Third, taking part in an online poll may affect participants’ understanding of diverse arguments about a political issue. In some cases, knowing how others think about an issue invites a mental listing of the possible reasons for why someone may hold certain beliefs.33 For example, a person supportive of the Affordable Care Act may believe that a majority of citizens see the law as making life better for them. If asked a question about the percentage of Americans who believe that the Act is helpful, a person may answer 50 percent. However, according to the Pew Research Center, 38 percent of Americans actually believe this. When the person is corrected, the new information may influence the person’s understanding of multiple perspectives on health care. The person may think about why people believe the law will not make life better for them. We test whether participating in a poll affects the important democratic outcome of understanding multiple points of view on an issue.

Time with Information; Enjoyment of the Experience

Fourth, we analyze poll engagement by looking at how much time respondents spend with the polls and poll information and at how they rate the experience. We ask respondents to rate how trust- worthy, credible, clear, informative, interesting, and enjoyable they find the polls. This allows us to understand whether participants react to and interact with slider and multiple-choice polls differently than they interact with static, non-interactive presentations of the poll information.

The Experiment

To test how people responded to different online poll types, this study compared participants’ reactions to three different presentations of numerical data and one control group who did not see any numerical data. Participants were randomly assigned to one of the four following experimental conditions:

  • Slider Polls. In this condition, participants used three slider polls to answer questions on current public policy topics. Participants interacted with the poll by moving a slider or entering a number into an open-ended box before hitting a “submit” button. If the answer entered was within three percentage points of the correct answer, the participant received a notice of their “Nice work!” and a description of the actual quantitative results. If the answer was outside of this range, the participants was presented with the correct answer.
  • Multiple Choice Polls. This condition had participants respond to three closed-ended polls. Each poll question had four answer options. Participants who answered correctly received a notice of their “Nice work!” while incorrect answers were corrected with the accurate information. This condition contained a format similar to those currently featured on most local news websites.
  • Poll Information. This condition provided study participants with the results of each of the poll questions. There were no options in this condition for participants to interact.
  • Control. In this condition, participants did not view any polls and did not receive any new information. This group was used as a point of comparison in determining which poll format was most effective in helping people learn, increasing time on site, and engaging users.

Poll Content

The study participants in three of the four conditions (Slider Polls, Multiple-Choice Polls, Poll Information) answered questions and/or received information related to three current policy topics: health care, taxes, and the federal budget.

For health care, participants interacting with the Slider Polls and Multiple-Choice Polls were asked “What percentage of Americans say that the 2010 health care law recently upheld by the Supreme Court will make things better for them?” After answering, each participant then learned that “Approximately 38% of Americans say that the 2010 health care law recently upheld by the Supreme Court will make things better for them, according to a 2012 Gallup poll. In addition, 42% say that it will make things worse and 13% say that it will make no difference.”

For taxes, participants answered “What percentage of Americans approves of raising the income tax rate on incomes over $250,000 a year as a way to reduce the size of the national debt?” They then learned that “Approximately 66% of Americans approve of raising the income tax rate on incomes over $250,000 a year as a way to reduce the size of the national debt, according to a 2011 poll from the non-partisan Pew Research Center.”

For the federal budget, the poll asked “What percentage of the estimated 2012 federal budget is spent on Social Security?” Participants learned that “Approximately 20% of the estimated 2012 federal budget is devoted to paying for Social Security, according to the federal Office of Management and Budget.”

People in the Poll Information condition saw the poll answers for each policy topic, but were not asked any of the questions.

Study Participants

Four hundred and fifty-six people participated in this study. Participants for this study were recruited through Survey Sampling International (SSI), an online survey panel vendor, to mirror the latest sample of Internet users in the United States as identified by the Pew Research Center’s Internet & American Life Project August 2012 tracking survey (see Appendix for more details). Data for this study were gathered in early October, 2012.

Study participants answered questions about how enjoyable, interesting, informative, clear, credible, and trustworthy the polls were. Participants also answered recall questions about the policy issues covered in each poll, how confident they were in their knowledge, and what arguments they thought people would give for and against different issues. Study software unobtrusively tracked the amount of time participants spent with each poll.

Experimental Results

After completing the experiment, we analyzed the results to ascertain whether participating in the slider polls and multiple-choice polls made any difference in how people thought about politics and how much time people spent compared to just reading the poll information or not receiving any new information. The six major results are reported below.

People devoted more time to interactive polls, especially slider polls.

We added up the amount of time each participant spent with the three slider polls, the three multiple-choice polls, or the three descriptions of the poll information. People spent more time with the Slider Polls than with Multiple-Choice Polls. They spent the least amount of time with the Poll Information.34

Slider polls helped people learn and apply their knowledge.

Near the end of the study, participants were asked a series of quiz questions about the information on health care, taxes, and budgets. A random half of participants were asked these questions in a closed-ended format resembling the multiple-choice polls. The other random half of participants were asked these questions in an open-ended, fill-in-the-blank format resembling the slider polls.

Those in the Slider Polls and Multiple-Choice Polls conditions knew more than those in the Poll Information and Control conditions.35 There were no differences between the Slider Polls and Multiple-Choice Polls conditions in how many quiz questions respondents answered correctly. Regardless of whether they were asked closed-ended or open-ended quiz questions, respondents in the Slider Polls condition answered the same number of knowledge questions correctly. Yet participants in the Multiple-Choice Polls, Poll Information, and Control conditions answered more questions correctly when asked closed-ended questions in comparison to open-ended questions.36 In other words, those in the Slider Polls condition were equally able to recall and recognize the information from the polls while those in the other conditions.

Polls, facts help people identify how confident they should be in their knowledge.

After answering each quiz question, study participants were asked how confident they were in their answer from 1 (not at all confident) to 4 (very confident). We analyzed whether confidence varied across conditions and knowledge levels.

Encountering the poll information, whether in an interactive format (Multiple-Choice, Slider) or in a non-interactive format (Poll Information) helped users to connect their knowledge with their confidence. Respondents were more confident in their response when they did, in fact, know the right answer and were less confident in their response when they did not have the correct answer. This is in contrast to those in the Control condition where people expressed similar levels of low confidence irrespective of whether or not they knew the right answer.37

Poll format did not affect respondents’ perceptions of why people hold different views.

Following each poll, respondents were asked to list up to three reasons why someone would agree with the poll statements and then up to three reasons why some would disagree with the poll statements (e.g. Regardless of your own opinion, why do you think that someone would [approve/disapprove] of raising the income tax rate on incomes over $250,000 a year as a way to reduce the size of the national debt?). We then asked respondents to rate how legitimate they found each of the reasons that they had generated for and against each statement. For example, one respondent noted that one reason people may approve of raising the income tax rate on those making over $250,000 per year is “taxes are now more equitable.” Another respondent noted that one reason people may disapprove of raising taxes on the wealthy is “it would inhibit investing by the high earners.”

To analyze these data, we first looked at what respondents felt about each issue. Some respondents, for example, were opposed to raising taxes on high-income earners and others favored the action. We categorized the responses as pro-attitudinal reasons when respondents were giving reasons that favored their view on the topic and as counter-attitudinal reasons when respondents were providing rationales that were opposed to their point of view.

We then analyzed: (1) the number of pro- and counter-attitudinal reasons provided, (2) the average legitimacy of pro- and counter-attitudinal reasons provided, and (3) the maximum legitimacy rating for the pro- and counter-attitudinal reasons generated by respondents. There were no differences among the conditions.

Interactive polls are more enjoyable than reading the results, and are just as clear and informative.

Participants were asked to indicate whether they found each poll enjoyable, interesting, informative, clear, credible, and trustworthy. There were no differences across the conditions for any of these measures except for enjoyment. Participants were asked to rate how boring or enjoyable they found the poll from 1 (boring) to 5 (enjoyable). Participants found the Multiple-Choice Polls and Slider Polls more enjoyable than the Poll Information.38Results hold for different education levels.

One important question about these findings is whether they apply only to those with higher levels of education. It could be that only those with more education benefit from online interactivity like multiple-choice and slider polls. Research on the knowledge gap, for instance, suggests that those of higher socioeconomic status are more likely to benefit from information and learn more in comparison to those of lower socioeconomic status.39 Further, one study suggested that slider polls attract only those with higher levels of education.40

We analyzed whether our significant findings about (a) the amount of time spent with the polls, (b) enjoyment, (c) knowledge, and (d) confidence in one’s knowledge were different among those with higher levels of education compared to those with lower levels of education. In no in- stance were there significant differences among those with a high school education or less, those with some college, and those who had graduated college.

Field Testing

Based on the encouraging findings from the laboratory tests, we wanted to test the polls in a more natural environment. We created code to experimentally test polls on news websites. As different news sites cater to different audiences, it is important to enable news outlets to test how the polls work in their unique online environment. Our code allows news organizations to know how many people took part in a poll on their site and whether the poll affected time spent on their site.

For a news outlet, the program is simple to include on a webpage – it involves inserting a few lines of code to place an iframe on a page. Behind the scenes, however, the program is more sophisticated. Upon reaching a webpage, site visitors randomly see either (a) a multiple-choice poll or (b) a slider poll.

The randomization is tied to each browser session. In other words, webpage visitors see the same poll each time they navigate to the site as long as they have not closed their Internet browser between page visits. If visitors close their browser and then return to the same webpage, then the randomization begins anew and the visitor could see either a multiple-choice or slider poll upon revisiting the page. As the visitor is surfing the webpage with the poll, the program is tracking their IP address, how much time they spend on the page, and their engagement with the poll.

Field Test #1

On December 11, 2012, a local television news station included an experimental poll on their news website. The program randomly assigned respondents to either the multiple-choice or slider poll. A link to the story containing the poll, “Burglars are shopping for Christmas – in parking lots,” was featured on the station main webpage. The poll was embedded partway through the article, requiring some scrolling for visitors using a typical browser to view.

In total, we logged 926 unique IP addresses visiting the page. A total of 369 unique IP addresses were randomly assigned to the multiple-choice poll and 348 to the slider poll. Site visitors with cookies disabled were shown a multiple-choice poll and are not included in the analysis (n=209).

We first analyzed whether there were any differences in participation in the poll depending on whether respondents randomly saw the slider or multiple-choice version. There were no differences between conditions; 7.6 percent of visitors to the multiple-choice page took the poll and 7.2 percent of site visitors to the slider poll took the poll, a non-significant difference.41 We next calculated the amount of time on the page for poll takers. These data are complicated because some people might leave a browser window open for several days (outliers), while others spend only a few seconds on a page. For this reason, we present trends in our findings, as opposed to charts for the results. More detail can be found in the footnotes.

Although site visitors did spend more time on the site when it contained a slider poll in comparison to when it contained a multiple-choice poll, the differences were small and not statistically significant.42 The direction of the findings was, however, consistent with the results from the laboratory test.

Field Test #2

On January 15, 2013, a local news station partnered with us to test the polls in a novel way: by including two polls on a single news page. As with the first field test, the polls randomly appeared as either a slider poll or as a multiple-choice poll. For this test, both polls could be slider polls, both could be multiple-choice polls, or one could be a slider poll and the other a multiple-choice poll.

The article was titled, “Starting at community college may not lower degree cost” and the polls asked about (a) what percentage of students at public, four-year institutions take out student loans and (b) the typical amount of student loan debt for those attending a four-year public college or university. To view either of the polls, the site visitor with a typical browser needed to scroll down the page. The story was promoted on air and on the site main page.

The web page attracted 454 visitors with unique IP addresses. As before, site visitors with cookies disabled were shown a multiple-choice poll (n=92) and are not included in the results. Overall, 19 percent were shown two multiple-choice polls, 20 percent were shown a multiple-choice poll first and a slider poll second, 30 percent saw a slider poll first and a multiple-choice poll second, and the remaining 31 percent saw two slider polls.

Across both polls, 7.2 percent completed the first and not the second, 3.2 percent completed the second and not the first, and 2.6 percent completed both. Overall, people were more likely to take part in the slider poll. Eight percent of respondents shown at least one multiple-choice poll took it compared to 13 percent of respondents shown at least one slider poll. The highest rates of participation, however, came from sites with both a slider and multiple-choice poll compared to sites that contained only one of the two.

We next turned to analyzing how much time respondents spent on the site depending on the combination of polls present. Across multiple strategies of analyzing the data, the webpage with a slider poll first and a multiple-choice poll second yielded the greatest amount of time-on-page and the two multiple-choice polls condition yielded the least time-on-page. The other two conditions (two slider polls; multiple-choice first, slider second) had average time-on- page scores in the middle. These differences across conditions were statistically significant.43

Field Test #3

Upon finding that combinations of polls on a single page increased time on site, we next explored whether a quiz consisting of seven questions about gun control would affect the amount of time respondents spent on a web page. A single poll was included on the web page, randomized to begin as either a slider or multiple-choice poll. After receiving the correct information, respondents were able to complete up to six additional questions in a quiz format. Each quiz question was randomized to display as either a multiple-choice or a slider poll.

On March 6, 2013, a local news station included the quiz on their website. As with the other polls, site visitors needed to scroll down part way through the article in order to see the quiz. The seven poll questions probed (1) how many people have an active concealed handgun license, (2) how many people were denied a concealed handgun license, (3) what percentage of people convicted of a crime held a concealed handgun license, (4) how many firearms are manufactured in the United States, (5) the percentage of robberies that involve a firearm, (6) the percentage of Americans with a gun in their home, and (7) how many intentional deaths occur as a result of firearms. The design of the slider and multiple-choice poll versions were identical to the previous studies.

There were 311 unique IP addresses recorded. Twelve percent of site visitors had cookies disabled and could not be randomly assigned. For the remaining visitors, 148 randomly first saw the multiple- choice quiz question and 126 first saw the slider quiz question. Overall, 38.7 percent of site visitors took part in the online quiz. Of those who did participate in the poll, they completed an average of 4.41 of the seven quiz questions. There were no differences between those who first saw the slider poll (38.1%) and those who first saw the multiple-choice poll (37.8%) in their decision to start taking part in the quiz.44 The more poll questions that respondents completed, the more time they spent on the site, confirming the efficacy of including multiple quiz questions sequentially on a page.45

Conclusion

Online polls present many benefits: they’re enjoyable, they help people to learn, they help people to connect their knowledge with their levels of confidence, and they yield more time on a webpage. Although slider polls are similar in many ways to multiple-choice polls, they present some unique benefits. From a business angle, slider polls increase time spent on the site relative to multiple-choice polls. From a democratic angle, slider polls help people to recall poll information equally well no matter whether they’re asked about the information using open-ended, fill-in-the-blank quiz questions or if they’re asked about the information using closed-ended, multiple-choice quiz questions. This is particularly important, since people often have to recall information without a multiple-choice question to prompt their memory in daily life.

Our recommendation is that online news sites use slider polls, such as those displayed here, to meet both business and journalistic goals. We are happy to assist news organizations with testing whether these polls work on their site.

For information on participant demographics, see the full report.

  1. Kent, M. L., Harrison, T. R., & Taylor, M. (2006). A critique of Internet polls as symbolic representation and pseudo-events. Communication Studies, 57, 299-315. Kim, S. T., Weaver, D., & Willnat, L. (2002). Media reporting and perceived credibility of online polls. Journalism & Mass Communication Quarterly, 77, 846-864.
  2. Bucy, E. P. (2004). Second generation net news: Interactivity and information accessibility in the online environment. The International Journal on Media Management, 6, 102-113; Chan-Olmsted, M., & Suk Park, J. (2000). From on-air to online world: Examining the content and structures of broadcast TV stations’ web sites. Journalism & Mass Communication Quarterly, 77, 321-339; Dibean, W., & Garrison, B. (2001). How six online newspapers use web technologies. Newspaper Research Journal, 22, 79-93; Rosenberry, J. (2005). Few papers use online techniques to improve public communication. Newspaper Research Journal, 26, 61-73. Schultz, T. (1999). Interactive options in online journalism: A content analysis of 100 U.S. newspapers. Journal of Computer-Mediated Communication, 5. doi: 10.1111/j.1083-6101.1999.tb00331.x; Stroud, N. J., Muddiman, A., & Scacco, J. (forthcoming). Engaging audiences via online news sites. In H. Gil de Zúñiga (Ed.) New agendas in communication: New technologies and civic engagement. New York, NY: Routledge.
  3. Tremayne, M. (2008). Manipulating interactivity with thematically hyperlinked news texts: A media learning experiment. New Media & Society, 10, 703-727.
  4. Kim, Weaver, & Willnat, 2002. Rosenberry, 2005; Schultz, 1999; Stroud, Muddiman, & Scacco, forthcoming; Wu, W., & Weaver, D. (1997). On-line democracy or on-line demagoguery? Public opinion “polls” on the Internet. International Journal of Press/Politics, 2, 71-86.
  5. Chan-Olmsted & Suk Park, 2000; Schultz, 1999.
  6. Bucy, 2004; Dibean & Garrison, 2001; Rosenberry, 2005.
  7. Stroud, Muddiman, & Scacco, forthcoming.
  8. Hermida, A., & Thurman, N. (2008). A clash of cultures: The integration of user-generated content within professional journalistic frameworks at British newspaper websites. Journalism Practice, 2, 343-356.
  9. Kim, Weaver, & Willnat, 2002; Schultz, 1999.
  10. Stroud, Muddiman, & Scacco, forthcoming.
  11. Kent, Harrison, & Taylor, 2006; Wu & Weaver, 1997.
  12. Wu & Weaver, 1997.
  13. Rosenberry, 2005; Schultz, 1999; Wu & Weaver, 1997.
  14. Kim, Weaver, & Willnat, 2002.
  15. Kim, Weaver, & Willnat, 2002.
  16. Lehman, D. R, & Nisbett, R. E. (1990). A longitudinal study of the effects of undergraduate training on reasoning. Developmental Psychology, 26, 952-960.
  17. Kim, Weaver, & Willnat, 2002.
  18. Ansolabehere, S. & Iyengar, S. (1994). Of horseshoes and horse races: Experimental studies of the impact of poll results on electoral behavior. Political Communication, 11, 413-430; Mutz, D. C. (1995). Effects of horse-race coverage on campaign coffers: Strategic contributing in presidential primaries. Journal of Politics, 57, 1015-1042.
  19. Noelle-Neumann, E. (1993). The spiral of silence: Public opinion—our social skin. Chicago, IL: University of Chicago Press.
  20. Fournier, P., Turgeon, M., Blais, A., Everitt, J., Gindengil, E., & and Nevitte, N. (2011). Deliberation from within: Changing one’s mind during an interview. Political Psychology, 32, 885-919.
  21. Mutz, D. C. (1997). Mechanisms of momentum: Does thinking make it so? Journal of Politics, 59, 104-125.
  22. Tremayne, 2008.
  23. Stroud, Muddiman, & Scacco, forthcoming.
  24. Alwin, D. F. (1997). Feeling thermometers versus 7-point scales: Which are better? Sociological Methods & Research, 25, 318-340; Andrews, F. M. (1984). Construct validity and error components of survey measures: A structural modeling approach. Public Opinion Quarterly, 48, 409-442; Borgers, N., Hox, J, & Sikkel, D. (2004). Response effects in surveys on children and adolescents: The effect of number of response options, negative wording, and neutral mid-point. Quality & Quantity, 38, 17-33; Lissitz, R. W., & Green, S. B. (1975). Effect of the number of scale points on reliability: A Monte Carlo approach. Journal of Applied Psychology, 60, 10-13; Lozano, L. M., Garcia-Cueto, E., Muniz, J. (2008). Effect of the number of response categories on the reliability and validity of rating scales. Methodology, 4, 73-79; Tversky, A. (1964). On the optimal number of alternatives at a choice point. Journal of Mathematical Psychology, 1, 386-391.
  25. Andrews, 1984; Bendig, A. W. (1954). Reliability and the number of rating scale categories. The Journal of Applied Psychology, 38, 38-40; Borgers, Hox, & Sikkel, 2004; Lissitz & Green, 1975; Lozano, Garcia-Cueto, & Muniz, 2008.
  26. Geer, J. G. (1991). Do open-ended questions measure “salient” issues? Public Opinion Quarterly, 55, 360-370; Luskin, R. C., & Bullock, J. G. (2011). “Don’t know” means “don’t know”: DK responses and the public’s level of political knowledge. Journal of Politics, 73, 547-557.
  27. Cook, C, Heath, F., Thompson, R. L., & Thompson, B. (2001). Score reliability in web- or Internet- based surveys: Unnumbered graphic rating scales versus Likert-type scales. Educational and Psycho- logical Measurement, 61, 697-706; Smyth, J. D., Dillman, D. A., Christian, L. M., & McBride, M. (2009). Open-ended questions in web surveys: Can increasing the size of answer boxes and providing extra verbal instructions improve response quality? Public Opinion Quarterly, 73, 325-337.
  28. Eveland, W. P., Marton, K., & Seo, M. (2004). Moving beyond “just the facts”: The influence of online news on the content and structure of public affairs knowledge. Communication Research, 31, 82-108; Tremayne, M., & Dunwoody, S. (2001). Interactivity, information processing, and learning on the World Wide Web. Science Communication, 23, 111-134.
  29. Geer, 1991; Luskin & Bullock, 2011.
  30. Kuklinski, J. H., Quirk, P. J., Jerit, J., Schwieder, D., & Rich, R. F. (2000). Misinformation and the currency of democratic citizenship. Journal of Politics, 62, 790-816.
  31. Kuklinski, Quirk, Jerit, Schwieder, & Rich, 2000.
  32. Kuklinski, Quirk, Jerit, Schwieder, & Rich, 2000.
  33. Mutz, 1997.
  34. Time spent differed significantly across conditions, F(2, 318)=12.81, p<.01. A planned comparison reveals that respondents spent more time with the Slider Polls than with the Multiple-Choice Polls (p<.05). Post-hoc comparisons show that participants spent less time with the Poll Information than they did with either the Slider Polls (p<.01) or Multiple-Choice Polls (p<.01) conditions.
  35. All results obtained using ANOVA with covariates for education, age, gender, race/ethnicity, political knowledge and interest, ideology/partisanship, and income unless otherwise noted. Results are similar without covariates unless otherwise noted. Charts display adjusted means. All post-hoc comparisons completed with a Sidak adjustment for multiple comparisons and are two-sided. Main effect of Condition F(3, 388)=46.62, p<.01; Main effect of open vs. closed-ended quiz questions F(1, 388)=35.86, p<.01; Interaction F(3, 388)=3.08, p<.05. Post-hoc comparisons for the main effect of Condition show that Slider Polls and Multiple-Choice Polls differ from Poll Information (p<.01) and Control (p<.01). Poll Information also differ from Control (p<.01).
  36. Post-hoc comparisons for the Interaction show that the difference between the number of open- ended poll questions answered correctly and the number of closed-ended poll questions answered correctly differed for the Multiple-Choice (p<.05), Poll Information (p<.01), and Control (p<.01) conditions, but not for the Slider Polls condition (p<.45).
  37. Poll knowledge collapsed into two categories: High knowledge (answering 2 or 3 questions correctly) and Low knowledge (answering 0 or 1 question correctly). Results remain the same if knowledge varies from 0 to 4. Main effect of Condition F(3,387)=14.73, p<.01; Main effect of Poll Knowledge F(1,387)=21.17, p<.01; Interaction F(3, 387)=5.74, p<.01. Differences in confidence between low and high political knowledge is significant, using post-hoc comparisons, for the Slider Polls (p<.01), Mul- tiple-Choice Polls (p<.01), and Poll Information (p<.01) conditions, but not for the Control condition (p<.25). Those in the Slider Polls condition who answered only 0 or 1 quiz question correctly (low knowledge) had average confidence of 2.74 while those answering 2 or 3 quiz questions correctly (high knowledge) had average confidence of 3.37 – a difference of .62. In comparison, the difference in confidence between those with high and low knowledge was .57 for the Multiple-Choice Polls condition, .40 for those in the Poll Information condition, and -.22 for those in the Control condition.
  38. Enjoyment differed significantly across conditions, F(2, 310)=10.03, p<.01. Post-hoc comparisons revealed that participants rated Poll Information as less enjoyable than Slider Polls (p<.01) and Multiple-Choice Polls (p<.01). The difference between Slider Polls and Multiple-Choice Polls is not significant (p<.60).
  39. Tichenor, P. J., Donohue, G. A., & Olien, C. N. (1970). Mass media flow and differential growth in knowledge. Public Opinion Quarterly, 34, 159-170.
  40. Funike, F., Reips, U, & Thomas, R. K. (2011). Sliders for the smart: Type of rating scale on the web interacts with educational level. Social Science Computer Review, 29, 221-231.
  41. The difference in participating in the poll between the slider and multiple-choice was not significant χ2(2)=.06, p=.97.
  42. We examined five different strategies for handling outliers: (1) trimming the data by 10%, (2) trimming the data by 20%, (3) winsorizing the data by 10%, (4) winsorizing the data by 20%, and (5) transforming the data using the natural log. For all but one of these seven strategies (trimmed data 10%), time-on-page for the slider poll exceeds time-on-page for the multiple-choice poll. In no instance, however, was the difference significant (using either independent sample t-tests or Mann-Whitney non-parametric tests).
  43. Using the same five strategies for handling outliers as Field Test 1, ANOVA tests were significant in three cases, marginal in one (natural log), and non-significant in the fifth (trimmed 20%). Non-parametric Kruskal-Wallis tests were significant in all cases, except trimmed 20% which was marginally significant.
  44. The difference in participating in the poll was not dependent on whether the first poll was a slider or multiple-choice poll (χ2(1)=.002, p=.97).
  45. Both Pearson’s correlations (Range = .30 to .66) and Spearman’s rho (Range = .56 to .63) for time on page and number of poll questions completed were significant regardless of the strategy for handling outliers.

Researchers

Natalie (Talia) Jomini Stroud

Josh Scacco headshot

Joshua Scacco

Ashley Muddiman