麻豆视频

Cornell-led election survey seeks to improve science of polls

In Florida recently, a registered Republican answered a pollster鈥檚 questions over his cell phone, while a Spanish-speaking Democrat responded to a survey invitation received by mail. In Michigan, a white male voter participated after a link was texted to his randomly selected number.

Those are just a few of the thousands of potential voters being reached in diverse ways by a Cornell-led survey that aims to provide the most comprehensive understanding of this year鈥檚 midterm elections on Nov. 8 鈥 and to advance the science of survey research in the process.

Boasting a sample size 20 times larger than most nationally representative surveys, the federally funded will collect extensive information on voters鈥 attitudes toward candidates and key issues including the economy, abortion, race relations, political polarization and authoritarianism.

Importantly, the sample of roughly 20,000 also means that 鈥 for the first time 鈥 it will be possible to evaluate how different survey methods can be combined to offer the most representative data, not just across the U.S. but in key states such as California, Florida and Wisconsin, said , professor of government and public policy in the 麻豆视频 and 麻豆视频 and the Cornell Jeb E. Brooks School of Public Policy.

鈥淭here鈥檚 been a massive proliferation of polls that are increasingly using different methods to collect data, but no way to systematically analyze what the best approach is,鈥 said Enns, the Robert S. Harrison Director of the . 鈥淥ur ultimate goal is to provide a roadmap for improving the science of surveys.鈥

Enns is principal investigator of the midterm survey by the National Science Foundation. Co-principal investigators are , associate professor in the Department of Communication in the College of Agriculture and Life 麻豆视频, and executive director of the ; and , inaugural dean of the Cornell Brooks School.

The team plans to unveil its findings at a Jan. 20 event at Cornell Tech in New York City.

The challenge they hope to address is highlighted in Enns co-authored of more than 350 polls conducted over two months preceding the 2020 presidential election. Randomized, or probability-based, surveys 鈥 long considered the 鈥済old standard鈥 by survey firms and researchers 鈥 were the least accurate, on average. Nonprobability-based surveys, including so-called 鈥渃onvenience samples鈥 of pre-selected respondents, fared slightly better. Mixed-method surveys performed best overall but showed wide variation.

鈥淭his goes against the science,鈥 Schuldt said. 鈥淩andom, probability-based samples aren鈥檛 behaving as if random anymore.鈥

Fast-changing technologies and declining response rates have made it increasingly challenging for surveys to collect representative samples, according to the researchers.

After an open call for proposals, the Cornell team in September selected three teams to conduct the monthlong midterm survey, collecting data from Oct. 26 to Nov. 22: ; the and researchers at the University of Iowa; and partners and . Each team will collect probability- and nonprobability-based samples and utilize at least two recruitment methodologies 鈥 reaching voters in both English and Spanish via mail, text, online panels or calls to land lines and mobile phones.

For example, Gradient plans to send more than 1 million survey invitations by text message, and more than 18,000 by mail. The Iowa researchers plan to contact more than 31,000 cell phones and 8,500 land lines using random digit dialing. SSRS will issue approximately 45,000 invitations using probability and non-probability methods, in addition to mailing invitations to 3,000 randomly selected addresses is Wisconsin.

Each team is asking the same survey questions at the same time about U.S. House, U.S. Senate and gubernatorial races, and will collect large samples through each methodology. The researchers say that will enable direct comparisons of the different approaches and an assessment of the most cost-effective combinations currently available to survey scientists. That hasn鈥檛 been possible to date, they said, because polls vary so much in their budgets, questions, methods, sample sizes and transparency.

David Wilson, dean of the Goldman School of Public Policy at the University of California, Berkeley and a senior adviser to the Cornell team, said the 2022 Collaborative Midterm Survey could transform the study of political attitudes and behavior during election season.

鈥淭he principal investigators have assembled some of the most innovative minds in survey methodology and public opinion, and partnered them with diverse practitioners in academia and the profession,鈥 Wilson said. 鈥淭he result is a new framework for investigating our democracy, advancing the science of surveys and politics.鈥

Learning about the electorate and state of American democracy are a top priority for the midterm survey project. But Barry said its insights into best practices 鈥 ideally to be updated every year or two 鈥 would benefit influential and costly government surveys across topics such as employment, consumer spending, health and the environment.

鈥淯nderstanding these survey methods,鈥 Barry said, 鈥渉olds implications well beyond political surveys.鈥

.

More News from A&S

 Peter Enns
Peter K. Enns, the Robert S. Harrison Director of the Cornell Center for Social 麻豆视频, Executive Director of the Roper Center for Public Opinion Research and professor of government