2020 College Free Speech Rankings Q&A: How were the students surveyed?

October 14, 2020

Last week I answered a number of questions from the online webinar where FIRE Executive Director Robert Shibley and I summarized why we created the 2020 College Free Speech Rankings and some of its important findings. This is the second in a series that answers some of the questions we were not able to address during the webinar, for part I see here. A number of these questions were about our sampling methodology. Our survey partner College Pulse offers a detailed description of their sampling methodology, and I encourage people to read through the information provided there. However, I address some of the more specific questions in regards to sampling that we received during the webinar below. 

How were students recruited for participation? How did you ensure that first generation college students were adequately sampled? What about lower-income college students?

I asked College Pulse these questions and Anne Schwichtenberg, my co-author on the 2020 College Free Speech Rankings
report, provided the following answers:

We recruit students through mainly word of mouth marketing, but also do need to do targeted recruiting to fill panels from specific universities where we haven’t met a minimum, and for that, we’ll use targeted email marketing or social media outreach.

Students are incentivized to take surveys, mainly through Amazon gift cards for completion.

We can’t ensure first generation student status, but we do weight the data accordingly to be nationally representative of four year college students based on the current population’s enrollment in four-year universities according to the following levels:

  • Race
  • Gender
  • Age
  • Voter registration status

Voter registration status is a way to “back into” socioeconomic status (SES) and is admittedly an indirect measure, but as you know, frequently correlated with SES. We have asked students in surveys about their first generation status but do not collect this as a demographic variable. 

In addition to Anne’s responses I will note two additional points. First, 97% of college students own a smartphone. Second, all of College Pulse’s surveys can also be completed on their website by anyone who uses the internet, a category that includes roughly 90% of American adults.Thus, we are confident that College Pulse’s outreach and marketing should be able to reach a large cross-section of all college students enrolled at 4-year institutions within the United States.

How do online samples differ from traditional telephone survey methods?

The internet has revolutionized the polling industry. For decades most public opinion polling organizations have relied on samples obtained via telephone surveys using random digit dialing. Over the past 20 years, the response rate for traditional RDD surveys has
consistently declined and is now less than 10%, so online surveys are becoming increasingly prevalent. It is important to note, here, that by “online surveys” we do not mean the equivalent of a Twitter poll or a web survey that anyone can take; rather, we are discussing professionally administered surveys to large panels such as those performed by YouGov, the Pew Research Center, and (in this case) College Pulse.

It is true that people respond to online surveys differently than if they are surveyed over the phone by a live interviewer. However, this is not necessarily a bad thing. For instance, socially desirable responding is not as prevalent in online surveys. In other words, online surveys of public opinion regarding controversial topics may be a better gauge of public opinion than traditional telephone surveys. 

This does not mean that online surveys do not have their own drawbacks. The main one is that it is impossible to use them to systematically collect a traditional probability sample of the general population. Because of this, College Pulse applies a post-stratification adjustment based on a number of attributes, such as race, gender, class year, voter registration status, and financial aid status obtained from multiple data sources, including the 2017 Current Population Survey , the 2016 National Postsecondary Student Aid Study, and the 2017-18 Integrated Postsecondary Education Data System. This post-stratification weighting–the standard practice in surveys of this type–helps ensure that the demographic distribution of the sample matches the national population of college students.

How many student responses did you get from each school?

We provided this information in the appendix of our full
report, but I am including that table below:

College Number of Undergraduates Sampled
Arizona State University


Brigham Young University 531
Brown University 475
Clemson University 488
Columbia University 255
Cornell University 276
Dartmouth College 302
DePauw University 249
Duke University 288
Georgetown University 356
Harvard University 322
Indiana University 512
Kansas State University 279
Louisiana State University 322
Northwestern University 338
Ohio State University 351
Oklahoma State University 357
Pennsylvania State University 506
Princeton University 363
Rutgers University 316
Stanford University 260
Syracuse University 232
Texas A&M University 622
University of Alabama 298
University of Arizona 398
University of Arkansas 327
University of California, Berkeley 338
University of California, Davis 444
University of California, Los Angeles 315
University of Chicago 298
University of Colorado 469
University of Georgia 392
University of Illinois, Chicago 652
University of Illinois at Urbana Champaign 464
University of Iowa 316
University of Michigan 720
University of Minnesota  529
University of Missouri 278
University of Nebraska 303
University of North Carolina 387
University of Oklahoma 278
University of Oregon 364
University of Pennsylvania 540
University of South Carolina 350
University of Tennessee 441
University of Texas, at Austin 312
University of Texas, at Dallas 326
University of Utah 315
University of Virginia 429
University of Washington 360
University of Wisconsin 347
Virginia Tech 403
Wake Forest University 318
Washington State University 390
Yale University 269

Why did you survey these specific 55 colleges? Why wasn’t [insert college name here] surveyed?

We had a few reasons for selecting these specific 55 colleges. First,
prior work suggests that support for free speech and expression may be lower among students at private colleges compared to students at state universities. Thus, we included all eight members of the Ivy League, a number of other private colleges (e.g. Duke University, Stanford University), and a number of flagship state universities, so we could explore this possibility. We also wanted to include the flagship state universities because far more students attend and consider attending such institutions compared to more expensive private universities. 

One important factor that impacted if we surveyed a college or not was whether College Pulse considered the panel of students at that school large enough for us to obtain an adequate sample. For instance, given how publicly supportive Purdue University’s administration has been of free speech and expression on campus, we would have liked to include it. However, there were not enough Purdue students enrolled in College Pulse’s panel to be comfortable with the results. This was the main reason for not including some colleges that one might expect in this initial round. If we are able to administer the survey again and produce a new set of rankings, one of our primary goals is to significantly expand the number of colleges surveyed. After all, the number one reason more schools weren’t included was (of course) the cost of doing so — so please feel free to support FIRE if you’d like to see more work like this!

Final Thoughts

I hope that this piece answers the main questions regarding our sampling strategy and method. In my next piece, I will address more specific questions about the survey items themselves, with a particular focus on the items that make up the tolerance factor of our rankings.