Rushing to the end: Participants’ perceptions of demotivating aspects of online surveys

João Martins, Leonor Lavradio


More and more social science studies are now acquiring data through the internet, reaching participants
online. Some participants start out engaged and motivated to participate, but progressively slide into
“rushing behaviors”. We inquired experts in survey responding about when, in online studies, they
would feel a desire for rushing (defined as speeding with no concerns about the quality of responses).
This qualitative approach uncovered Repetition, Survey length and No interest in topic as the three
main features that would motivate these participants to rush in surveys. Subsequent inquiry of the
same participants indicated that repetition concerns the type of questions made (more than stimuli or
task), the execution of the same task more than 5-6 times, or for more than 6 minutes. Survey length
concerns a preference for shorter surveys, as well as the subjective experience in which length exceeds
previously set expectations (i.e., longer than announced), contributing to rushing by effectively
lowering the hourly pay rate as the survey increases in length. Interest in topic was reported to be
consistently low, despite not being the main reason to quit the survey. However, a change in expected
level of interest in the middle of the survey is reported as a factor that will promote rushing behaviors.
We discuss these data as informative regarding how pre-tests of surveys can benefit from these
participants’ expertise.


Online survey, Rushing, Data validity.

Full Text:



Asarch, A., Chiu, A., Kimball, A. B., & Dellavalle, R. P. (2009). Survey research in dermatology: Guidelines for success. Dermatologic Clinics, 27, 121-131. Retrieved from

Conrad, F. G., Couper, M. P., Tourangeau, R., & Peytchev, A. (2006). Use and non-use of clarification features in web surveys. Journal of Official Statistics, 22, 245-269.

Conrad, F. G., Tourangeau, R., Couper, M. P., & Zhang, C. (2017). Reducing speeding in web surveys by providing immediate feedback. Survey Research Methods, 11, 45-61. Retrieved from

Couper, M. P. (2000). Web surveys: A review of issues and approaches. The Public Opinion Quarterly, 64, 464-494. Retrieved from

Couper, M. P., Traugott, M. W., & Lamias, M. J. (2001). Web survey design and administration. Public Opinion Quarterly, 65, 230-253. Retrieved from

Dillman, D. A. (2011). Mail and internet surveys: The tailored design method (2nd ed.). Hoboken, NJ: John Wiley & Sons.

Dillman, D. A., Tortora, R. D., & Bowker, D. (1998). Principles for Constructing Web Surveys. Joint Meetings of the American Statistical Association, 64, 1-16.

Evans, J. R., & Mathur, A. (2005). The value of online surveys. Internet Research, 15, 195-219. Retrieved from

Gadiraju, U., Kawase, R., Dietze, S., & Demartini, G. (2015). Understanding malicious behavior in crowdsourcing platforms: The case of online surveys. In B. Begole, J. Kim, K. Inkpen, & W. Woo (Eds.), 33rd Annual ACM Conference on Human Factors in Computing Systems (pp. 1631-1640). Seoul: ACM.

Geisen, E., & Bergstrom, J. R. (2017). Usability testing for survey research. Cambridge: Morgan Kaufmann.

Gummer, T., Roβmann, J., & Silber, H. (2018). Using instructed response items as attention checks in web surveys: Properties and implementation. Sociological Methods and Research. 1-27. Retrieved from

Hauser, D. J., & Schwarz, N. (2015). It’s a trap! Instructional manipulation checks prompt systematic thinking on «tricky» tasks. SAGE Open, 5, 1-5. Retrieved from

Herzog, A. R., & Bachman, J. G. (1981). Effects of questionnaire length on response quality. Public Opinion Quarterly, 73, 349-360. Retrieved from

Hillygus, D. S., Jackson, N., & Young, M. (2014). Professional respondents in nonprobability online panels. In M. Callegaro et al. (Eds.), Online panel research: A data quality perspective (1st ed., pp. 219-237). John Wiley & Sons. Retrieved from

Kaye, B. K., & Johnson, T. J. (1999). Research methodology: Taming the cyber frontier techniques for improving online surveys. Social Science Computer Review, 17, 323-337. Retrieved from

Kittur, A., Chi, E. H., & Suh, B. (2008). Crowdsourcing user studies with Mechanical Turk. In M. Czerwinski (Ed.), Twenty-sixth annual SIGCHI conference on Human factors in computing systems [April 5-10, 2008, Florence, Italy] (pp. 453-456). New York: ACM Press. Retrieved from

Krosnick, J. (1991). Response strategies for coping with the cognitive demands of attitude measures in survey. Applied Cognitive Psychology, 5, 213-236. Retrieved from

Manfreda, K. L. M., Batagelj, Z., & Vehovar, V. (2002). Design of websurvey questionnaires: Three basic experiments. Jounal of Computer-Mediated Communication, 7. Retrieved from

Matthijsse, S. M., de Leeuw, E. D., & Hox, J. J. (2015). Internet panels, professional respondents, and data quality. Methodology, 11, 81-88. Retrieved from

Nulty, D. D. (2008). The adequacy of response rates to online and paper surveys: What can be done? Assessment and Evaluation in Higher Education, 33, 301-314. Retrieved from

Revilla, M. (2017). Ideal and maximum length for a web survey. International Journal of Market Research, 59, 557-565. Retrieved from

Revilla, M., & Ochoa, C. (2015). What are the links in a web survey among response time, quality, and auto-evaluation of the efforts done?. Social Science Computer Review, 33, 97-114. Retrieved from

Roβmann, J., Gummer, T., & Silber, H. (2018). Mitigating satisficing in cognitively demanding grid questions: Evidence from two web-based experiments. Journal of Survey Statistics and Methodology, 6, 376-400. Retrieved from

Simon, H. A. (1972). Theories of bounded rationality. Decision and Organization, 1, 161-176.

Sparrow, N. (2007). Quality issues in online research. Journal of Advertising Research, 47, 179-182. Retrieved from

Tourangeau, R., Conrad, F. G., & Couper, M. P. (2013). The science of web surveys. New York: Oxford University Press.

Vannette, D. L. (n/d). The qualtrics handbook of question design. Provo, UT: Qualtrics Survey Software. Retrieved from

Van Selm, M., & Jankowski, N. W. (2006). Conducting online surveys. Quality and Quantity, 40, 435-456. Retrieved from

Zhang, C., & Conrad, F. (2014). Speeding in Web Surveys: The tendency to answer very fast and its association with straightlining. Survey Research Methods, 8, 127-135. Retrieved from



  • There are currently no refbacks.

Nº ERC: 107494 | ISSN (in print): 0870-8231 | ISSN (online): 1646-6020 | Copyright © ISPA - CRL, 2012 | Rua Jardim do Tabaco, 34, 1149-041 Lisboa | NIF: 501313672 | The portal and metadata are licensed under the license Creative Commons CC BY-NC