Rushing to the end: Participants’ perceptions of demotivating aspects of online surveys

João Martins, Leonor Lavradio

Abstract


More and more social science studies are now acquiring data through the internet, reaching participants
online. Some participants start out engaged and motivated to participate, but progressively slide into
“rushing behaviors”. We inquired experts in survey responding about when, in online studies, they
would feel a desire for rushing (defined as speeding with no concerns about the quality of responses).
This qualitative approach uncovered Repetition, Survey length and No interest in topic as the three
main features that would motivate these participants to rush in surveys. Subsequent inquiry of the
same participants indicated that repetition concerns the type of questions made (more than stimuli or
task), the execution of the same task more than 5-6 times, or for more than 6 minutes. Survey length
concerns a preference for shorter surveys, as well as the subjective experience in which length exceeds
previously set expectations (i.e., longer than announced), contributing to rushing by effectively
lowering the hourly pay rate as the survey increases in length. Interest in topic was reported to be
consistently low, despite not being the main reason to quit the survey. However, a change in expected
level of interest in the middle of the survey is reported as a factor that will promote rushing behaviors.
We discuss these data as informative regarding how pre-tests of surveys can benefit from these
participants’ expertise.

Keywords


Online survey, Rushing, Data validity.

Full Text:

PDF

References


Asarch, A., Chiu, A., Kimball, A. B., & Dellavalle, R. P. (2009). Survey research in dermatology: Guidelines for success. Dermatologic Clinics, 27, 121-131. Retrieved from https://doi.org/10.1016/j.det.2008.11.001

Conrad, F. G., Couper, M. P., Tourangeau, R., & Peytchev, A. (2006). Use and non-use of clarification features in web surveys. Journal of Official Statistics, 22, 245-269.

Conrad, F. G., Tourangeau, R., Couper, M. P., & Zhang, C. (2017). Reducing speeding in web surveys by providing immediate feedback. Survey Research Methods, 11, 45-61. Retrieved from https://doi.org/10.18148/srm/2017.v11i1.6304

Couper, M. P. (2000). Web surveys: A review of issues and approaches. The Public Opinion Quarterly, 64, 464-494. Retrieved from https://doi.org/10.1086/318641

Couper, M. P., Traugott, M. W., & Lamias, M. J. (2001). Web survey design and administration. Public Opinion Quarterly, 65, 230-253. Retrieved from https://doi.org/10.1086/322199

Dillman, D. A. (2011). Mail and internet surveys: The tailored design method (2nd ed.). Hoboken, NJ: John Wiley & Sons.

Dillman, D. A., Tortora, R. D., & Bowker, D. (1998). Principles for Constructing Web Surveys. Joint Meetings of the American Statistical Association, 64, 1-16.

Evans, J. R., & Mathur, A. (2005). The value of online surveys. Internet Research, 15, 195-219. Retrieved from https://doi.org/10.1108/10662240510590360

Gadiraju, U., Kawase, R., Dietze, S., & Demartini, G. (2015). Understanding malicious behavior in crowdsourcing platforms: The case of online surveys. In B. Begole, J. Kim, K. Inkpen, & W. Woo (Eds.), 33rd Annual ACM Conference on Human Factors in Computing Systems (pp. 1631-1640). Seoul: ACM.

Geisen, E., & Bergstrom, J. R. (2017). Usability testing for survey research. Cambridge: Morgan Kaufmann.

Gummer, T., Roβmann, J., & Silber, H. (2018). Using instructed response items as attention checks in web surveys: Properties and implementation. Sociological Methods and Research. 1-27. Retrieved from https://doi.org/10.1177/0049124118769083

Hauser, D. J., & Schwarz, N. (2015). It’s a trap! Instructional manipulation checks prompt systematic thinking on «tricky» tasks. SAGE Open, 5, 1-5. Retrieved from https://doi.org/10.1177/2158244015584617

Herzog, A. R., & Bachman, J. G. (1981). Effects of questionnaire length on response quality. Public Opinion Quarterly, 73, 349-360. Retrieved from https://doi.org/10.1093/poq/nfp031

Hillygus, D. S., Jackson, N., & Young, M. (2014). Professional respondents in nonprobability online panels. In M. Callegaro et al. (Eds.), Online panel research: A data quality perspective (1st ed., pp. 219-237). John Wiley & Sons. Retrieved from https://doi.org/10.1002/9781118763520.ch10

Kaye, B. K., & Johnson, T. J. (1999). Research methodology: Taming the cyber frontier techniques for improving online surveys. Social Science Computer Review, 17, 323-337. Retrieved from https://doi.org/10.1177/089443939901700307

Kittur, A., Chi, E. H., & Suh, B. (2008). Crowdsourcing user studies with Mechanical Turk. In M. Czerwinski (Ed.), Twenty-sixth annual SIGCHI conference on Human factors in computing systems [April 5-10, 2008, Florence, Italy] (pp. 453-456). New York: ACM Press. Retrieved from https://doi.org/10.1145/1357054.1357127

Krosnick, J. (1991). Response strategies for coping with the cognitive demands of attitude measures in survey. Applied Cognitive Psychology, 5, 213-236. Retrieved from https://doi.org/10.1002/acp.2350050305

Manfreda, K. L. M., Batagelj, Z., & Vehovar, V. (2002). Design of websurvey questionnaires: Three basic experiments. Jounal of Computer-Mediated Communication, 7. Retrieved from https://doi.org/10.1111/j.1083-6101.2002.tb00149.x

Matthijsse, S. M., de Leeuw, E. D., & Hox, J. J. (2015). Internet panels, professional respondents, and data quality. Methodology, 11, 81-88. Retrieved from https://doi.org/10.1027/1614-2241/a000094

Nulty, D. D. (2008). The adequacy of response rates to online and paper surveys: What can be done? Assessment and Evaluation in Higher Education, 33, 301-314. Retrieved from https://doi.org/10.1080/02602930701293231

Revilla, M. (2017). Ideal and maximum length for a web survey. International Journal of Market Research, 59, 557-565. Retrieved from https://doi.org/10.2501/IJMR-2017-039

Revilla, M., & Ochoa, C. (2015). What are the links in a web survey among response time, quality, and auto-evaluation of the efforts done?. Social Science Computer Review, 33, 97-114. Retrieved from https://doi.org/10.1177/0894439314531214

Roβmann, J., Gummer, T., & Silber, H. (2018). Mitigating satisficing in cognitively demanding grid questions: Evidence from two web-based experiments. Journal of Survey Statistics and Methodology, 6, 376-400. Retrieved from https://doi.org/10.1093/jssam/smx020

Simon, H. A. (1972). Theories of bounded rationality. Decision and Organization, 1, 161-176.

Sparrow, N. (2007). Quality issues in online research. Journal of Advertising Research, 47, 179-182. Retrieved from https://doi.org/10.2501/S0021849907070201

Tourangeau, R., Conrad, F. G., & Couper, M. P. (2013). The science of web surveys. New York: Oxford University Press.

Vannette, D. L. (n/d). The qualtrics handbook of question design. Provo, UT: Qualtrics Survey Software. Retrieved from https://www.qualtrics.com/ebooks-guides/qualtrics-handbook-of-question-design

Van Selm, M., & Jankowski, N. W. (2006). Conducting online surveys. Quality and Quantity, 40, 435-456. Retrieved from https://doi.org/10.1007/s11135-005-8081-8

Zhang, C., & Conrad, F. (2014). Speeding in Web Surveys: The tendency to answer very fast and its association with straightlining. Survey Research Methods, 8, 127-135. Retrieved from https://doi.org/10.18148/srm/2014.v8i2.5453




DOI: https://doi.org/10.14417/ap.1674

Refbacks

  • There are currently no refbacks.


Nº ERC: 107494 | ISSN (in print): 0870-8231 | ISSN (online): 1646-6020 | Copyright © ISPA - CRL, 2012 | Rua Jardim do Tabaco, 34, 1149-041 Lisboa | NIF: 501313672 | The portal and metadata are licensed under the license Creative Commons CC BY-NC