Understanding mobile survey respondent behaviour

Electronic patient reported outcomes (ePROs) and  diaries are  increasingly being used as a method for collecting data directly from patients. The many advantages of doing so include:

  • valid, in-range data options can be entered
  • avoiding missing data within a questionnaire
  • feedback and reminders to help patients  complete the questionnaire
  • time stamped data
  • question branching

The different modalities through which this data can be captured include: smartphones, tablet, laptop and desktop. However, with the increasing use of smartphones and tablets and the move in particular to Bring Your Own Device (BOYD), it is prudent to consider such issues as, mobile respondent behaviour, survey completion times, breakoffs (failing to complete the survey) etc. with respect to the mobile survey.

Despite the research on mobile responses being sparse, there is evidence from market research from Australia, Canada, Germany, the Netherlands, Russia, UK and USA which can be drawn and which has been extensively reviewed by Wells (2015)

In this post and drawing extensively on the review by Wells (2015) we provide a summary of some of the findings  which we consider can be both particularly relevant and  significant when collecting patient reported outcome data  through mobile responses.

  • Respondents insist on using a device of their own choice. Respondents in general will not switch devices to complete a survey even if prompted by the researcher. (McClain et al. 2012; Millar & Dillman 2012; Peterson 2012).
  • Most mobile respondents use smartphones to complete a survey. Most mobile respondents will use smartphones to complete surveys rather than with tablets. (Guidry 2012; Kinesis 2012; McClain et al. 2012)
  • Most mobile responders will complete the survey outside the home than PC users. (de Bruijne and Wijnant 2013; Mavletova 2013; and Mavletova and Couper 2013.
  • Most mobile responders will complete the survey at home. (de Bruijne & Wijnant 2013; Mavletova & Couper 2013)
  • Response rates are lower with mobile responses. This is a consistent finding supported by research carried out by Buskirk & Andrus 2012; de Bruijne & Wijnant 2013; Mavletova 2013.
  • SMS survey invites lead to higher uptake than email invites. An interesting finding here is that email survey invitations will increase the uptakeof PC respondents (Crawford et al. 2013; de Bruijne & Wijnant 2014b; Mavletova & Couper 2014).
  • SMS survey invites lead to quicker responses than email invites. (de Bruijne & Wijnant 2014b; Mavletova & Couper 2014).
  • Breakoffs (Failure to complete the survey) are higher among smartphone users. This is a consistent finding reported by Wells. (Callegaro 2010; Stapleton 2011; Buskirk & Andrus 2012; Guidry 2012; McClain et al. 2012).
  • Smartphone users take longer to complete a survey. This is also a very consistent research finding reported by Watts. (Peterson 2012; de Bruijne & Wijnant 2013; Mavletova 2013; Mavletova & Couper 2013; McGeeney & Marlar 2013; Wells et al. 2013).
  • Respondent demographics differ by the type of device used. Both smartphone respondents and tablet respondents are shown to be different. To quote Wells (2015) “For example, in the US, smartphone respondents are more likely to be younger, non-white, female, and to display lower levels of education and income (Stapleton 2011; Wells et al. 2013; Buskirk et al. 2014; Cook 2014). 

Whilst the findings discussed above  are specific to market research, we believe they provide – built on lessons learned – practical guidance that may help clinical trialists and researchers to have a better understanding of mobile respondentbehaviour. This is of particular relevance in the mixed mode (mobile/PC/tablet) survey where failure to accommodate the mobile user will result in non-coverage bias. According to Wells (2015) “… mobile respondents should not be blocked, screened out or disqualified from surveys, or redirected to a PC. They should be accommodated and surveys optimised for mobile devices”.

 

References

Buskirk, T.D. & Andrus, C. (2012) Online surveys aren’t just for computers anymore! Exploring potential mode effects between smartphone and computer-based online surveys. Paper presented at the annual meeting of the American Association for Public Opinion Research, Orlando, Florida, 17–20 May.

Callegaro, M. (2010) Do you know which device your respondent has used to take your online survey? Survey Practice, 3, 6. Available online at: http://surveypractice.org/index.php/SurveyPractice/article/view/250/html

Crawford, S., McClain, C., O’Brien, S. & Nelson, T. (2013) Examining the feasibility of SMS as a contact mode for a college student survey. Paper presented at the annual meeting of the American Association for Public Opinion Research, Boston, Massachusetts, 16–19 May.

de Bruijne, M. & Wijnant, A. (2013) Comparing survey results obtained via mobile devices and computers: an experiment with a mobile web survey on a heterogeneous group of mobile devices versus a computer-assisted web survey. Social Science Computer Review, 31, 4, pp. 482–504.

de Bruijne, M. & Wijnant, A. (2014) Improving response rates and questionnaire design for mobile web surveys. Public Opinion Quarterly, 78, 4, pp. 951–962.

Guidry, K.R. (2012) Response quality and demographic characteristics of respondents using a mobile device on a web-based survey. Paper presented at the annual meeting of the American Association for Public Opinion Research, Orlando, Florida, 17–20 May.

McClain, C.A., Crawford, S.D. & Dugan, J.P. (2012) Use of mobile devices to access computer-optimized web instruments: implications for respondent behavior and data quality. Paper presented at the annual meeting of the American Association for Public Opinion Research, Orlando, Florida, 17–20 May.

McGeeney, K. & Marlar, J. (2013) Mobile browser web surveys: testing response rates, data quality, and best practices. Paper presented at the annual meeting of the American Association for Public Opinion Research, Boston, Massachusetts, 16–19 May.

Mavletova, A. (2013) Data quality in PC and mobile web surveys. Social Science Computer Review, 31, 6, pp. 725–743.

Mavletova, A. & Couper, M.P. (2013) Sensitive topics in PC web and mobile web surveys. Survey Research Methods, 7, 3, pp. 191–205.

Mavletova, A. & Couper, M.P. (2014) Mobile web survey design: scrolling versus paging, SMS versus e-mail invitations. Journal of Survey Statistics and Methodology, 2, 4, pp. 498–518.

Millar, M.M. & Dillman, D.A. (2012) Encouraging survey response via smartphones: effects on respondents’ use of mobile devices and survey response rates. Survey Practice, 5, 3. Available online at: www.surveypractice.org/index.php/SurveyPractice/article/view/19/html

Peterson, G. (2012) Unintended mobile respondents. Paper presented at the annual Council of American Survey Research Organizations Technology Conference, New York, 30–31 May.

Stapleton, C. (2011) The smart(phone) way to collect survey data. Paper presented at the annual meeting of the American Association for Public Opinion Research, Phoenix, Arizona, 12–15 May.

Wells, T., Bailey, J.T. & Link, M.W. (2013) Filling the void: gaining a better understanding of tablet-based surveys. Survey Practice, 6, 1. Available online at: www.surveypractice.org/index.php/SurveyPractice/article/view/25/html

Wells. T (2015) What market researchers should know about mobile surveys. International Journal of Market Research. Vol. 57 No. 4, 2015 p.521–532

 

About us

HealthSurveySolutions is based in Oxfordshire in the United Kingdom and was formed:

  • To offer clients real expertise in the development, design and validation of paper and web based health survey questionnaires that provide actionable results.
  • Provide support to organisations who are increasingly using self-service online survey applications (such as SurveyMonkey) who are either struggling or making errors when writing their surveys – the most pivotal step of the entire research process – or would like some expert advice before starting their survey