ePROs in clinical trials – Why we should be mobile first

Electronic patient reported outcomes (ePROs) and in particular with Bring Your Own Device (BYOD) are increasingly being used as a method for collecting data directly from patients. The many advantages of doing so include:

  • clinical trials can be bigger, faster and more efficient
  • valid, in-range data options can be entered
  • avoiding missing data within a questionnaire
  • feedback and reminders to help patients  complete the questionnaire
  • time stamped data
  • complex question branching

However, with people’s attention span getting shorter, there is a growing need to make surveys lean. With more and more surveys being taken on mobile phones, if the survey is too long or is badly designed participants will drop out.

While there are established guidelines for measuring equivalence between electronic and paper-based PRO measures (Coons et al 2017), what do we really know about respondent behaviour using a mobile to complete a ePRO and best practice in the design of a mobile friendly questionnaire?

Despite the research on mobile responses being sparse, there is evidence from market research from Australia, Canada, Germany, the Netherlands, Russia, UK and USA which can be drawn on that can be a potential guide to our thinking in the design and application of ePROs.

Mobile respondent behaviour

First, let’s look at some of the evidence around mobile respondent behaviour. Here I’ve drawn mostly from the work of Wells (2015).

  1. Respondents insist on using a device of their own choice. Respondents in general will not switch devices to complete a survey even if prompted by the researcher. (McClain et al. 2012; Millar & Dillman 2012; Peterson 2012).
  2. Most mobile responders will complete the survey outside the home than PC users. (de Bruijne and Wijnant 2013; Mavletova 2013; and Mavletova and Couper 2013.
  3. Most mobile responders will complete the survey at home. (de Bruijne & Wijnant 2013; Mavletova & Couper 2013).
  4. Smartphone users take longer to complete a survey compared to other devices. (Peterson 2012; de Bruijne & Wijnant 2013; Mavletova 2013; Mavletova & Couper 2013; McGeeney & Marlar 2013; Wells et al. 2013).
  5. Breakoffs (Failure to complete the survey) are higher among smartphone users than computer users. (Callegaro 2010; Stapleton 2011; Buskirk & Andrus 2012; Guidry 2012; McClain et al. 2012).
  6. Response rates are lower with mobile responses compared to computer users. This is a consistent finding supported by research carried out by Buskirk & Andrus 2012; de Bruijne & Wijnant 2013; Mavletova 2013).

One of the most important issues facing survey designers is poor response behaviour either because the participant fails to answer the survey or fails to answer one or more of the questions. This is particularly notable with mobile users compared to PC users (See 6 & 7 above). Poor response behaviour can have a significant impact on data quality when the questionnaire has not been mobile optimised.

Why do participants drop out of mobile surveys?

When surveys are conducted online or via a mobile device participant breakoff can be due to a number of factors. These include technical challenges such as slow page load due to a graphic rich questionnaire. Participants are also more likely to breakoff if the survey is tedious or repetitive in terms of how the questions are worded or presented. Also questionnaire design plays a critical role in the survey experience. For example, it’s widely recognised that the use of matrix or table-based questions, long answer lists and long or wide scales does not render well on a mobile device, leading to participant fatigue, frustration and higher dropout.

Wenz (2017) has shown that data quality mainly differs between small smartphones, that have a screen size of below 4.0 inches, and larger mobile devices. Survey respondents who use small-screen smartphones are more likely to break off the survey and they provide shorter answers to open questions compared to respondents with larger devices. There is however, some evidence that people using smartphones can provide high-quality responses as long as they are presented with question formats that are easy to use on small touchscreens. (Antoun, Couper & Conrad 2017).

Obviously, mobile – unfriendly questionnaire design will lead to the exclusion of people who will only participate via mobile which will clearly compromise the research. The advice being: ‘If it doesn’t work on mobile, don’t do it.’ Poynter (2017).

It’s worth noting however, that of the 7 billion mobile phones in use around the world, fewer than 3 billion are smartphones. Focusing on smartphone will result in excluding the majority of mobile phone users, although this is a marginal problem in some countries. Poynter (2017).

What is a mobile first survey?

Mobile  first is the current mantra which is designing the questionnaire at the outset for a mobile device and scaling up for larger screen sizes and tested on mobile devices which can overcome these design problems such as those mentioned above. For example, while it is generally accepted that grids and matrices are inappropriate for use on mobile devices, adopting a mobile first approach where the number of scale points is minimal, the labels are short or to present the rows of the grid one row at a time, can perform in a similar way to a grid on a PC (Couper 2017). However, grids remain one of the most disliked question formats by survey participants.

In addition to those mentioned above, other aspects of best practice for a mobile optimised questionnaire include:

  • Have one question per screen or have the number of questions that fit comfortably on the screen to avoid the participant having to scroll
  • Limit the number of words per question
  • Avoid progress bars. While progress bars are great for online surveys, there is simply no room for them on the mobile screen
  • Avoid drop-down menus which  are difficult for mobile respondents to use
  • Avoid open-ended questions
  • Avoid images and videos

Mode effects when using a mix of mobile and PC

Clearly, if participants are given the choice to use a PC or a mobile optimised questionnaire there will be an improvement in the coverage which is a desirable outcome. However, if a mobile – unfriendly questionnaire is being used there will be mode effects and answers can be different because some people will be missed. De Leeuw (2017) proposes that mode effects comprise two elements. These are (1) changes due to change in hardware which are undesirable mode effects and (2) those caused by improving the coverage of the study, which are desirable mode effects.

In summary

  1. Participant behaviour can vary between mobile and PC user
  2. Mobile – first is the current mantra which is designing the questionnaire at the outset for a mobile device and scaling up for larger screen sizes
  3. Breakoffs are higher among smartphone users than computer users
  4. Questionnaires need to be mobile – friendly to minimise participant breakoff and poor quality data
  5. A mobile – optimised questionnaire will produce similar results as on a PC
  6. A mobile – unfriendly questionnaire is more likely to produce different results from a PC
  7. Focusing on smartphone will result in excluding the majority of mobile phone users, although this is a marginal problem in some countries

References

Antoun, C., Couper. M.P. & Conrad, F.G. (2017) Response Quality: A Crossover Experiment in a Probability Web Panel. Public Opinion Quarterly, Vol. 81, Special Issue, 2017, pp. 280–306

Buskirk, T.D. & Andrus, C. (2012) Online surveys aren’t just for computers anymore! Exploring potential mode effects between smartphone and computer-based online surveys. Paper presented at the annual meeting of the American Association for Public Opinion Research, Orlando, Florida, 17–20 May.

Callegaro, M. (2010) Do you know which device your respondent has used to take your online survey? Survey Practice, 3, 6. Available online at: http://surveypractice.org/index.php/SurveyPractice/article/view/250/html

Coons, et al. (2009) Evidence Needed to Support Paper-Based Patient-Reported Outcome (PRO) Measures. Value in Health Volume 12 • Number 4

Couper, M.P. (2016) Grids Versus Item‐by‐Item Designs on Smartphones http://url.ie/11w1p

Crawford, S., McClain, C., O’Brien, S. & Nelson, T. (2013) Examining the feasibility of SMS as a contact mode for a college student survey. Paper presented at the annual meeting of the American Association for Public Opinion Research, Boston, Massachusetts, 16–19 May.

de Bruijne, M. & Wijnant, A. (2013) Comparing survey results obtained via mobile devices and computers: an experiment with a mobile web survey on a heterogeneous group of mobile devices versus a computer-assisted web survey. Social Science Computer Review, 31, 4, pp. 482–504.

de Bruijne, M. & Wijnant, A. (2014) Improving response rates and questionnaire design for mobile web surveys. Public Opinion Quarterly, 78, 4, pp. 951–962.

desiree de Leeuw E. (2017) Never A Dull Moment: Mixed-Mode Surveys In Past, Present & Future. Key note at the 2017 ESRA conference Lisbon

Guidry, K.R. (2012) Response quality and demographic characteristics of respondents using a mobile device on a web-based survey. Paper presented at the annual meeting of the American Association for Public Opinion Research, Orlando, Florida, 17–20 May.

McClain, C.A., Crawford, S.D. & Dugan, J.P. (2012) Use of mobile devices to access computer-optimized web instruments: implications for respondent behavior and data quality. Paper presented at the annual meeting of the American Association for Public Opinion Research, Orlando, Florida, 17–20 May.

McGeeney, K. & Marlar, J. (2013) Mobile browser web surveys: testing response rates, data quality, and best practices. Paper presented at the annual meeting of the American Association for Public Opinion Research, Boston, Massachusetts, 16–19 May.

Mavletova, A. (2013) Data quality in PC and mobile web surveys. Social Science Computer Review, 31, 6, pp. 725–743.

Mavletova, A. & Couper, M.P. (2013) Sensitive topics in PC web and mobile web surveys. Survey Research Methods, 7, 3, pp. 191–205.

Mavletova, A. & Couper, M.P. (2014) Mobile web survey design: scrolling versus paging, SMS versus e-mail invitations. Journal of Survey Statistics and Methodology, 2, 4, pp. 498–518.

Millar, M.M. & Dillman, D.A. (2012) Encouraging survey response via smartphones: effects on respondents’ use of mobile devices and survey response rates. Survey Practice, 5, 3. Available online at: www.surveypractice.org/index.php/SurveyPractice/article/view/19/html

Peterson, G. (2012) Unintended mobile respondents. Paper presented at the annual Council of American Survey Research Organizations Technology Conference, New York, 30–31 May.

Poynter, R. (2017) Major update on mobile market research. http://url.ie/11w1n

Stapleton, C. (2011) The smart(phone) way to collect survey data. Paper presented at the annual meeting of the American Association for Public Opinion Research, Phoenix, Arizona, 12–15 May.

Wells. T (2015) What market researchers should know about mobile surveys. International Journal of Market Research. Vol. 57 No. 4, 2015 p.521–532

Wenz. A (2017) Completing Web Surveys on Mobile Devices: Does Screen Size Affect Data Quality?  Institute for Social and Economic Research University of Essex No. 2017- 05 April 2017

 

Related posts

How to get responses to your online survey in less than five-minutes

How to improve your survey response rates

Should we have an opt-out choice available on our online survey?

5 key steps in constructing your paper or online survey

10 Tips on constructing your online survey

How do I calculate the response rate to my online survey?

When should I send out a reminder to my online survey?

 

Contact us for further information



Categories: Patient reported outcomes, Questionnaire design

Tags: , , , ,

2 replies

  1. Thank you for a good article about business, it was very interesting and informative.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: