Electronic Questionnaires Design and Implementation
Clara Minto1, Giulia Beltrame Vriz2, Matteo Martinato3, Dario Gregori1, *
1 Unit of Biostatistics, Epidemiology and Public Health, Department of Cardiac, Thoracic and Vascular Sciences, University of Padova, Padova, Italy
2 Department of Obstetrics, Santa Chiara Hospital, Trento, Italy
3 University Hospital, Gastroenterology, Padova, Padova, Italy
Nursing and health care research are increasingly using e-questionnaires and e-forms for data collection and survey conduction. The main reason lies in costs, time and data-entry errors containment, increased flexibility, functionality and usability. In spite of this growing usage, no specifc and comprehensive guidelines for designing and submitting e-questionnaires have been produced so far.
The aim of this review is to collect information on the current best practices, taking them from various fields of application. An evaluation of the efficacy of the single indication is provided.
A literature review of guidelines currently available on WebSM (Web Survey Methodology) about electronic questionnaire has been performed. Four search strings were used: “Electronic Questionnaire Design”, “Electronic Questionnaire”, “Online Questionnaire” and “Online survey”. Articles’ inclusion criteria were English language, relevant topic in relation to the aim of the research and the publication date from January 1998 to July 2014.
The review process led to identify 48 studies. The greater part of guidelines is reported for Web, and e-mail questionnaire, while a lack of indications emerges especially for app and e-questionnaires.
Lack of guidelines on e-questionnaires has been found, especially in health care research, increasing the risk of use of ineffective and expensive instruments; more research in this field is needed.
open-access license: This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International Public License (CC-BY 4.0), a copy of which is available at: https://creativecommons.org/licenses/by/4.0/legalcode. This license permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
* Address correspondence to this author at the Unit of Biostatistics, Epidemiology and Public Health, Department of Cardiac, Thoracic and Vascular Sciences, Via Loredan 18, 35121 Padova, Italy, Tel: +390498275384, Fax: +39 02 700445089; E-mail: email@example.com
Technological development and Internet diffusion have changed methods of data collection: World Wide Web and electronic tools represent new challenges for researchers, as these are the newest means to connect people and collect information. These advancements also affect research in clinical settings, where patients and healthcare providers are today more prone to Interent use and more expert on electronic devices like personal computers, tablets, mobile phones, smart phones and others. In nursing research, information is often collected directly from the patient, in order to minimize data entry errors [1Evans JR, Mathur A. The value of online surveys. Intest Res 2005; 15: 195-219. [http://dx.doi.org/10.1108/10662240510590360] , 2Manfreda KL, Batagelj Z, Vehovar V. Design of web survey questionnaires: Three basic experiments. J Comp Med Commun 2002; 1: 7-3.]. This method of data collection is called Patient Reported Oucomes PRO (MeSH), Assessment of the quality and effectiveness of health care as measured and directly reported by the patient, and is useful to investigate some health care aspects like patient illness perception, level of pain and efficacy of nurse’s interventions. One of the most important instruments of data collection is the questionnaire, allowing both a self-administration and an operator-assisted interview. Its validity is based on the consistency of gathered information with patients' actual conditions, health perception and thoughts.
Over the last 30 years, traditional survey methods as paper questionnaires or telephone interview, have been partially replaced by new collection instruments designed for electronic tools [3Palmblad M, Tiplady B. Electronic diaries and questionnaires: Designing user interfaces that are easy for all patients to use. Qual Life Res 2004; 13(7): 1199-207. [http://dx.doi.org/10.1023/B:QURE.0000037501.92374.e1] [PMID: 15473498] ].
Although literature is rich of guidelines on how to create and use paper questionnaires, not all of such indications are applicable to electronic counterparts [1Evans JR, Mathur A. The value of online surveys. Intest Res 2005; 15: 195-219. [http://dx.doi.org/10.1108/10662240510590360] ]: for example, while evidence on wording and item definition can be the same for all questionnaires, indications about layout, privacy or question structure may change depending on the type of device. Compared to traditional paper-based methods, electronic questionnaires offer several advantages like cost reduction [4Lumsden J. Guidelines for the design of online-questionnaires 2005.], speed in collection and data analysis [5Van SM, Jankowski N. Conducting Online Surveys. Qual Quant 2006; 40: 435-56. [http://dx.doi.org/10.1007/s11135-005-8081-8] ], personalized design for target sample, lack of influence of researcher presence, comfort for respondents who can complete the questionnaire when and where they prefer, flexibility, functionality, usability, allowing inclusion of pop-up instruction, error messages, link incorporation and making possible to encode difficult skip patterns making such patterns vitually invisible to respondents. However, e-forms present also important disadvantages, able to potentially compromise survey success. Firstly, a sample cannot be representative if the sampling process is not controlled [5Van SM, Jankowski N. Conducting Online Surveys. Qual Quant 2006; 40: 435-56. [http://dx.doi.org/10.1007/s11135-005-8081-8] ], and users can be worried for their privacy or have concerns about data circulation through a network. Furthermore, e-questionnaires can incur in typical errors due to their technological characteristics: e-mail forms can be read as a spam message, or instrument’s format can be incompatible with users’ device.
Literature reports four categories of errors: coverage, non-response, sampling and measurement errors [3Palmblad M, Tiplady B. Electronic diaries and questionnaires: Designing user interfaces that are easy for all patients to use. Qual Life Res 2004; 13(7): 1199-207. [http://dx.doi.org/10.1023/B:QURE.0000037501.92374.e1] [PMID: 15473498] , 5Van SM, Jankowski N. Conducting Online Surveys. Qual Quant 2006; 40: 435-56. [http://dx.doi.org/10.1007/s11135-005-8081-8] , 6Kalantari DH, Kalantari DE, Maleki S. E-survey (surveys based on e-mail & web). Procedia Comput Sci 2011; 3: 935-41. [http://dx.doi.org/10.1016/j.procs.2010.12.153] ]. Coverage error (i) occurs when not all members of a population have the same chance to be included in the sample; non response error (ii) happens when potential users fail to respond to survey invitation. Sampling errors (iii) are a consequence of data collection methods: subjects of interest can be excluded due to technological limitations as low connection, small bandwidth, browser configuration or different devices. Finally, measurement errors (iv) occur when ambiguous or incorrect question wording causes imprecise and contradictory answers.
The aim of the study is (i) to help researchers in the nursing area in limiting potential survey errors, through a critical review of the main guidelines published over the last decade on electronic questionnaire, and (ii) to highlight which aspects of e-forms still need to be analyzed by literature. The study represents a pratical guide to help health researchers and nurses creating an effective e-questionnaire, adapted to the target population and able to collect relevant information.
The present study is a review of the main guidelines currently present in literature about electronic questionnaire, including an overview on efficacy of these recommendations.
Literature review has been conducted using WebSM (Web Survey Methodology), a database specifically dedicated to address the methodological issues of Web surveys. The same search strings have been used to collect review and trial articles. Key words introduced in the advanced search of the website were: Electronic Questionnaire Design (search string 1), Electronic Questionnaire (search string 2), Online Questionnaire (search string 3), Online survey (search string 4). Articles’ inclusion criteria were English language, relevant topic in relation to the aim of the research and the publication date from January 1998 to July 2014.
Fig. (1) shows a flow chart of the process which led to identify 18 articles, relevant for the review, out of initial 1,073.
Flow chart of the paper selection process for the review.
Additional 23 articles found in the references of the 18 selected papers, have been included, for a total of 41 articles. Table (1) provides a summary of categories used for classifying e-forms aspects.
Results are organized in 4 different tables: findings from 16 papers were used to create Table (2) (Guidelines Table) and Table (3) (Guidelines’ Distribution Table), while findings from 25 papers contributed to create Table (4) (Efficacy Table).
Table (2) includes guidelines on electronic questionnaire. In this table, indications have been distributed among four types of e-questionnaire:
simple electronic questionnaire via personal computer, with a specific software program designed for the survey [4Lumsden J. Guidelines for the design of online-questionnaires 2005., 7Dillman DA, Tortora RD, Bowker D. Principles for constructing web surveys 1998.];
e-mail questionnaire, a type of e-survey via network system in which the questionnaire is sent as an e-mail message [6Kalantari DH, Kalantari DE, Maleki S. E-survey (surveys based on e-mail & web). Procedia Comput Sci 2011; 3: 935-41. [http://dx.doi.org/10.1016/j.procs.2010.12.153] ];
online questionnaire, directly inserts on the host website, and designed as a web page with a URL [6Kalantari DH, Kalantari DE, Maleki S. E-survey (surveys based on e-mail & web). Procedia Comput Sci 2011; 3: 935-41. [http://dx.doi.org/10.1016/j.procs.2010.12.153] ];
“App” questionnaire, downloaded as a Smart phone application.
Columns of Table include four types of e-questionnaires, while rows show main categories of guidelines: survey development, questionnaire design, and questionnaire layout and informatics accessibility. The first category, survey development, lists indications on what researcher should do before questionnaire distribution. In the questionnaire design category, we can find evidence about how to profile target audience, ensuring data quality and creating the welcome page, including specific criteria to formulate answers and questions. The row on questionnaire layout provides indications about tool’s visual configuration, with attention to colour use and choice of images. Finally, the category on information accessibility includes guidelines on costs, wording, and indications to ensure an easy and safe utilization of the questionnaire.
From literature review, it is possible to assess that the procedures identified for item’s definition are the same for the different kinds of electronic questionnaire; they are, basically, based on literature review, interviews, focus-group sessions, with potential respondents or experts (Table 2).
Table 1 Summary of categories used for classifying e-forms aspects.
Guidelines on questionnaire design and implementation.
Also, the sample selection procedures are similar, but, specifically, for e-mail questionnaire and web questionnaire, guidelines address the issue of achievement of a desired level of randomness and representativeness of non-probability samples. Moreover, in these cases, it is important to use a database with valid e-mail addresses and permission by potential respondents to send surveys Table (2). As to testing and pretesting procedures, the guidelines across different electronic forms are similar, but it is important to take into account that for Web and e-mail questionnaire, virtual pretesting and testing is useful to ensure quick and effective control of the questionnaire (Table 2).
The validation procedures in term of reliability and validity are identical across different electronic questionnaire forms Table (2) and are comparable to procedures provided for paper questionnaire [8Litwin MS, Fink A. How to measure survey reliability and validity. Sage 1995.]. Shows which aspects of electronic questionnaires have been more investigated, and which ones need further analysis. For example, in the category about survey development Table (3), the majority of articles concern e-mail and web questionnaires, while four papers address simple e-forms [1Evans JR, Mathur A. The value of online surveys. Intest Res 2005; 15: 195-219. [http://dx.doi.org/10.1108/10662240510590360] , 9Burns KE, Duffett M, Kho ME, et al. A guide for the design and conduct of self-administered surveys of clinicians. CMAJ 2008; 179(3): 245-52. [http://dx.doi.org/10.1503/cmaj.080372] [PMID: 18663204] , 10Bailey J, Bensky E, Link M. Can Your Smartphone Do This?: A New Methodology for Advancing Digital Ethnography. 2011. In American Association for Public Opinion Research annual conference 2011.Phoenix, AZ. 2011., 14Batinic B, Reips UD, Bosnjak M, Eds. Online social sciences 2002.] and only one article discusses app-forms [9Burns KE, Duffett M, Kho ME, et al. A guide for the design and conduct of self-administered surveys of clinicians. CMAJ 2008; 179(3): 245-52. [http://dx.doi.org/10.1503/cmaj.080372] [PMID: 18663204] ].
For what concerns efficacy, Table (4) reports estimates of the efficacy of some of the most relevant topics: login procedures, use of reminder letters, progress indicator, questionnaire length and cover messages. As shown in the table, use of personalized invitation [4Lumsden J. Guidelines for the design of online-questionnaires 2005.], adoptions of mixed strategies [9Burns KE, Duffett M, Kho ME, et al. A guide for the design and conduct of self-administered surveys of clinicians. CMAJ 2008; 179(3): 245-52. [http://dx.doi.org/10.1503/cmaj.080372] [PMID: 18663204] ], creation of plain questionnaire, reduction of text length [10Bailey J, Bensky E, Link M. Can Your Smartphone Do This?: A New Methodology for Advancing Digital Ethnography. 2011. In American Association for Public Opinion Research annual conference 2011.Phoenix, AZ. 2011.], use of reminder letters or telephone contacts [11Palmblad M, Tiplady B. Electronic diaries and questionnaires: Designing user interfaces that are easy for all patients to use. Qual Life Res 2004; 13(7): 1199-207. [http://dx.doi.org/10.1023/B:QURE.0000037501.92374.e1] [PMID: 15473498] ], inclusion of semiautomatic login procedure [12Lumsden J. Guidelines for the Design of Online-Questionnaires. NRC Publications Archive 2005; 31.-14Batinic B, Reips UD, Bosnjak M, Eds. Online social sciences 2002.] and adoption of strong privacy policies, are all elements that have the potential to increase response rates. On the other side, including in the cover letter or in the survey introduction an estimate of the completion time does not seem to significantly influence user’s attitudes.
Table 3 Reviewed papers distributed according to information and pertinence.
Literature review demonstrated a lack of indications about app and off-line e-questionnaires, while it has reported a wide collection of guidelines on e-mail and web forms. While many indications for online questionnaires can be applied also to simple e-forms, the same does not hold for app versions, since every type of device needs specific guidance in terms of media characteristics and electronic limitations. Colour and graphic use, as well as answers formulation and scrolling choice, can positively or negatively influence response rate: researcher for example should evaluate if scrolling is comfortable for smart phones users, or if a complex configuration does not increase download time. These and other guidelines on questionnaire layout should be taylored depending on type of electronic device. For example, for mail web questionnaire it is important to consider sampling procedures that take into account the representativeness of non-probability samples [5Van SM, Jankowski N. Conducting Online Surveys. Qual Quant 2006; 40: 435-56. [http://dx.doi.org/10.1007/s11135-005-8081-8] ].
Such a lack of evidence affects also the entire process of questionnaire creation. For example, only one article deals with questionnaire design, explaining question types as rank order questions, categorical or nominal questions, magnitude estimate questions, ordinal questions or Likert scale [3Palmblad M, Tiplady B. Electronic diaries and questionnaires: Designing user interfaces that are easy for all patients to use. Qual Life Res 2004; 13(7): 1199-207. [http://dx.doi.org/10.1023/B:QURE.0000037501.92374.e1] [PMID: 15473498] , 9Burns KE, Duffett M, Kho ME, et al. A guide for the design and conduct of self-administered surveys of clinicians. CMAJ 2008; 179(3): 245-52. [http://dx.doi.org/10.1503/cmaj.080372] [PMID: 18663204] ].
The category of information accessibility is the most discussed, consistently with the current interest in electronic tools. A large number of studies investigated topics like security (login procedure, privacy protection), strategies to encourage and help users (cover letter, instructions, reminders, users competence), technical procedures to create a simple tool (skip and filter questions, screen and type of device, technical procedures, browser and file size), ways to ensure data quality (repeated answer, confirmation message, thank you page), and indications to create a visually attractive questionnaire, in order to reduce dropout rates and increase users’ responses (progress indicator, one page and multiple page design, wording, phraseology, multiple choice, costs).
Table 4 Evidence-based classification on the impact of the various indications as proposed in the paper.
Furthermore, while the choice of the topic is relevant in increasing data quality [45Heerwegh D, Loosveldt G. Web surveys: The effect of controlling survey access using PIN numbers. Soc Sci Comput Rev 2002; 20: 10-21. [http://dx.doi.org/10.1177/089443930202000102] ], the use of a grid design does not seem to improve it. Respondents’ appreciation on the survey is positively conditioned by the inclusion of progress indicators, as it influences perceptions on questionnaire length, and by the adoption of graphical details that make questionnaire more enjoyable. However, effectiveness of most of the guidelines still remains uncertain: the completion rate is always higher in PC questionnaire compared to app forms, also for sensitive topics; the header image is not affected by “banner blindness” phenomenon and can partially influence user answers [28Holland JL, Christian LM. The Influence of topic interest and interactive probing on responses to open-ended questions in web surveys. Soc Sci Comput Rev 2009; 27: 196-212. [http://dx.doi.org/10.1177/0894439308327481] ]; finally, use of progress indicator and first question configuration have little or no effect on dropout and completeness rates [42Couper MP, Conrad FG, Tourangeau R. Visual context effects in web surveys. Public Opin Q 2007; 71: 623-34. [http://dx.doi.org/10.1093/poq/nfm044] ].
It is important to highlight that the majority of retrieved papers concern marketing and epidemiological topics, while just a few studies have been conducted in clinical setting. Implementation of new electronic tools and adaptation of existing guidelines to the needs of health care and nursing researchers can contribute to reinforce efficacy in this particular setting. This is the case of users with particular needs, for whom literature does not report any specific guideline, but for a few articles [21Couper MP, Traugott MW, Lamias MJ. Web survey design and administration. Public Opin Q 2001; 65(2): 230-53. [http://dx.doi.org/10.1086/322199] [PMID: 11420757] ] giving general information on font size, colours, and button size for subjects with visual or motor difficulties. Due to absence of particular indications for patients with disabilities, the risk to exclude part of the population is still present, with potential, eligible respondents not being able to participate because of their inability to use the instrument.
Lack of guidelines on e-questionnaires also has consequences in health research, especially for nurses (ideal users of this survey tool) [46Timmins F. Surveys and questionnaires in nursing research. Nurs Stand 2015; 29(42): 42-50. [http://dx.doi.org/10.7748/ns.29.42.42.e8904] [PMID: 26080989] ]. Questionnaires, in nursing research, are useful tools also in simple research studies [47Thom B. Role of the simple, self-designed questionnaire in nursing research. J Pediatr Oncol Nurs 2007; 24(6): 350-5. [http://dx.doi.org/10.1177/1043454207309478] [PMID: 18003595] ]. For these reasons, the shortage of methodological guidelines concerning the construction, validation and administration of questionnaires, particularly in electronic form, may increase the risk of use of ineffective and expensive instruments to conduct surveys.
In order to facilitate subjects’ participation and collect high-quality data, nurses should consider new technologies as resources to communicate with patients that use electronic tools in daily routine.
Literature review revealed the need of further implementation of guidelines on electronic questionnaires, particularly in nursing and health care research. The current research is suitable to be the starting point of future improvement: future studies on e-questionnaires will update this guidelines collection, including more accurate evidence in line with technological development and population characteristics. Finally, trials on the efficacy of guidelines have to be implemented in order to verify the validity of the indications coming from the current literature. The evolution of electronic techniques for data collection, and their integration into research practice, should always be consistent with the general criteria of validity and reliability, in order to ensure integrity of the scientific process and provide relevant information.
LIST OF ABBREVIATIONS
= (Web Survey Methodology)
= (Patient Reported Oucomes (P.R.O)
CONSENT FOR PUBLICATION
CONFLICT OF INTEREST
The authors declare no conflict of interest, financial or otherwise.
Bailey J, Bensky E, Link M. Can Your Smartphone Do This?: A New Methodology for Advancing Digital Ethnography. 2011. In American Association for Public Opinion Research annual conference 2011.Phoenix, AZ. 2011.
Batinic B, Reips UD, Bosnjak M, Eds. Online social sciences 2002.
Dube SR, Hu SS, Fredner-Maguire N, Dayton J. A Focus Group Pilot Study of Use of Smartphone to Collect Information about Health Behaviors. In: Association A S, Ed. 67th Annual Conference of the American Association for Public Opinion Research (AAPOR) 2012.Orlando, Florida. 2012.
Dillman DA, Smyth JD, Christian LM. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method 2008.
Väätäjä H, Roto V. Mobile questionnaires for user experience evaluation. CHI’10 2010; 3361-6.
Abraham SY, Steiger DM, Sullivan C. Electronic and mail self-administered questionnaires: A comparative assessment of use among elite Populations. Proceedings of the Section on Survey Research Methods 1998; 833-41.
Emde M, Fuchs M. Using adaptive questionnaire design in open-ended questions: A field experiment. 67th Annual Conference 2012.San Diego, USA. 2012.
Dillman DA, Tortora RD, Bowker D. Principles for constructing web surveys 1998.
Heerwegh D, Vanhove T, Loosveldt G, Matthijs K. Effects of personalization on web survey response rates and data quality. 2004. In: InRC33 Sixth International Conference on Social Science Methodology; Amsterdam. 2004.
Holland JL, Christian LM. The Influence of topic interest and interactive probing on responses to open-ended questions in web surveys. Soc Sci Comput Rev 2009; 27: 196-212. [http://dx.doi.org/10.1177/0894439308327481]
Dillman DA, Tortora RD, Conradt J, Bowker D. Influence of Plain Vs. Fancy Design on Response Rates for Web Surveys. In: Association AS, Ed. Joint Statistical Meetings 1998.
Scott A, Jeon S-H, Joyce CM, et al. A randomised trial and economic evaluation of the effect of response mode on response rate, response bias, and item non-response in a survey of doctors. BMC Med Res Methodol 2011; 11: 126. [http://dx.doi.org/10.1186/1471-2288-11-126] [PMID: 21888678]
Healey B, Macpherson T, Kuijten B. An Empirical Evaluation of Three Web Survey Design Principles Market Bullet 2005; 16: 1.
Burdein I. Shorter Isn’t Always Better In: 2013 CASRO Online Research Conference 2014.
Hoerger M. Participant dropout as a function of survey length in internet-mediated university studies: Implications for study design and voluntary participation in psychological research. Cyberpsychol Behav Soc Netw 2010; 13(6): 697-700. [http://dx.doi.org/10.1089/cyber.2009.0445] [PMID: 21142995]
Zhang C. Satisficing in Web Surveys: Implications for Data Quality and Strategies for Reduction 2013.
Stanley N, Jenkins S. Watch what I do: Using graphical input controls in Web surveys. Challanges of a changing world 2007; In: Proceedings of the Fifth International Conference of the Association for Survey Computing; . 2007; pp. 81-92.