• Aucun résultat trouvé

Higher response rates at the expense of validity? Consequences of the implementation of the ‘forced response‘ option within online surveys

N/A
N/A
Protected

Academic year: 2021

Partager "Higher response rates at the expense of validity? Consequences of the implementation of the ‘forced response‘ option within online surveys"

Copied!
4
0
0

Texte intégral

(1)

Jean Philippe Decieux, Université du Luxembourg

Alexandra Mergener, Federal Institute for Vocational Education and Training (BIBB) Kristina Neufang, University of Trier

Philipp Sischka, Université du Luxembourg

Higher response rates at the expense of validity?

The consequences of the implementation of forced answering options within online surveys!

Contact: mergener@bibb.de

General Online Research Conference GOR 15

18-20 March 2015, Cologne University of Applied Sciences, Germany

This work is licensed under a Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/)

Suggested citation: Decieux, J.P., Mergener, A., Neufang, K., Sischka, P. 2015. “Higher response rates at the expense of validity? The consequences of the implementation of forced answering options within online surveys!.” General Online Research (GOR) Conference, Cologne.

(2)

Higher response rates at the expense of validity?  

The consequences of the implementation of forced answering options  within online surveys! 

 

Jean Philippe Decieux, Alexandra Mergener, Kristina Neufang und Philipp Sischka 

 

Due to the low cost and the ability to reach thousands of people in a short amount of time, online  surveys have become well established as source of data for research. As result, many non‐

professionals gather their data through online questionnaires, which are often of low quality due to  having been operationalised poorly. 

A popular example for this is the ‘forced response‘ option, whose impact will be analysed within this  research project. The ‘forced response’ option is commonly described as possibility to force the  respondent to give an answer to each question that is asked. In most of the online survey computer  software, it is easily achieved by enabling a checkbox. 

Relevance: 

There has been a tremendous increase in the use of this option, however, the inquirers are often not  aware of the possible consequences. In software manuals, this option is praised as a strategy that  significantly reduces item non‐response. 

In contrast, research studies offer many doubts that counter this strategy. They are based on the  assumption that respondents typically have plausible reasons for not answering a question (such as  not understanding the question; absence of an appropriate category; personal reasons e.g. privacy). 

Research Question: 

Our thesis is that forcing the respondents to select an answer might cause two scenarios: 

‐ Increasing unit non‐response (increased dropout rates) 

‐ Decreasing validity of the answers (lying or random answers). 

Methods and Data: 

To analyse the consequences of the implementation of ‘forced response’ option, we use split ballot  field experiments. Our analysis focuses especially on dropout rates and response behaviour. Our first  split ballot experiment was carried out in July 2014 (n=1056) and we have planned second  experiment for February 2015, so that we will be able to present our results based on strong data  evidence. 

First results: 

If the respondents are forced to answer each question, they will 

‐ cancel the study earlier and 

‐ choose more often the response category “No” (in terms of sensitive issues). 

(3)

Contact details: 

Jean Philippe Pierre Décieux  Doctoral candidate 

Research Unit INSIDE  Université du Luxembourg  T.: 0170‐5887352 

jeanphilippe.decieux@uni.lu   

Alexandra Mergener  Research Associate

 Federal Institute for Vocational Education and Training (BIBB)  Tel.: 0160‐5530345 

mergener@bibb.de   

Kristina Neufang 

Project assistant (Chair of Methodology and Empirical Social Research)  University of Trier  

Tel.: 017697901307  s4krneuf@uni‐trier.de   

Philipp Sischka 

Research Associate Research Unit INSIDE  Université du Luxembourg 

Tel.: 0176‐20500543  philipp.sischka@uni.lu   

 

(4)

Split-Ballot-Design

Version I

Without Forced Response- Option

Version II

Forced Response-Option

Version III

Forced Response-Option and neutral category

Higher response rates at the expense of validity?

Consequences of the implementation of forced-response-options within online surveys

By now, we perceive a tremendous increase in using this option, without being aware of its possible consequences.

In contrast to that, scientists offer a large number of doubts and counter voices to this strategy

(Kaczmirek 2005, Peytchev/Crawford 2005, Stiegler/Reips/Voracek 2007, Dillman/

Smyth/Christian 2009, Albaum et al. 2010, Schnell/ Hill/Esser 2011, Jacob/Heinz/

Décieux 2013).

The ‘forced response’ option is commonly described as a possibility to force the respondent to give an answer to each question that is asked. And in most of the online survey computer software, it is easily to realise by enabling a checkbox.

Research project funded by

Theoretical background

Our major thesis is that forcing the respondents to an answer might cause the following scenarios:

I. Increasing unit non-response (Increasing dropout rates)

II. Decreasing validity of the answers (lying or random answers)

III. Offering a neutral or “do not want to answer this question” option results in weakening these effects

Research Design & Analytical Strategy

Albaum, G., Roster, C.A., Wiley, J., Rossiter, J., Smith, S.M. (2010): Designing Web Surveys in Marketing Research: Does Use of Forced Answering Affect Completion Rates? In: The Journal of Marketing Theory and Practice Vol. 18, No. 3, S. 285-294. Decieux, J.P. (2012):Modeeffekte bei Onlineumfragen. Ein multivariater Methodenvergleich unter Zuhilfenahme eines Propensity Score. Décieux, J. P., Hoffmann, M. (2015): Antwortdifferenzen im Junk & Crime Survey: Ein Methodenvergleich mit goffmanscher Interpretation. I n: Vielfalt und Zusammenhalt. Deutsche Gesellschaft für Soziologie Kongressband 2012 . Dillman, D.A., Smyth, J.D., Christian, L.M. (2009):Internet,mail, and mixed-mode surveys. The tailored design method, Hoboken. Göritz, A.S. (2004)The impact of material incentives on response quantity, response quality, sample composition, survey outcome, and cost in online access panels. In: International Journal of Market Research Vol. 46 Quarter 3. Jacob, R., Heinz, A., Décieux, J.P. (2013): Umfrage – Einführung in die Methode der Umfrageforschung. München. Kaczmirek, L. (2005): Web Surveys. A Brief Guide on Usability and Implementation Issues. Meier, G. et al. (2005):

Steigerung der Ausschöpfungsquote durch geschickte Einleitungstexte, in Zuma-Nachrichten 57, Jg. 29, S. 37-55.Peytchev, A., Crawford, S. (2005):A Typology of Real-Time Validations in Web-Based Surveys. In Social Science Computer Review, Vol. 23, No. 2, S. 235-249.Roster, C.A., Albaum, G. Smith, S.M. (2014): Topic Sensitivity and Internet Survey Design: A Cross-Cultural/National Study. In: Journal of Marketing Theory and Practice, Vol. 22, No. 1, S. 91-102.Schnell, R., Hill, P., Esser, E. (2011):Methoden der empirischen Sozialforschung. München.Stiegler, S., Reips, U.-D., Voracek, M. (2007):Forced-response in online surveys: Bias from reactance and an increase in sex-specific dropouts. In: Journal of the American Society for Information Science and Technology, Vol. 58, No. 11, S. 1653-1660.

The major objective of the project is to analyse the consequences of the implementation of the forced response option by using split ballot field experiments. Our analysis especially focuses on drop out rates, response times and on the validity of given answers. Our first split ballot experiment was in July 2014 (n=1056)

Results

Field time: 12.06.2014 to 12.07.2014; Population: students and employees of the Hochschule Trier and the Universität Trier; Topic: Perception of delinquency with a special focus on Uli Hoeneß tax evasion including some very personal and sensitive questions concerning personal experiences and behaviours in context of delinquency;

Incentives: total value 350 € (financed by the Online Research-Fund, DGOF); Sample: N=1056, 37,3% female / 62,7% male participants, mean age 27 years.

Jean Philippe Décieux, Alexandra Mergener, Kristina Neufang & Philipp Sischka

To hypothesis I:

Version 1 Version 2 Version 3 Dropouts 56 15,9% 55 15,6% 62 17,6%

Finished 296 84,1% 297 84,4% 290 82,4%

Total 352 352 352

These hypothesis base on the assumption that respondents normally have plausible reasons for not answering a question e.g. as

• not understanding the question,

• missing of an appropriate category,

• or personal reasons like privacy etc.

 Forcing the respondents to answers might by this result in reactance

Interpretation of hypothesis I:

There is no correlation between the total dropout rates and the Forced- response-option. But the moment of dropout differs during the survey.

0 10 20 30 40 50 60

Version 1 Version 2 Version 3

Numbers of dropouts

Dropout rates during the online-interview

2. big item-battery („attitudes towards Uli

Hoeneß tax evasion“)

„Are you self a crime victim?“

„Which kind of victim experiences have you made?“

Interpretation of hypothesis II and III:

If there is no possibility to leave the question blank or to choose the neutral-option, the respondents tend to answer sensitive questions more often with „No“. One reason might be that they do not want to offer the sensitive information or to accuse themselves.

Own criminal offences Version I Version II Version III

Yes 26,4% 19,5% 20,7%

No 72,3% 80,5% 73,1%

Not specified - - 6,2%

Not answered 1,4% - -

Total (finished) 296 297 290

 The consequences are analysed based on drop out rates and response latencies.

 Effects of the neutral or ”do not want to answer this question” option should also be analysed.

o This additional option offers the possibility to free the respondents from the enforcement to answer every question that is asked.

o Here we are especially interested in how often this option is chosen and how the response latencies differ between experimental and control group.

Contact: jeanphilippe.decieux@uni.lu, mergener@bibb.de, s4krneuf@uni-trier.de, philipp.sischka@uni.lu

Research Hypothesis

Conclusion:

The effect on drop out rates is not obvious, this is in line with the current results of Roster, Albaum and Smith (2014) and might also be caused by the experimental design of the study especially by the high incentives.

However our first outcomes show that forcing the respondents to answer questions results in cancelling the study more early and often as well as choosing a neutral response category more often in sensitive issues, if proposed in the design.

So, implementing a forced-response-option seems to have a negative influence on dropout-rates and on the validity of answers and should therefore only be implemented in very specific cases.

To hypothesis II and III:

In general, there is a high tendency to finish the survey. Reasons might be:

− Homogeneous sample with specific characteristics (e.g. high affinity for survey (e.g. Jacob/Heinz/Décieux 2013, Schnell/Hill/Esser 2011)

− High monetary incentives (e.g. Görlitz 2004)

− Mode effect of online survey, (e.g. higher disposition to answer sensitive questions) (e.g. Décieux 2012; Décieux/Hoffmann 2014)

− Interesting main topic (Uli Hoeneß), detailed and neutral introduction with many “ice breaker questions” (e.g. Jacob/Heinz/Décieux 2013; Meier et al. 2005)

Lower probability of dropouts in all versions

Critical Reflection:

Références

Documents relatifs

At the surface, in frontal regions such as south of Newfoundland, the ensemble spread has an amplitude of about 10% of the natural variability, that is the standard deviation over

However, in forced nonlinear oscillators under time delay, the interaction between the forcing and the time delay induces not only periodic oscillations, but also QP vibrations

shown by studying the (local) puise susceptibility xp, which grows exponentially with the inverse of the time interval from the next (global) earthquake and saturates to

Forced answering in online surveys: Do higher item response rates come at the expense of participation and answer quality.. 18 th General Online Research Conference

Implementation of the forced answering option within online surveys: Higher response rates at the expense of validity. Psihologija,

To gain a better understanding of the dropouts, it is important to divide the group of respondents under condition II into two subgroups: Those who tried to not answer at least

Implementation of the forced answering option within online surveys: Higher response rates at the expense of validity.. Psihologija,

Wageningen Academic Publishers, Annual Meeting of the European Association for Animal Production, 19 (1ère Ed.), 665 p., 2013, Book of Abstracts of the 64th Annual Meeting of