Individual
Turner, R. C., Reimers, J., Lo, W., Jozkowski, K., & Crawford, B. (2020, April 17-21). Comparison of data quality indicator effectiveness with unidirectional vs. bidirectional scales in survey research. American Educational Research Association (AERA) Annual Meeting. San Francisco, California.
Survey data collected via an online panel sampler were evaluated for response quality using six data quality indicators. Characteristics of insufficient effort responding (IER) that were assessed included random responding, inattentiveness, non-differentiation, and speeding. Five reliability and consistency measures were compared for two types of scales (one with a unidirectional item set and one with a bidirectional item set) using three sample groups: the complete sample, subgroups identified as exhibiting IER on each data quality indicator, and subgroups not identified as exhibiting IER. Results indicate that the use of a scale with bidirectional items as an outcome variable provides clearer identification of data impacts than a scale of unidirectional items, using the data quality variables selected in this study. The use of a person-level reliability index that included item subsets from both unidirectional and bidirectional item groups was one of the most effective data quality indicators based on subgroup-level scale consistency and reliability values.
Reimers, J., Turner, R., & Jozkowski, K.N. (2020, April 17-21). Demographic comparisons for various types of low cognitive effort survey responses. American Educational Research Association (AERA) Annual Meeting. San Francisco, California.
The use of web-based surveys for collecting self-report data is convenient for both clients and researchers, however, their use can reduce the likelihood of population representation and increase the risk for poor data quality. Evaluation of data quality in participant responses has resulted in different proportions of participant subgroups being flagged, however, these differences are not consistent across studies and survey topics. Setting aside participants with low-quality data can impact sample characteristics, as some subgroups may be reduced at higher rates than others. In this study, the incidence of low data quality is compared across demographic subgroups for three data collections measuring the complex social topic of abortion attitudes. Results indicate that younger participants are more frequently identified as having low-quality data using a variety of indicators. Black or African American participants were also flagged for low-quality data at a slightly higher rate. To address underrepresentation in the sample and associated analyses, we recommend oversampling certain subgroups when higher proportions of these subgroups have been shown to exhibit more aberrant response behaviors for the topic of interest. The extent that our specific topic, abortion, may influence these results is also discussed.
Keywords: Insufficient effort responding, Low-quality data, Data screening, Survey research, Quota sampling, Person-fit, Online surveys, Careless responding, Random responding, Response time