Abstract
Surveys that require users to evaluate or make judgments about information systems and their effect on specific work activities can produce misleading results if respondents do not interpret or answer questions in the ways intended by the researcher. This paper provides a framework for understanding both the cognitive activities and the errors and biases in judgment that can result when users are asked to categorize a system, explain its effects, or predict their own future actions and preferences with respect to use of a system. Specific suggestions are offered for wording survey questions and response categories so as to elicit more precise and reliable responses. In addition, possible sources of systematic bias are discussed, using examples drawn from published IS research. Recommendations are made for further research aimed at better understanding how and to what extent judgment biases could affect the results of IS surveys.
Keywords
Affiliated Institutions
Related Publications
Understanding Information Systems Continuance: An Expectation-Confirmation Model1
This paper examines cognitive beliefs and affect influencing one’s intention to continue using (continuance) information systems (IS). Expectation-confirmation theory is adapted...
Assessing uncertainty in physical constants
Assessing the uncertainty due to possible systematic errors in a physical measurement unavoidably involves an element of subjective judgment. Examination of historical measureme...
Confirmation, disconfirmation, and information in hypothesis testing.
Strategies for hypothesis testing in scientific investigation and everyday reasoning have interested both psychologists and philosophers.A number of these scholars stress the im...
Costs and benefits of judgment errors: Implications for debiasing.
Some authors questioned the ecological validity of judgmental biases demonstrated in the laboratory. One objection to these demonstrations is that evolutionary pressures would h...
Motivated Skepticism in the Evaluation of Political Beliefs
We propose a model of motivated skepticism that helps explain when and why citizens are biased‐information processors. Two experimental studies explore how citizens evaluate arg...
Publication Info
- Year
- 1994
- Type
- article
- Volume
- 5
- Issue
- 1
- Pages
- 48-73
- Citations
- 229
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1287/isre.5.1.48