What we talk about when we talk about evidence
By Jean Rhodes
Researchers, practitioners, and policymakers tend to define key terms—“empirical research” and “evidence”—quite differently. Whereas most researchers employ the two terms interchangeably to mean “findings derived from empirical methods,” studies suggest that practitioners working in the field tend to define evidence more broadly as stemming not only from scientific methods, but also from consumer satisfaction surveys; feedback from parents, commentary and personal stories; youth, and communities; and other sources. Practitioners’ embrace of less-rigorous evaluations may stem in part from understandable frustration with researchers’ seemingly slow and ponderous pace. Researchers’ tendency to poke and prod at topics that can seem arcane and irrelevant can frustrate practitioners who have pressing concerns. Efforts to be completely transparent about study limitations can seed doubt and misinterpretation about the findings. Add to these communication gaps the growing antipathy toward empirical evidence and, in the era of accusations about “fake news,” a suspicion that experts may have a hidden agenda that leads them to play fast and loose with their statistics. Fewer than 20 percent of respondents in one large survey indicated that they trusted scientists in general, and statistics, in particular, appear to draw skepticism.
But even so, I believe that empirical research has a more valuable role than ever to play in guiding us through this through age of misinformation and polarization. At its core, empiricism is the philosophical stance that knowledge comes primarily from sensory experience and evidence, not beliefs. In the social and behavioral sciences, this translates to a commitment to basing our understanding on observable, measurable phenomena rather than speculation, values, background reading, lived experiences, or intuition. Empirical research involves systematic observation, data collection, and analysis to test hypotheses and draw conclusions about underlying processes. This is important because it saves us from well-intended but ineffective interventions. Indeed, the field of mentoring is littered with untested curricula and trainings which , however well-intentioned, should always be tested. An empirical approach allows researchers to identify the unintended consequences, isolate active ingredients of successful interventions, refine and improve our approaches over time, and allocate limited resources more effectively. Consider the case of Critical Incident Stress Debriefing (CISD), a widely adopted intervention aimed at preventing post-traumatic stress disorder in first responders. Despite its popularity, rigorous empirical studies revealed that CISD could actually increase the risk of PTSD symptoms in some individuals (McNally et al., 2003). Empirical research also plays a crucial role in advancing psychological theory. Data-driven approaches push us to refine and sometimes radically revise our understanding of human behavior. For instance, attachment theory has been continuously refined and expanded through decades of empirical research across various contexts and populations (Cassidy & Shaver, 2016).
The American Psychological Association (APA) has established rigorous standards for empirical research through its Journal Article Reporting Standards (JARS). These guidelines emphasize the importance of clear research questions, well-defined populations, precise measurements, and transparent reporting of methods and results. The JARS provide detailed instructions for reporting quantitative, qualitative, and mixed methods research, ensuring consistency and transparency across psychological studies. APA Journal Standards for Empirical Research. These standards require researchers to provide detailed information about their study design, participants, measures, procedures, and data analysis techniques. Importantly, empirical research isn’t limited to laboratory experiments. It encompasses a wide range of methodologies, including randomized controlled trials, longitudinal studies, cross-sectional surveys, qualitative interviews, participant observation, and action research. What unites these diverse approaches is their adherence to certain fundamental principles: systematic observation and data collection, clearly defined and operationalized variables, replicability and transparency of methods, statistical analysis or rigorous qualitative interpretation, and peer review and scrutiny.
The path to empirical research is not without its challenges. Recent years have seen a “replication crisis,” where many high-profile studies failed to produce the same results when repeated. This has led to increased scrutiny of research practices and a push for more robust methodologies, including preregistration of studies, open data sharing, and improved statistical practices. Another ongoing debate centers on the generalizability of findings, particularly given the overreliance on WEIRD (Western, Educated, Industrialized, Rich, and Democratic) populations in much psychological research. This has spurred important efforts to diversify participant pools and consider cultural context more carefully in study design and interpretation.
Grounding our theories and interventions in solid empirical foundations doesn’t mean abandoning our values, creativity, or intuition – these remain as essential as ever to generating hypotheses and designing studies. But it does require a commitment to testing our ideas rigorously and being willing to revise them in light of new evidence.
References
American Psychological Association. (2020). Publication manual of the American Psychological Association (7th ed.). https://doi.org/10.1037/0000165-000
Appelbaum, M., Cooper, H., Kline, R. B., Mayo-Wilson, E., Nezu, A. M., & Rao, S. M. (2018). Journal article reporting standards for quantitative research in psychology: The APA Publications and Communications Board task force report. American Psychologist, 73(1), 3-25. https://doi.org/10.1037/amp0000191
Cassidy, J., & Shaver, P. R. (Eds.). (2016). Handbook of attachment: Theory, research, and clinical applications (3rd ed.). Guilford Press.
McNally, R. J., Bryant, R. A., & Ehlers, A. (2003). Does early psychological intervention promote recovery from posttraumatic stress? Psychological Science in the Public Interest, 4(2), 45-79. https://doi.org/10.1111/1529-1006.01421