4.5 Article

Perspectives on Evidence-Based Research in Education What Works? Issues in Synthesizing Educational Program Evaluations

Journal

EDUCATIONAL RESEARCHER
Volume 37, Issue 1, Pages 5-14

Publisher

SAGE PUBLICATIONS INC
DOI: 10.3102/0013189X08314117

Keywords

evidence-based reform; meta-analysis; research review; What Works Clearinghouse

Ask authors/readers for more resources

Syntheses of research on educational programs have taken on increasing policy importance. Procedures for performing such syntheses must therefore produce reliable, unbiased, and meaningful information on the strength of evidence behind each program. Because evaluations of any given program are few in number, syntheses of program evaluations must focus on minimizing bias in reviews of each study. This article discusses key issues in the conduct of program evaluation syntheses: requirements for research design, sample size, adjustments for pretest differences, duration, and use of unbiased outcome measures. It also discusses the need to balance factors such as research designs, effect sizes, and numbers of studies in rating the overall strength of evidence supporting each program.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available