LAUSR.org creates dashboard-style pages of related content for over 1.5 million academic articles. Sign Up to like articles & get recommendations!

The trials of evidence-based education: the promises, opportunities and problems of trials in education

Photo from wikipedia

In the last six years, UK education research funding has been transformed. The Education Endowment Foundation (EEF), established by the Sutton Trust in 2011 with £125 million of funding, has… Click to show full abstract

In the last six years, UK education research funding has been transformed. The Education Endowment Foundation (EEF), established by the Sutton Trust in 2011 with £125 million of funding, has funded more than 145 randomised controlled trials (RCTs), each costing an average of around £500k (EEF, 2015). By way of comparison, since 2012 the Economic and Social Research Council has spent just £4.1 million on open-call education research grants (ESRC, 2018). This makes the EEF by far the largest UK funder of education research, and its focus on RCTs represents a major change for education research and for mathematics education in particular: of the 263 articles published in the eight leading mathematics education journals in 2012, only eight reported studies with random allocation into groups (Alcock, Gilmore, & Inglis, 2013). Such a major change calls for reflection. Is this new focus on RCTs a positive development? Gorard, See, and Siddiqui’s (2017) book offers just such reflection. Its authors strongly support the increased use of RCTs, arguing that “much of the published research on education is of such poor quality that it might do more harm than good” (p. 4), that mostly it “is of no consequence or use for any real life purpose” and therefore that it “can safely be ignored” (p. 5). They argue that most studies do not involve randomisation into groups (so causality cannot be inferred), that too many educational interventions are evaluated by reference to anecdotes or student satisfaction surveys, and that many educational research designs lack appropriate comparison or control groups. Thus, they claim, more well-conducted RCTs are needed if we are to draw genuinely causal conclusions about “what works”. Given these comments, one might expect Gorard et al. to be delighted by the recent emergence of the EEF. But the authors’ position is more nuanced. While they believe that the increase in RCTs has led to “considerable progress” (p. 18), they spend much of the book critiquing the methods commonly adopted by the EEF’s RCT researchers and advocating their own alternatives. These alternative methods will, Gorard et al. suggest, allow education researchers to establish what works. I agree with Gorard et al.’s overall view about the need for better evidence, but see serious problems with both their critique of traditional RCT methods and their apparent philosophical views on the purpose of research. These problems are explored in the remainder of this review.

Keywords: research; mathematics; education research; evidence; education; eef

Journal Title: Research in Mathematics Education
Year Published: 2018

Link to full text (if available)


Share on Social Media:                               Sign Up to like & get
recommendations!

Related content

More Information              News              Social Media              Video              Recommended



                Click one of the above tabs to view related content.