The teaching of thinking skills is an explicit part of the National Curriculum in England and Wales, and contributes directly to Department for Education and Skills (DfES) initiatives such as Teaching and Learning in the Foundation Subjects (DfES, 2004a) and Leading in Learning at Key Stage 3 (DfES, 2005) which emphasise the importance of thinking skills approaches for the promotion of effective questioning and extending pupils’ oral responses in classrooms, as well as the potential contribution to assessment for learning. Thinking skills are also an important part of the developing Primary National Strategy aims (DfES, 2004b). However, thinking skills do not form a discrete and explicit programme of study and appear unevenly in the different subjects of the National Curriculum, which makes it challenging for schools to ensure progression in their teaching.
Our working definition for the purposes of this review is that thinking skills interventions are approaches or programmes which identify for learners translatable, mental processes and/or which require learners to plan, describe and evaluate their thinking and learning. These can therefore be characterised as approaches or programmes which:
- require learners to articulate and evaluate specific learning approaches; and/or
- identify specific cognitive, and related affective or conative processes that are amenable to instruction.
This definition is consistent with the definition used to identify and analyse thinking skills frameworks and taxonomies in other work undertaken by the Centre for Learning and Teaching (for example, Moseley et al., 2004, 2005a, 2005b).
A thinking skills approach therefore not only specifies what is to be taught but also how it is taught: the content of lessons and the teaching approach form an integral part of thinking skills approaches to teaching and learning. Examples of programmes and approaches commonly used in schools are instrumental enrichment (Feuerstein et al., 1980), philosophy for children (Lipman et al., 1980), cognitive acceleration through science education (CASE) (Adey et al., 1995), and Somerset thinking skills (Blagg et al., 1988). Considerable interest has also been shown by teachers and policymakers in how these more formal programmes can be integrated effectively or ‘infused’ into teaching approaches and adopted more widely by teachers (Leat and Higgins, 2002; McGuinness, 1999; McGuinness et al.,1995).
A meta-analysis was needed for the following reasons:
- to provide potential users with an estimate of the relative impact of thinking skills interventions, thus extending the scope of earlier reviews which have attempted to evaluate evidence from a range of thinking skills approaches (for example, Sternberg and Bhana, 1986); or which have focused on a particular programme (such as Romney and Samuels’ (2001) meta-analysis of evidence of the impact on learners of Feuerstein’s instrumental enrichment);
- to quantify the impact of thinking skills interventions in order to test the conclusions of the mainly positive but descriptive reviews in the UK (for example, Higgins et al., 2004; McGuinness, 1999; Wilson, 2000);
- to compare the impact of thinking skills interventions with other researched educational interventions (for example, Hattie et al., 1992; Marzano, 1998; Sipe and Curlette, 1997).
In our first review, we identified and described 191 studies available up until 2002. We used narrative synthesis methods to address the question of what the impact of thinking skills interventions is on pupils. Twenty-three studies were included and reviewed in-depth following EPPI Centre reviewing guidelines.
This review concluded that the selection and implementation of thinking skills approaches needed to be based on more precise information on their effectiveness and efficiency. Meta-analysis is a method for pooling the quantitative estimates of effects of interventions from multiple studies to give a more reliable and precise estimate of their benefits (or potential harm). Comparing these estimates across different types of interventions can also pinpoint which aspects of interventions offer the most potential in the classroom. Meta-analysis is proving to be a useful approach to addressing the key question of practitioners interested in thinking skills in terms of ‘what works’ in education (for example, Hattie et al. 1996; Marzano et al. 2001; Sipe and Curlette, 1997).
Aims and review questions
The overall aim of the Thinking Skills Review Group is to investigate the impact of thinking skills interventions on teaching and learning in classrooms over a series of focused reviews.
Our main review question is as follows:
What is the impact of the implementation of thinking skills interventions on teaching and learning?
For this review, a narrower focus was identified for the central question about the quantitative impact of thinking skills interventions on pupils to provide a quantitative synthesis of evidence in this area:
What is the quantitative evidence for impact on pupils’ attainment and attitudes in schools?
Relevant studies in the area of thinking skills were obtained by systematically searching a number of online databases of educational research literature, by identifying references in reviews and other relevant books and reports, and from contacts with expertise in this area. Twenty-six of the studies identified for this review were obtained from the database which resulted from the first thinking skills review (Higgins et al., 2004); a further three resulted from updating the original search and applying the more stringent criteria required for a quantitative synthesis.
Studies were selected for the meta-analysis if they had sufficient quantitative data to calculate an effect size (relative to a control or comparison group of pupils) and if the number of research subjects was greater than 10. Effect sizes were calculated from the reported data and combined statistically using quantitative synthesis.
Twenty-nine studies were identified which contained quantitative data on pupils’ attainment and attitudes suitable for meta-analysis. The studies come from a range of countries around the world with half set in the US or UK. The studies broadly cover the ages of compulsory schooling (5–16) and include studies set in both primary and secondary schools. A number of named thinking skills interventions are included, such as Feuerstein’s instrumental enrichment (FIE) and cognitive acceleration through science education (CASE) as well as studies which report a more general thinking skills approach (such as the development of metacognitive strategies).
The quantitative synthesis indicates that thinking skills programmes and approaches are effective in improving the performance on tests of cognitive measures (such as Raven’s progressive matrices) with an overall effect size of 0.62. (This effect would move a class ranked at 50th place in a league table of 100 similar classes to 26th or a percentile gain of 24 points.) However, these approaches also have a considerable impact on curricular outcomes with the same effect size of 0.62. The overall effect size (including cognitive, curricular and affective measures) was 0.74.
Overall, the quantitative synthesis indicates that, when thinking skills programmes and approaches are used in schools, they are effective in improving pupils’ performance on a range of tested outcomes (relative to those who did not receive thinking skills interventions). The magnitude of the gains found appears to be important when compared with the reported effect sizes of other educational interventions.
This review found an overall mean effect of 0.62 for the main (cognitive) effect of each of the included studies, larger than the mean of Hattie’s vast database of meta-analyses at 0.4 (Hattie, 1999) but very similar to the overall figure reported by Marzano (1998, p 76) of 0.65 for interventions across the knowledge, cognitive, metacognitive and self-system domains. In particular, our study identified metacognitive interventions as having relatively greater impact, similar to Marzano’s study.
Looking at a smaller part of our review, Feuerstein’s instrumental enrichment is one of the most extensively researched thinking skills programme. Our results broadly concur with those of Romney and Samuels (2001), whose meta-analysis found moderate overall effects and an effect size of 0.43 on reasoning ability (p 28). Our findings were of the same order, with an overall effect size of 0.58 (one main effect from each of seven studies included) and an effect size of 0.52 on tests of reasoning (one main effect from four studies).
There is some indication that the impact of thinking skills programmes and approaches may vary according to subject. In our analysis there was relatively greater impact on tests of mathematics (0.89) and science (0.78), compared with reading (0.4).
Thinking skills programmes have been extensively used around the world for a number of years. There is a growing body of accumulated evidence that they are effective at improving pupils’ performance on cognitive and curriculum tests when they are researched in school settings. Their effect is relatively greater than most other researched educational interventions. This review strengthens this evidence base.
For practitioners, thinking skills programmes and approaches are likely to improve pupils’ learning. Their use in schools should therefore be supported. Some caution is required as there is some variation in the impact of such approaches according to subject, age and gender. This suggests that their use needs to be matched to the particular teaching context and monitored critically to ensure potential benefits.
For policy-makers, thinking skills programmes and approaches are an effective way to improve teaching and learning, and their use in schools should be encouraged. However, as it is not clear to what extent the benefits are due to specific aspects of the content of the programmes and their implementation or the changes in teaching and learning which ensue, it is not possible to provide precise recommendations.
Further research is needed to clarify the particular causes of the benefits and where thinking skills programmes and approaches have most impact (such as on different age groups or in different areas of the curriculum). In particular, the impact of thinking skills programmes and approaches on teaching and learning processes needs to be related to improvements in outcomes to identify the means by which the impact occurs.
Researchers and journal editors should note that studies which looked like they were relevant to the review were often excluded because basic information, such as number of pupils involved, was not included in the published papers. Moreover, the details of sampling strategies and full sets of results were frequently omitted. Abstracts sometimes referred to data which was not then reported in detail. Journals which used structured abstracts were more likely to contain more accurate and more complete information to support the systematic reviewing process.
Strengths and limitations
- The extent of the literature included in the review process, building on the mapping of this literature and narrative synthesis completed for the first review
- The use of meta-analysis to provide a quantitative synthesis of the research literature and an overall estimate of the impact of such approaches and the interpretation of this quantitative synthesis which contextualises thinking skills research within the broader field of educational research
- The close involvement of practitioner user groups in setting and refining the questions, and interpreting and disseminating the findings
- Studies often reported little about the programmes themselves or aspects of their implementation and use in classrooms (such as changes in teaching and learning processes). It is therefore difficult to draw conclusions about any common features of programmes and approaches which may account for the positive impact reported.
- The review used a broad definition of thinking skills for its focus. As a result, there was considerable statistical heterogeneity in the results of the studies, which indicates that caution is required in combining the effects and interpreting the findings.
- We were only able to identify and synthesise 29 studies within the timescale and resources for the review. A larger number of studies would enable further analysis (such as by age or subject) to make more specific recommendations for practitioners and policy-makers.
- Meta-analysis, or quantitative synthesis, is subject to a number of limitations and criticisms; this review is therefore open to such critique (see, for example, Chambers, 2004; Kulik and Kulik, 1989a).
Adey PS, Shayer M, Yates C (1995) Thinking Science: The Curriculum Materials of the CASE Project. London: Thomas Nelson and Sons.
Blagg N, Ballinger M, Gardner R (1988) Somerset Thinking Skills Course Handbook. Oxford: Basil Blackwell.
Feuerstein R, Rand Y, Hoffman MB, Miller R (1980) Instrumental Enrichment: an Intervention Programme for Cognitive Modifiability. Baltimore: University Park Press.
Hattie JA (1992) Towards a model of schooling: a synthesis of meta-analyses. Australian Journal of Education 36: 5–13.
Hattie J, Biggs J, Purdie N (1996) Effects of learning skills interventions on student learning: a meta-analysis. Review of Educational Research 66: 99–136.
Hattie J (1999) Influences on student learning. Inaugural lecture. Professor of Education, University of Auckland. August 2. Available at: http://www.arts.auckland.ac.nz/edu/staff/jhattie/Inaugural.html.
Kulik JA, Kulik C-LC (1989a) The concept of meta-analysis. International Journal of Educational Research 13: 227–340.
Leat D, Higgins S (2002) The role of powerful pedagogical strategies in curriculum development. The Curriculum Journal 13: 71-85.
Lipman M, Sharp A, Oscanyan F (1980) Philosophy in the Classroom. Princeton: Temple University Press.
Marzano RJ (1998) A Theory-Based Meta-Analysis of Research on Instruction. Aurora, Colorado: Mid-continent Regional Educational Laboratory.
Marzano RJ, Pickering DJ, Pollock JE (2001) Classroom Instruction That Works. Alexandria Va: Association for Supervision and Curriculum Development (ASCD).
McGuinness C, Wylie J, Greer B, Sheehy NAF (1995) Developing children's thinking: a tale of three projects. Irish Journal of Psychology 16: 378-388.
McGuinness C (1999) From Thinking Skills to Thinking Classrooms: A Review and Evaluation of Approaches for Developing Pupils' Thinking. Nottingham: DfEE Publications.
Moseley D, Baumfield V, Higgins S, Lin M , Miller J, Newton D, Robson S, Elliott J, Gregson M (2004) Thinking Skill Frameworks for Post-16 Learners: An Evaluation. A research report for the learning and skills research centre. Trowbridge: Cromwell Press. Available from: http://www.lsda.org.uk/files/pdf/1541.pdf
Moseley D, Baumfield V, Elliott J, Higgins S, Miller J, Newton DP (2005a) Frameworks for Thinking: A Handbook for Teachers and Learning. Cambridge: Cambridge University Press.
Moseley D, Elliott J, Gregson M, Higgins S (2005b) Thinking skills frameworks for use in education and training. British Educational Research Journal 31: 81–101.
Romney DM, Samuels MT (2001) A meta-analytic evaluation of Feuerstein’s Instrumental Enrichment program. Educational and Child Psychology 18: 19-34.
Sipe T, Curlette WL (1997) A meta-synthesis of factors related to educational achievement: a methodological approach to summarizing and synthesizing meta-analyses. International Journal of Educational Research 25: 583–698.
Sternberg RG, Bhana K (1986) Synthesis of research on the effectiveness of intellectual skills programs: snake-oil remedies or miracle cures? Educational Leadership 44: 60-67.
Wilson V (2000) Education Forum on Teaching Thinking Skills Report. Edinburgh: Scottish Executive. Available from: http://www.scotland.gov.uk/library3/education/ftts-00.asp.
This report should be cited as: Higgins S, Hall E, Baumfield V, Moseley D (2005) A meta-analysis of the impact of the implementation of thinking skills approaches on pupils. In: Research Evidence in Education Library. London: EPPI Centre, Social Science Research Unit, Institute of Education, University of London.