The first OECD PISA assessment of digital skills was published last week and for learning technology enthusiasts, the report was difficult to ignore. We can’t do justice to the details of the 204-page document here but but we can have a look at how the report’s headline conclusions have made a splash so far.
The report concluded that across 31 OECD countries “where computers are used in the classroom, their impact on student performance is mixed at best” and that there were “no appreciable improvements in student achievement in reading, mathematics, or science in the countries that had invested heavily in ICT for education”. Further, “technology is of little help in bridging the skills divide between advantaged and disadvantaged students”. In fact, it continued, it may actually make things worse because “drilling” software (which according to the report is more commonly used for disadvantaged students) apparently had a negative effect on PISA performance.
Aside from the broad conclusions themselves, many commentators have objected to the limitations of the study’s methodology. The OECD itself admits that its key headlines are based on correlations between technology use and PISA performance rather than causations, and the report openly neglects any causal analysis. Many have also objected to the unhelpfully simplified ‘clickbait’ headlines of the mainstream media. From our perspective, as a counterpoint to the broad cross-border conclusions based on largely undifferentiated student “technology-use”, we would have liked to have seen how some students have performed when engaging properly with quality cognitively sophisticated educational software.
In describing the response to the report so far, Hack Education have helpfully laid out the common respondent battle lines, between those critical of available technologies, and those who claim it is the application of technology by schools and teachers which is to blame. Andreas Schleicher, Director of the Directorate for Education and Skills, certainly seemed to be leaning toward the latter when he said that “[schools] need to find more effective ways of integrating technology into teaching and learning”. In response, Hack Education itself forcefully argued that the burden of education technology’s effectiveness should not be shifted entirely on to educators, particularly “when so much edtech remains crap – exploitative and punitive crap that is well-funded”.
Overall, the report was far from earth-shattering but it was an unsurprising reminder of just how far technology has to go before it starts delivering its potential in education. Another no-brainer was the implication that technology alone won’t improve learning outcomes.
There are plenty of technological tools available to teachers and students (and PD training on new tools will always be beneficial and encouraging as in any line of work) but it is those tools which are focused closely on the users’ learning goals, and which are intuitive and rewarding to use, which will make the biggest impact on students’ performances. Alongside the low grade that the OECD dished out to the edtech sector with a “must improve” comment in red pen, we would have liked some forward-looking thoughts on how education technology enthusiasts might find ways of working more closely with schools and teachers to develop these types of tools for the future.