Lower test scores for students who use computers often in school, 31-country study finds
For those of us who worry that Google might be making us stupid, and that, perhaps, technology and education don’t mix well, here’s a new study to confirm that anxiety.
The Organization for Economic Cooperation and Development (OECD) looked at computer use among 15-year-olds across 31 nations and regions, and found that students who used computers more at school had both lower reading and lower math scores, as measured by PISA or Program for International Student Assessment. The study, published September 15, 2015, was actually conducted back in 2012, when the average student across the world, for example, was using the Internet once a week, doing software drills once a month, and emailing once a month. But the highest-performing students were using computers in the classroom less than that.
“Those that use the Internet every day do the worst,” said Andreas Schleicher, OECD Director for Education and Skills, and author of “Students, Computers and Learning: Making the Connection,” the OECD’s first report to look at the digital skills of students around the world. The study controlled for income and race; between two similar students, the one who used computers more, generally scored worse.*
Home computer use, by contrast, wasn’t as harmful to academic achievement. Many students in many high performing nations reported spending between one to two hours a day on a computer outside of school. Across the 31 nations and regions, the average 15-year-old spent more than two hours a day on the computer. (Compare your country here).
Back in the classroom, however, school systems with more computers tended to be improving less, the study found. Those with fewer computers were seeing larger educational gains, as measured by PISA test score changes between 2009 and 2012.
“That’s pretty sobering for us,” said Schleicher in a press briefing. “We all hope that integrating more and more technology is going to help us enhance learning environments, make learning more interactive, introduce more experiential learning, and give students access to more advanced knowledge. But it doesn’t seem to be working like this.”
Schleicher openly worried that if students end up “cutting and pasting information from Google” into worksheets with “prefabricated” questions, “then they’re not going to learn a lot.”
“There are countless examples of where the appropriate use of technology has had and is having a positive impact on achievement,” said Bruce Friend, the chief operating officer of iNACOL, a U.S.-based advocacy group for increasing the use of technology in education. “We shouldn’t use this report to think that technology doesn’t have a place.”
Friend urges schools in the U.S. and elsewhere to train teachers more in how to use technology, especially in how to analyze real-time performance data from students so that instruction can be modified and tailored to each student.
“Lots of technological investments are not translating into immediate achievement increases. If raising student achievement was as easy as giving every student a device, we would have this solved. It’s not easy,” Friend added.
In a press briefing on the report, Schleicher noted that many of the top 10 scoring countries and regions on the PISA test, such as Singapore and Shanghai, China, are cautious about giving computers to students during the school day. But they have sharply increased computer use among teachers. Teachers in Shanghai, Schleicher explained, are expected to upload lesson plans to a database and they are partly evaluated by how much they contribute. In other Asian countries, it is common for teachers to collaborate electronically in writing lessons. And technology is used for video observations of classrooms and feedback. “Maybe that’s something we can learn from,” said Schleicher.
In addition to comparing computer use at schools with academic achievement, the report also released results from a 2012 computerized PISA test that assessed digital skills. U.S. students, it turns out, are much better at “digital reading” than they are at traditional print reading. The U.S. ranked among the group of top performing nations in this category. In math, the U.S. was near the worldwide average on the digital test, whereas it usually ranks below average on the print test.
The digital reading test assesses slightly different skills than the print test. For example, students are presented with a simulated website and asked to answer questions from it. Astonishingly, U.S. students are rather good at remaining on task, clicking strategically and getting back on track after an errant click. By contrast, students in many other nations were more prone to click around aimlessly.
Interestingly, there wasn’t a positive correlation between computer usage at school and performance on the digital tests. Some of the highest scoring nations on the digital tests don’t use computers very much at school.
In the end, 15-year-old students need good comprehension and analysis skills to do well in either the print or the digital worlds. This study leaves me thinking that technology holds a lot of promise, but that it’s hard to implement properly. Yes, maybe there are superstar teachers in Silicon Valley who never get rattled by computer viruses, inspire their students with thrilling lab simulations and connect their classroom with Nobel Prize-winning researchers. But is it realistic to expect the majority of teachers to do that? Is the typical teacher’s attempt to use technology in the classroom so riddled with problems that it’s taking away valuable instructional time that could otherwise be spent teaching how to write a well-structured essay?
Perhaps, it’s best to invest the computer money, into hiring, paying and training good teachers.
* In reading, students who used the computer a little bit did score better than those who never used a computer. But then as computer use increased beyond that little bit, reading performance declined. In math, the highest performing students didn’t use computers at all.
This article also appeared here.
Jill Barshay, a contributing editor, writes a weekly column, Education By The Numbers, about education data and research.