New Research Answers Whether Technology is Good or Bad for Learning | Help and Support everybody around the world | Scoop.it

For years educators and scholars have debated whether technology aids learning or inhibits it.

In the most recent issue of Education Next, for example, Susan Payne Carter, Kyle Greenberg, and Michael S. Walker write about their research finding that allowing any computer usage in the classroom “reduces students’ average final-exam performance by roughly one-fifth of a standard deviation.” Other studies have shown similarly dismal numbers for student learning when technology is introduced in the classroom.

Yet there are also bright shining stars of technology use—both in proof points and in studies, such as this Ithaka study or this U.S. Department of Education 2010 meta-analysis.

So what gives? Since 2008 I’ve, perhaps conveniently, argued that scholars and advocates on both sides of this debate are correct. As we wrote in Disrupting Class in 2008, computers had been around for two decades. Even 10 years ago, we had already spent over $60 billion on them in K–12 schools in the United States to little effect. The reason quite simply was that when we crammed computers into existing learning models, they produced begrudging or negative results. To take a higher education example, when I was a student at the Harvard Business School, far fewer of us paid attention to the case discussion on the couple days at the end of the term when laptops were allowed, as we chose to instead chat online and coordinate evening plans. In that context, I would ban laptops, too.

When the learning model is fundamentally redesigned to incorporate intentionally the benefits of technology, say, in a blended-learning model, however, you can get very different results. To use another personal example, I fervently hope that the public school district where my daughters will go to school will comprehensively redesign its learning environments to personalize learning for each student through the use of technology. As we disruptive innovation acolytes like to say, it’s almost always about the model, not the technology.

This finding isn’t unique to the technology of computers in classrooms. It was true with chalkboards as well.

As Harvard’s David Dockterman recounts, the blackboard was reportedly invented in the early 19th century. The technology was adopted quickly throughout higher education in a lecture model to convey information to all the students at once. The first recorded use in North America was in 1801 at the United States Military Academy in West Point—ironically the location of the study that Carter, Greenberg, and Walker conducted—and it spread quickly.

Having observed the success of the blackboard in college, schoolhouses began installing the technology, but the teaching and learning changed minimally. The blackboards were largely unused because teachers had difficulty figuring out how to use them. Why? At the time, the prevalent model of education in public schools was the one-room schoolhouse in which all students, regardless of age or level, met in a single room and were taught by a single teacher. Rather than teaching all the students the same subjects, in the same way, at the same pace—like in today’s schools—the teacher rotated around the room and worked individually with small groups of students. As a result, the blackboard didn’t make much sense in the context of the one-room schoolhouse because the teacher rarely, if ever, stood in front of the class to lecture.

It wasn’t until the early 1900s when the public education system changed its instructional model—to today’s factory model—that the blackboard became a staple of American education. Lesson? The model matters.

Fast forward to today, and we see the same dynamic. A new—and very helpful—analysis of the research helps tease this out and perhaps can at last break the infuriating log-jam between those who argue technology is a distraction at best and those who argue it is an extremely positive force.


Via Miloš Bajčetić