Why technology is not always the solution for better education
Until quite recently, I was a keen advocate of transforming education through technology. Over the years I was inspired by the ideas and works of pioneers such as Jane McGonigal, Katie Salen, Salman Khan (founder of Khan Academy), Steven Johnson (author of Everything Bad is Good for You), Douglas Thomas and John Seely Brown (authors of A New Culture of Learning), to name a few.
As a technophile and a student of engineering, I believed technology could allow us to interact with one another and with our environments in ways that would not only enrich our experience, but also enhance the condition of our species. Every new tool, I thought, would enable us to better understand and effectively tackle some of the world’s most challenging problems. With every advancement, I saw potential; with every breakthrough, a promise for a better tomorrow.
But as Nicholas Carr writes in his thought-provoking book, The Shallows: What the Internet Is Doing to Our Brains, “an honest appraisal of any new technology, or of progress in general, requires a sensitivity to what’s lost as well as what’s gained. We shouldn’t allow the glories of technology to blind our inner watchdog to the possibility that we’ve numbed an essential part of our self.”
For instance, video games may strengthen our visual-spatial intelligence by immersing us in virtual spaces where we need to learn how to rotate objects in our minds and navigate through various architectures and surroundings. But Carr admonishes that this gained ability “go[es] hand in hand with a weakening of our capacities for the kind of ‘deep processing’ that underpins ‘mindful knowledge acquisition, inductive analysis, critical thinking, imagination, and reflection.’”
In reality, the subtleties and complexities of the real world cannot possibly be encompassed by a computer—no matter how advanced or sophisticated technology becomes. In his manifesto, You Are Not a Gadget, the father of virtual reality technology and digital media guru Jaron Lanier remarks that technology often “captures a certain limited measurement of reality within a standardized system that removes any of the original source’s unique qualities.” This is because the algorithms and tools we develop are a reflection of our subjective understanding of the world, and our minds can neither comprehend nor represent a thing to its completion.
Artist and internet anthropologist Jonathan Harris came to a similar conclusion after years working on a series of highly original projects, each of which sought to explore innovative ways of understanding and celebrating personhood. With every one of his projects Harris saw the limitations of his own tools and the difficulty in trying to capture depth and meaning using only digital information, and this has drastically reshaped his philosophy of technology.
A powerful insight comes from a study by Professor James Evans at the University of Chicago. Evans looked at 34 million articles, comparing academic papers written before and after the Internet was introduced to scholarly research. He showed not only that papers written after the digital age lacked richness and variety in citations, but also that old-fashioned library search served to widen the scholar’s horizons precisely because the process included going through more or less unrelated articles before reaching the desired study. As Carr observes, “a search engine often draws our attention to a particular snippet of text, a few words or sentences that have strong relevance to whatever we’re searching for at the moment, while providing little incentive for taking in the work as a whole. We don’t see the forest when we search the Web. We don’t even see the trees. We see twigs and leaves.”
Something is getting lost in the equation
Anyone familiar with the history and philosophy of science knows that paradigm shifts occur not by reinforcing consensus and normal science, but by allowing dissent and divergent thinking. Scientific revolutions happen because brave minds search for possible explanations outside of the box, in unexplored territories. But how can algorithms that lock us in a particular mindset—usually the developer’s worldview—even enable us to question our basic assumptions about nature?
As Evans demonstrated, the tedious and seemingly irrelevant tasks we try to eliminate with every new technology turn out to be the most essential to our learning experience. They elevate us precisely because they fatigue us. Reducing human error by relying on computer efficiency makes our work less thoughtful and less original, and we do not end up learning as much as when we do the hard work.
Photographer Fulvio Bonavia offers an insightful view on the relationship between technology and art: “The big challenge for photography today is that digital makes it much easier to become a photographer, but it is even harder to become a very good photographer. When I worked by hand as an illustrator and graphic designer, I would spend an entire day to make by hand something that I can now do in two minutes with the computer. But all that time I used to spend was not wasted, as I think it made me grow better, teaching me the concentration, the patience, the precision, and the attention to not make a mistake.”
Recent studies in neuroplasticity have shown how every tool we use changes the physical structure of our brains in different ways. Carr elucidates this point using a familiar example: “A page of text viewed through a computer screen may seem similar to a page of printed text. But scrolling or clicking through a Web document involves physical actions and sensory stimuli very different from those involved in holding and turning the pages of a book or a magazine. Research has shown that the cognitive act of reading draws not just on our sense of sight but also on our sense of touch. It’s tactile as well as visual.”
Dozens of studies by psychiatrists, psychologists, neurobiologists, educators, and designers point to the same conclusion: when we go online or facilitate our education through digital technology, we enter environments that promote cursory reading, hurried and distracted thinking, and superficial learning. We think we benefit because we have come to define intelligence by the medium’s own standards. Carr puts it best: “As we come to rely on computers to mediate our understanding of the world, it is our intelligence that flattens into artificial intelligence.”
Seeing as many entrepreneurs and education leaders introduce technology to classrooms, refugee camps and other places with the hopes of democratizing learning worries me more than it gives me hope. It is wonderful to see so many people with good intentions take interest in education. Good intention alone, however, is not enough; in the urgent words of Elias Aboujaoude, “those effects deserve to be understood, studied, and discussed.”
It is especially important for the technology enthusiast to learn about the studies mentioned here and question new-age rhetoric, so that we may deepen our understanding of what is involved and what is at stake. I will conclude with the great words of Jaron Lanier:
“When it comes to people, we technologists must use a completely different methodology. We don’t understand the brain well enough to comprehend phenomena like education or friendship on a scientific basis. So when we deploy a computer model of something like learning or friendship in a way that has an effect on real lives, we are relying on faith. When we ask people to live their lives through our models, we are potentially reducing life itself. How can we ever know what we might be losing?”
Do you think technology can limit individual learning and discovery? Share your thoughts in the comments section below.