I leaned over the shoulder of a student in the library. She was quietly working with headphones in, and completely focused. What caught my attention is that she would continually lift her phone up over the textbook, and then jot something down on the paper to her left. It was a motion and process that she repeated at least seven times before I headed over to see what was going on.
As I got closer I could see that it was a math textbook, and her paper was filled with equations, problems, and steps. I thought to myself, that sure doesn’t look like my math homework, which was always a mess of numbers and lines and eraser marks from messing up!
What happened next caught me by surprise. Not because I couldn’t believe it, but because it changed the way I viewed math forever.
She would pick up her iPhone (or maybe it was an Android) and open up an app. Then flicking over to a clear screen, she would hover the phone over a specific problem in her textbook.
What happened next was nothing short of magic. If, per chance, someone had been transported here from even 20 years ago they might not have believed it was possible.
The phone immediately (I mean it was quick!) overlayed the problem, multiple steps, and a solution all in a row on her screen. She jotted down the answers on her piece of paper and went on to the next problem.
Me, on the other hand, had to first close my mouth from dropping and then I tapped her on the shoulder.
She was startled, and took out one headphone.
“What is that?” I questioned.
“Oh, it’s PhotoMath. It’s an app.”
“Are you allowed to use that? Is it something your teacher uses in class?”
“Um, I don’t think Ms. Carter knows about it…but no one ever said we couldn’t use it. Am I in trouble?”
I told her she wasn’t in trouble at all, and continued to ask a few more questions about how the app worked. But there wasn’t much to learn. It worked just as I saw it work. I quickly googled the app on my phone and found this video (which is eerily similar to what I saw in the library that day):
Photomath 2.0 from MicroBLINK on Vimeo.
We tend to hear stories all the time of computers doing “human things” and impacting productivity but this time it was different. This time I saw a direct connection between a technology, and how it could eliminate the need to learn something (more on that later).
In 1997 it was Deep Blue taking down Gary Kasparov in a chess match where computers finally ascended to the level of a human and beyond. By 2005 there were no more chess champions that could even try and compete with the chess playing computers.
Now computers can learn to play and master video games like Atari’s Pong with no guidance from a human. And we’ve seen Artificial Intelligence grow to new levels.
But still, none of those impacted or affected learning on a student level. It was stuff we would only hear about in the news. They did not change the basic ways that we learn something new. Because in order for learning to change it has to affect one of the four stages of how we actually learn something.
The 4 Stages to Actually Learning Something
A few months ago while doing research for this post I was tracking down the best description of how we learn. The blessing and the curse is that over the past 10-20 years there have been numerous books, articles, and videos centered around the idea of cognitive science—or the science of how we learn.
Still, many of those books and videos focused on one specific stage to the learning process, and did not give a full overview of what it looked like to not know something, to know something, and then be able to use that knowledge and understanding in the world.
This (to me) is one of the most important pieces of information we can have as an educator. It should be the foundation to our practice. If we want to teach to the best of our abilities, we should have a clear understanding of how our students learn, and what helps them learn best.
But, we tend to only hit the science of learning in one (maybe two) undergraduate courses, before losing much of that information by the time we actually start working with students. It is rarely brought up in professional development. I have almost never seen it mentioned at the numerous education conferences around the world. And, when I’ve asked the question to teachers and leaders, most don’t have a clear understanding of how we learn.
To be honest. Neither did I.
It wasn’t that I didn’t care. I did (and still do). It’s that I was filling my mind with all other types of information relating to teaching and learning, without starting with the first principles of why we learn and how we learn.
So, when I stumbled on the work of Peter Nilsson, I was blown away to see the science of learning made so clear. Nilsson is a teacher and school leader who works at Deerfield Academy in Massachusetts; and he has read extensively on the field of cognitive science. His 14-part series on “The Cognitive Science of Education” is a must read for any educator (or parent) serious about understanding the way our brain process all this information we get each day. [footnote]This is serious. You need to go read the 14-part series. It should only take an hour or two at the most, but it may just be the most valuable time you spend as a teacher and learner. [/footnote]
Here is Peter Nilsson describing the four stages to learning on his blog, Sense and Sensation:
So how do people learn? What are the mechanics of memory? Can we distill thousands of articles and books to something that is manageable, digestible, and applicable to our classrooms?
Yes. In brief, the cognitive process of learning has four basic stages:
1. Attention: the filter through which we experience the world
2. Encoding: how we process what our attention admits into the mind
3. Storage: what happens once information enters the brain
4. Retrieval: the recall of that information or behavior
Almost everything we do or know, we learn through these stages, for our learning is memory, and the bulk of our memory is influenced by these four processes: what we pay attention to, how we encode it, what happens to it in storage, and when and how we retrieve it.
Let’s start with Attention. Going back to the previous post on why we learn, it all begins with attention. Most of the time we pay attention for two reasons: Interest or Necessity.
Our brain is flooded with information from a multi-sensory world that is throwing sounds, sights, feelings, and everything else at us in rapid succession. With all of this information coming at us we tend to pay attention to things that we are curious and interested about, or information that has a direct correlation to our physical, emotional, or psychological well-being.
Then comes the Encoding. Our senses are being hit with so much information that when we finally process that information we begin to categorize it as a new experience or a connected experience with prior knowledge.
After we’ve successfully paid attention and made some connections (or created new information) we come to the Storage stage. Here we store this new or connected information in our short-term, working, or long-term memory. Where it is stored and how it is stored is associated with how powerful of an experience it is/was, and how often we bring that experience back into our daily lives.
Retrieval is the final stage. This is when we pull information out of the memory to help us in learning something new, or adapting to a situation, or connecting the dots on an experience. Retrieval also allows us to “re-encode” which starts the learning process all over again. It’s like a mini-version of the unlearning/relearning cycle we discussed in the last article.
You can think about how this cycle of learning works in all different types of contexts and experiences. From real-world applications like driving a car, to classroom situations like understanding photosynthesis, the more we retrieve information and connect it to new experiences, the stronger our understanding becomes around that topic and idea.
Which is why most of you reading this post have a better sense of how to drive a car then how photosynthesis works. Even though photosynthesis happens every day all around you, it does not impact you, or in other words, it does not grab your attention. Driving a car, on the other hand, is connected to your daily life as an adult for work, pleasure, and all other kinds of reasons.
Our students, just like all of us, tend to prioritize the learning of things that will impact them. It is in our nature to pay attention (and kick off the learning process) to information that is connected to our interests and needs.
So what happens when technology evolves and takes away some of the need to learn in a few areas? What happens when we retrieve information from other places besides our memory? And, what happens when technology eliminates the need to learn certain information because we’ll never need to retrieve it at all?
How Technology Is Impacting the Way our Brain Works
Consider the fact that technological advances over the years have always impacted how we learn, and changed how we engage with the learning process.
Before the written word/language we could only gather information through oral processes and connect learning experiences. When scrolls and books entered our daily lives, learning could be retrieved through other processes, and our encoding of information took to a new level. Similarly, the internet, personal computing devices, and smartphones have begun to revolutionize what we give our attention to, how we encode information, where our new information is stored, and how we go about retrieving and re-encoding what we’ve learned.
In an article published by Wired Magazine titled, “How the Web Became Our ‘External Brain’ and What It Means for Our Kids”, author Michael Harris dives into the ways technology is impacting the biology of our brain:
The brains our children are born with are not substantively different from the brains our ancestors had 40,000 years ago. For all the wild variety of our cultures, personalities, and thought patterns, we’re all still operating with roughly the same three-pound lump of gray matter. But almost from day one, the allotment of neurons in those brains (and therefore the way they function) is different today from the way it was even one generation ago. Every second of your lived experience represents new connections among the roughly 86 billion neurons packed inside your brain. Children, then, can become literally incapable of thinking and feeling the way their grandparents did. A slower, less harried way of thinking may be on the verge of extinction.
In your brain, your billions of neurons are tied to each other by trillions of synapses, a portion of which are firing right now, forging (by still mysterious means) your memory of this sentence, your critique of this very notion, and your emotions as you reflect on this information. Our brains are so plastic that they will re-engineer themselves to function optimally in whatever environment we give them. Repetition of stimuli produces a strengthening of responding neural circuits. Neglect of other stimuli will cause corresponding neural circuits to weaken. (Grannies who maintain their crossword puzzle regime knew that already.)
UCLA’s Gary Small is a pioneer of neuroplasticity research, and in 2008 he produced the first solid evidence showing that our brains are reorganized by our use of the internet. He placed a set of “internet naïve” people in MRI machines and made recordings of their brain activity while they took a stab at going online. Small then had each of them practice browsing the internet for an hour a day for a week. On returning to the MRI machine, those subjects now toted brains that lit up significantly in the frontal lobe, where there had been minimal neural activity beforehand. Neural pathways quickly develop when we give our brains new tasks, and Small had shown that this held true—over the course of just a few hours, in fact— following internet use.
Young people now count on the internet as ‘their external brain’ and have become skillful decision makers—even while they also ‘thirst for instant gratification and often make quick, shallow choices.’
Flashback to the story of the girl with the PhotoMath app, we can see technology eliminating some of the ways she learns. She had attention while doing her math homework. It was mostly created out of necessity (due to grades) but it did start with a form of attention. And then the app took over giving instant gratification in the form of encoding and retrieving theorems and formulas and steps to correctly answer the problem.
In the girl’s brain, it was the same as copying answers out of the back of the textbook. However, it was better because she did get to see all of the steps in the correct order. This experience still has her attention but changes the encoding, storage, and retrieval process.
The question is whether or not we think this is OK.
Is it ok for students to not know the steps to solving a math problem, but be able to use technology to quickly solve the problem and use the results for a specific purpose?
Is it ok for students to not know their state capitals because they can ask Siri?
Is it ok for students to not know a language because their device can translate words in real-time while they are listening or reading it?
And if we don’t think it is OK then what the heck are we going to do about it??? It doesn’t seem like this is going to magically disappear or go away….
Technology is (if you haven’t noticed) impacting all four stages of learning:
Technology is impacting Attention through interest and necessity. We live in a world where notifications, vibrations, and messaging drives our thoughts and need for instant gratification. We don’t have to wait anymore for what we want. So attention spans drop, and deep work (and a state of flow) is sometimes impossible to do.
Technology is impacting Encoding in how we connect our current experiences to past experiences. Digital bookmarking and real-time collaboration tools make this process fast, and the scalability even faster. Now the “internet of things” is connecting our experiences to others’ experiences and learning at a rapid exponential pace, making knowledge double every 12 months.
Technology is impacting Storage because our memories now have an external companion. The companion is not a book or encyclopedia or library that we have to dive through to find information. Most of this knowledge can be asked by a click of a button or asking a computer to reveal it to you.
Technology is impacting Retrieval because of that very fact that we can retrieve memories/experiences and knowledge from millions of people, categorized and put into the perfect model for retrieval (written, oral, video, etc).
And if you read posts #1 and #2 of this series then you know that this is going to increase with exponential speed and growth over the next years and decades. Technology has always impacted learning, but now it is almost reinventing the notion of what it means to learn (and how we do it).
Technology Doesn’t Always Add, It Also Eliminates
Technology has proven time and time again that it eliminates many previously valued skills in past generations. You can look at the impact on agriculture (We grow vegetables in the basement of our school). Or look at the impact on transportation (I don’t need to know how to ride a horse, maybe in a generation, my kids won’t need to know how to drive a car). Or look at the impact on health care (I saw my brother have video house calls with his doctors regularly).
Whether we want to embrace it or not, the fact is that technology has transformed our world and the reality of learning (and living) in this world.
This post was meant to examine how we learn and whether or not that four-stage process is also changing. But, I think what we’ve uncovered is the idea that while our brains are changing due to technology, the big shift is the elimination of the types of learning we would do, and how the four stages are being interwoven with our “external brain.”
The science shows that our brains are slowly evolving while the world around us is quickly evolving. This eliminates the need for many learning tasks we previously had to do (i.e. taking notes) and has us rely much more heavily on a third-party teacher that is usually not a teacher but instead a machine/device/computer of some sort.
So What Is Actually Changing?
Environment is changing. Our learning environments are shifting away from “places” we go to learn (like a school classroom or library) to an on-demand learning environment where you can have not only access to information at your fingertips, but also access to experts, mentors, and teachers with a click of a button.
Pace is changing. We’ve seen informational, technological, and artificial intelligence grow at an exponential rate in which we are creating new technologies and solving new problems that never existed.
Intelligence is changing. It’s no longer about what you know, but how fast you can find the knowledge, how well you can use it, and what you can create with it.
Work is changing. Jobs are being eliminated by automation. New jobs are being created by technology. Most are being asked to be an innovator and creator and problem solver. Gone are the days of following marching orders and climbing the ladder.
Learning is always a social process. It has been and continues to be driven by real human interaction and social connections (even though many of those connections are now happening in an online space).
Learning is always going to be driven by meaning. We learn best when it is meaningful and relevant to our lives. How can we continue to make it meaningful as our lives change?
Learning follows the four stage process. It starts with attention and is rebooted by retrieval. We must focus on the first principles of learning to harness the beneficial ways technology can help in the teaching/learning process.
What happens if we don’t change, but technology keeps changing the world around us?
Consider the following scenario:
We give our students the tests.
Machines grade the tests.
Test data is used to measure the teacher.
The teacher teaches to the test…
But what happens when the machine teaches the test, and then grades the test, and the data says the machine is a better teacher than the actual teacher?
We have to focus on teaching above and beyond the test, so our students are empowered to learn deeply, create daily, and inspire each other to dare greatly.
Let’s be intentional about innovation and focus on doing the work that a machine could never do, and technology could never replace.
Join 76,000 other learners (and teachers)
And get new posts every week by email.