I began my path of looking into the impact of technology on adolescent attention spans whilst at The King's School, Sydney. It was a distinct moment I can remember, when a student in my boarding house came to me asking for the definition of the word abdicate. Being one to encourage more independent learning, I told Tom, 15, to go and look it up in the dictionary. To which he exclaimed, “but sir, how can I do that when the wifi is down?”. It was this moment that drove home to me the thought that I was dealing with a new age of students who approach learning and the use of ICT in a very different way to any previous generation before them.
I spent 11 years acting in-loco-parentis, at boarding schools both in the UK and Australia. And although they might not have been my own children, if you can imagine what it’s like to manage 60 adolescent boys at bedtime and the fight to get all of them to put their laptops and phones away, you get the picture. When not working in the boarding house, I’ve been in the classroom. There is no greater environment to witness first-hand how teenager’s attention spans are shifting, where the already demanding role of the teacher to be interesting now becomes one where the window to grab the focus of their audience grows smaller and smaller. Beyond this, the past decade has been dedicated to trying to understand how teaching has, or hasn’t, shifted to adapt to an audience whose needs and media diet are unlike any other generation who has sat in the classroom rows before them.
Firstly, this must be prefaced with the fact that I am a pro technology educator. If I weren’t, I can’t imagine my talk would be that welcome at a conference like this!
Advancements in technology throughout the ages have always been met with resistance. However, it is very rare that such resistance will ever slow the overall progress. If we take Conrad Gessner, for example, who highlighted the concerns over information overload as confusing and harmful to the mind. Perhaps a sentiment that seems a recurring theme from commentators in today’s digital age. The only thing is, this comment was made mid 16th century, and in response to Gutenburg’s printing press and the proliferation of books. Nor could we ignore Socrates’ forewarning that the craft of writing things down would lead to forgetfulness.
An obvious counter argument to this is that the fear of technology is nothing new, and moral panic around the next advancement has well predated the Internet and smartphones. So, what then, for those that say that today’s issues with technology are just the same as any other leaps? The difference is the ubiquity, the addictive aspects and the demand on attention that our devices now place on our cognitive load.
Therefore, we should start by looking at cognitive load, as this is where I’ve focused my research on learning and attention spans, and specifically trying to identify the point at where technology can become more of a hindrance than a help.
What I want you to do, aloud if you wish, is count to ten as fast as possible.
Now, say the alphabet from A to J as fast as possible.
Now what I want you to do, is intersperse A-1, B-2 etc. aloud.
If our cognitive load could handle two modalities at once, then one would assume that the length of this task should take as long as if we added the time from Task 1 and Task 2 together. Alas, this is not the case. See, these are both rehearsed behaviours on their own. But now that we’re dividing the cognitive load between two streams of information, the processing speed is slowed. If you’ve ever had to turn down the radio whilst you’re looking for a parking space, even though we don’t need to use our ears for parking, you’ll know first hand how our brains have limited processing power when our attention is divided.
What is the point of this basic exercise? To highlight that we can’t divide our attention and expect to achieve the same outcomes, at least not as efficiently as if we had attended to one at a time.
So why is this important?
In the next decade and beyond, the information age will see focus becoming the cognitive currency essential to excelling in learning and the workplace. Our brains are becoming accustomed, or “rewired” as some theorists would put it, to being constantly and instantly satiated. News can be disseminated in 280 (recently doubled from 140) characters or less, apps deliver communication that self-destructs and an idle moment leads us to thumbing through images on a conveyor belt of carefully curated self-absorption. In his book Netsmart (1), Howard Rheingold suggests that attention is a vital skill in this digital age and growing research continues to confirm this. In a longitudinal study of over 1,000 children, Moffitt & Caspi (2) found the ability to exercise self-control and concentrate was one of the strongest predictors of success in later life.
Although the way we teach continues to evolve, with our practice striving to maintain pace with the rapidly changing world we live in, schools ostensibly follow a format that has existed since the post-industrial age. Students file into class to sit in organised formations; and then someone up the front (who hopefully knows more than them) presents information that they must convert into knowledge. At a later date they will be called upon to prove, in a quantifiable way, that they can recall said knowledge. It’s how we teach, it’s how we learn, and for the most part, it’s how we measure success in education. Although many will deride league tables, assessment rubrics, and reducing the learning of the individual to statistics, such data provides a valuable insight into how academic achievement can be attained and measured.
However, what has significantly evolved is the audience who sit in front of people like us, the educators. Students are now predisposed to seek immediate gratification and their attention spans demand a level of engagement that can easily be lost if the content is not within their realm of interest and motivation.
But hasn’t this always been the case? Where a student will tune out once they become bored in a lesson. Absolutely. The difference now is the lure to let the mind wander becomes increasingly appealing when there’s a multitude of other ways they could direct their attention, beyond that of calculus, the Crimean war or the fundamentals of cubism. With device in hand, often sanctioned by the very institution they are sat in, the option to digitally escape the classroom over to bottomless social media feeds, a group chat, sports scores or perhaps a compilation video of people falling over, is all too appealing.
Yet, why do we malign adolescents with this label of tech-addicts, when the appeal of technology is not confined to just one generation? At any one moment as I give this presentation, you may have diverted your own attention to what was vibrating in your pocket, or perhaps you have your laptop open under the guise of taking notes, but sometimes we tell ourselves that there’s an email there that just can’t wait to be read.
Ignoring distraction is a challenge for us as adults, as we are shackled by an onslaught of connectivity and inbound communication. If we as adults cannot control the urge to divide our attention, where distraction is at every turn, how can we expect the developing mind of the teenager to achieve what we cannot?
It is a fallacy to assume that digital natives, those born of an age where technology was already embedded in daily life, are inherently better multitaskers or more adept at handling multiple streams of information concurrently. I often feel like I’m my own worst case study, as I work towards finishing my doctorate. I know now that the only way I can effectively work on my thesis is to have my phone in another room and all notifications on my computer turned off. Without this, the pull of anything but my work can often be too much. And research tells us that deviating from a primary task, even for what we think is a brief moment, can result in ‘resumption lag’ that is deleterious to our productivity, where our ability to return to productivity can take considerable effort once we have deviated.
Therefore, the solution in education is not to ban or prohibit technology, to do so would be akin to standing at the bottom of Niagara Falls with an umbrella and hoping it would keep you dry. And to stick on the analogy of flowing water, I will refer to Clay Shirky, who likens the progress of controlling technology to kayaking. We might have a slight ability to steer, but the pressure behind us will keep us moving forward, whether we like it or not. The solution that we, as educators, must work towards is to teach appropriate use. Students need to develop metacognitive strategies and the skill of self-regulation to at once embrace technology, but also to harness it. And I use the word ‘harness’ purposefully, as without control over technology, it can easily take control over you.
I have focused my career on the impact that technology has on adolescent learning. When I commenced this journey, I had the mindset that 'screens are ruining our classrooms', if not the minds of adolescents. Yet, through research and consultation, my perspective has evolved to one that now acknowledges that effective technology integration is all about balance.
We certainly should not assume that great technology will ever replace a great teacher. However, we should also not assume that the absence of technology makes a great student.
Citation for conference proceedings related to my presentation:
Sebire. K.B., Proceedings of EdMedia: World Conference on Educational Media and Technology (pp. 476-481). Amsterdam, Netherlands: Association for the Advancement of Computing in Education (AACE)