The coronavirus lockdown is forcing us to view ‘screen time’ differently. That’s a good thing
“How would we have coped before the internet?” is a quandary likely posed by someone you know. Beyond being a whimsical hypothetical, this question is relevant at a time when the digital age is ridiculed as the end of social skills as we know them. COVID-19 has seen society pivot, almost overnight, from real world interactions to the online space.
We have gone from mingling with colleagues, classmates and friends to being told to move our social interactions safely behind a webcam and sanitised keyboard. Internet providers and servers around the globe are being pushed to the limit as kitchen tables become boardrooms and laps become school desks.
Thus, it is cause to reframe our views on screen time – an activity that consumes, now more than ever, a significant proportion of our day.
COVID-19’s impact on screen time
With more than 90% of Australians having a smartphone, our often pilloried devices are now more essential to daily life than ever. As people fulfil their civic duty by staying home, platforms and internet providers are facing an unprecedented surge in online activity. Australia’s National Broadband Network (NBN) has seen a daytime usage increase of 70-80%, compared to figures in February. Demand for streaming sites across the globe has intensified, with Amazon and Netflix having to reduce video quality in some countries to handle the strain.
In March, Zoom knocked Facebook and Netflix down the Apple and Google mobile app store rankings in the US, as people sought video chat options.
Social media and video/online gaming are also flourishing. If we’re to take anything away from the significant increase in screen time caused by this pandemic, it is that human connection in the digital age comes in many forms.
Think of screen time as calories
We must acknowledge the umbrella term “screen time” can denote both positive and negative interactions with technology.
Think of screen time as consuming calories. All humans require calories to function. This unit of energy provides nutritional information relating to the contents of a food item, such as chocolate bar, or a carrot.
Whereas both foods contain calories, we know the carrot is a healthier source. While professionals might offer advice about which provides the most beneficial nutrition, the individual should still have agency over what they consume.
Similarly, people should be able to choose to partake in online activities not normally deemed “productive” – but which may help them through their day. Like calories, screen time is about moderation, making responsible choices and exercising self-control.
Lockdown and locked screens
Just as there are good and bad calories, so too exist good and bad examples of screen time. It is therefore not helpful to use the overarching term “screen time” when discussing how technology use should be moderated. An hour spent researching for an assignment is not tantamount to an hour spent watching cat videos, as the former is contributing to learning.
Also, an hour on social media chatting with friends is productive if it allows you to socialise at a time when important social interactions can’t otherwise take place (such as during lockdown). In this way, the current pandemic is not only helping shift our views on screen time – but has subtly rewritten them, too.
Screen time does not necessarily need to be objectively “beneficial”, nor does it need to have arbitrary time limits associated with it to prevent it from being detrimental.
Appropriate use is contextual. This fact should determine how parents, teachers and policymakers moderate its use, as opposed to mandating a certain number of hours per day, and not specifying how these hours should be spent.
We must steer clear of blanket statements when it comes to critiquing screen time. Our digital diets vary significantly, just as our real diets do. Consequently, screen time should be approached with a level of flexibility.
Fear fuels stigma
Some of the derision and concern associated with time spent on digital devices can be attributed to a fear of the new.
Swiss scientist Conrad Gessner was among the first to raise alarm over information overload, claiming an overabundance of data was “confusing and harmful” to the mind. If you’re not familiar with Gessner’s theory, it may be because he exclaimed it back in 1565, in response to the printing press.
Gessner’s warnings referred to the seemingly unmanageable flood of information unleashed by Johannes Gutenberg’s contraption. Fear of the new has permeated the debate on emerging technologies for generations.
And Gessner is not alone. From the New York Times warning in the late 1800s the telephone would invade our privacy, to concerns in the 1970s the rapid pacing of children’s shows such as Sesame Street led to distractibility – it is inherent human behaviour to be cautious about what we don’t fully understand.
Yet, many of these proclamations seem almost absurd in retrospect. What will later generations look back upon as statements fuelled by paranoia and fear, just because a new technology had disrupted the status quo?