Older man on computer confused

(Photo by Miljan Zivkovic. onShutterstock)

In the 21st century, digital technology has changed many aspects of our lives. Generative artificial intelligence (AI) is the latest newcomer, with chatbots and other AI tools changing how we learn and creating considerable philosophical and legal challenges regarding what it means to “outsource thinking.”

But the emergence of technology that changes the way we live is not a new issue. The change from analogue to digital technology began around the 1960s and this “digital revolution” is what brought us the internet. An entire generation of people who lived and worked through this evolution are now entering their early 80s.

So what can we learn from them about the impact of technology on the aging brain? A comprehensive new study from researchers at the University of Texas and Baylor University in the United States provides important answers.

Published in Nature Human Behaviour, it found no supporting evidence for the “digital dementia” hypothesis. In fact, it found the use of computers, smartphones and the internet among people over 50 might actually be associated with lower rates of cognitive decline.

What is ‘digital dementia’?

Much has been written about the potential negative impact from technology on the human brain.

According to the “digital dementia” hypothesis introduced by German neuroscientist and psychiatrist Manfred Spitzer in 2012, increased use of digital devices has resulted in an over-reliance on technology. In turn, this has weakened our overall cognitive ability.

Three areas of concern regarding the use of technology have previously been noted:

  1. An increase in passive screen time. This refers to technology use which does not require significant thought or participation, such as watching TV or scrolling social media.
  2. Offloading cognitive abilities to technology, such as no longer memorising phone numbers because they are kept in our contact list.
  3. Increased susceptibility to distraction.
Older woman looking at a tablet or smartphone while fixing her reading glases
Surprisingly, research suggests that use of mobile devices among older adults could actually lower the risk of cognitive decline. (Photo by Dragana Gordic on Shutterstock)

Why is this new study important?

We know technology can impact how our brain develops. But the effect of technology on how our brain ages is less understood.

This new study by neuropsychologists Jared Benge and Michael Scullin is important because it examines the impact of technology on older people who have experienced significant changes in the way they use technology across their life.

The new study performed what is known as a meta-analysis where the results of many previous studies are combined. The authors searched for studies examining technology use in people aged over 50 and examined the association with cognitive decline or dementia. They found 57 studies which included data from more than 411,000 adults. The included studies measured cognitive decline based on lower performance on cognitive tests or a diagnosis of dementia.

Older couple riding bicycles together
The study found that technology use had a similarly positive effect on brain function as physical activity. (© M. Business – stock.adobe.com)

A reduced risk of cognitive decline

Overall, the study found greater use of technology was associated with a reduced risk of cognitive decline. Statistical tests were used to determine the “odds” of having cognitive decline based on exposure to technology. An odds ratio under 1 indicates a reduced risk from exposure and the combined odds ratio in this study was 0.42. This means higher use of technology was associated with a 58% risk reduction for cognitive decline.

This benefit was found even when the effect of other things known to contribute to cognitive decline, such as socioeconomic status and other health factors, were accounted for.

Interestingly, the magnitude of the effect of technology use on brain function found in this study was similar or stronger than other known protective factors, such as physical activity (approximately a 35% risk reduction), or maintaining a healthy blood pressure (approximately a 13% risk reduction).

However, it is important to understand that there are far more studies conducted over many years examining the benefits of managing blood pressure and increasing physical activty, and the mechanisms through which they help protect our brains are far more understood.

It is also a lot easier to measure blood pressure than it is use of technology. A strength of this study is that it considered these difficulties by focusing on certain aspects of technology use but excluded others such as brain training games.

These findings are encouraging. But we still can’t say technology use causes better cognitive function. More research is needed to see if these findings are replicated in different groups of people (especially those from low and middle income countries) who were underrepresented in this study, and to understand why this relationship might occur.

A question of ‘how’ we use technology

In reality, it’s simply not feasible to live in the world today without using some form of technology. Everything from paying bills to booking our next holiday is now almost completely done online. Maybe we should instead be thinking about how we use technology.

Cognitively stimulating activities such as reading, learning a new language and playing music – particularly in early adulthood – can help protect our brains as we age.

Greater engagement with technology across our lifespan may be a form of stimulating our memory and thinking, as we adapt to new software updates or learn how to use a new smartphone. It has been suggested this “technological reserve” may be good for our brains.

Technology may also help us to stay socially connected, and help us stay independent for longer.

A rapidly changing digital world

While findings from this study show it’s unlikely all digital technology is bad for us, the way we interact and rely on it is rapidly changing

The impact of AI on the aging brain will only become evident in future decades. However, our ability to adapt to historical technological innovations, and the potential for this to support cognitive function, suggests the future may not be all bad.

For example, advances in brain-computer interfaces offer new hope for those experiencing the impact of neurological disease or disability.

However, the potential downsides of technology are real, particularly for younger people, including poor mental health. Future research will help determine how we can capture the benefits of technology while limiting the potential for harm.

Nikki-Anne Wilson, Postdoctoral Research Fellow, Neuroscience Research Australia (NeuRA), UNSW Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Conversation

About The Conversation

The Conversation is a nonprofit news organization dedicated to unlocking the knowledge of academic experts for the public. The Conversation's team of 21 editors works with researchers to help them explain their work clearly and without jargon.

Our Editorial Process

StudyFinds publishes digestible, agenda-free, transparent research summaries that are intended to inform the reader as well as stir civil, educated debate. We do not agree nor disagree with any of the studies we post, rather, we encourage our readers to debate the veracity of the findings themselves. All articles published on StudyFinds are vetted by our editors prior to publication and include links back to the source or corresponding journal article, if possible.

Our Editorial Team

Steve Fink

Editor-in-Chief

Sophia Naughton

Associate Editor

Leave a Reply