Spending lots of time watching TV in midlife may be bad for your brain health in your senior years, according to findings from three new studies.
The studies found that people who reported watching moderate to large amounts of TV in their 40s, 50s and early 60s experienced greater cognitive declines, and had lower volumes of gray matter in their brains, in their 70s and 80s, compared with people who reported watching very little TV in midlife. Gray matter is involved in many brain functions, including muscle control, vision, how to come off of lexapro hearing and decision-making, the researchers said. Higher volumes of gray matter have been linked with better cognitive skills.
The studies, which will be presented this week at the American Heart Association’s Epidemiology, Prevention – Lifestyle & Cardiometabolic Health Conference 2021, used TV viewing as a proxy for sedentary behavior, or time spent sitting. A sedentary lifestyle has already been linked with several health problems, including an increased risk of heart disease, cancer, type 2 diabetes and early death. What’s more, regular exercise isn’t necessarily enough to make up for time spent sitting — a finding that was seen in both the current studies and previous research.
“In our findings, television viewing remained associated with cognitive function and gray matter volume after accounting for physical activity, suggesting that this sedentary behavior may impart a unique risk with respect to brain and cognitive health,” Ryan Dougherty, lead author of one of the studies and a postdoctoral fellow in the Department of Epidemiology at the Johns Hopkins Bloomberg School of Public Health in Baltimore, Maryland, said in a statement. Given that the biological processes that underlie dementia, such as brain decay, tend to start in midlife, “that’s a period [where] modifiable behaviors, such as excessive television viewing, can be targeted and reduced to promote healthy brain aging,” Dougherty said.
And some studies suggest that, as far as sedentary behaviors go, TV watching may pose particular risks, as it is a passive behavior that doesn’t involve a lot of cognitive stimulation, the researchers said.
“In the context of cognitive and brain health, not all sedentary behaviors are equal; non-stimulating sedentary activities such as television viewing are linked to greater risk of developing cognitive impairment, whereas cognitively stimulating sedentary activities [such as reading, computer and board games] are associated with maintained cognition and reduced likelihood of dementia,” Dougherty said.
Too much TV?
Two of the new studies used data from the Atherosclerosis Risk In Communities Neurocognitive Study (ARIC-NCS), which began in the mid-1980s, when participants were 45 to 64 years old. At that time, they were asked how much they watched TV during their leisure time, with responses recorded as “never or seldom” (low TV watching), “sometimes” (medium/moderate TV watching) or “often/very often” (high TV watching). Researchers followed up with the participants in the 1990s, when they again answered questions about their TV watching habits and completed cognitive tests. During another evaluation, between 2011 and 2013, they received brain MRI scans to look for structural markers of brain health, including the volume of gray matter.
One study, led by Priya Palta, an assistant professor of medical sciences and epidemiology at Columbia University, analyzed information from 10,700 adults in the ARIC-NCS study. The researchers focused on the results of participants’ cognitive tests, which included tests of memory, language and brain processing speed.
They found that people who reported moderate to high TV viewing in midlife experienced a 7% greater decline in cognitive function (based on their test results) over a 15-year period, compared with those that reported low TV viewing.
Another study, led by Kelley Pettee Gabriel, a professor of epidemiology in the School of Public Health at the University of Alabama at Birmingham, analyzed information from about 1,600 ARIC-NCS participants and focused on the results of their MRI scans.
They found that, compared with people who reported low TV viewing, those who reported moderate to high TV viewing had lower volumes of gray matter more than a decade later, indicating greater brain deterioration.
“Our findings suggest that the amount of television viewing, a type of sedentary behavior, may be related to cognitive decline and imaging markers of brain health,” Palta said. “Therefore, reducing sedentary behaviors, such as television viewing, may be an important lifestyle modification target to support optimal brain health.”
A third study, led by Dougherty, used data from the Coronary Artery Risk Development in Young Adults Study, which also began in the mid-1980s but involved people who were in their 30s at the study start, and followed these participants for 20 years. The researchers analyzed information from 600 participants, who were asked how many hours per day they spent watching TV, and also underwent brain MRI scans.
They found that more TV viewing was tied to lower gray-matter volume 20 years later. The researchers calculated that each one-hour increase in a person’s daily average TV viewing time was tied to a 0.5% reduction in gray-matter volume. That’s similar to the amount of gray-matter atrophy that’s typically seen over the course of a year in mid to late adulthood, Dougherty said.
—Scientists discover 4 distinct patterns of aging
—Sitting in front of the TV may be worse for your heart than sitting at a desk
—Here’s how you can keep sitting from killing you
Although the studies found an association between TV watching and cognitive decline and reduced brain volumes later in life, they cannot prove that heavy TV watching actually caused these outcomes. Although the studies accounted for some factors that may affect brain health — including age, education level and the presence of certain genes tied to Alzheimer’s risk — they did not ask about total sedentary time, or tease out TV viewing from other types of sedentary behavior. The studies also relied on participants’ reports of their TV viewing time, which may not be reliable.
In addition, the studies cannot determine why TV viewing was linked with these outcomes. It’s unclear whether sedentary behavior is indeed responsible for the link or whether some other factors tied to TV watching, such as increased food consumption, may play a role.
The researchers said more studies are needed to confirm the findings, including studies that use objective measures of sedentary behavior (like activity trackers), and those that examine differences in passive and active sedentary behavior, in relation to cognitive decline and brain health markers.
Originally published on Live Science.
Source: Read Full Article