Normal view

Today — 4 February 2026Main stream

Can shoes boost your brain power? What neuroscience says about the new claims

4 February 2026 at 03:00

Athletic footwear has entered a new era of ambition. No longer content to promise just comfort or performance, Nike claims its shoes can activate the brain, heighten sensory awareness and even improve concentration by stimulating the bottom of your feet.

“By studying perception, attention and sensory feedback, we’re tapping into the brain-body connection in new ways,” said Nike’s chief science officer, Matthew Nurse, in the company’s press release for the shoes. “It’s not just about running faster — it’s about feeling more present, focused and resilient.”

Other brands like Naboso sell “neuro-insoles,” socks and other sensory-based footwear to stimulate the nervous system.

It’s a compelling idea: The feet are rich in sensory receptors, so could stimulating them really sharpen the mind?

As a neurosurgeon who studies the brain, I’ve found that neuroscience suggests the reality is more complicated – and far less dramatic – than the marketing implies.

Close links between feet and brain

The soles of the feet contain thousands of mechanoreceptors that detect pressure, vibration, texture and movement.

Signals from these receptors travel through peripheral nerves to the spinal cord and up to an area of the brain called the somatosensory cortex, which maintains a map of the body. The feet occupy a meaningful portion of this map, reflecting their importance in balance, posture and movement.

Footwear also affects proprioception – the brain’s sense of where the body is in space – which relies on input from muscles, joints and tendons. Because posture and movement are tightly linked to attention and arousal, changes in sensory feedback from the feet can influence how stable, alert or grounded a person feels.

This is why neurologists and physical therapists pay close attention to footwear in patients with balance disorders, neuropathy or gait problems. Changing sensory input can alter how people move.

But influencing movement is not the same thing as enhancing cognition.

Minimalist shoes and sensory awareness

Minimalist shoes, with thinner soles and greater flexibility, allow more information about touch and body position to reach the brain compared with heavily cushioned footwear. In laboratory studies, reduced cushioning can increase a wearer’s awareness of where their foot is placed and when it’s touching the ground, sometimes improving their balance or the steadiness of their gait.

However, more sensation is not automatically better. The brain constantly filters sensory input, prioritizing what is useful and suppressing what is distracting. For people unaccustomed to minimalist shoes, the sudden increase in sensory feedback may increase cognitive load – drawing attention toward the feet rather than freeing mental resources for focus or performance.

Sensory stimulation can heighten awareness, but there is a threshold beyond which it becomes noise.

Can shoes improve concentration?

Whether sensory footwear can improve concentration is where neuroscience becomes especially skeptical.

Sensory input from the feet activates somatosensory regions of the brain. But brain activation alone does not equal cognitive enhancement. Focus, attention and executive function depend on distributed networks involving various other areas of the brain, such as the prefrontal cortex, the parietal lobe and the thalamus. They also rely on hormones that modulate the nervous system, such as dopamine and norepinephrine.

There is little evidence that passive underfoot stimulation – textured soles, novel foam geometries or subtle mechanical features – meaningfully improves concentration in healthy adults. Some studies suggest that mild sensory input may increase alertness in specific populations – such as older adults training to improve their balance or people in rehabilitation for sensory loss – but these effects are modest and highly dependent on context.

Put simply, feeling more sensory input does not mean the brain’s attention systems are working better.

Belief, expectation and embodied experience

While shoes may not directly affect your cognition, that does not mean the mental effects people report are imaginary.

Belief and expectation still play a powerful role in medicine. Placebo effects and their influence on perception, motivation and performance are well documented in neuroscience. If someone believes a shoe improves focus or performance, that belief alone can change perception and behavior – sometimes enough to produce measurable effects.

There is also growing interest in embodied cognition, the idea that bodily states influence mental processes. Posture, movement and physical stability can shape mood, confidence and perceived mental clarity. Footwear that alters how someone stands or moves may indirectly influence how focused they feel, even if it does not directly enhance cognition.

In the end, believing a product gives you an advantage may be the most powerful effect it has.

Where science and marketing diverge

The problem is not whether footwear influences the nervous system – it does – but imprecision. When companies claim their shoes are “mind-altering,” they often blur the distinction between sensory modulation and cognitive enhancement.

Neuroscience supports the idea that shoes can change sensory input, posture and movement. It does not support claims that footwear can reliably improve concentration or attention for the general population. If shoes truly produced strong cognitive changes, those effects would be robust, measurable and reproducible. So far, they are not.

Shoes can change how we feel in our bodies, how you move through space and how aware you are of your physical environment. Those changes may influence confidence, comfort and perception – all of which matter to experience.

But the most meaningful “mind-altering” effects a person can experience through physical fitness still come from sustained movement, training, sleep and attention – not from sensation alone. Footwear may shape how the journey feels, but it is unlikely to rewire the destination.The Conversation

 

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Wealthier men show higher metabolism in brain regions controlling reward and stress

3 February 2026 at 23:00

An analysis of positron emission tomography data in Korea found that higher family income was associated with increased neural activity (estimated through increased glucose metabolism) in the caudate, putamen, anterior cingulate, hippocampus, and amygdala regions of the brain of middle-aged men. These areas of the brain are involved in reward processing and stress regulation. The paper was published in the European Journal of Neuroscience.

Socioeconomic status refers to a person’s position in society based on income, education, and social standing. It is a powerful predictor of many life outcomes. Individuals with higher socioeconomic status tend to have better physical and mental health and to live longer. Lower socioeconomic status is associated with higher rates of cardiovascular disease, diabetes, depression, anxiety, and psychotic disorders.

Cognitive abilities, intelligence, and academic achievement also tend to be higher in individuals with higher socioeconomic status. These effects are thought to arise partly through neurobiological pathways shaped by long-term social and environmental exposure. Research in animals shows that social hierarchy can alter neurotransmitter systems, influencing motivation, stress sensitivity, and vulnerability to addiction.

In humans, differences in socioeconomic status have been shown to produce differences in language development, learning opportunities, and responses to reward as early as childhood. Later in life, higher socioeconomic status contributes to cognitive reserve, affecting how well individuals maintain cognitive function despite aging or brain pathology.

Study author Kyoungjune Pak and his colleagues wanted to explore the associations between neural activity in middle adulthood, education, and family income. They note that a lot of research has focused on children, young people, and the elderly, but that the number of studies on middle-aged adults is relatively low. On the other hand, this period of life is particularly important, as accumulated experiences and exposures associated with socioeconomic status can have lasting effects on brain health.

They analyzed positron emission tomography data of 233 healthy males who underwent a health check-up program at Samsung Changwon Hospital Health Promotion Center (in Changwon, South Korea) in 2013. Their average age was 43 years. Participants’ mean family income was 61,319 USD per year. On average, they completed 13–14 years of education. Study authors also included data from 232 men with missing socioeconomic status data for comparison to ensure the sample was representative.

In their analysis, study authors used positron emission tomography recordings of participants’ brains alongside data on family income and education level. They also used data on stress (collected using the Korean National Health and Nutrition Examination Survey), anxiety (the Beck Anxiety Inventory), and depression (the Centre for Epidemiologic Studies Depression Scale).

Results showed that individuals with higher family income tended to have a higher education level. Higher family income was also associated with increased glucose metabolism in the caudate, putamen, anterior cingulate, hippocampus, and amygdala regions of the brain.

This means that neural activity in these regions was higher in individuals with higher family income. These regions of the brain are involved in reward processing and stress regulation. Interestingly, education level was not associated with brain activity patterns.

“Family income and education level show differential associations with brain glucose metabolism in middle-aged males. Family income is associated with elevated brain glucose metabolism in regions involved in reward processing and stress regulation, suggesting a potential link between current socioeconomic resources and neural activity. However, these findings are cross-sectional and must be interpreted as associative rather than causal. Education level does not show a significant association with brain glucose metabolism,” the study authors concluded.

The study contributes to the scientific understanding of neural correlates of socioeconomic status. However, it is important to note that study participants were Korean middle-aged men, so it remains unknown how much these findings can be generalized to other demographic groups and other cultures.

The paper, “Family Income Is Associated With Regional Brain Glucose Metabolism in Middle-Aged Males,” was authored by Kyoungjune Pak, Seunghyeon Shin, Hyun-Yeol Nam, Keunyoung Kim, Jihyun Kim, Myung Jun Lee, and Ju Won Seok.

Yesterday — 3 February 2026Main stream

High-precision neurofeedback accelerates the mental health benefits of meditation

3 February 2026 at 15:00

A new study published in the journal Mindfulness has found that high-precision brain training can help novice meditators learn the practice more effectively. The findings indicate that neurofeedback can assist individuals in reducing self-critical or wandering thoughts. This training appears to lead to sustained improvements in mindful awareness and emotional well-being during subsequent daily life.

Meditation is often promoted for its ability to reduce stress and improve mental health. The practice frequently involves focusing attention on a specific anchor, such as the sensation of breathing.

The goal is to notice when the mind wanders and gently return focus to the breath. While the concept is simple, the execution is often difficult for beginners. Novices frequently struggle to recognize when their minds have drifted into daydreams or self-referential thinking. Because meditation is an internal mental process, it lacks the external feedback that accompanies learning physical skills.

“A key problem that motivated this project, is ‘not being able to know whether what we are doing internally while meditating is what we were actually meant to be doing,'” said study author Saampras Ganesan, a postdoctoral research associate at the Laureate Institute for Brain Research and honorary research fellow at the University of Melbourne.

“You can look at a mirror to get live and detailed feedback while learning an intricate dance or exercise move. But this is not the case with something so abstract like meditation. This may be holding back the mental health benefits and wider impact that meditation could have in modern life.”

The researchers aimed to address this challenge by providing an external “mirror” for the mind. They sought to determine if real-time information about brain activity could act as a scaffold for learning.

The study focused on helping participants identify and reduce activity in the posterior cingulate cortex. This brain region is a key hub of the default mode network. This network typically becomes active when a person is not focused on the outside world, such as during daydreaming, worrying, or thinking about oneself.

To test this, the investigators recruited 40 healthy adults who had little to no prior experience with meditation. They screened these individuals to ensure they had no history of psychiatric or neurological conditions. The participants were randomly assigned to one of two groups. One group was the experimental condition, and the other served as a control.

The study employed a 7-Tesla fMRI scanner. This machine creates a magnetic field much stronger than the standard MRI scanners found in hospitals. The high magnetic field allows for extremely precise imaging of brain function. Participants lay inside the scanner and were instructed to practice focused attention meditation. They kept their eyes open and watched a visual display.

The display functioned like a thermometer. For the experimental group, the level on the thermometer changed based on the real-time activity of their own posterior cingulate cortex.

When they successfully focused on their breath and quieted this brain region, the thermometer reading went down. If their mind wandered and the region became active, the reading went up. This provided immediate confirmation of their internal mental state.

The control group went through the exact same procedure with one critical difference. The feedback they saw was not from their own brains. Instead, they viewed a recording of brain activity from a participant in the experimental group.

This is known as “sham” feedback. It allowed the researchers to control for the effects of being in the scanner, seeing visual stimuli, and trying to meditate. The participants did not know which group they were in.

The training took place over two consecutive days. Following this laboratory phase, all participants were asked to continue meditating at home for one week. They used a mobile app to guide 5-minute meditation sessions. They also completed surveys to track their mood, stress levels, and mindful awareness.

The results revealed that the blinding procedure was successful. Participants in both groups believed they were receiving genuine feedback. They also reported similar levels of effort and perceived success. This suggests that any differences in outcomes were due to the specific brain training rather than placebo effects or expectations.

“Surprisingly, people could not easily tell whether the brain feedback came from their own brain (experimental group) or someone else’s (control group),” Ganesan told PsyPost. “Both groups rated the feedback as equally accurate – even though the group receiving their own brain feedback showed more meaningful positive changes in the brain circuit linked to meditation.”

“This suggests that people, especially beginners at meditation, may not be completely aware of all the factors driving effects in meditation, and that perceivable benefits may only become clearer with time and more consistent practice following targeted, reliable training.”

Despite these similar perceptions, the brain imaging data showed distinct differences. The experimental group exhibited a change in how their brain regions communicated.

Specifically, they developed a stronger negative connection between the posterior cingulate cortex and the dorsolateral prefrontal cortex. The dorsolateral prefrontal cortex is involved in executive functions, such as controlling attention and managing distractions.

This finding implies that the neurofeedback helped the experimental group recruit their brain’s control systems to down-regulate the mind-wandering network. This neural pattern was not observed in the control group.

The ability to suppress the default mode network is often associated with experienced meditators. The novices in the experimental group appeared to acquire this neural skill rapidly through the targeted feedback.

The benefits of the training extended beyond the laboratory. During the week of home practice, the experimental group maintained higher levels of mindful awareness. In contrast, the control group showed a decline in awareness over the week. This suggests that without the specific guidance provided by the neurofeedback, the control participants struggled to sustain the quality of their meditation practice.

The study also found improvements in emotional well-being. The experimental group reported a significant reduction in emotional distress. This measure combined ratings of depression, anxiety, and stress.

The researchers found a correlation between the brain changes and the mood improvements. Participants who showed the strongest connection between the attention and default mode networks experienced the greatest reduction in distress.

“Teaching people to meditate with live feedback from their own brain can help them meditate more effectively on their own over time, with early benefits for self-awareness and mood,” Ganesan explained. “For these benefits to matter, the brain feedback needs to be well-targeted and specific to the meditation goal – more precise feedback leads to stronger results.”

One unexpected finding involved a breath-counting task. This is an objective test often used to measure mindfulness. Participants press a button for each breath and a different button for every ninth breath.

The experimental group actually performed worse on this task after the training. The researchers suggest this might be because the task requires cognitive effort and counting. The neurofeedback training emphasized “letting go” of thoughts, which might have conflicted with the requirement to actively count.

As with all research, there are limitations. The sample size was relatively small. While 40 participants is common for complex neuroimaging studies, it is small for drawing broad behavioral conclusions. The equipment used is also rare and expensive. A 7-Tesla scanner is not a tool that can be easily deployed for general therapy or training.

“An important takeaway is that while the idea of using brain feedback to support meditation is promising, most current wearable and commercial devices are not yet reliable enough to deliver clear benefits,” Ganesan said. “Many studies testing such devices find little evidence beyond placebo, often because the brain signals used are not precise enough.”

“At present, there are no widely accessible, well-validated brain-feedback systems detailed enough to reliably guide meditation training and practice. Highly advanced brain-imaging approaches, like the one used in our study, show what may be possible in principle, but they are not practical for everyday use. As technology improves, reliable and scalable tools may emerge. But until then, the benefits of brain-feedback-assisted meditation will remain limited for most people.”

The follow-up period was also short. It remains unclear if the benefits would persist longer than one week without further reinforcement.

“While the study offers promising signs that detailed brain-feedback–supported meditation training can have real-world benefits, larger studies over longer periods are needed to confirm these results,” Ganesan told PsyPost. “A major strength of the current study is the use of a well-matched control group, which helped show that the benefits were greater than placebo or other unrelated effects.”

Future research will likely focus on whether these results can be replicated with larger groups. Scientists may also explore if similar results can be achieved using less expensive technology, such as EEG sensors. If scalable methods can be developed, this approach could offer a new way to support mental health treatments. It provides a proof of concept that technology can accelerate the learning curve for meditation.

“My long-term vision is to develop a scalable but personalized, science-backed brain-feedback tool that can reliably support meditation training and mental health at a population level,” Ganesan explained. “By developing such technology and making it accessible in schools, clinics, and homes, the goal is to promote everyday emotional well-being, strengthen mental resilience, and help reduce the burden of mental illness in the modern world.”

“While there are many types of meditation, the technique studied here – focused-attention or breathing-based meditation, often grouped under mindfulness – is widely regarded by researchers and meditation experts as a foundational practice,” the researcher added. “The skills developed through this form of meditation are considered essential for learning and practicing other techniques effectively. As a result, developing reliable and targeted brain-based tools to support training in this practice is especially valuable.”

The study, “Neurofeedback Training Facilitates Awareness and Enhances Emotional Well-being Associated with Real-World Meditation Practice: A 7-T MRI Study,” was authored by Saampras Ganesan, Nicholas T. Van Dam, Sunjeev K. Kamboj, Aki Tsuchiyagaito, Matthew D. Sacchet, Masaya Misaki, Bradford A. Moffat, Valentina Lorenzetti, and Andrew Zalesky.

Before yesterdayMain stream

Brain scans reveal neural connectivity deficits in Long COVID and ME/CFS

2 February 2026 at 19:00

New research suggests that the brains of people with Long COVID and Myalgic Encephalomyelitis/Chronic Fatigue Syndrome (ME/CFS) struggle to communicate effectively during mentally tiring tasks. While healthy brains appear to tighten their neural connections when fatigued, these patients show disrupted or weakened signals between key brain areas. This study was published in the Journal of Translational Medicine.

ME/CFS and Long COVID are chronic conditions that severely impact the quality of life for millions of people. Patients often experience extreme exhaustion and “brain fog,” which refers to persistent difficulties with memory and concentration.

A defining feature of these illnesses is post-exertional malaise. This describes a crash in energy and a worsening of symptoms that follows even minor physical or mental effort. Doctors currently lack a definitive biological test to diagnose these conditions. This makes it difficult to distinguish them from one another or from other disorders with similar symptoms.

The research team sought to identify objective biological markers of these illnesses. Maira Inderyas, a PhD candidate at the National Centre for Neuroimmunology and Emerging Diseases at Griffith University in Australia, led the investigation. She worked alongside senior researchers including Professor Sonya Marshall-Gradisnik. They aimed to understand how the brain behaves when pushed to the limit of its cognitive endurance.

Professor Marshall-Gradisnik noted the shared experiences of these patient groups. “The symptoms include cognitive difficulties, such as memory problems, difficulties with attention and concentration, and slowed thinking,” Professor Marshall-Gradisnik said. The team hypothesized that these subjective feelings of brain fog would correspond to visible changes in brain activity.

To test this, the researchers utilized a 7 Tesla MRI scanner. This device is much more powerful than the standard scanners found in most hospitals. The high magnetic field allows for extremely detailed imaging of deep brain structures. It can detect subtle changes in blood flow that weaker scanners might miss.

The study involved nearly eighty participants. These included thirty-two individuals with ME/CFS and nineteen with Long COVID. A group of twenty-seven healthy volunteers served as a control group for comparison.

While inside the scanner, participants performed a cognitive challenge known as the Stroop task. This is a classic psychological test that requires focus and impulse control. Users must identify the color of a word’s ink while ignoring the actual word written. For example, the word “RED” might appear on the screen written in blue ink. The participant must select “blue” despite their brain automatically reading the word “red.”

“The task, called a Stroop task, was displayed to the participants on a screen during the scan, and required participants to ignore conflicting information and focus on the correct response, which places high demands on the brain’s executive function and inhibitory control,” Ms. Inderyas said.

The researchers structured the test to induce mental exhaustion. Participants performed the task in two separate sessions. The first session was designed to build up cognitive fatigue. The second session took place ninety seconds later, after fatigue had fully set in. This “Pre” and “Post” design allowed the scientists to see how the brain adapts to sustained mental effort.

The primary measurement used in this study was functional connectivity. This concept refers to how well different regions of the brain synchronize their activity. When two brain areas activate at the same time, it implies they are communicating or working together.

The results revealed clear differences between the healthy volunteers and the patient groups. In healthy participants, the brain responded to the fatigue of the second session by increasing its connectivity. Connections between deep brain regions and the cerebellum became stronger. This suggests that a healthy brain actively recruits more resources to maintain performance when it gets tired. It becomes more efficient and integrated under pressure.

The pattern was markedly different for patients with Long COVID. They displayed reduced connectivity between the nucleus accumbens and the cerebellum. The nucleus accumbens is a central part of the brain’s reward and motivation system. A lack of connection here might explain the sense of apathy or lack of mental drive patients often report.

Long COVID patients also showed an unusual increase in connectivity between the hippocampus and the prefrontal cortex. The researchers interpret this as a potential compensatory mechanism. The brain may be trying to bypass damaged networks to keep functioning. It is attempting to use memory centers to help with executive decision-making.

Patients with ME/CFS showed their own distinct patterns of dysfunction. They exhibited increased connectivity between specific areas of the brainstem, such as the cuneiform nucleus and the medulla. These regions are responsible for controlling automatic body functions. This finding aligns with the autonomic nervous system issues frequently seen in ME/CFS patients.

The researchers also looked at how these brain patterns related to the patients’ medical history. In the ME/CFS group, the length of their illness correlated with specific connectivity changes. As the duration of the illness increased, communication between the hippocampus and cerebellum appeared to weaken. This suggests a progressive change in brain function over time.

Direct comparisons between the groups highlighted the extent of the impairment. When compared to the healthy controls, both patient groups showed signs of neural disorganization. The healthy brain creates a “tight” network to handle stress. The patient brains appeared unable to form these robust connections.

Instead of tightening up, the networks in sick patients became looser or dysregulated. This failure to adapt dynamically likely contributes to the cognitive dysfunction known as brain fog. The brain cannot summon the necessary energy or coordination to process information efficiently.

“The scans show changes in the brain regions which may contribute to cognitive difficulties such as memory problems, difficulty concentrating, and slower thinking,” Ms. Inderyas said. This provides biological validation for symptoms that are often dismissed as psychological.

The study does have some limitations that must be considered. The number of participants in each group was relatively small. This is common in studies using such advanced and expensive imaging technology. However, it means the results should be replicated in larger groups to ensure accuracy.

The researchers also noted that they lacked complete medical histories regarding prior COVID-19 infections for the ME/CFS group. It is possible that some ME/CFS patients had undiagnosed COVID-19 in the past. This could potentially blur the lines between the two conditions.

Future studies will need to follow patients over a longer period. Longitudinal research would help determine if these brain changes evolve or improve over time. It would also help clarify if these connectivity issues are a cause of the illness or a result of it.

Despite these caveats, the use of 7 Tesla fMRI offers a promising new direction for research. It has revealed abnormalities that standard imaging could not detect. These findings could eventually lead to new diagnostic tools. Identifying specific broken circuits may also help researchers target treatments more effectively.

The study, “Distinct functional connectivity patterns in myalgic encephalomyelitis and long COVID patients during cognitive fatigue: a 7 Tesla task-fMRI study,” was authored by Maira Inderyas, Kiran Thapaliya, Sonya Marshall-Gradisnik & Leighton Barnden.

The neural path from genes to intelligence looks different depending on your age

2 February 2026 at 17:00

New research published in Scientific Reports provides evidence that the path from genetic predisposition to general intelligence travels through specific, frequency-dependent networks in the brain. The findings indicate that these neural pathways are not static but appear to shift significantly between early adulthood and older age.

Intelligence is a trait with a strong biological basis. Previous scientific inquiries have established that genetic factors account for approximately 50% of the differences in intelligence between individuals. Genome-wide association studies have identified hundreds of specific variations in the genetic code that correlate with cognitive ability.

These variations are often aggregated into a metric known as a polygenic score, which estimates an individual’s genetic propensity for a certain trait. Despite this knowledge, the specific biological mechanisms that translate a genetic sequence into the ability to reason, plan, and solve problems remain unclear.

Scientists have hypothesized that the brain’s functional connectivity acts as the intermediary between genes and behavior. Functional connectivity refers to how well different regions of the brain communicate with one another. While past studies using functional magnetic resonance imaging (fMRI) have attempted to map these connections, the results have been inconsistent.

fMRI is excellent at locating where brain activity occurs but is less precise at measuring when it occurs. The authors of the new study opted to use electroencephalography (EEG). This technology records the electrical activity of the brain with high temporal resolution, allowing researchers to observe the speed and rhythm of neural communication.

“We already know that intelligence is highly heritable, which is why we are especially interested in the role of the brain as a ‘neural pathway’ linking genetic variation to cognitive ability,” said study author Rebecca Engler of the Leibniz Research Centre for Working Environment and Human Factors (IfADo).

“The lack of integrative approaches combining genetics, brain network organization, and intelligence motivated us to take a closer look at resting-state EEG markers, with a particular focus on differences between young and older adults.”

“In a recent large-scale study (Metzen et al., 2024) using resting-state fMRI, we found no robust association between functional architecture of specific brain regions and intelligence. This motivated our shift toward resting-state EEG, which captures brain dynamics at much higher temporal resolution. EEG measures brain activity as oscillations across different frequencies, allowing us to study frequency-specific brain networks that may carry distinct information relevant to cognitive ability.”

For their study, the researchers recruited a representative sample of 434 healthy adults from the Dortmund Vital Study. The participants were categorized into two distinct age groups. The young adult group consisted of 199 individuals between the ages of 20 and 40. The older adult group included 235 individuals aged 40 to 70.

To measure intelligence, the research team administered a comprehensive battery of cognitive tests. These assessments covered a wide range of mental capabilities, including verbal memory, processing speed, attention span, working memory, and logical reasoning. The scores from these tests were combined to calculate a single factor of general intelligence, often denoted as g. This factor serves as a reliable summary of an individual’s overall cognitive performance.

Genetic data were obtained through blood samples. The researchers analyzed the DNA of each participant to compute a polygenic score for intelligence. This score was calculated based on summary statistics from previous large-scale genetic studies. It represents the cumulative effect of many small genetic variations that are statistically associated with higher cognitive function.

Brain activity was recorded while participants sat quietly with their eyes closed for two minutes. This “resting-state” EEG data allowed the researchers to analyze the intrinsic functional architecture of the brain.

The team employed a method known as graph theory to quantify the organization of the brain networks. In this framework, the brain is modeled as a collection of nodes (regions) and edges (connections).

The researchers calculated metrics such as “efficiency,” which measures how easily information travels across the network, and “clustering,” which measures how interconnected specific local neighborhoods of the brain are. These metrics were analyzed across different frequency bands, including delta, theta, alpha, and beta waves.

The study employed complex statistical modeling to test for mediation effects. A mediation analysis determines whether a third variable—in this case, brain connectivity—explains the relationship between an independent variable (genetics) and a dependent variable (intelligence). The researchers looked for instances where the polygenic score predicted a specific brain network property, which in turn predicted the intelligence score.

The results showed that global measures of brain efficiency did not mediate the link between genetics and intelligence. This suggests that simply having a “more efficient” brain overall is not the primary mechanism by which genes influence cognition.

In other words, “there is no single brain region responsible for intelligence,” Engler told PsyPost. “Instead, cognitive ability relies on efficient and dynamic communication across a broad network of regions throughout the brain, and this network organization changes as we age.”

The specific neural pathways identified varied substantially by age. For young adults, the connection between genetics and intelligence was mediated by brain activity in the beta and theta frequency bands. These effects were predominantly located in the frontal and parietal regions of the brain.

The frontal and parietal lobes are areas traditionally associated with executive functions, such as decision-making, working memory, and attention. This aligns with prominent theories that attribute intelligence to the efficient integration of information between these higher-order brain regions.

But for older adults, the mediating effects were found primarily in the low alpha and theta frequency bands. Furthermore, the specific brain regions involved shifted away from the frontal cortex. The analysis identified the superior parietal lobule and the primary visual cortex as key mediators. These areas are largely responsible for sensory processing and integration.

This shift suggests that the neural architecture supporting intelligence evolves as people age. In younger adulthood, cognitive ability appears to rely heavily on the rapid, high-frequency communication of executive control networks in the front of the brain. As the brain ages, it may undergo a process of reorganization.

The reliance on posterior brain regions and slower frequency bands in older adults implies a strategy that prioritizes the integration of sensory information. This finding is consistent with the concept of neural dedifferentiation, where the aging brain recruits broader, less specialized networks to maintain performance.

The researchers also found that certain brain areas, such as the primary visual cortex, played a consistent role across both groups, though the direction of the effect varied. In both young and older adults, higher nodal efficiency in the visual cortex was associated with higher intelligence.

However, a higher genetic predisposition for intelligence was associated with lower efficiency in this region. This complex relationship highlights that the genetic influence on the brain is not always a straightforward enhancement of connectivity.

“When comparing the two age groups, we were surprised that the brain regions consistently mediating the link between genetic variation and intelligence are primarily involved in sensory processing and integration,” Engler explained. “One might expect such stable neural anchors to be associated with higher-order executive functions like reasoning or planning, typically located in frontal networks. Instead, our results suggest that sensory and associative regions play a more central role in maintaining cognitive ability than is typically emphasized in dominant models of intelligence.”

As with all research, there are some limitations to note. The study utilized a cross-sectional design, meaning it compared two different groups of people at a single point in time. It did not follow the same individuals as they aged.

Consequently, it is not possible to definitively prove that the observed differences are caused by the aging process itself rather than generational differences. Longitudinal studies that track participants over decades would be necessary to confirm the shift in neural strategies.

The study focused exclusively on resting-state EEG. While intrinsic brain activity provides a baseline of functional organization, it does not capture the brain’s dynamic response to active problem-solving.

It is possible that different network patterns would emerge if participants were recorded while performing the cognitive tests. Future research could investigate task-based connectivity to see if it offers a stronger explanatory link between genetics and performance.

“A crucial next step would be to replicate our findings in independent samples to ensure their robustness and generalizability,” Engler said. “Furthermore, it would be interesting to investigate age-related changes in functional network organization from a longitudinal rather than from a cross-sectional perspective. A further long-term goal is to investigate the triad of genetic variants, the brain’s functional connectivity, and intelligence by analyzing task-based EEG data rather than resting-state EEG data.”

The study, “Electrophysiological resting-state signatures link polygenic scores to general intelligence,” was authored by Rebecca Engler, Christina Stammen, Stefan Arnau, Javier Schneider Penate, Dorothea Metzen, Jan Digutsch, Patrick D. Gajewski, Stephan Getzmann, Christoph Fraenz, Jörg Reinders, Manuel C. Voelkle, Fabian Streit, Sebastian Ocklenburg, Daniel Schneider, Michael Burke, Jan G. Hengstler, Carsten Watzl, Michael A. Nitsche, Robert Kumsta, Edmund Wascher, and Erhan Genç.

A process thought to destroy brain cells might actually help them store data

2 February 2026 at 05:00

Recent research provides evidence that the nervous system actively promotes the formation of amyloid structures to stabilize long-term memories. While amyloids are often associated with neurodegenerative conditions, this study identifies a specific protein chaperone that drives the creation of beneficial amyloids in response to sensory experiences. These findings, which offer a new perspective on how the brain encodes information, were published in the Proceedings of the National Academy of Sciences.

Scientists have studied the biological basis of memory for decades. A prevailing model posits that long-term memory requires the physical alteration of synapses, the connections between neurons. This process involves changes in the proteins located at these synapses.

One specific protein, known as Orb2 in fruit flies, plays a central role in this process. Orb2 creates a stable memory trace by self-assembling into an amyloid, a tight stack of proteins that is durable and self-perpetuating.

Most research on amyloids focuses on their toxic role in diseases such as Alzheimer’s. In those contexts, proteins misfold and aggregate in ways that damage cells. However, the brain appears to use a similar aggregation mechanism for beneficial purposes. The question remained regarding how the brain ensures that Orb2 forms amyloids only when a memory needs to be stored and not at random times.

A research team led by Kyle Patton investigated the regulatory systems that might control this precise timing. They hypothesized that molecular chaperones, which are proteins that assist others in folding or assembling, might be responsible for this regulation.

To identify the specific molecules involved, the researchers focused on the J-domain protein (JDP) family. This is a diverse group of chaperones known to regulate protein states. The team utilized Drosophila melanogaster, the common fruit fly, as their model organism. They examined 46 different JDPs found in the fly genome. The team narrowed their search to chaperones expressed in the mushroom body, a brain structure in insects that is essential for learning and memory.

The researchers conducted a genetic screen to determine which of these chaperones influenced memory retention. They used a classical conditioning experiment known as an associative appetitive memory paradigm. In this procedure, the researchers starved flies for a short period to motivate them. They then exposed the flies to two different odors. One odor was paired with a sugar reward, while the other was not. After training, the flies were given a choice between the two odors.

Most wild-type flies remember which odor predicts food for a certain period. The researchers genetically modified groups of flies to overexpress specific JDPs in their mushroom body neurons. They found that increasing the levels of one specific chaperone, named CG10375, significantly enhanced the flies’ ability to form long-term memories. The researchers named this protein “Funes,” inspired by a fictional character with the inability to forget.

The study showed that flies with elevated levels of Funes remembered the association between the odor and the sugar for much longer than control flies. This effect was specific to long-term memory. Short-term memory, which operates through different molecular mechanisms, appeared unaffected. This suggests that Funes plays a distinct role in the consolidation phase of memory storage.

To verify that Funes is necessary for memory—and not just a booster when artificially added—the team performed the reverse experiment. They used genetic tools to reduce the natural levels of Funes in the fly brain or to create mutations in the Funes gene.

Flies with reduced Funes activity were capable of learning the task initially. However, they failed to retain the memory 24 hours later. This indicates that Funes is an essential component of the natural machinery required for memory stabilization.

The researchers next investigated how Funes interacts with sensory information. Memory formation usually depends on the intensity of the experience. For example, a strong sugary reward creates a stronger memory than a weak one. The team tested Funes-overexpressing flies with lower concentrations of sugar and weaker odors.

Remarkably, flies with extra Funes formed robust memories even when the sensory cues were suboptimal. They learned effectively with much less sugar than typical flies required. This finding suggests that Funes helps signal the nutritional value or “salience” of the experience. It acts as a sensitizing agent, allowing the brain to encode memories of events that might otherwise be too faint to trigger long-term storage.

Following the behavioral tests, the researchers explored the molecular mechanism at play. They suspected that Funes acted by influencing Orb2, the memory protein known to form amyloids. They performed biochemical experiments to see if the two proteins interacted physically.

The results showed that Funes binds directly to Orb2. Specifically, it binds to Orb2 when it is in an oligomeric state, which is an intermediate stage between a single molecule and a full amyloid fiber.

The team then reconstituted the reaction in a test tube to observe it directly. They purified Funes and Orb2 proteins and mixed them in a controlled environment. When mixed, Funes accelerated the transition of Orb2 from these intermediate clusters into long, stable amyloid filaments. The researchers confirmed the presence of these structures using an amyloid-binding dye called Thioflavin T, which fluoresces when it attaches to amyloid fibers.

To ensure these laboratory-created fibers were the same as those found in living brains, the team utilized cryogenic electron microscopy (cryo-EM). This advanced imaging technique allows scientists to see the atomic structure of proteins.

The images revealed that the Orb2 amyloids created with the help of Funes were structurally identical to endogenous Orb2 amyloids extracted from fly heads. They possessed the same “cross-beta” architecture that characterizes functional amyloids.

The study further demonstrated that the “J-domain” of the Funes protein is essential for this activity. This domain is a specific section of the protein sequence that defines the JDP family.

The researchers generated a mutant version of Funes with a slight alteration in the J-domain. This mutant was able to bind to Orb2 but could not push it to form the final amyloid structure. When this mutant version was expressed in flies, it failed to enhance memory, confirming that the physical formation of the amyloid is the key to the memory-boosting effect.

Beyond structural formation, the researchers verified that these Funes-induced amyloids were functionally active. In the brain, Orb2 amyloids work by binding to specific messenger RNAs (mRNAs) and regulating their translation into new proteins.

The researchers used a reporter assay to measure this activity. They found that the amyloids facilitated by Funes successfully promoted the translation of target mRNAs, mimicking the natural biological process seen in memory consolidation.

One potential limitation of this study is its focus on Drosophila. While the fundamental molecular machinery of memory is highly conserved across species, it remains to be seen if a direct homolog of Funes performs the exact same function in mammals.

The human genome contains many J-domain proteins, and identifying which one corresponds functionally to Funes will be a necessary next step. The study suggests a link to human health, noting that some related chaperones have been genetically associated with schizophrenia, a condition that involves cognitive deficits.

Future research will likely investigate how Funes receives the signal to act. The current study shows that Funes responds to nutritional cues, but the precise signaling pathway that activates Funes remains to be mapped. Additionally, scientists will need to determine if Funes regulates other proteins beside Orb2. It is possible that this chaperone manages a suite of proteins required for synaptic plasticity.

This work challenges the traditional view that amyloid formation is merely a pathological accident. It provides evidence that the brain has evolved sophisticated machinery to harness these stable structures for information storage. By identifying Funes, the researchers have pinpointed a control switch for this process, offering a potential target for understanding how memories persist over a lifetime.

The study, “A J-domain protein enhances memory by promoting physiological amyloid formation in Drosophila,” was authored by Kyle Patton, Yangyang Yi, Raj Burt, Kevin Kan-Shing Ng, Mayur Mukhi, Peerzada Shariq Shaheen Khaki, Ruben Hervas, and Kausik Si.

Speaking multiple languages appears to keep the brain younger for longer

People are living longer than ever around the world. Longer lives bring new opportunities, but they also introduce challenges, especially the risk of age-related decline.

Alongside physical changes such as reduced strength or slower movement, many older adults struggle with memory, attention and everyday tasks. Researchers have spent years trying to understand why some people stay mentally sharp while others deteriorate more quickly. One idea attracting growing interest is multilingualism, the ability to speak more than one language.

When someone knows two or more languages, all those languages remain active in the brain. Each time a multilingual person wants to speak, the brain must select the right language while keeping others from interfering. This constant mental exercise acts a bit like daily “brain training”.

Choosing one language, suppressing the others and switching between them strengthens brain networks involved in attention and cognitive control. Over a lifetime, researchers believe this steady mental workout may help protect the brain as it ages.

Studies comparing bilinguals and monolinguals have suggested that people who use more than one language might maintain better cognitive skills in later life. However, results across studies have been inconsistent. Some reported clear advantages for bilinguals, while others found little or no difference.

A new, large-scale study now offers stronger evidence and an important insight: speaking one extra language appears helpful, but speaking several seems even better.

This study analysed data from more than 86,000 healthy adults aged 51 to 90 across 27 European countries. Researchers used a machine-learning approach, meaning they trained a computer model to detect patterns across thousands of datapoints. The model estimated how old someone appeared based on daily functioning, memory, education level, movement and health conditions such as heart disease or hearing loss.

Comparing this “predicted age” with a person’s actual age created what the researchers called a “biobehavioural age gap”. This is the difference between how old someone is and how old they seem based on their physical and cognitive profile. A negative gap meant someone appeared younger than their biological age. A positive gap meant they appeared older.

The team then looked at how multilingual each country was by examining the percentage of people who spoke no additional languages, one, two, three or more. Countries with high multilingual exposure included places such as Luxembourg, the Netherlands, Finland and Malta, where speaking multiple languages is common. Countries with low multilingualism included the UK, Hungary and Romania.

People living in countries where multilingualism is common had a lower chance of showing signs of accelerated ageing. Monolingual speakers, by contrast, were more likely to appear biologically older than their actual age. Just one additional language made a meaningful difference. Several languages created an even stronger effect, suggesting a dose-dependent relationship in which each extra language provided an additional layer of protection.

These patterns were strongest among people in their late 70s and 80s. Knowing two or more languages did not simply help; it offered a noticeably stronger shield against age-related decline. Older multilingual adults seemed to carry a kind of built-in resilience that their monolingual peers lacked.

Could this simply reflect differences in wealth, education or political stability between countries? The researchers tested this by adjusting for dozens of national factors including air quality, migration rates, gender inequality and political climate. Even after these adjustments, the protective effect of multilingualism remained steady, suggesting that language experience itself contributes something unique.

Although the study did not directly examine brain mechanisms, many scientists argue that the mental effort required to manage more than one language helps explain the findings. Research shows that juggling languages engages the brain’s executive control system, the set of processes responsible for attention, inhibition and switching tasks.

Switching between languages, preventing the wrong word from coming out, remembering different vocabularies and choosing the right expression all place steady demands on these systems. Work in our lab has shown that people who use two languages throughout their lives tend to have larger hippocampal volume.

This means the hippocampus, a key brain region for forming memories, is physically bigger. A larger or more structurally robust hippocampus is generally linked to better memory and greater resistance to age-related shrinkage or neurodegenerative diseases such as Alzheimer’s.

This new research stands out for its scale, its long-term perspective and its broad approach to defining ageing. By combining biological, behavioural and environmental information, it reveals a consistent pattern: multilingualism is closely linked to healthier ageing. While it is not a magic shield, it may be one of the everyday experiences that help the brain stay adaptable, resilient and younger for longer.The Conversation

 

This article is republished from The Conversation under a Creative Commons license. Read the original article.

What brain scans reveal about people who move more

1 February 2026 at 21:00

New research indicates that physical movement may help preserve the ability to recall numbers over short periods by maintaining the structural integrity of the brain. These findings highlight potential biological pathways connecting an active lifestyle to cognitive health in later life. The analysis was published in the European Journal of Neuroscience.

As the global population ages, the prevalence of cognitive impairment and dementia has emerged as a primary public health concern. Memory decline compromises daily independence and social engagement. Medical experts have identified physical inactivity as a modifiable risk factor for this deterioration.

Prior investigations have consistently linked exercise to better cognitive performance. Researchers have found that older adults who maintain active lifestyles often exhibit preserved memory and executive function. However, the biological mechanisms driving this protective effect remain only partially understood.

The brain undergoes physical changes as it ages. These changes often include a reduction in volume and the accumulation of damage. Neuroscientists categorize brain tissue into gray matter and white matter.

Gray matter consists largely of neuronal cell bodies and is essential for processing information. White matter comprises the nerve fibers that transmit signals between different brain regions. The integrity of these tissues is essential for optimal cognitive function.

Another marker of brain health is the presence of white matter hyperintensities. These are small lesions that appear as bright spots on magnetic resonance imaging scans. They frequently indicate disease in the small blood vessels of the brain and are associated with cognitive decline.

Previous studies attempting to link activity with brain structure often relied on self-reported data. Surveys asking participants to recall their exercise habits are prone to inaccuracies and bias. People may not remember their activity levels correctly or may overestimate their exertion.

To address these limitations, a team of researchers conducted a large-scale analysis using objective data. The study was led by Xiaomin Wu and Wenzhe Yang from the Department of Epidemiology and Biostatistics at Tianjin Medical University in China. They utilized data from the UK Biobank, a massive biomedical database containing genetic and health information.

The researchers aimed to determine if objectively measured physical activity was associated with specific memory functions. They also sought to understand if structural markers in the brain could explain this relationship statistically. They focused on a sample of middle-aged and older adults.

The final analysis included 19,721 participants. The subjects ranged in age from 45 to 82 years. The study population was predominantly white and had a relatively high level of education.

Physical activity was measured using wrist-worn accelerometers. Participants wore these devices continuously for seven days. This method captured all movement intensity, frequency, and duration without relying on human memory.

The researchers assessed memory function using three distinct computerized tests. The first was a numeric memory test. Participants had to memorize a string of digits and enter them after they disappeared from the screen.

The second assessment was a visual memory test involving pairs of cards. Participants viewed the cards briefly and then had to match pairs from memory. The third was a prospective memory test, which required participants to remember to perform a specific action later in the assessment.

A subset of 14,718 participants also underwent magnetic resonance imaging scans. These scans allowed the researchers to measure total brain volume and the volumes of specific tissues. They specifically examined gray matter, white matter, and the hippocampus.

The hippocampus is a seahorse-shaped structure deep in the brain known to be vital for learning and memory. The researchers also quantified the volume of white matter hyperintensities. They then used statistical models to look for associations between activity, brain structure, and memory.

The study found a clear positive association between physical activity and performance on the numeric memory test. Individuals who moved more tended to recall longer strings of digits. This association held true even after adjusting for factors like age, education, and smoking status.

The results for the other memory tests were less consistent. Physical activity was not strongly linked to prospective memory. The link to visual memory was weak and disappeared in some sensitivity analyses.

When examining brain structure, the researchers observed that higher levels of physical activity correlated with larger brain volumes. Active participants had greater total brain volume. They also possessed higher volumes of both gray and white matter.

The scans also revealed that increased physical activity was associated with a larger hippocampus. This was observed in both the left and right sides of this brain region. Perhaps most notably, higher activity levels were linked to a lower volume of white matter hyperintensities.

The researchers then performed a pathway analysis to understand the mechanism. This statistical method estimates how much of the link between two variables is explained by a third variable. They tested whether the brain structures mediated the relationship between activity and numeric memory.

The analysis showed that brain structural markers explained a substantial portion of the memory benefits. Total brain volume, white matter volume, and gray matter volume all acted as mediators. White matter hyperintensities played a particularly strong role.

Specifically, the reduction in white matter hyperintensities accounted for nearly 30 percent of the total effect of activity on memory. This suggests that physical activity may protect memory partly by maintaining blood vessel health in the brain. Preventing small vessel damage appears to be a key pathway.

The findings indicate that physical activity helps maintain the overall “hardware” of the brain. By preserving the volume of processing tissue and connection fibers, movement supports the neural networks required for short-term memory. The preservation of white matter integrity seems particularly relevant.

The researchers encountered an unexpected result regarding the hippocampus. Although physical activity was linked to a larger hippocampus, this volume increase did not explain the improvement in numeric memory. The pathway analysis did not find a significant mediating effect for this specific structure.

The authors suggest this may be due to the nature of the specific memory task. Recalling a string of numbers is a short-term working memory task. This type of cognitive effort relies heavily on frontoparietal networks rather than the hippocampus.

The hippocampus is more closely associated with episodic memory, or the recollection of specific events and experiences. The numeric test used in the UK Biobank may simply tap into different neural circuits. Consequently, the structural benefits to the hippocampus might benefit other types of memory not fully captured by this specific test.

The study provides evidence that the benefits of exercise are detectable in the physical structure of the brain. It supports the idea that lifestyle choices can buffer against age-related degeneration. The protective effects were observed in a non-demented population, suggesting benefits for generally healthy adults.

There are several important caveats to consider regarding this research. The study was cross-sectional in design. This means data on activity, brain structure, and memory were collected at roughly the same time.

Because of this design, the researchers cannot definitively prove causality. It is possible that people with healthier brains find it easier to be physically active. Longitudinal studies tracking changes over time are necessary to confirm the direction of the effect.

Another limitation is the composition of the study group. The UK Biobank participants tend to be healthier and wealthier than the general population. This “healthy volunteer” bias might limit how well the findings apply to broader, more diverse groups.

The measurement of physical activity, while objective, was limited to a single week. This snapshot might not perfectly reflect a person’s long-term lifestyle habits. However, it is generally considered more reliable than retrospective questionnaires.

Future research should explore these relationships in more diverse populations. Studies including participants with varying levels of cardiovascular health would be informative. Additionally, using a wider array of memory tests could help map specific brain changes to specific cognitive domains.

Despite these limitations, the study reinforces the importance of moving for brain health. It suggests that physical activity does not just improve mood or heart health. It appears to physically preserve the brain tissue required for cognitive function.

The preservation of white matter and the reduction of vascular damage markers stand out as key findings. These structural elements provide the connectivity and health necessary for the brain to operate efficiently. Simple daily movement may serve as a defense against the structural atrophy that often accompanies aging.

The study, “Association Between Physical Activity and Memory Function: The Role of Brain Structural Markers in a Cross-Sectional Study,” was authored by Xiaomin Wu, Wenzhe Yang, Yu Li, Luhan Zhang, Chenyu Li, Weili Xu, and Fei Ma.

Alcohol shifts the brain into a fragmented and local state

1 February 2026 at 17:00

A standard glass of wine or beer does more than just relax the body; it fundamentally alters the landscape of communication within the brain. New research suggests that acute alcohol consumption shifts neural activity from a flexible, globally integrated network to a more segmented, local structure. These changes in brain architecture appear to track with how intoxicated a person feels. The findings were published in the journal Drug and Alcohol Dependence.

For decades, neuroscientists have worked to map how alcohol affects human behavior. Traditional studies often look at specific brain regions in isolation. Researchers might observe that activity in the prefrontal cortex dampens, which explains why inhibition lowers. Alternatively, they might see changes in the cerebellum, which accounts for the loss of physical coordination.

However, the brain does not operate as a collection of independent islands. It functions as a massive, interconnected web. Information must travel constantly between different areas to process sights, sounds, and thoughts. Understanding how alcohol impacts the traffic patterns of this web requires a different mathematical approach known as graph theory.

Graph theory allows scientists to treat the brain like a vast map of cities and highways. The “cities” are distinct brain regions, referred to as nodes. The “highways” are the functional connections between them, known as edges. By analyzing the flow of traffic across these highways, researchers can determine how efficiently the brain is sharing information.

Leah A. Biessenberger and her colleagues at the University of Minnesota and the University of Florida sought to apply this network-level analysis to social drinkers. Biessenberger, the study’s lead author, worked alongside senior author Jeff Boissoneault and a wider team. They aimed to fill a gap in the scientific literature regarding acute alcohol use.

While previous research has examined how chronic, heavy drinking reshapes the brain over years, less is known about the immediate network effects of a single drinking session. The researchers wanted to observe the brain in a “resting state.” This is the baseline activity that occurs when a person is awake but not performing a specific task.

To investigate this, the team recruited 107 healthy adults between the ages of 21 and 45. The participants were social drinkers without a history of alcohol use disorder. The study utilized a double-blind, placebo-controlled design. This method is the gold standard for removing bias from clinical experiments.

Each participant visited the laboratory for two separate sessions. During one visit, they consumed a beverage containing alcohol mixed with a sugar-free mixer. The dose was calculated to bring their breath alcohol concentration to 0.08 grams per deciliter, which is the legal driving limit in the United States.

During the other visit, they received a placebo drink. This beverage contained only the mixer but was misted with a small amount of alcohol on the surface and rim to mimic the smell and taste of a real cocktail. Neither the participants nor the research staff knew which drink was administered on a given day.

Approximately 30 minutes after drinking, the participants entered an MRI scanner. They were instructed to keep their eyes open and let their minds wander. The scanner recorded the blood oxygen levels in their brains, which serves as a proxy for neural activity.

The researchers then used computational tools to analyze the functional connectivity between 106 different brain regions. They looked for specific patterns in the data described by graph theory metrics. These metrics included “global efficiency” and “local efficiency.”

Global efficiency measures how easily information travels across the entire network. A network with high global efficiency has many long-distance shortcuts, allowing distant regions to communicate quickly. Local efficiency measures how well neighbors talk to neighbors. It reflects the tendency of brain regions to form tight-knit clusters that process information among themselves.

The analysis revealed distinct shifts in the brain’s topology following alcohol consumption. When participants drank alcohol, their brains moved toward a more “grid-like” state. The network became less random and more clustered.

Specifically, the study found that global efficiency decreased in several areas. This was particularly evident in the occipital lobe, the part of the brain responsible for processing vision. The reduction suggests that alcohol makes it harder for visual information to integrate with the rest of the brain’s operations.

Simultaneously, local efficiency increased. Regions in the frontal and temporal cortices began to communicate more intensely with their immediate neighbors. The brain appeared to fracture into smaller, self-contained communities. This structure requires less energy to maintain but hinders the rapid integration of complex information.

The researchers also examined a metric called “clustering coefficient.” This value reflects the likelihood that a node’s neighbors are also connected to each other. Alcohol increased the clustering coefficient across the network. This further supports the idea that the intoxicated brain relies more on local processing than global integration.

The team also looked at the “insula,” a region deeply involved in sensing the body’s internal state. Under the influence of alcohol, the insula showed increased connections with its local neighbors. It also displayed greater activity in communicating with the broader network compared to the placebo condition.

These architectural changes were not merely abstract mathematical observations. The researchers found a statistical link between the network shifts and the participants’ subjective experiences. Before the scan, participants rated how intoxicated they felt on a scale of 0 to 100.

The results showed that the degree of network reorganization predicted the intensity of the subjective “buzz.” Participants whose brains showed the largest drop in global efficiency and the largest rise in local clustering tended to report feeling the most intoxicated. The structural breakdown of long-range communication tracked with the feeling of impairment.

This correlation offers new insight into why individuals react differently to the same amount of alcohol. Even at the same blood alcohol concentration, people experience varying levels of intoxication. The study suggests that individual differences in how the brain network fragments may underlie these varying subjective responses.

The findings also highlighted disruptions in the visual system. The decrease in efficiency within the occipital regions was marked. This aligns with well-known effects of drunkenness, such as blurred vision or difficulty tracking moving objects. The network analysis provides a neural basis for these sensory deficits.

While the study offers robust evidence, the authors note certain limitations. The MRI scans did not capture the cerebellum consistently for all participants. The cerebellum is vital for balance and motor control. Because it was not included in the analysis, the picture of alcohol’s effect on the whole brain remains incomplete.

Additionally, the study focused on young, healthy adults. The brain changes observed here might differ in older adults or individuals with a history of substance abuse. Aging brains already show some reductions in global efficiency. Alcohol could compound these effects in older populations.

The researchers also point out that the participants were in a resting state. The brain rearranges its network when actively solving problems or processing emotions. Future research will need to determine if these topological shifts persist or worsen when an intoxicated person tries to perform a complex task, like driving.

This investigation provides a nuanced view of acute intoxication. It moves beyond the idea that alcohol simply “dampens” brain activity. Instead, it reveals that alcohol forces the brain into a segregated state. Information gets trapped in local cul-de-sacs rather than traveling the superhighways of the mind.

By connecting these mathematical patterns to the subjective feeling of being drunk, the study helps bridge the gap between biology and behavior. It illustrates that the sensation of intoxication is, in part, the feeling of a brain losing its global coherence.

The study, “Acute alcohol intake disrupts resting state network topology in healthy social drinkers,” was authored by Leah A. Biessenberger, Adriana K. Cushnie, Bethany Stennett-Blackmon, Landrew S. Sevel, Michael E. Robinson, Sara Jo Nixon, and Jeff Boissoneault.

Long-term antidepressant effects of psilocybin linked to functional brain changes

31 January 2026 at 23:00

A new study suggests that the long-term antidepressant effects of psychedelics may be driven by persistent changes in how neurons fire rather than by the permanent growth of new brain cell connections. Researchers found that a single dose of psilocybin altered the electrical properties of brain cells in rats for months, even after physical changes to the neurons had disappeared. These findings were published in the journal Neuropsychopharmacology.

Depression is a debilitating condition that is often treated with daily medications. These standard treatments can take weeks to work and do not help every patient. Psilocybin, a compound found in certain mushrooms, has emerged as a potential alternative therapy. Clinical trials indicate that one or two doses of psilocybin can alleviate symptoms of depression for months or even years. However, scientists do not fully understand the biological mechanisms that allow a single treatment to produce such enduring results.

Researchers have previously focused on the concept of neuroplasticity to explain these effects. This term generally refers to the brain’s ability to reorganize itself. One specific type is structural plasticity, which involves the physical growth of new connection points between neurons, known as dendritic spines. Short-term studies conducted days or weeks after drug administration often show an increase in these spines. The question remained whether these physical structures persist long enough to account for relief lasting several months.

To investigate this, a team of researchers led by Hannah M. Kramer, Meghan Hibicke, and Charles D. Nichols at LSU Health Sciences Center designed an experiment using rats. They chose Wistar Kyoto rats for the study. This specific breed is often used in research because the animals naturally exhibit behaviors analogous to stress and depression in humans.

The investigators sought to compare the effects of psilocybin against another compound called 25CN-NBOH. Psilocybin interacts with various serotonin receptors in the brain. In contrast, 25CN-NBOH is a synthetic drug designed to target only one specific receptor known as the 5-HT2A receptor. This is the receptor believed to be primarily responsible for the psychedelic experience. By using both drugs, the team hoped to isolate the role of this specific receptor in creating long-term behavioral changes.

The study began with the administration of a single dose of either psilocybin, 25CN-NBOH, or a saline placebo to the male rats. The researchers then waited for a substantial period before testing the animals. They assessed the rats’ behavior at five weeks and again at twelve weeks after the injection. This timeline allowed the team to evaluate effects that persist well beyond the immediate aftermath of the drug experience.

The primary method for assessing behavior was the forced swim test. In this standard procedure, rats are placed in a tank of water from which they cannot escape. Researchers measure how much time the animals spend swimming versus floating motionless. In this context, high levels of immobility are interpreted as a passive coping strategy, which is considered a marker for depressive-like behavior. Antidepressant drugs typically cause rats to spend more time swimming and struggling.

The behavioral results indicated a lasting change. Rats treated with either psilocybin or 25CN-NBOH showed reduced immobility compared to the control group. This antidepressant-like effect was evident at the five-week mark. It remained equally strong at the twelve-week mark. The persistence of the effect suggests that the single dose induced a stable, long-term shift in behavior.

After the twelve-week behavioral tests, the researchers examined the brains of the animals. They focused on the medial prefrontal cortex. This brain region is involved in mood regulation and decision-making. The team utilized high-resolution microscopy to count the density of dendritic spines on the neurons. They specifically looked for the physical evidence of new connections that previous short-term studies had identified.

The microscopic analysis revealed that the number of dendritic spines in the treated rats was no different from that of the control group. The structural growth seen in other studies shortly after treatment appeared to be transient. The physical architecture of the neurons had returned to its baseline state after three months. The researchers also analyzed the expression of genes related to synaptic structure. They found no difference in gene activity between the groups.

Since structural changes could not explain the lasting behavioral shift, the team investigated functional plasticity. This refers to changes in how neurons process and transmit electrical signals. They prepared thin slices of the rats’ brain tissue. Using a technique called electrophysiology, they inserted microscopic glass pipettes into individual neurons to record their electrical activity.

The researchers classified the neurons into two types based on their firing patterns: adapting neurons and bursting neurons. Adapting neurons typically slow down their firing rate after an initial spike. Bursting neurons fire in rapid clusters of signals. The recordings showed that the drugs had altered the intrinsic electrical properties of these cells.

In the group treated with psilocybin, adapting neurons sat at a resting voltage that was closer to the threshold for firing. This state is known as depolarization. It means the cells are primed to activate more easily. The bursting neurons in psilocybin-treated rats also showed increased excitability. They required less input to trigger a signal and fired at faster rates than neurons in untreated rats.

The rats treated with 25CN-NBOH also exhibited functional changes, though the specific electrical alterations differed slightly from the psilocybin group. For instance, the bursting neurons in this group were not as easily triggered as those in the psilocybin group. However, the overall pattern confirmed that the drug had induced a lasting shift in neuronal function.

These electrophysiological findings provide a potential explanation for the behavioral results. While the physical branches of the neurons may have pruned back to normal levels, the cells “remembered” the treatment through altered electrical tuning. This functional shift allows the neural circuits to operate differently long after the drug has left the body.

The study implies that the 5-HT2A receptor is sufficient to trigger these long-term changes. The synthetic drug 25CN-NBOH produced lasting behavioral effects similar to psilocybin. This suggests that activating this single receptor type can initiate the cascade of events leading to persistent antidepressant-like effects.

There are limitations to this study that provide context for the results. The researchers used only male rats. Female rats may exhibit different biological responses to psychedelics or stress. Future research would need to include both sexes to ensure the findings are universally applicable.

Additionally, the forced swim test is a proxy for human depression but does not capture the full complexity of the human disorder. While it is a standard tool for screening antidepressant drugs, it measures a specific type of coping behavior. The translation of these specific neural changes to human psychology remains a subject for further investigation.

The researchers also noted that while spine density returned to baseline, this does not mean structural plasticity plays no role. It is possible that a rapid, temporary growth of connections acts as a trigger. This early phase might set the stage for the permanent electrical changes that follow. The exact molecular switch that locks in these functional changes remains to be identified.

Future studies will likely focus on the period between the initial dose and the three-month mark. Scientists need to map the transition from structural growth to functional endurance. Understanding this timeline could help optimize how these therapies are delivered.

The study, “Psychedelics produce enduring behavioral effects and functional plasticity through mechanisms independent of structural plasticity,” was authored by Hannah M. Kramer, Meghan Hibicke, Jason Middleton, Alaina M. Jaster, Jesper L. Kristensen and Charles D. Nichols.

Scientists identify key brain structure linked to bipolar pathology

31 January 2026 at 19:00

Recent analysis of human brain tissue suggests that a small and often overlooked region deep within the brain may play a central role in bipolar disorder. Researchers found that neurons in the paraventricular thalamic nucleus are depleted and genetically altered in people with the condition. These results point toward potential new targets for diagnosis and treatment. The findings were published in the journal Nature Communications.

Bipolar disorder is a mental health condition characterized by extreme shifts in mood and energy levels. It affects approximately one percent of the global population and can severely disrupt daily life. While medications such as lithium and antipsychotics exist, they do not work for every patient. These drugs also frequently carry difficult side effects that cause patients to stop taking them. To develop better therapies, medical experts need a precise map of what goes wrong in the brain.

Past research has largely focused on the outer layer of the brain known as the cortex. This area is responsible for higher-level thinking and processing. However, brain scans using magnetic resonance imaging have hinted that deeper structures also shrink in size during the course of the illness. One such structure is the thalamus. This central hub acts as a relay station for sensory information and emotional regulation.

Within the thalamus lies a specific cluster of cells called the paraventricular thalamic nucleus. This area is rich in chemical messengers and has connections to parts of the brain involved in emotion. Despite these clues, the molecular details of this region remained largely unmapped in humans. A team led by Masaki Nishioka and Tadafumi Kato from Juntendo University Graduate School of Medicine in Tokyo launched an investigation to bridge this gap. They collaborated with researchers including Mie Sakashita-Kubota to analyze postmortem brain tissue.

The researchers aimed to determine if the genetic activity in this deep brain region differed from healthy brains. They examined brain samples from 21 individuals who had been diagnosed with bipolar disorder and 20 individuals without psychiatric conditions. They looked at two specific areas: the frontal cortex and the paraventricular thalamic nucleus. To do this, they used a technique called single-nucleus RNA sequencing.

This technology allows researchers to catalog the genetic instructions being used by individual cells. By analyzing thousands of nuclei, the team could identify different cell types and see which genes were active or inactive. This provided a high-resolution view of the cellular landscape. They compared the data from the thalamus against the data from the cortex to see which region was more affected.

The analysis revealed that the thalamus had undergone substantial changes. Specifically, the paraventricular thalamic nucleus contained far fewer excitatory neurons in the samples from people with bipolar disorder. The researchers estimated a reduction of roughly 50 percent in these cells compared to the control group. This loss was specific to the neurons that send stimulating signals to other parts of the brain.

In contrast, the changes observed in the frontal cortex were much more subtle. While there were some alterations in the cortical cells, they were not as extensive as those seen in the deep brain. This suggests that the thalamus might be a primary site of pathology in the disorder. The team validated these findings by staining proteins in the tissue to visually confirm the lower cell density.

Inside the remaining thalamic neurons, the genetic machinery was also behaving differently. The study identified a reduced activity of genes responsible for maintaining connections between neurons. These genes are essential for the flow of chemical and electrical signals. Among the affected genes were CACNA1C and SHISA9. These specific segments of DNA have been flagged in previous genetic studies as potential risk factors for the illness.

Another gene called KCNQ3, which helps regulate electrical channels in cells, was also less active. These channels act like gates that let electrically charged potassium or calcium atoms flow in and out of the cell. This flow is what allows a neuron to fire a signal. When the genes controlling these gates are turned down, the neuron may become unstable or fail to communicate.

The specific combination of affected genes suggests a vulnerability in how these cells handle calcium and electrical activity. High-frequency firing of neurons requires tight regulation of calcium levels. If the proteins that manage this process are missing, the cells might become damaged over time. This could explain why so many of these neurons were missing in the patient samples.

The team also looked at non-neuronal cells called microglia. These are the immune cells of the brain that help maintain healthy synapses. Synapses are the junction points where neurons pass signals to one another. The data showed that the communication between the thalamic neurons and these immune cells was disrupted.

A specific pattern of gene expression that usually coordinates the interaction between excitatory neurons and microglia was weaker in the bipolar disorder samples. This breakdown could contribute to the loss of synapses or the death of neurons. It represents a failure in the support system that keeps brain circuits healthy. The simultaneous decline in both neuron and microglia function suggests a coordinated failure in the region.

The researchers note that the paraventricular thalamic nucleus is distinct from other brain regions. It contains a high density of receptors for dopamine, a neurotransmitter involved in reward and motivation. This makes it a likely target for antipsychotic medications that act on the dopamine system. The specific genetic profile of these neurons aligns with biological processes previously linked to the disorder.

There are limitations to consider regarding these results. The study relied on postmortem tissue, so it represents a snapshot of the brain at the end of life. It is difficult to know for certain if the cell loss caused the disorder or if the disorder caused the cell loss. The sample size was relatively small, with only 41 donors in total.

Additionally, the patients had been taking various medications throughout their lives. These drugs can influence gene expression. The researchers checked for medication effects and found little overlap between drug signatures and their findings. However, they could not rule out medication influence entirely.

Looking ahead, the authors suggest that the paraventricular thalamic nucleus could be a target for new drugs. Therapies that aim to protect these neurons or restore their function might offer relief where current treatments fail. Advanced imaging could also focus on this region to help diagnose the condition earlier.

Associate Professor Nishioka emphasized the importance of looking beyond the usual suspects in brain research. “This study highlights the need to extend research to the subcortical regions of the brain, which may harbor critical yet underexplored components of BD pathophysiology,” Nishioka stated. The team hopes that integrating these molecular findings with neuroimaging will lead to better patient outcomes.

Professor Kato added that the findings could reshape how scientists view the origins of the illness. “We finally identified that PVT is the brain region causative for BD,” Kato said. “This discovery will lead to the paradigm shift of BD research.”

The study, “Disturbances of paraventricular thalamic nucleus neurons in bipolar disorder revealed by single-nucleus analysis,” was authored by Masaki Nishioka, Mie Sakashita-Kubota, Kouichirou Iijima, Yukako Hasegawa, Mizuho Ishiwata, Kaito Takase, Ryuya Ichikawa, Naguib Mechawar, Gustavo Turecki & Tadafumi Kato.

Alcohol triggers unique activity in amygdala neurons

31 January 2026 at 01:00

A study on mice identified a group of neurons in the central amygdala region of the brain that display a unique pattern of increased activity during voluntary alcohol consumption. While these neurons also responded to other fluids, their activity was significantly higher when mice drank alcohol compared to when they drank sucrose or water. This unique response did not diminish over time. The paper was published in Progress in Neuro-Psychopharmacology and Biological Psychiatry.

Alcohol use disorder is a chronic condition characterized by a problematic pattern of alcohol consumption that leads to significant distress or impairment in daily functioning. Despite treatment, relapses are frequent. Estimates suggest that around 30 million people in the U.S. alone are affected by it, which is around 9% of the population.

People with alcohol use disorder tend to have difficulty controlling how much they drink or how often they drink. They tend to continue drinking despite negative consequences. Common symptoms of this disorder include tolerance, withdrawal symptoms, and spending a great deal of time obtaining, using, or recovering from alcohol.

Excessive alcohol drinking, characteristic of alcohol use disorder, increases the risk of liver disease, cardiovascular problems, and certain cancers. It also has substantial psychological and social consequences, including depression, anxiety, family conflict, and work-related difficulties.

Study author Christina L. Lebonville and her colleagues note that studies of rodents have revealed that the central amygdala is a key region of the brain for alcohol drinking behaviors, particularly in alcohol dependence. This region contains three groups of neurons (sub-nuclei) that differ in the type of neuropeptide they express.

Neuropeptides are small protein-like molecules that neurons use to communicate with each other and to regulate various functions of the body. Unlike neurotransmitters, neuropeptides are released more slowly and they act over a longer time span.

One of these groups of neurons produces dynorphin, a neuropeptide involved in stress, pain, and negative emotional states. They are called dynorphin-expressing neurons or CeADyn neurons.

Previous studies implicated their activity in excessive alcohol drinking both during acute and chronic alcohol exposure. They also showed that CeADyn neurons regulate both binge alcohol drinking and drinking enhanced by stress in individuals with alcohol dependence. The disruption of their activity reduced alcohol drinking.

This study was conducted on 35 prodynorphin-Cre mice. These are genetically engineered mice with genetic properties that allow researchers to selectively label, monitor, and manipulate their CeADyn neurons. Mice were 8–17 weeks of age at the start of the experiment. They had free access to food throughout the experiment and free access to water outside experimental drinking sessions.

The study authors performed a surgery on these mice during which they injected a virus into their central amygdala. This virus changed their DNA so that a fluorescent calcium sensor was expressed in their CeADyn neurons, allowing the authors to measure their activity. At the same time, they implanted a small optical fiber above this region allowing them to record neural activity through light signals (fiber photometry).

After recovery from surgery, mice were given access to different solutions for 2 hours per day, 5–6 days per week. In the first experiment, mice had access to 20% alcohol for 3 weeks, water for two weeks, and 0.5% sucrose for three weeks.

In the second experiment, mice first had access to solutions with different quinine concentrations, followed by water, water after 24 hours of water deprivation, a combination of 0.5% sucrose and low quinine concentrations, and 0.5% sucrose with high quinine concentrations. The study authors recorded the brain activity of the mice during these periods.

Results showed strong increases in CeADyn neuron activity after bouts of alcohol drinking compared to sucrose or water drinking. Behaviors specific for drinking alcohol, such as longer bout durations, did not fully explain the differences in the pattern of activity of these neurons when mice were drinking alcohol compared to when they were drinking something else.

“No other conditions or solutions tested reproduced the pronounced change in CeADyn activity associated with alcohol drinking. These findings support the presence of a unique functional signature for alcohol in a cell population known to control excessive alcohol drinking and further advance fiber photometric normalization and analytical methods,” the study authors concluded.

The study contributes to the scientific understanding of the neural underpinnings of alcohol drinking behaviors. However, it should be noted that this study was done on mice, not on humans. While humans and mice share many physiological characteristics, they are still very different species. Findings on humans may differ.

The paper, “Alcohol drinking is associated with greater calcium activity in mouse central amygdala dynorphin-expressing neurons,” was authored by Christina L. Lebonville, Jennifer A. Rinker, Krysten O’Hara, Christopher S. McMahan, Michaela Hoffman, Howard C. Becker, and Patrick J. Mulholland.

Genetic risk for depression maps to specific structural brain changes

30 January 2026 at 21:00

A new comprehensive analysis has revealed that major depressive disorder alters both the physical architecture and the electrical activity of the brain in the same specific regions. By mapping these overlapping changes, researchers identified a distinct set of genes that likely drives these abnormalities during early brain development. The detailed results of this investigation were published in the Journal of Affective Disorders.

Major depressive disorder is a pervasive mental health condition that affects millions of people globally. It is characterized by persistent low mood and a loss of interest in daily activities. Patients often experience difficulties with cognitive function and emotional regulation.

While the symptoms are psychological, the condition is rooted in biological changes within the brain. Researchers have sought to understand the physical mechanisms behind the disorder for decades. The goal is to move beyond symptom management toward treatments that address the root biological causes.

Most previous research has looked at brain changes in isolation. Some studies use structural magnetic resonance imaging to measure the volume of gray matter. This tissue contains the cell bodies of neurons. A reduction in gray matter volume typically suggests a loss of neurons or a shrinkage of connections between them.

Other studies use functional magnetic resonance imaging. This technique measures blood flow to track brain activity. It looks at how well different brain regions synchronize their firing patterns or the intensity of their activity while the person is resting.

Results from these single-method studies have often been inconsistent. One study might find a problem in the frontal lobe, while another points to the temporal lobe. It has been difficult to know if structural damage causes functional problems or if they occur independently. Additionally, scientists know that genetics play a large role in depression risk. However, it remains unclear how specific genetic variations translate into the physical brain changes seen in patients.

To bridge this gap, a team of researchers led by Ying Zhai, Jinglei Xu, and Zhihui Zhang from Tianjin Medical University General Hospital conducted a large-scale study. They aimed to integrate data on brain structure, brain function, and genetics. Their primary objective was to find regions where structural and functional abnormalities overlap. They also sought to identify which genes might be responsible for these simultaneous changes.

The research team began by conducting a meta-analysis. This is a statistical method that combines data from many previous studies to find patterns that are too subtle for a single study to detect. They gathered data from 89 independent studies.

These included over 3,000 patients with major depressive disorder and a similar number of healthy control subjects for the structural analysis. The functional analysis included over 2,000 patients and controls. The researchers used a technique called voxel-wise analysis. This divides the brain into thousands of tiny three-dimensional cubes to pinpoint exactly where changes occur.

The team looked for three specific markers. First, they examined gray matter volume to assess physical structure. Second, they looked at regional homogeneity. This measures how synchronized a brain region is with its immediate neighbors. Third, they analyzed the amplitude of low-frequency fluctuations. This indicates the intensity of spontaneous brain activity. By combining these metrics, the researchers created a detailed map of the “depressed brain.”

The analysis revealed widespread disruptions. The researchers found that patients with depression consistently showed reduced gray matter volume in several key areas. These included the median cingulate cortex, the insula, and the superior temporal gyrus. These regions are essential for processing emotions and sensing the body’s internal state.

The functional data showed a more complicated picture. In some areas, brain activity was lower than normal. In others, it was higher. The researchers then overlaid the structural and functional maps to find the convergence points. This multimodal analysis uncovered two distinct patterns of overlap.

The first pattern involved regions that showed both physical shrinkage and reduced functional activity. This “double hit” was observed primarily in the median cingulate cortex and the insula. The insula helps the brain interpret bodily sensations, such as heartbeat or hunger, and links them to emotions. A failure in this region could explain why depressed patients often feel physically lethargic or disconnected from their bodies. The reduced activity and volume suggest a breakdown in the neural circuits responsible for emotional and sensory integration.

The second pattern was unexpected. Some regions showed reduced gray matter volume but increased functional activity. This occurred in the anterior cingulate cortex and parts of the frontal lobe. These areas are involved in self-reflection and identifying errors. The researchers suggest this hyperactivity might be a form of compensation.

The brain may be working harder to maintain normal function despite physical deterioration. Alternatively, this high activity could represent neural noise or inefficient processing. This might contribute to the persistent rumination and negative self-focus that many patients experience.

After mapping these brain regions, the researchers investigated the genetic underpinnings. They used a large database of genetic information from over 170,000 depression cases. They applied a method called H-MAGMA to prioritize genes associated with the disorder. They identified 1,604 genes linked to depression risk. The team then used the Allen Human Brain Atlas to see where these genes are expressed in the human brain. This atlas maps gene activity across different brain tissues.

The team looked for a spatial correlation. They wanted to know if the depression-linked genes were most active in the same brain regions that showed structural and functional damage. The analysis was successful. They identified 279 genes that were spatially linked to the overlapping brain abnormalities. These genes were not randomly distributed. They were highly expressed in the specific areas where the researchers had found the “double hit” of shrinkage and altered activity.

The researchers then performed an enrichment analysis to understand what these 279 genes do. The results pointed toward biological processes that happen very early in life. The genes were heavily involved in the development of the nervous system. They play roles in neuron projection guidance, which is how neurons extend their fibers to connect with targets. They are also involved in synaptic signaling, the process by which neurons communicate.

The study also looked at when these genes are most active. The data showed that these genes are highly expressed during fetal development. They are particularly active in the cortex and hippocampus during the middle to late fetal stages. This suggests that the vulnerability to depression may be established long before birth. Disruptions in these genes during critical developmental windows could lead to the structural weak points identified in the MRI scans.

The researchers also examined which types of cells use these genes. They found that the genes were predominantly expressed in specific types of neurons in the cortex and striatum. This includes neurons that use dopamine, a chemical messenger vital for motivation and pleasure. This connects the genetic findings to the known symptoms of depression, such as anhedonia, or the inability to feel pleasure.

There are limitations to this study that should be noted. The meta-analysis relied on coordinates reported in previous papers rather than raw brain scans. This can slightly reduce the precision of the location data. Additionally, the gene expression data came from the Allen Human Brain Atlas, which is based on healthy adult brains. It does not reflect how gene expression might change in a depressed brain.

The study was also cross-sectional. This means it looked at a snapshot of patients at one point in time. It cannot prove that the brain shrinkage caused the depression or vice versa. The researchers also noted that demographic factors like age and sex influence brain structure. While they controlled for these variables statistically, future research should look at how these patterns differ between men and women or across different age groups.

Future research will need to verify these findings using longitudinal data. Scientists need to track individuals over time to see how gene expression interacts with environmental stressors to reshape the brain. The team suggests that future studies should also incorporate environmental data. Factors such as inflammation or stress exposure could modify how these risk genes affect brain structure.

This study represents a step forward in integrating different types of biological data. It moves beyond viewing depression as just a chemical imbalance or a structural deficit. Instead, it presents a cohesive model where genetic risks during development lead to specific structural and functional vulnerabilities. These physical changes then manifest as the emotional and cognitive symptoms of depression.

The study, “Neuroimaging-genetic integration reveals shared structural and functional brain alterations in major depressive disorder,” was authored by Ying Zhai, Jinglei Xu, Zhihui Zhang, Yue Wu, Qian Wu, Minghuan Lei, Haolin Wang, Qi An, Wenjie Cai, Shen Li, Quan Zhang, and Feng Liu.

New maps of brain activity challenge century-old anatomical boundaries

30 January 2026 at 01:00

New research challenges the century-old practice of mapping the brain based on how tissue looks under a microscope. By analyzing electrical signals from thousands of neurons in mice, scientists discovered that the brain’s command center organizes itself by information flow rather than physical structure. These findings appear in the journal Nature Neuroscience.

The prefrontal cortex acts as the brain’s executive hub. It manages complex processes such as planning, decision-making, and reasoning. Historically, neuroscientists defined the boundaries of this region by studying cytoarchitecture. This method involves staining brain tissue and observing the arrangement of cells. The assumption has been that physical differences in cell layout correspond to distinct functional jobs.

However, the connection between these static maps and the dynamic electrical firing of neurons remains unproven. A research team led by Marie Carlén at the Karolinska Institutet in Sweden sought to test this long-standing assumption. Pierre Le Merre and Katharina Heining served as the lead authors on the paper. They aimed to create a functional map based on what neurons actually do rather than just where they sit.

To achieve this, the team performed an extensive analysis of single-neuron activity. They focused on the mouse brain, which serves as a model for mammalian neural structure. The researchers implanted high-density probes known as Neuropixels into the brains of awake mice. These advanced sensors allowed them to record the electrical output of more than 24,000 individual neurons.

The study included recordings from the prefrontal cortex as well as sensory and motor areas. The investigators first analyzed spontaneous activity. This refers to the electrical firing that occurs when the animal is resting and not performing a specific task. Spontaneous activity offers a window into the intrinsic properties of a neuron and its local network.

The team needed precise ways to describe this activity. Simply counting the number of electrical spikes per second was insufficient. They introduced three specific mathematical metrics to characterize the firing patterns. The first metric was the firing rate, or how often a neuron sends a signal.

The second metric was “burstiness.” This describes the irregularity of the intervals between spikes. A neuron with high burstiness fires in rapid clusters followed by silence. A neuron with low burstiness fires with a steady, metronomic rhythm.

The third metric was “memory.” This measures the sequential structure of the firing. It asks whether the length of one interval between spikes predicts the length of the next one. Taken together, these three variables provided a unique “fingerprint” for every recorded neuron.

The researchers used a machine learning technique called a Self-Organizing Map to sort these fingerprints. This algorithm grouped neurons with similar firing properties together. It allowed the scientists to visualize the landscape of neuronal activity without imposing human biases.

The analysis revealed a distinct signature for the prefrontal cortex. Neurons in this area predominantly displayed low firing rates and highly regular rhythms. They did not fire in erratic bursts. This created a “low-rate, regular-firing” profile that distinguished the prefrontal cortex from other brain regions.

The team then projected these activity profiles back onto the physical map of the brain. They compared the boundaries of their activity-based clusters with the traditional cytoarchitectural borders. The two maps did not align.

Regions that looked different under a microscope often contained neurons with identical firing patterns. Conversely, regions that looked the same structurally often hosted different types of activity. The distinct functional modules of the prefrontal cortex ignored the classical boundaries drawn by anatomists.

Instead of anatomy, the activity patterns aligned with hierarchy. In neuroscience, hierarchy refers to the order of information processing. Sensory areas that receive raw data from the eyes or ears are at the bottom of the hierarchy. The prefrontal cortex, which integrates this data to make decisions, sits at the top.

The researchers correlated their activity maps with existing maps of brain connectivity. They found that regions higher up in the hierarchy consistently displayed the low-rate, regular-firing signature. This suggests that the way neurons fire is determined by their place in the network, not by the local architecture of the cells.

This finding aligns with theories about how the brain processes information. Sensory areas need to respond quickly to changing environments, requiring fast or bursty firing. High-level areas need to integrate information over time to maintain stable plans. A slow, regular rhythm is ideal for holding information in working memory without being easily distracted by noise.

The study then moved beyond resting activity to examine goal-directed behavior. The mice performed a task where they heard a tone or saw a visual stimulus. They had to turn a wheel to receive a water reward. This allowed the researchers to see how the functional map changed during active decision-making.

The team identified neurons that were “tuned” to specific aspects of the task. Some neurons responded only to the sound. Others fired specifically when the mouse made a choice to turn the wheel.

When they mapped these task-related neurons, they again found no relation to the traditional anatomical borders. The functional activity formed its own unique territories. One specific finding presented a paradox.

The researchers had established that the hallmark of the prefrontal cortex was slow, regular firing. However, the specific neurons that coded for “choice”—the act of making a decision—tended to have high firing rates. These “decider” neurons were chemically and spatially mixed in with the “integrator” neurons but behaved differently.

This implies a separation of duties within the same brain space. The general population of neurons maintains a slow, steady rhythm to provide a stable platform for cognition. Embedded within this stable network are specific, highly excitable neurons that trigger actions.

The overlap of these two populations suggests that connectivity shapes the landscape. The high-hierarchy network supports the regular firing. Within that network, specific inputs drive the high-rate choice neurons.

These results suggest that intrinsic connectivity is the primary organizing principle of the prefrontal cortex. The physical appearance of the tissue is a poor predictor of function. “Our findings challenge the traditional way of defining brain regions and have major implications for understanding brain organisation overall,” says Marie Carlén.

The study does have limitations. It relied on data from mice. While mouse and human brains share many features, the human prefrontal cortex is far more complex. Additionally, the recordings focused primarily on the deep layers of the cortex. These layers are responsible for sending output signals to other parts of the brain.

The activity in the surface layers, which receive input, might show different patterns. The study also looked at a limited set of behaviors. Future research will need to explore whether these maps hold true across different types of cognitive tasks.

Scientists must also validate these metrics in other species. If the pattern holds, it could provide a new roadmap for understanding brain disorders. Many psychiatric conditions involve dysfunction in the prefrontal cortex. Understanding the “normal” activity signature—slow and regular—could help identify what goes wrong in disease.

This data-driven approach offers a scalable framework. It moves neuroscience away from subjective visual descriptions toward objective mathematical categorization. It suggests that to understand the brain, we must look at the invisible traffic of electricity rather than just the visible roads of tissue.

The study, “A prefrontal cortex map based on single-neuron activity,” was authored by Pierre Le Merre, Katharina Heining, Marina Slashcheva, Felix Jung, Eleni Moysiadou, Nicolas Guyon, Ram Yahya, Hyunsoo Park, Fredrik Wernstal & Marie Carlén.

Menopause is linked to reduced gray matter and increased anxiety

29 January 2026 at 01:00

New research suggests that menopause is accompanied by distinct changes in the brain’s structure and a notable increase in mental health challenges. While hormone replacement therapy appears to aid in maintaining reaction speeds, it does not seem to prevent the loss of brain tissue or alleviate symptoms of depression according to this specific dataset. These observations were published online in the journal Psychological Medicine.

Menopause represents a major biological transition marked by the cessation of menstruation and a steep decline in reproductive hormones. Women frequently report a variety of symptoms during this time, ranging from hot flashes to difficulties with sleep and mood regulation.

Many individuals turn to hormone replacement therapy to manage these physical and psychological obstacles. Despite the common use of these treatments, the medical community still has questions about how these hormonal shifts affect the brain itself. Previous research has yielded mixed results regarding whether hormone treatments protect the brain or potentially pose risks.

To clarify these effects, a team of researchers from the University of Cambridge undertook a large-scale analysis. Katharina Zuhlsdorff, a researcher in the Department of Psychology at the University of Cambridge, served as the lead author on the project.

She worked alongside senior author Barbara J. Sahakian and colleagues from the Departments of Psychiatry and Psychology. Their objective was to provide a clearer picture of how the end of fertility influences mental well-being, thinking skills, and the physical architecture of the brain.

The team utilized data from the UK Biobank, a massive biomedical database containing genetic and health information from half a million participants. For this specific investigation, they selected a sample of nearly 125,000 women.

The researchers divided these participants into three distinct groups to allow for comparison. These groups included women who had not yet gone through menopause, post-menopausal women who had never used hormone therapy, and post-menopausal women who were users of such therapies.

The investigation first assessed psychological well-being across the different groups. The data showed that women who had passed menopause reported higher levels of anxiety and depression compared to those who had not.

Sleep quality also appeared to decline after this biological transition. The researchers observed that women taking hormone replacement therapy actually reported more mental health challenges than those who did not take it. This group also reported higher levels of tiredness.

This result initially seemed counterintuitive, as hormone therapy is often prescribed to help with mood. To understand this, the authors looked backward at the medical history of the participants. They found that women prescribed these treatments were more likely to have had depression or anxiety before they ever started the medication. This suggests that doctors may be prescribing the hormones specifically to women who are already struggling with severe symptoms.

The study also tested how quickly the participants could think and process information. The researchers found that reaction times typically slow down as part of the aging process.

However, menopause seemed to speed up this decline in processing speed. In this specific domain, hormone therapy appeared to offer a benefit. Post-menopausal women taking hormones had reaction times that were faster than those not taking them, effectively matching the speeds of pre-menopausal women.

Dr. Katharina Zühlsdorff noted the nuance in these cognitive findings. She stated, “Menopause seems to accelerate this process, but HRT appears to put the brakes on, slowing the ageing process slightly.”

While reaction times varied, the study did not find similar differences in memory performance. The researchers administered tasks designed to test prospective memory, which is the ability to remember to perform an action later. They also used a digit-span task to measure working memory capacity. Across all three groups, performance on these memory challenges remained relatively comparable.

A smaller subset of about 11,000 women underwent magnetic resonance imaging scans to measure brain volume. The researchers focused on gray matter, the tissue containing the body of nerve cells. They specifically looked at regions involved in memory and emotional regulation. These included the hippocampus, the entorhinal cortex, and the anterior cingulate cortex.

The hippocampus is a seahorse-shaped structure deep in the brain that is essential for learning and memory. The entorhinal cortex functions as a gateway, channeling information between the hippocampus and the rest of the brain. The anterior cingulate cortex plays a primary role in managing emotions, impulse control, and decision-making.

The scans revealed that post-menopausal women had reduced gray matter volume in these key areas compared to pre-menopausal women. This reduction helps explain the higher rates of mood issues in this demographic. Unexpectedly, the group taking hormone therapy showed the lowest brain volumes of all. The treatment did not appear to prevent the loss of brain tissue associated with the end of reproductive years.

The specific regions identified in the study are often implicated in neurodegenerative conditions. Professor Barbara Sahakian highlighted the potential long-term importance of this observation. She explained, “The brain regions where we saw these differences are ones that tend to be affected by Alzheimer’s disease. Menopause could make these women vulnerable further down the line.”

While the sample size was large, the study design was observational rather than experimental. This means the researchers could identify associations but cannot definitively prove that menopause or hormone therapy caused the changes.

The UK Biobank population also tends to be wealthier and healthier than the general public, which may skew the results. Additionally, the study relied on self-reported data for some measures, which can introduce inaccuracies.

The finding regarding hormone therapy and lower brain volume is difficult to interpret without further research. It remains unclear if the medication contributes to the reduction or if the women taking it had different brain structures to begin with.

The researchers emphasize that more work is needed to disentangle these factors. Future studies could look at genetic factors or other health conditions that might influence how hormones affect the brain.

Despite these limitations, the research highlights the biological reality of menopause. It confirms that the transition involves more than just reproductive changes.

Christelle Langley emphasized the need for broader support systems. She remarked, “We all need to be more sensitive to not only the physical, but also the mental health of women during menopause, however, and recognise when they are struggling.”

The study, “Emotional and cognitive effects of menopause and hormone replacement therapy,” was authored by Katharina Zuhlsdorff, Christelle Langley, Richard Bethlehem, Varun Warrier, Rafael Romero Garcia, and Barbara J Sahakian.

❌
❌