Normal view

Before yesterdayMain stream

The neural path from genes to intelligence looks different depending on your age

2 February 2026 at 17:00

New research published in Scientific Reports provides evidence that the path from genetic predisposition to general intelligence travels through specific, frequency-dependent networks in the brain. The findings indicate that these neural pathways are not static but appear to shift significantly between early adulthood and older age.

Intelligence is a trait with a strong biological basis. Previous scientific inquiries have established that genetic factors account for approximately 50% of the differences in intelligence between individuals. Genome-wide association studies have identified hundreds of specific variations in the genetic code that correlate with cognitive ability.

These variations are often aggregated into a metric known as a polygenic score, which estimates an individual’s genetic propensity for a certain trait. Despite this knowledge, the specific biological mechanisms that translate a genetic sequence into the ability to reason, plan, and solve problems remain unclear.

Scientists have hypothesized that the brain’s functional connectivity acts as the intermediary between genes and behavior. Functional connectivity refers to how well different regions of the brain communicate with one another. While past studies using functional magnetic resonance imaging (fMRI) have attempted to map these connections, the results have been inconsistent.

fMRI is excellent at locating where brain activity occurs but is less precise at measuring when it occurs. The authors of the new study opted to use electroencephalography (EEG). This technology records the electrical activity of the brain with high temporal resolution, allowing researchers to observe the speed and rhythm of neural communication.

“We already know that intelligence is highly heritable, which is why we are especially interested in the role of the brain as a ‘neural pathway’ linking genetic variation to cognitive ability,” said study author Rebecca Engler of the Leibniz Research Centre for Working Environment and Human Factors (IfADo).

“The lack of integrative approaches combining genetics, brain network organization, and intelligence motivated us to take a closer look at resting-state EEG markers, with a particular focus on differences between young and older adults.”

“In a recent large-scale study (Metzen et al., 2024) using resting-state fMRI, we found no robust association between functional architecture of specific brain regions and intelligence. This motivated our shift toward resting-state EEG, which captures brain dynamics at much higher temporal resolution. EEG measures brain activity as oscillations across different frequencies, allowing us to study frequency-specific brain networks that may carry distinct information relevant to cognitive ability.”

For their study, the researchers recruited a representative sample of 434 healthy adults from the Dortmund Vital Study. The participants were categorized into two distinct age groups. The young adult group consisted of 199 individuals between the ages of 20 and 40. The older adult group included 235 individuals aged 40 to 70.

To measure intelligence, the research team administered a comprehensive battery of cognitive tests. These assessments covered a wide range of mental capabilities, including verbal memory, processing speed, attention span, working memory, and logical reasoning. The scores from these tests were combined to calculate a single factor of general intelligence, often denoted as g. This factor serves as a reliable summary of an individual’s overall cognitive performance.

Genetic data were obtained through blood samples. The researchers analyzed the DNA of each participant to compute a polygenic score for intelligence. This score was calculated based on summary statistics from previous large-scale genetic studies. It represents the cumulative effect of many small genetic variations that are statistically associated with higher cognitive function.

Brain activity was recorded while participants sat quietly with their eyes closed for two minutes. This “resting-state” EEG data allowed the researchers to analyze the intrinsic functional architecture of the brain.

The team employed a method known as graph theory to quantify the organization of the brain networks. In this framework, the brain is modeled as a collection of nodes (regions) and edges (connections).

The researchers calculated metrics such as “efficiency,” which measures how easily information travels across the network, and “clustering,” which measures how interconnected specific local neighborhoods of the brain are. These metrics were analyzed across different frequency bands, including delta, theta, alpha, and beta waves.

The study employed complex statistical modeling to test for mediation effects. A mediation analysis determines whether a third variable—in this case, brain connectivity—explains the relationship between an independent variable (genetics) and a dependent variable (intelligence). The researchers looked for instances where the polygenic score predicted a specific brain network property, which in turn predicted the intelligence score.

The results showed that global measures of brain efficiency did not mediate the link between genetics and intelligence. This suggests that simply having a “more efficient” brain overall is not the primary mechanism by which genes influence cognition.

In other words, “there is no single brain region responsible for intelligence,” Engler told PsyPost. “Instead, cognitive ability relies on efficient and dynamic communication across a broad network of regions throughout the brain, and this network organization changes as we age.”

The specific neural pathways identified varied substantially by age. For young adults, the connection between genetics and intelligence was mediated by brain activity in the beta and theta frequency bands. These effects were predominantly located in the frontal and parietal regions of the brain.

The frontal and parietal lobes are areas traditionally associated with executive functions, such as decision-making, working memory, and attention. This aligns with prominent theories that attribute intelligence to the efficient integration of information between these higher-order brain regions.

But for older adults, the mediating effects were found primarily in the low alpha and theta frequency bands. Furthermore, the specific brain regions involved shifted away from the frontal cortex. The analysis identified the superior parietal lobule and the primary visual cortex as key mediators. These areas are largely responsible for sensory processing and integration.

This shift suggests that the neural architecture supporting intelligence evolves as people age. In younger adulthood, cognitive ability appears to rely heavily on the rapid, high-frequency communication of executive control networks in the front of the brain. As the brain ages, it may undergo a process of reorganization.

The reliance on posterior brain regions and slower frequency bands in older adults implies a strategy that prioritizes the integration of sensory information. This finding is consistent with the concept of neural dedifferentiation, where the aging brain recruits broader, less specialized networks to maintain performance.

The researchers also found that certain brain areas, such as the primary visual cortex, played a consistent role across both groups, though the direction of the effect varied. In both young and older adults, higher nodal efficiency in the visual cortex was associated with higher intelligence.

However, a higher genetic predisposition for intelligence was associated with lower efficiency in this region. This complex relationship highlights that the genetic influence on the brain is not always a straightforward enhancement of connectivity.

“When comparing the two age groups, we were surprised that the brain regions consistently mediating the link between genetic variation and intelligence are primarily involved in sensory processing and integration,” Engler explained. “One might expect such stable neural anchors to be associated with higher-order executive functions like reasoning or planning, typically located in frontal networks. Instead, our results suggest that sensory and associative regions play a more central role in maintaining cognitive ability than is typically emphasized in dominant models of intelligence.”

As with all research, there are some limitations to note. The study utilized a cross-sectional design, meaning it compared two different groups of people at a single point in time. It did not follow the same individuals as they aged.

Consequently, it is not possible to definitively prove that the observed differences are caused by the aging process itself rather than generational differences. Longitudinal studies that track participants over decades would be necessary to confirm the shift in neural strategies.

The study focused exclusively on resting-state EEG. While intrinsic brain activity provides a baseline of functional organization, it does not capture the brain’s dynamic response to active problem-solving.

It is possible that different network patterns would emerge if participants were recorded while performing the cognitive tests. Future research could investigate task-based connectivity to see if it offers a stronger explanatory link between genetics and performance.

“A crucial next step would be to replicate our findings in independent samples to ensure their robustness and generalizability,” Engler said. “Furthermore, it would be interesting to investigate age-related changes in functional network organization from a longitudinal rather than from a cross-sectional perspective. A further long-term goal is to investigate the triad of genetic variants, the brain’s functional connectivity, and intelligence by analyzing task-based EEG data rather than resting-state EEG data.”

The study, “Electrophysiological resting-state signatures link polygenic scores to general intelligence,” was authored by Rebecca Engler, Christina Stammen, Stefan Arnau, Javier Schneider Penate, Dorothea Metzen, Jan Digutsch, Patrick D. Gajewski, Stephan Getzmann, Christoph Fraenz, Jörg Reinders, Manuel C. Voelkle, Fabian Streit, Sebastian Ocklenburg, Daniel Schneider, Michael Burke, Jan G. Hengstler, Carsten Watzl, Michael A. Nitsche, Robert Kumsta, Edmund Wascher, and Erhan Genç.

Speaking multiple languages appears to keep the brain younger for longer

People are living longer than ever around the world. Longer lives bring new opportunities, but they also introduce challenges, especially the risk of age-related decline.

Alongside physical changes such as reduced strength or slower movement, many older adults struggle with memory, attention and everyday tasks. Researchers have spent years trying to understand why some people stay mentally sharp while others deteriorate more quickly. One idea attracting growing interest is multilingualism, the ability to speak more than one language.

When someone knows two or more languages, all those languages remain active in the brain. Each time a multilingual person wants to speak, the brain must select the right language while keeping others from interfering. This constant mental exercise acts a bit like daily “brain training”.

Choosing one language, suppressing the others and switching between them strengthens brain networks involved in attention and cognitive control. Over a lifetime, researchers believe this steady mental workout may help protect the brain as it ages.

Studies comparing bilinguals and monolinguals have suggested that people who use more than one language might maintain better cognitive skills in later life. However, results across studies have been inconsistent. Some reported clear advantages for bilinguals, while others found little or no difference.

A new, large-scale study now offers stronger evidence and an important insight: speaking one extra language appears helpful, but speaking several seems even better.

This study analysed data from more than 86,000 healthy adults aged 51 to 90 across 27 European countries. Researchers used a machine-learning approach, meaning they trained a computer model to detect patterns across thousands of datapoints. The model estimated how old someone appeared based on daily functioning, memory, education level, movement and health conditions such as heart disease or hearing loss.

Comparing this “predicted age” with a person’s actual age created what the researchers called a “biobehavioural age gap”. This is the difference between how old someone is and how old they seem based on their physical and cognitive profile. A negative gap meant someone appeared younger than their biological age. A positive gap meant they appeared older.

The team then looked at how multilingual each country was by examining the percentage of people who spoke no additional languages, one, two, three or more. Countries with high multilingual exposure included places such as Luxembourg, the Netherlands, Finland and Malta, where speaking multiple languages is common. Countries with low multilingualism included the UK, Hungary and Romania.

People living in countries where multilingualism is common had a lower chance of showing signs of accelerated ageing. Monolingual speakers, by contrast, were more likely to appear biologically older than their actual age. Just one additional language made a meaningful difference. Several languages created an even stronger effect, suggesting a dose-dependent relationship in which each extra language provided an additional layer of protection.

These patterns were strongest among people in their late 70s and 80s. Knowing two or more languages did not simply help; it offered a noticeably stronger shield against age-related decline. Older multilingual adults seemed to carry a kind of built-in resilience that their monolingual peers lacked.

Could this simply reflect differences in wealth, education or political stability between countries? The researchers tested this by adjusting for dozens of national factors including air quality, migration rates, gender inequality and political climate. Even after these adjustments, the protective effect of multilingualism remained steady, suggesting that language experience itself contributes something unique.

Although the study did not directly examine brain mechanisms, many scientists argue that the mental effort required to manage more than one language helps explain the findings. Research shows that juggling languages engages the brain’s executive control system, the set of processes responsible for attention, inhibition and switching tasks.

Switching between languages, preventing the wrong word from coming out, remembering different vocabularies and choosing the right expression all place steady demands on these systems. Work in our lab has shown that people who use two languages throughout their lives tend to have larger hippocampal volume.

This means the hippocampus, a key brain region for forming memories, is physically bigger. A larger or more structurally robust hippocampus is generally linked to better memory and greater resistance to age-related shrinkage or neurodegenerative diseases such as Alzheimer’s.

This new research stands out for its scale, its long-term perspective and its broad approach to defining ageing. By combining biological, behavioural and environmental information, it reveals a consistent pattern: multilingualism is closely linked to healthier ageing. While it is not a magic shield, it may be one of the everyday experiences that help the brain stay adaptable, resilient and younger for longer.The Conversation

 

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Novel essential oil blend may enhance memory and alertness

30 January 2026 at 19:00

A recent study provides evidence that inhaling a specific blend of essential oils may improve cognitive performance in healthy adults. The research indicates that while this aromatic blend increases brain metabolism during mental tasks, these physiological changes do not directly explain the observed boost in memory and attention. These findings were published in the scientific journal Human Psychopharmacology: Clinical and Experimental.

The use of essential oils for psychological well-being is a practice with a long history, yet scientific validation for these effects varies across different substances. Previous investigations have identified that the aromas of single oils, such as rosemary and sage, appear to support memory retention and alertness. However, the practice of aromatherapy frequently relies on the blending of multiple oils to create potential synergistic effects.

Despite the popularity of these blends, the efficacy of combining oils has received limited empirical attention compared to single extracts. The creators of the “Genius” blend formulated it based on the purported cognitive benefits of ingredients like frankincense, cardamom, and patchouli. The researchers aimed to determine if this complex mixture could outperform a single oil known for its positive effects.

“I have been Interested in natural interventions to deliver cognitive enhancement for 30 years. For around 20 years, I have been looking at the effects of the aromas of essential oils on aspects of human behaviour including cognition, mood and stress,” said study author Mark Moss, a professor and member of the Brain Performance and Nutrition Research Centre at Northumbria University.

“Essential oils and aromas have been used in society since before the beginning of written records but the scientific investigation of their effects is lacking. I have an interest in conducting high quality research that can deliver reliable and valid findings in this area.”

The scientific team also sought to move beyond subjective reports and behavioral scores. A primary goal was to explore the biological mechanisms that might underpin these effects. Specifically, they investigated whether the inhalation of these aromas influences brain metabolism by measuring blood oxygenation levels during the performance of demanding mental tasks.

The study involved ninety healthy adult participants who were pseudo-randomly assigned to one of three experimental conditions. To ensure a balanced sample, the groups were matched for gender and age. One group was exposed to the aroma of the Genius essential oil blend, which includes patchouli, neroli, grapefruit, cardamom, frankincense, spikenard, rosemary, and lemongrass.

A second group was exposed to the aroma of sage essential oil to serve as a positive control, given its established reputation for cognitive enhancement. A third group sat in an environment with no added aroma to function as a standard control comparison. The study utilized a double-blind design where neither the researchers administering the tests nor the participants knew which aroma condition was active.

Participants completed a battery of computerized cognitive assessments designed to measure memory, attention, and computational skills. These tasks included word recall, where participants had to remember a list of words, and serial subtraction, which required them to repeatedly subtract specific numbers from a starting figure. Other tasks involved sequence memory challenges known as Corsi blocks.

While performing these mental exercises, participants wore a headband equipped with near-infrared spectroscopy technology. This non-invasive device projected light through the skull to measure changes in oxygenated and deoxygenated hemoglobin in the prefrontal cortex. This provided the researchers with real-time data on brain metabolism and oxygen utilization.

Following the completion of the cognitive battery, the participants rated their current mood states. They specifically evaluated their levels of alertness and mental fatigue on visual analogue scales. This allowed the researchers to correlate subjective feelings of well-being with objective performance metrics.

The data analysis revealed significant improvements in performance for the group exposed to the Genius blend compared to the no-aroma control. These improvements were particularly notable in tasks requiring memory and executive function. For instance, participants in the blend condition performed better on word recall and numeric working memory tasks.

The blend also demonstrated superior effects compared to the sage essential oil condition in several performance metrics. This provides some evidence supporting the theory of synergy, where the combined effect of multiple oils may exceed the impact of a single component. The magnitude of the improvement was considered statistically significant.

Regarding subjective experience, participants in the Genius condition reported feeling significantly more alert by the end of the testing session. Perhaps most notably, they reported feeling significantly less fatigued than those in the control group. This buffering against mental exhaustion suggests that the aroma may help maintain stamina during cognitive exertion.

The physiological data gathered via the spectroscopy headbands showed that both aroma conditions led to increased oxygen extraction in the brain during tasks. The level of deoxygenated hemoglobin was significantly higher in the Genius aroma condition compared to the control. This indicates that the brain was extracting and utilizing more oxygen from the blood while the participants were inhaling the blend.

Despite these clear physiological changes, the researchers found no statistical correlation between the increased brain metabolism and the improved cognitive scores. The participants who showed the greatest increase in oxygen utilization were not necessarily the ones who performed best on the tests. This disconnect suggests that while the aroma increases brain energy usage, this mechanism does not directly account for the better test results.

The lack of correlation implies that other mechanisms may be driving the cognitive improvements. One possibility is a pharmacological effect, where chemical compounds from the oils are absorbed into the bloodstream through the lungs and cross into the brain. Another potential pathway is direct stimulation of the olfactory bulb, which has neural connections to brain areas involved in memory and emotion.

“The overall message is that aromas of essential oils can provide cheap, safe and accessible options for personal benefit,” Moss told PsyPost. “Inhalation of the ambient aroma of the essential oils we employed here (pure sage and a blend of oils) can positively affect cognition and mood, although only to a relatively small degree.

“Interestingly the reasons why these effects occur are not well understood at this time, and this study looked at one particular possibility. The brain uses a lot of energy when we apply it to completing tests of memory and similar. It is possible that breathing aromas could help the brain in delivering more energy to the tasks in hand. Although we found that increased energy production appeared to take place this was not related to levels of performance on the tasks. Other possible explanations are still to be tested in depth.”

The study, like all research, includes some caveats. The method of delivering the aroma involved a diffuser in a testing cubicle, which means the exact dose inhaled by each participant could vary based on their breathing patterns. This lack of standardization makes it difficult to establish precise dose-response relationships.

Additionally, the study focused on acute effects observed during a single session. It remains unknown whether these benefits would persist with long-term use or if users might develop a tolerance to the aromas.

“Next steps include finding good ways to standardise aroma delivery,” Moss explained. “Currently, it is all rather vague as people breathe at different rates and to different depths. It is hard to know exactly how much aroma is being delivered and this would be very useful to enable dose-response relationships to be identified. I am generally interested in continuing to apply scientific method to investigate effects that often exist as received wisdom.”

The researchers add that while essential oils offer a safe and accessible option for personal benefit, they function best as a complementary aid rather than a standalone medical treatment.

“The effects of aromas are generally relatively small, but beneficial. Don’t over interpret the findings of aroma research,” Moss said. “Aromas are not a panacea. They can be beneficial, generally within a framework of general healthy living. They can be beneficial in healthcare as part of an integrated healthcare system.”

The study, “Aroma of Genius Essential Oil Blend Significantly Enhances Cognitive Performance and Brain Metabolism in Healthy Adults,” was authored by Mark Moss, Jake Howarth, and Holly Moss.

New maps of brain activity challenge century-old anatomical boundaries

30 January 2026 at 01:00

New research challenges the century-old practice of mapping the brain based on how tissue looks under a microscope. By analyzing electrical signals from thousands of neurons in mice, scientists discovered that the brain’s command center organizes itself by information flow rather than physical structure. These findings appear in the journal Nature Neuroscience.

The prefrontal cortex acts as the brain’s executive hub. It manages complex processes such as planning, decision-making, and reasoning. Historically, neuroscientists defined the boundaries of this region by studying cytoarchitecture. This method involves staining brain tissue and observing the arrangement of cells. The assumption has been that physical differences in cell layout correspond to distinct functional jobs.

However, the connection between these static maps and the dynamic electrical firing of neurons remains unproven. A research team led by Marie Carlén at the Karolinska Institutet in Sweden sought to test this long-standing assumption. Pierre Le Merre and Katharina Heining served as the lead authors on the paper. They aimed to create a functional map based on what neurons actually do rather than just where they sit.

To achieve this, the team performed an extensive analysis of single-neuron activity. They focused on the mouse brain, which serves as a model for mammalian neural structure. The researchers implanted high-density probes known as Neuropixels into the brains of awake mice. These advanced sensors allowed them to record the electrical output of more than 24,000 individual neurons.

The study included recordings from the prefrontal cortex as well as sensory and motor areas. The investigators first analyzed spontaneous activity. This refers to the electrical firing that occurs when the animal is resting and not performing a specific task. Spontaneous activity offers a window into the intrinsic properties of a neuron and its local network.

The team needed precise ways to describe this activity. Simply counting the number of electrical spikes per second was insufficient. They introduced three specific mathematical metrics to characterize the firing patterns. The first metric was the firing rate, or how often a neuron sends a signal.

The second metric was “burstiness.” This describes the irregularity of the intervals between spikes. A neuron with high burstiness fires in rapid clusters followed by silence. A neuron with low burstiness fires with a steady, metronomic rhythm.

The third metric was “memory.” This measures the sequential structure of the firing. It asks whether the length of one interval between spikes predicts the length of the next one. Taken together, these three variables provided a unique “fingerprint” for every recorded neuron.

The researchers used a machine learning technique called a Self-Organizing Map to sort these fingerprints. This algorithm grouped neurons with similar firing properties together. It allowed the scientists to visualize the landscape of neuronal activity without imposing human biases.

The analysis revealed a distinct signature for the prefrontal cortex. Neurons in this area predominantly displayed low firing rates and highly regular rhythms. They did not fire in erratic bursts. This created a “low-rate, regular-firing” profile that distinguished the prefrontal cortex from other brain regions.

The team then projected these activity profiles back onto the physical map of the brain. They compared the boundaries of their activity-based clusters with the traditional cytoarchitectural borders. The two maps did not align.

Regions that looked different under a microscope often contained neurons with identical firing patterns. Conversely, regions that looked the same structurally often hosted different types of activity. The distinct functional modules of the prefrontal cortex ignored the classical boundaries drawn by anatomists.

Instead of anatomy, the activity patterns aligned with hierarchy. In neuroscience, hierarchy refers to the order of information processing. Sensory areas that receive raw data from the eyes or ears are at the bottom of the hierarchy. The prefrontal cortex, which integrates this data to make decisions, sits at the top.

The researchers correlated their activity maps with existing maps of brain connectivity. They found that regions higher up in the hierarchy consistently displayed the low-rate, regular-firing signature. This suggests that the way neurons fire is determined by their place in the network, not by the local architecture of the cells.

This finding aligns with theories about how the brain processes information. Sensory areas need to respond quickly to changing environments, requiring fast or bursty firing. High-level areas need to integrate information over time to maintain stable plans. A slow, regular rhythm is ideal for holding information in working memory without being easily distracted by noise.

The study then moved beyond resting activity to examine goal-directed behavior. The mice performed a task where they heard a tone or saw a visual stimulus. They had to turn a wheel to receive a water reward. This allowed the researchers to see how the functional map changed during active decision-making.

The team identified neurons that were “tuned” to specific aspects of the task. Some neurons responded only to the sound. Others fired specifically when the mouse made a choice to turn the wheel.

When they mapped these task-related neurons, they again found no relation to the traditional anatomical borders. The functional activity formed its own unique territories. One specific finding presented a paradox.

The researchers had established that the hallmark of the prefrontal cortex was slow, regular firing. However, the specific neurons that coded for “choice”—the act of making a decision—tended to have high firing rates. These “decider” neurons were chemically and spatially mixed in with the “integrator” neurons but behaved differently.

This implies a separation of duties within the same brain space. The general population of neurons maintains a slow, steady rhythm to provide a stable platform for cognition. Embedded within this stable network are specific, highly excitable neurons that trigger actions.

The overlap of these two populations suggests that connectivity shapes the landscape. The high-hierarchy network supports the regular firing. Within that network, specific inputs drive the high-rate choice neurons.

These results suggest that intrinsic connectivity is the primary organizing principle of the prefrontal cortex. The physical appearance of the tissue is a poor predictor of function. “Our findings challenge the traditional way of defining brain regions and have major implications for understanding brain organisation overall,” says Marie Carlén.

The study does have limitations. It relied on data from mice. While mouse and human brains share many features, the human prefrontal cortex is far more complex. Additionally, the recordings focused primarily on the deep layers of the cortex. These layers are responsible for sending output signals to other parts of the brain.

The activity in the surface layers, which receive input, might show different patterns. The study also looked at a limited set of behaviors. Future research will need to explore whether these maps hold true across different types of cognitive tasks.

Scientists must also validate these metrics in other species. If the pattern holds, it could provide a new roadmap for understanding brain disorders. Many psychiatric conditions involve dysfunction in the prefrontal cortex. Understanding the “normal” activity signature—slow and regular—could help identify what goes wrong in disease.

This data-driven approach offers a scalable framework. It moves neuroscience away from subjective visual descriptions toward objective mathematical categorization. It suggests that to understand the brain, we must look at the invisible traffic of electricity rather than just the visible roads of tissue.

The study, “A prefrontal cortex map based on single-neuron activity,” was authored by Pierre Le Merre, Katharina Heining, Marina Slashcheva, Felix Jung, Eleni Moysiadou, Nicolas Guyon, Ram Yahya, Hyunsoo Park, Fredrik Wernstal & Marie Carlén.

The psychology behind why we pay to avoid uncertainty

28 January 2026 at 19:00

Most people are familiar with the feeling of anxiety while waiting for the result of a medical test or a job interview. A new study suggests that this feeling of dread is far more powerful than the excitement of looking forward to a positive outcome.

The research indicates that the intensity of this dread drives people to avoid risks and demand immediate results. This behavior explains why impatience and risk-avoidance often appear together in the same individuals. The findings were published in the journal Cognitive Science.

Economists have traditionally viewed risk-taking and patience as separate character traits. A person could theoretically be a daring risk-taker while also being very patient. However, researchers have frequently observed that these two traits tend to correlate. People who are unwilling to take risks are often the same people who are unwilling to wait for a reward.

Chris Dawson of the University of Bath and Samuel G. B. Johnson of the University of Waterloo sought to explain this connection. They proposed that the link lies in the emotions people feel while waiting for an outcome. They distinguished between the feelings experienced after an event occurs and the feelings experienced beforehand.

When an event happens, we feel “reactive” emotions. We feel pleasure when we win money or displeasure when we lose it. But before the event occurs, we engage in “anticipatory” emotions. We might savor the thought of a win or dread the possibility of a loss.

The researchers hypothesized that these anticipatory emotions are not symmetrical. They suspected that the dread of a future loss is much stronger than the savoring of a future gain. If this is true, it would create a psychological cost to waiting.

To test this theory, Dawson and Johnson analyzed a massive dataset from the United Kingdom. They used the British Household Panel Survey and the Understanding Society study. These surveys followed approximately 14,000 individuals over a period spanning from 1991 to 2024.

The team needed a way to measure dread and savoring without asking participants directly. They developed a novel method using data on financial expectations and general well-being. The survey asked participants if they expected their financial situation to get better or worse over the next year.

The researchers then looked at how these expectations affected the participants’ current happiness. If a person expected to be worse off and their happiness dropped, that drop represented dread. If they expected to be better off and their happiness rose, that rise represented savoring.

The analysis revealed a dramatic imbalance between these two emotional states. The negative impact of anticipating a loss was more than six times stronger than the positive impact of anticipating a gain. This suggests that the human brain weighs future pain much more heavily than future pleasure.

The researchers also measured “reactive” emotions using the same method. They looked at how participants felt after they actually experienced a financial loss or gain. As expected, losses hurt more than gains felt good.

However, the imbalance in reactive emotions was much smaller than the imbalance in anticipatory emotions. Realized losses were about twice as impactful as realized gains. The anticipatory dread was three times more lopsided than the reactive experience.

This finding implies that the waiting period itself is a major source of distress. The researchers describe this phenomenon as “dread aversion.” It is distinct from the more famous concept of loss aversion.

The study then connected these emotional patterns to economic preferences. The survey included questions about the participants’ willingness to take risks in general. It also measured their patience through a delayed gratification scale.

The results showed a strong correlation between high levels of dread and risk-avoidance. People who experienced intense dread were much less likely to take risks. This makes sense within the researchers’ framework.

Taking a gamble creates a situation where a negative outcome is possible. This possibility triggers dread. By avoiding the risk entirely, the individual removes the source of the dread.

The results also showed a strong connection between dread and impatience. People who felt high levels of dread were less willing to wait for rewards. This also aligns with the researchers’ model.

Waiting for an uncertain outcome prolongs the experience of dread. A person who hates waiting may simply be trying to shorten the time they spend feeling anxious. They choose immediate rewards to stop the emotional wheel from spinning.

The study found that savoring plays a much smaller role in decision-making. The pleasure of imagining a good outcome is generally weak. This may be because positive anticipation is often mixed with the fear that the good event might not happen.

The authors checked to see if these results were simply due to personality traits. For example, a person with high neuroticism might naturally be both anxious and risk-avoidant. The researchers controlled for the “Big Five” personality traits in their analysis.

Even after accounting for neuroticism and other traits, the effect of dread remained. This suggests that the asymmetry of anticipatory emotions is a distinct psychological mechanism. It is not just a symptom of being a generally anxious person.

This research offers a unified explanation for economic behavior. It suggests that risk preferences and time preferences are not independent. They are both shaped by the desire to manage anticipatory emotions.

The authors use the analogy of a roulette wheel to explain their findings. When a person bets on roulette, they are not just weighing the odds of winning or losing. They are also deciding if they can endure the feeling of watching the wheel spin.

If the dread of losing is overwhelming, the person will not bet at all. If they do bet, they will want the wheel to stop as quickly as possible. The act of betting creates a stream of emotional discomfort that lasts until the result is known.

There are some limitations to this study. It relies on observational data rather than a controlled experiment. The researchers inferred emotions from survey responses rather than measuring them physiologically.

Additionally, the study assumes that changes in well-being are caused by financial expectations. It is possible that other unmeasured factors influenced both happiness and expectations. However, the use of longitudinal data helps to account for stable individual differences.

The findings have implications for various sectors. In healthcare, patients might avoid screening tests because the dread of a bad result outweighs the benefit of knowing. Reducing the waiting time for results could encourage more people to get tested.

In finance, investors might choose low-return savings accounts over stocks to avoid the anxiety of market fluctuations. This “dread premium” could explain why safe assets are often overvalued. Investors pay a price for emotional tranquility.

Future research could investigate how to modify these anticipatory emotions. If people can learn to reduce their dread, they might make better long-term decisions. Techniques from cognitive behavioral therapy could potentially help investors and patients manage their anxiety.

The study provides a new lens through which to view human irrationality. We often make choices that look bad on paper because we are optimizing for our current emotional state. We are willing to pay a high price to avoid the shadow of the future.

The study, “Asymmetric Anticipatory Emotions and Economic Preferences: Dread, Savoring, Risk, and Time,” was authored by Chris Dawson and Samuel G. B. Johnson.

❌
❌