Normal view

Before yesterdayMain stream

Higher diet quality is associated with greater cognitive reserve in midlife

12 December 2025 at 15:00

A new study published in Current Developments in Nutrition provides evidence that individuals who adhere to higher quality diets, particularly those rich in healthy plant-based foods, tend to possess greater cognitive reserve in midlife. This concept refers to the brain’s resilience against aging and disease, and the findings suggest that what people eat throughout their lives may play a distinct role in building this mental buffer.

As humans age, the brain undergoes natural structural changes that can lead to difficulties with memory, thinking, and behavior. Medical professionals have observed that some individuals with physical signs of brain disease, such as the pathology associated with Alzheimer’s, do not exhibit the expected cognitive symptoms. This resilience is attributed to cognitive reserve, a property of the brain that allows it to cope with or compensate for damage.

While factors such as education level and occupational complexity are known to contribute to this buffer, the specific influence of dietary habits has been less clear. The scientific community has sought to determine if nutrition can serve as a modifiable factor to help individuals maintain cognitive function into older age.

“It has been established that cognitive reserve is largely influenced by factors like genetics, education, occupation, and certain lifestyle behaviors like physical activity and social engagement,” explained study author Kelly C. Cara, a postdoctoral fellow at the American Cancer Society.

“Few studies have examined the potential impact of diet on cognitive reserve, but specific dietary patterns (i.e., all the foods and beverages a person consumes), foods, and food components have been associated with other cognitive outcomes including executive function and cognitive decline. With this study, we wanted to determine whether certain dietary patterns were associated with cognitive reserve and to what degree diet quality may influence cognitive reserve.”

For their study, the researchers analyzed data from the 1946 British Birth Cohort. This is a long-running project that has followed thousands of people born in Great Britain during a single week in March 1946. The final analysis for this specific study included 2,514 participants. The researchers utilized dietary data collected at four different points in the participants’ lives: at age 4, age 36, age 43, and age 53. By averaging these records, the team created a cumulative picture of each person’s typical eating habits over five decades.

The researchers assessed these dietary habits using two main frameworks. The first was the Healthy Eating Index-2020. This index measures how closely a person’s diet aligns with the Dietary Guidelines for Americans. It assigns higher scores for the consumption of fruits, vegetables, whole grains, dairy, and proteins, while lowering scores for high intakes of refined grains, sodium, and added sugars.

The second framework involved three variations of a Plant-Based Diet Index. These indexes scored participants based on their intake of plant foods versus animal foods. The overall Plant-Based Diet Index gave positive scores for all plant foods and reverse scores for animal foods.

The researchers also calculated a Healthful Plant-Based Diet Index, which specifically rewarded the intake of nutritious plant foods like whole grains, fruits, vegetables, nuts, legumes, vegetable oils, tea, and coffee. Finally, they calculated an Unhealthful Plant-Based Diet Index. This measure assigned higher scores to less healthy plant-derived options, such as fruit juices, refined grains, potatoes, sugar-sweetened beverages, and sweets.

To measure cognitive reserve, the researchers administered the National Adult Reading Test to the participants when they were 53 years old. This assessment asks individuals to read aloud a list of 50 words with irregular pronunciations. The test is designed to measure “crystallized” cognitive ability, which relies on knowledge and experience acquired over time.

Unlike “fluid” abilities such as processing speed or working memory, crystallized abilities tend to remain stable even as people age or experience early stages of neurodegeneration. This stability makes the reading test a reliable proxy for estimating a person’s accumulated cognitive reserve.

The analysis revealed that participants with higher scores on the Healthy Eating Index and the Healthful Plant-Based Diet Index tended to have higher reading test scores at age 53. The data suggested a dose-response relationship, meaning that as diet quality improved, cognitive reserve scores generally increased.

Participants in the top twenty percent of adherence to the Healthy Eating Index showed the strongest association with better cognitive reserve. This relationship persisted even after the researchers used statistical models to adjust for potential confounding factors, including childhood socioeconomic status, adult education levels, and physical activity.

“This was one of the first studies looking at the relationship between dietary intake and cognitive reserve, and the findings show that diet is worth exploring further as a potential influencer of cognitive reserve,” Cara told PsyPost.

On the other hand, the researchers found an inverse relationship regarding the Unhealthful Plant-Based Diet Index. Participants who consumed the highest amounts of refined grains, sugary drinks, and sweets generally had lower cognitive reserve scores. This distinction highlights that the source and quality of plant-based foods are significant. The findings indicate that simply reducing animal products is not sufficient for cognitive benefits if the diet consists largely of processed plant foods.

The researchers also examined how much variability in cognitive reserve could be explained by these dietary patterns. The single strongest predictor of cognitive reserve at age 53 was the individual’s childhood cognitive ability, measured at age 8. This early-life factor accounted for over 40 percent of the variance in the adult scores.

However, the Healthy Eating Index scores still uniquely explained about 2.84 percent of the variation. While this number may appear small, the authors noted that when diet was combined with other lifestyle factors like smoking and exercise, the collective contribution to cognitive reserve was roughly 5 percent. This effect size is comparable to the cognitive advantage associated with obtaining a higher education degree.

“People in our study with healthier dietary patterns generally showed higher levels of cognitive reserve while those with less healthy dietary patterns generally showed lower levels of cognitive reserve,” Cara explained. “We do not yet know if diet caused these differences in cognitive reserve or if the differences were due to some other factor(s). Our study findings did suggest that diet plays at least a small role in individuals’ cognitive reserve levels.”

It is worth noting that the Healthy Eating Index showed a stronger association with cognitive reserve than the plant-based indexes. The authors suggest this may be due to how the indexes treat certain foods. The Healthy Eating Index rewards the consumption of fish and seafood, which are rich in omega-3 fatty acids known to support brain health. In contrast, the plant-based indexes penalize all animal products, including fish.

Additionally, the plant-based indexes categorized all potatoes and fruit juices as unhealthful. The Healthy Eating Index allows for these items to count toward total vegetable and fruit intake in moderation. This nuance in scoring may explain why the general healthy eating score served as a better predictor of cognitive outcomes.

As with all research, there are some caveats to consider. The measurement of cognitive reserve was cross-sectional, meaning it looked at the outcome at a single point in time rather than tracking the development of reserve over decades. It is not possible to definitively state that the diet caused the higher test scores, as other unmeasured factors could play a role. For instance, while the study controlled for childhood cognition, it is difficult to completely rule out the possibility that people with higher cognitive abilities simply choose healthier diets.

“To date, very few studies have examined diet and cognitive reserve, so our work started with an investigation of the relationship between diet and cognitive reserve only at a single point in time,” Cara said. “While we can’t draw any strong conclusions from the findings, we believe our study suggests that diet may be one of the factors that influence cognitive reserve.”

“Future studies that look at diet and the development of cognitive reserve over time will help us better understand if dietary patterns or any specific aspect of diet can improve or worsen cognitive reserve. I hope to apply different statistical approaches to dietary and cognitive data collected across several decades to get at how these two factors relate to each other over a lifetime.”

The study, “Associations Between Healthy and Plant-Based Dietary Patterns and Cognitive Reserve: A Cross-Sectional Analysis of the 1946 British Birth Cohort,” was authored by Kelly C. Cara, Tammy M. Scott, Paul F. Jacques, and Mei Chung.

New review challenges the idea that highly intelligent people are hyper-empathic

12 December 2025 at 03:00

A new scientific review challenges the popular assumption that highly intelligent people possess a naturally heightened capacity for feeling the emotions of others. The analysis suggests that individuals with high intellectual potential often utilize a distinct form of empathy that relies heavily on cognitive processing rather than automatic emotional reactions. Published in the journal Intelligence, the paper proposes that these individuals may intellectualize feelings to maintain composure in intense situations.

The research team set out to clarify the relationship between high intelligence and socio-emotional skills. General society often views people with high intellectual potential as hypersensitive or “hyper-empathic.” This stereotype suggests that a high intelligence quotient, or IQ, comes packaged with an innate ability to deeply feel the pain and joy of those around them.

This belief has historical roots in psychological theories that linked intellectual giftedness with emotional overexcitability. The researchers wanted to determine if this reputation holds up against current neuroscientific and psychological evidence.

The review was conducted by Nathalie Lavenne-Collot, Pascale Planche, and Laurence Vaivre-Douret. They represent institutions including the Université Paris Cité and INSERM in France. The authors sought to move beyond simple generalizations. They aimed to understand how high intelligence interacts with the specific brain mechanisms that govern how humans connect with one another.

To achieve this, the investigators performed a systematic review of existing literature. They searched major scientific databases for studies linking high intellectual potential with various components of empathy. The team did not simply look for a “yes” or “no” regarding whether smart people are empathetic. Instead, they broke empathy down into its constituent parts to see how each functioned in this population. They examined emotional detection, motivation, regulation, and cognitive understanding.

A primary distinction made in the review is the difference between emotional empathy and cognitive empathy. Emotional empathy is the automatic, visceral reaction to another person’s state. It is the phenomenon of flinching when someone else gets hurt or tearing up when seeing a crying face. The review found that individuals with high intellectual potential do not necessarily exhibit higher levels of this automatic emotional contagion. Their immediate physical resonance with the feelings of others appears to be average compared to the general population.

However, the findings regarding cognitive empathy were quite different. Cognitive empathy involves the intellectual ability to understand and identify what another person is thinking or feeling. The researchers found that highly intelligent individuals often excel in this area. They possess advanced capabilities in “Theory of Mind,” which is the psychological term for understanding that others have beliefs and desires different from one’s own. Their strong verbal and reasoning skills allow them to decode social situations with high precision.

The reviewers detailed how these individuals process emotional data. While they may not feel a rush of emotion, they are often superior at emotion recognition. They can identify subtle changes in facial expressions, vocal tones, and body language more faster and accurately than average. This ability likely stems from their general cognitive speed and heightened attention to detail. The brain networks responsible for processing visual and auditory information are highly efficient in this population.

A central finding of the article involves the regulation of emotions. The authors describe a mechanism where cognitive control overrides emotional reactivity. Individuals with high intellectual potential typically possess strong executive functions. This includes inhibitory control, which is the ability to suppress impulsive responses. The review suggests that these individuals often use this strength to dampen their own emotional reactions. When they encounter a charged situation, they may unconsciously inhibit their feelings to analyze the event objectively.

This creates a specific empathic profile characterized by a dominance of cognitive empathy over emotional empathy. The person understands the situation perfectly but remains affectively detached. The authors note that this “intellectualization” of empathy can be an adaptive strategy.

It allows the individual to function effectively in high-stress environments where getting swept up in emotion would be counterproductive. However, this imbalance can also create social friction. It may lead others to perceive them as cold or distant, even when they are fully engaged in understanding the problem.

The study also explored the motivational aspects of empathy. The researchers investigated what drives these individuals to engage in prosocial behavior. They found that for this population, empathy is often linked to a sensitivity to justice. Their motivation to help often stems from an abstract moral reasoning rather than a personal emotional connection. They may be deeply disturbed by a violation of fairness or an ethical breach. This sense of justice can be intense. Yet, it is frequently directed toward systemic issues or principles rather than specific individuals.

The authors discussed the developmental trajectory of these traits. They highlighted the concept of developmental asynchrony. This occurs when a child’s cognitive abilities develop much faster than their emotional coping mechanisms. A highly intelligent child might cognitively understand complex adult emotions but lack the regulatory tools to manage them. This gap can lead to the “intellectualization” strategy observed in adults. The child learns to rely on their strong thinking brain to manage the confusing signals from their developing emotional brain.

The review also addressed the overlap between high intelligence and other neurodivergent profiles. The researchers noted that the profile of high cognitive empathy and low emotional empathy can superficially resemble traits seen in autism spectrum disorder. However, they clarify a key difference.

In autism, challenges often arise from a difficulty in reading social cues or understanding another’s perspective. In contrast, highly intelligent individuals often read the cues perfectly but regulate their emotional response so tightly that they appear unresponsive.

This distinction is essential for clinicians and educators. Misinterpreting this regulatory strategy as a deficit could lead to incorrect interventions. The high-potential individual does not need help understanding the social world. They may instead need support in learning how to access and express their emotions without feeling overwhelmed. The dominance of the cognitive system is a strength, but it should not come at the cost of the ability to connect authentically with others.

The authors also touched upon the role of sensory sensitivity. While the stereotype suggests these individuals are hypersensitive to all stimuli, the evidence is mixed. They do not consistently show higher physiological reactivity to stress. Instead, they may show a “negativity bias.” This is a tendency to focus on negative or threatening information. For a high-functioning brain, a negative emotion or a social threat is a problem to be solved. This intense focus can mimic anxiety but is rooted in an analytical drive to resolve discrepancies in the environment.

The review emphasizes that this profile is not static. Empathy is influenced by context and motivation. A highly intelligent person might appear detached in a boring or repetitive social situation. Yet, the same person might show profound engagement when the interaction is intellectually stimulating or aligned with their values. Their empathic response is flexible and modulated by how much they value the interaction.

The authors provide several caveats to their conclusions. They warn against treating individuals with high intellectual potential as a monolith. Great diversity exists within this group. Some may have co-occurring conditions like ADHD or anxiety that alter their empathic profile. Additionally, the definition of high potential varies across studies, with different IQ thresholds used. This inconsistency makes it difficult to draw universal conclusions.

Future research directions were also identified. The authors argue that scientists need to move beyond simple laboratory questionnaires. Self-report surveys are prone to bias, especially with subjects who are good at analyzing what the test is asking.

Future studies should use ecologically valid methods that mimic real-world social interactions. Observing how these individuals navigate complex, dynamic social environments would provide a clearer picture of their empathic functioning. Physiological measures, such as heart rate variability or brain imaging during social tasks, could also help verify the “inhibition” hypothesis.

The study, “Empathy in subjects with high intellectual potential (HIP): Rethinking stereotypes through a multidimensional and developmental review,” was authored by Nathalie Lavenne-Collot, Pascale Planche, and Laurence Vaivre-Douret.

Study reveals visual processing differences in dyslexia extend beyond reading

11 December 2025 at 19:00

New research published in Neuropsychologia provides evidence that adults with dyslexia process visual information differently than typical readers, even when viewing non-text objects. The findings suggest that the neural mechanisms responsible for distinguishing between specific items, such as individual faces or houses, are less active in the dyslexic brain. This implies that dyslexia may involve broader visual processing differences beyond the well-known difficulties with connecting sounds to language.

Dyslexia is a developmental condition characterized by significant challenges in learning to read and spell. These difficulties persist despite adequate intelligence, sensory abilities, and educational opportunities. The most prominent theory regarding the cause of dyslexia focuses on a phonological deficit. This theory posits that the primary struggle lies in processing the sounds of spoken language.

According to this view, the brain struggles to break words down into their component sounds. This makes mapping those sounds to written letters an arduous task. However, reading is also an intensely visual activity. The reader must rapidly identify complex, fine-grained visual patterns to distinguish one letter from another.

Some scientists suggest that the disorder may stem partly from a high-level visual dysfunction. This hypothesis proposes that the brain regions repurposed for reading are part of a larger system used to identify various visual objects. If this underlying visual system functions atypically, it could impede reading development.

Evidence for this visual hypothesis has been mixed in the past. Some studies show that people with dyslexia struggle with visual tasks unrelated to reading, while others find no such impairment. The authors of the current study aimed to resolve some of these inconsistencies. They sought to determine if neural processing differences exist even when behavioral performance appears normal.

“Developmental dyslexia is typically understood as a phonological disorder in that it occurs because of difficulties linking sounds to words. However, past findings have hinted that there can also be challenges with visual processing, especially for complex real-world stimuli like objects and faces. We wanted to test if these visual processing challenges in developmental dyslexia are linked to distinct neural processes in the brain,” said study author Brent Pitchford, a postdoctoral researcher at KU Leuven.

The researchers focused on how the brain identifies non-linguistic objects. They chose faces and houses as stimuli because these objects require the brain to process complex visual information without involving language. This allowed the team to isolate visual processing from phonological or verbal processing.

The study involved 62 adult participants. The sample consisted of 31 individuals with a history of dyslexia and 31 typical readers. The researchers ensured the groups were matched on key demographics, including age, gender, and general intelligence. All participants underwent vision screening to ensure normal visual acuity.

Participants engaged in a matching task while their brain activity was recorded. The researchers used electroencephalography (EEG), a method that detects electrical activity using a cap of electrodes placed on the scalp. This technique allows for the precise measurement of the timing of brain responses.

The researchers were specifically interested in two electrical signals, known as event-related potentials. The first signal is called the N170. It typically peaks around 170 milliseconds after a person sees an image. This component reflects the early stage of structural encoding, where the brain categorizes an object as a face or a building.

The second signal is called the N250. This potential peaks between 230 and 320 milliseconds. The N250 is associated with a later stage of processing. It reflects the brain’s effort to recognize a specific identity or “individuate” an object from others in the same category.

During the experiment, participants viewed pairs of images on a computer screen. A “sample” image appeared first, followed by a brief pause. A second “comparison” image then appeared. Participants had to decide if the second image depicted the same identity as the first.

“The study focused on within-category object discrimination (e.g., telling one house from another house) largely because reading involves visual words,” Pitchford told PsyPost. “It is often hard to study these visual processes because reading also involves other things like sound processing as well.”

The researchers also manipulated the visual quality of the images. Some trials used images containing all visual information. Other trials utilized images filtered to show only high spatial frequencies. High spatial frequencies convey fine details and edges, which are essential for distinguishing letters.

Remaining trials used images filtered to show only low spatial frequencies. These images convey global shapes and blurry forms but lack fine detail. This manipulation allowed the team to test if dyslexia involves specific deficits in processing fine details.

The behavioral results showed that both groups performed similarly on the task. Adults with dyslexia were generally as accurate and fast as typical readers when determining if two faces or houses were identical. There was a non-significant trend suggesting dyslexic readers were slightly less accurate with high-detail images.

Despite the comparable behavioral performance, the EEG data revealed distinct neural differences. The early brain response, the N170, was virtually identical for both groups. This suggests that the initial structural encoding of faces and objects is intact in dyslexia. The dyslexic brain appears to categorize objects just as quickly and effectively as the typical brain.

However, the later N250 response showed a significant divergence. The amplitude of the N250 was consistently reduced in the dyslexic group compared to the typical readers. This reduction indicates less neural activation during the process of identifying specific individuals.

“This effect was medium-to-large-sized, and robust when controlling for potential confounds such as ADHD, fatigue, and trial-to-trial priming,” Pitchford said. “Importantly, it appeared for both face and house stimuli, highlighting its generality across categories.”

The findings provide support for the high-level visual dysfunction hypothesis. They indicate that the neural machinery used to tell one object from another functions differently in dyslexia. This difference exists even when the individual successfully performs the task.

“Our results suggest that reading challenges in developmental dyslexia are likely due to a combination of factors, including some aspects of visual processing, and that developmental dyslexia is not solely due to challenges with phonological processing,” Pitchford explained. “We found neural differences related to how people with dyslexia discriminate between similar faces or objects, even though their behavior looked the same. This points to specific visual processes in the brain that may play a meaningful role in reading development and reading difficulties.”

The researchers propose that adults with dyslexia may use compensatory strategies to achieve normal behavioral performance. Their brains might rely on different neural pathways to recognize objects. This compensation allows them to function well in everyday visual tasks. However, this alternative processing route might be less efficient for the rapid, high-volume demands of reading.

“We expected to see lower accuracy on the visual discrimination tasks in dyslexia based on previous work,” Pitchford said. “Instead, accuracy was similar across groups, yet the neural responses differed. This suggests that adults with dyslexia may rely on different neural mechanisms to achieve comparable performance. Because these adults already have years of experience reading and recognizing faces and objects, it raises important questions about how these neural differences develop over time.”

One limitation of the study is the educational background of the participants. A significant portion of the dyslexic group held university degrees. These individuals likely developed robust compensatory mechanisms over the years. This high level of compensation might explain the lack of behavioral deficits.

It is possible that a sample with lower educational attainment would show clearer behavioral struggles with visual recognition. Additionally, the study was conducted on adults. It remains to be seen if these neural differences are present in children who are just learning to read.

Pitchford also noted that “these findings do not imply that phonological difficulties are unimportant in dyslexia. There is already extensive evidence supporting their crucial role. Rather, our study shows that visual factors contribute to dyslexia as well, and that dyslexia is unlikely to have a single cause. We see dyslexia as a multifactorial condition in which both phonological and visual factors play meaningful roles.”

Determining the timeline of these deficits is a necessary step for future research. Scientists need to establish whether these visual processing differences precede reading problems or result from a lifetime of different reading experiences. The researchers also suggest comparing these findings with other conditions. For instance, comparing dyslexic readers to individuals with prosopagnosia, or face blindness, could be illuminating.

“The next steps for this research are to test whether the neural differences we observed reflect general visual mechanisms or processes more specific to particular categories such as faces,” Pitchford explained. “To do this, we’ll apply the same paradigm to individuals with prosopagnosia, who have difficulties recognizing faces. We believe the comparison of results from the two groups will shed light on which visual processes contribute to dyslexia and prosopagnosia, both of which are traditionally thought to be due to challenges in specific domains (reading vs. face recognition).”

The study, “Distinct neural processing underlying visual face and object perception in dyslexia,” was authored by Brent Pitchford, Hélène Devillez, and Heida Maria Sigurdardottir.

Humans have an internal lunar clock, but we are accidentally destroying it

11 December 2025 at 03:00

Most animals, including humans, carry an internal lunar clock, tuned to the 29.5-day rhythm of the Moon. It guides sleep, reproduction and migration of many species. But in the age of artificial light, that ancient signal is fading – washed out by the glow of cities, screens and satellites.

Just as the circadian rhythm keeps time with the 24-hour rotation of the Earth, many organisms also track the slower rhythm of the Moon. Both systems rely on light cues, and a recent study analysing women’s menstrual cycles shows that as the planet brightens from artificial light, the natural contrasts that once structured biological time are being blurred.

Plenty of research suggests the lunar cycle still influences human sleep. A 2021 study found that in Toba (also known as Qom) Indigenous communities in Argentina, people went to bed 30-80 minutes later and slept 20-90 minutes less in the three-to-five nights before the full Moon.

Similar, though weaker, patterns appeared among more than 400 Seattle students in the same study, even amid the city’s heavy light pollution. This suggests that electric light may dampen but not erase this lunar effect.

The researchers found that sleep patterns varied not only with the full-Moon phase but also with the new- and half-Moon phases. This 15-day rhythm may reflect the influence of the Moon’s changing gravitational pull, which peaks twice per lunar month, during both the full and new Moons, when the Sun, Earth and Moon align. Such gravitational cycles could subtly affect biological rhythms alongside light-related cues.

Laboratory studies have supported these findings. In a 2013 experiment, during the full Moon phase participants took about five minutes longer to fall asleep, slept 20 minutes less, and secreted less melatonin (a hormone that helps regulate the sleep-wake cycle). They also showed a 30% reduction in EEG slow-wave brain activity – an indicator of deep sleep.

Their sleep was monitored over several weeks covering a lunar cycle. The participants also reported poorer sleep quality around the full Moon, despite being unaware that their data was being analysed against lunar phases.

Perhaps the most striking evidence of a lunar rhythm in humans comes from the recent study analysing long-term menstrual records of 176 women across Europe and the US.

Before around 2010 – when LED lighting and smartphone use became widespread – many women’s menstrual cycles tended to begin around the full Moon or new Moon phases. Afterwards, that synchrony largely vanished, persisting only in January, when the Moon-Sun-Earth gravitational effects are strongest.

The researchers propose that humans may still have an internal Moon clock, but that its coupling to lunar phases has been weakened by artificial lighting.

A metronome for other species

The Moon acts as a metronome for other species. For example, coral reefs coordinate mass spawning events with precision, releasing eggs and sperm under specific phases of Moonlight.

In a 2016 laboratory study, researchers working with reef-building corals (for example A. millepora) replaced the natural night light cycle with regimes of constant light or constant darkness. They found that the normal cycling of clock-genes (such as the cryptochromes) was flattened or lost, and the release of sperm and eggs fell out of sync. These findings suggest lunar light cues are integral to the genetic and physiological rhythms that underlie synchronised reproduction.

Other species, such as the marine midge Clunio marinus, use an internal “coincidence detector” that integrates circadian and lunar signals to time their reproduction precisely with low tides. Genetic studies have shown this lunar timing is linked to several clock-related genes – suggesting that the influence of lunar cycles extends down to the molecular level.

However, a 2019 study found that the synchrony of wild coral spawning is breaking down. Scientists think this may be due to pollutants and rising sea temperatures as well as light pollution. But we know that light pollution is causing disruption for many wildlife species that use the Moon to navigate or time their movements.

Near-permanent brightness

For most of human history, moonlight was the brightest light of night. Today, it competes with an artificial glow visible from space. According to the World Atlas of Artificial Night Sky Brightness, more than 80% of the global population – and nearly everyone in Europe and the US – live under a light-polluted sky (one that is bright enough to hide the Milky Way).

In some countries such as Singapore or Kuwait, there is literally nowhere without significant light pollution. Constant sky-glow from dense urban lighting keeps the sky so bright that night never becomes truly dark.

This near-permanent brightness is a by-product of these countries’ high population density, extensive outdoor illumination, and the reflection of light off buildings and the atmosphere. Even in remote national parks far from cities, the glow of distant lights can still be detected hundreds of kilometres away.

In cognitive neuroscience, time perception is often described by pacemaker–accumulator models, in which an internal “pacemaker” emits regular pulses that the brain counts to estimate duration. The stability of this system depends on rhythmic environmental cues – daylight, temperature, social routines – that help tune the rate of those pulses.

Losing the slow, monthly cue of moonlight may mean that our internal clocks now run in a flatter temporal landscape, with fewer natural fluctuations to anchor them. Previous psychological research has found disconnection from nature can warp our sense of time.

The lunar clock still ticks within us – faint but measurable. It shapes tides, sleep and the rhythms of countless species. Yet as the night sky brightens, we risk losing not only the stars, but the quiet cadence that once linked life on Earth to the turning of the Moon.The Conversation

 

This article is republished from The Conversation under a Creative Commons license. Read the original article.

❌
❌