Normal view

Today — 11 February 2026Main stream
Before yesterdayMain stream

The scientist who predicted AI psychosis has issued another dire warning

7 February 2026 at 17:00

More than two years ago, Danish psychiatrist Søren Dinesen Østergaard published a provocative editorial suggesting that the rise of conversational artificial intelligence could have severe mental health consequences. He proposed that the persuasive, human-like nature of chatbots might push vulnerable individuals toward psychosis.

At the time, the idea seemed speculative. In the months that followed, however, clinicians and journalists began documenting real-world cases that mirrored his concerns. Patients were developing fixed, false beliefs after marathon sessions with digital companions. Now, the scientist who foresaw the psychiatric risks of AI has issued a new warning. This time, he is not focusing on mental illness, but on a potential degradation of human intelligence itself.

In a new letter to the editor published in Acta Psychiatrica Scandinavica, Østergaard argues that academia and the sciences are facing a crisis of “cognitive debt.” He posits that the outsourcing of writing and reasoning to generative AI is eroding the fundamental skills required for scientific discovery. The commentary builds upon a growing body of evidence suggesting that while AI can mimic human output, relying on it may physically alter the brain’s ability to think.

Østergaard’s latest writing is a response to a letter by Professor Soichiro Matsubara. Matsubara had previously highlighted that AI chatbots might harm the writing abilities of young doctors and damage the mentorship dynamic in medicine. Østergaard agrees with this assessment but takes the argument a step further. He contends that the danger extends beyond mere writing skills and strikes at the core of the scientific process: reasoning.

The psychiatrist acknowledges the utility of AI for surface-level tasks. He notes that using a tool to proofread a manuscript for grammar is largely harmless. However, he points out that technology companies are actively marketing “reasoning models” designed to solve complex problems and plan workflows. While this sounds efficient, Østergaard suggests it creates a paradox. He questions whether the next generation of scientists will possess the cognitive capacity to make breakthroughs if they never practice the struggle of reasoning themselves.

To illustrate this point, he cites the developers of AlphaFold, an AI program that predicts protein structures. This technology resulted in the 2024 Nobel Prize in Chemistry for researchers from Google DeepMind and the University of Washington.

Østergaard argues that it is not a given that these specific scientists would have achieved such heights if generative AI had been available to do their thinking for them during their formative years. He suggests that scientific reasoning is not an innate talent. It is a skill learned through the rigorous, often tedious practice of reading, thinking, and revising.

The concept of “cognitive debt” is central to this new warning. Østergaard draws attention to a preprint study by Kosmyna and colleagues, titled “Your brain on ChatGPT.” This research attempts to quantify the neurological cost of using AI assistance. The study involved participants writing essays under three conditions: using ChatGPT, using a search engine, or using only their own brains.

The findings of the Kosmyna study provide physical evidence for Østergaard’s concerns. Electroencephalography (EEG) monitoring revealed that participants in the ChatGPT group showed substantially lower brain activation in networks typically engaged during cognitive tasks. The brain was simply doing less work. More alerting was the finding that this “weaker neural connectivity” persisted even when these participants switched to writing essays without AI.

The study also found that those who used the chatbot had significant difficulties recalling the content of the essays they had just produced. The authors of the paper concluded that the results demonstrate a pressing matter of a likely decrease in learning skills. Østergaard describes these findings as deeply concerning. He suggests that if AI use indeed causes such cognitive debt, the educational system may be in a difficult position.

This aligns with other recent papers regarding “cognitive offloading.” A commentary by Umberto León Domínguez published in Neuropsychology explores the idea of AI as a “cognitive prosthesis.” Just as a physical prosthesis replaces a limb, AI replaces mental effort. While this can be efficient, Domínguez warns that it prevents the stimulation of higher-order executive functions. If students do not engage in the mental gymnastics required to solve problems, those cognitive muscles may atrophy.

Real-world examples are already surfacing. Østergaard references a report from the Danish Broadcasting Corporation about a high school student who used ChatGPT to complete approximately 150 assignments. The student was eventually expelled. While this is an extreme case, Østergaard notes that widespread outsourcing is becoming the norm from primary school through graduate programs. He fears this will reduce the chances of exceptional minds emerging in the future.

The loss of critical thinking skills is not just a future risk but a present reality. A study by Michael Gerlich published in the journal Societies found a strong negative correlation between frequent AI tool usage and critical thinking abilities. The research indicated that younger individuals were particularly susceptible. Those who frequently offloaded cognitive tasks to algorithms performed worse on assessments requiring independent analysis and evaluation.

There is also the issue of false confidence. A study published in Computers in Human Behavior by Daniela Fernandes and colleagues found that while AI helped users score higher on logic tests, it also distorted their self-assessment. Participants consistently overestimated their performance. The technology acted as a buffer, masking their own lack of understanding. This creates a scenario where individuals feel competent because the machine is competent, leading to a disconnect between perceived and actual ability.

This intellectual detachment mirrors the emotional detachment Østergaard identified in his earlier work on AI psychosis. In his previous editorial, he warned that the “sycophantic” nature of chatbots—their tendency to agree with and flatter the user—could reinforce delusions. A user experiencing paranoia might find a willing conspirator in a chatbot, which confirms their false beliefs to keep the conversation going.

The mechanism is similar in the context of cognitive debt. The AI provides an easy, pleasing answer that satisfies the immediate need of the user, whether that need is emotional validation or a completed homework assignment. in both cases, the human user surrenders their agency to the algorithm. They stop testing reality or their own logic against the world, preferring the smooth, frictionless output of the machine.

Østergaard connects this loss of human capability to the ultimate risks of artificial intelligence. He cites Geoffrey Hinton, a Nobel laureate in physics often called the “godfather of AI.” Hinton has expressed concerns that there is a significant probability that AI could threaten humanity’s existence within the next few decades. Østergaard argues that facing such existential threats requires humans who are cognitively adept.

If the population becomes “cognitively indebted,” reliant on machines for basic reasoning, the ability to maintain control over those same machines diminishes. The psychiatrist emphasizes that we need humans in the loop who are capable of independent, rigorous thought. A society that has outsourced its reasoning to the very systems it needs to regulate may find itself ill-equipped to handle the consequences.

The warning is clear. The convenience of generative AI comes with a hidden cost. It is not merely a matter of students cheating on essays or doctors losing their writing flair. The evidence suggests a fundamental change in how the brain processes information. By skipping the struggle of learning and reasoning, humans may be sacrificing the very cognitive traits that allow for scientific advancement and independent judgment.

Østergaard was correct when he flagged the potential for AI to distort reality for psychiatric patients. His new commentary suggests that the distortion of our intellectual potential may be a far more widespread and insidious problem. As AI tools become more integrated into daily life, the choice between cognitive effort and cognitive offloading becomes a defining challenge for the future of human intelligence.

The paper, “Generative Artificial Intelligence (AI) and the Outsourcing of Scientific Reasoning: Perils of the Rising Cognitive Debt in Academia and Beyond,” was published January 21, 2026.

Scientists just mapped the brain architecture that underlies human intelligence

6 February 2026 at 17:00

For decades, researchers have attempted to pinpoint the specific areas of the brain responsible for human intelligence. A new analysis suggests that general intelligence involves the coordination of the entire brain rather than the superior function of any single region. By mapping the connections within the human brain, or connectome, scientists found that distinct patterns of global communication predict cognitive ability.

The research indicates that intelligent thought relies on a system-wide architecture optimized for efficiency and flexibility. These findings were published in the journal Nature Communications.

General intelligence represents the capacity to reason, learn, and solve problems across a variety of different contexts. In the past, theories often attributed this capacity to specific networks, such as the areas in the frontal and parietal lobes involved in attention and working memory. While these regions are involved in cognitive tasks, newer perspectives suggest they are part of a larger story.

The Network Neuroscience Theory proposes that intelligence arises from the global topology of the brain. This framework suggests that the physical wiring of the brain and its patterns of activity work in tandem.

Ramsey R. Wilcox, a researcher at the University of Notre Dame, led the study to test the specific predictions of this network theory. Working with senior author Aron K. Barbey and colleagues from the University of Illinois and Stony Brook University, Wilcox sought to move beyond localized models. The team aimed to understand how the brain’s physical structure constrains and directs its functional activity.

To investigate these questions, the research team utilized data from the Human Connectome Project. This massive dataset provided brain imaging and cognitive testing results from 831 healthy young adults. The researchers also validated their findings using an independent sample of 145 participants from a separate study.

The investigators employed a novel method that combined two distinct types of magnetic resonance imaging (MRI) data. They used diffusion-weighted MRI to map the structural white matter tracts, which act as the physical cables connecting brain regions. Simultaneously, they analyzed resting-state functional MRI, which measures the rhythmic activation patterns of brain cells.

By integrating these modalities, Wilcox and his colleagues created a joint model of the brain. This approach allowed them to estimate the capacity of structural connections to transmit information based on observed activity. The model corrected for limitations in traditional scanning, such as the difficulty in detecting crossing fibers within the brain’s white matter.

The team then applied predictive modeling techniques to see if these global network features could estimate a participant’s general intelligence score. The results provided strong support for the idea that intelligence is a distributed phenomenon. Models that incorporated connections across the whole brain successfully predicted intelligence scores.

In contrast, models that relied on single, isolated networks performed with less accuracy. This suggests that while specific networks have roles, the interaction between them is primary. The most predictive connections were not confined to one area but were spread throughout the cortex.

One of the specific predictions the team tested involved the strength and length of neural connections. The researchers found that individuals with higher intelligence scores tended to rely on “weak ties” for long-range communication. In network science, a weak tie represents a connection that is not structurally dense but acts as a bridge between separate communities of neurons.

These long-range, weak connections require less energy to maintain than dense, strong connections. Their weakness allows them to be easily modulated by neural activity. This quality makes the brain more adaptable, enabling it to reconfigure its communication pathways rapidly in response to new problems.

The study showed that in highly intelligent individuals, these predictive weak connections spanned longer physical distances. Conversely, strong connections in these individuals tended to be shorter. This architecture likely balances the high cost of long-distance communication with the need for system-wide integration.

Another key finding concerned “modal control.” This concept refers to the ability of specific brain regions to drive the brain into difficult-to-reach states of activity. Cognitive tasks often require the brain to shift away from its default patterns to process complex information.

Wilcox and his team found that general intelligence was positively associated with the presence of regions exhibiting high modal control. These control hubs were located in areas of the brain associated with executive function and visual processing. The presence of these regulating nodes allows the brain to orchestrate interactions between different networks effectively.

The researchers also examined the overall topology of the brain using a concept known as “small-worldness.” A small-world network is one that features tight-knit local communities of nodes as well as short paths that connect those communities. This organization is efficient because it allows for specialized local processing while maintaining rapid global communication.

The analysis revealed that participants with higher intelligence scores possessed brain networks with greater small-world characteristics. Their brains exhibited high levels of local clustering, meaning nearby regions were tightly interconnected. Simultaneously, they maintained short average path lengths across the entire system.

This balance ensures that information does not get trapped in local modules. It also ensures that the brain does not become a disorganized random network. The findings suggest that deviations from this optimal balance may underlie lower cognitive performance.

There are limitations to the current study that warrant consideration. The research relies on correlational data, so it cannot definitively prove that specific network structures cause higher intelligence. It is possible that engaging in intellectual activities alters the brain’s wiring over time.

Additionally, the study focused primarily on young adults. Future research will need to determine if these network patterns hold true across the lifespan, from childhood development through aging. The team also used linear modeling techniques, which may miss more nuanced, non-linear relationships in the data.

These insights into the biological basis of human intelligence have implications for the development of artificial intelligence. Current AI systems often excel at specific tasks but struggle with the broad flexibility characteristic of human thought. Understanding how the human brain achieves general intelligence through global network architecture could inspire new designs for artificial systems.

By mimicking the brain’s balance of local specialization and global integration, engineers might create AI that is more adaptable. The reliance on weak, flexible connections for integrating information could also serve as a model for efficient data processing.

The shift in perspective offered by this study is substantial. It moves the field away from viewing the brain as a collection of isolated tools. Instead, it presents the brain as a unified, dynamic system where the pattern of connections determines cognitive potential.

Wilcox and his colleagues have provided empirical evidence that validates the core tenets of Network Neuroscience Theory. Their work demonstrates that intelligence is not a localized function but a property of the global connectome. As neuroscience continues to map these connections, the definition of what it means to be intelligent will likely continue to evolve.

The study, “The network architecture of general intelligence in the human connectome,” was authored by Ramsey R. Wilcox, Babak Hemmatian, Lav R. Varshney & Aron K. Barbey.

A high-sugar breakfast may trigger a “rest and digest” state that dampens cognitive focus

5 February 2026 at 21:00

Starting the day with a sugary pastry might feel like a treat, but new research suggests it could sabotage your workday before it begins. A study published in the journal Food and Humanity indicates that a high-fat, high-sugar morning meal may dampen cognitive planning abilities and increase sleepiness in young women. The findings imply that nutritional choices at breakfast play a larger role in regulating morning physiological arousal and mental focus than previously realized.

Dietary habits vary widely across populations, yet breakfast is often touted as the foundation for daily energy. Despite this reputation, statistical data indicates that a sizable portion of adult women frequently consume confectionaries or sweet snacks as their first meal of the day. Researchers identify this trend as a potential public health concern, particularly regarding productivity and mental well-being in the workplace.

The autonomic nervous system regulates involuntary body processes, including heart rate and digestion. It functions through two main branches: the sympathetic nervous system and the parasympathetic nervous system. The sympathetic branch prepares the body for action, often described as the “fight or flight” response.

Conversely, the parasympathetic branch promotes a “rest and digest” state, calming the body and conserving energy. Professional work performance typically requires a certain level of alertness and physiological arousal. Fumiaki Hanzawa and colleagues at the University of Hyogo in Japan sought to understand how different breakfast compositions influence this delicate neural balance.

Hanzawa and his team hypothesized that the nutrient density of a meal directly impacts how the nervous system regulates alertness and cognitive processing shortly after eating. To test this, they designed a randomized crossover trial involving 13 healthy female university students. This specific study design ensured that each participant acted as her own control, minimizing the impact of individual biological variations.

On two separate mornings, the women arrived at the laboratory after fasting overnight. They consumed one of two test meals that contained an identical amount of food energy, totaling 497 kilocalories. The researchers allowed for a washout period of at least one week between the two sessions to prevent any lingering effects from the first test.

One meal option was a balanced breakfast modeled after a traditional Japanese meal, known as Washoku. This included boiled rice, salted salmon, an omelet, spinach with sesame sauce, miso soup, and a banana. The nutrient breakdown of this meal favored carbohydrates and protein, with a moderate amount of fat.

The alternative was a high-fat, high-sugar meal designed to mimic a common convenient breakfast of poor nutritional quality. This consisted of sweet doughnut holes and a commercially available strawberry milk drink. This meal derived more than half its total energy from fat and contained very little protein compared to the balanced option.

The researchers monitored several physiological markers for two hours following the meal. They measured body temperature inside the ear to track diet-induced thermogenesis, which is the production of heat in the body caused by metabolizing food. They also recorded heart rate variability to assess the activity of the autonomic nervous system.

At specific intervals, the participants completed computerized cognitive tests. These tasks were designed to measure attention and executive function. Specifically, the researchers looked at “task switching,” which assesses the brain’s ability to shift attention between different rule sets.

The participants also rated their subjective feelings on a sliding scale. They reported their current levels of fatigue, vitality, and sleepiness at multiple time points. This allowed the researchers to compare the women’s internal psychological states with their objective physiological data.

The physiological responses showed distinct patterns depending on the food consumed. The balanced breakfast prompted a measurable rise in body temperature and heart rate shortly after eating. This physiological shift suggests an activation of the sympathetic nervous system, preparing the body for the day’s activities.

In contrast, the doughnut and sweetened milk meal failed to raise body temperature to the same degree. Instead, the data revealed a dominant response from the parasympathetic nervous system immediately after consumption. This suggests the sugary meal induced a state of relaxation and digestion rather than physiological readiness.

Subjective reports from the participants mirrored these physical changes. The women reported feeling higher levels of vitality after consuming the balanced meal containing rice and fish. This feeling of energy persisted during the post-meal monitoring period.

Conversely, when the same women ate the high-fat, high-sugar breakfast, they reported increased sleepiness. This sensation of lethargy aligns with the parasympathetic dominance observed in the heart rate data. The anticipated energy boost from the sugar did not translate into a feeling of vitality.

The cognitive testing revealed that the sugary meal led to a decline in planning function. Specifically, the participants struggled more with task switching after the high-fat, high-sugar breakfast compared to the balanced meal. This function is vital for organizing steps to achieve a goal and adapting to changing work requirements.

Unexpectedly, the high-fat, high-sugar group performed slightly better on a specific visual attention task. The authors suggest this could be due to a temporary dopamine release triggered by the sweet taste. However, this isolated improvement did not extend to the more complex executive functions required for planning.

The researchers propose that the difference in carbohydrate types may explain some of the results. The balanced meal contained rice, which is rich in polysaccharides like amylose and amylopectin. These complex carbohydrates digest differently than the sucrose found in the doughnuts and sweetened milk.

Protein content also likely played a role in the thermal effects observed. The balanced meal contained significantly more protein, which is known to require more energy to metabolize than fat or sugar. This thermogenic effect contributes to the rise in body temperature and the associated feeling of alertness.

The study implies that work performance is not just about caloric intake but the quality of those calories. A breakfast that triggers a “rest and digest” response may be counterproductive for someone attempting to start a workday. The mental fog and sleepiness associated with the high-fat, high-sugar meal could hinder productivity.

While the results provide insight into diet and physiology, the study has limitations that affect broader applications. The sample size was small, involving only 13 participants from a specific age group and gender. This limits the ability to generalize the results to men or older adults with different metabolic profiles.

The study also focused exclusively on young students rather than full-time workers. Actual workplace stress and physical demands might interact with diet in ways this laboratory setting could not replicate. Additionally, the study only examined immediate, short-term effects following a single meal.

It remains unclear how long-term habitual consumption of high-fat, high-sugar breakfasts might alter these responses over months or years. Chronic exposure to such a diet could potentially lead to different adaptations or more severe deficits. The researchers note that habitual poor diet is already linked to cognitive decline in other epidemiological studies.

Hanzawa and the research team suggest that future investigations should expand the demographic pool. Including male participants and older workers would help clarify if these physiological responses are universal. They also recommend examining how these physiological changes translate into actual performance metrics in a real-world office environment.

The study, “High-fat, high-sugar breakfast worsen morning mood, cognitive performance, and cardiac sympathetic nervous system activity in young women,” was authored by Fumiaki Hanzawa, Manaka Hashimoto, Mana Gonda, Miyoko Okuzono, Yumi Takayama, Yukina Yumen, and Narumi Nagai.

❌
❌