Reading view

Inside the ‘House of Horrors’ mass murder case that rocked India – and may never be solved

Nearly two decades after the chilling discovery of mutilated bodies in drains behind a bungalow in a village near Delhi, India’s top court is now poised to overturn the last remaining conviction in the serial murder case. Speaking to Namita Singh, parents of the victims say they are losing hope in receiving justice

© AFP via Getty Images

Keira Knightley: ‘Early motherhood is definitely more exhausting than shooting films!’

She’s played a Jane Austen hero, a swashbuckling aristocrat, and a wartime cryptanalyst. Now, the beloved actor steps into her most unexpected role yet, as a children’s author. She talks to Jessie Thompson about her daughter’s teething problems, getting equal pay, and that famous ‘Vanity Fair’ photoshoot

© Getty

Scientists question caffeine’s power to shield the brain from junk food

A recent study provides evidence that while a diet high in fat and sugar is associated with memory impairment, habitual caffeine consumption is unlikely to offer protection against these negative effects. These findings, which come from two related experiments, help clarify the complex interplay between diet, stimulants, and cognitive health in humans. The findings were published in Physiology & Behavior.

Researchers have become increasingly interested in the connection between nutrition and brain function. A growing body of scientific work, primarily from animal studies, has shown that diets rich in fat and sugar can impair memory, particularly functions related to the hippocampus, a brain region vital for learning and recall.

Human studies have started to align with these findings, linking high-fat, high-sugar consumption with poorer performance on memory tasks and with more self-reported memory failures. Given these associations, scientists are searching for potential protective factors that might lessen the cognitive impact of a poor diet.

Caffeine is one of the most widely consumed psychoactive substances in the world, and its effects on cognition have been studied extensively. While caffeine is known to improve alertness and reaction time, its impact on memory has been less clear. Some research in animal models has suggested that caffeine could have neuroprotective properties, potentially guarding against the memory deficits induced by a high-fat, high-sugar diet. These animal studies hinted that caffeine might work by reducing inflammation or through other brain-protective mechanisms. However, this potential protective effect had not been thoroughly investigated in human populations, a gap this new research aimed to address.

To explore this relationship, the researchers conducted two experiments. In the first experiment, they recruited 1,000 healthy volunteers between the ages of 18 and 45. Participants completed a series of online questionnaires designed to assess their dietary habits, memory, and caffeine intake. Their consumption of fat and sugar was measured using the Dietary Fat and free Sugar questionnaire, which asks about the frequency of eating various foods over the past year.

To gauge memory, participants filled out the Everyday Memory Questionnaire, a self-report measure where they rated how often they experience common memory lapses, such as forgetting names or misplacing items. Finally, they reported their daily caffeine consumption from various sources like coffee, tea, and soda.

The results from this first experiment confirmed a link between diet and self-perceived memory. Individuals who reported eating a diet higher in fat and sugar also reported experiencing more frequent everyday memory failures. The researchers then analyzed whether caffeine consumption altered this relationship. The analysis suggested a potential, though not statistically strong, moderating effect.

When the researchers specifically isolated the fat component of the diet, they found that caffeine consumption did appear to weaken the association between high fat intake and self-reported memory problems. At low levels of caffeine intake, a high-fat diet was strongly linked to memory complaints, but this link was not present for those with high caffeine intake. This provided preliminary evidence that caffeine might offer some benefit.

The second experiment was designed to build upon the initial findings with a more robust assessment of memory. This study involved 699 healthy volunteers, again aged 18 to 45, who completed the same questionnaires on diet, memory failures, and caffeine use. The key addition in this experiment was an objective measure of memory called the Verbal Paired Associates task. In this task, participants were shown pairs of words and were later asked to recall the second word of a pair when shown the first. This test provides a direct measure of episodic memory, which is the ability to recall specific events and experiences.

The findings from the second experiment once again showed a clear association between diet and memory. A higher intake of fat and sugar was linked to more self-reported memory failures, replicating the results of the first experiment. The diet was also associated with poorer performance on the objective Verbal Paired Associates task, providing stronger evidence that a high-fat, high-sugar diet is connected to actual memory impairment, not just the perception of it.

When the researchers examined the role of caffeine in this second experiment, the results were different from the first. This time, caffeine consumption did not moderate the relationship between a high-fat, high-sugar diet and either of the memory measures. In other words, individuals who consumed high amounts of caffeine were just as likely to show diet-related memory deficits as those who consumed little or no caffeine.

This lack of a protective effect was consistent for both self-reported memory failures and performance on the objective word-pair task. The findings from this more comprehensive experiment did not support the initial suggestion that caffeine could shield memory from the effects of a poor diet.

The researchers acknowledge certain limitations in their study. The data on diet and caffeine consumption were based on self-reports, which can be subject to recall errors. The participants were also relatively young and generally healthy, and the effects of diet on memory might be more pronounced in older populations or those with pre-existing health conditions. Since the study was conducted online, it was not possible to control for participants’ caffeine intake right before they completed the memory tasks, which could have influenced performance.

For future research, the scientists suggest using more objective methods to track dietary intake. They also recommend studying different populations, such as older adults or individuals with obesity, where the links between diet, caffeine, and memory may be clearer. Including a wider array of cognitive tests could also help determine if caffeine has protective effects on other brain functions beyond episodic memory, such as attention or executive function. Despite the lack of a protective effect found here, the study adds to our understanding of how lifestyle factors interact to influence cognitive health.

The study, “Does habitual caffeine consumption moderate the association between a high fat and sugar diet and self-reported and episodic memory impairment in humans?,” was authored by Tatum Sevenoaks and Martin Yeomans.

New $2 saliva test may aid in psychiatric diagnosis

A team of researchers in Brazil has engineered an inexpensive, disposable sensor that can detect a key protein linked to mental health conditions using a drop of saliva. Published in the journal ACS Polymers Au, the device could one day offer a rapid, non-invasive tool to help in the diagnosis and monitoring of disorders like depression and schizophrenia. The results are available in under an hour, offering a significant departure from current lab-based methods.

Diagnosing and managing psychiatric disorders currently relies heavily on clinical interviews and patient-reported symptoms, which can be subjective. Scientists have been searching for objective biological markers, and a protein called brain-derived neurotrophic factor, or BDNF, has emerged as a promising candidate. Lower-than-normal levels of BDNF, which supports the health and growth of neurons, have been consistently associated with conditions like major depression, bipolar disorder, and schizophrenia.

Existing methods for measuring BDNF typically involve blood draws and rely on complex, time-consuming laboratory procedures like the enzyme-linked immunosorbent assay. These techniques are often expensive and require specialized equipment and personnel, making them impractical for routine clinical use or for monitoring patient progress outside of a dedicated lab. The researchers sought to develop a fast, affordable, and non-invasive alternative that could be used at the point of care, motivated by the global increase in mental health conditions.

The foundation of the device is a small, flexible strip of polyester, similar to a piece of plastic film. Using a screen-printing technique, the scientists printed three electrodes onto this strip using carbon- and silver-based inks. This fabrication method is common in electronics and allows for inexpensive, mass production of the sensor strips.

To make the sensor specific to BDNF, the team modified the surface of the main working electrode in a multi-step process. First, they coated it with a layer of microscopic carbon spheres, which are synthesized from a simple glucose solution. This creates a large, textured surface area that is ideal for anchoring other molecules and enhances the sensor’s electrical sensitivity.

Next, they added a sequence of chemical layers that act as a sticky foundation for the biological components. Onto this foundation, they attached specialized proteins called antibodies. These anti-BDNF antibodies are engineered to recognize and bind exclusively to the BDNF protein, much like a key fits into a specific lock. A final chemical layer was added to block any remaining empty spots on the surface, which prevents other molecules in saliva from interfering with the measurement.

When a drop of saliva is applied to the sensor, any BDNF protein present is captured by the antibodies on the electrode. This binding event physically alters the electrode’s surface, creating a minute barrier that impedes the flow of electrons. The device then measures this change by sending a small electrical signal through the electrode and recording its resistance to that signal.

A greater amount of captured BDNF creates a larger barrier, resulting in a higher resistance, which can be precisely quantified. The entire process, from sample application to result, can be completed in about 35 minutes. The data is captured by a portable analyzer that can communicate wirelessly with a device like a smartphone, allowing for real-time analysis.

The research team demonstrated that their biosensor was remarkably sensitive. It could reliably detect BDNF across a vast concentration range, from incredibly minute amounts (as low as 10⁻²⁰ grams per milliliter) up to levels typically seen in healthy individuals.

This wide detection range is significant because it means the device could potentially identify the very low BDNF levels that may signal a disorder. It could also track the increase in BDNF levels as a patient responds positively to treatment, such as antidepressants, offering an objective measure of therapeutic success.

The sensor also proved to be highly selective. When tested against a variety of other substances commonly found in saliva, including glucose, uric acid, paracetamol, and even the spike protein from the SARS-CoV-2 virus, the device did not produce a false signal. It responded specifically to BDNF, confirming the effectiveness of its design.

Furthermore, tests using human saliva samples that were supplemented with known quantities of the protein showed that the sensor could accurately measure BDNF levels even within this complex biological fluid. The researchers estimated the cost of the materials for a single disposable strip to be around $2.19, positioning it as a potentially accessible diagnostic tool.

The current study was a proof-of-concept and has certain limitations. The experiments were conducted with a limited number of saliva samples from a single volunteer, which were then modified in the lab to contain varying concentrations of the target protein.

The next essential step will be to test the biosensor with a large and diverse group of patients diagnosed with various psychiatric conditions to validate its accuracy and reliability in a real-world clinical setting. Such studies would be needed to establish clear thresholds for what constitutes healthy versus potentially pathological BDNF levels in saliva. The researchers also plan to secure a patent for their technology and refine the device for potential commercial production. Future work could also explore integrating sensors for other biomarkers onto the same strip, allowing for a more comprehensive health assessment from a single saliva sample.

The study, “Low-Cost, Disposable Biosensor for Detection of the Brain-Derived Neurotrophic Factor Biomarker in Noninvasively Collected Saliva toward Diagnosis of Mental Disorders,” was authored by Nathalia O. Gomes, Marcelo L. Calegaro, Luiz Henrique C. Mattoso, Sergio A. S. Machado, Osvaldo N. Oliveira Jr., and Paulo A. Raymundo-Pereira.

The secret to sustainable AI may have been in our brains all along

Researchers have developed a new method for training artificial intelligence that dramatically improves its speed and energy efficiency by mimicking the structured wiring of the human brain. The approach, detailed in the journal Neurocomputing, creates AI models that can match or even exceed the accuracy of conventional networks while using a small fraction of the computational resources.

The study was motivated by a growing challenge in the field of artificial intelligence: sustainability. Modern AI systems, such as the large language models that power generative AI, have become enormous. They are built with billions of connections, and training them can require vast amounts of electricity and cost tens of millions of dollars. As these models continue to expand, their financial and environmental costs are becoming a significant concern.

“Training many of today’s popular large AI models can consume over a million kilowatt-hours of electricity, which is equivalent to the annual use of more than a hundred US homes, and cost tens of millions of dollars,” said Roman Bauer, a senior lecturer at the University of Surrey and a supervisor on the project. “That simply isn’t sustainable at the rate AI continues to grow. Our work shows that intelligent systems can be built far more efficiently, cutting energy demands without sacrificing performance.”

To find a more efficient design, the research team looked to the human brain. While many artificial neural networks are “dense,” meaning every neuron in one layer is connected to every neuron in the next, the brain operates differently. Its connectivity is highly sparse and structured. For instance, in the visual system, neurons in the retina form localized and orderly connections to process information, creating what are known as topographical maps. This design is exceptionally efficient, avoiding the need for redundant wiring. The brain also refines its connections during development, pruning away unnecessary pathways to optimize its structure.

Inspired by these biological principles, the researchers developed a new framework called Topographical Sparse Mapping, or TSM. Instead of building a dense network, TSM configures the input layer of an artificial neural network with a sparse, structured pattern from the very beginning. Each input feature, such as a pixel in an image, is connected to only one neuron in the following layer in an organized, sequential manner. This method immediately reduces the number of connections, known as parameters, which the model must manage.

The team then developed an enhanced version of the framework, named Enhanced Topographical Sparse Mapping, or ETSM. This version introduces a second brain-inspired process. After the network trains for a short period, it undergoes a dynamic pruning stage. During this phase, the model identifies and removes the least important connections throughout its layers, based on their magnitude. This process is analogous to the synaptic pruning that occurs in the brain as it learns and matures, resulting in an even leaner and more refined network.

To evaluate their approach, the scientists built and trained a type of network known as a multilayer perceptron. They tested its ability to perform image classification tasks using several standard benchmark datasets, including MNIST, Fashion-MNIST, CIFAR-10, and CIFAR-100. This setup allowed for a direct comparison of the TSM and ETSM models against both conventional dense networks and other leading techniques designed to create sparse, efficient AI.

The results showed a remarkable balance of efficiency and performance. The ETSM model was able to achieve extreme levels of sparsity, in some cases removing up to 99 percent of the connections found in a standard network. Despite this massive reduction in complexity, the sparse models performed just as well as, and sometimes better than, their dense counterparts. For the more difficult CIFAR-100 dataset, the ETSM model achieved a 14 percent improvement in accuracy over the next best sparse method while using far fewer connections.

“The brain achieves remarkable efficiency through its structure, with each neuron forming connections that are spatially well-organised,” said Mohsen Kamelian Rad, a PhD student at the University of Surrey and the study’s lead author. “When we mirror this topographical design, we can train AI systems that learn faster, use less energy and perform just as accurately. It’s a new way of thinking about neural networks, built on the same biological principles that make natural intelligence so effective.”

The efficiency gains were substantial. Because the network starts with a sparse structure and does not require complex phases of adding back connections, it trains much more quickly. The researchers’ analysis of computational costs revealed that their method consumed less than one percent of the energy and used significantly less memory than a conventional dense model. This combination of speed, low energy use, and high accuracy sets it apart from many existing methods that often trade performance for efficiency.

A key part of the investigation was to confirm the importance of the orderly, topographical wiring. The team compared their models to networks that had a similar number of sparse connections but were arranged randomly. The results demonstrated that the brain-inspired topographical structure consistently produced more stable training and higher accuracy, indicating that the specific pattern of connectivity is a vital component of its success.

The researchers acknowledge that their current framework applies the topographical mapping only to the model’s input layer. A potential direction for future work is to extend this structured design to deeper layers within the network, which could lead to even greater gains in efficiency. The team is also exploring how the approach could be applied to other AI architectures, such as the large models used for natural language processing, where the efficiency improvements could have a profound impact.

The study, “Topographical sparse mapping: A neuro-inspired sparse training framework for deep learning models,” was authored by Mohsen Kamelian Rad, Ferrante Neri, Sotiris Moschoyiannis, and Roman Bauer.

Lily James: ‘Every time I watch Alien, I go... my God, that’s my grandma!’

The star of ‘Baby Driver’, ‘The Iron Claw’ and ‘Mamma Mia! Here We Go Again’ now plays a woman fleeing constant surveillance in the thriller ‘Relay’. She talks to Adam White about losing her privacy in the public eye, the time she boasted about her lineage to impress Edgar Wright, and why she has no regrets about playing Pamela Anderson

© Ben Trivett/Shutterstock

Vulnerability to stress magnifies how a racing mind disrupts sleep

A new study provides evidence that a person’s innate vulnerability to stress-induced sleep problems can intensify how much a racing mind disrupts their sleep over time. While daily stress affects everyone’s sleep to some degree, this trait appears to make some people more susceptible to fragmented sleep. The findings were published in the Journal of Sleep Research.

Scientists have long understood that stress can be detrimental to sleep. One of the primary ways this occurs is through pre-sleep arousal, a state of heightened mental or physical activity just before bedtime. Researchers have also identified a trait known as sleep reactivity, which describes how susceptible a person’s sleep is to disruption from stress. Some individuals have high sleep reactivity, meaning their sleep is easily disturbed by stressors, while others have low reactivity and can sleep soundly even under pressure.

Despite knowing these factors are related, the precise way they interact on a daily basis was not well understood. Most previous studies relied on infrequent, retrospective reports or focused on major life events rather than common, everyday stressors. The research team behind this new study sought to get a more detailed picture. They aimed to understand how sleep reactivity might alter the connection between daily stress, pre-sleep arousal, and objectively measured sleep patterns in a natural setting.

“Sleep reactivity refers to an individual’s tendency to experience heightened sleep disturbances when faced with stress. Those with high sleep reactivity tend to show increased pre-sleep arousal during stressful periods and are at greater risk of developing both acute and chronic insomnia,” explained study authors Ju Lynn Ong and Stijn Massar, who are both research assistant professors at the National University of Singapore Yong Loo Lin School of Medicine.

“However, most prior research on stress, sleep, and sleep reactivity has relied on single, retrospective assessments, which may fail to capture the immediate and dynamic effects of daily stressors on sleep. Another limitation is that previous studies often examined either the cognitive or physiological components of pre-sleep arousal in isolation. Although these two forms of arousal are related, they may differ in their predictive value and underlying mechanisms, highlighting the importance of evaluating both concurrently.”

“To address these gaps, the current study investigated how day-to-day fluctuations in stress relate to sleep among university students over a two-week period and whether pre-sleep cognitive and physiological arousal mediate this relationship—particularly in individuals with high sleep reactivity.”

The research team began by recruiting a large group of full-time university students. They had the students complete a questionnaire called the Ford Insomnia Response to Stress Test, which is designed to measure an individual’s sleep reactivity. From this initial pool, the researchers selected two distinct groups for a more intensive two-week study: 30 students with the lowest scores, indicating low sleep reactivity, and 30 students with the highest scores, representing high sleep reactivity.

Over the following 14 days, these 60 participants were monitored using several methods. They wore an actigraphy watch on their wrist, which uses motion sensors to provide objective data on sleep patterns. This device measured their total sleep time, the amount of time it took them to fall asleep, and the time they spent awake after initially drifting off. Participants also wore an ŌURA ring, which recorded their pre-sleep heart rate as an objective indicator of physiological arousal.

Alongside these objective measures, participants completed daily surveys on their personal devices. Each evening before going to bed, they rated their perceived level of stress. Upon waking the next morning, they reported on their pre-sleep arousal from the previous night. These reports distinguished between cognitive arousal, such as having racing thoughts or worries, and somatic arousal, which includes physical symptoms like a pounding heart or muscle tension.

The first part of the analysis examined within-individual changes, which looks at how a person’s sleep on a high-stress day compared to their own personal average. The results showed that on days when participants felt more stressed than usual, they also experienced a greater degree of pre-sleep cognitive arousal. This increase in racing thoughts was, in turn, associated with getting less total sleep and taking longer to fall asleep that night. This pattern was observed in both the high and low sleep reactivity groups.

This finding suggests that experiencing a more stressful day than usual is likely to disrupt anyone’s sleep to some extent, regardless of their underlying reactivity. It appears to be a common human response for stress to activate the mind at bedtime, making sleep more difficult. The trait of sleep reactivity did not seem to alter this immediate, day-to-day effect.

“We were surprised to find that at the daily level, all participants did in fact exhibit a link between higher perceived stress and poorer sleep the following night, regardless of their level of sleep reactivity,” Ong and Massar told PsyPost. “This pattern may reflect sleep disturbances as a natural—and potentially adaptive—response to stress.”

The researchers then turned to between-individual differences, comparing the overall patterns of people in the high-reactivity group to those in the low-reactivity group across the entire two-week period. In this analysis, a key distinction became clear. Sleep reactivity did in fact play a moderating role, amplifying the negative effects of stress and arousal.

Individuals with high sleep reactivity showed a much stronger connection between their average stress levels, their average pre-sleep cognitive arousal, and their sleep quality. For these highly reactive individuals, having higher average levels of cognitive arousal was specifically linked to spending more time awake after initially falling asleep. In other words, their predisposition to stress-related sleep disturbance made their racing thoughts more disruptive to maintaining sleep throughout the night.

The researchers also tested whether physiological arousal played a similar role in connecting stress to poor sleep. They examined both the participants’ self-reports of physical tension and their objectively measured pre-sleep heart rate. Neither of these measures of physiological arousal appeared to be a significant middleman in the relationship between stress and sleep, for either group. The link between stress and sleep disruption in this study seemed to operate primarily through mental, not physical, arousal.

“On a day-to-day level, both groups exhibited heightened pre-sleep cognitive arousal and greater sleep disturbances in response to elevated daily stress,” the researchers explained. “However, when considering the study period as a whole, individuals with high sleep reactivity consistently reported higher average levels of stress and pre-sleep cognitive arousal, which in turn contributed to more severe sleep disruptions compared to low-reactive sleepers. Notably, these stress → pre-sleep arousal → sleep associations emerged only for cognitive arousal, not for somatic arousal—whether assessed through self-reports or objectively measured via pre-sleep heart rate.”

The researchers acknowledged some limitations of their work. The study sample consisted of young university students who were predominantly female and of Chinese descent, so the results may not be generalizable to other demographic groups or age ranges. Additionally, the study excluded individuals with diagnosed sleep disorders, meaning the findings might differ in a clinical population. The timing of the arousal survey, completed in the morning, also means it was a retrospective report that could have been influenced by the night’s sleep. It is also important to consider the practical size of these effects.

While statistically significant, the changes were modest: a day with stress levels 10 points higher than usual was linked to about 2.5 minutes less sleep, and the amplified effect in high-reactivity individuals amounted to about 1.2 additional minutes of wakefulness during the night for every 10-point increase in average stress.

Future research could build on these findings by exploring the same dynamics in more diverse populations. The study also highlights pre-sleep cognitive arousal as a potential target for intervention, especially for those with high sleep reactivity. Investigating whether therapies like cognitive-behavioral therapy for insomnia can reduce this mental activation could offer a path to preventing temporary, stress-induced sleep problems from developing into chronic conditions.

The study, “Sleep Reactivity Amplifies the Impact of Pre-Sleep Cognitive Arousal on Sleep Disturbances,” was authored by Noof Abdullah Saad Shaif, Julian Lim, Anthony N. Reffi, Michael W. L. Chee, Stijn A. A. Massar, and Ju Lynn Ong.

A severed brain reveals an astonishing power to reroute communication

A new study reveals the human brain’s remarkable ability to maintain communication between its two hemispheres even when the primary connection is almost entirely severed. Researchers discovered that a tiny fraction of remaining nerve fibers is sufficient to sustain near-normal levels of integrated brain function, a finding published in the Proceedings of the National Academy of Sciences. This observation challenges long-held ideas about how the brain is wired and suggests an immense potential for reorganization after injury.

The brain’s left and right hemispheres are linked by the corpus callosum, a massive bundle of about 200 million nerve fibers that acts as a superhighway for information. For decades, scientists have operated under the assumption that this structure has a map-like organization, where specific fibers connect corresponding regions in each hemisphere to perform specialized tasks. Based on this model, damage to a part of the corpus callosum should result in specific, predictable communication breakdowns between the brain halves.

To test this idea, researchers turned to a unique group of individuals known as split-brain patients. These patients have undergone a rare surgical procedure called a callosotomy, where the corpus callosum is intentionally cut to treat severe, otherwise untreatable epilepsy. This procedure provides a distinct opportunity to observe how the brain functions when its main inter-hemispheric pathway is disrupted. Because the surgery is no longer common, data from adult patients using modern neuroimaging techniques has been scarce, leaving a gap in understanding how this profound structural change affects the brain’s functional networks.

The international research team studied six adult patients who had undergone the callosotomy procedure. Four of the patients had a complete transection, meaning the entire corpus callosum was severed. Two other patients had partial transections. One had about 62 percent of the structure intact, while another, patient BT, had approximately 90 percent of his corpus callosum removed, leaving only a small segment of fibers, about one centimeter wide, at the very back of the structure.

To assess the functional consequences, the researchers first performed simple bedside behavioral tests. The four patients with complete cuts exhibited classic “disconnection syndromes,” where one hemisphere appeared unable to share information with the other. For example, they could not verbally name an object placed in their left hand without looking at it, because the sensation from the left hand is processed by the right hemisphere, while language is typically managed by the left. The two hemispheres were acting independently.

In contrast, both patients with partial cuts showed no signs of disconnection. Patient BT, despite having only a tiny bridge of fibers remaining, could perform these tasks successfully, indicating robust communication was occurring between his hemispheres.

To look directly at brain activity, the team used resting-state functional magnetic resonance imaging, or fMRI. This technique measures changes in blood flow throughout the brain, allowing scientists to identify which regions are active and working together. When two regions show synchronized activity over time, they are considered to be functionally connected. The researchers compared the brain activity of the six patients to a benchmark dataset from 100 healthy adults.

In the four patients with a completely severed corpus callosum, the researchers saw a dramatic reduction in functional connectivity between the two hemispheres. The brain’s large-scale networks, which normally span both sides of the brain, appeared highly “lateralized,” meaning their activity was largely confined to either the left or the right hemisphere. It was as if each side of the brain was operating in its own bubble, with very little coordination between them.

The findings from the two partially separated patients were strikingly different. Their patterns of interhemispheric functional connectivity looked nearly identical to those of the healthy control group. Even in patient BT, the small remnant of posterior fibers was enough to support widespread, brain-wide functional integration. His brain networks for attention, sensory processing, and higher-order thought all showed normal levels of bilateral coordination. This result directly contradicts the classical model, which would have predicted that only the brain regions directly connected by those few remaining fibers, likely related to vision, would show preserved communication.

The researchers also analyzed the brain’s dynamic activity, looking at how moment-to-moment fluctuations are synchronized across the brain. In healthy individuals, the overall rhythm of activity in the left hemisphere is tightly coupled with the rhythm in the right hemisphere. In the patients with complete cuts, these rhythms were desynchronized, as if each hemisphere was marching to the beat of its own drum.

Yet again, the two patients with partial cuts showed a strong, healthy synchronization between their hemispheres, suggesting the small bundle of fibers was sufficient to coordinate the brain’s global dynamics. Patient BT’s brain had apparently reorganized its functional networks over the six years since his surgery to make optimal use of this minimal structural connection.

The study is limited by its small number of participants, a common challenge in research involving rare medical conditions. Because the callosotomy procedure is seldom performed today, finding adult patients for study is difficult. While the differences observed between the groups were pronounced, larger studies would be needed to fully characterize the range of outcomes and the ways in which brains reorganize over different timescales following surgery.

Future research could focus on tracking patients over many years to map the process of neural reorganization in greater detail. Such work may help uncover the principles that govern the brain’s plasticity and its ability to adapt to profound structural changes. The findings open new avenues for rehabilitation research, suggesting that therapies could aim to leverage even minimal remaining pathways to help restore function after brain injury. The results indicate that the relationship between the brain’s physical structure and its functional capacity is far more flexible and complex than previously understood.

The study, “Full interhemispheric integration sustained by a fraction of posterior callosal fibers,” was authored by Tyler Santander, Selin Bekir, Theresa Paul, Jessica M. Simonson, Valerie M. Wiemer, Henri Etel Skinner, Johanna L. Hopf, Anna Rada, Friedrich G. Woermann, Thilo Kalbhenn, Barry Giesbrecht, Christian G. Bien, Olaf Sporns, Michael S. Gazzaniga, Lukas J. Volz, and Michael B. Miller.

Public Montessori preschool yields improved reading and cognition at a lower cost

The debate over the most effective models for early childhood education is a longstanding one. While the benefits of preschool are widely accepted, researchers have observed that the academic advantages gained in many programs tend to diminish by the time children finish kindergarten, a phenomenon often called “fade-out.” Some studies have even pointed to potential negative long-term outcomes from certain public preschool programs, intensifying the search for approaches that provide lasting benefits.

This situation prompted researchers to rigorously examine the Montessori method, a well-established educational model that has been in practice for over a century. Their new large-scale study found that children offered a spot in a public Montessori preschool showed better outcomes in reading, memory, executive function, and social understanding by the end of kindergarten.

The research also revealed that this educational model costs public school districts substantially less over three years compared to traditional programs. The findings were published in the Proceedings of the National Academy of Sciences.

The Montessori method is an educational approach developed over a century ago by Maria Montessori. Its classrooms typically feature a mix of ages, such as three- to six-year-olds, learning together. The environment is structured around child-led discovery, where students choose their own activities from a curated set of specialized, hands-on materials. The teacher acts more as a guide for individual and small-group lessons rather than a lecturer to the entire class.

The Montessori model, which has been implemented in thousands of schools globally, had not previously been evaluated in a rigorous, national randomized controlled trial. This study was designed to provide high-quality evidence on its impact in a public school setting.

“There have been a few small randomized controlled trials of public Montessori outcomes, but they were limited to 1-2 schools, leaving open the question of whether the more positive results were due to something about those schools aside from the Montessori programming,” said study author Angeline Lillard, the Commonwealth Professor of Psychology at the University of Virginia.

“This national study gets around that by using 24 different schools, which each had 3-16 Montessori Primary (3-6) classrooms. In addition, the two prior randomized controlled trials that had trained Montessori teachers (making them more valid) compromised the randomized controlled trial in certain ways, including not using intention-to-treat designs that are preferred by some.”

To conduct the research, the research team took advantage of the admissions lotteries at 24 oversubscribed public Montessori schools across the United States. When a school has more applicants than available seats, a random lottery gives each applicant an equal chance of admission. This process creates a natural experiment, allowing for a direct comparison between the children who were offered a spot (the treatment group) and those who were not (the control group). Nearly 600 children and their families consented to participate.

The children were tracked from the start of preschool at age three through the end of their kindergarten year. Researchers administered a range of assessments at the beginning of the study and again each spring to measure academic skills, memory, and social-emotional development. The primary analysis was a conservative type called an “intention-to-treat” analysis, which measures the effect of simply being offered a spot in a Montessori program, regardless of whether the child actually attended or for how long.

The results showed no significant differences between the two groups after the first or second year of preschool. But by the end of kindergarten, a distinct pattern of advantages had emerged for the children who had been offered a Montessori spot. This group demonstrated significantly higher scores on a standardized test of early reading skills. They also performed better on a test of executive function, which involves skills like planning, self-control, and following rules.

The Montessori group also showed stronger short-term memory, as measured by their ability to recall a sequence of numbers. Their social understanding, or “theory of mind,” was also more advanced, suggesting a greater capacity to comprehend others’ thoughts, feelings, and beliefs. The estimated effects for these outcomes were considered medium to large for this type of educational research.

The study found no significant group differences in vocabulary or a math assessment, although the results for math trended in a positive direction for the Montessori group.

In a secondary analysis, the researchers estimated the effects only for the children who complied with their lottery assignment, meaning those who won and attended Montessori compared to those who lost and did not. As expected, the positive effects on reading, executive function, memory, and social understanding were even larger in this analysis.

“For example, a child who scored at the 50th percentile in reading in a traditional school would have been at the 62nd percentile had they won the lottery to attend Montessori; had they won and attended Montessori, they would have scored at the 71st percentile,” Lillard told PsyPost.

Alongside the child assessments, the researchers performed a detailed cost analysis. They followed a method known as the “ingredients approach,” which accounts for all the resources required to run a program. This included teacher salaries and training, classroom materials, and facility space for both Montessori and traditional public preschool classrooms. One-time costs, such as the specialized Montessori materials and extensive teacher training, were amortized over their expected 25-year lifespan.

The analysis produced a surprising finding. Over the three-year period from ages three to six, public Montessori programs were estimated to cost districts $13,127 less per child than traditional programs. The main source of this cost savings was the higher child-to-teacher ratio in Montessori classrooms for three- and four-year-olds. This is an intentional feature of the Montessori model, designed to encourage peer learning and independence. These savings more than offset the higher upfront costs for teacher training and materials.

“I thought Montessori would cost the same, once one amortized the cost of teacher training and materials,” Lillard said. “Instead, we calculated that (due to intentionally higher ratios at 3 and 4, which predicted higher classroom quality in Montessori) Montessori cost less.”

“Even when including a large, diverse array of schools, public Montessori had better outcomes. These finding were robust to many different approaches to the data. And, the cost analysis showed these outcomes were obtained at significantly lower cost than was spent on traditional PK3 through kindergarten programs in public schools.”

But as with all research, there are limitations. The research included only families who applied to a Montessori school lottery, so the findings might not be generalizable to the broader population. The consent rate to participate in the study was relatively low, at about 21 percent of families who were contacted. Families who won a lottery spot were also more likely to consent than those who lost, which could potentially introduce bias into the results.

“Montessori is not a trademarked term, so anyone can call anything Montessori,” Lillard noted. “We required that most teachers be trained by the two organizations with the most rigorous training — AMI or the Association Montessori Internationale, which Dr. Maria Montessori founded to carry on her work, and AMS or the American Montessori Society, which has less rigorous teacher-trainer preparation and is shorter, but is still commendable. Our results might not extend to all schools that call themselves Montessori. In addition, we had low buy-in as we recruited for this study in summer 2021 when COVID-19 was still deeply concerning. We do not know if the results apply to families that did not consent to participation.”

The findings are also limited to the end of kindergarten. Whether the observed advantages for the Montessori group persist, grow, or fade in later elementary grades is a question for future research. The study authors expressed a strong interest in following these children to assess the long-term impacts of their early educational experiences.

“My collaborators at the American Institutes for Research and the University of Pennsylvania and University of Virginia are deeply appreciative of the schools, teachers, and families who participated, and to our funders, the Institute for Educational Sciences, Arnold Ventures, and the Brady Education Foundation,” Lillard added.

The study, “A national randomized controlled trial of the impact of public Montessori preschool at the end of kindergarten,” was authored by Angeline S. Lillard, David Loeb, Juliette Berg, Maya Escueta, Karen Manship, Alison Hauser, and Emily D. Daggett.

Familial link between ADHD and crime risk is partly genetic, study suggests

A new study has found that individuals with ADHD have a higher risk of being convicted of a crime, and reveals this connection also extends to their family members. The research suggests that shared genetics are a meaningful part of the explanation for this link. Published in Biological Psychology, the findings show that the risk of a criminal conviction increases with the degree of genetic relatedness to a relative with ADHD.

The connection between ADHD and an increased likelihood of criminal activity is well-documented. Past research indicates that individuals with ADHD are two to three times more likely to be arrested or convicted of a crime. Scientists have also established that both ADHD and criminality have substantial genetic influences, with heritability estimates around 70-80% for ADHD and approximately 50% for criminal behavior. This overlap led researchers to hypothesize that shared genetic factors might partly explain the association between the two.

While some previous studies hinted at a familial connection, they were often limited to specific types of crime or a small number of relative types. The current research aimed to provide a more complete picture. The investigators sought to understand how the risk for criminal convictions co-aggregates, or clusters, within families across a wide spectrum of relationships, from identical twins to cousins. They also wanted to examine potential differences in these patterns between men and women.

“ADHD is linked to higher rates of crime, but it’s unclear why. We studied families to see whether shared genetic or environmental factors explain this connection, aiming to better understand how early support could reduce risk,” said study author Sofi Oskarsson, a researcher and senior lecturer in criminology at Örebro University.

To conduct the investigation, researchers utilized Sweden’s comprehensive national population registers. They analyzed data from a cohort of over 1.5 million individuals born in Sweden between 1987 and 2002. ADHD cases were identified through clinical diagnoses or prescriptions for ADHD medication recorded in national health registers. Information on criminal convictions for any crime, violent crime, or non-violent crime was obtained from the National Crime Register, with the analysis beginning from an individual’s 15th birthday, the age of criminal responsibility in Sweden.

The study design allowed researchers to estimate the risk of a criminal conviction for an individual based on whether a relative had ADHD. By comparing these risks across different types of relatives who share varying amounts of genetic material—identical twins (100%), fraternal twins and full siblings (average 50%), half-siblings (average 25%), and cousins (average 12.5%)—the team could infer the potential role of shared genes and environments.

The results first confirmed that individuals with an ADHD diagnosis had a substantially higher risk of being convicted of a crime compared to those without ADHD. The risk was particularly elevated for violent crimes.

The analysis also revealed a significant gender difference: while men with ADHD had higher absolute numbers of convictions, women with ADHD had a greater relative increase in risk compared to women without the disorder. For violent crime, the risk was over eight times higher for women with ADHD, while it was about five times higher for men with ADHD.

“Perhaps not a surprise given what we know today about ADHD, but the stronger associations found among women were very interesting and important,” Oskarsson told PsyPost. “ADHD is not diagnosed as often in females (or is mischaracterized), so the higher relative risk in women suggest that when ADHD is present, it may reflect a more severe or concentrated set of risk factors.”

The central finding of the study was the clear pattern of familial co-aggregation. Having a relative with ADHD was associated with an increased personal risk for a criminal conviction. This risk followed a gradient based on genetic relatedness.

The highest risk was observed in individuals whose identical twin had ADHD, followed by fraternal twins and full siblings. The risk was progressively lower for half-siblings and cousins. This pattern, where the association weakens as genetic similarity decreases, points toward the influence of shared genetic factors.

“Close relatives of people with ADHD were much more likely to have criminal convictions, especially twins, supporting a genetic contribution,” Oskarsson explained. “But the link is not deterministic, most individuals with ADHD or affected relatives are not convicted, emphasizing shared risk, not inevitability.”

The study also found that the stronger relative risk for women was not limited to individuals with ADHD. A similar pattern appeared in some familial relationships, specifically among full siblings and full cousins, where the association between a relative’s ADHD and a woman’s conviction risk was stronger than for men. This suggests that the biological and environmental mechanisms connecting ADHD and crime may operate differently depending on sex.

“People with ADHD are at a higher risk of criminality, but this risk also extend to their relatives,” Oskarsson said. “This pattern suggest that some of the link between ADHD and crime stems from shared genetic and/or environmental factors. Importantly, this does not mean that ADHD causes crime, but that the two share underlying vulnerabilities. Recognizing and addressing ADHD early, especially in families, could reduce downstream risks and improve outcomes.”

As with any study, the researchers acknowledge some limitations. The study’s reliance on official medical records may primarily capture more severe cases of ADHD, and conviction data does not account for all criminal behavior. Because the data comes from Sweden, a country with universal healthcare, the findings may not be directly generalizable to countries with different social or legal systems. The authors also note that the large number of statistical comparisons means the overall consistency of the patterns is more important than any single result.

Future research could explore these associations in different cultural and national contexts to see if the patterns hold. Further investigation is also needed to identify the specific genetic and environmental pathways that contribute to the shared risk between ADHD and criminal convictions. These findings could help inform risk assessment and prevention efforts, but the authors caution that such knowledge must be applied carefully to avoid stigmatization.

“I want to know more about why ADHD and criminality are connected, which symptoms or circumstances matter most, and whether early support for individuals and families can help break that link,” Oskarsson added. “This study underscores the importance of viewing ADHD within a broader family and societal context. Early support for ADHD doesn’t just help the individual, it can have ripple effects that extend across families and communities.”

The study, “The Familial Co-Aggregation of ADHD and Criminal Convictions: A Register-Based Cohort Study,” was authored by Sofi Oskarsson, Ralf Kuja-Halkola, Anneli Andersson, Catherine Tuvblad, Isabell Brikell, Brian D’Onofrio, Zheng Chang, and Henrik Larsson.

‘I felt like it was torn away from me’: Evie Richards on overcoming a ‘turbulent’ year to seal first World Cup title

Two-time world champion Evie Richards is now an overall World Cup winner for the first time. The 28-year-old from Malvern sat down with Flo Clifford to dissect a topsy-turvy season, her unconventional route into the sport, and where next for one of mountain biking’s biggest stars

© Fabio Piva / Red Bull Content Pool

Dr Zoe Williams and nutritionist Hannah Alderson on the ‘hormone balancing’ boom

We hear the term “hormone balancing” day in, day out, Instagram is awash with hormone balancing coaches and we’re constantly seeing products that claim to reinstate hormonal balance. But given that hormones are always in flux – due to the nature of them being hormones – what does the term “hormone balance” mean? TV doctor Dr Zoe Williams and nutritionist and author of Everything I Know About Hormones, Hannah Alderson discuss the balancing boom, supplements, healthy eating, the pill, menopause and more with podcast host Emilie Lavinia.

© The Independent

❌