Search Index
359 results found
- Sleep less…remember less: the hidden link between sleep and memory loss | Scientia News
Not getting enough sleep can increase the risk of developing Alzheimer’s Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Sleep less…remember less: the hidden link between sleep and memory loss Last updated: 10/07/25, 18:27 Published: 17/04/25, 07:00 Not getting enough sleep can increase the risk of developing Alzheimer’s People often don’t get enough sleep for a variety of reasons, ranging from intentional choices like work or study demands (because who needs sleep when you’ve got deadlines, right?), to the growing concern with screen time (a.k.a. the “I’ll just watch one more episode” syndrome), and of course, procrastination (where your brain convinces you that 3 a.m. is a great time to suddenly get productive). But it’s not all fun and games—serious issues like insomnia, sleep apnoea, family responsibilities, or even shift work can also interfere with rest. Sleep disorders are increasingly common, with around one in three people in the UK affected, and they’re particularly prevalent among the elderly. However, not getting enough sleep can increase the risk of developing Alzheimer’s disease (AD). How do sleep disorders impact Alzheimer’s disease? Insomnia is characterised by difficulty falling asleep or staying asleep, which can lead to prolonged fatigue and memory issues. As shown in Figure 1 , people with insomnia tend to have some similarity in markers as those with Alzheimer’s disease, such as an increased level of Aβ and tau proteins in the brain. This is primarily because a lack of sleep prevents the effective removal of harmful products from the brain – this accumulation increases a person’s risk of AD. A plethora of experimental studies on humans and animals have shown that lack of sleep can lead to increased circulating levels of TNF-α and the gene resulting in more TNF-α secretion. This pro-inflammatory cytokine exacerbates AD pathology because neuroinflammation can lead to dysfunction and cell death, which are key markers of AD. Other pro-inflammatory cytokines, like IL-1, have been found to be relevant in the link between sleep deprivation and AD. Overexpression of IL-1 in the brain leads to abnormal changes in nerve cell structures especially relating to Aβ plaques. This highlights IL-1’s key role in plaque evolution and the synthesis of Amyloid Precursor Protein, which promotes amyloid production that eventually results in AD pathology. What type of sleep can impact one’s risk of Alzheimer’s disease? Studies using more objective measures, like actigraphy (which tracks sleep-wake activity), found that sleep quality (sleep efficiency) is more important than total sleep time. For example, women with less than 70% sleep efficiency were more likely to experience cognitive impairment. Increased wakefulness during the night also moderated the relationship between amyloid deposition (a hallmark of AD) and memory decline. Uncertainties… However, it remains unclear whether poor sleep directly causes AD or if the disease itself leads to sleep disturbances. Some studies suggest a bidirectional relationship. Aging itself leads to poorer sleep quality, including reduced sleep efficiency, less slow-wave sleep (SWS), and more frequent awakenings. Sleep disorders like obstructive sleep apnoea, insomnia, and restless legs syndrome also become more common with age. What are the next steps? The good news is that many sleep disorders, including insomnia, are manageable, and improving sleep quality could be a simple yet powerful way to reduce Alzheimer’s risk. Additionally, early diagnosis and treatment of conditions like sleep apnoea and insomnia may help slow or even prevent neurodegenerative changes. s researchers continue to explore the intricate relationship between sleep and Alzheimer’s, one thing is clear: getting a good night’s sleep isn’t just about feeling refreshed. It is a crucial investment in long-term brain health. Written by Blessing Amo-Konadu Related articles: Overview of Alzheimer's / Hallmarks of Alzheimer's / CRISPR-Cas9 in AD treatment / Memory erasure / Does insomnia run in families? REFERENCES Lucey, B. (2020). It’s complicated: The relationship between sleep and Alzheimer’s disease in humans. Neurobiology of Disease , [online] 144, p.105031. doi: https://doi.org/10.1016/j.nbd.2020.105031 . NHS (2023). Insomnia . [online] www.nhsinform.scot . Available at: https://www.nhsinform.scot/illnesses-and-conditions/mental-health/insomnia/ . Pelc, C. (2023). Not getting enough deep sleep may increase the risk of developing dementia . [online] Medicalnewstoday.com . Available at: https://www.medicalnewstoday.com/articles/not-getting-enough-deep-sleep-may-increase-dementia-risk#Clarifying-the-link-between-sleep-aging-and-dementia-risk [Accessed 22 Dec. 2024]. Sadeghmousavi, S., Eskian, M., Rahmani, F. and Rezaei, N. (2020). The effect of insomnia on development of Alzheimer’s disease. Journal of Neuroinflammation , 17(1). doi: https://doi.org/10.1186/s12974-020-01960-9 . Project Gallery
- Maveerar Naal: health, trauma, and resilience amid decades of war | Scientia News
A scientific reflection on the humanitarian, physical, and psychological cost of war Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Maveerar Naal: health, trauma, and resilience amid decades of war Last updated: 05/03/26, 14:49 Published: 27/11/25, 08:00 A scientific reflection on the humanitarian, physical, and psychological cost of war Every year on the 27th of November — and throughout the month of remembrance — Eelam Tamils worldwide observe Maveerar Naal, honouring those who lost their lives during Sri Lanka’s war (1983–2009). While traditionally centred on fallen fighters, this period also serves as a vital opportunity to reflect on the epidemiology of trauma, the collapse of public health systems, and the long-term physical and psychological consequences carried by Eelam Tamil communities after more than two decades of conflict. This article reframes Maveerar Naal not only as a commemoration, but also as a scientific reflection on the humanitarian, physical, and psychological cost of war — and the resilience of those who survived it. A health system under siege From the mid-1980s onward, northern and eastern Sri Lanka experienced a chronic, escalating humanitarian emergency. Repeated mass displacement, food scarcity, blocked medical supply routes, and intermittent bombardment steadily eroded the region’s healthcare infrastructure. Clinics became inaccessible due to shelling or military restrictions, and maternal and child health services deteriorated sharply. Early epidemiological observations from the 1990s documented widespread anxiety, depression, and trauma symptoms among civilians, demonstrating that mental-health consequences were emerging long before the war’s final years. By the late 2000s, the public health crisis intensified dramatically. As the conflict entered its final phase — from late 2008 to May 2009 — more than 2.5 million people were trapped in active conflict zones, while approximately 800,000 civilians were internally displaced. Entire districts lost functional hospitals; others were forced to convert schools, churches, and tarpaulin shelters into emergency medical centres. Human resource shortages reflected the near-total systemic collapse: in some northern districts, only 34 of 108 midwife posts and 6 of 27 doctor posts remained filled. Pregnant women delivered in makeshift bunkers, neonatal mortality spiked, and infectious diseases spread rapidly through overcrowded displacement camps. For many, survival came at the cost of long-term disability, untreated injuries, and profound psychological trauma. Physical health consequences across populations The physical scars of the war persist across generations. Civilians experienced blast injuries, shrapnel wounds, burns, and amputations, often without access to timely surgical care. Emergency operations were performed in unsterile environments; in some cases, anaesthesia was unavailable, forcing staff to improvise with inadequate substitutes. Conditions in displacement camps — overcrowding, poor sanitation, contaminated water — led to outbreaks of diarrhoea, hepatitis A and E, and vector-borne diseases. For combatants, chronic health burdens are well-documented. Peer-reviewed studies, including research published in journals such as the International Journal of Social Psychiatry and the Journal of Rehabilitation Medicine , report the following long-term conditions among injured veterans: Back pain: 69.4% Knee osteoarthritis: 18.8% Hypertension: 22.4% Diabetes: 34.2% Phantom-limb pain among amputees: over 77% PTSD among amputees: ~41.7% These outcomes reflect years of untreated injuries, limited rehabilitation access, chronic stress, and long-term nutritional deficiencies. Psychological trauma and intergenerational consequences The psychological impact of the war has been profound. Medical workers described witnessing mass casualties with inadequate supplies — a situation that produced significant moral injury, compassion fatigue, and long-lasting mental-health consequences. Among severely injured fighters, mental-health assessments published in trauma and rehabilitation journals report: PTSD: 41.7% Adjustment disorder: 16.4% Depressive disorder: 15.6% Somatoform/dissociative disorders: significant prevalence Civilians exposed to high-intensity conflict show similarly alarming patterns. Studies from humanitarian organisations and academic institutions report that approximately: 64% of civilians exhibited long-term trauma-related effects 27% experienced PTSD 26% had anxiety disorders 25% had depression 18% experienced functional disability due to psychological distress Notably, emerging research has identified intergenerational transmission of trauma, with children of survivors — even those born after 2009 — displaying elevated rates of anxiety, behavioural challenges, and trauma-related symptoms. This represents a critical area for continued scientific study and intervention. Health workers on the frontline: the hidden scientific story The war’s final months produced some of the most extreme medical working conditions documented in modern conflict settings. For ethical, political, and safety reasons, this article does not name frontline medical staff; however, their experiences are well-recorded in reports by Physicians for Human Rights (PHR), Human Rights Watch (HRW), and eyewitness testimonies. One regional physician coordinated makeshift hospitals inside schools and religious buildings. With no supplies, he sterilised instruments over open flames, used sarongs as dressings, and suspended IV fluids from tree branches. He performed dozens of emergency surgeries daily, sometimes operating while artillery fire struck nearby. A field-hospital superintendent described conducting amputations without anaesthesia, supported only by volunteer nurses. When their facility was shelled — an incident documented by multiple international observers — dozens died instantly. Survivors were treated in trenches illuminated by mobile phone torches. Another medical coordinator reported overseeing triage for thousands of displaced civilians, many severely dehydrated or malnourished. He described having to prioritise patients based solely on survivability, an ethically devastating but necessary decision in conditions of extreme scarcity. PHR and HRW documented at least 30 direct attacks on hospitals between December 2008 and May 2009. These incidents — some among the most thoroughly investigated attacks on medical facilities globally — illustrate the catastrophic collapse of health infrastructure and the extraordinary resilience of those who continued to provide care. Reflection, healing, and the path ahead Maveerar Naal is, at its core, a day of remembrance. Yet for many Eelam Tamils, it is also a day of scientific reflection — a moment to acknowledge the measurable, long-term consequences of conflict on physical health, mental well-being, and community resilience. Healing requires investment in: Long-term mental-health services rooted in trauma-informed care Rehabilitation programmes for amputees and individuals with chronic injuries Public health research into intergenerational trauma Accessible healthcare for survivors living in diaspora communities Preservation of evidence and health data for historical and scientific record By understanding the epidemiology of suffering, communities can better design strategies for recovery. By recognising the extraordinary resilience of civilians, fighters, and health workers, they honour all forms of courage. And by grounding remembrance in scientific truth, Maveerar Naal becomes not only a memorial, but a commitment to protecting health, dignity, and humanity for future generations. In remembering the past, we build the foundation for a more compassionate, prepared, and resilient future. Written by Jeevana Thavarajah Related articles: Impact of war on health (series) / South Asian Mental Health / Ethnic Health Inequalities REFERENCES Amnesty International (2009) Sri Lanka: Twenty Years of Make-Believe. Available at: https://www.amnesty.org/en/documents/asa37/005/2009/en/ BBC News (2009) Sri Lanka shells no-fire zone. Available at: http://news.bbc.co.uk/2/hi/south_asia/8046136.stm Catani, C. et al. (2008) ‘War trauma, child abuse and PTSD in Sri Lankan children’, Journal of Child Psychology and Psychiatry . Available at: https://pubmed.ncbi.nlm.nih.gov/18673497/ Channel 4 News (2011) Sri Lanka’s Killing Fields. Available at: https://www.channel4.com/news/sri-lankas-killing-fields Fernando, G. and Ferrari, M. (2013) ‘Short- and long-term psychological effects of war in Sri Lankan populations’, Asian Journal of Psychiatry . Available at: https://pubmed.ncbi.nlm.nih.gov/23885541/ Human Rights Watch (2009) Sri Lanka: Repeated Shelling of Hospitals. Available at: https://www.hrw.org/news/2009/05/08/sri-lanka-repeated-shelling-hospitals International Committee of the Red Cross (ICRC) (2014) War injury rehabilitation and prosthetics – Sri Lanka. Available at: https://www.icrc.org/en/document/sri-lanka-prosthetics-rehabilitation International Crisis Group (2010) War Crimes in Sri Lanka. Available at: https://www.crisisgroup.org/asia/south-asia/sri-lanka/war-crimes-sri-lanka Office of the High Commissioner for Human Rights (OHCHR) (2015) OISL Report: Sri Lanka. Available at: https://www.ohchr.org/en/hr-bodies/hrc/oisl-sri-lanka Physicians for Human Rights (PHR) (2009) PHR calls for inquiry into detention of doctors and war crimes in Sri Lanka. Available at: https://phr.org/news/phr-calls-for-inquiry-into-detention-of-doctors-and-war-crimes-in-sri-lanka/ Project Gallery
- CEDS: a break in cell death | Scientia News
Looking at caspase-8’s inability to trigger cell death Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link CEDS: a break in cell death Last updated: 12/09/25, 11:08 Published: 11/09/25, 07:00 Looking at caspase-8’s inability to trigger cell death This is article no. 11 in a series on rare diseases. Next article coming soon. Previous article: Ehlers-Danlos syndrome . Cell death, as we know it, is a crucial phenomenon by which our bodies remove unnecessary or damaged cells to maintain internal stability, a process known as homeostasis. Cell death can occur in many ways, but the mechanisms by which cells die follow two main paths. It may occur as naturally programmed, as in apoptosis, or as a result of toxic trauma or physical damage, like necrosis. While cell death due to trauma can often be more noticeable and dramatic, programmed cell death happens continually, not only because of cell damage but also because it is a normal part of development, and inducing it is a core function of immune system cells. In essence, cell death comes naturally, removing cells that are possibly damaged or infected to maintain the body as a whole. But what if cell death stops? As many fiction stories will tell you, immortality is never a good thing, and this is accurate for our cells, too. Although excessive cell death is also destructive, cell death in its natural controlled manner not only stops the spread of infection but also prevents the survival of cancer cells and auto-reactive immune cells, which can damage the body by forming cancerous tumours and triggering autoimmune diseases, respectively. This demonstrates that a careful balance of life and death must always be in place to maintain homeostatic conditions and allow our unimpeded survival. However, as cell death is a multi-step mechanism, it can go wrong in several ways. Furthermore, diseases causing faults in the cell death process can be challenging to diagnose. Not only can there be numerous reasons for patients to exhibit symptoms associated with the loss of cell death, but some of these reasons may also be rare disorders and, therefore, difficult for healthcare professionals to identify. One rare disease that researchers recently recognised is Caspase-8 Deficiency Syndrome (CEDS). This disease, stemming from a genetic mutation in the gene coding for caspase-8, results in extensive issues related to immunodeficiency, and they are all caused by caspase-8’s inability to trigger cell death. So what is Caspase-8? Caspase-8 is a pivotal regulator of the apoptotic pathway. Essentially, apoptosis can happen through two key pathways: the extrinsic pathway, when triggers originate outside the cell; and the intrinsic pathway, when the cell itself activates the cell death pathway. Whilst there are several key players in apoptosis, caspase-8 is a central mediator of the extrinsic apoptotic pathway. Caspase-8 can be activated through numerous ways, but it is often through so-called death receptors, which are typically members of the Tumour Necrosis Factor Receptor (TNFR) family of transmembrane proteins. Upon their activation, a chain reaction occurs, involving the recruitment of caspase-8 into a complex, known as the death-inducing signalling complex (DISC). This complex then cleaves further downstream caspases or the BH3 Bcl2-interacting protein. This cascade leads to DNA fragmentation, degradation of the cytoskeleton, formation of apoptotic bodies, expression of ligands for phagocytic cell receptors, and finally, uptake by phagocytes, thus completing the death of the cell and its cleanup ( Figure 2. ). Caspase-8 therefore plays a crucial role in completing the death inducing pathway. While there are other methods of cell death, the loss of Caspase-8 undoubtedly leads to significant consequences. Caspase-8 deficiency syndrome (CEDS) Scientists first discovered CEDs in the early 2000s. By this time, there had already been extensive research into a similar disease known as Autoimmune Lymphoproliferative Syndrome (ALPS), which results from defective apoptosis leading to abnormal immune cell survival. However, at the time of ALPS discovery, there was no identified link to a loss of Caspase-8. Furthermore, there was a lack of available mouse models to study, as inducing homozygous caspase-8 deficiency caused embryonic lethality in mice, significantly limiting research. Therefore, a loss of caspase-8 was also considered to have the same effect in humans. This train of thought continued until 2002, when Chun et al. conducted major studies into apoptosis-related diseases. During one of their many trials, two siblings—a 12-year-old girl and an 11-year-old boy—were found to exhibit symptoms similar to those of ALPS (lymphadenopathy, splenomegaly, and defective CD95-induced apoptosis of peripheral blood lymphocytes). However, unlike ALPS, the siblings were also immunodeficient and suffered from recurrent sinopulmonary and herpes simplex virus (HSV) infections, as well as a poor response to immunisation. Following the discovery of these additional symptoms in the siblings, researchers examined their other family members but were surprised to find that neither the parents nor another sibling suffered in a similar fashion. The only symptom they had was a partial defect in apoptosis mediated by CD95. It was determined that the mother, father, sibling, and several other extended family members were potentially heterozygous carriers of the mutation found in the affected siblings. Subsequently, a DNA analysis was conducted, and a mutation was found in the CASP8 gene. This mutation was a homozygous deletion, which ultimately led to a loss of function of the caspase-8 protein. This loss of function in caspase-8 resulted in defective interleukin 2 production and diminished T-cell proliferation, explaining the immunodeficiency associated with CEDS and highlighting the important role caspase-8 plays in regulating cell death and immune responses. Since CEDS was first identified in the 2002 study, very few cases have been reported in medical literature. However, despite this, research continues, and it has allowed further insights into caspase-8’s pathophysiology and, in many studies, new genetic variants have been identified. One such variant is a homozygous missense mutation resulting in significant immune dysregulation in an affected individual, which results in immune responses and inflammatory conditions associated with the disease. Alongside research into the causes of this disease, focus has also shifted to how we might best diagnose and treat the disease and provide patients with the good quality of life they deserve. Diagnosis As with all rare diseases, one of the main issues stopping correct diagnosis of CEDS and delaying treatment is the fact healthcare providers are not familiar with disease symptoms, let alone the genetic basis of the disease. To make matters worse, the presentation of disease varies depending on the age of onset, which makes it even more difficult to recognise CEDS as the common underlying cause. For instance, early-onset often results in symptoms, such as severe infections and organomegaly, while adult-onset patients may present with neurological issues, multi-organ failure and chronic inflammatory conditions. Further adding to these diagnostic difficulties is the fact CEDS overlaps with other conditions, such as the previously mentioned ALPs. As a result, a patient could receive multiple different diagnoses before CEDS is identified as the cause of their suffering. For effective CEDS diagnosis, expertise in immunology, genetics and infectious diseases is required. However, this specialised knowledge is hard to come by, and as with all diseases, the familiarity the healthcare provider has with it contributes greatly to whether you will get diagnoses, and this familiarity does not exist for rare diseases. Furthermore, diagnostic methods in general are tricky for this disease, with multiple tests often being required including an analysis of patient history alongside genetic testing through methods like whole exome sequencing and immunological tests analysing the types and states of immune cells and abnormal levels of immunoglobulins. Each of these diagnostic methods takes time, in an often-strained healthcare system, which can lead to a sense of helplessness in disease sufferers who only suffer more the longer they do not know what is wrong. Treatments Unfortunately for patients, a difficult diagnosis is not the only challenge they face, as there is currently no cure for CEDS, and no specific treatments. However, there are more general treatments available that could potentially alleviate symptoms and help individuals achieve some level of normality in their lives. The best possible way to approach treatment of CEDS, as with most immunodeficiency related diseases, would be to treat both the immune dysfunction and prevent recurrent infections. This could involve a multifaceted treatment plan tailored to the individual, aiming to avoid complications from immune dysfunction and improve quality of life. Potential treatment plans could include the use of antibiotic and antiviral medications for recurrent infections, and also more complex treatments such as Immunoglobulin replacement therapy (IVIG). IVIG provides necessary antibodies to bolster patients’ immune system, when they are not able to themselves, which both helps avoid overuse of antibiotic and antiviral treatments and prevents infections in the first place before treatment is required. Alongside these treatment methods, due to it being a relatively unknown disease, CEDS patients will also require a great deal of supportive and hands on care. As part of this care patients could potentially be provided with a specialised diet plan with all the correct nutrition to help them combat any gastrointestinal issues (GI’s) associated with CEDS, as primary immunodeficiency patients have found this method to help with control of GIs. In addition to current therapies several innovative approaches to treatment of genetic diseases are in development which could be used in CEDS treatment. Recent advances in gene therapy research offer new hope for treating immune deficiencies resulting from genetic defects, which means these therapies could potentially benefit CEDS patients. One promising method for gene therapy utilises CRISPR-Cas9 to correct the genetic mutations, such as those in CASP8 leading to CEDS. Another approach uses viral vectors to deliver functional genes into patients’ cells, and this could potentially deliver a functional CASP8 gene. Additionally, another very promising therapy, previously used for ALPS patients, involves genetically modifying stem cells to correct a faulty gene (such as the faulty CASP8 gene) before re-infusing them into the patient to produce healthy immune cells. These treatments could revolutionise the management of rare genetic diseases like CEDS. The future for CEDS as a rare disease Rare diseases like CEDS are often chronic and, in many cases, life threatening. Due to the scarcity of information on these conditions, few if any treatments exist. Furthermore, due to their rarity, patients of rare diseases are not only small in number but also dispersed worldwide, leading to a feeling of isolation as they rarely meet someone who shares in their experiences. However, as scientific research progresses, treatments and therapies become more effective and accessible, and with 72% of rare diseases, including CEDS, having a genetic basis, gene therapies appear incredibly promising. Yet, there is still a long way to go to fully realize their potential, and even more that can be done to help and support those who continue to suffer alone with rare diseases. Written by Faye Boswell REFERENCES Telford WG. Multiparametric analysis of apoptosis by flow cytometry. Methods Mol Biol. 2018;1678:167–202. Available from: https://pmc.ncbi.nlm.nih.gov/articles/PMC8063493/ Smith C. Monitoring apoptosis by flow cytometry. Biocompare. 2017 Jan 17. Available from: https://www.biocompare.com/Editorial-Articles/332620-Monitoring-Apoptosiby-Flow-Cytometry/ Tummers B, Green DR. Caspase-8; regulating life and death. Immunol Rev. 2017 May;277(1):76–89. doi: 10.1111/imr.12541. Available from: https://pmc.ncbi.nlm.nih.gov/articles/PMC5417704/ Leeies M, Flynn E, Turgeon AF, Paunovic B, Loewen H, Rabbani R, Abou-Setta AM, Ferguson ND, Zarychanski R. High-flow oxygen via nasal cannulae in patients with acute hypoxemic respiratory failure: a systematic review and meta-analysis. Syst Rev. 2017 Oct 18;6(1):202. doi: 10.1186/s13643-017-0607-1. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4439260/ Goyal A, Moitra D, Goldstein DB, Savage H, Lisco A, Rosenzweig SD, et al. Caspase-8 deficiency presenting as a novel immune dysregulation syndrome: case report and literature review. Allergy Asthma Clin Immunol. 2023;19(1):57. doi:10.1186/s13223-023-00778-3. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10084589/ Chun HJ, Zheng L, Ahmad M, Wang J, Speirs CK, Siegel RM, et al. Pleiotropic defects in lymphocyte activation caused by caspase-8 mutations lead to human immunodeficiency. Nature. 2002 Sep 26;419(6905):395–9. doi:10.1038/nature01063. Available from: https://pubmed.ncbi.nlm.nih.gov/12353035/ Khan S, Saha S, Saha S, et al. Early and frequent exposure to antibiotics in early childhood and risk of overweight: a systematic review and dose-response meta-analysis. Obes Rev. 2021;22(3):e13113. doi:10.1111/obr.13113. Available from: https://www.gastrojournal.org/article/S0016-5085(18)35036-4/fulltext Casanova JL, Abel L. Caspase-8 deficiency syndrome. Front Immunol. 2019;10:104. doi:10.3389/fimmu.2019.00104. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7750663/ Castiello MC, Villa A. Stem cell editing repairs severe immunodeficiency. The Scientist. 2024 Mar 6. Available from: https://www.the-scientist.com/stem-cell-editing-repairs-severe-immunodeficiency-71733 Ha TC, Morgan M, Schambach A. Base editing: a novel cure for severe combined immunodeficiency. Signal Transduct Target Ther. 2023;8(1):354. doi:10.1038/s41392-023-01586-2. Available from: https://www.nature.com/articles/s41392-023-01586-2https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7750663/ Project Gallery
- Do other animals get periods? | Scientia News
Knowing which species menstruate lets us pick suitable animal models Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Do other animals get periods? Last updated: 16/06/25, 16:25 Published: 26/06/25, 07:00 Knowing which species menstruate lets us pick suitable animal models Periods, formally called menstruation, happen to female mammals every menstrual cycle when an egg cell is not fertilised. Levels of the progesterone hormone decrease, causing the lining of the uterus to self-destruct and shed. This lining is called the endometrium and is flushed out of the body with blood during menstruation. Some primates, bats, the spiny mouse, and elephant shrews get periods ( Figure 1 ). Since these groups are distantly related, menstruation likely evolved multiple times independently. Knowing which species menstruate lets us pick animal models which best reflect the human female reproductive system. Why do we get periods? Despite being painful and inconvenient, menstruation must have some benefit; otherwise, natural selection would not favour it on multiple separate occasions. Hypotheses put forward to explain menstruation include clearing the uterus of pathogens and saving energy compared to maintaining an endometrium all the time. A 2012 paper argues that neither of these hypotheses are true and that menstruation is an unfortunate byproduct of the way pregnancy occurs in certain animals. In non-menstruating animals, an embryo induces morphological and biological changes in the uterus, so those changes do not happen if they are not pregnant. The uterus of a menstruating animal undergoes regular changes even without an embryo, and one of those changes is shedding the endometrium. However, there is no consensus on the benefits of menstruation. Non-human primates Old World monkeys, apes, and humans menstruate conspicuously. This could be because their endometria have spiral arteries, which dilate and weaken in response to hormones. Eventually, the weakened arteries break and release blood, which carries dead and detached endometrial tissue out of the body. While chimpanzee menstruation is visible to the naked eye, menstrual blood in orangutans and gorillas is detected with a chemical urine strip. Gorillas bleed for 3 days, while orangutans bleed for 1-4 days. Humans have the most obvious, and possibly the most prolonged, menstruation out of the Old World primates. (Aren’t we unlucky?). On the other hand, the very few New World monkey species which menstruate need a microscope to detect it. Pedro Mayor and colleagues sampled the endometria of various New World monkeys and viewed those samples under a microscope. They found that monkeys from the Aotus nancymaae and Sapajus macrocephalus species had weakened endometria with dilated blood vessels and blood clots ( Figure 2 ). Combined with other context clues from those endometrium samples, they concluded that those monkeys must be menstruating. Bats Microscopy also identified menstruation in some bat species. In a 2011 study, uterus sections from Carollia perspicillata bats showed the endometrium getting thinner over a few days with associated bleeding. Some sections had endometrial debris in the lumen of the uterus – but unlike in Old World primates and humans, this debris was reabsorbed by the body rather than released. Menstruating Molossus ater bats had blood and endometrial cells in their cervix under a microscope, while one individual was visibly bleeding in its vagina. In contrast, a colony of female Rousettus leschenaulti bats all had visible vaginal bleeding on the same day. On that day, two-thirds of their endometria were shed, and they had low progesterone levels – meaning those bats were menstruating. Bat menstruation differs from primates in at least two ways. Firstly, menstruation happens simultaneously with ovary development in Carollia perspicillata and before ovary development in primates. Secondly, some bat species only menstruate after an interrupted mating attempt – which scientists call coitus , and the public would call “pulling out”. Perhaps menstruation gives these bats a second chance at successful mating in that breeding season. Conclusion We rarely see other animals on their period because if the species does menstruate, they do not bleed as much as humans do. Evidence of menstruation in New World monkeys and bats usually came from microscopy, where the endometrium was seen to detach, and blood was seen in the uterine lumen. These monkeys and bats could be used as rudimentary animal models to study what happens in humans during a period. Written by Simran Patel Related article: Monkey see, monkey clone REFERENCES Catalini L, Fedder J. Characteristics of the endometrium in menstruating species: lessons learned from the animal kingdom. Biology of Reproduction [Internet]. 2020 May 26 [cited 2025 Jan 8];102(6):1160–9. Available from: https://doi.org/10.1093/biolre/ioaa029 Mayor P, Pereira W, Nacher V, Navarro M, Monteiro FOB, El Bizri HR, et al. Menstrual cycle in four New World primates: Poeppig’s woolly monkey (Lagothrix poeppigii), red uakari (Cacajao calvus), large-headed capuchin (Sapajus macrocephalus) and nocturnal monkey (Aotus nancymaae). Theriogenology [Internet]. 2019 Jan 1 [cited 2025 Jan 7];123:11–21. Available from: https://www.sciencedirect.com/science/article/pii/S0093691X18302796 Rasweiler IV JJ, Badwaik NK, Mechineni KV. Ovulation, Fertilization, and Early Embryonic Development in the Menstruating Fruit Bat, Carollia perspicillata. The Anatomical Record [Internet]. 2011 [cited 2025 Jan 8];294(3):506–19. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1002/ar.21304 Graham C. Reproductive Biology of the Great Apes: Comparative and Biomedical Perspectives. Elsevier; 2012. 456 p. Rasweiler IV JJ. Spontaneous decidual reactions and menstruation in the black mastiff bat, Molossus ater. American Journal of Anatomy [Internet]. 1991 [cited 2025 Jan 8];191(1):1–22. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1002/aja.1001910102 Martin RD. The evolution of human reproduction: A primatological perspective. American Journal of Physical Anthropology [Internet]. 2007 [cited 2025 Jan 8];134(S45):59–84. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1002/ajpa.20734 Emera D, Romero R, Wagner G. The evolution of menstruation: A new model for genetic assimilation. BioEssays [Internet]. 2012 [cited 2025 Jan 8];34(1):26–35. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1002/bies.201100099 Zhang X, Zhu C, Lin H, Yang Q, Ou Q, Li Y, et al. Wild Fulvous Fruit Bats (Rousettus leschenaulti) Exhibit Human-Like Menstrual Cycle1. Biology of Reproduction [Internet]. 2007 Aug 1 [cited 2025 Jan 8];77(2):358–64. Available from: https://doi.org/10.1095/biolreprod.106.058958 Project Gallery
- Regulation and policy of stem cell research | Scientia News
The 14-day rule and stem cell-based embryo models Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Regulation and policy of stem cell research Last updated: 20/10/25, 14:40 Published: 23/10/25, 07:00 The 14-day rule and stem cell-based embryo models This is the last article (article no. 3) in a three-part series on stem cells. Previous article: The role of mesenchymal stem cells in regenerative medicine. Welcome to the final article in this series of three articles about stem cells. Article 1 was an overview of stem cells, and Article 2 focused on mesenchymal stem cells. In Article 3, I will look at the regulation and policy of stem cell research, which is important given the rapidly changing landscape of stem cell research. Introduction If used effectively, stem cells can be used in treating diseases, understanding human development, and more. For example, a recent paper published in September 2025 explains how scientists created embryos from human skin DNA, in an experimental process they named “mitomeiosis”. Here, the scientists attempted to force the egg cell to divide to remove half of its chromosomes so it could be fertilised like a normal egg cell. While mitomeiosis was unsuccessful in creating viable egg cells, new advancements like this raise ethical questions about the use of stem cells, especially those derived from embryos. As a result, policies and regulations must be created and followed to ensure stem cells are used ethically and appropriately. Two major topics in this policy landscape are the 14-day rule for using human embryos and the creation of Stem Cell-Based Embryo Model (SCBEM) frameworks. The 14-day rule One of the most widely known restrictions in the field of stem cells is the 14-day rule. The 14-day rule prohibits scientists from culturing human embryos in vitro (in the laboratory) beyond 14 days or the appearance of the primitive streak. The primitive streak is a developmental marker signalling the point at which an embryo is biologically individualised. The appearance of this streak also marks the beginning of gastrulation, which is when embryonic cells start differentiating into the three primary germ layers: endoderm, mesoderm, and ectoderm. A timeline of human embryo development from day 0 to day 14 is shown in Figure 1 to help visualise the different stages. In the UK, the 14-day rule is a law under the Human Fertilisation and Embryology (HFE) Act 1990 (as amended 2008) . These human embryos are either donated with consent for research purposes due to no longer being needed, are unsuitable for fertility treatments, or are embryos created explicitly from donated sperm and eggs for research purposes. However, scientific advances have meant that human embryo cultures have now become advanced, resulting in embryos being destroyed at the 14-day deadline due to the law. For example, in 2016, researchers developed new in vitro culture systems that allowed human embryos to be maintained in the lab up to the 12th and 13th day of development. This had previously not been possible. Unfortunately, the experiments had to be stopped because they were approaching the 14-day legal limit. Therefore, scientists have questioned whether the 14-day rule is still fit-for-purpose, and if not, how it could be amended in a way that still ensures ethical and appropriate use of these cells. A specific area of development that scientists do not have a lot of information on is the “black box” period, which includes the moment of gastrulation, happening around day 14-15. Having further knowledge of gastrulation could be used to improve the success rate of In Vitro Fertilisation (IVF), by helping scientists to understand possible causes of early miscarriage and implantation failure, and working to mitigate those. Because of this debate, the Nuffield Council on Bioethics has launched a project to better understand the arguments for and against extensions to the 14-day limit on human embryo research. The Council aims to use this project to provide decision-makers, such as policymakers, with the evidence they need to decide whether to extend the time limit. Regulating Stem Cell-Based Embryo Models (SCBEMs) There is also the development of SCBEMs to consider, as seen in Figure 2 . SCBEMs are also called embryoids or embryo models. They are complex, organised three-dimensional structures derived from pluripotent stem cells, which are cells that can differentiate into all cells in the human body. SCBEMs replicate certain features and processes of embryonic development, meaning they can provide new insights into stages of early human development that have been normally inaccessible to scientists. However, SCBEMs are not defined as embryos under existing laws, like the HFE Act 1990, meaning there is a policy and regulation gap covering these structures. To fill this gap, researchers recently created the first-ever UK guidelines for generating and using SCBEMs in research. The new SCBEM Code of Practice was published in July 2024 and has clear guidance and standards, increasing the transparency of research that will be conducted using SCBEMs. The Code requires that research have well-justified scientific objectives and adhere to an approved culture period, the minimum duration needed to achieve the scientific objective. For example, the Code prohibits the transfer of human SCBEMs into a human or animal womb. Furthermore, adherence to the Code requires that a dedicated SCBEM Oversight Committee be created to review and approve proposed work. An SCBEM Register is also needed to record information about successful applications. Both of these increase the transparency and openness of research using SCBEMs. Future of regulation and policy of stem cell research Given the rapid pace of development in stem cell research, policies and regulations must be created and followed to ensure ethical and appropriate use of these cells. The review by the Nuffield Council on Bioethics regarding the 14-day rule will be important in determining if the rule should be extended. The extension could allow scientists to study developmental stages such as gastrulation, currently part of the “black box” period of development occurring after 14 days. The creation of the UK's first-ever SCBEM Code of Practice in July 2024 has introduced guidelines to fill the existing policy gap, requiring research using these models to have well-justified scientific objectives, follow approved culture periods, and be reviewed by an Oversight Committee to ensure transparency and ethical use. However, there is a need for stronger regulations, as opposed to guidelines, for using SCBEMs, and it is an important example of where policy needs to continue to be developed. Written by Naoshin Haque Related articles: Animal testing ethics / How colonialism, geopolitics and health are interwoven Project Gallery
- Unveiling the cancer magnet: vertebral stem cells and spinal tumour metastasis | Scientia News
Unlocking the mystery of spinal disorders and paving the way for targeted therapies Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Unveiling the cancer magnet: vertebral stem cells and spinal tumour metastasis Last updated: 29/05/25, 10:46 Published: 24/04/25, 07:00 Unlocking the mystery of spinal disorders and paving the way for targeted therapies Introduction Researchers at Weill Cornell Medicine have discovered that the vertebral bones in the spine contain a unique type of stem cell that secretes a protein-promoting tumour metastasis. This protein, called MFGE8, plays a significant role in attracting tumours to the spine, making it more susceptible to metastasis when compared to other bones in the body. A new line of research on spinal disorders This groundbreaking study , published in the journal Nature, sheds light on the mechanisms behind the preference for solid tumours to spread to the spine. The findings open up a new line of research on spinal disorders, potentially leading to a better understanding and treatment of bone diseases involving the spine. Identifying vertebral stem cells The researchers began their study by isolating skeletal stem cells, which are responsible for bone and cartilage formation, from various bones in lab mice. Through gene activity analysis, they identified a distinct set of markers for vertebral stem cells. Further experiments in mice and lab-dish cell culture systems confirmed the functional roles of these stem cells in forming spinal bone. Unravelling the mystery of spinal tropism Previous theories attributed the spine's susceptibility to metastasis to patterns of blood flow. However, the study's findings challenged this long-standing belief. Animal models reproduced the phenomenon of spinal tropism, but the researchers discovered that blood flow was not the sole explanation. Instead, they found evidence pointing towards vertebral stem cells as the possible culprits. The role of MFGE8 The researchers discovered that spinal tropism is largely a result of the protein MFGE8, which vertebral stem cells secrete in greater quantities than other bone stem cells. Removing vertebral stem cells eliminated the difference in metastasis rates between spine bones and other long bones. Implications for cancer patients These findings have significant implications for cancer patients, particularly those at risk of spinal metastasis. The researchers are now exploring methods to block the activity of MFGE8, aiming to reduce the risk of tumour spread to the spine. By understanding the distinctive properties of vertebral stem cells, researchers hope to develop targeted treatments for spinal disorders. A new frontier in orthopaedics According to study senior author Matthew Greenblatt, the identification of these unique stem cells opens up a new subdiscipline in orthopaedics called spinal orthopaedics. Many conditions in this clinical category may be attributed to the properties of vertebral stem cells. Further research in spinal orthopaedics is needed to understand how these distinct properties of vertebral stem cells contribute to spinal disorders. The discovery of MFGE8, a protein secreted in higher amounts by vertebral stem cells, has shed light on the mechanism behind the preferential spread of tumours to the spine. By investigating methods to block MFGE8, researchers hope to reduce the risk of spinal metastasis in cancer patients. Additionally, the study findings highlight the importance of understanding the role of vertebral stem cells in bone diseases that primarily affect the spine. This new line of research may provide insights into the development of novel treatments for spinal disorders. Conclusion In conclusion, the study by researchers at Weill Cornell Medicine has shown that vertebral bones, which make up the spine, contain a particular type of stem cell that secretes a protein known as MFGE8. This protein plays a significant role in promoting tumour metastases, explaining why solid tumours often spread to the spine. The findings have opened up new avenues of research in understanding spinal disorders and may lead to the development of strategies for reducing the risk of spinal metastasis in cancer patients. Overall, this study highlights the importance of vertebral stem cells in contributing to spinal disorders and emphasises the need for further investigation in this field. Written by Sara Maria Majernikova Related articles: Cancer metastasis / Brain metastasis / Stem cells REFERENCE Sun, J., Hu, L., Bok, S. et al. A vertebral skeletal stem cell lineage driving metastasis. Nature 621, 602–609 (2023). https://doi.org/10.1038/s41586-023-06519-1 Project Gallery
- Meet the microbes that feed phosphorus to plants | Scientia News
About phosphate-solubilising micro-organisms and their role in the phosphorus cycle Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Meet the microbes that feed phosphorus to plants Last updated: 15/01/26, 19:00 Published: 27/11/25, 08:00 About phosphate-solubilising micro-organisms and their role in the phosphorus cycle Plants need phosphorus to make biological molecules like DNA, ATP, and the phospholipid bilayers that form cell membranes. Most phosphorus on Earth is found in its most oxidised form, phosphate (PO 4 3- ). Plant roots can only absorb soluble phosphate ions, but 80% of the phosphate in soil is insoluble and therefore unavailable for plant growth. Enter phosphate-solubilising micro-organisms. What are phosphate-solubilising micro-organisms? Phosphate solubilisation is the process by which micro-organisms convert insoluble phosphorus sources, like rocks or the biomass of dead organisms, into bioavailable phosphate ions (Figure 1). Examples of phosphate-solubilising bacteria come from the genera Bacillus , Pseudomonas , Rhizobium, Escherichia , Streptomyces , and Micromonospora , as well as some cyanobacteria. Phosphate-solubilising fungi include Aspergillus , Penicillium , Mucor , Rhizopus , Rhizophagus, and Glomus . The latter two fungal genera are arbuscular mycorrhizal (AM) fungi - more on them later. The chemistry underpinning phosphate solubilisation is complex but can broadly be split into inorganic and organic processes ( Figure 1 ). Some of these inorganic and organic processes are described in the rest of this article. Solubilising inorganic phosphate Inorganic insoluble phosphate is solubilised by microbial acids. When phosphate-containing rocks like apatite are broken down by weathering, the resulting smaller rock particles enter the soil. Micro-organisms secrete organic acids – usually gluconic acid but occasionally lactic, citric, oxalic, or other acids – to solubilise these rock particles. Acids work on inorganic phosphate in two ways. Firstly, they dissolve weathered rock pieces due to their low pH. Secondly, negatively charged acid anions (lactate, citrate, etc) displace the phosphate captured by aluminium, iron, magnesium, and calcium minerals in the rock. Organic acids are just some of the chemicals secreted by microbes to solubilise inorganic phosphate. Solubilising organic phosphorus On the other hand, microbial enzymes solubilise organic phosphorus during the decomposition of organic matter. The two types of phosphate-solubilising enzymes are phosphatases, which solubilise 90% of organic phosphorus, and phytases, which solubilise the remaining 10%. Both types of enzyme break the ester bonds linking a PO 4 3- group to the rest of a biological molecule. By expressing genes encoding phytases and phosphatases, soil micro-organisms make phosphorus available for plants. Arbuscular mycorrhizae (AM) AM fungi provide plants with phosphorus in a symbiotic relationship. These fungi consist of hyphae, which are long, thin strands of cells that extend a plant’s root network and access phosphorus where roots cannot (Figure 2). AM fungi have a three-pronged approach to improving a plant’s phosphorus uptake: firstly, they absorb phosphate from the soil and give it to the plant in exchange for carbon. Secondly, they solubilise phosphate by secreting acids and phosphatases. Finally, AM fungi recruit phosphate-solubilising bacteria to the root system by feeding them sugars and amino acids. Conclusion Phosphate-solubilising bacteria and fungi provide plants with phosphorus, an essential element for making nucleic acids and ATP. Most phosphate is inaccessible to plants, locked up in rocks and biomass. By secreting organic acids and enzymes, soil micro-organisms convert this inaccessible phosphate into a form that plant roots can absorb and incorporate into their own biomass. When that plant dies, the organic phosphorus is solubilised again for another plant to use, so phosphorus never runs out. Therefore, phosphate-solubilising microbes are a small part of the invisible world that keeps our planet green. Written by Simran Patel Related article: Human activity and the phosphorus cycle REFERENCES Silva LI da, Pereira MC, Carvalho AMX de, et al. Phosphorus-Solubilizing Microorganisms: A Key to Sustainable Agriculture. Agriculture 2023; 13: 462. Pang F, Li Q, Solanki MK, et al. Soil Phosphorus Transformation and Plant Uptake Driven by Phosphate-solubilizing Microorganisms. Front Microbiol ; 15. Epub ahead of print 27 March 2024. DOI: 10.3389/fmicb.2024.1383813 . Schipanski ME, Bennett EM. Chapter 9 - The Phosphorus Cycle. In: Weathers KC, Strayer DL, Likens GE (eds) Fundamentals of Ecosystem Science (Second Edition) . Academic Press, pp. 189–213. Tian J, Ge F, Zhang D, et al. Roles of Phosphate Solubilizing Microorganisms from Managing Soil Phosphorus Deficiency to Mediating Biogeochemical P Cycle. Biology 2021; 10: 158. Project Gallery
- The celestial blueprint of time: Stonehenge, United Kingdom | Scientia News
The utilisation of Stonehenge as an astronomical calculator Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The celestial blueprint of time: Stonehenge, United Kingdom Last updated: 08/10/25, 16:22 Published: 09/10/25, 07:00 The utilisation of Stonehenge as an astronomical calculator This is Article 3 in a series about astro-archaeology. Next article coming soon. Previous article: The astronomical symbolism of the Giza Pyramids . Stonehenge, located within the south-west of England, is one of the UK’s most notable man-made structures, built during the neolithic period around 3100BC. Not only is this famous UNESCO heritage site a breakthrough in engineering, but the sandstone architecture also holds an enigmatic connection between the land and the sky. Its location and stone arrangement mirrors a blueprint that can be analysed to predict the timings of astronomical phenomena. The utilisation of Stonehenge as an astronomical calculator was established by astronomer Gerald Hawkins in 1965. Using computer software, Hawkins discovered that the location of Stonehenge aligned with several solar and lunar positions. He theorised that Stonehenge was built to predict astronomical events, such as eclipses, and to determine the position of summer and winter solstices. From the shape and positions of the 19 stones that comprise Stonehenge, its ‘horseshoe’ shape could predict the lunar eclipses. A booklet titled Stonehenge: Sun, Moon, Wandering Stars , written by M.W. Postins further detailed the significance of Stonehenge in archaeoastronomy. Postins suggested two scale models, the ‘Temple model’ and the ‘Enclosure model’, which detailed the significance of each stone and its relation to different events. For example, the booklet notes that the Altar Stone, a large sandstone located in the centre of Stonehenge, was placed across the solstice axis and represents the ‘Summer solstice sunrise’. Additionally, Postins hypothesised that the five trilithons, which are the vertical stones that form the structure of Stonehenge, represented planets that can be viewed with the naked eye. These include the two lowest trilithons on the East and Northern sides of the structure, representing Mercury and Venus. There has been new research, currently underway by the universities of Oxford, Leicester and Bournemouth in collaboration with the Royal Astronomical Society, linking the Stonehenge monument to a unique lunar phenomenon, called the ‘Major Lunar Standstill’. Right from the early construction of Stonehenge, researchers note that the major lunar standstill may have influenced the design of the monument. Four of the stones at Stonehenge align with two of the Moon’s positions, which aid to indicate moonrise and moonset. This would have allowed individuals to use the moonlight for longer periods of activity, such as night time hunting, as well as visualise the cycle of the lunar phases as a method of time watching for farming and celebratory purposes. Potentially, there is speculation that this made the positioning and construction of Stonehenge intentional. The timeless effect of the Stonehenge landmark, which shaped life in the past and continues to be of astronomical interest to determine the future, is a remarkable example of the functions of built structures for the analysis of astronomical events. It truly is a celestial blueprint for the relationship between the earth and cosmology. Written by Shiksha Teeluck Related article: Astro-geography of Lonar Lake REFERENCES English Heritage. (2024). Stonehenge: Major Lunar Standstill . https://www.english-heritage.org.uk/visit/places/stonehenge/things-to-do/major-lunar-standstill/ OSR. (2009). Stonehenge: An Astronomical Calculator . https://osr.org/blog/astronomy/stonehenge-an-astronomical-calculator/?srsltid=AfmBOopNQnJ-XUZSyLY_Aqu3L2nOJgSoAceRzQJIVZbsIsFhW6s3U_NT Tiverton & Mid Devon Astronomy Society. (n.d.). Astro-Archaeology at Stonehenge . http://www.tivas.org.uk/stonehenge/stone_ast.html Project Gallery
- Why brain injuries affect children and adults differently | Scientia News
The main difference between children and adults lies in what needs to be rebuilt Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Why brain injuries affect children and adults differently Last updated: 12/11/25, 12:09 Published: 13/11/25, 08:00 The main difference between children and adults lies in what needs to be rebuilt When we think about a brain injury, it is easy to assume that the same thing happens in everyone; a bump to the head, swelling, and hopefully a recovery. In reality, things aren’t quite that simple. A child’s brain is not a smaller version of an adult’s, it is still developing, which makes it both incredibly adaptable and, at the same time, especially vulnerable. Smaller bodies, bigger risks Although the brain’s basic reaction to injury is similar in children and adults, injuries in younger people tend to cause more widespread and severe damage. This is due to the differences in anatomical development. Children’s heads are proportionally larger compared to the rest of their bodies, and their neck muscles are much weaker than those of adults. This means that when a child falls or is knocked, their head can move suddenly and forcefully, placing extra strain on the brain. On top of that, children’s brains have a higher water content and are softer in texture, which makes them more vulnerable to rotational forces and acceleration-deceleration injuries. These types of movements can lead to diffuse axonal injury, where nerve fibres are torn across large areas, and cerebral swelling, both of which are less common in adults experiencing similar trauma. A clear example of this vulnerability is seen in abusive head trauma. When an infant is shaken, their softer skull and brain structure can lead to a combination of skull fractures, internal bleeding, and swelling. Sadly, these injuries are often linked to very poor outcomes. The double-edged sword of brain plasticity One of the most remarkable things about the young brain is its plasticity, which is its ability to reorganise itself and form new connections after injury. This flexibility often means that children recover some functions, such as movement or daily activities, more quickly than adults do in the early months after a brain injury. However, this adaptability has limits. During childhood, the brain is constantly developing new skills and abilities. If an injury occurs during one of these critical periods, it can interrupt processes essential for normal development. This means that difficulties might not appear straight away. A child could seem to recover well at first but then struggle later when their brain is expected to handle more complex tasks, such as problem-solving or emotional regulation. Over time, recovery often plateaus, and children may continue to face long-term challenges with learning, behaviour, and social interaction. Research also shows that injury severity is a major factor in long-term outcomes. Children who suffer severe traumatic brain injuries are more likely to experience lower academic performance and, later in life, face higher rates of unemployment or lower paid work compared with their peers. Behaviour, learning and life after injury Brain injuries in childhood can also affect behaviour and mental health. Conditions such as ADHD are especially common following injury, affecting between 20-50% of children. These difficulties can make returning to school and social life far more challenging. Children from lower socioeconomic backgrounds often experience extra barriers, including limited access to rehabilitation and educational support. This can increase the risk of social isolation and mental health difficulties. Children are also more likely than adults to develop secondary brain conditions, such as epilepsy, after an injury which adds further complexity to their recovery. Why recovery is not the same The main difference between children and adults lies in what needs to be rebuilt. Adults are generally trying to re-learn skills they already had, while children are still learning those skills for the first time. That makes recovery a much more delicate and unpredictable process. Moreover, most rehabilitation is concentrated in the first few months after the injury, but children’s challenges often become clearer years later, when their brains, and the demands placed on them, have developed further. In summary The developing brain is both fragile and flexible . While its biological features make it more prone to injury, its capacity for plasticity allows for impressive short-term recovery. Yet the same developmental processes that support growth also make it more vulnerable to long-term disruption. Injuries sustained during childhood can alter the course of brain development, leading to lasting effects on thinking, learning, and behaviour. These consequences can shape a person’s future long after the initial recovery period has ended. Understanding these differences is crucial, not just for doctors, but also for teachers, parents, and anyone supporting a young person recovering from a brain injury. Written by Alice Greenan Related articles: Synaptic plasticity / Traumatic Brain Injury (TBI) / Childhood intelligence REFERENCES Anderson, V. (2005). Functional Plasticity or Vulnerability After Early Brain Injury? PEDIATRICS , 116 (6), 1374–1382. https://doi.org/10.1542/peds.2004-1728 Anderson, V., Brown, S., Newitt, H., & Hoile, H. (2011). Long-term outcome from childhood traumatic brain injury: Intellectual ability, personality, and quality of life. Neuropsychology , 25 (2), 176–184. https://doi.org/10.1037/a0021217 Anderson, V., & Yeates, K. O. (2010). Pediatric Traumatic Brain Injury. In Cambridge University Press eBooks . Cambridge University Press. https://doi.org/10.1017/cbo9780511676383 ARAKI, T., YOKOTA, H., & MORITA, A. (2017). Pediatric Traumatic Brain Injury: Characteristic Features, Diagnosis, and Management. Neurologia Medico-Chirurgica , 57 (2), 82–93. https://doi.org/10.2176/nmc.ra.2016-0191 Blackwell, L. S., & Grell, R. M. (2023). Pediatric Traumatic Brain Injury: Impact on the Developing Brain. Pediatric Neurology . https://doi.org/10.1016/j.pediatrneurol.2023.06.019 Figaji, A. A. (2017). Anatomical and Physiological Differences between Children and Adults Relevant to Traumatic Brain Injury and the Implications for Clinical Assessment and Care. Frontiers in Neurology , 8 (685). https://doi.org/10.3389/fneur.2017.00685 Manfield, J., Oakley, K., Macey, J.-A., & Waugh, M.-C. (2021). Understanding the Five-Year Outcomes of Abusive Head Trauma in Children: A Retrospective Cohort Study. Developmental Neurorehabilitation , 24 (6), 1–7. https://doi.org/10.1080/17518423.2020.1869340 Narad, M. E., Kaizar, E. E., Zhang, N., Taylor, H. G., Yeates, K. O., Kurowski, B. G., & Wade, S. L. (2022). The Impact of Preinjury and Secondary Attention-Deficit/Hyperactivity Disorder on Outcomes After Pediatric Traumatic Brain Injury. Journal of Developmental & Behavioral Pediatrics , 43 (6), e361–e369. https://doi.org/10.1097/dbp.0000000000001067 Neumane, S., Câmara-Costa, H., Francillette, L., Araujo, M., Toure, H., Brugel, D., Laurent-Vannier, A., Ewing-Cobbs, L., Meyer, P., Dellatolas, G., Watier, L., & Chevignard, M. (2021). Functional outcome after severe childhood traumatic brain injury: Results of the TGE prospective longitudinal study. Annals of Physical and Rehabilitation Medicine , 64 (1), 101375. https://doi.org/10.1016/j.rehab.2020.01.008 Parker, K. N., Donovan, M. H., Smith, K., & Noble-Haeusslein, L. J. (2021). Traumatic Injury to the Developing Brain: Emerging Relationship to Early Life Stress. Frontiers in Neurology , 12 . https://doi.org/10.3389/fneur.2021.708800 Project Gallery
- The potential of virtual reality (VR) in healthcare | Scientia News
VR in pain management, and mental health treatment Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The potential of virtual reality (VR) in healthcare Last updated: 27/03/25, 15:44 Published: 06/03/25, 08:00 VR in pain management, and mental health treatment Introduction The term 'extended reality' (XR) consists of three concepts: augmented reality, mixed reality and virtual reality (VR). The Oxford English Dictionary defines VR as a 'computer-generated simulation of a lifelike environment that a person can interact with in a seemingly real or physical way'. When you think of VR, you might think of headsets, goggles and gaming. However, you might not know that VR can have huge potential in healthcare as a non-pharmacological intervention. Research has shown that active VR, where patients interact and engage more with the virtual environment, becoming immersed, works better than passive VR, where patients just view content. In this article, I will look at the use of VR in two cases: for pain management and mental health treatment. VR for pain management VR-based treatments for pain management work by attention modulation, also known as focus-shifting, providing distraction analgesia (pain relief) by shifting a patient’s focus away from the pain to the virtual environment. To access the VR set-up, patients use a head-mounted display (HMD) and hardware. VR uses technology that stimulates the senses, particularly sight, sound, and touch, reducing the amount of pain a patient feels by changing the pain intensity; it is especially useful when a patient experiences sharp and sudden pain, including pain during labour or post-surgery. Additionally, VR changes how the brain processes pain by affecting the pain-control system, which includes regions like the periaqueductal grey (PAG) and the anterior cingulate cortex (ACC). Specifically for chronic pain (persistent pain that lasts for more than three months), VR can help patients develop techniques to manage their pain better over time, such as by improving their physical abilities, like moving their arms or legs more easily and improving their muscular endurance. For example, Merlot et al. (2023) found that for women with endometriosis-related pelvic pain who used Endocare (a VR software designed to reduce pain for those with endometriosis), women reported that it reduced pain intensity, with Endocare's maximum pain reduction being 51.58% compared to 27.37% in the sham control group. VR for mental health treatment VR-based treatments have also proven to be effective in treating mental health conditions, helping patients to manage conditions such as anxiety and depression. This is because they can replicate a negative environment within a controlled and safe VR setting, helping patients manage and confront their triggers. The Institute for Health Metrics and Evaluation has stated that as of 2019, 301 million people were living with an anxiety disorder, and 58 million of them (about 20% of those with anxiety) were children and adolescents. Regarding depression, the statistic was 280 million people, including 23 million (nearly 10% of those with anxiety) children and adolescents. For anxiety, VR-based treatments use exposure treatment, where patients are confronted with the stimuli, but the expected outcome does not occur. Repeating the exposure leads to patients’ anxiety decreasing over time since their perception of the stimuli leading to the feared outcome does not come true. For example, someone with a fear of heights would undergo VR-based exposure treatment where they would be exposed to heights. They would be guided through a learning process, and after multiple exposures, they would think of heights as being safe, leading to less fear of heights overall. For depression, VR-based treatments use behavioural activation so that individuals can reconnect with activities they enjoy. This can help patients develop and learn coping strategies, improving their mood and reducing depressive symptoms. VR-based treatments will be particularly helpful for children and adolescents. The statistics by the Institute for Health Metrics and Evaluation clearly show that a high percentage of those with mental health conditions are young people, and general research has shown that they will be less likely to seek professional help and receive appropriate care. VR could help this group by becoming a more appealing therapy method, especially through gamification, making children and adolescents more motivated and more likely to participate in treatment. This method would provide an immersive environment and could be a personalised form of therapy. Implications for the future It is important to note that there are still limitations stopping a wider roll-out of VR within healthcare. For example, VR can cause cybersickness, the virtual equivalent of motion sickness, resulting in nausea, disorientation, and headaches. In addition, within the use of VR for young people, more research needs to be conducted on whether gamified therapies are safe and effective. Nevertheless, these limitations can be mitigated. Technology is advancing rapidly, and newer hardware have a better field of vision and refresh rates of visual content. The VR environment is also being designed better, accounting for individual patient preferences. With further research, scientists can examine in more detail the factors that make VR-based therapies effective and implement them in a way that addresses ethical concerns and increases their effectiveness. Written by Naoshin Haque Related articles: Clinical scientist computing / Smart bandages / Emojis in healthcare Project Gallery










