Search Index
355 results found
- An exploration of the attentional blink in rapid serial visual presentation studies | Scientia News
Raymond et. al (1992), Shapiro (1994), and other studies Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link An exploration of the attentional blink in rapid serial visual presentation studies Last updated: 24/06/25, 14:01 Published: 03/07/25, 07:00 Raymond et. al (1992), Shapiro (1994), and other studies Attention is a cognitive mechanism that helps us select and process vital information while ignoring irrelevant information, enabling us to consolidate our memories. Attentional blink typically refers to the finding of a severe impairment for detection or identification of the second target (T2) of the two masked visual targets that occurs when the targets are presented within less than 500 milliseconds of each other. In this context, T1 refers to the first target, which captures attention and temporarily limits the ability to detect or identify T2 if they are presented too closely in time. Raymond et al. (1992) suggested that the attentional blink phenomenon is observed in rapid serial visual presentation (RSVP) conditions in which stimuli such as letters, digits or pictures are presented in a rapid sequence mostly at a single location. Typically, the target from the RSVP stimulus stream is differentiated (e.g. presented in a different colour), and the participant’s task is to identify the target. The RSVP procedure is a widely employed paradigm used to examine the temporal characteristics of perceptual and attentional processes. Shapiro (1994) proposed the interference theory as an explanation for attentional blink. According to the interference theory, there is a temporal buffer if many distractors are present. Due to the limitations of visual short-term memory, multiple items compete to be retrieved from this hypothetical temporal buffer, which can affect recall accuracy. As a result, attentional blink occurs due to competition over which target, T1 or T2, receives attentional processing. Supporting evidence comes from Isaak (1999), who presented combinations of letter and false-font stimuli per trial, and claimed that attentional blink magnitude increases if the competitors arise from the same conceptual category, for example, digits. Alternatively, Chun and Potter (1995) introduced their two-stage model to account for attentional blink. The aim of their research was to investigate whether attentional blink occurs in a Rapid Serial Visual Presentation (RSVP) task. Their hypothesis stated that participants’ ability to detect T2 would be reduced if it appeared approximately 300 milliseconds after T1. They also sought to examine whether attentional blink reflects a limited-capacity processing mechanism. The model suggests that stage 1 is where stimuli are processed and features and meanings are registered, but not at a sufficient level for report. In stage 2, the stimulus is consolidated for a response. The researchers reported that attentional blink occurs at stage 2, where identification and consolidation of T1 are slowed when there is a following item, delaying the processing of T2 after the onset of T1. Discussion Many RSVP studies hypothesise that presenting T2 300-700 milliseconds after T1, with multiple distractor items, increases the likelihood of attentional blink and impairs the ability to detect T2. This outcome aligns with Shapiro et al.’s (1999) interference theory, as participants faced significant difficulty retrieving stimuli from the temporal buffer during the dual task. However, participants demonstrated a higher success rate in identifying the target during the single task, even with rapid stimulus presentation. Additional support for the interference theory is provided by Raffone et al. (2014), who argued that T2 must be masked by a distractor, and if T1 appears within 500 milliseconds of T2, T2 often goes undetected, leading to attentional blink. The unified model further suggests that in RSVP tasks, attention allocation to T1 reduces the attention available for T2, leaving T2 susceptible to decay or substitution. This implies that attentional blink may result from T1 monopolising attentional resources and thus limiting the capacity to process T2, which explains the poorer performance observed in the dual task. Conclusions Despite their insights, both theories of attentional blink have notable shortcomings. There is contradicting evidence for the interference theory from Olivers and Meeter (2008), who believe that once attentional blink is induced by a first target, it can be alleviated if T2 is preceded by a non-target that shares a target-defining feature, such as having the same colour. Whereas, Reeves and Sperling (1986) postulate that an attentional gate is opened after T1 is detected and continues to remain open until target identification is complete. This can amplify the processing of the stimuli, enabling the identification of T1 and aiding T2 in receiving attentional processes and being identified accurately. A main limitation of the two-stage model for attentional blink studies is its difficulty in explaining the full spectrum of attentional blink effects, particularly the T1-sparing’ phenomenon and the impact of task demands on T2 processing. For instance, the two-stage model often assumes that T2 processing is solely impaired due to the attentional load of T1, but research suggests that the difficulty of the T2 task itself can influence the attentional blink. For example, if T2 requires a more complex or demanding response, the attentional blink effect may be more pronounced, even if T1 processing is relatively simple. Future research should investigate if attentional blink exists within other modalities, such as cross-modal perception (visual T1, auditory T2). This will enable us to get a deeper insight into how the attention mechanisms operate. Future research should also explore alternative explanations for the attentional blink. Some studies suggest it may not be solely attributable to resource limitations or processing bottlenecks but could instead reflect a more dynamic process involving attentional re-engagement or the interaction between perceptual and attentional systems. Written by Pranavi Rastogi REFERENCES Chun, M. M., & Potter, M. C. (1995). A two-stage model for multiple target detection in rapid serial visual presentation. Journal of Experimental Psychology: Human Perception and Performance, 21 (1), 109-127. doi:10.1037/0096-1523.21.1.109 Isaak, M. I., Shapiro, K. L., & Martin, J. (1999). The attentional blink reflects retrieval competition among multiple rapid serial visual presentation items: Tests of an interference model. Journal of Experimental Psychology: Human Perception and Performance, 25 (6), 1774-1792. doi:10.1037/0096-1523.25.6.1774 Olivers, C. N., & Meeter, M. (2008). A boost and bounce theory of temporal attention. Psychological Review, 115 (4), 836-863. doi:10.1037/a0013395 Raffone, A., Srinivasan, N., & Van Leeuwen, C. (2014). The interplay of attention and consciousness in visual search, attentional blink and working memory consolidation. Philosophical Transactions of the Royal Society B: Biological Sciences, 369 (1641), 20130215. doi:10.1098/rstb.2013.0215 Reeves, A., & Sperling, G. (1986). Attention gating in short-term visual memory. Psychological Review, 93 (2), 180-206. doi:10.1037/0033-295x.93.2.180 Raymond, J. E., Shapiro, K. L., & Arnell, K. M. (1992). Temporary suppression of visual processing in an RSVP task: An attentional blink? Journal of Experimental Psychology: Human Perception and Performance, 18 (3), 849-860. doi:10.1037/0096-1523.18.3.849 Shapiro, K. L., Raymond, J. E., & Arnell, K. M. (1994). Attention to visual pattern information produces the attentional blink in rapid serial visual presentation. Journal of Experimental Psychology: Human Perception and Performance,20 (2), 357-371. doi:10.1037/0096-1523.20.2.357 Project Gallery
- Healthcare challenges during civil war in Sudan | Scientia News
Health inequalities and inequities amid the ongoing civil war Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Healthcare challenges during civil war in Sudan Last updated: 19/06/25, 10:09 Published: 17/04/25, 07:00 Health inequalities and inequities amid the ongoing civil war This is article no. 2 in a series about global health injustices. Next article: Yemen: a neglected humanitarian crisis . Previous article: Life under occupation in Palestine Introduction Welcome to the second article of the Global Health Injustices Series. My previous article focused on the Palestinians and the injustices they face, notably the blockade of food, water and medical supplies in Gaza. This one will focus on Sudan by examining the health inequalities and inequities the wider Sudanese population faces, mainly due to the ongoing civil war between the Sudanese Armed Forces (SAF) and the Rapid Support Forces (RSF). This carries direct and indirect consequences ( Figure 1 ); some of these will be discussed in this article, along with ways forward to advocate and support the Sudanese people after an overview of Sudan’s history and current state. Sudan: a rich history to modern challenges Sudan is a country in North Africa bordered by South Sudan, Egypt, the Central African Republic, Libya, Chad to the northwest, Eritrea and Ethiopia. Sudan has had shifts in political power over centuries, notably the joint Egyptian-Ottoman rule beginning over 200 years ago before the British government took control of Sudan during the first half of the 20th century. After that, Sudan became independent, and South Sudan gained independence in the 21st century. Through these different shifts, there has been a struggle for representation and power in Sudan, leading to various crises, including the current civil war ( Figure 2 ). Despite this, Sudan maintains its multiple languages and cultural traditions through its resilient population. Aside from the SAF and RSF, the civil war in Sudan has arms trade and exports from external governments, particularly the United Arab Emirates (UAE), Russia, and China, have accelerated the civil war. This expansion is crucial because it illustrates how much geopolitics has severe consequences on the health and wellbeing of the Sudanese people. Health in Sudan: the consequences of civil war and geopolitics In a public health situation analysis (PHSA) by the World Health Organisation (WHO) published in 2024, they highlighted four major emergencies in Sudan: food insecurity, displacement, epidemics and conflicts, which are intrinsically linked to detrimental health outcomes like non-communicable diseases (NCDs), trauma and injury, measles and malaria. Moreover, several mortality indicators were noted in the PHSA. For example, the mortality rate among infants is 39 per 1000 people and for children, it is 54 per 1000, both originating from the United Nations Children's Fund (UNICEF). These outcomes among infants and children are attributed to health conditions, such as those occurring neonatally and lower respiratory infections. Nonetheless, there has been increased vaccine coverage in Sudan to fight the spread of infectious diseases. For example, COVID-19 vaccination reached approximately 12.6 million people (28% of the population) in March 2023, along with improved polio and rotavirus vaccination. However, all of these outcomes highlight the magnitude of the civil war in Sudan, with the impact of the arms trade adding fuel to it. Looking at Sudan’s healthcare system, there are several pressures to highlight. One commentary article noted that in conflict areas, less than one third of hospitals are operational, while 70% of them are not. Additionally, the operating hospitals stopped for various reasons, mainly shortages in electricity, medical equipment and healthcare workers. With the aforementioned geopolitical context, these gaps in the healthcare system are amplified and lead to the worsening health outcomes outlined in the PHSA, such as the rise in NCDs. Not only are NCDs rising in Sudan, but infectious diseases are exacerbated in Sudan with the civil war. One of them is drug-resistant tuberculosis (DR-TB), caused by bacteria. One systematic review found that the prevalence of TB with resistance to drugs was 47%; the ones that are not working on TB with the highest resistance include isoniazid at 32.3%, streptomycin at 31.7% and rifampicin at 29.2% resistance. These values are likely to be higher nowadays, given that arms trade exports into Sudan are increasing and leading to more patients not getting sufficient care to manage or treat DR-TB. Another infectious disease that is a significant health problem in Sudan is schistosomiasis, which is caused by parasites. One systematic review included two categories of the disease: Schistosoma haematobium (S. haematobium) and Schistosoma mansoni (S. mansoni) . S. haematobium prevalence was 24.83%, and for S. mansoni , it was 19.13%. These signify that although devising preventative strategies against these infections is crucial, it is paramount to consider the broader picture in Sudan: tackling schistosomiasis and other infections begins with understanding the geopolitical context. Looking at undernutrition among children in Sudan it is another significant health problem. For instance, a meta-analysis found that Sudan had the highest prevalence of stunting among North African countries at 36%; this was also true for wasting, where Sudan had a prevalence of wasting at 14.1% and a prevalence of underweight at 24.6%. Therefore, in a similar sentiment to tackling infectious diseases, understanding the geopolitical context in Sudan is vital to minimising the prevalence of undernutrition among children. Reflecting on all the data and sources I used above, gaps and perspectives still need to be addressed and highlighted, specifically in places within Sudan where the ongoing civil war severely impacts research. This signifies the importance of obtaining reliable information to support communities in Sudan facing numerous injustices. In turn, filling these information and perspective gaps may apply to other crises similar to Sudan. Protecting health in Sudan: crucial ways forward from NGOs To move forward, several NGOs, particularly Amnesty International, have made recommendations to protect the Sudanese people: As a part of their obligation to respect and ensure respect for international humanitarian law (IHL), all states are prohibited from transferring or permitting private actors to transfer weapons to a party to an armed conflict In light of the substantial risk that all arms and ammunition being transferred to Sudan….. will be used by parties to the conflict to commit grave human rights abuses, companies must immediately cease their involvement in this supply of arms to avoid causing or contributing to these abuses. If a company identifies that the products they sold have contributed to such abuses, they should provide for or cooperate in the remediation process to any persons harmed as a result. Therefore, taking these steps on board is essential to upholding human rights and ensuring that the health and wellbeing of the Sudanese people are sustained, particularly during the ongoing civil war. If not, these health inequities and inequalities will only be exacerbated. Moreover, the health outcomes from infectious and chronic diseases outlined are likely worse now, given how much weapons trading has occurred. Conclusion: call to action for the international community Overall, the civil war in Sudan has had devastating impacts on the health and wellbeing of the whole population, particularly the infants and children, among the other injustices. Unfortunately, this crisis has not received a lot of mainstream attention compared to others currently, such as Palestine, which is also a significant injustice. Therefore, Sudan must be addressed just as openly through discussions of justice and advocacy through the voices of the Sudanese people. Moreover, my statement in the previous article on Palestine rings true: It is crucial always to nudge those in positions of power worldwide to fulfil their responsibilities as civil servants and defend human rights for everyone. This is essential to maintain the health and wellbeing of the Sudanese people, particularly to facilitate the recommendations from NGOs such as Amnesty International. In my next article, I will discuss Yemen because this population is also encountering civil war as one of the many injustices which have been occurring for more than a decade, and Yemen is considered to be going through one of the worst humanitarian crises of our time. Similarly, these impacts on the health and wellbeing of the Yemeni people still need awareness and discussion. Written by Sam Jarada Related articles: A perspective on well-being / Understanding health through different stances / Impacts of global warming on dengue fever REFERENCES Crisis in Sudan: What is happening and how to help. The IRC. 2025. Available from: https://www.rescue.org/article/crisis-sudan-what-happening-and-how-help Khogali A, Homeida A. Impact of the 2023 armed conflict on Sudan’s healthcare system. Public Health Challenges. 2023 Oct 28;2(4). Available from: https://onlinelibrary.wiley.com/doi/full/10.1002/puh2.134 Elamin A, Abdullah S, ElAbbadi A, Abdellah A, Hakim A, Wagiallah N, et al. Sudan: from a forgotten war to an abandoned healthcare system. BMJ Global Health. 2024 Oct;9(10):e016406. Available from: https://pmc.ncbi.nlm.nih.gov/articles/PMC11529772/ New weapons fuelling the Sudan conflict. Amnesty International. 2024. Available from: https://www.amnesty.org/en/latest/research/2024/07/new-weapons-fuelling-the-sudan-conflict/#:~:text=Shipment%2Dlevel%20trade%20data%20indicates,into%20lethal%20weapons%20in%20Sudan . PHSA -Sudan Complex Emergency 030424 SUDAN CONFLICT. World Health Organisation (WHO); 2024. Available from: https://cdn.who.int/media/docs/default-source/documents/emergencies/phsa--sudan-complex-emergency-030424.pdf?sfvrsn=81039842_1&download=true Alaa Dafallah, Osman, Ibrahim ME, Elsheikh RE, Blanchet K. Destruction, disruption and disaster: Sudan’s health system amidst armed conflict. Conflict and Health. 2023 Sep 27;17(1). Available from: https://conflictandhealth.biomedcentral.com/articles/10.1186/s13031-023-00542-9 Hajissa, K., Marzan, M., Idriss, M.I. and Islam, M.A. (2021). Prevalence of Drug-Resistant Tuberculosis in Sudan: A Systematic Review and Meta-Analysis. Antibiotics, 10(8), p.932. doi: https://doi.org/10.3390/antibiotics10080932 . Yousef Alsaafin, Omer, A., Osama Felemban, Sarra Modawi, Ibrahim, M., Mohammed, A., Ammar Elfaki, Abushara, A. and SalahEldin, M.A. (2024). Prevalence and Risk Factors of Schistosomiasis in Sudan: A Systematic Review and Meta-Analysis. Cureus. doi: https://doi.org/10.7759/cureus.73966 . Nagwa Farag Elmighrabi, Catharine, Dhami, M.V., Elmabsout, A.A. and Agho, K.E. (2023). A systematic review and meta-analysis of the prevalence of childhood undernutrition in North Africa. PLoS ONE, 18(4), pp.e0283685–e0283685. doi: https://doi.org/10.1371/journal.pone.0283685 . Project Gallery
- Addressing mental health within the South Asian community | Scientia News
Cultural beliefs, stigma, family values and more, inhibit open discussion of mental health Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Addressing mental health within the South Asian community Last updated: 27/11/25, 15:14 Published: 22/05/25, 07:00 Cultural beliefs, stigma, family values and more, inhibit open discussion of mental health Mental health is a critical aspect of human life, yet it remains a deeply taboo subject within the South Asian community. Despite the growing awareness in mainstream discourse, many South Asians—especially those living in diasporic communities such as the UK, the US, and Canada—continue to face significant barriers when it comes to recognising, understanding, and seeking help for mental health concerns. But why does this silence continue? The answer lies in a combination of cultural beliefs, stigma, family values, societal expectations, and a general lack of education, especially among the older generations. Unlike Western cultures, which tend to emphasise individualism, South Asian societies often focus on collectivism, where the success and well-being of the family take precedence over the individual. This cultural foundation has both strengths and challenges. While it preaches community and support, it also discourages expressions of emotional vulnerability, especially when that vulnerability may be perceived as bringing shame or dishonour to the family. Mental health is often viewed as a personal weakness, a spiritual failing, or something that reflects poorly on one’s upbringing or family reputation. A survey conducted by the NHS in the UK revealed that 35% of South Asian youth aged 18–24 reported experiencing some form of mental health issue, compared to 30% of White British youth. While these figures suggest a slightly higher incidence, what is more alarming is the disparity in access to care and treatment. Many South Asians are less likely to seek help due to fears of being perceived as 'crazy' or weak. In some cases, mental health symptoms are dismissed as temporary mood swings, spiritual crises, or simply a lack of willpower. A study published by the Mental Health Foundation (2020) found that only 32% of South Asians surveyed had a functional understanding of mental health, compared to 60% of the general UK population. This suggests that stigma is caused by a lack of knowledge, which prevents early intervention and exacerbates untreated conditions. Among those who recognise they have a problem, there is often a reluctance to seek professional help, particularly from psychologists or psychiatrists. Instead, some may turn to spiritual leaders or rely solely on familial support, both of which, while culturally significant, may not always offer the necessary therapeutic intervention. One of the major mental health concerns within the South Asian community is depression and anxiety, and these conditions often go undiagnosed. Research from the Centre for Mental Health has indicated that South Asian individuals are more likely to report symptoms of depression and anxiety than their White counterparts, but are less likely to receive treatment. According to a 2022 study by Public Health England, South Asian women are 1.5 times more likely to suffer from common mental health disorders, such as anxiety and depression, but only 13% accessed mental health services compared to 25% of White British women. Many culturally specific factors contribute to higher rates of anxiety and depression in South Asian communities. These include intergenerational trauma, immigration stress, identity conflict, and pressures related to marriage, family reputation, and academic or career success. Young South Asians often find themselves navigating between traditional family expectations and Western societal norms, leading to identity struggles that can trigger chronic stress and anxiety. Additionally, gender roles in South Asian cultures often impose strict expectations on behaviour. Women may be discouraged from voicing emotional distress, as they are expected to be nurturing and self-sacrificing. Men, on the other hand, are often pressured to appear strong and unemotional, which leads to a culture where expressing vulnerability is equated with failure. These rigid expectations prevent both genders from openly discussing their struggles or seeking help. Barriers to accessing mental health services are not only cultural but also structural. Many South Asians, particularly first-generation immigrants, may face language barriers when communicating with healthcare providers. There is also a lack of culturally competent therapists who understand the nuances of South Asian traditions, values, and family structures. Without representation or relatability, individuals may feel misunderstood or alienated by the mental healthcare system. Despite these challenges, there is hope. The rise of South Asian mental health advocates, community-based initiatives, and culturally tailored therapy programs is slowly helping to dismantle stigma. Social media has also played a vital role in bringing these conversations to the forefront, especially among Gen Z and Millennials. Many people are now speaking out and sharing their stories and experiences, which helps shift the narrative within the South Asian Community. We can help break the stigma surrounding mental health in the South Asian community by raising awareness, educating others, and normalising conversations around emotional wellbeing. It starts at the grassroots level: in homes, schools, religious institutions, and workplaces. Encouraging open dialogue and fostering environments where individuals feel safe to share their experiences without judgment is key. More importantly, we must validate the struggles of those suffering from mental health issues—telling them that it is okay to not be okay, and that seeking help is a sign of strength, not weakness. Furthermore, the government and health services can do more! They should invest in culturally sensitive mental health resources, including multilingual therapy options and outreach programs tailored specifically for South Asian populations. In conclusion, addressing mental health within the South Asian community requires a collective effort to challenge outdated norms, educate people across all age groups, and improve access to inclusive and empathetic mental healthcare. Depression, anxiety, and other mental illnesses are not signs of weakness; they are real, treatable conditions that deserve compassion and support. Only by acknowledging this and working together can we begin to transform the narrative and create a healthier, more open future for the South Asian community, letting the future generation have a safe and open space to talk and get help for their mental health! Written by Rajeevan Sinnathurai ------- Scientia News thanks Rajeevan of Open Talk, for this enlightening piece on mental health in the South Asian Community. Connect with Open Talk on Instagram and TikTok . ------- Related articles: Mental health awareness / Imposter syndrome / Anxiety / South Asian epigenetics / Global health injustices- Kashmir , Bangladesh , Sri Lankan Tamils REFERENCES NHS Digital. (2021). Mental Health of Children and Young People in England . Mental Health Foundation. (2020). Mental Health in the South Asian Community . Centre for Mental Health. (2022). Race and Mental Health Inequalities . Public Health England. (2022). Mental Health Services Use by Ethnic Groups in the UK . Project Gallery
- Reaching new horizons in Alzheimer's research | Scientia News
The role of CRISPR-Cas9 technology Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Reaching new horizons in Alzheimer's research 10/07/25, 10:34 Last updated: Published: 12/10/23, 10:50 The role of CRISPR-Cas9 technology The complexity of Alzheimer’s Alzheimer's disease (AD) is a formidable foe, marked by its relentless progression and the absence of a definitive cure. As the leading cause of dementia, its prevalence is expected to triple by 2050. Traditional therapies mainly focus on managing symptoms; however, advances in genetics research, specifically CRISPR-Cas9 gene-editing technology, offer newfound hope for understanding and treating this debilitating condition. The disease is characterized by progressive deterioration of cognitive function, with memory loss being its hallmark symptom. Primarily affecting individuals aged 65 and over, age is the most significant risk factor. Although this precise cause remains elusive, scientists believe that a combination of genetic, lifestyle and environmental factors contributes to its development. CRISPR’s role in Alzheimer’s research After the discovery of using CRISPR-Cas9 for gene editing, this technology is receiving interest for its potential ability to manipulate genes contributing to Alzheimer’s. Researchers from the University of Tokyo used a screening technique involving CRISPR-Cas9 to identify calcium, proteins, and integrin-binding protein 1, which is involved in the formation of AD. Furthermore, Canadian researchers have edited genes in brain cells to prevent Alzheimer’s using CRISPR. The team identified a genetic variant called A673T, found to decrease Alzheimer’s likelihood by a factor of four and reduce Alzheimer’s biomarker beta-amyloid (Aβ). Using CRISPR in petri dish studies, they managed to activate this A673T variant in lab-grown brain cells. However, the reliability and validity of this finding are yet to be confirmed by replication in animal studies. One final example of CRISPR application is targeting the amyloid precursor protein (APP) gene. The Swedish mutation in the APP gene is associated with dominantly inherited AD. Scientists were able to specifically target and disrupt the mutant allele of this gene using CRISPR, which decreased pathogenic Aβ peptide. Degenerating neurons are surrounded by Aβ fibrils, the production of Αβ in the brain initiates a series of events which cause the clinical syndrome of dementia. The results of this study were replicated both ex vivo and in vivo and demonstrated this could be a potential treatment strategy in the future. The road ahead While CRISPR technology’s potential in Alzheimer’s research is promising, its therapeutic application is still in its Infancy. Nevertheless, with the aid of cutting-edge tools like CRISPR, deepening our understanding of AD, we are on the cusp of breakthroughs that could transform the landscape of Alzheimer’s disease treatment. Written by Maya El Toukhy Related articles: Alzheimer's disease (an overview) / Hallmarks of Alzheimer's / Sleep and memory loss Project Gallery
- Monkey see, monkey clone | Scientia News
A leap forward in primate research Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Monkey see, monkey clone 10/07/25, 10:22 Last updated: Published: 07/09/24, 19:20 A leap forward in primate research Chinese scientists have recently unlocked the secrets of cloning Rhesus monkeys offering new hope for medical breakthroughs. Introduction When we think of cloning, perhaps the first thing that comes to mind is Dolly the sheep, the first mammal ever cloned from an adult cell back in 1996. This groundbreaking achievement inspired a revolution leading to the successful cloning of other mammals such as cattles and pigs. However, cloning primates, especially Rhesus monkeys, has proven to be a significant challenge due to the low success rates and high embryonic losses during development. What is cloning? Cloning is the process of creating an identical genetic copy of an organism. In mammals, this is typically done through a technique called somatic cell nuclear transfer (SCNT). In SCNT, the nucleus (the compartment storing genetic material) from a cell of the animal to be cloned is transferred into an egg cell that has had its own nucleus removed. This hybrid egg cell then develops into an embryo which is implanted into a surrogate mother to grow into a new individual. Despite the success in cloning other mammals, cloning primates has proven to be a significant challenge. However, the potential benefits of cloning primates for medical research make it a worthwhile endeavour. The importance of cloning primates You might be wondering why being able to clone primates is so important. Well, primates like the Rhesus monkey are invaluable models for studying human diseases and create new therapies! The reason we can use them as disease models is because they share about 93% genetic identity and have very similar physiological characteristics with humans. For instance, Rhesus monkeys also experience a decline in their cognitive abilities as they age, and they lose important connections between brain cells in the part of the brain responsible for complex thinking, even when there's no severe brain damage. Moreover, Rhesus monkeys also develop the same kinds of brain changes that we see in people with Alzheimer's disease, such as the buildup of sticky proteins called amyloid-beta and tangled fibres of another protein called tau.These similarities make them excellent models for understanding how human diseases progress and for developing new treatments. So, by cloning these animals, researchers might be able to create monkeys with specific genetic changes that mimic human diseases even more closely. This could allow scientists to study these diseases in greater detail and develop more effective therapies. Cloning primates could give us a powerful tool to fight against some of the most challenging disorders that affect the human brain! A breakthrough in primate cloning Now, a group of scientists in China have made a breakthrough in primate cloning. They successfully cloned a Rhesus monkey using a novel technique called trophoblast replacement (TR).This innovative approach not only helps us better understand the complex process of cloning but also offers a promising way to improve the efficiency of primate cloning, bringing us one step closer to unlocking the full potential of this technology for medical research and beyond. The awry DNA methylation of cloned conkey embryos To understand why cloning monkeys is so challenging, Liao and colleagues (2024) took a closer look at the genetic material of embryos created in two different ways. They compared embryos made through a standard fertility treatment called intracytoplasmic sperm injection (ICSI) with those created via the cloning technique, SCNT. What they found was quite surprising! To make matters worse, the scientists also noticed that certain genes, known as imprinted genes, were not functioning properly in the SCNT embryos. Imprinted genes are a special group of genes that play a crucial role in embryo development. In a healthy embryo, only one copy of an imprinted gene (either from the mother or the father) is active, while the other copy is silenced. But in the cloned embryos, both copies were often incorrectly switched on or off. Here's the really concerning part: these genetic abnormalities were not just present in the early embryos but also in the placentas of the surrogate monkey mothers carrying the cloned offspring. This suggests that the issues arising from the cloning process start very early in development and continue to affect the pregnancy. Liao and colleagues suspect that the abnormal DNA methylation patterns might be responsible for the imprinted gene malfunction. It's like a game of genetic dominos – when one piece falls out of place, it can cause a whole cascade of problems down the line. Piecing together this complex genetic puzzle is crucial for understanding why primate cloning is so difficult and how we can improve its success in the future. By shedding light on the mysterious world of DNA methylation and imprinted genes, Liao and colleagues have brought us one step closer to unravelling the secrets behind monkey cloning. Digging deeper: what does the data reveal? Liao et al. (2024) discovered that nearly half of the cloned monkey foetuses died before day 60 of the gestation period, indicating developmental defects in the SCNT embryos during implantation. They also found that the DNA methylation level in SCNT blastocysts was 25% lower compared to those created through ICSI (30.0% vs. 39.6%). Furthermore, out of the 115 human imprinting genes they examined in both the embryos and placentas, four genes - THAP3, DNMT1, SIAH1, and RHOBTB3 - showed abnormal expression and loss of DNA methylation in SCNT embryos. These findings highlight the complex nature of the reprogramming process in SCNT and the importance of imprinted genes in embryonic development. By understanding these intricacies, scientists can develop targeted strategies to improve the efficiency of primate cloning. The power of trophoblast replacement To avoid the anomalies in SCNT placentas, the researchers developed a new method called TR. In this method, they transferred the inner cell mass (the part of the early embryo that develops into the baby) from an SCNT embryo into the hollow cavity of a normal embryo created through fertilisation, after removing its own inner cell mass. The idea behind this technique is to replace the abnormal placental cells in the SCNT embryo with healthy ones from the normal embryo. And it worked! Using this method, along with some additional treatments, Liao et al. (2024) successfully cloned a healthy male Rhesus monkey that has survived for over two years (FYI his name is Retro!). The ethics of cloning While the scientific advances in primate cloning are exciting, they also raise important ethical questions. Some people worry about the potential misuse of this technology, for instance to clone humans, which is widely considered unethical. Others are concerned about the well-being of cloned animals, as the cloning process can sometimes lead to health problems. As scientists continue to make progress in cloning technology, it is essential to have open discussions about the ethical implications of their work. Rules and guidelines must be put in place to ensure that this technology is developed and used responsibly, with the utmost care for animal welfare and the concerns of society. Looking to the future The successful cloning of a rhesus monkey using TR opens up new avenues for primate research. This technology can help scientists create genetically identical monkeys to study a wide range of human diseases, from neurodegenerative disorders like Alzheimer's and Parkinson's to infectious diseases like HIV and COVID-19. The trophoblast replacement technique developed by Liao et al. (2024) increases the likelihood of successful cloning by replacing the abnormal placental cells in the SCNT embryo with healthy ones from a normal embryo. However, it is important to note that this technique does not affect the genetic similarity between the clone and the original monkey, as the inner cell mass, which gives rise to the foetus, is still derived from the SCNT embryo. Moreover, this research provides valuable insights into the mechanisms of embryonic development and the role of imprinted genes in this process. By understanding these fundamental biological processes, scientists can not only improve the efficiency of cloning but also develop new strategies for regenerative medicine and tissue engineering. As we look to the future, cloning monkeys could help us make groundbreaking discoveries in medical research and develop new treatments for human diseases. However, we must also carefully consider the ethical implications of cloning primates and ensure that this powerful tool is used responsibly and for the benefit of society. Written by Irha Khalid Related articles: Do other animals get periods? / Germline gene therapy (GGT) REFERENCES Beckman, D. and Morrison, J.H. (2021). Towards developing a rhesus monkey model of early Alzheimer’s disease focusing on women’s health. American Journal of Primatology , [online] 83(11). doi: https://doi.org/10.1002/ajp.23289 . Liao, Z., Zhang, J., Sun, S., Li, Y., Xu, Y., Li, C., Cao, J., Nie, Y., Niu, Z., Liu, J., Lu, F., Liu, Z. and Sun, Q. (2024). Reprogramming mechanism dissection and trophoblast replacement application in monkey somatic cell nuclear transfer. Nature Communications , [online] 15(1), p.5. doi: https://doi.org/10.1038/s41467-023-43985-7 . Morrison, J.H. and Baxter, M.G. (2012). The ageing cortical synapse: hallmarks and implications for cognitive decline. Nature Reviews Neuroscience , [online] 13(4), pp.240–250. doi: https://doi.org/10.1038/nrn3200 . Paspalas, C.D., Carlyle, B.C., Leslie, S., Preuss, T.M., Crimins, J.L., Huttner, A.J., Dyck, C.H., Rosene, D.L., Nairn, A.C. and Arnsten, A.F.T. (2017). The aged rhesus macaque manifests Braak stage III/IV Alzheimer’s‐like pathology. Alzheimer’s & Dementia , [online] 14(5), pp.680–691. doi: https://doi.org/10.1016/j.jalz.2017.11.005 . Shi, L., Luo, X., Jiang, J., Chen, Y., Liu, C., Hu, T., Li, M., Lin, Q., Li, Y., Huang, J., Wang, H., Niu, Y., Shi, Y., Styner, M., Wang, J., Lu, Y., Sun, X., Yu, H., Ji, W. and Su, B. (2019). Transgenic rhesus monkeys carrying the human MCPH1 gene copies show human-like neoteny of brain development. National Science Review , [online] 6(3), pp.480–493. doi: https://doi.org/10.1093/nsr/nwz043 . Project Gallery
- Unlocking the power of statistics | Scientia News
From confusion to career opportunities Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Unlocking the power of statistics 14/07/25, 15:09 Last updated: Published: 19/09/23, 16:23 From confusion to career opportunities During my time studying maths there was always one topic that would trip me up: statistics. Being an A-level physics student, I could understand why calculus is useful in real life, using differentiation to calculate the velocities of projectiles. And I could look and see how geometry is used in buildings and structures. However, statistics often made me feel unrelatable and lost, as I was unable to see real-world applications. But today, I wish to alter my old perspective. First and foremost, you might be pleasantly surprised to learn that statistics opens doors to some of the most lucrative careers available today. We'll delve into roles such as quantitative analysts, who boast a national average salary of £99,000 per year. But if finance is not your cup of tea, there are many other rewarding career paths to explore, from becoming a data scientist to forecasting the weather as a meteorologist. In this article, I wish to unveil the world of statistics, revealing its importance and shedding light on its real-life applications. My hope is to not only inspire those who are already passionate about statistics but also to ignite motivation in individuals who, like me, found themselves in a similar predicament a few years ago. The Actuary Less well known when compared to a banker or engineer, an actuary’s sole purpose is to analyse risk for multiple different scenarios. It may sound simple on first inspection, but being an actuary is a very well-established career requiring many years of learning followed by some of the most challenging exams in the job market. An actuary attempts to quantify the risk of an event happening so that financial decisions can be made with an objective view. A good and close-to-home example of this is being either accepted or rejected from a credit card. As a younger person below the age of 21, the chances of you getting accepted for a credit card are extremely and quite painfully low. This is because banks, and more specifically, credit score providers, deem you to be a high-risk person to lend to. They think this because you have a very short credit history, are unaware of how responsible you are with money, and are more afraid to lend you their cash. In other words, they don’t want you to spend their money on going out and drinking booze. The insurance industry is, however, the biggest industry when it comes to actuaries. Both life and non-life actuaries work in teams with insurance providers to establish whether a client, company, or investment is worthwhile. Actuaries apply both statistics and actuarial science (similar to applied statistics) to real-life situations, evaluate whether to offer a premium to a customer, and then establish what that premium is. You may see in advertisements that life insurance costs as little as £10 a month for a 20-year-old compared to someone who is 65. This is because the younger you are, the less likely you are to claim against your policy. Actuaries put together vast amounts of information about people, lifestyle choices, and other factors to help determine the probability that someone may claim, suggesting a ‘fair’ premium that an insurance company may offer. Without the help of an actuary, insurance companies would either charge too much, making people disadvantaged, or charge too little, in which case they would have to default on their policy and be unable to pay out any claims. Although this seems very specific, the role of an actuary is becoming increasingly important as people live longer lives and insurance companies become more fearful of defaulting.To put it into perspective, actuaries on average earn £80,000 working in London, putting you well in the top 10% of earners in the UK. The Quantitative Analyst Similar to an actuary, quantitative analysts do exactly what is said on the tin. They use quantitative methods to analyse data. Often, companies like investment banks, hedge funds, and pension funds will hire front-office ‘quants’. The aim of the game is to send out trades as quickly as possible before all the other trading offices do. These big companies have links directly to the trading floor, so every millisecond counts, and it’s a quant's job to devise a trading strategy that beats the rest and operates in the least amount of time. Quants are masters of statistics and mathematics, and for this reason, high-frequency trading firms like Hudson River Trading offer salaries to top mathematical minds in excess of $500,000. The role of quantitative researchers is to explore the latest statistical articles being published by top universities and generate strategies that can be implemented in the stock market. This job is not one to be taken lightly, as salary is often based on performance, but someone who is motivated to explore the ins and outs of statistics may find themselves loving the life of a quant. The Meteorologist Meteorologists are the people that we incorrectly blame for the bad weather that we have. And they are also the people we blame when we forget to take a coat and get soaked on the long walk back home. But what do meteorologists actually do? And is it any more than just an educated guess? Meteorologists, along with climatologists, collect millions of pieces of information every hour of every day across their 195,000 weather stations spread all around the globe. These stations collect key pieces of information, including atmospheric pressure, temperature, speed, rain, humidity, and many other components of current weather conditions. With this information, meteorologists begin to paint a picture of what the current weather climate is like and then use forecasting methods and statistical models to estimate how the weather is going to change. The probability that it might rain is much more than an educated guess; it is the probability that if this situation happened 100 times, it would rain the estimated number of times (i.e., if there was an 80% chance of rain, it would rain 80 times out of the hundred over a large enough sample). As a forecaster, you will collect this information and input it into very advanced systems to analyse and give an outcome, but as a researcher, you will help derive these statistical forecasting models and improve them so that our apps and news channels are even more precise. Not only that, but you may also find yourself researching the effects of climate change from the data that you analyse, and maybe even how the weather affects the spread of pollution and disease. Meteorologists get paid a modest salary of around £32,000 per year, which may seem small when compared to that of a quant, but the quality of life is far more generous than some careers in finance. To conclude In conclusion, statistics, once a perplexing subject for many, can offer an exciting and rewarding career. From the meticulous work of actuaries, assessing risks and financial decisions, to the world of quantitative analysts, where every millisecond counts, and even to the indispensable role of meteorologists, who help us navigate the weather and climate change, statistics holds the power to transform lives and industries. As we've explored, statistics is not just about numbers and formulas; it's about making sense of the world, predicting outcomes, and creating informed decisions. So, whether you're a seasoned statistician or someone who, like me, once felt lost in its complexities, remember that statistics isn't merely a subject to conquer—it's a key that unlocks doors to some of the most intriguing and well-compensated careers out there. Written by George Chant Project Gallery
- Can carbon monoxide unlock new pathways in inflammation therapy? | Scientia News
Recent prospects for carbon monoxide indicate so Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Can carbon monoxide unlock new pathways in inflammation therapy? 20/03/25, 12:03 Last updated: Published: 01/09/24, 10:31 Recent prospects for carbon monoxide indicate so Carbon monoxide (CO) is a colourless, odourless and tasteless gas which is a major product of the incomplete combustion of carbon-containing compounds. The toxic identity CO stems from its strong affinity for the haemoglobin in our blood which is around 300 times as strong as the affinity of oxygen. As a result, once the gas is inhaled, CO binds to the haemoglobin instead and reduces the amount of oxygen our blood can transport, which can cause hypoxia (low levels of oxygen in tissue) and dizziness, eventually leading to death. However, an intriguing fact is that CO is also endogenously produced in our body, due to the degradation of haem in the blood. Moreover, recent prospects for CO indicate that it may even be developed as an anti-inflammatory drug. How CO is produced in the body See Figure 1 Haem is a prosthetic (non-peptide) group in haemoglobin, where the oxygen binds to the iron in the molecule. When red blood cells reach the end of their lifespan of around 120 days, they are broken down in a reaction called haemolysis. This occurs in the bone marrow by macrophages that engulf the cells, which contain the necessary haem-oxygenase enzyme. Haem-oxygenase converts haem into CO, along with Fe2+ and biliverdin, the latter being converted to bilirubin for excretion. The breakdown of haem is crucial because the molecule is pro-oxidant. Therefore, free haem in the blood can lead to oxidative stress in cells, potentially resulting in cancers. Haem degradation also contributes to the recycling of iron for the synthesis of new haem molecules or proteins like myoglobin. This is crucial for maintaining iron homeostasis in the body. The flow map illustrates haemolysis and the products produced, which either protect cells from further stress or result in cell injury. CO can go on to induce anti-inflammatory effects- see Figure 2 . Protein kinases and CO Understanding protein kinases is crucial before exploring carbon monoxide (CO) reactions. Protein kinases phosphorylate (add a phosphate group to) proteins using ATP. Protein kinases are necessary to signal the release of a hormone or regulating cell growth. Each kinase has two regulatory (R) subunits and two catalytic (C) subunits. ATP as a reactant is usually sufficient for protein kinases. However, some kinases require additional mitogens – specific activating molecules like cytokines (proteins regulating immune cell growth), that are involved in regulating cell division and growth. Without the activating molecules, the R subunits bind tightly to the C subunits, preventing phosphorylation. Research on obese mice showed that CO binding to a Mitogen-Activated Protein Kinase (MAPK) called p38 inhibits inflammatory responses. This kinase pathway enhances insulin sensitivity, reducing obesity effects. The studies used gene therapy, modifying haem-oxygenase levels in mice. Mice with reduced haem-oxygenase levels had more adipocytes (fat-storing cells) and increased insulin resistance, suggesting CO treatment potential for chronic obstructive pulmonary disease (COPD), which causes persistent lung inflammation and results in 3 million deaths annually. Carbon-monoxide-releasing molecules As a result of these advancements, specific CO-releasing molecules (CORMs) have been developed to release carbon monoxide at specific doses. Researchers are particularly interested in the ability of CORMs to regulate oxidative stress and improve outcomes in conditions during organ transplantation, and cardiovascular diseases. Advances in the design of CORMs have focused on improving their stability, and targeted release to specific tissues or cellular environments. For instance, CORMs based on transition metals like ruthenium, manganese, and iron have been developed to enhance their efficacy and minimize side effects. This is achieved through carbon monoxide forming a stable ‘ligand’ structure with metals to travel in the bloodstream. Under an exposure to light or a chemical, or even by natural breakdown, these structures can slowly distribute CO molecules. Although the current research did not find any notable side effects within mouse cells, this does not reflect the mechanisms in human organ systems, therefore there is still a major risk of incompatibility due to water insolubility and toxicity issues. These problems could lead to potentially lead to disruption in the cell cycle, which may promote neurodegenerative diseases. Conclusion: the future of carbon monoxide Carbon monoxide has transitioned from being a notorious toxin to a valuable therapeutic agent. Advances in CO-releasing molecules have enabled its safe and controlled use, elevating its anti-inflammatory and protective properties to treat various inflammatory conditions effectively. This shift underpins the potential of CO to revolutionise inflammation therapy. It is important to remember that while carbon monoxide-releasing molecules (CORMs) have potential in controlled therapeutic settings, carbon monoxide gas itself remains highly toxic and should be handled with extreme caution to avoid serious health risks. Written by Baraytuk Aydin Related articles: Schizophrenia, inflammation and ageing / Kawasaki disease REFERENCES Different Faces of the Heme-Heme Oxygenase System in Inflammation - Scientific Figure on ResearchGate. Available from: https://www.researchgate.net/figure/The-colorimetric-actions-of-the-heme-HO-system-heme-oxygenase-mediated-heme-degradation_fig3_6531826 (accessed 11 Jul, 2024). Nath, K.A. (2006) Heme oxygenase-1: A provenance for cytoprotective pathways in the kidney and other tissues, Kidney International. Available at: https://www.sciencedirect.com/science/article/pii/S0085253815519595 (Accessed: 12 July 2024). Gáll, T. et al. (2020) ‘Therapeutic potential of carbon monoxide (CO) and hydrogen sulfide (H2S) in hemolytic and hemorrhagic vascular disorders—interaction between the heme oxygenase and H2S-producing systems’, International Journal of Molecular Sciences, 22(1), p. 47. doi:10.3390/ijms22010047. Venkat, A. (2024) Protein kinase, Wikipedia. Available at: https://en.wikipedia.org/wiki/Protein_kinase (Accessed: 12 July 2024). Goebel, U. and Wollborn, J. (2020) Carbon monoxide in intensive care medicine-time to start the therapeutic application?! - intensive care medicine experimental, SpringerOpen. Available at: https://icm-experimental.springeropen.com/articles/10.1186/s40635-020-0292-8 (Accessed: 07 July 2024). Bansal, S. et al. (2024) ‘Carbon monoxide as a potential therapeutic agent: A molecular analysis of its safety profiles’, Journal of Medicinal Chemistry, 67(12), pp. 9789–9815. doi:10.1021/acs.jmedchem.4c00823. DeSimone, C.A., Naqvi, S.L. and Tasker, S.Z. (2022) ‘Thiocormates: Tunable and cost‐effective carbon monoxide‐releasing molecules’, Chemistry – A European Journal, 28(41). doi:10.1002/chem.202201326. Project Gallery
- Proving causation: causality vs correlation | Scientia News
Establishing causation through Randomised Controlled Trials and Instrumental Variables Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Proving causation: causality vs correlation Last updated: 03/06/25, 13:43 Published: 12/06/25, 07:00 Establishing causation through Randomised Controlled Trials and Instrumental Variables Does going to the hospital lead to an improvement in health? At first glance, one might assume that visiting a hospital should improve health outcomes. However, if we compare the average health status of those who go to the hospital with those who do not, we might find that hospital visitors tend to have worse health overall. This apparent contradiction arises due to confounding – people typically visit hospitals due to existing health issues. Simply comparing these two groups does not tell us whether hospitals improve health or if the underlying health conditions of patients drive the observed differences. A similar challenge arises when examining the relationship between police presence and crime rates. Suppose we compare two cities—one with a large police force and another with a smaller police force. If the city with more police also has higher crime rates, does this mean that police cause crime? Clearly not. Instead, it is more likely that higher crime rates lead to an increased police presence. This example illustrates why distinguishing causation from correlation is crucial in data analysis, and that stating that two variables are correlated does not imply causation. First, let’s clarify the distinction between causation and correlation. Correlation refers to a relationship between two variables, but it does not imply that one causes the other. Just because two events occur together does not mean that one directly influences the other. To establish causation, we need methods that separate the true effect of an intervention from other influencing factors. Statisticians, medical researchers and economists have ingeniously come up with several techniques that allow us to separate correlation and causation. In medicine, the gold standard for researchers is the use of Randomised Controlled Trials (RCTs). Imagine a group of 100 people, each with a set of characteristics, such as gender, age, political views, health status, university degree, etc. RCTs randomly assign each individual to one of two groups. Consequently, each group of 50 individuals should, on average, have similar ages, gender distribution, and baseline health. Researchers then examine both groups simultaneously while changing only one factor. This could involve instructing one group to take a specific medicine or asking individuals to drink an additional cup of coffee each morning. This results in two statistically similar groups differing in only one key aspect. Therefore, if the characteristics of one group change while those of the other do not, we can reasonably conclude that the change caused the difference between the groups. This is great for examining the effectiveness of medicine, especially when you give one group a placebo, but how would we research the causation behind the police rate and crime example? Surely it would be unwise and perhaps unethical to randomise how many police officers are present in each city? And because not all cities are the same, the conditions for RCTs would not hold. Instead, we use more complex techniques like Instrumental Variables (IV) to overcome those limitations. A famous experiment using IV to explain police levels and crime was published by Steven Levitt (1997). Levitt used the timings of mayoral and gubernatorial elections (the election of a governor) as an instrument for changes in police hiring. Around election time, mayors and governors have incentives to look “tough on crime.” This can lead to politically motivated increases in police hiring before an election. Crucially, hiring is not caused by current crime rates but by the electoral calendar. So, by using the timing of elections to predict an increase in police, we can use those values to estimate the effect on crime. What he found was that more police officers reduce violent and property crime, with a 10% increase in police officers reducing violent crime by roughly 5%. Levitt’s paper is a clever application of IV to get around the endogeneity problem and takes correlation one step further into causation, through the use of exogenous election timing. However, these methods are not without limitations. IV analysis, for instance, hinges on finding a valid instrument—something that affects the independent variable (e.g., police numbers) but has no direct effect on the outcome (e.g., crime) other than through that variable. Finding such instruments can be extremely challenging, and weak or invalid instruments can lead to biased or misleading results. Despite these challenges, careful causal inference allows researchers to better understand the true drivers behind complicated relationships. In a world where influencers, media outlets, and even professionals often mistake correlation for causation, developing a critical understanding of these concepts is an essential skill required to navigate through the data, as well as help drive impactful change in society through exploring the true relationships behind different phenomena. Written by George Chant Related article: Correlation between HDI and mortality rate REFERENCE Steven D. Levitt (1997). “Using Electoral Cycles in Police Hiring to Estimate the Effect of Police on Crime”. American Economic Review 87.3, pp. 270–290 Project Gallery
- Herpes vs devastating skin disease | Scientia News
From foe to ally Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Herpes vs devastating skin disease 09/07/25, 14:16 Last updated: Published: 06/01/24, 11:14 From foe to ally This is article no. 3 in a series on rare diseases. Next article: Epitheliod hemangioendothelioma . Previous article: Breast cancer in males . Have you ever plucked loose skin near your nail, ripping off a tiny strip of good skin too? Albeit very small, that wound can be painful. Now imagine that it is not just a little strip that peels off, but an entire sheet. And it does not detach only when pulled, but at the slightest touch. Even a hug opens wounds, even a caress brings you pain. This is life with recessive dystrophic epidermolysis bullosa (RDEB), the most severe form of dystrophic pidermolysis bullosa (DEB). Herpes becomes a therapy DEB is a rare genetic disease of the skin that affects 3 to 10 individuals per million people (prevalence is hard to nail down for rare diseases). A cure is still far off, but there is good news for patients. Last May, the US Food and Drug Administration (FDA) approved Vyjuvek (beremagen geperparvec) to treat skin wounds in DEB. Clinical studies showed that it speeds up healing and reduces pain. Vyjuvek is the first gene therapy for DEB. It is manufactured by Krystal Biotech and - get this- it is a tweaked version of the herpes virus. Yes, you got that right, the virus causing blisters and scabs has become the primary ally against a devastating skin disease. This approval is a milestone for gene therapies, as Vyjuvek is the first gene therapy - based on the herpes virus, - to apply on the skin as a gel, - approved for repeated use. This article describes how DEB, and especially RDEB, affects the skin and wreaks havoc on the body; the following article will explain how Vyjuvek works. DEB disrupts skin integrity We carry around six to nine pounds of skin. Yet we often forget its importance: it stops germs and UVs, softens blows, regulates body temperature and makes us sensitive to touch. Diseases that compromise the skin are therefore devastating. These essential functions rely on the organisation of the skin in three layers: epidermis, dermis and hypodermis ( Figure 1 ). Typically, a Velcro strap of the protein collagen VII firmly anchors the epidermis to the dermis. The gene COL7A1 contains the instructions on how to produce collagen VII. In DEB, mutations in COL7A1 result in the production of a faulty collagen VII. As the Velcro strap is weakened, the epidermis becomes loosely attached to the dermis. Mutations in one copy of COL7A1 cause the dominant form of the disease (DDEB), mutations in both copies cause RDEB. With one copy of the gene still functional, the skin still produces some collagen VII, when both copies are mutated, little to no collagen VII is left. Therefore, RDEB is more severe than DDEB. In people with RDEB, the skin can slide off at the slightest touch and even gentle rubs can cause blisters and tears ( Figure 2 ). Living with RDEB Life with RDEB is gruelling and life expectancy doesn't exceed 30 years old. Wounds are very painful, slow to heal and get infected easily. The risk of developing an aggressive skin cancer is higher. The constant scarring can cause limb deformities. In addition, blisters can appear in the mouth, oesophagus, eyes and other organs. There is no cure for DEB for now; treatments can only improve the quality of life. Careful dressing of wounds promotes healing and prevents infections. Painkillers are used to ease pain. Special diets are required. And, to no one's surprise, physical activities must be avoided. Treating RDEB Over the past decade, cell and genetic engineering advances have sparked the search for a cure. Scientists have explored two main alternatives to restore the production of collagen VII in the skin. The first approach is based on transferring skin cells able to produce collagen VII. Despite promising results, this approach treats only tinyl patches of skin, requires treatments in highly specialised centres and it may cause cancer. The second approach is the one Vyjuvek followed. Scientists place the genetic information to make collagen VII in a modified virus and apply it to a wound. There, the virus infects skin cells, providing them with a new COL7A1 gene to use. These cells now produce a functional collagen VII and can patch the damage up. We already know which approach came up on top. Vyjuvek speeds up the healing of wounds as big as a smartphone. Professionals can apply it in hospitals, clinics or even at the patient’s home. And it uses a technology that does not cause cancer. But how does Vyjuvek work? And why did scientists choose the herpes virus to build Vyjuvek? We will find the answer in the following article. And since perfection does not belong to biology, we will also discuss the limitations of this remarkable gene therapy. NOTES: 1. DEB is part of a group of four inherited conditions, collectively named epidermolysis bullosa (EB), where the skin loses integrity. EB is also known as “Butterfly syndrome” because the skin becomes as fragile as a butterfly’s wing. These conditions are EB simplex, junction EB, dystrophic EB and Kindler EB. 2. Most gene therapies are based on modified, or recombinant in science jargon, adenoassociated viruses, which I reviewed for Scientia News. 3. Over 700 mutations have been reported. They disrupt collagen VII and its function with various degrees of severity. Consequently, RDEB and DDEB display several clinical phenotypes. 4. Two studies have adopted this approach: in the first study, Siprashvili and colleagues (2016) grafted ex vivo retrovirally-modified keratinocytes, the main cell type in the epidermis, over the skin of people with RDEB; in the second study, Lwin and colleagues (2019) injected ex vivo lentivirally-modified fibroblasts in the dermis of people with RDEB. Written by Matteo Cortese, PhD Related article: Ehlers-Danlos syndrome Project Gallery
- Unleashing the power of the stars: how nuclear fusion holds the key to tackling climate change | Scientia News
Looking at the option of nuclear fusion to generate renewable energy Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Unleashing the power of the stars: how nuclear fusion holds the key to tackling climate change 14/07/25, 15:08 Last updated: Published: 30/04/23, 10:55 Looking at the option of nuclear fusion to generate renewable energy Imagine a world where we have access to a virtually limitless and clean source of energy, one that doesn't emit harmful greenhouse gases or produce dangerous radioactive waste. A world where our energy needs are met without contributing to climate change. This may sound like science fiction, but it could become a reality through the power of nuclear fusion. Nuclear fusion, often referred to as the "holy grail" of energy production, is the process of merging light atomic nuclei to form a heavier nucleus, releasing an incredible amount of energy in the process. It's the same process that powers the stars, including our very own sun, and holds the potential to revolutionize the way we produce and use energy here on Earth. Nuclear fusion occurs at high temperature and pressure when two atoms (e.g. Tritium and Deuterium atoms) merge together to form Helium. This merge releases excess energy and a neutron. This energy an then be harvested inform of heat to produce electricity. Progress in the field of creating a nuclear fusion reactor has been slow, despites the challenges there are some promising technologies and approaches have been developed. Some of the notable approaches to nuclear fusion research include: 1. Magnetic Confinement Fusion (MCF) : In MCF, high temperatures and pressures are used to confine and heat the plasma, which is the hot, ionized gas where nuclear fusion occurs. One of the most promising MCF devices is the tokamak, a donut-shaped device that uses strong magnetic fields to confine the plasma. The International Thermonuclear Experimental Reactor (ITER), currently under construction in France, is a large-scale tokamak project that aims to demonstrate the scientific and technical feasibility of nuclear fusion as a viable energy source. 2. Inertial Confinement Fusion (ICF) : In ICF, high-energy lasers or particle beams are used to compress and heat a small pellet of fuel, causing it to undergo nuclear fusion. This approach is being pursued in facilities such as the National Ignition Facility (NIF) in the United States, which has made significant progress in achieving fusion ignition, although it is still facing challenges in achieving net energy gain. In December of 2022, the US lab reported that for the first time, more energy was released compared to the input energy. 3. Compact Fusion Reactors: There are also efforts to develop compact fusion reactors, which are smaller and potentially more practical for commercial energy production. These include technologies such as the spherical tokamak and the compact fusion neutron source, which aim to achieve high energy gain in a smaller and more manageable device. While nuclear fusion holds immense promise as a clean and sustainable energy source, there are still significant challenges that need to be overcome before it becomes a practical reality. In nature nuclear fusion is observed in stars, to be able to achieve fusion on Earth such conditions have to be met which can be an immense challenge. High level of temperature and pressure is required to overcome the fundamental forces in atoms to fuse them together. Not only that, but to be able to actually use the energy it has to be sustained and currently more energy is required then the output energy. Lastly, the material and technology also pose challenges in development of nuclear fusion. With high temperature and high energy particles, the inside of a nuclear fusion reactor is a harsh environment and along with the development of sustained nuclear fusion, development of materials and technology that can withstand such harsh conditions is also needed. Despite many challenges, nuclear fusion has the potential to be a game changer in fight against not only climate change but also access of cheap and clean energy globally. Unlike many forms of energy used today, fusion energy does not emit any greenhouse gasses and compared to nuclear fission is stable and does not produce radioactive waste. Furthermore, the fuel for fusion, which is deuterium is present in abundance in the ocean, where as tritium may require to synthesised at the beginning, but once the fusion starts it produce tritium by itself making it self-sustained. When the challenges are weighted against the benefits of nuclear fusion along with the new opportunities it would unlock economically and in scientific research, it is clear that the path to a more successful and clean future lies within the development of nuclear fusion. While there are many obstacles to overcome, the progress made in recent years in fusion research and development is promising. The construction of ITER project, along with first recordings of a higher energy outputs from US NIF programs, nuclear fusion can become a possibility in a not too distant future. In conclusion, nuclear fusion holds the key to address the global challenge of climate change. It offers a clean, safe, and sustainable energy source that has the potential to revolutionize our energy systems and reduce our dependence on fossil fuels. With continued research, development, and investment, nuclear fusion could become a reality and help us build a more sustainable and resilient future for our planet. It's time to unlock the power of the stars and harness the incredible potential of nuclear fusion in the fight against climate change. Written by Zari Syed Related articles: Nuclear medicine / Geoengineering / The silent protectors / Hydrogen cars Project Gallery










