top of page

Search Index

355 results found

  • The MAPK/ERK signalling pathway in cancer | Scientia News

    Dysregulation of this pathway occurs in many different types of cancers Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The MAPK/ERK signalling pathway in cancer Last updated: 24/02/25, 11:29 Published: 20/02/25, 08:00 Dysregulation of this pathway occurs in many different types of cancers Introduction The mitogen-activated protein kinase (MAPK) signalling pathway is an important pathway in apoptosis, proliferation, differentiation, angiogenesis and metastasis. It is a protein kinase pathway (causes phosphorylation) with between 3-5 sets of kinases and is known to be activated via Ras, KC-mediated (Kupffer cells/liver macrophages), Ca2+, or G protein-coupled receptors. The MAPK/ERK pathway, also known as the Ras-Raf-MEK-ERK pathway, is conserved in mammals, and dysregulation of this pathway occurs in many different types of cancers. MAPK/ERK function Ras (GTPase) activates Raf (serine/threonine kinase), which activates MEK1/2 (tyrosine & serine/threonine kinases) and ERK1/2 (serine/threonine kinases), which controls certain transcription factors. ERK1/2 also phosphorylates various substrates in the cytoplasm (not shown). This results in gene expression, which can cause apoptosis, cell cycle regulation, differentiation, proliferation, etc. (Fig. 1). It is estimated that there are more than 150 target substrates of ERK1/2, either directly or indirectly. Furthermore, Ras and RAF have several different subtypes which have different functions. Ras has four different subtypes, which are the GTPases: HRAS, KRAS4A/4B, and NRAS, with KRAS being the common form found in human cancers. RAF has subtypes, which are the kinases: ARAF, BRAF, and CRAF (in humans). Ras is activated when GRB2 (growth-factor-receptor bound protein 2) binds to SOS (son of sevenless). This occurs via the complex moving to the cell membrane upon activation of a transmembrane receptor, such as EGFR (epidermal growth factor receptor). SOS transports the signal from the receptor to RAS and aids in the conversion of RAS-GDP to RAS-GTP. This switches ‘on’ RAF, which leads to the phosphorylation of MEK and ERK (Fig. 1). ERK is then able to move into the nucleus and alter gene expression, of genes such as CREB, MYC, FOS, MSK, ELK, JUN, etc., which are involved in processes such as metabolism, proliferation, angiogenesis (formation of blood vessels), haematopoiesis (formation of blood cells), wound healing, differentiation, inflammation, and cancer. However, ERK is also able to activate other substrates in the cytoplasm, such as BIM, RSK, MNK, and MCL, which are involved in processes such as apoptosis and blood pressure regulation. A regular level of ERK expression is needed for activation of genes involved in the cell cycle and to inhibit negative cell cycle control. ERK phosphorylates Cyclin D and Cdk4/6, which are bound together and aid the cell in the movement from G1 (gap) to the S phase (DNA synthesis/repair) of the cell cycle. MAPK/ERK pathway in cancer The MAPK/ERK pathway has been linked with many cancers, such as colon, thyroid, melanoma, pancreatic, lung, and glioblastoma. Mutations in epidermal growth factor receptor (EGFR), Ras, and Raf are well-known to cause cancer, with an estimated 33% of cancers containing Ras mutations, and an estimated 8% being caused by Raf mutations. It is also estimated that 85% of cancers have elevated activity of MEK. The MAPK/ERK pathway has also been shown to interact with the PI3K/Akt pathway, which controls the cell cycle and causes increased cell proliferation, which is obviously an important factor in tumourigenesis (tumour initiation). Regulation of the MAPK/ERK pathway There is a negative feedback mechanism of ERK1/2 on RAS/RAF/MEK, by ERK1/2 phosphorylating SOS, which causes the RAF-RAS link to be disrupted. ERK also inhibits MEK via the phosphorylation of BRAF and CRAF. There are inhibitors for Ras/Raf/MEK/ERK, but not all of these inhibitors work well/are without issues. ERK is problematic, in that their ATP-binding sites are very like cell cycle proteins, so are more difficult to inhibit. Also, it is difficult to target Ras due to its high GTP binding affinity, profuse cellular GTP, and lack of appropriate binding pockets. Therefore, the main focus currently appears to be on Raf/MEK inhibition. Raf inhibitors include drugs such as sorafenib, vemurafenib, encorafenib, and dabrafenib (these drugs are used on specific BRAF mutations). On the other hand, MEK inhibitors include drugs such as trametinib, cobimetinib, binimetinib, and selumetinib (these drugs can be used on specific mutations in Ras and Ras/Raf). Negative feedback mechanisms tightly control the MEK/ERK pathway and therefore great care is taken with inhibitor drug doses. To illustrate, if the doses are too low, the negative feedback loops are activated, which can lead to drug resistance/ poor therapeutic outcome. Conclusion The MAPK/ERK pathway is essential for several cellular processes, such as apoptosis, cell cycle regulation, differentiation, and proliferation. Therefore, it has a critical role in tumourigenesis. Raf and MEK in particular are susceptible to inhibition, which has led to the production of several different drugs for use in various types of cancer. There are currently other clinical trials in progress, and these will hopefully lead to further therapies for other cancers involved in this pathway. Written by Eleanor R Markham Related articles: HIPPO signalling pathway / Thyroid cancer REFERENCES Lake, D., Corrêa, S.A.L. & Müller, J. Negative feedback regulation of the ERK1/2 MAPK pathway. Cell. Mol. Life Sci. 73 , 4397–4413 (2016). https://doi.org/10.1007/s00018-016-2297-8 Song Y, Bi Z, Liu Y, Qin F, Wei Y, Wei X. Targeting RAS-RAF-MEK-ERK signaling pathway in human cancer: Current status in clinical trials. Genes Dis. 2022 May 20;10(1):76-88. doi: 10.1016/j.gendis.2022.05.006. PMID: 37013062; PMCID: PMC10066287 Ullah R, Yin Q, Snell AH, Wan L. RAF-MEK-ERK pathway in cancer evolution and treatment. Semin Cancer Biol. 2022 Oct;85:123-154. doi: 10.1016/j.semcancer.2021.05.010. Epub 2021 May 13. PMID: 33992782. Project Gallery

  • Vertigo | Scientia News

    In some cases, the exact cause of vertigo remains unidentified, highlighting the complexity of diagnosis Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Vertigo Last updated: 27/06/25, 14:06 Published: 03/07/25, 07:00 In some cases, the exact cause of vertigo remains unidentified, highlighting the complexity of diagnosis Vertigo is a symptom characterised by the sensation of spinning or movement, affecting either the individual or their surroundings. Unlike dizziness, which involves a floating sensation, or imbalance, which reflects unsteadiness, vertigo conveys a distinct sense of motion. While it is not a condition in itself, vertigo often indicates an underlying issue and can range from mild to debilitating, significantly impairing balance and daily activities. Physiology of vertigo Physiologically, vertigo is primarily linked to the inner ear and the vestibular system, which is responsible for maintaining balance and spatial orientation. The vestibular apparatus consists of semicircular canals and otolith organs, which detect angular and linear movements, respectively. Dysfunction in these structures, or their neural pathways to the brainstem and cerebellum, can disrupt normal sensory input, causing vertigo. Symptoms ( Figure 1 ) may include a spinning sensation, nausea, vomiting, nystagmus (involuntary eye movements), sweating, and difficulty with balance. Triggers vary widely and may include head movements, changes in position, or even psychological stress. The underlying causes can be peripheral, such as inner ear disorders, or central, involving the brain or central nervous system. Causes and prevalence Vertigo is particularly common among middle-aged and older adults, where it presents a considerable risk of falls and associated injuries. This demographic is especially vulnerable due to age-related changes in the vestibular system, such as a decline in vestibular hair cells and neurons, as well as alterations in central pathways. Vestibular disorders are among the most frequent causes of vertigo episodes in the elderly, often contributing to a cycle of psychological distress and physical limitation. Anxiety and depressive syndromes further exacerbate this cycle by increasing fear of attacks and falls, ultimately limiting daily activities and lowering perceived quality of life. Benign Paroxysmal Positional Vertigo (BPPV) is the most common cause of vertigo and is featured in multiple studies within the literature ( Figure 2 ). BPPV is typically triggered by changes in head position, leading to brief episodes of intense vertigo. Despite its prevalence, management can be challenging due to the nonspecific nature of symptoms and the diverse underlying causes. Polypharmacy, or the use of multiple medications, has also emerged as a significant factor in vertigo among older adults. Prescriptions involving several drugs, particularly antihypertensives and sedative hypnotics, have been linked to an increased likelihood of vertigo. Careful assessment of medication interactions and side effects during medical consultations is therefore essential. Metabolic disorders, such as diabetes and hypoglycaemia, also contribute to vertigo in some individuals. However, in a portion of cases, the exact cause of vertigo remains unidentified, highlighting the complexity of diagnosis. Conclusion As one of the most common and disabling symptoms in the elderly, vertigo requires comprehensive and individualised care. Understanding its underlying physiological mechanisms, as well as recognising the multifactorial influences such as medication use, psychological health, and metabolic disorders, is essential for effective management. By adopting an integrated approach that prioritises accurate diagnosis and targeted interventions, clinicians can improve both symptom control and overall quality of life for individuals affected by vertigo. Further research is needed to enhance treatment strategies and address the remaining gaps in knowledge. Written by Maria Z Kahloon Project Gallery

  • Why South Asian genes remember famine | Scientia News

    Famine-induced epigenetic changes and public health strategies in affected populations Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Why South Asian genes remember famine Last updated: 18/09/25, 08:44 Published: 23/01/25, 08:00 Famine-induced epigenetic changes and public health strategies in affected populations Our genes are often thought of as a fixed blueprint, but what if our environment could change how they work? This is the intriguing idea behind epigenetics—a field that shows how our environment, combined with the body’s adaptive responses for survival, can influence gene expression without altering our DNA. In South Asia, famines such as the infamous Bengal Famine of 1943 caused immense suffering, and these hardships may have triggered genetic changes that continue to affect generations. Today, South Asians face an increased risk of developing Type 2 diabetes by age 25, whereas White Europeans generally encounter this risk around age 40. What is driving this difference in risk? This article will explore the science behind these epigenetic changes, their impact on the descendants of famine survivors and how these insights can shape public health, policy, and research. The legacy of historical famines In 1943, the Bengal Famine claimed around three million lives. Nobel laureate Amartya Sen argues that the severity of the famine was not merely a result of prior natural disasters and disease outbreaks in crops. Instead, it was primarily driven by wartime inflation, speculative buying, and panic hoarding, which disrupted food distribution across the Bengal region. Consequently, for the average Bengali citizen, death from starvation, disease, and malnutrition became widespread and inevitable. The impact of the famine extended well beyond the immediate loss of life. Dr Mubin Syed, a radiologist specialising in vascular and obesity medicine, emphasises that these famines have left a lasting mark on the health of future generations. Dr Syed explains that South Asians, having endured numerous famines, have inherited "starvation-adapted" traits. These traits are characterised by increased fat storage. As a result, the risk of cardiovascular diseases, diabetes, and obesity is heightened in their descendants. This tendency towards fat storage is believed to be closely tied to epigenetic factors, which play a crucial role in how these traits are passed down through generations. Epigenetic mechanisms and their impact These inherited traits are shaped by complex epigenetic mechanisms, which regulate gene expression in response to environmental stressors like famines without altering the underlying DNA sequence. DNA methylation, a process involving the addition of small chemical groups to DNA, plays a crucial role in regulating gene expression. When a gene is 'on,' it is actively transcribed into messenger RNA (mRNA), resulting in the synthesis of proteins such as enzymes that regulate energy metabolism or hormones like insulin that manage blood sugar levels. Conversely, when a gene is 'off,' it is not transcribed, leading to a deficiency of these essential proteins. During periods of famine, increased DNA methylation can enhance the body's ability to conserve and store energy by altering the activity of metabolism-related genes. Epigenetic inheritance, a phenomenon where some epigenetic tags escape the usual reprogramming process and persist across generations, plays a crucial role in how famine-induced traits are passed down. Typically, reproductive cells undergo a reprogramming phase where most epigenetic tags are erased to reset the genetic blueprint. However, certain DNA methylation patterns can evade this erasure and remain attached to specific genes in the germ cells, the cells that develop into sperm and egg cells. These persistent modifications can influence gene expression in the next generation, affecting metabolic traits and responses to environmental stressors. This means the metabolic adaptations seen in famine survivors, such as increased fat storage and altered hormone levels, can be transmitted to their descendants, predisposing them to similar health risks. Research has highlighted how these inherited traits manifest in distinct hormone profiles across different ethnic groups. A study published in Diabetes Care found that South Asians had higher leptin levels (11.82 ng/mL) and lower adiponectin levels (9.35 µg/mL) compared to Europeans, whose leptin levels were 9.21 ng/mL and adiponectin levels were 12.96 µg/mL. Leptin, encoded by the LEP gene, is a hormone that reduces appetite and encourages fat storage. Adiponectin, encoded by the ADIPOQ gene, improves insulin sensitivity and supports fat metabolism. Epigenetic changes, such as DNA methylation in the LEP and ADIPOQ genes, have led to these imbalances which were advantageous for South Asian populations during times of famine. Elevated leptin levels helped ensure the body could maintain energy reserves for survival, while lower adiponectin levels slowed fat breakdown, preserving stored fat for future use. This energy-conservation mechanism allowed individuals to endure long periods of food scarcity. Remarkably, these epigenetic changes can be passed down to subsequent generations. As a result, descendants continue to exhibit these metabolic traits, even in the absence of famine conditions. This inherited imbalance—higher leptin levels and lower adiponectin—leads to a higher predisposition to metabolic disorders. Increased leptin levels can cause leptin resistance, where the body no longer responds properly to leptin’s signals, driving overeating and fat accumulation. Simultaneously, reduced adiponectin weakens the body’s ability to regulate insulin and break down fats efficiently, resulting in higher blood sugar levels and greater fat storage. These combined effects heighten the risk of obesity and Type 2 diabetes in South Asian populations today. Integrating cultural awareness in health strategies Understanding famine-induced epigenetic changes provides a compelling case for rethinking public health strategies in affected populations. While current medicine cannot reverse famine-induced epigenetic changes in South Asians, culturally tailored interventions and preventive measures are crucial to reducing metabolic risks. These should include personalised dietary plans, preventive screenings, and targeted healthcare programmes. For example, the Indian Diabetes Prevention Programme showed that lifestyle changes reduced diabetes risk by 28.5% among high-risk individuals. Equally, policymakers must consider the broader societal factors that contribute to these health risks, and qualitative studies highlight challenges in shifting cultural attitudes. Expectations that women prepare meals in line with traditional norms often limit healthier dietary options.Differing perceptions of physical activity can complicate efforts to promote healthier lifestyles. For example, a study in East London found that some communities consider prayer sufficient exercise, which adds complexity to changing attitudes. Facing our past to secure a healthier future As we uncover the long-term effects of environmental stressors like historical famines, it becomes clear that our past is not just a distant memory but an active force shaping our present and future health. Epigenetic changes inherited from South Asian ancestors who endured famine have heightened the risk of metabolic disorders in their descendants. For instance, UK South Asian men have been found to have nearly double the risk of coronary heart disease (CHD) compared to White Europeans. Consultant cardiologist Dr Sonya Babu-Narayan has stated, “Coronary heart disease is the world’s biggest killer and the most common cause of premature death in the UK.” With over 5 million South Asians in the UK alone, this stark reality requires immediate action. We must not only address the glaring gaps in scientific research but also develop targeted public health policies to tackle these inherited health risks. These traits are not relics of the past; they are living legacies that, without swift intervention, will continue to affect generations to come. To truly address the inherited health risks South Asians face, we must go beyond surface-level awareness and commit to long-term, systemic change. Increasing funding for research that directly focuses on the unique health challenges within this population is non-negotiable. Equally crucial are culturally tailored public health initiatives that resonate with the affected communities, alongside comprehensive education programmes that empower individuals to take control of their health. These steps are not just about improving outcomes—they’re about breaking a cycle. The question, therefore, is not simply whether we understand these epigenetic changes, but whether we have the resolve to confront their full implications. Can we muster the political will needed to confront these inherited risks? Can we unite our efforts to stop these risks from affecting the health of entire communities? The cost of inaction is not just measured in statistics—it will be felt in the lives lost and the potential unrealised. The time to act is now. Written by Naziba Sheikh Related articles: Epigenetics / Food deserts and malnutrition / Mental health in South Asian communities / Global health injustices- Kashmir , Bangladesh REFERENCES Safi, M. (2019). Churchill’s policies contributed to 1943 Bengal famine – study. [online] the Guardian. Available at: https://www.theguardian.com/world/2019/mar/29/winston-churchill-policies-contributed-to-1943-bengal-famine-study . Bakar, F. (2022). How History Still Weighs Heavy on South Asian Bodies Today. [online] HuffPost UK. Available at: https://www.huffingtonpost.co.uk/entry/south-asian-health-colonial-history_uk_620e74fee4b055057aac0e9f . Sayed, M., Deek, F. and Shaikh, A. (2022). The Susceptibility of South Asians to Cardiometabolic Disease as a Result of Starvation Adaptation Exacerbated During the Colonial Famines. [online] Research Gate. Available at: https://www.researchgate.net/publication/366596806_The_Susceptibility_of_South_Asians_to_Cardiometabolic_Disease_as_a_Result_of_Starvation_Adaptation_Exacerbated_During_the_Colonial_Famines#:~:text=This%20crisis%20could%20be%20the,adapted%20physiology%20can%20become%20harmful . Utah.edu . (2009). Epigenetics & Inheritance. [online] Available at: https://learn.genetics.utah.edu/content/epigenetics/inheritance/ . Palaniappan, L., Garg, A., Enas, E., Lewis, H., Bari, S., Gulati, M., Flores, C., Mathur, A., Molina, C., Narula, J., Rahman, S., Leng, J. and Gany, F. (2018). South Asian Cardiovascular Disease & Cancer Risk: Genetics & Pathophysiology. Journal of Community Health, 43(6), pp.1100–1114. doi: https://doi.org/10.1007/s10900-018-0527-8 . Diabetes UK (2022). Risk of Type 2 Diabetes in the South Asian Community. [online] Diabetes UK. Available at: https://www.diabetes.org.uk/node/12895 . King, M. (2024). South Asian Heritage Month: A Journey Through History and Culture . [online] Wearehomesforstudents.com . Available at: https://wearehomesforstudents.com/blog/south-asian-heritage-month-a-journey-through-history-and-culture . Project Gallery

  • Gatekeepers of pain: how your body decides what hurts | Scientia News

    Explaining The Pain Gate Theory Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Gatekeepers of pain: how your body decides what hurts Last updated: 18/09/25, 08:40 Published: 18/09/25, 07:00 Explaining The Pain Gate Theory Pain is an unpleasant bodily sensation that’s usually linked to actual or potential tissue damage. It often acts as the body’s warning system, protecting us from further harm. Now picture this: you hit your leg, and it hurts—but then you instinctively start rubbing it, and the pain begins to ease. Why does that happen? That’s where the Pain Gate Theory (also known as The Gate Theory of Pain, or The Gate Control Theory of Pain) comes in. It’s one of the most fascinating ideas in pain science because it explains how pain isn’t just about injury— it’s also about how our nervous system processes it. Pain can vary greatly between individuals and even in the same person under different circumstances. This variation is due to the fact that pain is not just a physical experience, but also influenced by emotions, attention, and context. The Pain Gate Theory was first coined in 1965 by Ronald Melzack and Patrick Wall to explain this phenomenon. It states that a stimulus must travel through the substantia gelatinosa in the dorsal horn of the spinal cord, the transmission cells and the fibres in the dorsal column in order to have an effect. The substantia gelatinosa acts as a ‘gate’, mediating which signals are able to pass through the nervous system to the brain. As to whether the gate closes is influenced by an array of factors. How does it work? The below figure depicts the relationships in The Pain Gate Theory. The gate mechanism is influenced by the activity of the larger diameter fibres (A-beta) which usually inhibit transmission and the small diameter fibres (A-delta and C) which increase transmission. Take our analogy from earlier about rubbing your leg: when you do this, the large fibres carrying non painful stimuli like touch and pressure are activated. This causes the gate to be ‘closed’ which blocks the pain signals being transmitted by the small fibres. This concept is so interesting as it opens doors to viewing pain holistically; pain is influenced by touch, thoughts and emotions, which explains why you may not notice pain as much when your super excited about something or why placebos have been proven to work in some cases. In a clinical sphere, this theory has opened the door to many pain management techniques, for example Transcutaneous Electrical Nerve Stimulation (TENS), which selectively stimulates A-beta fibres leading to a consequential inhibition in A-delta and C fibres, preventing pain-related signals reaching the brain. It also has been utilised in physiotherapy, labour and chronic pain treatments. One main limitation of this model is its inability to explain certain types of pain like phantom limb since it relies on the assumption that pain requires an input from a limb to the spinal cord . This has led to the development of more advanced models like the neuromatrix model which acknowledges the fact that the brain can create pain on its own. In conclusion, the bottom line is that The Pain Gate Theory was groundbreaking in our understanding of how pain works. Understanding pain as a brain-and-body experience opens the door to innovative treatments that may one day make pain more manageable, or even preventable. Written by Blessing Amo-Konadu Related articles: Ibuprofen / Anthrax toxin to treat pain REFERENCES Cho, In-Chang, and Seung Ki Min. “Proposed New Pathophysiology of Chronic Prostatitis/Chronic Pelvic Pain Syndrome.” Urogenital Tract Infection , vol. 10, no. 2, 2015, p. 92, https://doi.org/10.14777/uti.2015.10.2.92 . Accessed 29 June 2020. Merrick, Mark. “Gate Control Theory - an Overview | ScienceDirect Topics.” Sciencedirect.com , 2012, www.sciencedirect.com/topics/medicine-and-dentistry/gate-control-theory . Tashani, O, and M Johnson. “Transcutaneous Electrical Nerve Stimulation (TENS). A Possible Aid for Pain Relief in Developing Countries?” Libyan Journal of Medicine , vol. 4, no. 2, 10 Dec. 2008, pp. 77–83, www.ncbi.nlm.nih.gov/pmc/articles/PMC3066716/pdf/LJM-4-062.pdf , https://doi.org/10.4176/090119 . The British Pain Society. “What Is Pain?” Britishpainsociety.org , July 2020, www.britishpainsociety.org/about/what-is-pain/ . Trachsel, Lindsay A., et al. “Pain Theory.” PubMed , StatPearls Publishing, 17 Apr. 2023, www.ncbi.nlm.nih.gov/books/NBK545194/ Project Gallery

  • Nanogels: the future of smart drug delivery | Scientia News

    Nanogels are tiny, water swollen polymer networks and encapsulate therapeutic agents Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Nanogels: the future of smart drug delivery Last updated: 17/07/25, 10:54 Published: 17/07/25, 07:00 Nanogels are tiny, water swollen polymer networks and encapsulate therapeutic agents Nanomedicine is a rapidly advancing field, with nanogels emerging as promising innovations for drug delivery applications. Nanogels are soft nanoscale hydrogels that are transforming how we deliver drugs and treat diseases. Whilst hydrogels themselves have long been used in biomedical applications such as tissue engineering and wound healing, their relatively larger sizes (above 100 micrometres) limits their ability to interact with cells and cross biological barriers. Nanogels, however, are thousands of times smaller, and offer unique advantages as a result. What are nanogels? Nanogels are tiny, water swollen polymer networks and are made up of crosslinked polymer chains to form a 3D matrix. Nanogels can encapsulate therapeutic agents inside their porous core shell structure. This swelling allowing nanogels to carry payloads, such as drugs, proteins, nucleic acids and these cargo materials are protected from degradation in the body whilst enabling controlled and targeted delivery. Due to their small sizes, nanogels can penetrate tissues and even enter cells, which overcomes the limitations faced with hydrogels. The surface of nanogels can also be engineered for specificity, to allow for precise targeting of drugs to receptors on diseased cells or inflamed tissues. Advantages over other nanocarriers Compared to liposomes and polymeric micelles, nanogels have a larger inner surface, which means they can carry more payload. The higher loading capacity improves the therapeutic efficiency whilst reducing the risks of side effects cause by off-target drug release. Nanogels also undergo the enhanced permeability and retention (EPR) effect - a phenomenon where the nanoparticles naturally accumulate in tumour or inflamed tissues due to leaky blood vessel, and as a result this improves drug delivery to targeted disease sites. Stimuli responsive ‘smart’ nanogels A key feature of nanogels is their stimuli responsiveness, or ability to act as ‘smart’ materials. The nanogels can be designed to respond to environmental triggers such as changes in pH, temperature, light, redox conditions, pressure and more. This responsiveness enables controlled release of drugs exactly when and where they are needed12. For example, thermoresponsive nanogels can change their structure at body temperature or when exposed to localised heating, making them ideal for applications like wound healing and cancer therapy. This controlled release prevents premature drug leakage, reduces systemic toxicity and overall improves the precision of the treatment. The future of nanogels in medicine Nanogels have huge potential as customisable drug delivery systems to target specific disease systems. They are biocompatible, stable, and have high drug loading capacities and are stimuli responsive; these properties combined make them a powerful tool in applications such as targeted drug delivery and gene therapy. As nanomedicine research progresses, nanogels are set to revolutionise healthcare with smarter, safer and more targeted therapies. Written by Saanchi Agarwal Related articles: Nanomedicine / Nanoparticles and diabetes treatment / Nanoparticles and health / Nanocarriers / Silicon hydrogel REFERENCES L. Blagojevic and N. Kamaly, Nanogels: A chemically versatile drug delivery platform, Nano Today, 2025, 61, 102645. F. Carton, M. Rizzi, E. Canciani, G. Sieve, D. Di Francesco, S. Casarella, L. Di Nunno and F. Boccafoschi, Use of Hydrogels in Regenerative Medicine: Focus on Mechanical Properties, Int. J. Mol. Sci. , 2024, 25 , 11426. N. Rabiee, S. Hajebi, M. Bagherzadeh, S. Ahmadi, M. Rabiee, H. Roghani-Mamaqani, M. Tahriri, L. Tayebi and M. R. Hamblin, Stimulus-Responsive Polymeric Nanogels as Smart Drug Delivery Systems, Acta Biomater. , 2019, 92 , 1–18. N. Rabiee, S. Hajebi, M. Bagherzadeh, S. Ahmadi, M. Rabiee, H. Roghani-Mamaqani, M. Tahriri, L. Tayebi and M. R. Hamblin, Stimulus-Responsive Polymeric Nanogels as Smart Drug Delivery Systems, Acta Biomater. , 2019, 92 , 1–18. A. Vashist, G. P. Alvarez, V. A. Camargo, A. D. Raymond, A. Y. Arias, N. Kolishetti, A. Vashist, P. Manickam, S. Aggarwal and M. Nair, Recent advances in nanogels for drug delivery and biomedical applications, Biomater. Sci. , 2024, 12 , 6006–6018. K. S. Soni, S. S. Desale and T. K. Bronich, Nanogels: an overview of properties, biomedical applications and obstacles to clinical translation, J. Control. Release Off. J. Control. Release Soc. , 2016, 240 , 109–126. A. Bordat, T. Boissenot, J. Nicolas and N. Tsapis, Thermoresponsive polymer nanocarriers for biomedical applications, Adv. Drug Deliv. Rev. , 2019, 138 , 167–192. K. S. Soni, S. S. Desale and T. K. Bronich, Nanogels: an overview of properties, biomedical applications and obstacles to clinical translation, J. Control. Release Off. J. Control. Release Soc. , 2016, 240 , 109–126. T. Alejo, L. Uson, G. Landa, M. Prieto, C. Yus Argón, S. Garcia-Salinas, R. de Miguel, A. Rodríguez-Largo, S. Irusta, V. Sebastian, G. Mendoza and M. Arruebo, Nanogels with High Loading of Anesthetic Nanocrystals for Extended Duration of Sciatic Nerve Block, ACS Appl. Mater. Interfaces , 2021, 13 , 17220–17235. S. V. Vinogradov, Nanogels in The Race for Drug Delivery, Nanomed. , 2010, 5 , 165–168. Project Gallery

  • Why brain injuries affect children and adults differently | Scientia News

    The main difference between children and adults lies in what needs to be rebuilt Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Why brain injuries affect children and adults differently Last updated: 12/11/25, 12:09 Published: 13/11/25, 08:00 The main difference between children and adults lies in what needs to be rebuilt When we think about a brain injury, it is easy to assume that the same thing happens in everyone; a bump to the head, swelling, and hopefully a recovery. In reality, things aren’t quite that simple. A child’s brain is not a smaller version of an adult’s, it is still developing, which makes it both incredibly adaptable and, at the same time, especially vulnerable. Smaller bodies, bigger risks Although the brain’s basic reaction to injury is similar in children and adults, injuries in younger people tend to cause more widespread and severe damage. This is due to the differences in anatomical development. Children’s heads are proportionally larger compared to the rest of their bodies, and their neck muscles are much weaker than those of adults. This means that when a child falls or is knocked, their head can move suddenly and forcefully, placing extra strain on the brain. On top of that, children’s brains have a higher water content and are softer in texture, which makes them more vulnerable to rotational forces and acceleration-deceleration injuries. These types of movements can lead to diffuse axonal injury, where nerve fibres are torn across large areas, and cerebral swelling, both of which are less common in adults experiencing similar trauma. A clear example of this vulnerability is seen in abusive head trauma. When an infant is shaken, their softer skull and brain structure can lead to a combination of skull fractures, internal bleeding, and swelling. Sadly, these injuries are often linked to very poor outcomes. The double-edged sword of brain plasticity One of the most remarkable things about the young brain is its plasticity, which is its ability to reorganise itself and form new connections after injury. This flexibility often means that children recover some functions, such as movement or daily activities, more quickly than adults do in the early months after a brain injury. However, this adaptability has limits. During childhood, the brain is constantly developing new skills and abilities. If an injury occurs during one of these critical periods, it can interrupt processes essential for normal development. This means that difficulties might not appear straight away. A child could seem to recover well at first but then struggle later when their brain is expected to handle more complex tasks, such as problem-solving or emotional regulation. Over time, recovery often plateaus, and children may continue to face long-term challenges with learning, behaviour, and social interaction. Research also shows that injury severity is a major factor in long-term outcomes. Children who suffer severe traumatic brain injuries are more likely to experience lower academic performance and, later in life, face higher rates of unemployment or lower paid work compared with their peers. Behaviour, learning and life after injury Brain injuries in childhood can also affect behaviour and mental health. Conditions such as ADHD are especially common following injury, affecting between 20-50% of children. These difficulties can make returning to school and social life far more challenging. Children from lower socioeconomic backgrounds often experience extra barriers, including limited access to rehabilitation and educational support. This can increase the risk of social isolation and mental health difficulties. Children are also more likely than adults to develop secondary brain conditions, such as epilepsy, after an injury which adds further complexity to their recovery. Why recovery is not the same The main difference between children and adults lies in what needs to be rebuilt. Adults are generally trying to re-learn skills they already had, while children are still learning those skills for the first time. That makes recovery a much more delicate and unpredictable process. Moreover, most rehabilitation is concentrated in the first few months after the injury, but children’s challenges often become clearer years later, when their brains, and the demands placed on them, have developed further. In summary The developing brain is both fragile and flexible . While its biological features make it more prone to injury, its capacity for plasticity allows for impressive short-term recovery. Yet the same developmental processes that support growth also make it more vulnerable to long-term disruption. Injuries sustained during childhood can alter the course of brain development, leading to lasting effects on thinking, learning, and behaviour. These consequences can shape a person’s future long after the initial recovery period has ended. Understanding these differences is crucial, not just for doctors, but also for teachers, parents, and anyone supporting a young person recovering from a brain injury. Written by Alice Greenan Related articles: Synaptic plasticity / Traumatic Brain Injury (TBI) / Childhood intelligence REFERENCES Anderson, V. (2005). Functional Plasticity or Vulnerability After Early Brain Injury? PEDIATRICS , 116 (6), 1374–1382. https://doi.org/10.1542/peds.2004-1728 Anderson, V., Brown, S., Newitt, H., & Hoile, H. (2011). Long-term outcome from childhood traumatic brain injury: Intellectual ability, personality, and quality of life. Neuropsychology , 25 (2), 176–184. https://doi.org/10.1037/a0021217 Anderson, V., & Yeates, K. O. (2010). Pediatric Traumatic Brain Injury. In Cambridge University Press eBooks . Cambridge University Press. https://doi.org/10.1017/cbo9780511676383 ARAKI, T., YOKOTA, H., & MORITA, A. (2017). Pediatric Traumatic Brain Injury: Characteristic Features, Diagnosis, and Management. Neurologia Medico-Chirurgica , 57 (2), 82–93. https://doi.org/10.2176/nmc.ra.2016-0191 Blackwell, L. S., & Grell, R. M. (2023). Pediatric Traumatic Brain Injury: Impact on the Developing Brain. Pediatric Neurology . https://doi.org/10.1016/j.pediatrneurol.2023.06.019 Figaji, A. A. (2017). Anatomical and Physiological Differences between Children and Adults Relevant to Traumatic Brain Injury and the Implications for Clinical Assessment and Care. Frontiers in Neurology , 8 (685). https://doi.org/10.3389/fneur.2017.00685 Manfield, J., Oakley, K., Macey, J.-A., & Waugh, M.-C. (2021). Understanding the Five-Year Outcomes of Abusive Head Trauma in Children: A Retrospective Cohort Study. Developmental Neurorehabilitation , 24 (6), 1–7. https://doi.org/10.1080/17518423.2020.1869340 Narad, M. E., Kaizar, E. E., Zhang, N., Taylor, H. G., Yeates, K. O., Kurowski, B. G., & Wade, S. L. (2022). The Impact of Preinjury and Secondary Attention-Deficit/Hyperactivity Disorder on Outcomes After Pediatric Traumatic Brain Injury. Journal of Developmental & Behavioral Pediatrics , 43 (6), e361–e369. https://doi.org/10.1097/dbp.0000000000001067 Neumane, S., Câmara-Costa, H., Francillette, L., Araujo, M., Toure, H., Brugel, D., Laurent-Vannier, A., Ewing-Cobbs, L., Meyer, P., Dellatolas, G., Watier, L., & Chevignard, M. (2021). Functional outcome after severe childhood traumatic brain injury: Results of the TGE prospective longitudinal study. Annals of Physical and Rehabilitation Medicine , 64 (1), 101375. https://doi.org/10.1016/j.rehab.2020.01.008 Parker, K. N., Donovan, M. H., Smith, K., & Noble-Haeusslein, L. J. (2021). Traumatic Injury to the Developing Brain: Emerging Relationship to Early Life Stress. Frontiers in Neurology , 12 . https://doi.org/10.3389/fneur.2021.708800 Project Gallery

  • Cancer Articles 2 | Scientia News

    Peruse through the current treatment discoveries for one of the deadliest diseases in the world. Learn about the factors that cause tumour growth, metastatic processes and blastomas. Cancer Articles Peruse through the current treatment discoveries for one of the deadliest diseases in the world. Learn about the factors that cause tumour growth, metastatic processes and blastomas. You may also like: Biology, Medicine Arginine and tumour growth Another breakthrough in cancer research Unveiling the cancer magnet Stem cells in vertebral bones can act like cancer magnets for spinal tumour metastasis Brain metastasis in cognitive impairment Researchers used machine learning to investigate this Novel neuroblastoma driver Uncovering the role of IGF2BP1 in neuroblastoma and its potential as a therapeutic target Previous

  • Geoengineering as a solution to the climate crisis | Scientia News

    For centuries, we have been burning fossil fuels, polluting our oceans and participating in deforestation without a second thought. We have managed to understand the consequences this has had on our planet and have started to make movement in the right direction; but is it too late? Go back Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Geoengineering: what is it and will it actually work? Last updated: 14/11/24 Published: 02/04/23 For centuries, we have been burning fossil fuels, polluting our oceans and participating in deforestation without a second thought. We have managed to understand the consequences this has had on our planet and have started to make movement in the right direction; but is it too late? In the past 50 years, we have warmed the planet at a rate of approximately 0.1°C per decade. It doesn’t sound like much but the effect this has is astronomical; increased drought, adverse weather conditions and a rising sea level to name a few of the consequences. People are aware of the damage we have caused, and there is thankfully a switching attitude towards our environment with the increased usage of renewable energies and technologies such as electric cars. The problem arises from the rate of this societal switch. It isn’t fast enough. We haven’t quite understood how to stop our reliance on farming animals, carbon dioxide emissions and polluting transport. What if we could disrupt the natural mechanisms of our planet, just as we did to cause this problem in the first place? Scientists have started to consider some dystopian sounding scenarios that are classed as ‘geoengineering’ techniques. There are two main branches of geoengineering: solar radiation management and greenhouse gas removal. Solar radiation management is the more alien of the two categories, involving sending large mirrors into space that reflect sunlight or enhancing the natural ability of clouds to block radiation, called albedo enhancement. Greenhouse gas removal is more commonly heard of, and involves reducing the proportion of harmful gases, mainly carbon dioxide, in our atmosphere. This can be as simple as planting more trees to do this naturally, or having point source removal of carbon dioxide in factories, which means that the gases never enter the atmosphere. A difficult yet promising idea is the removal of carbon dioxide directly from the atmosphere using a material that absorbs it directly, which could not only reduce the amount in the atmosphere, but could return us to anthropogenic atmosphere composition. The idea is interesting; to disrupt the naturally occurring processes with human intervention, which buys time for us to develop better renewable energy resources, biodegradable materials and a better attitude towards saving our planet. Theoretically, it seems reasonable however the concern is that with these techniques, we may continue to treat the environment with a lack of respect, since we would be creating a false sense of security. Furthermore, the technologies are large scale therefore we may not be able to model and test them sufficiently before implementation. They may not be successful or safe. The ideal scenario is to not need geoengineering, however we need to act fast to avoid its necessity. Written by Megan Martin Related article: How nuclear fusion holds the key to tackling climate change

  • Metal organic frameworks and cancer drug delivery | Scientia News

    In particular the novel zeolitic 2-methylimidazole framework (ZIF-8) MOF has received attention for drug delivery. ZIF-8 is composed of Zn2+ ions and 2-methylimidazole ligands, making a highly crystalline structure. ZIF-8 MOFs are able to deliver  cancer drugs like doxorubicin to tumorous environments as it possesses a pH-sensitive degradation property. ZIF-8’s framework will only degrade in pH 5.0-5.5 which is a cancerous pH environment, and will not degrade in normal human body pH 7.4 Go back Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link How metal organic frameworks are used to deliver cancer drugs in the body Last updated: 14/11/24 Published: 20/04/23 Metal ions and organic ligands are able to connect to form metallic organic frameworks on a nanoscale (Nano-MOFs) for cancer drug delivery. Metal Organic Frameworks (MOFs) are promising nanocarriers for the encapsulation of cancer drugs for drug delivery in the body. Cancer affects people globally with chemotherapy remaining the most frequent treatment approach. However, chemotherapy is non-specific, being cytotoxic to patients’ normal DNA cells causing severe side effects. Nanoscale Metal Organic Frameworks (Nano-MOFs) are highly effective for encapsulating cancer drugs for controlled drug delivery, acting as capsules that deliver cancer drugs to only tumorous environments. MOFs are composed of metal ions linked by organic ligands creating a permanent porous network. MOFs are able to form one-, two-, or three-dimensional structures building a coordination network with cross-links. When synthesized MOFs are crystalline compound and can sometimes be observed as a cubic structure when observed on a scanning electron microscope (SEM) image. In particular the novel zeolitic 2-methylimidazole framework (ZIF-8) MOF has received attention for drug delivery. ZIF-8 is composed of Zn2+ ions and 2-methylimidazole ligands, making a highly crystalline structure. ZIF-8 MOFs are able to deliver cancer drugs like doxorubicin to tumorous environments as it possesses a pH-sensitive degradation property. ZIF-8’s framework will only degrade in pH 5.0-5.5 which is a cancerous pH environment, and will not degrade in normal human body pH 7.4 conditions. This increases therapeutic efficacy for the patients having less systemic side effects, an aspect that nanomedicine has been extensively researching. As chemotherapy will damage health DNA cells as well as cancer cells, MOFs will only target cancer cells. Additionally the ZIF-8 MOF has a high porosity property due to the MOFs structures that is able to uptake doxorubicin successfully. Zn2+ is used in the medical field having a low toxicity and good biocompatibility. Overall MOFs and metal-organic molecules are important for the advancement of nanotechnology and nanomedicine. MOFs are highly beneficial for cancer research being a less toxic treatment method for patients. ZIF-8 MOFs are a way forward for biotechnology and pharmaceutical companies that research treatments that are more tolerable for patients. Such research shows the diversity of chemistry as the uses of metals and organic molecules are able to expand to medicine. Written by Alice Davey Related article: Anti-cancer metal compounds

  • International Baccalaureate (IB) resources | Scientia News

    Common questions and answers- along with helpful resources- regarding the International Baccalaureate programme. International Baccalaureate (IB) Are you a student currently studying the IB Diploma Programme (IBDP), or about to commence it? You're in the right place! You may also like: Personal statements , A-level resources , University prep and Extra resources What is the IB? Jump to resources The IB is an International Academic Program which is another alternative to A levels. This is a highly academic program with final exams that prepare students for university and careers. You select one subject from each of the five categories, which include two languages, social sciences, experimental sciences, and mathematics. You must also choose either an arts subject from the sixth group or another from the first to fifth groups. How is the IB graded? Subjects might differ from schools and countries but these are the ideal subjects given in the IB. IB is graded through a point system (7 being the highest and 1 being the lowest) and the highest mark you can achieve in total is 45. For the 6 subjects you study you can achieve a maximum of 42 points. Theory of Knowledge and Extended Essay are combined to gain 3 extra bonus points. These 2 subjects will be marked from A (highest) to E (lowest) and then will be converted to points. What are the benefits of studying the IB? Even though there are a lot of subjects, this programme is great for students to gain new skills and be an all- rounder. IB also helps students to have a better idea of how work will be in university especially with coursework and that is one of the main things you will work on when studying IB- it is known as Internal Asssessment (IA). Doing CAS is also a great opportunity for students to be independent and find activities/ services to do outside of school to build up their portfolio on CAS as well as their CV/ personal statement when applying for university. The marking matrix used in the IB. How do universities use the IB to select students? All universities around the world accept the IB as a qualification gained in secondary school. Depending on the degree you are applying to, universities mainly focus on your Higher Level (HL) subjects. Each university has their own requirements for students applying to study a course at their institution. The most common way is considering your total point score out of 45, and your total point score of your HL subjects. Another way is asking applicants to achieve a certain grade in a particular grade at HL or at standard level (SL). If you complete the IB programme well enough, universities may prefer you over the other qualifications e.g. A-levels. Benefits of completing the IB programme. Resources for revision Websites to help Official IB website and the IB Bookshop Maths IA ideas Maths Analysis and Approaches SL and HL practice questions Maths resources in general / Worksheets and more Biology- BioNinja Biology, Chemistry, Physics, Maths- Revision Village / Save My Exams Biology, Chemistry, Maths- IB Dead IB Psychology IB Computer Science resources YouTube channels to help Chemistry- Richard Thornley Physics- Chris Doner Textbooks for both HL and SL Bio: Oxford IB Diploma Programme: Biology Course Biology for the IB Diploma by Brenda Walpole Chem: Chemistry Oxford IB Diploma Programme: Chemistry Course Chemistry for the IB Diploma Coursebook with Cambridge Elevate Enhanced Edition b y Steve Owen Physics: Physics Oxford IB Diploma Programme: Physics Course Physics for the IB Diploma with Cambridge by T. A. Tsokos Maths: Maths Oxford IB Diploma Programme- IB Mathematics: analysis and approaches / applications and interpretations

bottom of page