Search Index
355 results found
- The rising threat of antibiotic resistance | Scientia News
Understanding the problem and solutions Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The rising threat of antibiotic resistance 14/07/25, 15:00 Last updated: Published: 07/01/24, 13:47 Understanding the problem and solutions An overview and history of antibiotics Antibiotics are medicines that treat and prevent bacterial infections (such as skin infections, respiratory infections and more). Antibiotic resistance is the process of infection-causing bacteria becoming resistant to antibiotics. As the World Health Organisation (WHO) stated, antibiotic resistance is one of the biggest threats to global health, food security and development. In 1910, Paul Ehrlich discovered the first antibiotic, Salvarsan, used to treat syphilis at the time. His idea was to create anti-infective medication, and Salvarsan was successful. The golden age of antibiotic discovery began with the accidental discovery of penicillin by Alexander Fleming in 1928. He noticed that mould had contaminated one of the petri dishes of Staphylococcus bacteria. He observed that bacteria around the mould were dying and realised that the mould, Penicillium notatum , was causing the bacteria to die. In 1940, Howard Florey and Ernst Chain isolated penicillin and began clinical trials, showing that it effectively treated infectious animals. Penicillin was then used to treat patients by 1943 in the United States. Overall, the discovery and use of antibiotics in the 21st century was a significant scientific discovery, extending people’s lives by around 20 years. Factors contributing to antibiotic resistance Increasing levels of antibiotic resistance could mean routine surgeries and cancer treatments (which can weaken the body’s ability to respond to infections) might become too risky, and minor illnesses and injuries could become more challenging to treat. There are various factors contributing to this, including overusing and misusing antibiotics and low investment in new antibiotic research. Antibiotics are overused and misused due to misunderstanding when and how to use them. As a result, antibiotics may be used for viral infections, and an entire course may not be completed if patients start to feel better. Some patients may also use antibiotics not prescribed to them, such as those of family and friends. Moreover, there has not been enough investment to fund the research of novel antibiotics. This has resulted in a shortage of antibiotics available to treat infections that have become resistant. Therefore, more investment and research are needed to prevent antibiotic resistance from becoming a public health crisis. Combatting antibiotic resistance One of the most effective ways to combat antibiotic resistance is through raising public awareness. Children and adults can learn about when and how to use antibiotics safely. Several resources are available to help individuals and members of the public to do this. Some resources are linked below: 1. The WHO has provided a factsheet with essential information on antibiotic resistance. 2. The Antibiotic Guardian website is a platform with information and resources to help combat antibiotic resistance. It is a UK-wide campaign to improve and reduce antibiotic prescribing and use. Visit the website to learn more, and commit to a pledge to play your part in helping to solve this problem. 3. Public Health England has created resources to support Antibiotic Guardian. 4. The E-bug peer-education package is a platform that aims to educate individuals and provide them with tools to educate others. Written by Naoshin Haque Related articles: Anti-fungal resistance / Why bacteria are essential to human survival Project Gallery
- The role of chemistry in space exploration | Scientia News
How chemistry plays a part Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The role of chemistry in space exploration 14/07/25, 15:00 Last updated: Published: 05/08/23, 09:41 How chemistry plays a part Background Space exploration is without a doubt one of the most intriguing areas of science. As humans, we have a natural tendency to investigate everything around us – with space, the main question we want to answer is if there is life beyond us on Earth. Astronomers use advanced telescopes to help look for celestial objects and therefore study their structures, to get closer in finding a solution to this question. However, astronomers do have to communicate with other scientists in doing so. After all, the field of science is all about collaboration. One example is theoretical physicists studying observed data and, as the name suggests, come up with theories using computational methods for other scientists to examine experimentally. In this article, we will acknowledge the importance of chemistry in space exploration, from not only studying celestial bodies but also to life support technology for astronauts and more. Examples of chemistry applications 1) Portable life support systems To survive in space requires advanced and well-designed life support systems due to being exposed to extreme temperatures and conditions. Portable life support systems (PLSS) are devices connected to an astronaut’s spacesuit that supplies oxygen as well as removal of carbon dioxide (CO2). The famous apollo lunar landing missions had clever PLSS – they utilised lithium hydroxide to remove CO2 and liquid cooling garments, which used any water to remove heat from breathing air. However, these systems are large and quite bulky, so hopefully we can see chemistry help us design even more smart PLSS in the future. 2) Solid rocket propulsion systems Chemical propellants in rockets eject reaction mass at high velocities and pressure using a source of fuel and oxidiser, causing thrust in the engine. Simply put, thrust is a strong force that causes an object to move – in this case, a rocket launching into space. Advancements in propellant chemistry has allowed greater space exploration to take place due to more efficient and reliable systems. 3) Absorption spectroscopy Electromagnetic radiation is energy travelling at the speed of light (approx. 3.0 x 108 m/s!) that can interact with matter. This radiation consists of different wavelengths and frequencies, with longer wavelengths possessing shorter frequencies and vice versa. Each molecule has unique absorption wavelength(s) – this means that if specific wavelengths of radiation ‘hits’ a substance, electrons in the ground state will become excited and can jump up to higher energy states. A line appears in the absorption spectrum for every excited electron (see Figure 1 ). As a result, spectroscopic analysis of newly discovered planets or moons can give us information on the different elements that are present. It should also be noted that the excited electrons will relax back down to the ground state and emit a photon, allowing us to observe emission spectra as well. In the emission spectra, the lines would be in the exact same place as those in the absorption, but coloured in a black background (see Figure 2 ). Fun fact: There are six essential elements needed for life – carbon, hydrogen, nitrogen, oxygen, phosphorus and sulfur. In 2023, scientists concluded that Saturn’s moon Enceladus has all these which indicates that life could be present here! 1) Space medicine Whilst many people are fascinated by the idea of going to space, it is definitely not an easy task as the body undergoes more stress and changes than one can imagine. For example, barotrauma is when tissues filled with air space due to differences in pressure between the body and ambient atmosphere becomes injured. Another example is weakening of the immune system, as researchers has been found that pre-existing T cells in the body were not able to fight off infection well. However, the field of space medicine is growing and making sure discomforts like those above are prevented where possible. Space medicine researchers have developed ‘countermeasures’ for astronauts to follow, such as special exercises that maintain bone/muscle mass as well as diets. Being in space is isolating which can cause mental health problems, so early-on counselling and therapy is also being provided to prevent this. To conclude Overall, chemistry plays a vital role in the field of space exploration. It allows us to go beyond just analysis of celestial objects as demonstrated in this article. Typically, when we hear the word ‘chemistry’ we often just think of its applications in the medical field or environment, but its versatility should be celebrated more often. Written by Harsimran Kaur Related articles: AI in space / The role of chemistry in medicine / Astronauts in space Project Gallery
- Unfolding prion diseases and their inheritance | Scientia News
When misfolded proteins lead to disease Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Unfolding prion diseases and their inheritance 22/04/25, 14:11 Last updated: Published: 06/03/24, 11:32 When misfolded proteins lead to disease This is article no. 5 in a series on rare diseases. Next article: Neuromyelitis optica . Previous article: Epitheliod hemangioendothelioma . Prion proteins are found abundantly in the brain; their function is unclear, but they are involved in a multitude of physiological mechanisms, including myelin homeostasis and the circadian rhythm. Correctly folded prion proteins in the cellular form are termed PrP C , while their infectious isoform is called PrP Sc . As shown in Figure 1, the misfolded PrP Sc is largely made up of β-pleated sheets instead of α-helices; PrP Sc is prone to forming aggregates that cause transmissible spongiform encephalopathies (TSEs). Prion diseases can be categorised by their aetiology: acquired, sporadic, and hereditary. Acquired prion diseases are caused by the inadvertent introduction of PrP Sc prions into an individual. Sporadic prion diseases are the most common type, where PrP C misfolds into PrP Sc for an unknown reason and propagates this misfolding within other prion proteins. Hereditary prion diseases are caused by genetic mutation of the human prion protein gene (PRNP), which causes misfolding into the infectious isoform. Consequently, these mutations can be passed to offspring, resulting in the same misfolding and disease. Interestingly, different types of PRNP mutations cause different types of prion diseases. Creutzfeldt-Jakob disease (CJD) is a type of TSE found in humans which causes mental deterioration and involuntary muscle movement; symptoms tend to worsen as the disease progresses, making it a degenerative disorder. Familial CJD (fCJD) is a rare type of hereditary prion disease and can sometimes result in a faster rate of disease progression compared to sporadic cases. Due to a dominant inheritance pattern, relatives of fCJD patients are often also affected by the disease. The most common mutation observed in familial CJD is an E200K mutation denoting the substitution of glutamic acid with lysine in the prion protein. Other common mutations resulting in fCJD include mutations at positions 178 and 210 on the prion protein. However, there are, less frequently, a multitude of other mutations correlated with familial CJD development. Familial CJD can be caused by STOP codon mutations, which result in a truncated protein, some of which show similar pathology to Alzheimer’s disease, such as Q16OX and Q227X. fCJD can also be caused by insertional mutations, possibly caused by unbalanced crossover and recombination. The prion protein consists of a nona-peptide (made up of nine amino acids) followed by four repeats of an octa-peptide (made up of eight amino acids). During insertion mutations, additional repeats of the octa-peptide are present in the prion protein. Interestingly, different numbers of inserts result in different pathological characteristics; patients with 1, 2 or 4 extra repeats show similarity to sporadic CJD, while those with 5-9 extra repeats show similarity to Gerstmann-Sträussler-Scheinker syndrome. Hereditary prion diseases are important to study in order to develop an understanding of not only prion misfolding diseases but also diseases associated with misfolding of other proteins, such as Alzheimer’s and Parkinson’s. Understanding the mechanisms of hereditary prion diseases will aid the development of treatments for such conditions. In particular, observing and investigating particular genetic mutations observed to play a part in prion misfolding is crucial alongside using genetic information to infer the risk of disease an individual may have. Written by Isobel Cunningham Project Gallery
- Beyond the bump: unravelling traumatic brain injuries | Scientia News
The yearly incidence of TBI is around 27 and 69 million people worldwide Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Beyond the bump: unravelling traumatic brain injuries 13/11/25, 12:27 Last updated: Published: 15/10/24, 11:32 The yearly incidence of TBI is around 27 and 69 million people worldwide A traumatic brain injury (TBI) is one of the most serious and complex injuries sustained by the human body, often with profound and long-term effects on an individual’s physical, emotional, behavioural and cognitive abilities. What is a traumatic brain injury? A TBI results from an external force which causes structural and physical damage to the brain. The primary injury refers to the immediate damage to the brain tissue which is caused directly by the event. Whereas secondary injuries result from the cascade of cellular and molecular processes triggered by the initial injury and develop from hours to weeks following the initial TBI. Typically, the injury can be penetrating, where an object pierces the skull and damages the brain, or non-penetrating which occurs when the external force is large enough to shake the brain within the skull causing coup- contrecoup damage. Diagnosis and severity The severity of a TBI is classified as either mild (aka concussion), moderate, or severe, using a variety of indices. Whilst more than 75% of TBIs are mild, even these individuals can suffer long-term consequences from post-concussion syndrome. Here are two commonly used measures to initially classify severity: The Glasgow Coma Scale (GCS) is an initial neurological examination which assesses severity based on the patient’s ability to open their eyes, move, and respond verbally. It is a strong indicator of whether an injury is mild (GCS 13-15), moderate (GCS 9-12) or severe (≤8). Following the injury and any period of unconsciousness, when a patient has trouble with their memory and is confused, they are said to have post-traumatic amnesia (PTA). This is another measure of injury severity and lasts up to 30 30 minutes in mild TBI, between 30 minutes and 24 hours in moderate TBI, and over 24 hours in severe TBI. Imaging tests including CT scans and MRIs are used to detect brain bleeds, swelling or any other damage. These tests are essential upon arrival to the hospital, especially in moderate and severe cases to understand the full extent of the injury. Leading causes of TBI Common causes of TBI are a result of: Falls (most common in young children and older adults) Vehicle collisions (road traffic accidents- RTAs) Inter-personal violence Sports injuries Explosive blasts Interestingly, the rate of TBI is 1.5 times more common in men than women. General symptoms The symptoms and outcome of a TBI depend on the severity and location of the injury. They differ from person to person based on a range of factors which include pre-injury sociodemographic vulnerabilities including age, sex and level of education, as well as premorbid mental illnesses. There are also post-injury factors such as access to rehabilitation and psychosocial support which influence recovery. Due to this, nobody will have the same experience of a TBI, however there are some effects which are more common than others which are described: Mild TBI: Physical symptoms: headaches, dizziness, nausea, and blurred vision. Cognitive symptoms: confusion, trouble concentrating, difficulty with memory or disorientation. Emotional symptoms: mood swings, irritability, depression or anxiety. Moderate-to-severe TBI: Behavioural symptoms: aggression, personality change, disinhibition, impulsiveness. Cognitive symptoms: difficulties with attention and concentration, decision making, memory, executive dysfunction, information processing, motivation, language, reasoning, self-awareness. Physical symptoms: headaches, seizures, speech problems, fatigue, weakness or paralysis. Many of these symptoms are ‘hidden’ and can often impact functional outcomes for an individual, such as their capacity for employment and daily living (i.e., washing, cooking, cleaning etc.). The long-term effects of TBI can vary, with some returning to normal functioning. However, others might experience lifelong disabilities and require adjustments in their daily lives. For more information and support, there are some great resources on the Headway website, a leading charity which supports individuals after brain injury. Written by Alice Jayne Greenan Related articles: Why brain injuries affect adults and children differently / Neuroimaging / Different types of seizures Project Gallery
- The Genetics of Ageing and Longevity | Scientia News
A well-studied longevity gene is SIRT1 Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The Genetics of Ageing and Longevity 11/07/25, 09:57 Last updated: Published: 13/05/24, 15:20 A well-studied longevity gene is SIRT1 Ageing is a natural process inherent to all living organisms. Yet, its mechanisms remain somewhat enigmatic. While lifestyle factors undoubtedly influence longevity, recent advancements in genetic research have revealed the influence of our genomes on ageing. Through understanding these influences, we can unlock further knowledge on longevity, which can aid us in developing interventions to promote healthy ageing. This article delves into the world of ageing and longevity genetics and how we can use this understanding to our benefit. Longevity genes A number of longevity genes, such as APOE , FOXO3 , and CETP, have been identified. These genes influence various biological processes, including cellular repair, metabolism, and stress response mechanisms. A well-studied longevity gene is SIRT1 . Located on chromosome 10, SIRT1 encodes sirtuin 1, a histone deacetylase, transcription factor, and cofactor. Its roles include protecting cells against oxidative stress, regulating glucose and lipid metabolism, and promoting DNA repair and stability via deacetylation. Sirtuins are an evolutionarily conserved mediator of longevity in many organisms. One study looked at mice with knocked-out SIRT1 ; these mice had significantly lower lifespans when compared with WT mice1. The protective effects of SIRT1 are thought to be due to deacetylating p53, which promotes cell death2. SIRT1 also stimulates the cytoprotective and stress-resistance gene activator FoxO1A (see Figure 1 ), which upregulates catalase activity to prevent oxidative stress3. Genome-wide association studies (GWAS) have identified several genetic variants associated with ageing and age-related diseases. Such variants influence diverse aspects of ageing, such as cellular senescence, inflammation, and mitochondrial function. For example, certain polymorphisms in APOE are associated with an increased risk of age-related conditions like Alzheimer's and Parkinson’s disease4. These genes have a cumulative effect on the longevity of an organism. Epigenetics of ageing Epigenetic modifications, such as histone modifications and chromatin remodelling, regulate gene expression patterns without altering the DNA sequence. Studies have shown that epigenetic alterations accumulate with age and contribute to age-related changes in gene expression and cellular function. For example, DNA methylation is downregulated in human fibroblasts during ageing. Furthermore, ageing correlates with decreased nucleosome occupancy in human fibroblasts, thereby increasing the expression of genes unoccupied by nucleosomes. One specific marker of ageing in metazoans is H3K4me3, indicating the trimethylation of lysine 4 on histone 3; in fact, H3K4me3 demethylation extends lifespan. Similarly, H3K27me3 is also a marker of biological age. By using these markers as an epigenetic clock, we can predict biological age using molecular genetic techniques. As a rule of thumb, genome-wide hypomethylation and CpG island hypermethylation correlate with ageing, although this effect is tissue-specific5. Telomeres are regions of repetitive DNA at the terminal ends of linear chromosomes. Telomeres become shorter every time a cell divides (see Figure 2 ), and eventually, this can hinder their function of protecting the ends of chromosomes. As a result, cells have complex mechanisms in place to prevent telomere degradation. One of these is the enzyme telomerase, which maintains telomere length by adding G-rich DNA sequences. Another mechanism is the shelterin complex, which binds to ‘TTAGGG’ telomeric repeats to prevent degradation. Two major components of the shelterin complex are TRF1 and TRF2, which bind telomeric DNA. They are regulated by the chromatin remodelling enzyme BRM-SWI/SNF, which has been shown to be crucial in promoting genomic stability, preventing cell apoptosis, and maintaining telomeric integrity. BRM-SWI/SNF regulates TRF1/2, thereby, regulating the shelterin complex, by remodelling the TRF1/2 promoter region to convert it to euchromatin and increase transcription. BRM-SWI/SNF inactivating mutations have been shown to contribute to cancer and cellular ageing through telomere degradation6. Together, the mechanisms cells have in place to protect telomeres provide protection against cancer as well as cellular ageing. Future of anti-ageing drugs Anti-ageing drugs are big business in the biotechnology and cosmetics sector. For example, senolytics are compounds that decrease the number of senescent cells in an individual. Senescent cells are those that have permanently exited the cell cycle and now secrete pro-inflammatory molecules (see Figure 3); they are a major cause of cellular and organismal ageing. Senolytic drugs aim to provide anti-ageing benefits to an individual, whereby senescent cells are removed, therefore, decreasing inflammation. Currently, researchers are certain that removing senescent cells would have an anti-ageing effect, although senolytic drugs currently on the market are understudied, and so their side effects are unknown. Speculative drugs could include those that enhance telomerase or SIRT1 activity. Evidently, ageing is not purely determined by lifestyle and environmental factors alone but also by genetics. While longevity genes are hereditary, epigenetic modifications may be influenced by external factors. Therefore, we can attribute the complex interplay between various external factors and an individual’s genome to understanding the role of genetics in ageing. Perhaps we will see a new wave of anti-ageing treatments in the coming years, developed on the genetics of ageing. Written by Malintha Hewa Batage Related articles: An introduction to epigenetics / Schizophrenia, inflammation and ageing / Ageing and immunity REFERENCES Cilic, U et al., (2015) ‘A Remarkable Age-Related Increase in SIRT1 Protein Expression against Oxidative Stress in Elderly: SIRT1 Gene Variants and Longevity in Human’, PLoS One , 10(3). Alcendor, R et al., (2004) ‘Silent information regulator 2alpha, a longevity factor and class III histone deacetylase, is an essential endogenous apoptosis inhibitor in cardiac myocytes’, Circulation Research , 95(10):971-80. Alcendor, R et al., (2007) ‘Sirt1 regulates aging and resistance to oxidative stress in the heart’, Circulation Research , 100(10):1512-21. Yin, Y & Wang, Z, (2018) ‘ApoE and Neurodegenerative Diseases in Aging’, Advances in Experimental Medicine and Biology , 1086:77-92. Wang, K et al., (2022) ‘Epigenetic regulation of aging: implications for interventions of aging and diseases’, Signal Transduction and Targeted Therapy , 7(1):374. Images made using BioRender. Project Gallery
- The physics of the world’s largest gravitational-wave observatory: LIGO | Scientia News
Laser Interferometric Gravitational-wave Observatory (LIGO) Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The physics of the world’s largest gravitational-wave observatory: LIGO 23/10/25, 10:23 Last updated: Published: 11/05/24, 11:16 Laser Interferometric Gravitational-wave Observatory (LIGO) Since the confirmation of detection, talk of gravitational waves has drastically increased in the public forum. In February 2016, the Laser Interferometric Gravitational-wave Observatory (LIGO) Collaboration announced that they had sensed gravitational waves, or ripples in spacetime, caused by the collision of two black holes approximately 1.3 billion light years away. Such an amazing feat quickly became globalized news with many asking how it could be physically possible to detect an event occurring at an unimaginable distance? For some, the entire situation feels incomprehensible. Although named an observatory, LIGO looks quite different from observatories such as the late Arecibo Observatory in Puerto Rico, the Very Large Array (VLA) in New Mexico, or the Lowell Observatory in Arizona. Instead of being related to the traditional telescope concept, LIGO is comprised of two interferometers, one in Hanford, Washington and the other in Livingston, Louisiana, that use lasers to detect vibrations in the fabric of spacetime. An interferometer is an L-shaped apparatus with mirrors at the end of each arm specifically positioned to split the incoming light waves, specifically in this case laser waves, into an interference pattern. This pattern is then detected by a device called a photodetector, which converts the pattern into carefully recorded data. When an incredibly violent event occurs, two black holes colliding, for instance, that action results in a massive release of energy that ripples across the fabric of spacetime. The energy from the event vibrates the laser light causing a change in the recorded light pattern. This change is also recorded by the photodetector and stored as data, which scientists can collect to analyze as needed. Because the LIGO detector is so sensitive, there are a number of systems in place to maintain its functionality and reliability. The apparatus is comprised of four main systems: 1) seismic isolation that focuses on removing non-gravitational-wave detections (also called ‘noise’) 2) optics that regulate the laser 3) a vacuum system preserving the continuity of the laser by removing dust from the components 4) computing infrastructure that manages the collected scientific data. The collaboration of these systems helps to minimize the number of false detections. False detections are also kept at a minimum with the effective communication between the Washington and Louisiana LIGO sites. It took months for the official announcement of the 2015 gravitational-wave detection because both locations had to compare data to ensure that the detection of one apparatus was also accurately detected by the other apparatus. Because of human activity on Earth, there can be a number of vibrations similar to gravitational-wave ripples, but ultimately are shown to be terrestrial events rather than celestial ones. So, while LIGO physics itself is fairly straightforward, the interpretation of the gathered data tends to be tricky. Written by Amber Elinsky Related articles: the DESI instrument / the JWST / The physics behind cumulus clouds / Light Project Gallery
- Nanoparticles: the future of diabetes treatment? | Scientia News
Nanoparticles have unique properties Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Nanoparticles: the future of diabetes treatment? 17/07/25, 10:52 Last updated: Published: 06/05/24, 13:20 Nanoparticles have unique properties Diabetes mellitus is a chronic metabolic disorder affecting millions worldwide. Given its myriad challenges, there is a substantial demand for innovative therapeutic strategies in its treatment. The global diabetic population is expected to increase to 439 million by 2030, which will impose a significant burden on healthcare systems. Diabetes occurs when the body cannot produce enough insulin, a hormone crucial for regulating glucose levels in the blood. This deficiency leads to increased glucose levels, causing long-term damage to organs such as the eyes, kidneys, heart, and nervous system, due to defects in insulin function and secretion. Nanoparticles have unique properties making them versatile in their applications and are promising to help revolutionise the future of the treatment of diabetes. This article will explore the potential of this emerging technology in medicine and will address the complexities and issues that arise with the management of diabetes. Nanoparticles have distinct advantages: biocompatibility, bioavailability, targeting efficiency and minimal toxicity, making them ideal for antidiabetic treatment. The drug delivery is targeted, making the delivery precise and efficient, avoiding off-target effects. Modifying nanoparticle surfaces enhances therapeutic efficacy, enabling targeted delivery to specific tissues and cells, while reducing systemic side effects. Another currently researched key benefit is real-time glucose sensing and monitoring, which addresses a critical aspect in managing diabetes, as nanoparticle-based glucose sensors can detect glucose levels with high sensitivity and selectivity. This avoids the use of invasive blood sampling and allows for continuous monitoring of glucose levels. These can be functionalised and integrated into wearable devices, or implanted sensors, making it convenient and reliable to monitor and to be able to optimum insulin therapy. Moreover, nanoparticle-based approaches show potential in tissue regeneration, aiding insulin production restoration. For example, in particular, nanomedicine is a promising tool in theranostics of chronic kidney disease (CKD), where one radioactive drug can diagnose and a second delivers the therapy. The conventional procedure to assess renal fibrosis is by taking a kidney biopsy, which is then followed by a histopathological assessment. This method is risky, invasive, and subjective, and less than 0.01 % of kidney tissue is examined which results in diagnostic errors, limiting the accuracy of the current screening method. The standard use of pharmaceuticals has been promising but can cause hypoglycaemia, diuresis, and malnutrition because of the low caloric intake. Nanoparticles offer a new approach to both diagnosis and treatment and are an attractive candidate for managing CKD as they can carry drugs and enhance image contrast, controlling the rate and location of drug release. In the treatment of this multifaceted disease, nanoparticle delivery systems seem to be a promising and innovative therapeutic strategy, with the variety in the methods of delivery. The range of solutions that are currently being developed are promising, from enhancing the drug delivery to monitoring the glucose level, to direct tissue regeneration. There is immense potential for the advancement of nanomedicines, helping improve patient outcomes, the treatment efficacy, and allowing the alleviation of the burden and side effects of the disorder. With ongoing efforts and innovation, the future treatment of diabetes can be greatly helped with the use of nanoparticles, and these advancements will improve strategies for the management and future treatment of diabetes. Written by Saanchi Agarwal Related articles: Pre-diabetes / Can diabetes mellitus become an epidemic? / Nanomedicine / Nanoparticles on gut health / Nanogels / Nanocarriers REFERENCES Lemmerman LR, Das D, Higuita-Castro N, Mirmira RG, Gallego-Perez D. Nanomedicine-Based Strategies for Diabetes: Diagnostics, Monitoring, and Treatment. Trends Endocrinol Metab. 2020 Jun;31(6):448-458. doi: 10.1016/j.tem.2020.02.001. Epub 2020 Mar 4. PMID: 32396845; PMCID: PMC7987328. Dehghani P, Rad ME, Zarepour A, Sivakumar PM, Zarrabi A. An Insight into the Polymeric Nanoparticles Applications in Diabetes Diagnosis and Treatment. Mini Rev Med Chem. 2023;23(2):192-216. doi: 10.2174/1389557521666211116123002. PMID: 34784864. Luo XM, Yan C, Feng YM. Nanomedicine for the treatment of diabetes-associated cardiovascular diseases and fibrosis. Adv Drug Deliv Rev. 2021 May;172:234-248. doi: 10.1016/j.addr.2021.01.004. Epub 2021 Jan 5. PMID: 33417981. L. Tillman, T. A. Tabish, N. Kamaly, A. El-Briri F, C. Thiemermann, Z. I. Pranjol and M. M. Yaqoob, Review Advancements in nanomedicines for the detection and treatment of diabetic kidney disease, Biomaterials and Biosystems, 2022, 6, 100047. J. I. Cutler, E. Auyeung and C. A. Mirkin, Spherical nucleic acids, J Am Chem Soc, 2012, 134, 1376–1391. Veiseh, O., Tang, B., Whitehead, K. et al. Managing diabetes with nanomedicine: challenges and opportunities. Nat Rev Drug Discov 14, 45–57 (2015). https://doi.org/10.1038/nrd4477 Project Gallery
- The genesis of life | Scientia News
Life's origins Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The genesis of life 11/07/25, 10:04 Last updated: Published: 23/11/23, 11:22 Life's origins Did the egg or the chicken come first? This question is often pondered regarding life’s origin and how biological systems came into play. How did chemistry move to biology to support life? And how have we evolved into such complex organisms? The ingredients, conditions and thermodynamically favoured reactions hold the answer, but understanding the inner workings of life’s beginnings poses a challenge for us scientists. Under an empirical approach, how can we address these questions if these events occurred 3.7 billion years ago? The early atmosphere of the Earth To approach these questions, it is relevant to understand the atmospheric contents of the primordial Earth. With a lack of oxygen, the predominant make-up included C02, NH3 and H2, creating a reducing environment for the drive of chemical reactions. When the earth cooled, and the atmosphere underwent condensation, pools of chemicals were made - this is known as “primordial soup”. It is thought that reactants could collide from this “soup” to synthesise nucleotides by forming nitrogenous bases and bonds, such as glycosidic or hydrogen bonds. Such nucleotide monomers were perhaps polymerised to create long chains for nucleic acid synthesis, that is, RNA, via this abiotic synthesis. Thus, if we have nucleic acids, genetic information could have been stored and passed later down the line, allowing for our eventual evolution. Conditions for nucleic acid synthesis The environment supported the formation of monomers for said polymerisation. For example, hydrothermal vents could have provided the reducing power via protons, allowing for the protonation of structures and providing the free energy for bond formation. Biology, of course, relies on protons for the proton gradient in ATP synthesis at the mitochondrial membrane and, in general, acid-base catalysis in enzymatic reactions. Therefore, it is safe to say protons played a vital role in life’s emergence. The eventual formation of structures by protonation and deprotonation provides the enzymatic theory of life’s origins. That is, some self-catalytic ability for replication in a closed system and the evolution of complex biological units. This is the “RNA World” theory, which will be discussed later. Another theory is wet and dry cycling at the edge of hydrothermal pools. This theory Is provided by David Deamer, who suggests that nucleic acid monomers placed in acidic (pH 3) and hot (70-90 degrees Celsius) pools could undergo condensation reactions for ester bond formation. It highlights the need for low water activity and a “kinetic trap” in which the condensation reaction rate exceeds the hydrolysation rate. The heat of the pool provides a high activation energy for the localised generation of polymers without the need for a membrane-like compartment. But even if this was possible and nucleic acids could be synthesised, how could we “keep them safe”? This issue is addressed by the theory of "protocells" formed from fatty acid vesicles. Jack Szostak suggests phase transition (that is pH decrease) allowed for the construction of bilayer membranes from fatty acid monomers, which is homologous to what we see now in modern cells. The fatty acids in these vesicles have the ability to “flip-flop” to allow for the exchange of nutrients or nucleotides in and out of the vesicles. It is suggested that clay encapsulated nucleotide monomers were brought into the protocell by this flip-flop action. Vesicles could grow by competing with surrounding smaller vesicles. Larger vesicles are thought to be those harbouring long polyanionic molecules - that is RNA - which creates immense osmotic pressure pushing outward on the protocell for absorption of smaller vesicles. This represents the Darwinian “survival of the fittest” theory in which cells with more RNA are favoured for survival. The RNA World Hypothesis DNA is often seen as the “Saint” of all things biology, given its ability to store and pass genetic information to mRNA and then mRNA can use this information to synthesise polypeptides. This is the central dogma of course. However, the RNA world hypothesis suggests that RNA arose first due to its ability to form catalytic 3D structures and store genetic information that could have allowed for further synthesis of DNA. This makes sense when you think about how the primer for DNA replication is formed out of RNA. If RNA did not come first, how could DNA replication be possible? Many other scenarios suggest RNA evolution preceded that of DNA. So, if RNA arose as a simple polymer, its ability to form 3D structures could have allowed ribozymes (RNA with enzymatic function) within these protocells. Ribozymes, such as RNA ligase and polymerase, could have allowed for self-replication, and then mutation in primary structure could have allowed evolution to occur. If we have a catalyst, in a closed system, with nutrient exchange, then why would life’s formation not be possible? But how can we show that RNA can arise in this way? The answer to this is SELEX - selective evolution of ligands by exponential enrichment (5). This system was developed by Jack Szostak, who wanted to show the evolution of complex RNA, ribozymes in a test tube was possible. A pool of random, fragmented RNA molecules can be added to a chamber and run through a column with beads. These beads harbour some sequence or attraction to the RNA molecules the column is selecting for. Those that attach can be eluted, and those that do not can be disregarded. The bound RNA can be rerun through SELEX, and the conditions in the column can be more specific in that only the most complementary RNAs bind. This allowed for the development of RNA ligase and RNA polymerase - thus, self-replication of RNA is possible. SELEX helps us understand how the evolution of RNA in the primordial Earth could have been possible. This is also established by meteorites, such as carbon chondrites that burnt up in the earth’s atmosphere encapsulating the organic material in the centre. Chondrites found in Antarctica have been found to contain 80+ amino acids (some of which are not compatible with life). These chondrites also included nucleobases. So, if such monomers can be synthesised in a hostile environment in outer space/in our atmosphere, then the theory of abiotic synthesis is supported. Furthermore, it is relevant to address the abiotic synthesis of amino acids since the evolution of catalytic RNA could have some complementarity for polypeptide synthesis. Miller and Urey (1953) set up a simple experiment containing gas representing the early primordial earth (Methane, hydrogen, ammonia, water). They used a conduction rod to provide the electrical discharge (meant to simulate lightning or volcanic eruption) to the gases and then condensed them. The water in the other chamber turned pink/ brown. Following chromatography, they identified amino acids in the mixture. These simple manipulations could have been homologous to early life. Conclusion The abiotic synthesis of nucleotides and amino acids for their later polymerisation would support the theories that address chemistry moving toward biological life. Protocells containing such polymers could have been selected based on their “fitness” and these could have mutated to allow for the evolution of catalytic RNA. The experiments mentioned represent a small fragment of those carried out to answer the questions of life’s origins. The evidence provides a firm ground for the emergence of life to the complexity of what we know today. Written by Holly Kitley Project Gallery
- Artificial intelligence in space | Scientia News
AI in developing space technologies Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Artificial intelligence in space 09/07/25, 10:56 Last updated: Published: 19/11/23, 17:31 AI in developing space technologies Artificial intelligence or AI has become an important force or a tool that drives the evolution of technologies that improve human life and helps unlock the secrets of the universe beyond the influence of our planet. In simple words, AI is something that enables a computer/ robot to mimic human intelligence and it is revolutionizing the way we explore and utilize space, enhancing everything from spacecraft navigation and autonomous decision-making to data analysis and mission planning. This article explores the profound impact of AI in the development of space related technologies. Mission planning and design Space mission planning and payload, instrument designs rely on the gathered previous mission data. However, access to all the historic mission data is only provided to individuals with a higher authority access at the space agency which requires a lot of paper works and approvals. But recently NASA came up with a solution and they named it as the “Data Acquisition Processing and Handling Network Environment” (DAPHNE) system. Daphne is an AI assistant which can access millions of previous mission data including the most restricted ones and provide the scientists an insight about their mission without the need of a higher authority access or security clearance. It can also compute and analyze countless input variables to determine the most efficient routes and schedules for missions, which is crucial for long-duration missions or missions with multiple objectives. Manufacturing Manufacturing processes usually involves complex tasks that requires high precision and attention to detail when it comes to space related applications. The use of AI in spacecraft manufacturing not only accelerates production but also increases precision and reliability. Ai assistants like collaborative bots (cobots) interacts with the engineers and help them to make the right decisions, reduce the overall assembly process time and also provide insights about the final product which ensures that the spacecrafts are built to the highest standards. Data processing Space missions generate vast amounts of data, from images and telemetry to instrument readings. AI algorithms are capable in sifting through this data, identifying patterns, and extracting meaningful insights. An example is the estimation of planetary wind speed which requires a combination of the satellite imagery and meteorological data. AI tools can rapidly analyze these large datasets and help scientists in understanding these planetary phenomena and easily uncover its secrets. This capability is also valuable in missions to study distant galaxies, black holes, and exoplanets. Navigation & guidance systems One of the critical applications of AI in space technology is autonomous navigation. Spacecraft traveling vast distances through the cosmos must constantly adjust their trajectories to avoid collisions with celestial bodies and maximize their fuel efficiency. Advanced AI systems can process data in real-time and autonomously adjust a spacecraft's course. This not only reduces the need for constant human intervention from the ground station but also allows for more precise and efficient missions. Astronaut health monitoring Astronauts in space face a range of health issues like bone density loss, cardiovascular issues etc. The AI systems can continuously monitor physiological data and provide an insight into the astronaut’s health condition including sleep patterns. This allows early detection of health issues and timely intervention which reduces the need for immediate communication with ground mission control, ultimately safeguard the safety of the astronauts on long-duration missions. In summary, AI is a tool that represents a transformative shift in how we explore and understand our cosmos and its secrets. One day AI will play an even more significant role in the future that pushes the boundaries of space and bring us closer to answering some of humanity’s most profound questions. Written by Arun Sreeraj Related articles: Astronauts in space / AI in drug discovery / Evolution of AI / Chemistry in space exploration Project Gallery
- The role of mesenchymal stem cells (MSCs) in regenerative medicine | Scientia News
The potential of MSCs to treat diseases like rheumatoid arthritis Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The role of mesenchymal stem cells (MSCs) in regenerative medicine 23/10/25, 10:18 Last updated: Published: 28/11/24, 15:16 The potential of MSCs to treat diseases like rheumatoid arthritis This is article no. 2 in a three-part series on stem cells. Next article: Regulation and policy of stem cell research . Previous article: An introduction to stem cells . Welcome to the second article in a series of three articles about stem cells. I will explore mesenchymal stem cells and their role in regenerative medicine in this article. Additionally, I will consider the potential of mesenchymal stem cells in treating three different diseases: multiple sclerosis (MS), rheumatoid arthritis (RA) and inflammatory bowel disease (IBD). Consider reading Article 1 for more information on mesenchymal stem cells! Multiple sclerosis (MS) Multiple sclerosis (MS) is an autoimmune disease affecting the brain and spinal cord. It can cause symptoms such as muscle stiffness and spasms, problems with balance and coordination, vision problems and more. According to the Multiple Sclerosis Society UK (MS Society UK), it is estimated that there are around 150,000 people with MS in the UK, with nearly 7,100 people being newly diagnosed every year. Scientists have found that MSCs can be used to treat some of the symptoms of MS as MSCs protect the nerves in the CNS by secreting substances called neurotrophic growth factors, which increase nerve growth and the survival of nerve cells. These neurotrophic growth factors can also repair damaged nerves, improving nerve function. However, the exact mechanisms of this are still being studied. Furthermore, MSCs can activate the brain's natural healing mechanisms by stimulating the brain's stem cells to become active and repair the damaged tissue. This results in patients having a reduction in symptoms and the severity of the symptoms, improving the quality of life for those with MS. Rheumatoid arthritis (RA) Rheumatoid arthritis (RA) is a chronic inflammatory autoimmune disease affecting the joints. The charity Versus Arthritis has said there are around 400,000 adults aged 16 and over affected by RA in the UK. Scientists have found that MSCs can reduce inflammation in the joints as they have immunomodulatory properties, so they can regulate the immune system's abnormal responses that cause RA. MSCs suppress immune cell activity, resulting in a decrease in inflammation and joint damage. In addition, MSCs can migrate (travel) to the inflamed joints and release anti-inflammatory molecules, reducing joint swelling and pain. This results in patients having a reduction in pain and joint swelling, improving the quality of life for those with RA. Inflammatory bowel disease (IBD) Inflammatory bowel disease (IBD) is an umbrella term for chronic inflammatory digestive diseases, including ulcerative colitis and Crohn’s disease (CD), affecting the gastrointestinal tract. A study by the University of Nottingham estimates that 500,000 people in the UK are living with IBD. Scientists have found that MSCs can reduce inflammation and increase tissue repair in the gastrointestinal tract. This is because MSCs can migrate to sites of inflammation in the gut, where they can replace damaged tissue cells. MSCs release signalling molecules that regulate the immune response and reduce inflammation. They can even directly interact with immune cells in the gut, influencing their behaviour and decreasing the inflammatory response. Also, MSCs can transfer mitochondria to damaged cells through cell fusion, helping the damaged cells function better and reduce inflammation. This results in reduced inflammation in patients, improving the quality of life for those with IBD. Looking to the future MS, RA and IBD are just three of the multiple diseases MSCs can target, and while there are many refinements to be made for MSCs to become more viable as treatment options, current findings show promising results. With further development, including more research to understand the exact biology of MSCs, there is massive potential for this method to revolutionise the treatment of various diseases, including cardiovascular diseases, liver diseases and cancer. As stem cell research continues to advance, policies must also adapt to this changing landscape; watch out for the last article in the series, where I will discuss the regulation and policy of stem cell research! Written by Naoshin Haque Related articles: The biggest innovations in the biosciences / Neuromyelitis optica and MS / Crohn's disease Project Gallery










