Search Index
278 items found
- Artemis: The Lunar South Pole Base | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Artemis: The Lunar South Pole Base 13/12/24, 12:20 Last updated: Landing on the moon (again!) Humans have not visited the moon since 1972, but that’s about to change. Thanks to NASA’s Artemis missions, we have already taken the first small step towards our own lunar home for astronauts. NASA has established the second generation of its lunar missions- “Artemis”, fittingly named after the ancient Greek Goddess of the Moon, and Apollo’s twin. The ultimate aim of the Artemis missions is to solidify a stepping stone to Mars. Technologies will be developed, tested, and perfected, before confidence is built to travel on to Mars. NASA has to consider the natural conditions of the Moon, since doing so will allow astronauts to limit their reliance on resources from Earth, and increase their length of stay and therefore potential for research. The amount achieved would be extremely limited if a lunar mission relied solely on resources from Earth, due to the limitation of rocket payloads. This is known as In-Situ Resource Utilisation, and in addition to extended lunar stays, its success on the Moon is essential if we hope to one day establish a base on Mars. As a priority, astronauts need to have access to energy and water. Luckily, the conditions at the lunar south pole may be ideal for this. Unlike Earth, where we experience seasons due to its 23.5° tilt, the Moon’s tilt is tiny, at only 1.5°. This means some areas at the lunar poles are almost always exposed to sunlight, providing a reliable source of solar energy generation for a potential Artemis Base Camp. And since the Sun is always near the horizon at the poles, there are even areas in deep craters that never see the light. These areas of “eternal darkness” can reach temperatures of -235°, possibly allowing astronauts access to water ice. Aside from access to resources, Artemis has to consider the dangers that come from living in space. Away from the safety of Earth’s protective atmosphere and magnetosphere, astronauts would be exposed to harsh solar winds and cosmic rays. To combat this, NASA hopes to make use of the terrain surrounding the base, highlighting another advantage of the hilly south pole [3]. The exact location for the Artemis Base is currently undecided. We just know it will most likely be near a crater rim by the south pole, and on the Earth-facing side to allow for communication to and from Earth. Not only is the south pole ideal from a practical standpoint, it is also an area of exciting scientific interest. Scientists will have access to the South Pole–Aitken basin, not only the oldest and largest confirmed impact crater on the Moon, but the second largest confirmed impact crater in the entire Solar System. With a depth of up to 8.2 km, and diameter of 2500 km, it is thought this huge crater will contain exposed areas of lower crust and mantle, providing an insight into the Moon’s history and formation. Additionally, thanks to areas of “eternal darkness” the ice water found deep within craters at the south pole may hold trapped volatiles up to 3.94 billion years old, which, although not as ancient as previously expected, can still provide an insight into the evolution of the Moon. The scientific potential of the Artemis Base Camp extends far beyond location specific investigations to our most fundamental understanding of physics, from Quantum Physics to General Relativity. Not to mention the astronauts themselves, as well as “model organisms” which will be the focus of physiological studies into the effects of extreme space environments. Artemis Timeline Overview: Artemis 1 launched on 16th November 2022. It successfully tested the use of two key elements of the Artemis mission- Orion and the Space Launch System (SLS)- with an orbit around the moon. Orion, named after the Goddess Artemis' hunting partner, is the spacecraft that will carry the Artemis crew into lunar orbit. It is carried by the SLS, NASA’s super heavy-lift rocket, one of the most powerful rockets in the world. Artemis 2 plans to launch late 2024 and will be the first crewed Artemis mission, with a lunar flyby bringing four astronauts further than humans have ever travelled beyond Earth. Artemis 3 plans to launch the following year. It will be the historic moment that will see humans step foot on the surface of the moon for the first time since we left in 1972. The mission will be the first use of another key element of the Artemis missions- the Human Landing System (HLS). Astronauts will use a lunar version of SpaceX’s Starship rocket as the HLS for Artemis 3 and 4. (Starship is currently in its test stage, with its second test launch carried out very recently on the 18th November 2023.) Two astronauts will stay on the lunar surface for about a week, beating the current record of 75 hours on the Moon by Apollo 17. Artemis 4 plans to launch in 2028. The mission will include the first use of Gateway, another key element to the Artemis missions. Gateway will be a multifunctional lunar space station, designed to transfer astronauts between Orion and HLS, as well as hosting astronauts to live and research in lunar orbit. Gateway will be constructed over Artemis 4-6 , with each mission completing an additional module. NASA plans to have Artemis missions extending for years beyond this, with over 10 proposed and more expected. Eventually we will have a working base on the Moon with astronauts able to stay for months at a time. Having already started a year ago, Artemis will continue to expand our horizons. We can look forward to uncovering long held secrets of the Moon, and soon, setting our sights confidently on Mars. Written by Imo Bell Related articles: Exploring Mercury / Fuel for the colonisation of Mars / Nuclear fusion REFERENCES How could we live on the Moon? - Institute of Physics. Available at: https://www.iop.org/explore-physics/moon/how-could-we-live-on-the-moon Understanding Physical Sciences on the Moon - NASA. Available at: https://science.nasa.gov/lunar-science/focus-areas/understanding-physical-sciences-on-themoon NASA’s Artemis Base Camp on the moon will need light, water, elevation - NASA. Available at: https://www.nasa.gov/humans-in-space/nasas-artemis-base-camp-on-the-moon-will-need-ligh t-water-elevation Zuber, M.T. et al. (1994) ‘The shape and internal structure of the Moon from the Clementine Mission’, Science, 266(5192), pp. 1839–1843. doi:10.1126/science.266.5192.1839. Flahaut, J. et al. (2020) ‘Regions of interest (ROI) for future exploration missions to the Lunar South Pole’, Planetary and Space Science, 180, p. 104750. doi:10.1016/j.pss.2019.104750. Moriarty, D.P. et al. (2021) ‘The search for lunar mantle rocks exposed on the surface of the Moon’, Nature Communications, 12(1). doi:10.1038/s41467-021-24626-3. Estimates of water ice on the Moon get a ‘dramatic’ downgrade - Physics World. Available at: https://physicsworld.com/a/estimates-of-water-ice-on-the-moon-get-a-dramatic-downgrade Biological Systems in the lunar environment - NASA. Available at: https://science.nasa.gov/lunar-science/focus-areas/biological-systems-in-the-lunar-environme Https://www.nasa.gov/wp-content/uploads/static/artemis/NASA : Artemis - NASA. Available at: https://www.nasa.gov/specials/artemis Project Gallery
- Psychology | Scientia News
Psychology Articles Psychology delves into the human mind and behaviour. Read on for compelling articles ranging from reward sensitivity to evolutionary, and empathy-altruism theories. You may also like: Biology, Medicine Motivating the mind Effect of socioeconomic status on reward sensitivity The evolutionary theory by Darwin vs empathy-altruism Explaining altruism through different theories A perspective on well-being Hedonic vs eudaimonic: based on the principles of Aristotle and Aristippus Nature vs. nurture in childhood intelligence What matters most? The psychology of embarrassment Why do we feel this emotion? Models and theories A primer on the Mutualism theory of intelligence A detailed review on different studies
- Potential vaccine for malaria | Scientia News
Go Back Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Could this new vaccine spell the end of malaria? Last updated: 07/11/24 Malaria is a vicious parasitic disease spread through the bite of the female Anopheles mosquito, with young children being its most prevalent victim. In 2021, there were over 600,000 reported deaths, giving us an insight into its alarming virulence. The obstacle in lessening malaria's disease burden is the challenge of creating a potent vaccine. The parasite utilises a tactic known as antigenic variation, where its extensive genetic diversity of antigens allows it to modulate its surface coat, allowing it to effectively evade the host immune system. However, unlike other variable malaria surface proteins, RH5, the protein required to invade red blood cells (RBC), does not vary and is therefore a promising target. Researchers at the University of Oxford have demonstrated various human antibodies that block the interaction between the RH5 malaria protein to host RBCs, providing hope for a new way to combat this deadly disease. The researchers have reported up to an 80% vaccine efficacy, surpassing the WHOs goal of developing a malaria vaccine with 75% efficacy. Therefore, this vaccine has the potential to be the world’s first highly effective malaria vaccine, and with adequate support in releasing this vaccine, we could be well on our way to seeing a world without child deaths from malaria. Written by Bisma Butt
- The dopamine connection | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The dopamine connection 10/05/24, 10:34 Last updated: How your gut influences your mood and behaviour Introduction to dopamine Dopamine is a neurotransmitter derived from an amino acid called phenylalanine, which must be obtained through the diet, through foods such as fish, meat, dairy and more. Dopamine is produced and released by dopaminergic neurons in the central nervous system and can be found in different brain regions. The neurotransmitter acts via two mechanisms: wiring transmission and volume transmission. In wiring transmission, dopamine is released to the synaptic cleft and acts on postsynaptic dopamine receptors. In volume transmission, extracellular dopamine arrives at neurons other than postsynaptic ones. Through methods such as diffusion, dopamine then reaches receptors in other neurons that are not in direct contact with the cell that has released the neurotransmitter. In both mechanisms, dopamine binds to the receptors, transmitting signals between neurons and affecting mood and behaviour. The link between dopamine and gut health Dopamine has been known to result in positive emotions, including pleasure, satisfaction and motivation, which can be influenced by gut health. Therefore, what you eat and other factors, including motivation, could impact your mood and behaviour. This was proven by a study (Hamamah et al., 2022), which looked at the bidirectional gut-brain connection. The study found that gut microbiota was important in maintaining the concentrations of dopamine via the gut-brain connection, also known as the gut microbiota-brain axis or vagal gut-to-brain axis. This is the communication pathway between the gut microbiota and the brain facilitated by the vagus nerve, and it is important in the neuronal reward pathway, which regulates motivational and emotional states. Activating the vagal gut-to-brain axis, which leads to dopamine release, suggests that modulating dopamine levels could be a potential treatment approach for dopamine-related disorders. Some examples of gut microbiota include Prevotella, Bacteroides, Lactobacillus, Bifidobacterium, Clostridium, Enterococcus, and Ruminococcus , and they can affect dopamine by modulating dopaminergic activity. These gut microbiota are able to produce neurotransmitters, including dopamine, and their functions and bioavailability in the central nervous system and periphery are influenced by the gut-brain axis. Gut dysbiosis is the disturbance of the healthy intestinal flora, and it can lead to dopamine-related disorders, including Parkinson's disease, ADHD, depression, anxiety, and autism. Gut microbes that produce butyrate, a short-chain fatty acid, positively impact dopamine and contribute to reducing symptoms and effects seen in neurodegenerative disorders. Dopamine as a treatment It is important to understand the link between dopamine and gut health, as this could provide information about new therapeutic targets and improve current methods that have been used to prevent and restore deficiencies in dopamine function in different disorders. Most cells in the immune system contain dopamine receptors, allowing processes such as antigen presentation, T-cell activation, and inflammation to be regulated. Further research into this could open up a new possibility for dopamine to be used as a medication to treat diseases by changing the activity of dopamine receptors. Therefore, dopamine is important in various physiological processes, both in the central nervous and immune systems. For example, studies have shown that schizophrenia can be treated with antipsychotic medications which target dopamine neurotransmission. In addition, schizophrenia has also been treated by targeting the dysregulation (decreasing the amount) of dopamine transmission. Studies have shown promising results regarding dopamine being used as a form of treatment. Nevertheless, further research is needed to understand the interactions between dopamine, motivation and gut health and explore how this knowledge can be used to create medications to treat conditions. Conclusion The bidirectional gut-brain connection shows the importance of gut microbiota in controlling dopamine levels. This connection influences mood and behaviour but also has the potential to lead to new and innovative dopamine-targeted treatments being developed (for conditions including dopamine-related disorders). For example, scientists could target and manipulate dopamine receptors in the immune system to regulate the above mentioned processes: antigen presentation, T-cell activation, and inflammation. While current research has shown some promising results, further investigations are needed to better comprehend the connection between gut health and dopamine levels. Nevertheless, through consistent studies, scientists can gain a deeper understanding of this mechanism to see how changes in gut microbiota could affect dopamine regulation and influence mood and behaviour. Written by Naoshin Haque Related articles: the gut microbiome / Crohn's disease Project Gallery
- Anaemia | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Anaemia 03/06/24, 14:57 Last updated: A disease of the blood This is article no. 1 in a series about anaemia. Next article: iron-deficiency anaemia Introduction Erythrocytes in their typical state are a biconcave and nucleus free cell responsible for carrying oxygen and carbon dioxide. The production is controlled by erythropoietin and as they mature in the bone marrow, they lose their nuclei. These red blood cells (RBC) contain haemoglobin, which aids in the transport of oxygen and iron, iron is a key component of haem, insufficient levels of iron leads to anaemic disorders. Low oxygen-carrying capacity may be defined by too few RBC in circulation or RBC dysfunction. Haem iron is acquired through the digestion of meat and transported through enterocytes of the duodenum, in its soluble form. Erythrocytic iron accounts for approximately 50% of the iron in blood. Metals cannot move freely throughout the body so they must be transported, the molecule involved in transporting iron is known as transferrin. Plasma transferrin saturation refers to the iron that is attached to transferrin, in iron deficient anaemia (IDA) this will always be low. Anaemia is physiological or pathological, these changes can be due to a plethora of causes; malabsorption due to diet or gastrointestinal (GI) conditions, genetic dispositions such as sideroblastic anaemias (SA), thalassaemia, or deficiency in erythropoietin due to comorbidities and chronic disease; where haemolysis is caused by autoimmune disorders, infections and drugs, or blood loss. Haem The iron is in a protoporphyrin ring at the centre of a haem molecule. The structure of haem consists of two alpha and two beta polypeptide chains to form a single haemoglobin macromolecule. Microcytic anaemias arise from problems in the creation of haemoglobin; sourcing through diet (IDA), synthesising protoporphyrin (SA) or from globin chain defects caused by thalassaemia. Summary Anaemia is a multifactorial condition with many different mechanisms involved, microcytic anaemias have an issue at the haemoglobin level, these can be acquired or inherited. A microcytic anaemia is caused by a failure to efficiently synthesise haemoglobin, whether from iron, protoporphyrin rings or globin chains. The diagnosis of anaemias is reliant on a patient’s background and medical history, as there are many factors involved in an anaemic disorder. A diagnosis should be patient led, as the age and sex of the patient can significantly highlight the origin and pathogenesis, as well as the prognosis and follow up care. By Lauren Kelly Project Gallery
- A deep dive into the hallmarks defining Alzheimer’s disease | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link A deep dive into the hallmarks defining Alzheimer’s disease 12/12/24, 12:19 Last updated: Exploring the distinctive features that define and disrupt the brain The progressive decline in neurocognition, resulting in a detrimental effect on one’s activities of daily living, is referred to as dementia. It typically affects people over the age of 65. Multiple theories have been proposed to explain the pathogenesis of Alzheimer’s disease (AD), including the buildup of amyloid plaques in the brain and the formation of neurofibrillary tangles (NFT) in cells. Understanding the pathophysiology of AD is imperative to the development of therapeutic strategies. Therefore, this article will outline the major hallmarks and mechanisms of AD. Hallmark 1: amyloid plaques One of the most widely accepted hypotheses for AD is the accumulation of amyloid beta protein (Aβ) in the brain. Aβ is a 4.2 kDa peptide consisting of approximately 40–42 amino acids, originating from a precursor molecule called amyloid precursor protein. This process, defined as amyloidosis, is strongly linked to brain aging and neurocognitive decline. How do the amyloid plaques form? See Figure 1 . Reasons for the accumulation of amyloid plaques: Decreased autophagy: Amyloid proteins are abnormally folded proteins. Autophagy in the brain is primarily carried out by neuronal and glial cells, involving key structures known as autophagosomes and lysosomes. When autophagy becomes downregulated, the metabolism of Aβ is impaired, eventually resulting in plaque buildup. Overproduction of acetylcholinesterase (AChE): Acetylcholine (Ach) is the primary neurotransmitter involved in memory, awareness, and learning. Overproduction of ACHE by astrocytes into the synaptic cleft can lead to excessive breakdown of Ach, with detrimental effects on cognition. Reduced brain perfusion: Blood flow delivers necessary nutrients and oxygen for cellular function. Reduced perfusion can lead to “intracerebral starvation”, depriving cells of the energy needed to clear Aβ. Reduced expression of low-density lipoprotein receptor-related protein 1: Low-density lipoprotein receptor-related protein 1 (LRP1) receptors are abundant in the central nervous system under normal conditions. They are involved in speeding up the metabolic pathway of Aβ by binding to its precursor and transporting them from the central nervous system into the blood, thereby reducing buildup. Reduced LRP1 expression can hinder this process, leading to amyloid buildup. Increased expression of the receptor for advanced glycation end products (RAGE): RAGE is expressed on the endothelial cells of the BBB, and its interaction with Aβ facilitates the entry of Aβ into the brain. Hallmark 2: neurofibrillary tangles See Figure 2 Neurofibrillary tangles are excessive accumulations of tau protein. Microtubules typically support neurons by guiding nutrients from the soma (cell body) to the axons. Furthermore, tau proteins stabilise these microtubules. In AD, signalling pathways involving phosphorylation and dephosphorylation cause tau proteins to detach from microtubules and stick to each other, eventually forming tangles. This results in a disruption in synaptic communication of action potentials. However, the exact mechanism remains unclear. Recent studies suggest an interaction between Aβ and tau, where Aβ can cause tau to misfold and aggregate, forming neurofibrillary tangles inside brain cells. Both Aβ and tau can self-propagate, spreading their toxic effects throughout the brain. This creates a vicious cycle, where Aβ promotes tau toxicity, and toxic tau can further exacerbate the harmful effects of Aβ, ultimately causing significant damage to synapses and neurons in AD. Hallmark 3: neuroinflammation Microglia are the primary phagocytes in the central nervous system. They can be activated by dead cells and protein plaques, where they initiate the innate immune response. This involves the release of chemokines to attract other white blood cells and the activation of the complement system which is a group of proteins involved in initiating inflammatory pathways to fight pathogens. In AD, microglia bind to Aβ via various receptors. Due to the substantial accumulation of Aβ, microglia are chronically activated, leading to sustained immune responses and neuroinflammation. Conclusion The contributions of amyloid beta plaques, neurofibrillary tangles and chronic neuroinflammation provide a framework for understanding the pathophysiology of AD. AD is a highly complex condition with unclear mechanisms. This calls for the need of continued research in the area as it is crucial for the development of effective treatments. Written by Blessing Amo-Konadu Related articles: Alzheimer's disease (an overview) / CRISPR-Cas9 to potentially treat AD REFERENCES 2024 Alzheimer’s Disease Facts and Figures. (2024). Alzheimer’s & dementia, 20(5). doi:https://doi.org/10.1002/alz.13809. A, C., Travers, P., Walport, M. and Shlomchik, M.J. (2001). The complement system and innate immunity. [online] Nih.gov. Available at: https://www.ncbi.nlm.nih.gov/books/NBK27100/ . Bloom, G.S. (2014). Amyloid-β and tau: the Trigger and Bullet in Alzheimer Disease Pathogenesis. JAMA neurology, [online] 71(4), pp.505–8. doi:https://doi.org/10.1001/jamaneurol.2013.5847. Braithwaite, S.P., Stock, J.B., Lombroso, P.J. and Nairn, A.C. (2012). Protein Phosphatases and Alzheimer’s Disease. Progress in molecular biology and translational science, [online] 106, pp.343–379. doi:https://doi.org/10.1016/B978-0-12-396456-4.00012-2. Heneka, M.T., Carson, M.J., El Khoury, J., Landreth, G.E., Brosseron, F., Feinstein, D.L., Jacobs, A.H., Wyss-Coray, T., Vitorica, J., Ransohoff, R.M., Herrup, K., Frautschy, S.A., Finsen, B., Brown, G.C., Verkhratsky, A., Yamanaka, K., Koistinaho, J., Latz, E., Halle, A. and Petzold, G.C. (2015). Neuroinflammation in Alzheimer’s disease. The Lancet. Neurology, 14(4), pp.388–405. doi:https://doi.org/10.1016/S1474-4422(15)70016-5. Kempf, S. and Metaxas, A. (2016). Neurofibrillary Tangles in Alzheimer′s disease: Elucidation of the Molecular Mechanism by Immunohistochemistry and Tau Protein phospho- proteomics. Neural Regeneration Research, 11(10), p.1579. doi:https://doi.org/10.4103/1673-5374.193234. Kumar, A., Tsao, J.W., Sidhu, J. and Goyal, A. (2022). Alzheimer disease. [online] National Library of Medicine. Available at: https://www.ncbi.nlm.nih.gov/books/NBK499922/. Ma, C., Hong, F. and Yang, S. (2022). Amyloidosis in Alzheimer’s Disease: Pathogeny, Etiology, and Related Therapeutic Directions. Molecules, 27(4), p.1210. doi:https://doi.org/10.3390/molecules27041210. National Institute on Aging (2024). What Happens to the Brain in Alzheimer’s Disease? [online] National Institute on Aging. Available at: https://www.nia.nih.gov/health/alzheimers-causes-and-risk-factors/what-happens-brain- alzheimers-disease. Stavoe, A.K.H. and Holzbaur, E.L.F. (2019). Autophagy in Neurons. Annual Review of Cell and Developmental Biology, 35(1), pp.477–500. doi: https://doi.org/10.1146/annurev-cellbio-100818-125242 . Project Gallery
- Crohn's disease | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Crohn's disease 09/01/25, 12:13 Last updated: Unmasking the complexities of the condition Introduction Crohn's disease is a chronic inflammatory condition that primarily targets the gastrointestinal tract. While it commonly afflicts individuals aged 20 to 50, it can also manifest in children and older adults, albeit less frequently. Symptoms of Crohn's disease vary widely and may include skin lesions spanning from the mouth to the anus, along with prevalent issues such as diarrhoea, abdominal pain, weight loss, rectal bleeding, fatigue, and fever. Diagnosis Diagnosing Crohn's disease can be challenging due to its similarity to other conditions. However, specific symptoms like bloody diarrhoea, iron deficiency, and unexplained weight loss are significant indicators that warrant further investigation by a gastroenterologist. Many tests that can confirm Crohn’s disease: Endoscopy: endoscopy, including procedures like colonoscopy and upper endoscopy, is a dependable method for diagnosing Crohn's disease and distinguishing it from other conditions with similar symptoms. During an endoscopy, a thin tube called an endoscope is inserted into the rectum to visually inspect the entire gastrointestinal tract and collect small tissue samples for further analysis. Imaging: Computed tomography (CT), magnetic resonance imaging (MRI), and ultrasonography are valuable tools for assessing disease activity and detecting complications associated with Crohn's disease. These imaging techniques can examine areas of the gastrointestinal tract that may not be accessible via endoscopy, providing comprehensive insights into the condition's progression and associated issues. Laboratory testing: various laboratory tests, including complete blood count, C-reactive protein levels, pregnancy tests, and stool samples, are conducted to screen for Crohn's disease. These tests are typically the initial step in diagnosis, helping to avoid the necessity for more invasive procedures like endoscopies and imaging. Additionally, laboratory testing may involve assessing inflammatory markers such as erythrocyte sedimentation rate (ESR) and faecal calprotectin to further aid in diagnosis and monitoring of the condition. Treatment and prevention While there is currently no cure for Crohn’s disease, numerous treatments have been developed over time to effectively manage symptoms and sometimes even induce remission. When determining a treatment plan for patients, factors such as age, specific symptoms, and the severity of inflammation are taken into careful consideration. Corticosteroids and immunomodulators are medications commonly used to manage Crohn’s disease. Corticosteroids work by reducing inflammation and suppressing the immune system, typically employed to address flare-ups due to their rapid action. However, they are not suitable for long-term use as they may lead to significant side effects. In contrast, maintenance therapy often involves immunomodulators such as azathioprine, methotrexate, or biologic agents like anti-TNF drugs (such as infliximab or adalimumab). These medications target specific immune pathways to enhance the effectiveness of the immune system. Research indicates that immunomodulators are associated with fewer adverse effects compared to corticosteroids and are effective in maintaining remission. Monoclonal antibody treatment is another approach used to manage symptoms and sustain remission in Crohn's disease. These therapies are categorised as biologic treatments, targeting precise molecules involved in inflammation and the immune response. Despite carrying certain risks, such as infections, the likelihood of developing cancer with these treatments is typically deemed low. Crohn’s disease frequently leads to complications that may necessitate surgical intervention. Gastrointestinal surgeries can greatly alleviate symptoms and enhance the quality of life for patients. However, surgery is usually considered only when medical therapy proves insufficient in controlling the disease or when complications arise. Although the exact cause of Crohn’s disease remains uncertain, factors such as genetics, immune system dysfunction, and environmental influences are believed to contribute to its development. While there is no definitive evidence pinpointing specific causative factors, numerous studies suggest potential links to an unhealthy diet and lifestyle, dysbiosis (imbalance of healthy and unhealthy gut bacteria), smoking, and a family history of the disease. Therefore, it is crucial to minimise exposure to these risk factors in order to decrease the likelihood of developing Crohn’s disease. Written by Sherine Abdul Latheef Related articles: the gut microbiome / the dopamine connection / Diverticular disease REFERENCES Veauthier B, Hornecker JR. Crohn's Disease: Diagnosis and Management. Am Fam Physician. 2018;98(11):661-669. Torres J, Mehandru S, Colombel JF, Peyrin-Biroulet L. Crohn's disease. Lancet. 2017;389(10080):1741-1755. doi:10.1016/S0140-6736(16)31711-1 Mills SC, von Roon AC, Tekkis PP, Orchard TR. Crohn's disease. BMJ Clin Evid. 2011;2011:0416. Published 2011 Apr 27. Sealife, A. (2024) Crohn’s disease, Parkland Natural Health. Available at: https://wellness-studio.co.uk/crohns-disease/ (Accessed: 09 March 2024). How to stop anxiety stomach pain & cramps (2022) Calm Clinic - Information about Anxiety, Stress and Panic. Available at: https://www.calmclinic.com/anxiety/symptoms/stomach-pain (Accessed: 09 March 2024). Project Gallery
- Polypharmacy: the complex landscape of multiple medications | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Polypharmacy: the complex landscape of multiple medications 21/10/24, 14:40 Last updated: From the eyes of a chemist The concurrent use of many medications by a patient, known as polypharmacy, poses a complex challenge to modern healthcare, especially for the elderly and those with chronic diseases. Polypharmacy raises the risk of adverse drug responses, drug interactions, and medication non-adherence, even though it is essential for managing complicated health concerns. To maximise patient outcomes and guarantee safe treatment regimens, it is crucial to recognise the chemical interactions and effects of different medications. The Chemistry Behind Polypharmacy Polypharmacy stems from the intricate interactions between several chemicals in the human body. Every drug has unique chemical components intended to interact with biological targets in order to provide therapeutic benefits. Nevertheless, when several medications are taken at once, their combinations may have unexpected effects. Understanding polypharmacy requires a thorough understanding of pharmacokinetics—the way the body absorbs, distributes, metabolises, and excretes medications—and pharmacodynamics—the effects of pharmaceuticals on the body. For example, some pharmaceuticals may cause or inhibit the enzymes that metabolise other drugs, changing the levels of the drug and possibly increasing its toxicity or decreasing its effectiveness. Analytical Methods in Polypharmacy Management Chemistry offers a number of analytical and instrumental techniques for efficient polypharmacy management. Drug levels in the blood are tracked using methods like mass spectrometry (MS) and high-performance liquid chromatography (HPLC) to make sure they stay within therapeutic ranges. These techniques support dose modifications by identifying possible medication interactions. Furthermore, it is impossible to exaggerate the importance of chemistry in the creation of drug interaction databases and predictive modelling instruments. By helping medical professionals foresee and minimise harmful medication interactions, these materials help to ensure patient safety. The Role of Healthcare Professionals To successfully manage the complexity of polypharmacy, healthcare professionals—including physicians, chemists, and nurses—need to have a solid understanding of chemistry. Their expertise is essential for assessing each drug's requirement, taking possible interactions into account, and coming up with methods to make drug regimens easier to follow. Managing polypharmacy is especially important for chemists. They assess patients' prescriptions, look for any interactions, and suggest changes or substitutes using their knowledge of medicinal chemistry. Pharmacists who participate in collaborative care can greatly lower the hazards related to polypharmacy. Innovations in Medication Management Chemistry-driven advances in medical technology are improving polypharmacy management. Real-time alerts regarding potential drug interactions can be provided to prescribers through computerised physician order entry (CPOE) systems that are coupled with clinical decision support systems (CDSS). Optimising polypharmacy may also be possible with the emergence of personalised medicine, which adjusts drug regimens according to a patient's genetic profile. Conclusion Polypharmacy remains a significant challenge in healthcare, demanding a comprehensive understanding of chemistry and pharmacology to manage effectively. Healthcare practitioners can minimise the hazards associated with several medications and provide safer and enhanced patient care by utilising modern analytical methods, prediction technologies, and multidisciplinary teamwork. Written by Laura K Project Gallery
- How does moving houses impact your health and well-being? | Scientia News
Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link How does moving houses impact your health and well-being? 04/09/24, 16:14 Last updated: Evaluating the advantages and disadvantages of gentrification in the context of health Introduction According to the World Health Organization (WHO), health is “a state of complete physical, mental and social well-being and not merely the absence of disease or infirmity". Another way to define health is an individual being in a condition of equilibrium within themselves and the surrounding environment, which includes their social interactions and other factors. Reflecting on historical views of health, ancient Indian and Chinese medicine and society in Ancient Greece thought of health as harmony between a person and their environment, which underlines the cohesion between the soul and body; this is similar to the WHO’s definition of health. Considering these ideas, one key determinant of health is gentrification (see Figure 1 ). It was first defined in 1964 by British sociologist Ruth Glass, who witnessed the dilapidated houses in the London Borough of Islington being taken over and renovated by middle-class proprietors. The broader consequences of gentrification include enhanced living conditions for the residents, differences in ownership prerequisites, increased prices of land and houses, and transformations in the social class structure. Also, these changes cause lower-income inhabitants to be pushed out or go to poorer neighbourhoods, and the conditions in these neighbourhoods, which can include racial separation, lead to inequities and discrepancies in health. For example, a systematic review discovered that elderly and Black residents were affected more by gentrification compared to younger and White citizens; this highlights the importance of support and interventions for specific populations during urban renewal. Given the knowledge provided above, this article will delve further into the advantages and disadvantages of gentrification in the context of health outcomes. Advantages of gentrification Gentrification does have its benefits. Firstly, it is positively linked with collective efficacy, which is about enhancing social cohesion within neighbourhoods and maintaining etiquette; this has health benefits for residents, like decreased rates of obesity, sexually transmitted diseases, and all-cause mortality. Another advantage of gentrification is the possibility of economic growth because as more affluent tenants move into specific neighbourhoods, they can bring companies, assets, and an increased demand for local goods and services, creating more jobs in the area for residents. Additionally, gentrification can be attributed to decreased crime rates in newly developed areas because the inflow of wealthier citizens often conveys a more substantial sense of community and investment in regional security standards. Therefore, this revitalised feeling of safety can make these neighbourhoods more appealing to existing and new inhabitants, which leads to further economic development. Moreover, reducing crime can improve health outcomes by reducing stress and anxiety levels among residents, for example. As a result, the community's general well-being can develop, leading to healthier lifestyle choices and more lively neighbourhoods. Furthermore, the longer a person lives in a gentrifying neighbourhood, the better their self-reported health, which does not differ by race or ethnicity, as observed in Los Angeles. Disadvantages of gentrification However, it is also essential to mention the drawbacks of gentrification, which are more numerous. In a qualitative study involving elderly participants, for example, one of them stated that, “The cost of living increases, but the money that people get by the end of the month is the same, this concerning those … even retired people, and people receiving the minimum wage, the minimum wage increases x every year, isn’t it? But it is not enough”. Elderly residents in Barcelona faced comparable challenges of residential displacement between 2011 and 2017 due to younger adults with higher incomes and those pursuing university education moving into the city. These cases spotlight how gentrification can raise the cost of living without an associated boost in earnings, making it problematic for people with lower incomes or vulnerable individuals to live in these areas. Likewise, a census from gentrified neighbourhoods in Pittsburgh showed that participants more typically conveyed negative health changes and reduced resources. Additionally, one study examined qualitative data from 14 cities in Europe and North America and commonly noticed that gentrification negatively affects the health of historically marginalised communities. These include threats to housing and monetary protection, socio-cultural expulsion, loss of services and conveniences, and raised chances of criminal behaviour and compromised public security. This can be equally observed during green gentrification, where longtime historically marginalised inhabitants feel excluded from green or natural spaces, and are less likely to use them compared to newer residents. To mitigate these negative impacts of gentrification, inclusive urban renewal guidelines should be drafted that consider vulnerable populations to boost health benefits through physical and social improvements. The first step would be to provide residents with enough information and establish trust between them and the local authorities because any inequality in providing social options dramatically affects people’s health-related behaviours. Intriguingly, gentrification has been shown to increase the opportunity for exposure to tick-borne pathogens by populations staying in place, displacement within urban areas, and suburban removal. This increases tick-borne disease risk, which poses a health hazard to impacted residents ( Figure 2 ). As for mental health, research has indicated that residing in gentrified areas is linked to greater levels of anxiety and depression in older adults and children. Additionally, one study found young people encountered spatial disconnection and affective exclusion due to gentrification and felt disoriented by the quickness of transition. Therefore, all of these problems associated with gentrification reveal that it can harm public health and well-being, aggravating disparities and creating feelings of isolation and aloneness in impacted communities. Conclusion Gentrification is a complicated and controversial approach that has noteworthy consequences for the health of neighbourhoods. Its advantages include enhanced infrastructure and boosted economic prospects, potentially leading to fairer access to healthcare services and improved health outcomes for residents. However, gentrification often leads to removal and the loss of affordable housing, which can harm the health of vulnerable populations. Therefore, it is vital for policymakers and stakeholders to carefully evaluate the likely health effects of gentrification and enforce alleviation strategies to safeguard the well-being of all citizens (see Table 1 ). Written by Sam Jarada Related article: A perspective on well-being REFERENCES WHO. Health and Well-Being. Who.int . 2015. Available from: https://www.who.int/data/gho/data/major-themes/health-and-well-being Sartorius N. The meanings of health and its promotion. Croatian Medical Journal. 2006;47(4):662–4. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2080455/ Krahn GL, Robinson A, Murray AJ, Havercamp SM, Havercamp S, Andridge R, et al. It’s time to Reconsider How We Define Health: Perspective from disability and chronic condition. Disability and Health Journal. 2021 Jun;14(4):101129. Available from: https://www.sciencedirect.com/science/article/pii/S1936657421000753 Svalastog AL, Donev D, Jahren Kristoffersen N, Gajović S. Concepts and Definitions of Health and health-related Values in the Knowledge Landscapes of the Digital Society. Croatian Medical Journal. 2017 Dec;58(6):431–5. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5778676/ Foryś I. Gentrification on the Example of Suburban Parts of the Szczecin Urban Agglomeration. remav. 2013 Sep 1;21(3):5–14. Uribe-Toril J, Ruiz-Real J, de Pablo Valenciano J. Gentrification as an Emerging Source of Environmental Research. Sustainability. 2018 Dec 19;10(12):4847. Schnake-Mahl AS, Jahn JL, Subramanian SV, Waters MC, Arcaya M. Gentrification, Neighborhood Change, and Population Health: a Systematic Review. Journal of Urban Health. 2020 Jan 14;97(1):1–25. Project Gallery
- Evolution of AI and the role of NLP | Scientia News
Go back Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The evolution of AI: understanding the role of NLP technologies Last updated: 05/11/24 Artificial intelligence (AI) has long been a controversial topic, with some people fearing its potential consequences. This has been exacerbated by popular culture, with movies such as "The Terminator" and "2001: A Space Odyssey" depicting AI systems becoming self-aware and turning against humans. Similarly, "The Matrix" portrayed a dystopian future where AI systems had enslaved humanity. Fast forward to 2023- AI has become a normal part of our everyday life, whether we realise it or not. From virtual assistants like Siri and Alexa to personalised movie and product recommendations, AI-powered technologies have revolutionized the way we interact with technology. AI also plays a critical role in industries such as healthcare, finance, and transportation, with algorithms helping to analyse data, identify patterns, and make predictions that lead to better decision-making. As with any industry, the AI industry is very much prone to evolution. In fact, this is especially relevant for the AI industry, given that it engages user habits to learn and redefine its understanding. This has led to the introduction of unforeseen technologies. One of the most studied and developed AI modelling techniques, Natural Language Processing (NLP), has been particularly placed under focus recently with the emergence of technologies such as Open AI’s ChatGPT, Google’s Bard AI and Microsoft’s Bing AI. ChatGPT in particular, was one of the first technologies of this kind to garner significant fame. Within its first year of release, the GPT-3 model had more than 10,000 registered developers and over 300 applications built on its API. In addition, Microsoft acquired OpenAI's exclusive license to the GPT-3 technology in 2020, further solidifying its position as a leading language model in the industry. ChatGPT works as an advanced artificial intelligence technology designed to understand and process human language. Built on the GPT-3.5 architecture, it uses natural language processing (NLP) to comprehend and generate responses that simulate human conversation. ChatGPT is classified as a large language model, which means it has been trained on vast amounts of data and can generate high-quality text that is both coherent and relevant to the input provided. While concerns have been raised about the potential impact of natural language processing (NLP) technologies, there are several reasons why we should not fear their emergence. Firstly, NLP has already enabled a wide range of useful applications that have the potential to improve efficiency, convenience, and accessibility. Furthermore, the development and deployment of NLP technologies is subject to ethical considerations and regulations that aim to ensure their responsible use. NLP technologies are not designed to replace humans, but rather to complement and enhance human capabilities. While some jobs may be impacted by automation, new jobs are likely to emerge that require human skills that are not easily replicated by machines. Ultimately, the impact of NLP technologies depends on how they are developed and used. There are always likely to be risks, but by taking a proactive approach to their development and deployment, we can ensure that they are used to benefit society and advance human progress. Written by Jaspreet Mann Related article: AI: the good, the bad, and the future REFERENCES Hirschberg, Julia, and Christopher D. Manning. “Advances in Natural Language Processing.” Science, vol. 349, no. 6245, July 2015, pp. 261–66. DOI.org (Crossref), https://doi.org/10.1126/science.aaa8685. What Is Natural Language Processing? | IBM. https://www.ibm.com/topics/natural-language-processing. Accessed 1 May 2023. Biswas, Som S. “Role of Chat GPT in Public Health.” Annals of Biomedical Engineering, vol. 51, no. 5, May 2023, pp. 868–69. Springer Link, https://doi.org/10.1007/s10439-023-03172-7. Davenport, T.H. (2018). The AI Advantage: How to Put the Artificial Intelligence Revolution to Work. MIT Press. Bird, S., Klein, E., & Loper, E. (2009). Natural Language Processing with Python. O'Reilly Media.