Search Index
355 results found
- Monkey see, monkey clone | Scientia News
A leap forward in primate research Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Monkey see, monkey clone 10/07/25, 10:22 Last updated: Published: 07/09/24, 19:20 A leap forward in primate research Chinese scientists have recently unlocked the secrets of cloning Rhesus monkeys offering new hope for medical breakthroughs. Introduction When we think of cloning, perhaps the first thing that comes to mind is Dolly the sheep, the first mammal ever cloned from an adult cell back in 1996. This groundbreaking achievement inspired a revolution leading to the successful cloning of other mammals such as cattles and pigs. However, cloning primates, especially Rhesus monkeys, has proven to be a significant challenge due to the low success rates and high embryonic losses during development. What is cloning? Cloning is the process of creating an identical genetic copy of an organism. In mammals, this is typically done through a technique called somatic cell nuclear transfer (SCNT). In SCNT, the nucleus (the compartment storing genetic material) from a cell of the animal to be cloned is transferred into an egg cell that has had its own nucleus removed. This hybrid egg cell then develops into an embryo which is implanted into a surrogate mother to grow into a new individual. Despite the success in cloning other mammals, cloning primates has proven to be a significant challenge. However, the potential benefits of cloning primates for medical research make it a worthwhile endeavour. The importance of cloning primates You might be wondering why being able to clone primates is so important. Well, primates like the Rhesus monkey are invaluable models for studying human diseases and create new therapies! The reason we can use them as disease models is because they share about 93% genetic identity and have very similar physiological characteristics with humans. For instance, Rhesus monkeys also experience a decline in their cognitive abilities as they age, and they lose important connections between brain cells in the part of the brain responsible for complex thinking, even when there's no severe brain damage. Moreover, Rhesus monkeys also develop the same kinds of brain changes that we see in people with Alzheimer's disease, such as the buildup of sticky proteins called amyloid-beta and tangled fibres of another protein called tau.These similarities make them excellent models for understanding how human diseases progress and for developing new treatments. So, by cloning these animals, researchers might be able to create monkeys with specific genetic changes that mimic human diseases even more closely. This could allow scientists to study these diseases in greater detail and develop more effective therapies. Cloning primates could give us a powerful tool to fight against some of the most challenging disorders that affect the human brain! A breakthrough in primate cloning Now, a group of scientists in China have made a breakthrough in primate cloning. They successfully cloned a Rhesus monkey using a novel technique called trophoblast replacement (TR).This innovative approach not only helps us better understand the complex process of cloning but also offers a promising way to improve the efficiency of primate cloning, bringing us one step closer to unlocking the full potential of this technology for medical research and beyond. The awry DNA methylation of cloned conkey embryos To understand why cloning monkeys is so challenging, Liao and colleagues (2024) took a closer look at the genetic material of embryos created in two different ways. They compared embryos made through a standard fertility treatment called intracytoplasmic sperm injection (ICSI) with those created via the cloning technique, SCNT. What they found was quite surprising! To make matters worse, the scientists also noticed that certain genes, known as imprinted genes, were not functioning properly in the SCNT embryos. Imprinted genes are a special group of genes that play a crucial role in embryo development. In a healthy embryo, only one copy of an imprinted gene (either from the mother or the father) is active, while the other copy is silenced. But in the cloned embryos, both copies were often incorrectly switched on or off. Here's the really concerning part: these genetic abnormalities were not just present in the early embryos but also in the placentas of the surrogate monkey mothers carrying the cloned offspring. This suggests that the issues arising from the cloning process start very early in development and continue to affect the pregnancy. Liao and colleagues suspect that the abnormal DNA methylation patterns might be responsible for the imprinted gene malfunction. It's like a game of genetic dominos – when one piece falls out of place, it can cause a whole cascade of problems down the line. Piecing together this complex genetic puzzle is crucial for understanding why primate cloning is so difficult and how we can improve its success in the future. By shedding light on the mysterious world of DNA methylation and imprinted genes, Liao and colleagues have brought us one step closer to unravelling the secrets behind monkey cloning. Digging deeper: what does the data reveal? Liao et al. (2024) discovered that nearly half of the cloned monkey foetuses died before day 60 of the gestation period, indicating developmental defects in the SCNT embryos during implantation. They also found that the DNA methylation level in SCNT blastocysts was 25% lower compared to those created through ICSI (30.0% vs. 39.6%). Furthermore, out of the 115 human imprinting genes they examined in both the embryos and placentas, four genes - THAP3, DNMT1, SIAH1, and RHOBTB3 - showed abnormal expression and loss of DNA methylation in SCNT embryos. These findings highlight the complex nature of the reprogramming process in SCNT and the importance of imprinted genes in embryonic development. By understanding these intricacies, scientists can develop targeted strategies to improve the efficiency of primate cloning. The power of trophoblast replacement To avoid the anomalies in SCNT placentas, the researchers developed a new method called TR. In this method, they transferred the inner cell mass (the part of the early embryo that develops into the baby) from an SCNT embryo into the hollow cavity of a normal embryo created through fertilisation, after removing its own inner cell mass. The idea behind this technique is to replace the abnormal placental cells in the SCNT embryo with healthy ones from the normal embryo. And it worked! Using this method, along with some additional treatments, Liao et al. (2024) successfully cloned a healthy male Rhesus monkey that has survived for over two years (FYI his name is Retro!). The ethics of cloning While the scientific advances in primate cloning are exciting, they also raise important ethical questions. Some people worry about the potential misuse of this technology, for instance to clone humans, which is widely considered unethical. Others are concerned about the well-being of cloned animals, as the cloning process can sometimes lead to health problems. As scientists continue to make progress in cloning technology, it is essential to have open discussions about the ethical implications of their work. Rules and guidelines must be put in place to ensure that this technology is developed and used responsibly, with the utmost care for animal welfare and the concerns of society. Looking to the future The successful cloning of a rhesus monkey using TR opens up new avenues for primate research. This technology can help scientists create genetically identical monkeys to study a wide range of human diseases, from neurodegenerative disorders like Alzheimer's and Parkinson's to infectious diseases like HIV and COVID-19. The trophoblast replacement technique developed by Liao et al. (2024) increases the likelihood of successful cloning by replacing the abnormal placental cells in the SCNT embryo with healthy ones from a normal embryo. However, it is important to note that this technique does not affect the genetic similarity between the clone and the original monkey, as the inner cell mass, which gives rise to the foetus, is still derived from the SCNT embryo. Moreover, this research provides valuable insights into the mechanisms of embryonic development and the role of imprinted genes in this process. By understanding these fundamental biological processes, scientists can not only improve the efficiency of cloning but also develop new strategies for regenerative medicine and tissue engineering. As we look to the future, cloning monkeys could help us make groundbreaking discoveries in medical research and develop new treatments for human diseases. However, we must also carefully consider the ethical implications of cloning primates and ensure that this powerful tool is used responsibly and for the benefit of society. Written by Irha Khalid Related articles: Do other animals get periods? / Germline gene therapy (GGT) REFERENCES Beckman, D. and Morrison, J.H. (2021). Towards developing a rhesus monkey model of early Alzheimer’s disease focusing on women’s health. American Journal of Primatology , [online] 83(11). doi: https://doi.org/10.1002/ajp.23289 . Liao, Z., Zhang, J., Sun, S., Li, Y., Xu, Y., Li, C., Cao, J., Nie, Y., Niu, Z., Liu, J., Lu, F., Liu, Z. and Sun, Q. (2024). Reprogramming mechanism dissection and trophoblast replacement application in monkey somatic cell nuclear transfer. Nature Communications , [online] 15(1), p.5. doi: https://doi.org/10.1038/s41467-023-43985-7 . Morrison, J.H. and Baxter, M.G. (2012). The ageing cortical synapse: hallmarks and implications for cognitive decline. Nature Reviews Neuroscience , [online] 13(4), pp.240–250. doi: https://doi.org/10.1038/nrn3200 . Paspalas, C.D., Carlyle, B.C., Leslie, S., Preuss, T.M., Crimins, J.L., Huttner, A.J., Dyck, C.H., Rosene, D.L., Nairn, A.C. and Arnsten, A.F.T. (2017). The aged rhesus macaque manifests Braak stage III/IV Alzheimer’s‐like pathology. Alzheimer’s & Dementia , [online] 14(5), pp.680–691. doi: https://doi.org/10.1016/j.jalz.2017.11.005 . Shi, L., Luo, X., Jiang, J., Chen, Y., Liu, C., Hu, T., Li, M., Lin, Q., Li, Y., Huang, J., Wang, H., Niu, Y., Shi, Y., Styner, M., Wang, J., Lu, Y., Sun, X., Yu, H., Ji, W. and Su, B. (2019). Transgenic rhesus monkeys carrying the human MCPH1 gene copies show human-like neoteny of brain development. National Science Review , [online] 6(3), pp.480–493. doi: https://doi.org/10.1093/nsr/nwz043 . Project Gallery
- Unlocking the power of statistics | Scientia News
From confusion to career opportunities Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Unlocking the power of statistics 14/07/25, 15:09 Last updated: Published: 19/09/23, 16:23 From confusion to career opportunities During my time studying maths there was always one topic that would trip me up: statistics. Being an A-level physics student, I could understand why calculus is useful in real life, using differentiation to calculate the velocities of projectiles. And I could look and see how geometry is used in buildings and structures. However, statistics often made me feel unrelatable and lost, as I was unable to see real-world applications. But today, I wish to alter my old perspective. First and foremost, you might be pleasantly surprised to learn that statistics opens doors to some of the most lucrative careers available today. We'll delve into roles such as quantitative analysts, who boast a national average salary of £99,000 per year. But if finance is not your cup of tea, there are many other rewarding career paths to explore, from becoming a data scientist to forecasting the weather as a meteorologist. In this article, I wish to unveil the world of statistics, revealing its importance and shedding light on its real-life applications. My hope is to not only inspire those who are already passionate about statistics but also to ignite motivation in individuals who, like me, found themselves in a similar predicament a few years ago. The Actuary Less well known when compared to a banker or engineer, an actuary’s sole purpose is to analyse risk for multiple different scenarios. It may sound simple on first inspection, but being an actuary is a very well-established career requiring many years of learning followed by some of the most challenging exams in the job market. An actuary attempts to quantify the risk of an event happening so that financial decisions can be made with an objective view. A good and close-to-home example of this is being either accepted or rejected from a credit card. As a younger person below the age of 21, the chances of you getting accepted for a credit card are extremely and quite painfully low. This is because banks, and more specifically, credit score providers, deem you to be a high-risk person to lend to. They think this because you have a very short credit history, are unaware of how responsible you are with money, and are more afraid to lend you their cash. In other words, they don’t want you to spend their money on going out and drinking booze. The insurance industry is, however, the biggest industry when it comes to actuaries. Both life and non-life actuaries work in teams with insurance providers to establish whether a client, company, or investment is worthwhile. Actuaries apply both statistics and actuarial science (similar to applied statistics) to real-life situations, evaluate whether to offer a premium to a customer, and then establish what that premium is. You may see in advertisements that life insurance costs as little as £10 a month for a 20-year-old compared to someone who is 65. This is because the younger you are, the less likely you are to claim against your policy. Actuaries put together vast amounts of information about people, lifestyle choices, and other factors to help determine the probability that someone may claim, suggesting a ‘fair’ premium that an insurance company may offer. Without the help of an actuary, insurance companies would either charge too much, making people disadvantaged, or charge too little, in which case they would have to default on their policy and be unable to pay out any claims. Although this seems very specific, the role of an actuary is becoming increasingly important as people live longer lives and insurance companies become more fearful of defaulting.To put it into perspective, actuaries on average earn £80,000 working in London, putting you well in the top 10% of earners in the UK. The Quantitative Analyst Similar to an actuary, quantitative analysts do exactly what is said on the tin. They use quantitative methods to analyse data. Often, companies like investment banks, hedge funds, and pension funds will hire front-office ‘quants’. The aim of the game is to send out trades as quickly as possible before all the other trading offices do. These big companies have links directly to the trading floor, so every millisecond counts, and it’s a quant's job to devise a trading strategy that beats the rest and operates in the least amount of time. Quants are masters of statistics and mathematics, and for this reason, high-frequency trading firms like Hudson River Trading offer salaries to top mathematical minds in excess of $500,000. The role of quantitative researchers is to explore the latest statistical articles being published by top universities and generate strategies that can be implemented in the stock market. This job is not one to be taken lightly, as salary is often based on performance, but someone who is motivated to explore the ins and outs of statistics may find themselves loving the life of a quant. The Meteorologist Meteorologists are the people that we incorrectly blame for the bad weather that we have. And they are also the people we blame when we forget to take a coat and get soaked on the long walk back home. But what do meteorologists actually do? And is it any more than just an educated guess? Meteorologists, along with climatologists, collect millions of pieces of information every hour of every day across their 195,000 weather stations spread all around the globe. These stations collect key pieces of information, including atmospheric pressure, temperature, speed, rain, humidity, and many other components of current weather conditions. With this information, meteorologists begin to paint a picture of what the current weather climate is like and then use forecasting methods and statistical models to estimate how the weather is going to change. The probability that it might rain is much more than an educated guess; it is the probability that if this situation happened 100 times, it would rain the estimated number of times (i.e., if there was an 80% chance of rain, it would rain 80 times out of the hundred over a large enough sample). As a forecaster, you will collect this information and input it into very advanced systems to analyse and give an outcome, but as a researcher, you will help derive these statistical forecasting models and improve them so that our apps and news channels are even more precise. Not only that, but you may also find yourself researching the effects of climate change from the data that you analyse, and maybe even how the weather affects the spread of pollution and disease. Meteorologists get paid a modest salary of around £32,000 per year, which may seem small when compared to that of a quant, but the quality of life is far more generous than some careers in finance. To conclude In conclusion, statistics, once a perplexing subject for many, can offer an exciting and rewarding career. From the meticulous work of actuaries, assessing risks and financial decisions, to the world of quantitative analysts, where every millisecond counts, and even to the indispensable role of meteorologists, who help us navigate the weather and climate change, statistics holds the power to transform lives and industries. As we've explored, statistics is not just about numbers and formulas; it's about making sense of the world, predicting outcomes, and creating informed decisions. So, whether you're a seasoned statistician or someone who, like me, once felt lost in its complexities, remember that statistics isn't merely a subject to conquer—it's a key that unlocks doors to some of the most intriguing and well-compensated careers out there. Written by George Chant Project Gallery
- Can carbon monoxide unlock new pathways in inflammation therapy? | Scientia News
Recent prospects for carbon monoxide indicate so Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Can carbon monoxide unlock new pathways in inflammation therapy? 20/03/25, 12:03 Last updated: Published: 01/09/24, 10:31 Recent prospects for carbon monoxide indicate so Carbon monoxide (CO) is a colourless, odourless and tasteless gas which is a major product of the incomplete combustion of carbon-containing compounds. The toxic identity CO stems from its strong affinity for the haemoglobin in our blood which is around 300 times as strong as the affinity of oxygen. As a result, once the gas is inhaled, CO binds to the haemoglobin instead and reduces the amount of oxygen our blood can transport, which can cause hypoxia (low levels of oxygen in tissue) and dizziness, eventually leading to death. However, an intriguing fact is that CO is also endogenously produced in our body, due to the degradation of haem in the blood. Moreover, recent prospects for CO indicate that it may even be developed as an anti-inflammatory drug. How CO is produced in the body See Figure 1 Haem is a prosthetic (non-peptide) group in haemoglobin, where the oxygen binds to the iron in the molecule. When red blood cells reach the end of their lifespan of around 120 days, they are broken down in a reaction called haemolysis. This occurs in the bone marrow by macrophages that engulf the cells, which contain the necessary haem-oxygenase enzyme. Haem-oxygenase converts haem into CO, along with Fe2+ and biliverdin, the latter being converted to bilirubin for excretion. The breakdown of haem is crucial because the molecule is pro-oxidant. Therefore, free haem in the blood can lead to oxidative stress in cells, potentially resulting in cancers. Haem degradation also contributes to the recycling of iron for the synthesis of new haem molecules or proteins like myoglobin. This is crucial for maintaining iron homeostasis in the body. The flow map illustrates haemolysis and the products produced, which either protect cells from further stress or result in cell injury. CO can go on to induce anti-inflammatory effects- see Figure 2 . Protein kinases and CO Understanding protein kinases is crucial before exploring carbon monoxide (CO) reactions. Protein kinases phosphorylate (add a phosphate group to) proteins using ATP. Protein kinases are necessary to signal the release of a hormone or regulating cell growth. Each kinase has two regulatory (R) subunits and two catalytic (C) subunits. ATP as a reactant is usually sufficient for protein kinases. However, some kinases require additional mitogens – specific activating molecules like cytokines (proteins regulating immune cell growth), that are involved in regulating cell division and growth. Without the activating molecules, the R subunits bind tightly to the C subunits, preventing phosphorylation. Research on obese mice showed that CO binding to a Mitogen-Activated Protein Kinase (MAPK) called p38 inhibits inflammatory responses. This kinase pathway enhances insulin sensitivity, reducing obesity effects. The studies used gene therapy, modifying haem-oxygenase levels in mice. Mice with reduced haem-oxygenase levels had more adipocytes (fat-storing cells) and increased insulin resistance, suggesting CO treatment potential for chronic obstructive pulmonary disease (COPD), which causes persistent lung inflammation and results in 3 million deaths annually. Carbon-monoxide-releasing molecules As a result of these advancements, specific CO-releasing molecules (CORMs) have been developed to release carbon monoxide at specific doses. Researchers are particularly interested in the ability of CORMs to regulate oxidative stress and improve outcomes in conditions during organ transplantation, and cardiovascular diseases. Advances in the design of CORMs have focused on improving their stability, and targeted release to specific tissues or cellular environments. For instance, CORMs based on transition metals like ruthenium, manganese, and iron have been developed to enhance their efficacy and minimize side effects. This is achieved through carbon monoxide forming a stable ‘ligand’ structure with metals to travel in the bloodstream. Under an exposure to light or a chemical, or even by natural breakdown, these structures can slowly distribute CO molecules. Although the current research did not find any notable side effects within mouse cells, this does not reflect the mechanisms in human organ systems, therefore there is still a major risk of incompatibility due to water insolubility and toxicity issues. These problems could lead to potentially lead to disruption in the cell cycle, which may promote neurodegenerative diseases. Conclusion: the future of carbon monoxide Carbon monoxide has transitioned from being a notorious toxin to a valuable therapeutic agent. Advances in CO-releasing molecules have enabled its safe and controlled use, elevating its anti-inflammatory and protective properties to treat various inflammatory conditions effectively. This shift underpins the potential of CO to revolutionise inflammation therapy. It is important to remember that while carbon monoxide-releasing molecules (CORMs) have potential in controlled therapeutic settings, carbon monoxide gas itself remains highly toxic and should be handled with extreme caution to avoid serious health risks. Written by Baraytuk Aydin Related articles: Schizophrenia, inflammation and ageing / Kawasaki disease REFERENCES Different Faces of the Heme-Heme Oxygenase System in Inflammation - Scientific Figure on ResearchGate. Available from: https://www.researchgate.net/figure/The-colorimetric-actions-of-the-heme-HO-system-heme-oxygenase-mediated-heme-degradation_fig3_6531826 (accessed 11 Jul, 2024). Nath, K.A. (2006) Heme oxygenase-1: A provenance for cytoprotective pathways in the kidney and other tissues, Kidney International. Available at: https://www.sciencedirect.com/science/article/pii/S0085253815519595 (Accessed: 12 July 2024). Gáll, T. et al. (2020) ‘Therapeutic potential of carbon monoxide (CO) and hydrogen sulfide (H2S) in hemolytic and hemorrhagic vascular disorders—interaction between the heme oxygenase and H2S-producing systems’, International Journal of Molecular Sciences, 22(1), p. 47. doi:10.3390/ijms22010047. Venkat, A. (2024) Protein kinase, Wikipedia. Available at: https://en.wikipedia.org/wiki/Protein_kinase (Accessed: 12 July 2024). Goebel, U. and Wollborn, J. (2020) Carbon monoxide in intensive care medicine-time to start the therapeutic application?! - intensive care medicine experimental, SpringerOpen. Available at: https://icm-experimental.springeropen.com/articles/10.1186/s40635-020-0292-8 (Accessed: 07 July 2024). Bansal, S. et al. (2024) ‘Carbon monoxide as a potential therapeutic agent: A molecular analysis of its safety profiles’, Journal of Medicinal Chemistry, 67(12), pp. 9789–9815. doi:10.1021/acs.jmedchem.4c00823. DeSimone, C.A., Naqvi, S.L. and Tasker, S.Z. (2022) ‘Thiocormates: Tunable and cost‐effective carbon monoxide‐releasing molecules’, Chemistry – A European Journal, 28(41). doi:10.1002/chem.202201326. Project Gallery
- Proving causation: causality vs correlation | Scientia News
Establishing causation through Randomised Controlled Trials and Instrumental Variables Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Proving causation: causality vs correlation Last updated: 03/06/25, 13:43 Published: 12/06/25, 07:00 Establishing causation through Randomised Controlled Trials and Instrumental Variables Does going to the hospital lead to an improvement in health? At first glance, one might assume that visiting a hospital should improve health outcomes. However, if we compare the average health status of those who go to the hospital with those who do not, we might find that hospital visitors tend to have worse health overall. This apparent contradiction arises due to confounding – people typically visit hospitals due to existing health issues. Simply comparing these two groups does not tell us whether hospitals improve health or if the underlying health conditions of patients drive the observed differences. A similar challenge arises when examining the relationship between police presence and crime rates. Suppose we compare two cities—one with a large police force and another with a smaller police force. If the city with more police also has higher crime rates, does this mean that police cause crime? Clearly not. Instead, it is more likely that higher crime rates lead to an increased police presence. This example illustrates why distinguishing causation from correlation is crucial in data analysis, and that stating that two variables are correlated does not imply causation. First, let’s clarify the distinction between causation and correlation. Correlation refers to a relationship between two variables, but it does not imply that one causes the other. Just because two events occur together does not mean that one directly influences the other. To establish causation, we need methods that separate the true effect of an intervention from other influencing factors. Statisticians, medical researchers and economists have ingeniously come up with several techniques that allow us to separate correlation and causation. In medicine, the gold standard for researchers is the use of Randomised Controlled Trials (RCTs). Imagine a group of 100 people, each with a set of characteristics, such as gender, age, political views, health status, university degree, etc. RCTs randomly assign each individual to one of two groups. Consequently, each group of 50 individuals should, on average, have similar ages, gender distribution, and baseline health. Researchers then examine both groups simultaneously while changing only one factor. This could involve instructing one group to take a specific medicine or asking individuals to drink an additional cup of coffee each morning. This results in two statistically similar groups differing in only one key aspect. Therefore, if the characteristics of one group change while those of the other do not, we can reasonably conclude that the change caused the difference between the groups. This is great for examining the effectiveness of medicine, especially when you give one group a placebo, but how would we research the causation behind the police rate and crime example? Surely it would be unwise and perhaps unethical to randomise how many police officers are present in each city? And because not all cities are the same, the conditions for RCTs would not hold. Instead, we use more complex techniques like Instrumental Variables (IV) to overcome those limitations. A famous experiment using IV to explain police levels and crime was published by Steven Levitt (1997). Levitt used the timings of mayoral and gubernatorial elections (the election of a governor) as an instrument for changes in police hiring. Around election time, mayors and governors have incentives to look “tough on crime.” This can lead to politically motivated increases in police hiring before an election. Crucially, hiring is not caused by current crime rates but by the electoral calendar. So, by using the timing of elections to predict an increase in police, we can use those values to estimate the effect on crime. What he found was that more police officers reduce violent and property crime, with a 10% increase in police officers reducing violent crime by roughly 5%. Levitt’s paper is a clever application of IV to get around the endogeneity problem and takes correlation one step further into causation, through the use of exogenous election timing. However, these methods are not without limitations. IV analysis, for instance, hinges on finding a valid instrument—something that affects the independent variable (e.g., police numbers) but has no direct effect on the outcome (e.g., crime) other than through that variable. Finding such instruments can be extremely challenging, and weak or invalid instruments can lead to biased or misleading results. Despite these challenges, careful causal inference allows researchers to better understand the true drivers behind complicated relationships. In a world where influencers, media outlets, and even professionals often mistake correlation for causation, developing a critical understanding of these concepts is an essential skill required to navigate through the data, as well as help drive impactful change in society through exploring the true relationships behind different phenomena. Written by George Chant Related article: Correlation between HDI and mortality rate REFERENCE Steven D. Levitt (1997). “Using Electoral Cycles in Police Hiring to Estimate the Effect of Police on Crime”. American Economic Review 87.3, pp. 270–290 Project Gallery
- Herpes vs devastating skin disease | Scientia News
From foe to ally Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Herpes vs devastating skin disease 09/07/25, 14:16 Last updated: Published: 06/01/24, 11:14 From foe to ally This is article no. 3 in a series on rare diseases. Next article: Epitheliod hemangioendothelioma . Previous article: Breast cancer in males . Have you ever plucked loose skin near your nail, ripping off a tiny strip of good skin too? Albeit very small, that wound can be painful. Now imagine that it is not just a little strip that peels off, but an entire sheet. And it does not detach only when pulled, but at the slightest touch. Even a hug opens wounds, even a caress brings you pain. This is life with recessive dystrophic epidermolysis bullosa (RDEB), the most severe form of dystrophic pidermolysis bullosa (DEB). Herpes becomes a therapy DEB is a rare genetic disease of the skin that affects 3 to 10 individuals per million people (prevalence is hard to nail down for rare diseases). A cure is still far off, but there is good news for patients. Last May, the US Food and Drug Administration (FDA) approved Vyjuvek (beremagen geperparvec) to treat skin wounds in DEB. Clinical studies showed that it speeds up healing and reduces pain. Vyjuvek is the first gene therapy for DEB. It is manufactured by Krystal Biotech and - get this- it is a tweaked version of the herpes virus. Yes, you got that right, the virus causing blisters and scabs has become the primary ally against a devastating skin disease. This approval is a milestone for gene therapies, as Vyjuvek is the first gene therapy - based on the herpes virus, - to apply on the skin as a gel, - approved for repeated use. This article describes how DEB, and especially RDEB, affects the skin and wreaks havoc on the body; the following article will explain how Vyjuvek works. DEB disrupts skin integrity We carry around six to nine pounds of skin. Yet we often forget its importance: it stops germs and UVs, softens blows, regulates body temperature and makes us sensitive to touch. Diseases that compromise the skin are therefore devastating. These essential functions rely on the organisation of the skin in three layers: epidermis, dermis and hypodermis ( Figure 1 ). Typically, a Velcro strap of the protein collagen VII firmly anchors the epidermis to the dermis. The gene COL7A1 contains the instructions on how to produce collagen VII. In DEB, mutations in COL7A1 result in the production of a faulty collagen VII. As the Velcro strap is weakened, the epidermis becomes loosely attached to the dermis. Mutations in one copy of COL7A1 cause the dominant form of the disease (DDEB), mutations in both copies cause RDEB. With one copy of the gene still functional, the skin still produces some collagen VII, when both copies are mutated, little to no collagen VII is left. Therefore, RDEB is more severe than DDEB. In people with RDEB, the skin can slide off at the slightest touch and even gentle rubs can cause blisters and tears ( Figure 2 ). Living with RDEB Life with RDEB is gruelling and life expectancy doesn't exceed 30 years old. Wounds are very painful, slow to heal and get infected easily. The risk of developing an aggressive skin cancer is higher. The constant scarring can cause limb deformities. In addition, blisters can appear in the mouth, oesophagus, eyes and other organs. There is no cure for DEB for now; treatments can only improve the quality of life. Careful dressing of wounds promotes healing and prevents infections. Painkillers are used to ease pain. Special diets are required. And, to no one's surprise, physical activities must be avoided. Treating RDEB Over the past decade, cell and genetic engineering advances have sparked the search for a cure. Scientists have explored two main alternatives to restore the production of collagen VII in the skin. The first approach is based on transferring skin cells able to produce collagen VII. Despite promising results, this approach treats only tinyl patches of skin, requires treatments in highly specialised centres and it may cause cancer. The second approach is the one Vyjuvek followed. Scientists place the genetic information to make collagen VII in a modified virus and apply it to a wound. There, the virus infects skin cells, providing them with a new COL7A1 gene to use. These cells now produce a functional collagen VII and can patch the damage up. We already know which approach came up on top. Vyjuvek speeds up the healing of wounds as big as a smartphone. Professionals can apply it in hospitals, clinics or even at the patient’s home. And it uses a technology that does not cause cancer. But how does Vyjuvek work? And why did scientists choose the herpes virus to build Vyjuvek? We will find the answer in the following article. And since perfection does not belong to biology, we will also discuss the limitations of this remarkable gene therapy. NOTES: 1. DEB is part of a group of four inherited conditions, collectively named epidermolysis bullosa (EB), where the skin loses integrity. EB is also known as “Butterfly syndrome” because the skin becomes as fragile as a butterfly’s wing. These conditions are EB simplex, junction EB, dystrophic EB and Kindler EB. 2. Most gene therapies are based on modified, or recombinant in science jargon, adenoassociated viruses, which I reviewed for Scientia News. 3. Over 700 mutations have been reported. They disrupt collagen VII and its function with various degrees of severity. Consequently, RDEB and DDEB display several clinical phenotypes. 4. Two studies have adopted this approach: in the first study, Siprashvili and colleagues (2016) grafted ex vivo retrovirally-modified keratinocytes, the main cell type in the epidermis, over the skin of people with RDEB; in the second study, Lwin and colleagues (2019) injected ex vivo lentivirally-modified fibroblasts in the dermis of people with RDEB. Written by Matteo Cortese, PhD Related article: Ehlers-Danlos syndrome Project Gallery
- Unleashing the power of the stars: how nuclear fusion holds the key to tackling climate change | Scientia News
Looking at the option of nuclear fusion to generate renewable energy Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Unleashing the power of the stars: how nuclear fusion holds the key to tackling climate change 14/07/25, 15:08 Last updated: Published: 30/04/23, 10:55 Looking at the option of nuclear fusion to generate renewable energy Imagine a world where we have access to a virtually limitless and clean source of energy, one that doesn't emit harmful greenhouse gases or produce dangerous radioactive waste. A world where our energy needs are met without contributing to climate change. This may sound like science fiction, but it could become a reality through the power of nuclear fusion. Nuclear fusion, often referred to as the "holy grail" of energy production, is the process of merging light atomic nuclei to form a heavier nucleus, releasing an incredible amount of energy in the process. It's the same process that powers the stars, including our very own sun, and holds the potential to revolutionize the way we produce and use energy here on Earth. Nuclear fusion occurs at high temperature and pressure when two atoms (e.g. Tritium and Deuterium atoms) merge together to form Helium. This merge releases excess energy and a neutron. This energy an then be harvested inform of heat to produce electricity. Progress in the field of creating a nuclear fusion reactor has been slow, despites the challenges there are some promising technologies and approaches have been developed. Some of the notable approaches to nuclear fusion research include: 1. Magnetic Confinement Fusion (MCF) : In MCF, high temperatures and pressures are used to confine and heat the plasma, which is the hot, ionized gas where nuclear fusion occurs. One of the most promising MCF devices is the tokamak, a donut-shaped device that uses strong magnetic fields to confine the plasma. The International Thermonuclear Experimental Reactor (ITER), currently under construction in France, is a large-scale tokamak project that aims to demonstrate the scientific and technical feasibility of nuclear fusion as a viable energy source. 2. Inertial Confinement Fusion (ICF) : In ICF, high-energy lasers or particle beams are used to compress and heat a small pellet of fuel, causing it to undergo nuclear fusion. This approach is being pursued in facilities such as the National Ignition Facility (NIF) in the United States, which has made significant progress in achieving fusion ignition, although it is still facing challenges in achieving net energy gain. In December of 2022, the US lab reported that for the first time, more energy was released compared to the input energy. 3. Compact Fusion Reactors: There are also efforts to develop compact fusion reactors, which are smaller and potentially more practical for commercial energy production. These include technologies such as the spherical tokamak and the compact fusion neutron source, which aim to achieve high energy gain in a smaller and more manageable device. While nuclear fusion holds immense promise as a clean and sustainable energy source, there are still significant challenges that need to be overcome before it becomes a practical reality. In nature nuclear fusion is observed in stars, to be able to achieve fusion on Earth such conditions have to be met which can be an immense challenge. High level of temperature and pressure is required to overcome the fundamental forces in atoms to fuse them together. Not only that, but to be able to actually use the energy it has to be sustained and currently more energy is required then the output energy. Lastly, the material and technology also pose challenges in development of nuclear fusion. With high temperature and high energy particles, the inside of a nuclear fusion reactor is a harsh environment and along with the development of sustained nuclear fusion, development of materials and technology that can withstand such harsh conditions is also needed. Despite many challenges, nuclear fusion has the potential to be a game changer in fight against not only climate change but also access of cheap and clean energy globally. Unlike many forms of energy used today, fusion energy does not emit any greenhouse gasses and compared to nuclear fission is stable and does not produce radioactive waste. Furthermore, the fuel for fusion, which is deuterium is present in abundance in the ocean, where as tritium may require to synthesised at the beginning, but once the fusion starts it produce tritium by itself making it self-sustained. When the challenges are weighted against the benefits of nuclear fusion along with the new opportunities it would unlock economically and in scientific research, it is clear that the path to a more successful and clean future lies within the development of nuclear fusion. While there are many obstacles to overcome, the progress made in recent years in fusion research and development is promising. The construction of ITER project, along with first recordings of a higher energy outputs from US NIF programs, nuclear fusion can become a possibility in a not too distant future. In conclusion, nuclear fusion holds the key to address the global challenge of climate change. It offers a clean, safe, and sustainable energy source that has the potential to revolutionize our energy systems and reduce our dependence on fossil fuels. With continued research, development, and investment, nuclear fusion could become a reality and help us build a more sustainable and resilient future for our planet. It's time to unlock the power of the stars and harness the incredible potential of nuclear fusion in the fight against climate change. Written by Zari Syed Related articles: Nuclear medicine / Geoengineering / The silent protectors / Hydrogen cars Project Gallery
- Revolutionising sustainable agriculture | Scientia News
Through AI Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Revolutionising sustainable agriculture 11/07/25, 09:51 Last updated: Published: 27/06/23, 15:34 Through AI Artificial Intelligence (AI) is taking the world by storm. Recent developments now allow scientists to integrate AI into sustainable farming. Through transforming the way we grow crops, manage resources and pests, and most importantly- protect the environment. There are many applications for AI in agriculture. Outlined below are some of the areas in which the incorporation of AI systems improves sustainability: Precision farming Artificial intelligence systems help improve the overall quality and accuracy of harvesting – known as precision farming. Artificial intelligence technology helps detect plant diseases, pests, and malnutrition on farms. AI sensors can detect and target weeds, then decide what herbicide to use in an area. This helps reduce the use of herbicides and lower costs. Many tech companies have developed robots that use computer vision and AI to monitor and precisely spray weeds. These robots can eliminate 80% of the chemicals normally sprayed on crops and reduce herbicide costs by 90%. These intelligent AI sprayers can drastically reduce the amount of chemicals used in the field, improving product quality, and lowering costs. Vertical farming Vertical farming is a technique in which plants are grown vertically by being stacked on top of each other (usually indoors) as opposed to the ‘traditional way’ of growing plants and crops on big strips of land. This approach offers several benefits for sustainable agriculture and waste reduction. The use of AI brings even more significant advancements making vertical farming more sustainable and efficient- Intelligent Climate Control: AI can use algorithms to measure and monitor temperature, humidity, and lighting conditions to optimise climate control in vertical farms. Thus, reducing energy consumption and improving resource efficiency. Creating an enhanced climate-controlled environment also allows for repeatable and programmable crop production. Predictive Plant Modelling: the difference between a profitable year and a failed harvest can just be the specific time the seeds were sowed. By using AI, farmers can use predictive analysis tools to determine the exact date suitable for sowing seeds for maximum yield and reduce waste from overproduction. Automated Nutrient Monitoring: to optimise plant nutrition, AI systems monitor and adjust nutrient levels in hydroponic (plants immersed in nutrient containing water) and aeroponic setups (plants growing outside the soil, with nutrients being provided by spraying the roots). Genetic engineering AI plays a pivotal role in genetic engineering, enhancing the sustainability and precision of crop modification through- Targeted Gene Editing: AI algorithms help in gene editing to produce desirable traits in crops, such as resistance to disease or improved nutritional content. This allows genetic modification without the need to conduct extensive field trials. Thus, saving time and resources. Computational Modelling: by combining AI modelling with gene prediction, farmers will be able to predict which combinations of genes have the potential to increase crop yield. Pest management and disease detection Artificial intelligence solutions such as smart pest detection systems are being used to monitor crops for signs of pests and diseases. These systems detect changes in the environment such as temperature, humidity, and soil nutrients, then alert farmers when something is wrong. This allows farmers to act quickly and effectively, taking preventive measures before pests cause significant damage. Another way to achieve this is by using computer vision and image processing techniques. AI can detect signs of pest infestation, nutrient deficiencies and other issues that can affect yields. This data can help farmers make informed decisions about how to protect their crops. By incorporating AI into these aspects of sustainable agriculture, farmers can achieve high yields, reduce waste and enable more sustainable farming practices, reducing environmental impacts while ensuring efficient food production. Written by Aleksandra Zurowska Related articles: Digital innovation in rural farming / Plant diseases and nanoparticles Project Gallery
- The fundamental engineering flaws of the Titan Submersible | Scientia News
From the hull to the glass viewpoint- shortcuts in design Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link The fundamental engineering flaws of the Titan Submersible Last updated: 03/04/25, 10:27 Published: 03/04/25, 07:00 From the hull to the glass viewpoint- shortcuts in design On June 18, 2023, the Titan submersible made headlines when the expedition to visit the wreck of the Titanic ended in tragedy. In the North Atlantic Ocean, 3,346 metres below sea-level, the underwater vessel catastrophically imploded along with its five passengers. Two years on, this article deep dives into the key points of failure in engineering and reflects on what we can learn from the fatal incident. The Titanic and OceanGate’s mission The Titanic wreck lies around 3800 metres below sea level in the North Atlantic Ocean, approximately 370 miles off the coast of Newfoundland, Canada. Since the wreckage was finally discovered in September 1985, over seven decades after the boat sank from an iceberg collision on the 15th of April 1912, less than 250 people have personally viewed the wreckage. Despite many discussions to raise the wreckage back to the surface, the complete Titanic structure has become too fragile after over a century underwater and will likely disintegrate completely over the next few decades. Hence, viewing the Titanic in person is only possible with an underwater vessel, a feat which has been achieved successfully since 1998 by a range of companies seating historians, oceanographers, and paying tourists. The Titan submersible is one such vessel developed by OceanGate Expeditions. Titan has been attempting dives to the Titanic wreck since 2017 and was first successful in 2021, when it went on to complete 13 successful dives. According to the passenger liability waiver however, this was only 13 out of 90 attempted dives (a 14% success rate), as a result of communication signal failures, structural concerns, strong currents, poor visibility, or logistical issues. On the many failed attempts, the mission was either cancelled or aborted before the Titan reached the depth of the Titanic wreck. Despite concerns raised by engineers, poor success rates in testing and simulation, as well as previous instances of the Titan spiralling out of control, OceanGate continued with their first planned dive of 2023, leading to its catastrophic implosion that claimed five lives. The Titan is the first fatality of a submersible dive to the Titanic. What went wrong: structural design When designing an underwater vessel to reach a certain depth, the body of the vessel called the hull, would need to be capable of withstanding an immense amount of pressure. For 10 metres of depth, the pressure on the submersible’s hull increases by one atmosphere (1 bar or 101kPa). To reach the wreck of the Titanic 3800 metres underwater would require the hull to withstand the pressure of over 38 MPa (see Figure 1 ). For perspective, this is around 380 times the pressure we feel on the surface and about 200 times the pressure of a standard car tyre. Over one square inch, this equates to nearly 2500kg. To withstand such high hydrostatic pressure, a submersible hull is normally constructed with high-strength steel and titanium alloys in a simple spherical, elliptical, or cylindrical shell. At this point we discover some of the key points of failure in the Titan. The Titan’s hull was made from Carbon Fibre Reinforced Plastic (CFRP), i.e., multiple layers of carbon fibre mixed with polymers. Carbon fibre is a high-tech and extremely desirable material for its tensile strength, strength-to-weight ratio, high chemical resistance, and temperature tolerance. The material has proven itself since the 1960’s in the aerospace, military, and motorsport industries, however the Titan was the first case of using carbon fibre for a crewed submersible. At first glance, the use of a carbon fibre hull suggests the advantage of significantly reducing the vessel's weight (50-75% lighter than titanium) while maintaining tensile strength, which will allow for a greater natural buoyancy. Without the need for added buoyancy systems, the hull would be able to hold space for more passengers at one time. As carbon fibre is cheaper than titanium and passengers pay $250,000 a seat, carbon fibre may appear to be a better business plan. However, although carbon fibre performs extremely well under tension loads, it has no resistance to compression loads (as with any fibre) unless it is infused with a polymer to hold the fibres together (see Figure 2 ). The polymer in the CFRP holding the fibres in alignment is what allows the material to resist compressive loads without bending by distributing the forces to all the fibres in the structure. This means the material is an isotropic: it is much stronger in the direction of the fibres than against (the same way wood is stronger along the grain). Therefore, individual layers of the CFRP must be oriented strategically to ensure the structure can withstand an expected load in all directions. A submersible hull intending to reach the ocean floor must withstand a tremendous compressive load, much higher than carbon fibre is typically optimised for in the aviation and automotive racing industries, and carbon fibre under such high compressive load is currently an under-researched field. Although it is likely possible for carbon fibre to be used in deep-sea vessels in the future, it would require rigorous testing and intensive research which was not done by OceanGate. Despite this, the Titan had apparently attempted 90 dives since 2017 and the repeated cycling of the carbon fibre composite at a high percentage of its yield strength would have made the vessel especially vulnerable to any defects reaching a critical level. Upon simple inspection, the Titan also raises other immediate structural concerns. Submersible hulls are usually spherical or slightly elliptical, which would allow the vessel to receive an equal amount of pressure at every point. The unique tube-shape of the Titan’s hull (see cover image) would not equally distribute pressure, and this issue was ‘addressed’ with the use of separate end-caps. The joints that attach the end-caps to the rest of the hull only introduced further structural weaknesses, which made the vessel especially vulnerable to collapsing from micro-cracks. The Titan’s glass viewpoint was another structurally unsound feature [Figure 3]. David Lochridge, the former director of OceanGate’s marine operations between 2015 and 2018 who was fired for raising concerns about the submersible’s safety features, claimed the company that made the material only certified its use down to 1300m (falling over 2000 metres short of the Titanic’s depth). The immense forces on materials without the properties to withstand the compressive pressure made the Titan’s failure inevitable. Cutting corners in the interest of business The foundation of the implosion’s cause was OceanGate’s insistence on cutting corners in Titan’s design to save time and money. The Titan was not certified for deep-sea diving by any regulatory boards and instead asked passengers to sign a waiver stating the Titan was ‘experimental’. As underwater vessels operate in international waters, there is no single official organisation to ensure ship safety standards, and it is not essential to have a vessel certified. However, many companies choose to have their ships assessed and certified by one of several organisations. According to The Marine Technology Society submarine committee, there are only 10 marine vessels capable of reaching Titanic level depths, all of which are certified except for the Titan. According to a blog post on the company website, OceanGate claimed the way that the Titan had been designed fell outside the accepted system - but it “does not mean that OceanGate does not meet standards where they apply”. The post continued that classification agencies “slowed down innovation… bringing an outside entity up to speed on every innovation before it is put into real-world testing is anathema to rapid innovation”. According to former engineers and consultants at OceanGate, the Titan’s pressure hull also did not undergo extensive full-depth pressure testing, as is standard for an underwater vessel. Carbon fibre - the primary material of the Titan’s hull - is extremely unpredictable under high compressive loads, and currently has no real way to measure fatigue. This makes it an unreliable and dangerous material to be used for deep-sea dives. OceanGate CEO Stockton Rush, who was a passenger on the Titan during its last fatal dive in 2023, described the glue holding the submersible’s structure together as “pretty simple” in a 2018 video, admitting “if we mess it up, there’s not a lot of room for recovery”. Having attempted 90 dives with a 14% success rate since 2017, it was inevitable that micro-cracks in the Titan from repeated dives, if not for the extremely sudden failure modes of carbon fibre composites, would result in the vessel's instantaneous implosion. On the 15th of July 2022 (dive 80), Titan experienced a "loud acoustic event" likely form the hull’s carbon fibre delaminating, which was heard by the passengers onboard and picked up by Titan's real-time monitoring system (RTM). Data from the RTM later revealed that the hull had permanently shifted following this event. Continued use of the Titan beyond this event without further testing of the carbon fibre - because the hull was ‘too thick’ - prevented micro-cracks and air bubbles in the epoxy resin from being discovered until it was too late. Another fundamental flaw lies in the Titan’s sole means of control being a Bluetooth gaming controller. While this is not an uncommon practice, especially in the case of allowing tourists to try controlling the vessel once it has reached its location, it is essential that there are robust secondary and even tertiary controls that are of a much higher standard. The over-reliance on wireless and touch-screen control, particularly one operating on Bluetooth which is highly sensitive to interference, was a dangerous and risky design choice. Although it was unlikely to have caused the implosion on its own, cutting corners in the electronics and controls of a vessel that needs to be operated in dangerous locations is irresponsible and unsafe. Submersibles operating at extreme depths require robust fail-safes, including emergency flotation systems and locator beacons. Again, OceanGate cut corners in developing Titan’s emergency recovery systems, using very basic methods and off-the-shelf equipment. In the event of catastrophic failure, the absence of autonomous emergency measures is fatal. With the extent of damage and poor design to the vessel’s carbon fibre hull, it was unlikely that even the most advanced emergency systems could prevent the magnitude of the implosion. Still, the carelessness displayed in almost every aspect of the submersible’s design was ultimately the cause of the fatal Titan tragedy. Conclusion In a 2019 interview, OceanGate’s former CEO Stockton Rush said: There hasn’t been an injury in the commercial sub industry in over 35 years. It’s obscenely safe because they have all these regulations. But it also hasn’t innovated or grown — because they have all these regulations. In the world of engineering, shortcuts can be catastrophic. Whilst risk-taking is undeniably essential to support innovation, Titan’s fatal tragedy was entirely preventable and unnecessary if the proper risk management techniques were employed. OceanGate had the potential to revolutionise the use of carbon fibre in deep-sea industries but consistently cutting corners and not investing in the required real-world testing, as well as the arrogance to ignore expert warnings, is what ultimately led to Titan’s story fatefully echoing the overconfidence of Titanic’s “she is unsinkable!”. Whilst regulations on submersibles tighten and research into carbon fibre is increased, it is important to take the fundamental cause of the tragic implosion as a wake-up call. Assumptions are deadly: trust the science, invest in the proper research, test every bolt, and never underestimate the ocean’s relentless power. Written by Varuna Ganeshamoorthy Related articles: Engineering case study- silicon hydrogel / Superconductors / Building Physics Project Gallery
- Breast Cancer and Asbestos | Scientia News
A collaboration with the Mesothelioma Center (Asbestos.com), USA Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link Breast Cancer and Asbestos 04/02/25, 15:44 Last updated: Published: 06/06/23, 10:03 A collaboration with the Mesothelioma Center (Asbestos.com), USA Breast cancer is a prevalent disease characterized by abnormal cell growth in the breast. There are various types of breast cancer, including invasive ductal carcinoma, invasive lobular carcinoma, Paget's disease, medullary mucinous carcinoma, and inflammatory breast cancer. In 2022, approximately 287,850 new cases of invasive breast cancer were diagnosed, making it the most commonly diagnosed cancer in women. Natural risk factors for breast cancer include gender, age, race, early onset of menstruation, family history, and genetics. Environmental factors, such as exposure to radiation, pesticides, polycyclic aromatic hydrocarbons, and metals, may also contribute to the risk of developing breast cancer. Some studies have suggested a possible connection between asbestos exposure and breast cancer. While the link between asbestos and other health conditions like mesothelioma cancer is well-established, the exact relationship between asbestos and breast cancer remains unclear. Statistical significance refers to the level of confidence in the results of a study or experiment. In the context of studies investigating the correlation between asbestos exposure and breast cancer, Dr. Debra David points out that many studies fail to establish a conclusive link due to a lack of statistical significance. Certain factors can increase the risk of developing breast cancer, known as "partial risk factors." Some of these factors can be controlled by individuals, such as alcohol consumption. However, many other partial risk factors are not within an individual's control without compromising their overall health. For example, receiving radiation therapy to the chest or making decisions regarding childbirth can be deeply personal choices that impact breast cancer risk. Examples of partial risk factors include consuming more than two alcoholic drinks per day, having children after the age of 30, not having children, not breastfeeding, using the drug diethylstilbestrol (DES) to prevent miscarriage, recent use of birth control pills, receiving hormone replacement therapy (HRT), undergoing radiation therapy to the chest area, and exposure to toxic substances or carcinogens. According to the American Cancer Society, approximately 5 to 10% of breast cancer cases can be directly attributed to inherited gene mutations. However, many other factors, such as exposure to carcinogens, may be beyond a cancer patient's control. Summary written by the Mesothelioma Center ( Asbestos.com ) For more information, visit their website , and also read important facts breast cancer and mesothelioma survival rate . For further information, particularly the legal consequences, check out the Lanier Law Firm, which has more specific information Project Gallery
- African-American women in cancer research | Scientia News
Celebrating trailblazers in skin cancer, chemotherapy and cervical cancer cells Facebook X (Twitter) WhatsApp LinkedIn Pinterest Copy link African-American women in cancer research 08/07/25, 16:23 Last updated: Published: 20/04/24, 11:05 Celebrating trailblazers in skin cancer, chemotherapy and cervical cancer cells We are going to be spotlighting the incredible contributions of three African-American women who have carved paths for future scientists while significantly advancing our knowledge in the relentless global battle against cancer. Jewel Plummer Cobb (1924-2017) As a distinguished cancer researcher, Jewel is known for her extensive work on melanoma, a serious form of skin cancer. Alongside her frequent collaborator, Jane Cooke Wright, Jewel evidenced the anticancer effects of the drug methotrexate in addressing skin and lung cancer, as well as childhood leukaemia. She is also recognised for her distinctive research examining the varying responses to chemotherapy drugs among cells from different racial and ethnic groups. This research led to the pivotal finding that melanin, a skin pigment, could serve as a protective shield against the damaging effects of sunlight associated with skin cancer. Her 1979 article titled Filters for Women in Science recognised the low percentage of women working in scientific research and engineering, including the barriers that female scientists face in their professional journey. As a result, throughout her career, she often wrote about the experiences of black women in higher education. She also passionately championed for the advancement of black people and women working in the fields of science and medicine. In an interview, she stated that she would like to be remembered as “a black woman scientist who cared very much about what happens to young folks, particularly women, going into science”. Jane Cooke Wright (1919-2013) As the daughter of Harvard Medical School graduate, Louis Tompkins Wright, one of the first African American surgeons in the United States, Jane followed in her father’s footsteps and became a physician. Working together, they explored and compared the activity of possible anticancer compounds in both tissue cultures and in patients. This was revolutionary at the time, considering that chemotherapy guidelines were barely established. In collaboration with her father and six male doctors, the team established the American Society of Clinical Oncology (ASCO) to address the clinical needs of cancer patients. Later on, Jane led ASCO at just 33 years old, following her father’s death. Throughout her career, she conducted research in chemotherapy, publishing over 100 articles on the topic, aiming to fine-tune and tailor chemotherapeutic treatments for patients to ensure better survival outcomes. Like Jewel, she also played a key role in investigating and demonstrating how different racial and ethnic backgrounds respond to drugs used in chemotherapy. This has now become a field of its own, pharmacoethnicity, which studies the anticancer drug responses across people of different ethnicities and is advancing our knowledge on personalised chemotherapy treatment for patients. During an interview, her daughter, Alison Jones, described Jane as: A very ambitious person... she never let anything stand in the way of doing what she wanted to do. Henrietta Lacks (1920-1951) Although not a scientist herself, Henrietta has made a significant contribution to cancer research and medicine through her cervical cancer cells. Although, tragically, she did not know it. Henrietta was diagnosed with cervical cancer in 1951 and sadly passed away the same year. The cervical cancer cells obtained from her biopsy were found to have a unique ability to continuously grow and divide in vitro. Therefore, they could be grown into cell cultures and used in further research. As a result of this trait, researchers have investigated their behaviour, including mutation, division, and carcinogenesis, allowing them to study the effects of drugs and other treatments on these cells. The “immortal” cell line, termed HeLa, has played a pivotal role in the creation of the polio vaccine in the 1950s and medicines for conditions such as leukaemia, influenza, and Parkinson's disease. The HeLa cells also identified the Human papillomavirus (HPV), which later led to the finding that the virus can cause different types of cervical cancer, leading to the significant development of the HPV vaccine used today. It is estimated that over 110,000 research publications have used HeLa cells, emphasising their demand in research. Were it not for Henrietta Lacks, the HeLa cell line would not have been discovered, which has revolutionised our understanding of cancer and medical advancements. In conclusion, the remarkable journey of these pioneering African American women in cancer research serves not only as an inspiration but also a testament to their perseverance, courage, and dedication. They have championed diversity within science, pushed boundaries, and shaped the field of cancer research, allowing for the progress of scientific research in curing cancer and beyond. Written by Meera Solanki Related articles: Women leading the charge in biomedical engineering / The foremothers of gynaecology / Sisterhood in STEM REFERENCES American Society of Clinical Oncology (2016). Society History. [online] ASCO. Available at: https://old-prod.asco.org/about-asco/overview/society-history . Blood Cancer UK (2023). Blood Cancer UK | The story of Dr Jane C Wright, pioneer of blood cancer research. [online] Blood Cancer UK. Available at: https://bloodcancer.org.uk/news/the-story-of-jane-c-wright-pioneer-of-blood-cancer- research/. Boshart, M., Gissmann, L., Ikenberg, H., Kleinheinz, A., Scheurlen, W. and zur Hausen, H. (1984). A new type of papillomavirus DNA, its presence in genital cancer biopsies and in cell lines derived from cervical cancer. The EMBO Journal, 3(5), pp.1151–1157. doi: https://doi.org/10.1002/j.1460-2075.1984.tb01944.x . Cobb, J.P. (1956). Effect of in Vitro X Irradiation on Pigmented and Pale Slices of Cloudman S91 Mouse Melanoma as Measured by Subsequent Proliferation in Vivo234. JNCI: Journal of the National Cancer Institute, [online] 17(5). doi: https://doi.org/10.1093/jnci/17.5.657 . Cobb, J.P. (1979). Filters for Women in Science. Annals of the New York Academy of Sciences, 323(1 Expanding the), pp.236–248. doi: https://doi.org/10.1111/j.1749- 6632.1979.tb16857.x. Ferry, G. (2022). Jane Cooke Wright: innovative oncologist and leader in medicine. The Lancet, [online] 400(10360). doi: https://doi.org/10.1016/S0140-6736(22)01940-7 . Hyeraci, M., Papanikolau, E.S., Grimaldi, M., Ricci, F., Pallotta, S., Monetta, R., Minafò, Y.A., Di Lella, G., Galdo, G., Abeni, D., Fania, L. and Dellambra, E. (2023). Systemic Photoprotection in Melanoma and Non-Melanoma Skin Cancer. Biomolecules, [online] 13(7), p.1067. doi: https://doi.org/10.3390/biom13071067 . King, T., Fukishima, L., Donlon, T., Hieber, D. and Shimabukuro, K. (2000). Correlation between growth control, neoplastic potential and endogenous connexin43 expression in HeLa cell lines: implications for tumor progression. Carcinogenesis, [online] 21(2), pp.311–315. doi: https://doi.org/10.1093/carcin/21.2.311 . National Institutes of Health (2022). Significant Research Advances Enabled by HeLa Cells - Office of Science Policy. [online] Office of Science Policy. Available at: https://osp.od.nih.gov/hela-cells/significant-research-advances-enabled-by-hela- cells/. Pathak, S., Zajac, K.K., Manjusha Annaji, Manoj Govindarajulu, Nadar, R.M., Bowen, D., R. Jayachandra Babu and Muralikrishnan Dhanasekaran (2023). Clinical outcomes of chemotherapy in cancer patients with different ethnicities. Cancer Reports, 6(1). doi: https://doi.org/10.1002/cnr2.1830 . Project Gallery










