My Rambling Thoughts
  • Home
  • Blog

Ocean Acidification: why should we care?

7/26/2021

0 Comments

 
Beginning of Ocean Acidification: The Industrial Revolution

The so called industrial revolution, a period between 1760 and 1840, marked the transition of economies to new, more industrial manufacturing processes based on machines, factories and new energy sources such as coal, the steam engine, electricity, and petroleum, in Europe and the United States. During the 200 years or so post-industrial revolution, as a result of burning of fossil fuels and land use change, the concentration of carbon dioxide (CO2) in the atmosphere has increased considerably, from 280 parts per million (ppm) to close to 400 ppm, and is predicted to reach 500 ppm by 2050 and 800 ppm or more by the end of the century. Scientists calculate that the ocean is currently absorbing about one quarter of the CO2 that humans are emitting - this may sound like great news for us air-breathing bipedal animals, but it is actually bad news for the planet, including us.

Consequences of Ocean Acidification

Chemically speaking, absorption of CO2 by seawater causes an increase in carbonic acid, which dissociates and releases hydrogen ions, the concentration of which increases leading to decreased pH and more acidic waters, with a resulting reduction in carbonate ion concentration as these associate with the excess hydrogen ions. The pH of the ocean surface in the past 200 years has fallen by 0.1 pH units (from 8.2 to 8.1), which does not sound like much, but translates into a 30% increase in acidity, and it is estimated that if the current emissions trend continue, by the end of this century it could go down by an additional 0.3 units or 120%.

Picture
Figure from OCEANA
The reduction in carbonate ions caused by ocean acidification affects marine organisms with shells, which use carbonate to make calcium carbonate to form shells and coral skeletons. Pteropods (“sea butterflies”) are tiny snails that live in seawater as zooplankton, especially in colder areas, and are an important source of food to many marine animals such as fish and mammals in high latitudes, including salmon (pteropods make up 40% of the diet of pink salmon), mackerel, herring, seabirds, and whales. The nasty corrosive effect of ocean acidification on pteropod shells has been widely documented both in their natural habitats (the Gulf of Alaska and the Bering Sea, the US West coast from central California up to northern Washington state), as well as in the laboratory.
Picture
Oysters, clams, mussels, shrimps, lobsters and corals also depend on theirs shells/skeletons for survival. The shellfish industry and fisheries will experience great supply reductions if coastal water acidification is not halted soon, possibly leading to  estimated losses of almost 500 million US dollars per year by the end of the century.

Unfortunately, ocean acidification is not the whole story. Our oceans are also getting warmer and their oxygen concentration is going down (deoxygenation) - an approximate 2% reduction since 1960. Lower oxygen content happens for two reasons: 1) global (and therefore ocean) warming by about 0.55°C since the 1970s, as cool water con hold more oxygen than warm water; 2) excess of nutrients dumped into the coastal waters coming from our waste, agriculture and fossil fuel burning, leading to an overgrowth of phytoplankton (microalgae) that then decays and uses up a lot of oxygen.

A research study published in the journal Nature in 2019
showed that marine species are more vulnerable than land species to warming, they are more likely to live at dangerously high temperatures, and are disappearing from their habitats due to warming temperatures twice as often as land species. Land animals can find refuge from the heat by moving to  forests, shaded areas or underground habitats, while marine animals can not.
Picture
Addressing Ocean Acidification

Massachussetts, a state with rapidly acidifying coastal waters, is the second highest seafood industry employer. Earlier this year, a “report on the ocean acidification in Massachussetts” was published, recommending strategies to reduce nutrient pollution, restore coastal wetlands, and improve coastal monitoring, such as planting more marine algae/kelp (to increase absorption of CO2 and reduce acidification), and spreading waste shells near oyster beds (to raise carbonate concentration in the water). Coastal water acidification, as opposed to ocean acidification, is a more localized process due to nutrients entering the water from adjacent land, bacteria involved in decomposition and algal blooms. The US Environmental Protection Agency (EPA) issued guidelines in 2018 with guidelines for measuring acidification changes in coastal waters.

What can we, regular citizens, do? It all goes back to carbon emissions, we have to reduce our contribution or “carbon footprint” by using less energy at home and when traveling, follow the three Rs (reduce, reuse, recycle), and limit plastic and pesticides use.
0 Comments

citizen science: get involved

4/12/2021

0 Comments

 
Picture
In many scientific disciplines some research questions require data collection efforts far beyond those realistically possible with available technologies and funding, including surveys spanning long periods of time or large geographies, or unrealistic computing capacity. Early approaches to tackling these questions involved the first organized public participation projects such as the North American Breeding Bird Survey established in 1966, where thousands of people in the USA and Canada collect data on birds during their breeding season. Organizing large numbers of people to help collect data or do computational tasks, such as identifying objects in images or sounds, require substantial organizational and communication efforts. Seeking answers to those once thought of as “difficult questions” changed significantly with the advent of the internet, which facilitated communication and data gathering as well as resources organization and availability. For example, SETI (Search for Extraterrestrial Intelligence, famous after the movie Contact) asks for people’s help by donating their computer’s down time to process large amounts of data. This revolution spurred by the internet was not just in scientific disciplines; outside of science, instant access to online news and social networks has led to amazing contributions from people all over the world, as in news outlets accepting (and even requesting) photos and videos from contributors where developments are being reported in real time.

The growing cool trend in science involving active non-professional public participation in knowledge gathering, is variously called citizen/community/crowd/volunteers science and features prominently in a wide variety of science and research areas.

Are you a bird watcher or an amateur astronomer? You could contribute to global science knowledge by adding your local observations to global online platforms already available and free to anybody for a variety of topics such as these, or just explore them and find out what others are contributing and what the analyses of the data show. At “project finder” from Scistarter, you can find thousands of citizen science projects. Citizen science projects with popular online platforms have led to important peer-reviewed publications in science journals, which should be open access; a study published in 2020 showed that from all science papers that included citizen science activities, half were in ecology, environmental sciences and biodiversity.
Picture
Picture
Citizen science participation is not limited to environmental research. You can help in projects as divergent as microbiology or sociology. Sometimes the research is serendipitous and with no app or platform involved, as was the case with the accidental loss of 29,000 rubber ducks from containers on a ship to the USA from China in 1992. With the help of people from all over the world, 10 years later scientists had data on how far and how long it took for the little rubber ducks to wash ashore and used these data to discern the size and speed of ocean currents around the place of the spill.

What is or isn’t citizen science is often not clearly defined, in fact there is no agreement on a globally accepted definition. Sometimes projects that gather information from volunteers submitting it online are labelled as citizen science, when these volunteers are not really doing the work of scientists but just sending their own data. These studies are no different from studies that request participation from volunteers as study subjects, except that the advertisement and data collection are done through web tools. An example of a recent project wrongly referred to by some as citizen science was one used early in the COVID-19 pandemic known as the Covid Symptom Tracker, a collaboration between US and UK scientists and doctors, and Zoe, a healthcare company, consisting of a  free symptom tracker smartphone application launched in March 2020 in both countries. A Nature Medicine paper published in May 2020 showed and discussed results from 2,618,862 participants in a study using this application to report potential COVID-19 symptoms. The main finding was the loss of smell and taste as COVID-19 symptoms, and therefore authors recommended to add these to the list of symptoms that back then did not include them. These symptoms are now well known to be associated with the disease, and are frequently listed as such.

Citizen science involves non-professional scientists (lay people) with several possible roles, depending on the project, ranging from developing the research question and designing the method to contributing data, checking or monitoring, interpreting and analyzing data. Citizen science allows collection of data from widespread geographic areas over long periods of time, and promotes collaboration and interaction between scientists and citizens while at the same time resulting in a more informed, involved and engaged public.

Examples of popular and successful projects that have collected data via citizen science are:

1) eBird: a growing platform with bird sightings contributed by bird watchers around the world with many partner organizations and regional experts and managed by the Cornell Lab of Ornithology. It is available worldwide as a free mobile app that collection of data offline, and a website to explore and summarize global eBird data.
 
2) Foldit: A “protein folding” web-based game launched in May 2008 for players to use the mouse to provide shapes for specific proteins was compared to the Rosetta algorithm developed by scientists with the same purpose. Player strategies ended up outperforming Rosetta algorithms, with a PNAS publication of these impressive results that included “Foldit players” as co-authors.
 
3) Polymath: several math platforms for different topics (Banach spaces, Polynomial Hirsh Conjecture, Bounded gaps between primes, etc) developed as collaborations among professional and amateur mathematicians to solve problems based on online communication.
 
4) Zooniverse platform (este va con el gráfico de abajo on the side): Launched in 2007 with one project (Galaxy Zoo), Zooniverse has grown exponentially to 74 projects active now, 195 others paused, and 56 finished, covering a range of topics in biology, history, climate science, the arts, medicine, ecology, and social sciences. This incredibly diverse and successful platform offers current projects that you can browse and learn about including Galaxy Zoo, Chimp&See,  Penguin Watch, Etch a cell, and many others.

When a project requires the collection and interpretation of data by the community of volunteers, one of the most pressing criticisms is about data quality. In these situations, data collection and interpretation are subjected to biases that may introduce error in the data and could lead to erroneous conclusions. This is because, as opposed to “controlled” experiments in which scientists standardize all aspects of data collection, citizen scientists inevitably have a variety of skill levels, knowledge, interest, and experience. For example, when asked to count objects in an image, some people may count more or fewer than others due to different levels of involvement, attention to details, etc.

Another fantastic trend in science in the past decade is the robotization of data collection and processing. We are far from being able to deploy drones or autonomous data collection devices in large amounts to survey vast areas, or to set them in place to survey for long periods of time. Similarly, our computing power, artificial intelligence and machine learning have advanced immensely in recent years but using computers to process data at the speed and with the accuracy of the human brain remains a glimmer in the horizon. In the meantime, citizen science will grow and become an important methodological approach to tackle the “difficult questions”. Furthermore, with that future robotization goal in mind, some citizen science projects use the help of volunteers to train computers, such as Soundscapes to Landscapes, where volunteers generate the data to feed artificial intelligence models in order to identify bird species in sound recordings. Perhaps in the next few decades we will see the development in many disciplines of means for lay people to train computers that then help scientists answer increasingly difficult and complex questions.

Picture
Cartoon drawn by Jon Carter
0 Comments

COVID-19 vaccines, January 2021

1/24/2021

0 Comments

 
I wrote about COVID-19 and coronaviruses when the pandemic first started spreading worldwide in March 2020. There have been now roughly 100 million cases and a bit over 2 million deaths due to this disease, with the USA one of the main contributors.
 
In an unprecedented effort to face the pandemic and in addition to national, regional and global initiatives in terms of implementing safety measures such as lockdowns including school and stores closures, mask wearing, PPE and testing, etc, we count now with an impressive number of vaccines approved as well as at different stages of development and in clinical trials. More types of vaccines are being developed simultaneously for COVID-19 than for any other infectious disease before.

Picture
Although usually the FDA approval process is long (it can take years), emergency authorization is now given to COVID-19 vaccines with less data than otherwise required but based on enough evidence that they are safe and effective. The first two COVID-19 vaccines very recently approved in the USA for emergency use are based on a new (mRNA) vaccine technology, making them the first of this type to be approved for use in humans: the Pfizer-BioNTech and Moderna vaccines got 2020 FDA approval on December 11 and 19, respectively. These two vaccines have been manufactured and distributed to different countries, mostly high-income, as they require storing at very cold (freezer) temperatures to keep the very labile mRNA from degrading and are the most expensive per dose. The USA, Canada, many European countries and Japan have purchased many doses of one or both of these vaccines for their populations, and Mexico, Chile and Costa Rica in Latin America have also started Pfizer vaccination in December 2020. Regions and countries started preordering vaccines (in millions of doses) when candidate vaccines showed promising results in clinical trials.
 
All other vaccines approved to date can be stored at fridge temperatures. These vaccines use different technologies that don’t involve RNA, for example viral vectors are used that have been engineered to carry information on the antigen from the virus, which for COVID-19 is the Spike protein, and to not be able to replicate in human cells. One such vaccine was developed by AstraZeneca and University of Oxford, it got approved by the UK on December 30, 2020. The first registered vaccine was the Russian Sputnik V, in August 2020 before phase 3 clinical trials started, which has been used in Argentina since December 2020. The Janssen (Johnson & Johnson) vaccine is expected to get FDA emergency use approval soon in the USA, the third one after Pfizer and Moderna. This vaccine is the only one so far offering a single dose regimen instead of two.
Picture
All COVID-19 vaccines so far are designed to get our immune system to develop a strong response to the virus by delivering the Spike protein that covers the surface of the virus (acting as an “antigen” against which we will make antibodies) or parts of it. All approved vaccines, even for emergency use, have shown considerable protection in clinical trials, meaning a much lower risk of getting COVID-19 for vaccinated versus non-vaccinated people. As more people get vaccinated, more data on different vaccines will be available. In the USA, depending on where people live, they are currently getting either the Pfizer or the Moderna vaccine.
Picture
The process of approving vaccines, rollout protocols and guidelines for use all vary considerably between regions and countries. Some countries, including Japan, South Korea, Brazil and India have deals in place to manufacture themselves some of the available vaccine formulations. As more vaccines are approved, distributed and available, people wonder if some are better than others and whether they could choose one or wait for a different one not yet available. The recommendation is to get whichever is available in your area- it will protect you if it has been approved for use. If shortages occur for some vaccines or authorities decide to use the existing supply to give the first dose to more people without necessarily saving enough shots for the second dose, giving a second dose of a different available vaccine is considered, or allowing a longer time gap between doses until more vaccine shots are available.

Because clinical trials are first conducted in adults, COVID-19 vaccines are not yet approved for use in under 16-18 year old children, but trials in younger children are currently underway and a vaccine may become available either by the end of 2021 or in 2022.

Picture
This week, the USA joined COVAX, a global initiative that includes WHO, UNICEF, Gavi (the Vaccine Alliance) and CEPI (the Coalition for Epidemic Preparedness Innovations) to ensure rapid and equitable access to COVID-19 vaccines for all countries, aiming to cover at least 20% of each participating population by the end of 2021 by providing two billion vaccines this year. Some vaccine manufacturers have pledged a considerable number of doses to low income countries.

0 Comments

Thanking llamas (and other camelids) for their nanobodies: a potential treatment for covid-19 and other diseases

1/1/2021

0 Comments

 
I have discussed in previous posts the role of antibodies in cancer treatment and how a variety of medical tests for different conditions rely on detection of antibodies we make in response to foreign threatening organisms that enter our bodies.

Immunologists are also looking into antibodies made by other animals for their potential use in humans to treat diseases, including COVID-19. Our antibodies are Y-shaped molecules containing different “domains” called heavy and light chains (two of each). Camelid antibodies from a family of animals including dromedary camels, llamas, and alpaca, are also shaped as a Y but contain only heavy chains, making them a bit smaller in size (see figure below). At the end of the two arms of these open Y antibodies there are “variable” domains that are responsible for recognizing (and binding to) different antigens in response to infections. Whereas the variable antigen-recognizing domain of human antibodies has two components (from a heavy and a light chain, respectively), the camelid antibodies have only one, from their heavy chains. This “single domain” that recognizes the antigen of choice has been named “nanobody”, with this type of antibodies produced and isolated in good amounts in approaches used as the source of antibody tools in biotechnology research as well as therapies for specific diseases. The drug caplacizumab was the first nanobody-derived therapy- approved in 2018, it is used in patients with “acquired thrombotic thrombocytopenic purpura” to prevent blood clotting.

Picture
Figure modified from Figure 1 Resemann et al, 2010,  J Biomol Tech 21(3 Suppl):S49.
The camelids can produce these nanobody-containing antibodies when immunized with specific antigens, such as the spike protein of the SARS-CoV-2 virus that causes COVID-19. Because of their tiny size, they don’t trigger an immune response in humans. These antibodies can be isolated from blood samples and analyzed in the lab for activity against SARS-CoV-2 in an infection assay using the virus and human cells in a cell-culture dish.
Picture
Nanobodies from llamas were tested in the lab for “neutralization” of the COVID-19 causing virus SARS-CoV-2, in other words whether they bind to the virus’ spike protein that binds to ACE2 receptors present in human cells in order to enter and invade these cells. The spike protein, critical for virus infection, is currently also the main target of COVID-19 vaccines approved and in development.

Nanobodies in general offer several advantages in diverse applications, all mainly derived from its small size compared with human antibodies (10 times smaller), including easier and more affordable manufacturing in large scale. They are also very stable at long-term, and could be delivered by an inhaler directly to the lungs, as opposed to traditional antibody therapies that are delivered intravenously, are much more expensive to produce and require much higher amounts to be delivered as an effective treatment. The possibility of using nebulization to deliver a nanobody treatment is very appealing in the case of respiratory infections such as COVID-19, as it allows nanobodies to reach the lungs quickly and directly to block viral invasion. Nanobodies can also be made in microbial cells (bacteria and yeast).

0 Comments

the single cell analysis breakthrough

10/7/2020

0 Comments

 
As technology advances continuously at exponential speed, especially in fields based on cellular and molecular studies, researchers can go deeper and look more in detail inside organisms, cells, organelles and molecules including DNA, RNA and proteins (see my previous post to learn more about these "levels-of-complexity"). Examples of these breakthroughs are DNA sequencing technologies. These went from scientist being able to sequence specific genes (or gene segments) of interest, investing significant time and resources, to sequencing whole genomes of microorganisms, plants or animals, accomplished sometimes as a result of international collaborations involving different labs and organizations, to the possibility nowadays of having your own personal genome sequenced by a private company to learn about your ancestry and genetic predisposition to some diseases.
Molecular analyses have traditionally involved isolating a sample from an organism or "growing" in the lab, in the appropriate conditions and culture media, a sufficient number of cells to process that provide enough DNA, RNA or protein material for analysis. These experiments, and results used for ("bulk") analysis, are based on populations of cells. This all works well if the goal is to detect molecules (genes (DNA) or RNA transcripts to study gene expression, or proteins or metabolites) that are usually present in most cells with detectable abundance. For rare occurrence events, the target molecules may not be present in enough amount to be detected under these conditions, in which the signal obtained and read is an average of the molecules present in the bulk population, often a heterogeneous mixture of cells in different stages of growth and different genes turned "on" and "off" at the time of sampling. There are methods that allow researchers to "sort" cells or "synchronize" them in order to obtain more homogeneous populations for analysis, these methods still require a minimum amount of the research target molecule present in the population.
Picture

An important breakthrough to address these issues has been the availability, by a variety of technologies, to perform "single cell analysis" of molecules of interest, as opposed to from a cell population. Single cell technologies are used both in research and clinical applications, including for example to study cancer by looking at tumor cells.

Within a tumor, there are different clones of malignant cells which are genetically distinct, sometimes containing different mutations with roles in malignancy. These different clones can differ also in their dividing speed, metastasis capability and sensitivity to cancer treatment. Single cell analysis allows characterization of different subclones, which can inform therapies as well as follow-up of patients' response to treatment and disease progression. Single cell analysis has also been applied to the study of circulating cells that primary and metastatic tumors shed. These circulating cells are obtained from a liquid biopsy (usually a blood sample) and can be used in early diagnosis. With the added advantage that they can be obtained and analyzed in a much less invasive manner than primary and metastatic tumor cells, they allow for more frequent disease and response-to-treatment monitoring.

Single cell analysis of heterogeneous cell samples or tissues

1)    Enrichment of target cells if possible
2)    Isolation of cells of interest as single cells
3)   Amplification of DNA or cDNA from reverse transcription of RNA
      (usually PCR-based) to be used in:
4)    Sequencing studies
5)    Analysis of results


Most single cell analyses are based on sequencing DNA, RNA and epigenetic modifications, for which isolated cells are first broken or "lysed" to release nucleic acids, but protein and other metabolites can also be analyzed, and cells can also be used in assays in which they are kept alive and visualized under the microscope (for more details on epigenetics and labeling/visualization of fluorescent cells under the microscope, see my home page)

Microfluidics technologies, where small amounts of liquids circulate in microchannels are widely used in single cell isolating procedures. In what is known as "droplet microfluidics" (figure below) two unmixable liquid phases (water and oil, for example) flow through microchannels leading to formation of drops of one fluid within the other (carrier) fluid, containing single cells (an aqueous droplet in oil in our example).
Picture
These technologies have made it possible to isolate single cells on capture sites such as microchips, which are good platforms to subsequently perform sequencing of DNA or RNA, and have led to "lab-on-chip" devices that act as integrated microsystems. There is a wide variety of microfluidics-based methods for isolation and processing of single cells. In addition to sequencing analyses, they have numerous applications including in cancer research and treatment; stem cells; immunology, microorganism studies and antimicrobial resistance evaluation; therapeutics; development; prenatal screening and personalized medicine.

Picture
From “Beatrice the Biologist” (check out her awesome cartoons!)
http://www.beatricebiologist.com/2012/05/single-cell-is-just-fine-thank-you/
0 Comments

Coronaviruses causing recent epidemics and the latest covid-19: how similar are they and what has research taught us so far?

3/13/2020

1 Comment

 
I have written about new disease outbreaks before, for example when   the Zika epidemic occurred exactly where we were living at the time (northeast Brazil). In these cases, the information I include comes from what is available at the time I write: for the new COVID-19 disease (caused by the now named SARS-CoV-2 virus) I rely on sources published until March 12 2020 with most data from cases and trends in China and Italy outbreaks, as well as compiled global data. It is important to be careful when reading stuff on social media and pay attention to where the information comes from - always lean towards trusted public health and science sources (WHO, CDC from countries, well-known journals, medical centers, websites and newspapers). Sometimes an email, tweet or WhatsApp voice/video is circulated and even goes viral which is literally spreading false information (alarming/conspiracy theories, miraculous cures and treatments).

On January 30, 2020, the World Health Organization (WHO) declared the COVID-19 outbreak a "public health emergency of international concern", and then a pandemic on March 11. On January 31, almost 100 science journals and institutes signed a commitment to making data and research on this disease freely available for the duration of the outbreak (the same thing happened early in 2016 regarding Zika). These published articles are open access, and journals also expedite the peer-review process of COVID-19 manuscripts they review. Some prescription-only newspapers like the New York Times are also now making their coronavirus-related coverage available to everybody.

Today (March 12th) Maryland, where we live, has announced schools will close for two weeks starting this Monday the 16th. Different offices/companies/organizations (governmental and private) have been letting employees work from home or have closed the offices and are working remotely.

Coronaviruses are enveloped single-stranded RNA viruses of four genera, one of which is the beta-coronaviruses that includes SARS and MERS. SARS-CoV-2, the causative agent of COVID-19 disease, as other coronaviruses, is sensitive to ultraviolet rays and heat and can be effectively inactivated by soap and ethanol (>60% concentration). Transmission is believed to occur via respiratory droplets from coughing and sneezing and by aerosol transmission with elevated aerosol concentrations in close spaces; close contact between individuals seems required. The incubation time is between 2 days and 2 weeks and symptoms include fever, cough and shortness of breath. The "basic reproduction number" or R0 is about 2.2, meaning each infected person transmits the virus in average to 2.2 other people (the R0 of the previous SARS-CoV epidemic was approximately 3).
Picture
From: https://commons.wikimedia.org/wiki/File:3D_medical_animation_corona_virus.jpg
When estimating fatality rates of diseases, the denominator may be an underestimation, as (this) total number of cases depends on how good  testing is and whether all infected individuals show symptoms and/or seek care; therefore fatality rate tends to be overestimated, as it is probably the case with data for COVID-19 cases so far, especially in the US where testing was not immediately and is still not widely available. Today's global numbers of cases and deaths available result in a 3-4% fatality rate, however preliminary data from local studies in specific countries suggests it ranges from 1-2%, each significantly lower than the other two (previous) coronavirus epidemics SARS-CoV and MERS-CoV. Compared with these two viruses, SARS-COV-2 has shown a low mortality rate with high transmissibility.
Picture

   Because viruses mutate rapidly, sequences of new strains emerging in different location worldwide is important, and is the focus of researchers who make sequences publicly available at international databases such as GenBank. A mutation in an envelope protein-encoding gene in SARS-CoV-2 probably occurred in late November 2019 that triggered jumping to humans from an animal species carrying the virus (see my zoonotic diseases post for information about human diseases of animal origin). Very soon after the first outbreak of COVID-19 was identified in Wuhan, China, the Chinese CDC reported in the journal The Lancet the sequence of the virus causing the disease from samples isolated from 9 patients, which were found to be very similar (about 99%) and 88% related to bat SARS coronaviruses. They were not so close in sequence to the first SARS-CoV (~79%) or MERS-CoV (~50%).

The first SARS as well as the MERS epidemics were contained eventually, we learned a few lessons since, and there has been research on viral mechanisms (including in animal models) and possible vaccines. For example, SARS-CoV and MERS-CoV transmission between humans occurred mainly at hospitals (known as nosocomial transmission), where there were clusters probably due to virus shedding after the onset of symptoms. Family transmission occurred less frequently (13-21% for MERS, 22-39% SARS cases), while transmission between patients was the most common for MERS (62-79% of cases) and to health care workers by infected patients for SARS (33-42% cases).

Based on the viral sequence first released by China, PCR-based molecular tests (see my post on PCR for more info) were developed, which detect viral genetic material (RNA) from patient samples. PCR tests were the ones initially developed/used by the US CDC. China and Singapore have developed serological tests, which detect from blood samples antibodies our bodies produce in response to the viral infection. All these tests were developed quickly in response to the initial outbreak in China as well as subsequent ones in other locations.

These samples are taken mostly from the upper respiratory tract (nasal and oral swabs) but COVID-19 is known to be a lower respiratory tract infection (chest and lungs) and some studies on patients so far show that detection is better from the latter type of samples.  As case reports get published and made available immediately, it is clear that testing is critical, for example today  a letter  reported case of a 69-year-old male in China this past January with pneumonia that was co-infected with SARS-CoV-2 and influenza A, however he was originally misdiagnosed (negative for COVID-19) based on  upper respiratory specimens.

Because influenza viruses are seasonal, causing the flu typically to rapidly spread in the fall-winter and decrease in spring and summer (I grew up in a tropical country where there was no flu awareness, or flu shot every year), there has been some speculation that the COVID-19 pandemic may subside in the upcoming months in the summer. But there is no concrete evidence (as stated by WHO) that this is indeed the case. The virus is new to humans, there is no prior immunity and only time and research will tell us more about transmission, spreading, viral mechanisms and possible seasonality. Even if this is the case, with further spreading the virus could reach (and cause more outbreaks in) countries in the southern hemisphere, which are heading towards their winter when we will be in our summer.

Although SARS-CoV-2 can infect people of all ages, current evidence suggests that older people (>60 years of age) and/or those with underlying medical conditions (cardiovascular (heart) disease, diabetes, chronic respiratory disease, cancer) are considerably more susceptible to getting severe disease. Comorbidities were also shown to be important for disease severity in previous SARS and MERS outbreaks, as well as advanced age. These are WHO recommendations for this at-risk population:

Picture
Picture
The most important recommendation in general for everybody is to frequently wash hands for at least 20 seconds with soap and water, use portable hand sanitizer with at least 60% alcohol and avoid contact with eyes, nose and mouth. Health care workers caring for sick people should use masks, eye protection, gowns and gloves. Infected people should wear masks when around people.
Picture
Picture
1 Comment

Menopause: The HRT dilemma

12/19/2019

2 Comments

 
As a follow-up to my previous post on perimenopause (and my hormonal chronology), I venture now into the next phase, menopause, a term thought to come from the Greek words for month (mēn) and pause (pausis). A woman is considered officially in menopause once she has reached 12 months without periods, this occurs at around 50-52 years of age in average in women in the US. Menopause can occur earlier though (and also sometimes abruptly, without the perimenopause leading into it) in a woman’s 30s or even 20s and teen years, as a result of ovarian insufficiency that causes premature menopause, cancer treatment such as radiation and chemotherapy, or hysterechtomy of both uterus and ovaries (not uterus alone), causing periods to stop immediately, usually with hot flashes and other symptoms.

Once (and after) menopause, women are at increased risk of certain health conditions including weight gain, heart disease, and osteoporosis due to bone loss or weakening. Incontinence may occur with involuntary urine loss and a higher risk of vaginal and urinary infection; vaginal dryness may bring pain, bleeding and reduced libido.
Picture
Most of the menopause symptoms and uncomfortable consequences are due to the decrease in hormones produced by the ovaries (estrogen and progesterone), mainly estrogen reduction; other tissues such as the adrenal glands and adipose tissue (fat) make these hormones in minimum quantities. This is why estrogen is an effective treatment for relief of symptoms during the menopausal transition and postmenopausal years including hot flashes (and resulting fatigue and depression associated with lack of sleep) and vaginal dryness and sexual function. Estrogen can be taken as Hormone Replacement Therapy (HRT) with progesterone/progestin (when the uterus is present progesterone is given with estrogen given to prevent endometrial cancer) or without it (when the uterus is absent because of hysterechtomy). HRT however has been and still remains controversial. It was very popular starting in the late 1960s not just for symptoms but also to prevent heart disease and osteoporosis in post-menopause. Estrogen protects from osteoporosis by preventing bone mass loss, and it was FDA approved for osteoporosis prevention in post-menopausal women. Studies published later on scared everybody against HRT, and its use declined afterwards in the new millennium.

A study led by the Women’s Health Initiative (WHI) since the early 1990s that included 27,347 women in the US ages 50-79 on HRT and subsequent no treatment with follow-up of 13 years, was published in 2002 showing increased risk for breast and uterine cancer, as well as heart attack, blood clots and stroke. This study however drew its conclusions from mainly older women (average age 64) who received HRT long after menopause, whereas the recommendation is to start earlier, right after menopause. Subsequent studies and analyses found that when HRT started earlier it did not cause an increased cancer or stroke risk, and it may even protect from heart disease. In 2012-2013 several medical and OB/GYN organizations in the US stated that HRT is an option for menopausal symptoms treatment.

Picture
The “HRT or no HRT” decision is an individual one, to be made with your doctor and considering all risk factors that may apply to you including your family history, health and lifestyle, menopause age and severity of menopause symptoms. If HRT is chosen it usually starts at the lowest dose possible, and the patient decides on type from available options (oral/pills, patch or creams).

Nowadays there is a variety of different HRT formulations to choose from that come in different forms and dosages, including creams, gels or sprays to be topically applied on the arm or leg or as vaginal suppositories, rings or creams for women that experience uncomfortable vaginal dryness and intercourse; combination of estrogen and progestin as skin patches hidden from sunlight usually used below the waistline on the lower stomach; or tablets (pills) taken daily. When both progestin and estrogen are used (for women with a uterus), called “combined HRT” the regimen can be monthly or every 3 months in a cyclical manner (estrogen daily, progestin only for 14 days) for women still having periods, which will then come monthly or every 3 months respectively, or continuously (both hormones taken daily) for women who are post-menopausal and no longer have periods.

Picture

Picture
Helpful guidelines and tools towards decision making regarding HRT use are offered by 3 organizations: the American College of Obstetricians and Gynecologists, the North American Menopause Society (NAMS), and the Endocrine Society. NAMS offers a free mobile app called MenoPro with two modes (for clinicians and for women) that can help the female patient and her doctor assess together HRT and non-HRT options taking into account medical history and risk, and links to online tools.

The Endocrine Society, through its hormone health network, has made a very comprehensive Menopause Map - My Personal Path available in English or Spanish, where you can find information (that you can read or listen to) about perimenopause, menopause and early menopause, why hormone depletion matters and options to treat symptoms including lifestyle changes (diet, sleep, exercise, vitamins) and HRT, as well as numerous additional resources such as calculators of risks or vitamin D intake, and several additional informational booklets and videos on a variety of menopause-related topics.

2 Comments

Zoonotic diseases (transmitted to humans from other animals) are more common than you might think

12/10/2019

1 Comment

 
More than half of infectious diseases that have been around for a while, and three quarters of emerging (new) infectious diseases in people are transmitted from other animals to humans. These infections are called zoonoses and cause zoonotic diseases. Zoonoses are defined by the World Health Organization (WHO) as diseases and infections naturally transmitted between people and vertebrate animals.
Picture
From: OIE (World Organization for Animal Health)
Picture
Transmission of these diseases (which are caused by viruses, bacteria, parasites or fungi) can occur in different ways, including direct contact with body fluids from infected animals (by touching pets, or via bites or scratches from them), indirect contact with habitats of or surfaces touched by infected animals (their food containers, aquarium water, plants and soil), vector bites (tick or mosquitoes), or by eating contaminated/undercooked food (milk, meat, eggs, raw fruits and vegetables). An example of direct transmission from animals by bite is rabies transmitted often by dogs. Some zoonoses need an intermediate vector to be transmitted from vertebrate animals to humans, including the bubonic plague (from rats or prairie dogs via fleas to humans, and then between people via respiratory secretions), Lyme disease (from deer mice via ticks) and West Nile virus (from birds, by mosquitoes).

A recent emerging zoonotic disease with a huge impact was SARS (severe acute respiratory syndrome) in 2003, transmitted by a virus and thought to originally come from Chinese horseshoe bats. The virus was then probably transmitted to civets, small cat-like mammals eaten in some parts of China, then “jumping” to humans through exposure by butchering or food preparation. Eventually the SARS virus became capable of being directly transmitted by respiratory secretions between humans. The Middle East respiratory syndrome or MERS virus, although related to the SARS family (coronaviruses), is very different in genetic composition. The MERS virus hosts are camels, however evolutionary ancestors may also have been bats.

The Congo’s Ebola epidemic in West Africa in 2014-2016 was caused by a virus believed to be transmitted by fruit bats that are widespread in many parts of Africa. There is also Ebola virus transmission from monkeys, pigs, forest antelope or porcupines. These wild animals’ meat, some of which is known as bush meat in parts of Africa, is very popular, and its handling by hunters and butchers as well as eating it undercooked pose a great risk for Ebola virus transmission. The virus can also kill gorillas and chimpanzees; gorilla populations suffered an impressive decline of an estimated 5000 gorillas in Gabon and the Republic of the Congo in 2002-2003 during Ebola outbreaks.

Picture
Human tuberculosis (see my tuberculosis post) is caused by the bacterium Mycobacterium tuberculosis, but bovine tuberculosis caused by Mycobacterium bovis and affecting cattle, can be transmitted to humans in what is known as zoonotic tuberculosis that is sometimes fatal and affects people mostly in Africa and southeast Asia.

Some diseases that are now exclusively transmitted between humans, such as measles, influenza, SARS and HIV, are thought to have jumped at some point from animals to us, and animal domestication may have facilitated some of these events (see my previous post on domestication). Measles may have jumped from dogs to humans, although the measles-causing virus can no longer affect dogs, and smallpox may have originated from a cowpox virus that jumped from cows. The human immunodeficiency virus (HIV) is believed to have originated from chimpanzees and gorillas in Africa, afterwards adapting to the human host transmission. Viruses can change genetically (via mutations) very rapidly, and this favors efficient adaptation to new hosts, to the point where they do not need the previous host for transmission.

The One Health concept recognizes that humans and animals and their environment/ecosystem are all interconnected. As part of the “One Health” approach, building a strong and effective multi-sectoral collaboration between the animal and human health sectors locally and globally by integrating animal and human disease surveillance and response systems will allow early detection of possible outbreaks and prevent deadly epidemics. Besides zoonoses, the One Health approach also focuses on food safety and antimicrobial resistance (see my post on the latter).

1 Comment

Understanding domestication

12/3/2019

0 Comments

 
Picture
Domestic species, whether plants or animals, have been selected for (or adapted) by us humans, and are not the same as those found in the wild. Many domesticated species of plants and animals are not able to survive without our care.

Plants were first domesticated about 10,000 years ago, when agriculture begun with wheat, barley, lentils, and peas in what was then Mesopotamia, (now the Middle East -Iran, Iraq, Turkey, Kuwait and Syria), with very fertile valley soil between the Tigris and Euphrates rivers. Potatoes in South America and rice in Asia were among the first plants domesticated in those regions for food production. It is not clear when/where exactly olives were domesticated, the earliest was probably about 6000 years ago in the Middle East, spreading West to the Mediterranean and North Africa maybe 4500 years ago. Some plants were not cultivated for food purposes, for example cotton plants were used for their fiber to make cloth, and flowers for decoration.

Nowadays in urban settings we are often not familiar with how wild species look like, as we are only exposed to plantations and/or products of domesticates crops. This figure from a review on crop domestication (Cell 2006; 127(7):1309-21) shows striking differences between wild and domesticated species of corn, rice, wheat, tomato and sunflower.

Animal domestication began in Mesopotamia to get meat, milk, and animal skins for clothing and tents. Among the first domesticated animals about 10,000 years ago are the goat and sheep, and chickens in Southeast Asia. It is now believed that animal domestication in most cases was not a single event but, as for pigs, goats, sheep, horses, and chicken, it happened multiple times with domestication events from local populations of wild ancestral species. For example, genetic analyses have shown that ancestors of domestic pigs were found across different regions in Europe, Asia and North Africa and domestication occurred a minimum of six times from local populations- one may have occurred over 13,000 years ago. Larger animals including horses and oxen were domesticated later for plowing and transportation. Cows were easily domesticated because as herbivores that eat vegetation usually available, whereas other herbivores that eat grains such as chicken require domesticated crops such as seeds and grain.
PictureFrom: https://www.theatlantic.com/science/archive/2016/06/the-origin-of-dogs/484976/
The very first species to be domesticated however, which happened before agriculture at a debated time that could have been between 15 and 40 thousand years ago, was the dog, first for hunting help and then as pets. In the first phase of domestication, the dog derived from the grey wolf (Canis lupus), however today domestic dogs are a distinct species (Canis lupus familiaris) and the most variable mammalian species on Earth.

Most dog breeds were established in the last 300-200 years, with strong artificial selection resulting in almost 500 breeds with specific morphology (body size and skull shape, tail shape, fur and pigmentation). Dog breed selection has also worked for specific behaviors such as herding, hunting, guarding, and personality including aggression.

Picture
Morphological variation among dog breeds; from upper left and going clockwise: Brussels Griffon, Afghan Hound, Bull Terrier, Chinese Crested Dog, Skye Terrier, Basenji, Gordon Setter and Bernese Mountain dog, and in the center is a Cocker Spaniel. From: Natl Sci Rev. 2019; 6(4): 810–824.
Domesticated plants and animals can look very different from their wild ancestors. For example, domesticated tomatoes seem gigantic compared with their wild ancestors. Something similar occurs in chickens, first domesticated in Asia, with smaller early wild chickens (about two pounds) leading to domestic chickens today weighing as much as 17 pounds and laying many more eggs annually.

Domestication results in genetic changes. We may not be engineering the genetic alteration in the lab (or targeting specific gene/s or using elements from other organisms), but when selecting for taste, shape, color or growth features (faster, pest-resistant, sweeter, etc) we are indeed selecting for specific gene variants and mutations. As in other biology- and medicine-related fields, recent advances in genomics and gene technologies have shed light into aspects of domestication by revealing genome (DNA) sequences of both domesticated species as well as wild ancestors (alive or extinct).

Domestication in plants has led to acquired features such as modified seed size and shattering in cereal species, and modified size and shape in vegetable crops. These modifications have been associated with specific genes in species including tomato, rice, maize, soybean, barley and wheat.

The first genome of a domesticated animal to be fully mapped was that of the chicken, in 2004 (the human genome sequence was completed and available in 2003). Gene sequence comparisons between domesticated and wild animals allow researchers to identify mutations or gene variants that are specific to domesticated animals, sometimes called “domestication genes”. One such gene in chickens is TSHR (thyroid-stimulating hormone receptor), which in wild animals coordinates reproduction with day length, resulting in breeding and egg laying restricted to spring and summer seasons. A TSHR mutation leading to one amino acid change in the receptor protein encoded by the gene in domestic chickens renders the hormone receptor inactive and enables chickens to breed and lay eggs all year long. In pigs, several gene variants/mutations have been reported, based on detailed analysis of genetic variation of local breeds (mostly European), to affect specific phenotypic traits: coat color (KIT, MC1R), production and fatness (LEPR, FTO, MC4R, LEP or MSTN), meat quality (PCK1, PRKAG3, ACACA, CAST, MTTP) and disease resistance (MUC4, GBP5). The figure below shows genes that have been shown to be directly or indirectly linked to phenotypes that distinguish dogs from wolves.
Picture
From: https://blogs.biomedcentral.com/on-biology/2018/06/28/village-dog-dna-reveals-genetic-changes-caused-by-domestication/
An interesting theory proposes that we humans (and also bonobos) have 'self-domesticated', based on reported domestication traits and genes that are present in domesticated species and not their wild counterparts. A study published in 2017 in PLoS ONE showed that these domestication traits are shared by our species but not found in our Neanderthal or Denisovan wild extinct ancestors. The figure below, from this study, shows a comparison of craniofacial features of us (modern humans) and Neanderthals (top) and dog and wolf (bottom). The left skulls of domesticated species show smaller brow ridges, nasal projections, teeth and cranial capacity; which has been referred to as ‘feminized’ and the product of reduction of androgen levels in parallel with a rise in estrogen levels in domesticated species.
Picture
0 Comments

Do you hear ringing (clicking, buzzing) in your ears? A common and interesting symptom called tinnitus

11/18/2019

0 Comments

 
PictureGetty image
Tinnitus comes from the Latin word tinnire, meaning to ring, and is commonly defined as ringing in the ear. The sound tinnitus sufferers hear though is quite variable, and it can be perceived as ringing or clicking, hissing, chirping, or buzzing, which may be high or low pitch and vary in volume and persistence. It may also be present in one ear only or in both but with different intensities. Approximately 20% of the US population is estimated to suffer from tinnitus.


Many people have experienced temporary tinnitus after exposure to loud noises (firecrackers or other explosions, concerts); I focus here only on chronic tinnitus. Often tinnitus develops as a consequence of aging-associated hearing loss or exposure to loud noise, resulting in inner ear hair cell damage. These tiny hairs in the cochlea normally move in response to sound waves, initiating an electrical signaling process via the auditory nerve to the brain, where the latter interprets the signal as sound. It is the lack of this signal that makes the brain compensate for it and create the “phantom” sound resulting in tinnitus. Although the mechanism behind tinnitus has not been completely elucidated yet, some scientists believe that the brain is trying to fill in the blanks of the sounds we can no longer hear when we lose sensory hair cells.
Picture
In some other cases there is a specific cause for tinnitus, which may then be identified and possibly treated. For example, tinnitus can occur when there is a accumulation of ear wax (so it gets better upon wax removal) or can also be a side effect of medications (in which case it may go away or get better once the person stops taking the meds); these include some antibiotics (erythromycin, vancomycin, neomycin), cancer medications (methotrexate, cisplatin), diuretics, quinine-containing antimalarials, some antidepressants, high-dose aspirin, anti-inflammatories, and some herbal supplements as well as nicotine and caffeine. Meiniere’s disease is characterized by symptoms such as vertigo and tinnitus; head, neck or jawbone problems or high blood pressure may also cause tinnitus.
Picture
The most common type of tinnitus, however, is one that does not have a trigger that can be removed, is constant and increases gradually in volume with time. Although present sometimes in young people, it is more common in older individuals (associated with hearing loss) and in those who have been exposed to loud noise that damages the cochlea hair cells. Working in loud factories, the music industry (as Bradley Cooper’s character rocker in “A Star Is Born”), construction, or using firearms can lead to or worsen tinnitus. Some contemporary celebrities, especially musicians and actors exposed to loud noises, suffer from tinnitus. It is common knowledge that Beethoven was deaf, but less known is the fact that he also suffered from severe tinnitus.

PictureGetty image
The bad news is that there is currently no available cure for this sometimes called “subjective” tinnitus, which only the affected person hears, and in order to prevent it from getting worse we are supposed to avoid exposure to noise. As a tinnitus sufferer, I find this very difficult to achieve, especially if you live in a big city with sirens everywhere, loud speaker announcements, and noisy household appliances used in small apartments.

Tinnitus is much more noticeable when everything is quiet, as late at night, which makes it hard for some people with tinnitus to fall asleep. Chronic tinnitus can result in insomnia, anxiety and depression. There are however strategies that can help us manage tinnitus, often via sound therapy, which may be combined with counseling, support groups and behavioral therapy for stress and depression. For age-related hearing loss, a hearing aid can amplify surrounding sounds and mask that of tinnitus. Masking noise sources come from music playing or white noise machines.
I have not used white noise masking regularly for my tinnitus, as I find that although my hearing is fine, I am bothered by multiple sources of noise. I prefer to have a conversation (in the car or over a meal at home or at a restaurant) without music in the background as I used to in the past. I explain this often to others around me as a feeling of always having a “plus one” sound my brain is processing in addition to what everybody else is hearing- a high pitch, loud, constant ringing. I do use earplugs when I go to events I know will be loud, and often carry a pair in my purse.
When I first realized I had tinnitus (after hearing occasional chirping or clicking sounds and turning to see where it was coming from until it became a more homogeneous ringing, then constant and thereafter progressively louder) I had not heard the term before. It is not pleasant. So I encourage you to avoid as much as possible loud sounds, and protect your kids also - check their headsets' volume. The CDC recommends an exposure limit of 85 decibels of noise. To give you an idea of levels of noise we may be exposed to in different environments, see the American Academy of Audiology figure below. The recommended limits are 8h maximum for 85 decibel sounds, and halve that time for every 3 decibels over 85.

Picture
Neuromodulation-based acoustic candidate treatments are being developed and tested in tinnitus patients with promising results reported, mostly in Germany, aiming at reducing the hyperactivity in the brain neurons responsible for tinnitus. It is not a one-size-fits-all formula but a personalized approach for each patient, with exposure to different tones (usually 4) around the patient’s specific tinnitus frequency for 4-6 hours a day for a few months.
Picture
From:
http://soniclab.umn.edu/research/enhanced-hearing-aid-humans-tinnitus-treatment-and-other-auditory-applications


0 Comments

senolytics or the search for A longer healthier life

9/8/2019

0 Comments

 
Picture
The word “senolytic” comes from “senescence” and “lytic” which in biology and medicine is used to mean destruction. Senescence refers to the process by which cells stop dividing and growing. Senescent cells accumulate in the body as a result of aging or exposure to stressors including chronic infections, radiation and chemotherapy (see my post on telomeres as a marker for cellular senescence or aging) .

Whereas normal cells cycle through stages when they grow, divide and eventually die, senescent cells can remain alive and appear “zombie”, a state that involves the release of factors that affect neighboring cells. The presence of important amounts of senescent cells in our bodies as we age may induce several conditions such as osteoporosis, osteoarthritis and vascular disease.

Senolytics, an emerging and exciting field, is in its early stages so most trials for drugs to date have been conducted in mice (see my animal models post). A pioneer study by a team from the Mayo Clinic published in 2011 using mice with engineered “traceable” senescent cells and then a system by which these would be selectively eliminated showed that there was a delay in the appearance of age-related conditions and the lifespan of these mice was increased by about 20-205%, indicating that the removal of senescent cells leads to a healthier and longer lifespan in this animal model.

A more recent study was published last year  (2018) showing that in a senescence animal model, a combination of two existing drugs (dasatinib and quercetin) prevented cell damage, eliminated the senescent cells from tissues and restored function. The same combination of drugs, when administered to normal mice, extended their life (in older mice) and health span (when given to younger mice).
Picture
Two exciting senolytics studies in mice published this year used an obesity model. Lean mice, when fed a high-fat diet, become obese, accumulate senescent cells rich in fat content in adipose tissue and their brain’s white matter, and show more anxious behaviors. These obese mice were treated with senolytic drugs in a study that showed that although the mice did not lose the weight, the drugs resulted in clearance of the fat senescent cells in the brain and a noticeable reduction in anxious behavior. The second study with elimination of senescent cells in obese mice demonstrated  improvement of obesity-related metabolic complications including low glucose tolerance, inflammation, and kidney and heart function.
Picture
Picture
Pre-human clinical trials (see my post on clinical trials) for senolytic drugs are in their early stages, targeting specific conditions such as age-related macular degeneration, glaucoma and chronic obstructive pulmonary disease including emphysema. Much of the research so far has used the mentioned above combination of dasatinib and quercetin. A pre-clinical pilot trial for knee osteoarthritis has already shown some promising results in humans.

Dasatinib is an existing drug used to treat some forms of leukemia; quercetin is a flavonol found in fruits and vegetables such as apples and berries berries, cruciferous vegetables (broccoli, Brussels sprouts, cauliflower, collard greens, cabbage), capers, grapes, onions, shallots, tea, and tomatoes, and many seeds and nuts.

0 Comments

Electric bacteria or the electromicrobiology field

6/29/2019

1 Comment

 
Bacteria have colonized pretty much all habitats on Earth including the bottom of the sea, volcanoes, the Everest, and inside our bodies (see my microbiome post) to account for about half of all cells in each of us. We keep discovering amazing abilities bacteria have, which have led to and are currently the focus of numerous studies resulting in important real and potential medical and environmental applications.
 
In order to thrive in different environments, bacteria use specific mechanisms based on available resources. A group that has received special attention in recent years is one that uses/generates electricity.
 
When we say we breathe oxygen, what we really mean mechanistically is that we use oxygen as an electron acceptor of the energy-generating process (see my posts on mitochondria and ATP) essential to our lives. This chain of biochemical reactions starts with food metabolism where carbohydrates/sugars we eat provide the electrons that are then passed around different “electron carriers” to be finally delivered to oxygen. In low- or no-oxygen environments, bacteria have the ability to use other elements as electron acceptors including iron, manganese and other metals (in oxide form), with the first metal-reducing bacteria reported in the 1980s, Shewanella and Geobacter. More recently (in 2005) Geobacter bacteria were reported to have what are now known as “bacterial nanowires”, based on the ability of their long pili or extracellular protein nanofilaments to conduct electricity. This is different from our cells in that the electron transfer is “extra-cellular” as it happens outside of the bacterial cell. The current model is one of “electron hopping” along bacterial nanowires. One Geobacter species has been shown to produce nanowires of up to 20 μm in length, which is about 20-fold longer than Geobacter cells. Individual fibers have the charge transport capacity to discharge respiratory electrons at μm distances from the cell at ∼1 billion electrons per second.
Picture
These bacterial nanowires allow bacteria to live on iron and other metals abundant in habitats such as soil and marine and freshwater water sediment. Some bacteria that grow in communities known as “biofilms” (see my biofilms post) have been shown to colonize microelectrode surfaces, leading to electroconductive biofilms forming a redox gradient by transport from electron-rich to electron-poor areas. Depending on the type of bacteria they donate to or receive electrons from electrodes: anodes accept electrons from biofilm bacteria that grow on them, while cathodes grow electrotrophic bacteria that consume their electrons.
Picture
These biofilms can even grow as a combination of two different Geobacter or other species. The pili that act as nanowires play a role in the formation of these multicellular biofilm structures by mediating cell aggregation.
Picture
Electricity from bacteria and biofilms harnessed with electrodes is now the focus of potential applications such as bioremediation (soil and water contaminants including toxic heavy metals and radioactive waste, which are chemically reduced to non-toxic forms: uranium form underwater for example) and electricity harvesting or bioenergy with the use of “microbial fuel cells” (MFCs). These MFCs use microbial cellular respiration to pass electrons from the anode to the cathode separated by an ion exchange membrane, with organic “fuel” fed into the anode chamber (wastewater for example), which bacteria as a biofilm growing on the anode oxidize and reduce. Protons, electrons, and carbon dioxide are produced as byproducts, with the anode serving as the electron acceptor in the bacteria’s electron transport chain. Bacteria here are acting as "biocatalysts" in the generation of electricity from organic matter or waste.
Picture
1 Comment

vaccine refusal is harming disease eradication

6/17/2019

0 Comments

 
This is a self-explanatory figure post with CDC measles data to follow up on my November 1 2017 blog on vaccines ... unfortunately, we are still going backwards on the disease eradication front in spite of these amazing tools available called immunizations/vaccines
Picture
0 Comments

Thanking Henrietta: what makes cancer cells malignant is good for research

3/21/2019

1 Comment

 
A cancer cell, right before becoming “malignant”, was a normal cell growing inside us, part of a specific tissue, organ, blood cells or bone marrow. The main difference evident in a cancer cell is also where its threat comes from: cell division goes out of control resulting in faster and unchecked cell division leading to tumor growth. Different mutations occur in the cell’s DNA in genes that encode proteins that control and regulate the cell’s life cycle including division and checks and it all goes out of whack. A lot of research is now focused on understanding specific DNA alterations (mutations) that are associated with different types of cancer. Some hereditary mutations are well know to increase our risk to develop certain cancers depending on additional mutations acquired later and additional risk factors we may be exposed to (see my post on genetic testing for disease risk).
Picture
Picture
Not just one mutation present will lead to cancer, rather a number of them are accumulated in a cell before transformation occurs. A study published in 2017 in Cell showed that between 1 to 10 mutations are needed depending on the type of cancer (“Universal Patterns of Selection in Cancer and Somatic Tissues”, Martincorena et al); four mutations on average are found in patients with liver cancers, whereas colorectal cancer patients showed an average of 10.
Picture
Now a more familiar name to lay audiences thanks to Rebecca Skloot’s book “The Immortal Life of Henrietta Lacks” published in 2010 followed by an HBO movie in 2017, “HeLa” cells have been the most widely used cell line in laboratory research since its discovery in 1951 as the first “immortal” cell line available after it was obtained from treatment of Henrietta Lacks’s cervical cancer by a Johns Hopkins researcher, with the institution making them subsequently free for scientific research. Henrietta died of this cancer in 1951.
Picture
The amazing advantage of an immortal cell line, as are those derived from cancers, is that they can divide indefinitely when cultured in the lab, and used for a wide range of purposes including:
  1. experiments to figure out things about cancer or even non cancer cell lines
  2. production of cellular reagents for laboratory research including antibodies and other proteins that can be then be used in “cell-free” systems for cellular protein function studies
  3.  production of therapeutic protein pharmaceuticals such as antibodies, hormones and vaccines
  4. testing potential cancer therapies
 
Normal cells cultured in the lab will stop dividing after a number of division cycles (40-60) due to shortening of the ends of chromosomes called “telomeres” (see my previous post on telomeres) while HeLa cells have an overactive “telomerase”, an enzyme that lengthens telomeres to keep cells growing for ever. HeLa cells grow unusually fast even compared to other tumor-derived cell lines tested.
Picture
The American Type Culture Collection (ATCC), a reference source of all kinds of cell lines and microorganism strains, offers panels of tumor cell lines available by tissue (bladder brain, bone, breast, ovarian, colon, head and neck, leukemia, lymphoma, liver, lung, melanoma, pancreas and stomach cancer cell line panels, some including primary as well as metastatic tumors), each with specific mutations characterized. The molecular geneticist Michael Gottesman at NIH stated how critical these cell lines are in cancer drug development: “There is no cancer drug in current use that was not first tested in a cultured cell model” (Discovery magazine, October 2014).

Normal cells can be engineered to become immortal by infection with a virus such as human papilloma (HPV) which will induce mutations at the DNA level in genes encoding proteins involved in maintaining normal cell division and DNA damage checking and repair.

For more info on interesting novel therapies to treat cancer and testing your genetic risk you can see my previous posts on these topics (genetic testing for disease risk, immunotherapy, and pharmacogenetics).
1 Comment

what are clinical trials and why do  they take so long?

1/9/2019

2 Comments

 
Clinical Trials and the FDA

Before a new drug/treatment, clinical device or procedure can be widely used by doctors to treat patients, these potential medical tools undergo (usually very long) clinical trials. These trials use volunteers to evaluate efficacy (how well the treatment works) and most importantly, safety (side effects). Before anything gets OKed by the FDA (Food and Drug Administration) in the US, it has to be shown to be effective and safe via clinical trials results, which get reported to the FDA, reviewed by experts and eventually given an official approval.

I remember times when “clinical trials” or “FDA approval” were not common terms, but now you may hear in the health news that a new drug seems promising for a specific condition (diabetes, blood pressure, cholesterol, Alzheimer’s, Parkinson’s, etc) with a warning that this potential treatment will not be available to patients until it has gone through many additional trials (up to 10 years depending on the treatment and the target condition). 

In some cases when the drug/treatment is being tested at the NIH or different medical centers, terminal patients may be admitted as volunteers in the trial - you may know people/family who have been in such trials (or may be interested in being a volunteer on a clinical trial yourself), especially when they have received approved treatments that failed for cancer for example. The free database ClinicalTrials.gov includes clinical studies worldwide (currently a total of 293,889 research studies in all 50 states and in 207 countries) and is searchable by condition or disease, drug name, and country at:
https://clinicaltrials.gov/

All trials have to obtain approval by relevant committees and institutional review boards that make sure ethical requisites are being met, such as “informed consent” by the volunteers in the trial, which is requested prior to the trial. Volunteers have to be informed of the treatment being tested, possible side effects and risks, aims of the trial, after which they give consent to participating as volunteers and also to what can be done with the information and materials (that may be stored for years, such as blood samples) obtained from the trial; everything is of course kept confidential even if pooled results can be presented and shared with other/relevant audiences. Al clinical trials are expected to comply with "good clinical practice" (GCP) guidelines, which ensure ethical and scientific trials and protect human subjects' rights and confidentiality. Recruitment, selection and final inclusion of candidate volunteers into a clinical trial is a process that may take several months in order to reach the required numbers that will allow statistical analysis for the treatment to be proven effective and safe.

Pre-clinical Research Phase

Before clinical trials start, there has to be evidence (data) that the drug or test/device to be trialed is promising in preliminary laboratory studies done in cells in culture and/or animal models (usually mice) for the specific disease or condition (see my post on animal models). These “pre-clinical” studies results, along with detailed explanation of procedures by which the drug to be tested is made in batches and specific “protocols” that will be followed in the clinical trial proposed including following ethics rules, are submitted to the FDA in what is known as a “investigational new drug” or IND application; all clinical trial phases thereafter are under FDA oversight. Clinical trial phases 1-4 occur consecutively, and FDA approval to move on to the next phase depends on results and analysis of the previous phase, which are timely submitted to the FDA for review.
Picture
Clinical Trial Phases

There are 4 phases of clinical trials which may happen consecutively, with progress to the next phase depending on approval of previous phase results, or sometimes two phases may happen simultaneously depending on trial design. Planning and execution of each phase require time and navigating loads of associated paperwork, administrative and regulatory requirements, with possible delays affecting progress. Trials may be conducted at different sites sometimes in different countries; in these cases  procedures need to be similar so results are comparable, and forms for volunteers are provided in the local language.

The first phase (phase 1) usually recruits a small number of healthy volunteers to look at dosing of the drug and side effects, sometimes including studies on how/when the drug is processed (metabolized) and stays in the body. Phases 2 and 3 involve larger groups of volunteers including patients with the disease/condition and “control” groups to compare with, who receive either an existing treatment or a “placebo” (dummy treatment that looks exactly like the real treatment). There are instances in which trials may be sped up or skip phases depending on previous testing. For example, a “repurposed” drug that has been tested for one disease/condition and may be considered for another usually is accelerated to phase 2, as its safety in humans was preciously tested in phase 1 trials for the first condition (already FDA-approved). A "new drug application" (NDA) is submitted to the FDA after phase 3 is completed to demonstrate that the drug is safe and effective for its intended use in the population studied; the NDA is a complete and extensive report including pre-clinical and phases 1-3 data.

Picture
Phase 2 and 3 trials may be “randomized” and/or “blind”. Randomization is usually computer-based, assigning volunteers randomly to either the treatment or control groups usually using age, gender and disease stage variables to make sure the groups are of similar composition. Each patient receives a code number that matches the same code in his/her corresponding medication. When the patient doesn’t know whether they are getting treatment or control/placebo treatment, the trial is blind; in a double-blind trial the researchers/doctors are also unaware of which treatment is being administered to patients.

The final phase (4) starts immediately after the results of phase 3 trials are reviewed and considered to show good enough safety and efficacy to grant FDA approval; it consists of post-marketing monitoring over a longer period of time in a “real” scenario and wider population that may allow detection of rare side effects missed in earlier trials. This surveillance or “pharmacovigilance” may result in harmful effects reported over time, with consequences such as drug’s market withdrawal (or restricted use); this happened to troglitazone (Rezulin/Romozin) in 2000 due to risk of hepatotoxicity, cerivastatin (Baycol/Lipobay) in 2001 due to risk of rhabdomyolysis, and rofecoxib (Vioxx) in 2004 due to risk of myocardial infarction.

When pregnant women in other countries in Europe and Australia used the drug thalidomide in the 1960s for morning sickness, the drug was not marketed in the US thanks to a FDA employee (Dr. Kelsey, a woman) who felt there was no reliable safety evidence.  When babies were born with severe birth defects such as hands and feet protruding from shoulders and hips and these were linked to thalidomide use, it prompted the 1962 Drug Amendments bill that changed drug regulation; it also resulted in the FDA excluding women from participating in clinical trials in 1977, which was reversed in 1993 although women are still a minority population in clinical trials. Dr. Kelsey's role in averting a thalidomide tragedy in the US was recognized with several awards in the US and abroad.
Picture
Children are another group, due to safety and ethical considerations, not usually included in clinical trials. For them to participate,  a clinical benefit justifies the risk of taking the drug and they should be patients with the target disease or condition. Only about 50% of FDA approved drugs are labeled for use in children to date. However, clinical trials are common practice in pediatric cancer treatment, leading to considerable numbers of children with cancer participating in clinical trials.

This FDA web page has essential information and nice short videos explaining simply what clinical trials and phases are:
https://www.fda.gov/forpatients/approvals/drugs/ucm405622.htm
2 Comments
<<Previous
    Picture

    Author

    Hi! This is an attempt to write simply about things I feel passionate about. My name is Judith Recht and I am a scientist by training, a later-in-life mother, and an expat in Bangkok, Thailand and Recife, Brazil (~4 years in each country) now back in the US. I was born in one country (USA) grew up in another (Venezuela) raised by Argentine parents and moved around four more times (NYC to Bangkok to Recife to Maryland). This blog is for those of you who might be interested in the diverse topics so far included and others coming up soon.

    Archives

    July 2021
    April 2021
    January 2021
    October 2020
    March 2020
    December 2019
    November 2019
    September 2019
    June 2019
    March 2019
    January 2019
    November 2017
    October 2017
    September 2017
    June 2017
    April 2017
    February 2017
    November 2016
    October 2016
    June 2016
    May 2016
    January 2016
    December 2015
    November 2015
    September 2015
    April 2015
    March 2015
    October 2014
    September 2014
    July 2014
    May 2014
    April 2014
    February 2014
    September 2013
    March 2013
    February 2013
    January 2013

    Categories

    All

    RSS Feed

Proudly powered by Weebly