Masters of Our World: Should We Use Gene Drives to Control the Ecosystem?

121120172309566022149_3
Picture Credit: Michael Morgenstern | Science News

Some have called it a magic wand. Others have referred to it as the beginning of a new scientific revolution. Regardless of how you may see it, it’s a subject matter that shouldn’t be discussed by only scientists.

CRISPR-Cas9 is the latest state-of-the-art gene editing tool that has taken over the scientific community in recent years. While the concept of modifying DNA is certainly not a new invention, CRISPR’s main strength lies its transformation of the complicated process of gene editing into something quick, efficient, precise and ridiculously cheap. In other words, it has the potential to cut out undesirable segments of DNA, eradicate hereditary diseases and even guide humanity to a future where people can shape their body into whatever they want. It’s what discouraged many people from thinking that something like designer babies is “unlikely,” but rather as something “inevitable.”

One area of CRISPR research that has gained a lot of attention recently is the development of gene drive technology, which may give humans the power to modify or even exterminate entire species in the wild. According to evolutionary biologist and gene drive pioneer Kevin Esvelt, the purpose of a gene drive is to use CRISPR to override the traditional rules of Mendelian inheritance and introduce a genetic change in organisms that will be passed down to nearly all of its descendants.

In a typical situation, a parent organism can only pass down its genome to half of its offspring as per the rules of inheritance discovered by Gregor Mendel, the father of modern genetics. As a result, even if scientists were able to genetically modify organisms in the past, they would still encounter immense difficulty in forcing specific genetic changes across an entire population. With gene drive, however, that 50-50 chance of inheritance can skyrocket to as high as 99 percent. This, of course, has groundbreaking implications.

“The ability to edit populations of sexual species would offer substantial benefits to humanity and the environment. For example, RNA-guided gene drives could potentially prevent the spread of disease, support agriculture by reversing pesticide and herbicide resistance in insects and weeds, and control damaging invasive species… [G]ene drives will be capable of influencing entire ecosystems for good or for ill,” stated Esvelt when he first introduced the possibility of using CRISPR to develop gene drives.

We possess the technology to change the world’s ecosystems, but does that mean we should use it? Many people certainly seem to think so, and the proposed benefits seem irrefutable. For instance, one innovative project currently underway is the use of gene drives to eliminate malaria from mosquitoes. Scientists are working on genetically modifying the Anopheles gambiae mosquito, a species known for spreading the malaria parasite so that the female mosquitoes become sterile. That way, once these modified mosquitoes are released into the wild, they can breed with other members of their species and effectively die off. Other scientists are looking towards using gene drive to wipe out invasive species and save endangered native animals.

Esvelt himself has become heavily involved in gene drive technology. His current project aims to reduce the rate of Lyme disease on Nantucket Island in Massachusetts by genetically modifying the island’s white-footed mice to become immune to the disease. Then, ticks will be unable to transfer the bacteria that cause the disease, and the entire transmission cycle will collapse.

However, as promising as all this may sound, it’s doubtful that gene drives will provide a lasting, viable solution. In fact, it’s possible that this technology allows scientists to deal with these serious issues in the wrong way. We may have become too infatuated with how sleek and shiny CRISPR appears to consider better, less risky solutions.

For one thing, ecosystems aren’t so simple that we can just inject new variants of a species into the wild and expect everything to go exactly as we planned. There are too many nebulous factors involved for scientists to be able to correctly predict the outcome of every ecological experiment. One of the test subjects may escape into a different environment or a completely unrelated species may become caught in the crossfire. Most of the time, as Esvelt notes, the gene drive may have little to no effect on the ecosystem at all. Ultimately, it’s arrogant to treat the ecosystem like a math problem with a simple, clean answer.

Even Esvelt seems aware of these limitations, stating, “Let me be the first to say that we do not understand how ecosystems work. They are fantastically complex.”

As if affirming this admittance of ignorance, nature itself seems to have knocked gene drive down several pegs. According to a recent report by population geneticist Philipp Messer, the genetically modified mosquitoes that the team designed to pass down an infertility mutation to all their offspring started developing a resistance to the gene drive. In other words, gene drives may not be the permanent solution that many people claimed it to be. “In the long run, even with a gene drive, evolution wins in the end,” Esvelt commented in response to the news.

But that’s not even the worst part. Upon creating a detailed mathematical model that describes what happens when genetically modified organisms are released, Esvelt discovered that the chances of altered genes spreading to unintended parts of the ecosystem were much higher than he originally predicted.

“I [feel] like I’ve blown it … [Championing this idea was] an embarrassing mistake,” Esvelt admitted.

To be honest, the entire idea of gene drives seemed faulty to begin with, mainly because the desired population modifications were not introduced naturally. Instead of working hand-in-hand with evolution, gene drives attempt to solve ecological problems by simply creating more unsustainable arms races akin to the one we have between antibiotics and bacterial diseases. For instance, even if gene drives eradicated a species of mosquitoes that spread malaria, it wouldn’t be long before a different species of mosquitoes eventually emerged that can spread the bacteria to human hosts.

Instead of making sudden, irreversible changes to the ecosystem, a much more reasonable solution is the one offered by evolutionary biologist Dr. Sharon Moalem in his book The Survival of the Sickest. In it, Dr. Moalem describes how the best way to combat diseases like malaria is to change the conditions of the environment so that the nature of the disease evolves in a way that works in our favor. For example, consider how the widespread use of mosquito nets would not only stop mosquitoes from infecting humans but essentially invalidate mosquitoes in general as vectors for the disease. As a result, evolution may provide an alternative way for malaria to spread, perhaps one that wouldn’t cause the parasite to completely incapacitate the body and instead only slightly weaken it so that the disease can spread similarly to the common cold.

Rather than risk a high-stakes gamble on gene-editing technology, it may be wiser in the long run to contemplate less invasive methods to solve our ecological problems. Humans don’t have a great track record to begin with, after all.

Originally published on November 29, 2017, in The Miscellany News: Gene Drives Wrongfully Hailed as Biological Panacea

Advertisements

The War on Cancer: We Still Think We’re Winning Even Though We’re Not

cancer-immunotherapy-1
Picture Credit: Fabian Bimmer | Reuters | Newsweek

I must admit, it can be exciting to read about the latest developments in cancer research in the news. There is so much happening in the field of oncology that it’s tempting to imagine a future without cancer just around the corner.

For instance, scientists at Northwestern University have reported that they have found what they call an ancient kill switch in cancer cells. According to the researchers, they may be able to use this mechanism to force cancer cells to kill themselves in multiple ways simultaneously.

Not only that, a revolutionary new form of cancer treatment known as CAR T-cell therapy has swept up the scientific community in an excited fervor. By manipulating the T-cells of the cancer patient’s own immune system with gene therapy and then reinjecting them back into the patient, researchers have successfully destroyed tumors in people who had lost all hope.

According to various news reports, this treatment was so promising that the U.S. Food and Drug Administration (FDA) has recently given it the green light for production and distribution, making it the first use of medicinal gene editing to be available for patients in the United States.

“We’re entering a new frontier in medical innovation with the ability to reprogram a patient’s own cells to attack a deadly cancer,” FDA Commissioner Dr. Scott Gottlieb stated after approving the treatment.

As with anything that’s showered with positive attention by the media, however, it’s not as simple as it appears. All the hype surrounding cancer research is actually blinding us to the reality that we are not winning the war against cancer. In fact, despite what headlines may claim, we are nowhere close to actually finding the cure for cancer.

While such a claim may sound needlessly pessimistic, it is vital to view the current trajectory of cancer research within the context of its larger history. For one thing, cancer has been around for a very, very long time. This immortal and terrifying disease has been around for all of human history, with fossil evidence and ancient manuscripts dating its pervasiveness as far back as 1600 B.C. Needless to say, countless attempts have been made by renowned scientists and medical experts across human history in a collective effort to understand and combat this disease. In recent memory, the most notable collective endeavor is America’s War on Cancer, which was launched by President Nixon in 1971. From that moment on, the United States has devoted increasingly intensified efforts to find a cure.

Over the past 40 years, the U.S. has poured a total of more than $500 billion into winning this war. Even now, that war continues to escalate. In 2017, the National Cancer Institute (NCI) received $5.389 billion for the fiscal year, which is $174.6 million more than what the organization received in 2016. In addition, we have around 260 different nonprofit organizations in the United States that raise money for cancer research and treatment. Together, those nonprofit organizations have budgets that top $2.2 billion.

This should be good news, though, right? All of that money is going towards a worthy cause, after all. Indeed, that much is undeniable. However, the problem is that all that money is translating to very little substantive progress in terms of developing a permanent solution. So far, we have made great strides in understanding the nature of cancer cells and how they behave in general. Unfortunately, utilizing that knowledge to create a reliable treatment has so far proven to be much more difficult than anyone had realized.

Despite receiving billions of dollars in funding and conducting countless expensive and laborious drug trials, scientists have yet to develop anything that can meaningfully increase a patient’s chances of survival, much less actually cure the disease. In fact, a recent study published earlier this year reported that two-thirds of all cancer drugs that were approved in the past two years showed no evidence of extending survival at all (USA Today, “Dozens of New Cancer Drugs Do Little to Improve Survival,” 02.09.2017).

When President Nixon announced the War on Cancer, he vowed that cancer would be cured by 1976. Today, cancer remains as deadly as ever. According to the World Health Organization, one in six deaths in the world in 2015 was caused by cancer, resulting in a total of 8.8 million deaths. As a result, cancer is still the second leading cause of death globally, just behind heart disease. However, the death toll from heart disease has decreased significantly over the past several decades. In fact, between 1950 and 2005, the death rate of heart disease dropped by 64 percent. In contrast, the death rate for cancer fell by a mere five percent during that same time period. That’s how little progress we have made, even with billions of dollars in funding supporting decades of scientists’ focused research.

Of course, the last thing I want to do is discourage further cancer research. Despite the rather bleak odds, there are still benefits in continuing this line of inquiry and searching for other treatment options. The point I’m trying to articulate is that the news you hear about regarding cancer research tends to be so overly positive that they often fail to accurately depict the reality of the situation. No matter where you look, every new insight is a “major breakthrough,” and every new test product is “a miracle in the making.” By exaggerating successes, the media has effectively deceived the general public into believing that the cure for cancer is just around the corner.

Case in point: CAR-T therapy. Remember how I mentioned earlier that this method of cancer treatment showed promising results? When news sources learned that the FDA approved its use in the United States, they became ballistic with excitement. They issued articles about the miracle of CAR-T therapy, with headlines such as “Latest Car-T Therapy for Cancer Signals New Era for Life-Saving Treatments”, “New Gene Therapy for Cancer Offers Hope to Those With No Options Left”, and “Cancer’s Newest Miracle Cure”. In typical fashion, all these articles feature heartwarming stories of cancer patients miraculously being saved by this revolutionary new treatment that will surely stop cancer in its tracks.

What these articles fail to mention is that CAR-T therapy can be incredibly dangerous because it needs to bring your body to the brink of death in order to save you. While the genetically engineered T-cells spread and kill the tumor cells, the patient undergoes a series of intense side effects that are so sudden and severe that a team of top specialists must remain on standby to keep the patient alive.

And sometimes, not even that is enough. So far, several patients have died from neurotoxicity complications during the clinical trials, and experts still haven’t pinned down the exact cause of their deaths. Because CAR-T therapy is so risky and complex, experts warn that it’ll take years before a treatment like this is safe for patients to use. It is certainly not the miracle cure that the media is making it out to be. It’s not even effective against all cancers; CAR-T therapy has mainly been used to treat leukemia but struggles against solid tumors like sarcomas and lymphomas.

Does this mean that CAR-T therapy is a lost cause? Absolutely not. Medical experts are justified to claim that this immunotherapy treatment is a legitimate revolutionary breakthrough in a field that has largely stagnated over the past several decades. This is a major accomplishment, and the cancer survival stories prove that fact. But the issue is that for the past 40 years, the media has consistently trumpeted the end of cancer with every trivial development. By bombarding the public with overly exaggerated tales of successes, the media has essentially deluded the country into believing that we are winning the war against cancer and that all cancer patients have a good chance of not only surviving but also returning to their normal lives. But such rose-colored views are far from the truth and have broken families apart.

As Dr. Otis Brawley, the chief medical officer at the American Cancer Society, explained, “We have a lot of patients who spend their families into bankruptcy getting a hyped therapy that [many] know is worthless…[Some choose a medicine that] has a lot of hype around it and unfortunately lose their chance for a cure.”

It’s already heartbreaking for patients and their loved ones to learn that they have cancer. It feels infinitely worse to undergo several “life-saving” treatments performed by doctors at the country’s best hospitals only to learn that none of it actually works. Consider the tragic story of Michael Uvanni and his brother James, a patient with skin cancer. After hearing about all the miracle treatments that were supposedly available and seeing happy commercials of cancer patients hugging their grandchildren, they felt confident that the odds were in James’ favor. That optimism led to crushing disappointment when his health continued to suffer, even after trying immunotherapy and several other experimental treatments. Three years after his diagnosis, James passed away from metastatic melanoma.

“I thought they were going to save him…You get your hopes up, and then you are dropped off the edge of a cliff. That’s the worst thing in the world,” confessed Michael Uvanni.

This sort of duplicitous optimism, unfortunately, permeates through the entire field of oncology. While newspapers hype research results to attract readers, drug companies make outrageous promises to boost sales and hospitals draw in paying customers by appealing to their hopes and overstating their accomplishments. Many scientists have also fallen victim to this mindset, often exaggerating the successes of their own research results to attract investors. Back in 2003, Dr. Andrew von Eschenbach, the director of the National Cancer Institute, announced the possibility of “eliminating suffering and death due to cancer by 2015.” Even President Obama contributed to the illusion when he announced the Cancer Moonshot project in 2016 by saying, “Let’s make America the country that cures cancer once and for all.”

Given all these overly positive messages, it’s no wonder that so many cancer patients believe that their lives are guaranteed to be saved, only to feel crushed when they learn the awful truth. Let’s be clear: There is no miracle cure for cancer. According to the American Cancer Society, the percentage of people who are alive five years after being diagnosed with stomach cancer is 29 percent. For lung and bronchus cancer patients, the number is 18 percent. For pancreatic cancer patients, it’s 7 percent. Patients with metastatic melanoma typically die within a year of diagnosis. Despite what you may hear, immunotherapy can cause fatal immune system attacks on the lungs, kidneys, and heart. There are no approved immunotherapies for breast cancer, colon cancer or prostate cancer. Not only that, studies have found that immunotherapy only benefits about 10 percent of all cancer patients.

As grim as all this may be, we must remember that not all hope is lost. That said, the last thing cancer patients need right now is to be blindsided by all the fanfare that seems to accompany every piece of cancer news.

Originally published on October 26, 2017, in The Miscellany NewsCancer research advancements overstated

The New Americana: More Teens Are Suffering From Anxiety, Depression Than Ever Before

depressed teen
Picture Credit: Reuters | International Business Times

For a long time, teenagers have been characterized—generally by those older than them—as overly moody, self-centered and irrational. It’s not uncommon for adults to complain about how millennials are emotionally unstable, or to brush aside their problems as typical “teenage angst.” But in reality, these millennials have been rather upstanding.

Illegal drug use among teens has been declining for several years, and far fewer adolescents are smoking cigarettes and drinking alcohol than almost ever before. Not only that, the National Center for Health Statistics has reported a record low in the teen birth rate in the U.S., while high school graduation rates reached an all-time high of 83.2 percent in 2015.

Yet despite all the good news, researchers have noticed a disturbing trend: American adolescents are developing serious mental health problems at an alarming rate. According to the Department of Health and Human Services, about three million teenagers ages 12 to 17 had at least one major depressive episode in 2015 alone, and more than two million teens reported experiencing depression that impairs their daily activities. What’s even more startling is that this number is predicted to increase. According to a study that tracked depression among young adults across the country, the number of teenagers who reported having symptoms of low self-esteem and problems with sleep and concentration rose by 37 percent just between 2015 to 2016.

And it’s not just depression. Researchers have found that cases of anxiety have spiked in recent times. According to the Anxiety and Depression Association of America (ADAA), anxiety disorders have become the most common mental illness in the United States, affecting 18.1 percent of Americans every year. In fact, the National Institute of Mental Health reported that about 6.3 million teens in the U.S. have an anxiety disorder of some kind. Unfortunately, this widespread phenomenon is not just affecting middle- and high-school students. Anxiety has overtaken depression as the most common reason college students seek counseling services. According to the American College Health Association, the number of undergraduates reporting to have “overwhelming anxiety” increased significantly from 50 percent in 2011 to 62 percent in 2016.

It’s not normal “teen angst” anymore; it’s a full-scale epidemic that is bound to get worse over time if ignored. But what can be the cause of such a shocking national trend? Unfortunately, not even the researchers know for sure. Usually, there are several conspicuous reasons for adolescents to feel depressed or anxious. Being raised in abusive households, living in poverty or being surrounded by violence are all understandable causes of emotional instability. Yet, teenagers who live in well-off communities and who seemingly should have nothing to worry about tend to suffer the most. What could possibly be causing these adolescents such grief?

Rather than one definite answer, it is most likely the result of several interwoven factors. For instance, anxiety and depression are shown to have a biological component. Scientists have already located several genes that may influence the risk of developing an anxiety disorder, such as variants of the GLRB gene, which has been linked to responses in the brain that cause us to become startled or overly fearful. However, there are other relevant biological factors besides genetics. Just recently, scientists have discovered that our gut bacteria may influence the functioning of brain regions such as the amygdala and the prefrontal cortex, both of which are heavily linked to anxiety and depression. These studies found that mice with an imbalance in their gut microbiome were more likely to display anxious and depressive behaviors.

However, many experts agree that environment likely plays a larger role in the rise of mental health issues in adolescents than genetics or gut bacteria. More specifically, researchers suspect that this epidemic of intense anxiety and depression in teens may be caused by the overwhelming pressure placed on them not only to succeed but to perform better than everyone else. As a result of this pressure, both high school and college students have reported that their biggest stressor is the fact that no matter what they do, it’s never enough.

“Teenagers used to tell me, ‘I just need to get my parents off my back.’ [But now,] so many students have internalized the anxiety. The kids at this point are driving themselves crazy,” stated Madeline Levine, a practicing psychologist and a founder of a non-profit that works on school reform. This news probably comes as a surprise to no one. In 2013, the American Psychological Association reported that American teenagers have become the most stressed age-group in the United States. Various culprits are likely at fault, including sleep deprivation, the uncertainty surrounding job security and the fear of not living up to people’s expectations.

Researchers have also assigned blame to the prevalence of social media and technology. With everyone connected to the internet, it’s difficult for teens to avoid constantly comparing themselves with their peers and worrying about their digital image. Unsurprisingly, many anxious teenagers agree that social media has had a negative influence on their mental health. According to accounts by teenagers attending Mountain Valley, a residential treatment facility for adolescents suffering from severe anxiety disorder, social media played a large role in lowering self-esteem and provoking feelings of anxiety. Not only that, the students also talked about how their smartphones provided a false sense of control, which they could use to avoid talking to people and escape the stresses of school.

As a result, several experts suspect that there may be a connection between the extreme spike in anxiety and depression in recent years and the wide-spread adoption of the iPhone. As Jean Twenge, a professor of psychology at San Diego State University, puts it, these dramatic trends in teen mental health issues started “exactly the moment when the proportion of Americans who owned a smartphone surpassed 50 percent.”

In the end, researchers have yet to find a conclusive answer to this troubling phenomenon. However, they agree that there is a disturbing lack of resources available to help young adults who are currently struggling with these problems. Studies show that despite the rise in mental health issues, there hasn’t been a corresponding increase in mental health treatment for both teenagers and young adults. Not only that, it’s highly likely that the number of adolescents who are actually struggling with anxiety and depression is greater than the reported figure since many people choose not seek help. In fact, the Child Mind Institute reported in 2015 that only 20 percent of young people with diagnosable anxiety disorder get treatment.

Thus, it is important to understand the true gravity of the situation and reach out to those who need help. During these times of uncertainty and hardship, it’s crucial for us to take the time to understand these individuals and aid them as best as we can rather than brush their problems aside as mere trivialities.

Originally published on October 19, 2017, in The Miscellany NewsMillennial mental health issues spike, prompt response

It’s Time to Replace the Lab Mice

White-mouse-in-lab-009
Picture Credit: Alamy | The Guardian

Let’s do a little experiment. Read the following headlines from these recently published scientific articles and try to find the one thing that all of them have in common: “The Pancreas Provides a Potential Drug Candidate for Brain Disease,” “Chimera Viruses Can Help the Fight Against Lymphomas,” “What Was Once Considered Cell Waste Could Now Treat Pancreatic Cancer,” “Cellular Tango: Immune and Nerve Cells Work Together to Fight Gut Infections,” “Scientists Reveal Fire Ant Venom Could be Used as a Skin Treatment.” The answer? All of the listed studies are based on the results of experiments conducted on mice. And that is a huge problem.

Using lab mice to understand how the human body works is nothing new. This practice officially started in 1902 when French biologist Lucien Cuénot used mice to research the nature of genes. Inspired by the works of Gregor Mendel, the father of modern genetics, Cuénot wanted to see if Mendel’s laws of inheritance applied to more than just sweet peas. Beforehand, Mendelian genetics only applied to tested plants, so the Cuénot discovery that animals follow the laws of inheritance sent shockwaves across the scientific community.

Not long after, more scientists began to use mice to explore the field of genetics, establishing mating programs that created inbred strains of mice and leading efforts to fully map the mouse genome. As decades went by, lab mice skyrocketed in popularity and ended up contributing to numerous award-winning discoveries. Out of the 106 times the Nobel Prize for Physiology or Medicine has been awarded so far, 42 of them involved research on mice or rats in some major way. These studies include the discovery of penicillin, the yellow fever vaccine, the polio vaccine and the HIV-AIDS virus.

It is easy to see how the lab mice became such an iconic symbol of biomedical research. Xavier Montagutelli, the head of the animal facilities at Institut Pasteur in Paris, explains, “[Mice] are small and inexpensive, they reproduce quickly… and they age quickly too, making them ideal for studying age-related complaints. We know how to freeze their embryos, sperm, and ova. We now know how to manipulate their genes…They are remarkable tools.”

Unfortunately, the acceptance of mice as the ideal test subject has led to the rigid assumption that they are some kind of prototypical “blank slate” mammals rather than a species with its own unique features and body mechanisms. As a result, the field of biomedicine has built an entire infrastructure of knowledge around these rodents and has become dependent on their bodily responses to measure clinical success. But they simply don’t work as models of human disease, much less for human drug treatment.

For instance, scientists have used mice to find treatments for tuberculosis for decades. However, mice respond in a drastically different manner in comparison to humans. For one thing, mice don’t cough and aren’t contagious when they have the disease. In addition, the human body triggers an immune response when the bacteria responsible for the disease is detected. Mice don’t have this immune response—they get the disease and die. So it’s no surprise that scientists have found an antibiotic called Linezolid that works spectacularly well on human patients but not on mice.

The opposite can happen as well. In the late 1950s, German doctors prescribed Thaliomide under the drug name Contergan to pregnant women to alleviate morning sickness. Since the drug was successful in mice, they assumed that the same would happen in humans. Instead, Contergan led to countless birth defects and only 40 percent of the children survived. And this isn’t just a fluke, either. Dr. Jean-Marc Cavaillon, head of the cytokines and inflammation unit at Institut Pasteur, explained how researchers have discovered a monoclonal antibody that treats inflammatory conditions in mice but would send human patients to intensive care. “Mice are great for basic research, for understanding overall patterns and grasping mechanisms. But once you start modeling a human disease to find the right treatment, you run up against major differences between us and mice,” he said.

As a result, drug treatments that were successfully tested in mice have a high chance of failure when tested on humans. According to a 2014 study on animal models, researchers have found that, on average, less than eight percent of experimental cancer treatments have successfully transitioned from animal testing to clinical cancer trials. Similarly, researchers trying to find a cure for ALS have submitted about a dozen experimental treatments for clinical testing over the past decade after finding success in mice. But when tested on humans, all but one of them failed and the one that didn’t only showed marginal benefits.

It also doesn’t help that these clinical trials are ridiculously expensive—we’re talking about hundreds of millions of dollars and years’ worth of time. In October 2014, the New England Journal of Medicine published a report about how the clinical trials of three tuberculosis treatments ended in complete failure, despite promising results in lab mice. According to the head researcher, the clinical trials alone cost more than $200 million.

But that raises the question: Can we find a suitable replacement for the lab mouse? Unfortunately, no one can say for sure. It’s not like replacing mice with a different animal will solve everything, since animal testing as a whole is still rather dubious. So far, there are only two major possible alternatives, computer models and in vitro cell culture, neither of which offer much of a substitute since they don’t provide a lot of information regarding the complex interactions of living systems.

In addition, the push to stop the use of lab mice has been very controversial within the scientific community, especially for those who would rather turn a blind eye to the issue. Simply put, lab mice are incredibly cheap, convenient and easy to handle. The initiative would also place a large bulk of biomedical research into jeopardy and cast a shadow of doubt across countless pre-existing studies on disease treatment. Scientists today still continue to experiment on mice and spend millions of dollars on clinical trials, only to wonder why their product didn’t work. But what other choice do they have?

A survey of the National Library of Medicine’s database showed that experiments conducted on mice and rats make up almost half of the 20 million academic citations across the field of biomedicine. Despite all the problems they have caused, lab mice remain deeply entrenched in the field of medical research. Clearly, this isn’t a problem that can be solved in a single day.

But what’s even worse is that many news publications are making it seem as if these experimental treatments have worked on human patients and are bound to hit the shelves in the near future. Remember those headlines mentioned in the beginning of this article? All those articles were based on mouse studies and yet none of them mentioned the word “mice” in the headline. It’s sloppy journalism like this that helps fuel people’s doubt and confusion toward the sciences. In the end, one must always remain diligent when reading about the latest discoveries and findings. Science is already a difficult field to grasp, and diving into the literature blindly won’t make things any easier in the long run.

Originally published on September 21, 2017, in The Miscellany NewsExperiments on mice should not be generalized to humans

Is Drinking Coffee Good or Bad For Your Health?

clip_image012
Picture Credit: USA Today

There is no doubt that America loves its coffee. According to a 2017 study by the National Coffee Association, 62 percent of Americans drink this caffeinated beverage on a daily basis, consuming close to 400 million cups per day. That’s more than 140 billion cups of coffee per year.

On top of that, Americans have no intention of straying from this path. Studies have found that 31 percent of coffee drinkers consider brewing coffee to be the most important activity in the morning and 52 percent of drinkers stated they would rather skip the morning shower than their cup of joe. It’s safe to say neither Starbucks nor your local coffee shop will fall out of fashion anytime soon.

But while coffee’s immense popularity is unquestionable, can we say the same in regards to its health benefits? This has been a contentious issue for a long time, as countless studies over the past several years have either branded this beverage as a cure-all that increases lifespan or a deadly toxin that shortens it. Case in point: In 1981, Harvard published a study that connected coffee with a high risk of pancreatic cancer, which sent the entire nation into a frenzy. Later, those same Harvard researchers concluded that smoking may have been the real culprit instead. Like with dark chocolate and red wine, it’s incredibly difficult to pin down any definitive answer regarding coffee’s effects on the body because it’s by nature impossible to prove cause-and-effect in food studies. However, we should still be able to gather a general idea of its effects and whether the benefits outweigh the risks.

So what does science really say about the health effects of coffee? For the most part, it’s good news—or at the very least, coffee won’t kill you. There are numerous studies that suggest that drinking coffee regularly offers a wide range of health benefits, such as lowering the risk of stroke and dementia .

In fact, there doesn’t seem to be an end to the good news. A 2012 study indicates that the caffeine in coffee could decrease the risk of type 2 diabetes. The study featured almost 80,000 women and more than 40,000 men and controlled for all major lifestyle and dietary risk factors. After more than 20 years, they discovered that coffee consumption was associated with an eight percent risk decrease in women and four percent risk decrease in men.

The same could even be said for heart disease. In a 2015 meta-analysis of studies investigating long-term coffee consumption, Harvard researchers found that people who drank about three to five cups of coffee a day had the lowest risk of heart disease among more than 1,270,000 participants. Not only that, but those who consumed five or more cups a day did not suffer any higher risk than those who didn’t drink coffee at all. This information lines up with what a team of cardiologists at the University of California, San Francisco, stated all the way back in 1994: “Contrary to common belief, [there is] little evidence that coffee and/or caffeine in typical dosages increases the risk of [heart attack], sudden death or arrhythmia.”

On the other hand, studies investigating the supposed ill effects of drinking coffee have surprisingly come up short. To begin with, most of the negative connotations that surround coffee are mere myths. For instance, the old wives’ tale about how kids shouldn’t drink coffee because it stunts their growth is just not true. Years of studies have shown that there is no scientifically valid evidence that suggests that coffee affects a person’s height.

Likewise, the idea that drinking coffee will lead to lower bone density and greater risk of osteoporosis is also dubious. Scientists believe that this fear likely stemmed from early studies that linked caffeine with reduced bone mass. However, those early studies were mostly conducted on elderly people whose diets already lacked milk and other sources of calcium. To top it all off, even fears that coffee increases the risk of hypertension turned out to be unfounded thanks to a 2002 study by Johns Hopkins. Exactly what is it in coffee that provides all these benefits? Most studies point to coffee’s high antioxidant content, which protects the body from free radicals that harm the body and factor into cancer development. In fact, according to the American Chemical Society, coffee is the leading source of antioxidants in American diets due to how often we drink it.

Does this mean that coffee is a miracle drink after all? It’s difficult not to come to that conclusion, especially since two new studies published this year concluded that those who drink coffee regularly tend to live longer than those who do not. However, it’s best not to get carried away since, as stated earlier, food studies are notoriously inconsistent. These are all correlations, not causations. The caffeine in coffee is still a drug that has widespread effects that we’re not even close to uncovering. Coffee is still linked to insomnia, heartburn, addiction and digestion problems, as well as weight gain if consumed in excess (even without cream and sweeteners).

Both the U.S. Food and Drug Administration and the International Food Information Council recommend that you don’t exceed 400 milligrams of caffeine a day, which is roughly equivalent to four regular cups of coffee or one Starbucks Venti. As always with food or drink, moderation is key.

Originally published on September 7, 2017, in The Miscellany News: Coffee in moderation beneficial to health

How Religion Physically Changes Your Brain

586ea070170000260092853b
Picture Credit: JupiterImages | The Huffington Post

It seems that more and more young Americans don’t feel quite as deeply connected to deities as their parents or their grandparents. According to the Pew Research Center, the number of Americans under 30 who “never doubt the existence of God” has dropped from 83 percent in 2007 to 67 percent to 2012. In addition, only 18 percent of Millennials reported that they attend religious services at least once a week, compared with 26 percent of Boomers in the late 1970s.

With more people turning away from God and the church, questions surrounding the scientific implications of this generational trend can’t help but arise: How would this historic trend affect the minds and brains of young Americans, who will become the future of this country? In order to find an answer, we can turn toward a relatively obscure discipline in science: Neurotheology.

Neurotheology is the study of spirituality in the context of neuroscience, striving to explain the religious experience in neuroscientific terms.

“[We] evaluate what’s happening in people’s brains when they are in a deep spiritual practice like meditation or prayer. This has really given us a remarkable window into what it means for people to be religious or spiritual or to do these kinds of practices,” said Dr. Andrew Newberg, an established neuroscientist and Director of Research at the Myrna Brind Center at the Thomas Jefferson University Hospital.

So, what do studies of the brain tell us about the impact of religion? In 2014, when Dr. Newberg compared the brain scans of Franciscan nuns, Buddhist monks and staunch atheists in prayer, he found something interesting. The brain scans indicated that praying and meditation caused increased activity in the limbic system, the part of the brain that regulates emotion, and decreased activity in the parietal lobe, the brain region responsible orienting oneself in space and time.

“It seems that the brain is built in such a way that allows us as human beings to have transcendent experiences extremely easily, furthering our belief in a greater power,” says Newberg. According to him, this discovery explains why spirituality is one of the defining characteristics of our species.

Surprisingly, the connection between the parietal lobe and spirituality runs deep. All the way back in the 1990s, Canadian cognitive neuroscientist Michael Persinger tried to artificially replicate the mental effects of religion with his invention, the “God helmet,” a helmet that directed complex magnetic fields to parts of the brain including the parietal lobe. While crowds of Evangelical Christians protested outside his lab, Persinger invited participants to test the helmet. To his delight, more than 80 percent of the participants reported sensing a presence in the room that they took to be their deity. As a result, they became deeply emotional and, once the experiment concluded, were filled with a sense of loss.

Persinger theorized that the electromagnetic disruption created by the helmet caused one hemisphere of the participant’s brain to separate from the other and sense it as an entirely separate presence. Funnily enough, Persinger’s experiment then supports the claims of Princeton psychologist Julian Jaynes, whose 1976 book proposed that the left and right hemispheres are like two separate beings and that signals from the right brain were interpreted by the left brain as the voice of God. Ultimately, this would mean that supernatural occurrences such as divine visions and out-of-body experiences are merely the result of environmental disturbances.

However, there are still skeptics. Graham Ward, the Regius Professor of Divinity at Oxford University states that these claims are still shaky at best and that the temporal lobes “light up for any kind of excitement, not just religious experience.”

A more recent research study has found that humans naturally suppress the analytical parts of their brain and more heavily use the parts linked to empathy when they believe in God. Not only that, but the opposite occurs when humans think about the physical world instead. Anthony Jack, a Professor of Psychology at Case Western Reserve University who led the study, claims that humans use two different networks of neurons, one that enables critical thinking and one that promotes empathy. He explains that not only does this discovery broaden our understanding of spirituality in the history of cultures, but it also suggests that a healthy brain can choose which network to depend on and which to suppress when confronted with a logical problem or an ethical dilemma.

This idea that religion may arise from pathways in the brain rather than physical brain regions has been gaining traction recently. In a different study led by researchers at Auburn University showed that subjects who perceived supernatural agents in their daily lives were more likely to use brain pathways associated with fear when asked to think about their religious beliefs. They also found that devout believers tend to use neural pathways connected to language, while atheists tend to use pathways associated with visual imagery.

Most interestingly, while religion has been shown to heavily influence the brain, the brain can actually change how a person views religion. According to Boston University Professor of Neurology Patrick McNamara, changes in brain chemistry caused by Parkinson’s disease has been shown to erode a patient’s faith and devotion to God. These patients, McNamara discovered, lacked the neurotransmitter dopamine, which made him suspect that religiosity is connected to dopamine activity in the prefrontal lobes. This theory fits surprisingly well in the context of a completely different study, one where researchers used functional MRI scans and found that religious and spiritual experiences activate the same reward systems in the brain that become active when listening to music or doing drugs.

But even if spirituality is just a matter of brain chemistry, several theories point to religion as an evolutionary adaptation. A number of reports have found that churchgoers live about seven years longer than atheists and tend to have greater success with recovery from diseases like breast cancer and rheumatoid arthritis. They are also more likely to have lower blood pressure and less likely to have depression. So while cultural trends may shift away from god, it won’t be all that surprising if religion continues to persist for years to come.

Originally published on April 19, 2017, in The Miscellany NewsNeuroscience of religion reveals hidden cultural trends

Do Your Talents Depend on Your Genes?

d36464_8f6cb0ef9c2e4e1c84d850bc05d1254a
Picture Credit: tadtoonew.com

What if you were able to discover what your talents were the moment you were born? Would it have helped you at all in school if you knew that you were naturally gifted in sports or solving math problems or playing an instrument? According to certain health institutions in China, you no longer have to spend time wondering, thanks to the power of gene sequencing.

According to a recent article by The Telegraph, China is seeing an incredible surge of these so-called “talent detection” facilities that claim to be able to sequence a person’s DNA and uncover that person’s natural talent for a fee of about $500. Despite the dubious nature of these businesses, this type of direct-to-consumer genetic testing has become so popular among competitive Chinese parents that thousands of children are dragged by their mothers to these institutes to have their genomes sequenced in order to gain an extra advantage in the already cut-throat academic environment. As a result, China is already seeing the rise of the “talent detecting” industry, with companies promising to predict the future potential of children as well as their general level of intelligence, their emotional understanding and even their personality.

Wang Junyi, the president of the highly successful 1Gene health institute in Hangzhou, Zhejiang explains why these facilities are all the rage in China: “Many of my friends are anxious about deciding what their children should learn, as they fear making stupid decisions could result in lost opportunities. They will be wasting money and destroying their children’s confidence if they push them into something they are not good at, and this is where genetic testing can help.”

Of course, no matter how convincing they may sound, none of these claims are backed by actual scientific evidence. Genealogy expert Chang Zisong at the Tianjin International Joint Academy of Biomedicine states that all these predictions are ultimately meaningless and that the main reason why these institutions aren’t illegal is because banning them “would suggest that they have scientific value.”

But this opens up the question–how much impact does our DNA have on our talents? After all, the human genome is supposedly our body’s “blueprint.” While using gene sequencing to determine success in becoming the next Einstein or Mozart may be a farce today, would genetically detecting talent ever become standard practice in the future?

Let’s first examine athletic ability. One of the more controversial arguments regarding this subject is the athletic prowess of Jamaican sprinters. For some reason, the world’s best sprinters seem to come from this island nation in the Caribbean. Both Usain Bolt and Elaine Thompson, two Olympic champions who hold the title of fastest man and woman in the world respectively, are Jamaican. In addition, Jamaican athletes make up 19 of the 26 fastest times ever recorded in 100-meter races.

These numbers are a bit too bizarre to be mere coincidences, seeing how Jamaica has a population of only 2.8 million people. Many people have come up with different theories, from the diet of yams in local regions of the country to the island’s aluminum-rich soil. However, scientists who examined the DNA of Jamaican sprinters have suggested the existence of a “speed gene” and located the ACE gene as the culprit.

According to their explanations, this particular gene variant increases the chance of you developing a larger-than-average heart that can pump highly oxygenated blood to your muscles quicker than the average person’s. The data has shown that Jamaicans have a higher frequency of this gene variant than Europeans or even inhabitants of West Africa.

Funnily enough, 75 percent of Jamaicans, both athletes and non-athletes, also possess the ACTN3 gene, which helps develop muscle strength. In contrast, only 70 percent of U.S. international-standard athletes have this desirable variant.

So is your potential athletic ability primarily determined by these two genes? It’s difficult to tell.

For one thing, the genetics of sports is incredibly complicated, and it’s more likely that an entire pathway of genes is involved rather than a specific anomaly. In addition, Yannis Pitsilandis, a biologist at the University of Glasgow studied the genetics of Jamaican sprinters and could not genetically distinguish a subgroup that made them run faster than everyone else. Instead, Pitsilandis argues that Jamaica has a lot of fast sprinters because the entire country promotes the sport of running, similar to how the United States obsesses over the sport of football.

If the data on athleticism is inconclusive, then let’s look at a different but equally desired talent–the ability to solve math problems easily. Unfortunately, there is even less conclusive data surrounding the genetics of academic success. According to a large twin study by researchers from King’s College in London, it may be possible that the genes for math and language skills are inherited from your parents. However, the scientists were unable to determine the exact genes that may be responsible for these skills.

But then what about musical abilities, like becoming a prodigy in playing the violin or piano? As expected, the situation remains murky. While no direct connections between genes and musical ability have been established, some scientists believe that musical accomplishment may actually stem from the desire to practice, which does have genetic ties.

According to research led by psychologist David Hambrick from Michigan State University, a person’s genetics may influence their musical aptitude, musical enjoyment and motivation.

Similarly, a study of over 10,000 identical Swedish twins led by neuroscientist Miriam Mosing of Stockholm’s Karolinska Institute found that a person’s propensity to practice music may be inherited by their child by up to 70 percent. However, neither study can really be deemed conclusive, and connections to any specific gene variant have yet to be found.

Based on all this research, it seems that we still have a long way to go before we can rely on gene sequencing technology to predict people’s futures. Even our knowledge on the link between genetics and talent appears shaky at best. Yet despite this, direct-to-consumer gene sequencing has become all the rage recently, and not only among uber-competitive parents in China. In the United States, countless genetic testing companies have found success by offering to read the customer’s DNA and revealing that person’s natural “disposition.” But instead of analyzing DNA to unveil a person’s natural talent, these companies promise to uncover the customer’s ideal diet and exercise regime, giving “reliable” genetic information on their genetic fitness.

Even crazier is that these “lifestyle genetic tests” are offering to uncover more and more ridiculous information “buried” within our DNA. One company even wants to use gene sequencing to determine what comic superhero a customer would be, based on their genes. As the originator of the idea, Stephane Budel, explains: “It gives you your breakdown, like you’re 30 percent Superman, 20 percent Ironman and 50 percent the Hulk.”

Clearly, the human genome is being treated less like a blueprint and more like a personality test on Facebook. Nonetheless, I think it would be advisable for everyone to slow down, take a deep breath and follow what your brain tells you instead of relying on a genome report.

Originally published on March 1, 2017, in The Miscellany NewsTalents may be dependent on individual genetic makeup