Featured

Brief Overview

Welcome to SparktheScience, a website that collects and assembles all the science articles I have written. As a science enthusiast, my goal is to encourage and generate excitement for science and technology in other people, and I believe the best way to do so is through clear, clean, and fun writing. This website will cover all topics from quantum computing to biomimicry, a personal favorite of mine. I hope you enjoy everything this site has to offer!

“Science is simply the word we use to describe a method of organizing our curiosity.” ~Tim Minchin

Advertisements

The War on Cancer: We Still Think We’re Winning Even Though We’re Not

cancer-immunotherapy-1
Picture Credit: Fabian Bimmer | Reuters | Newsweek

I must admit, it can be exciting to read about the latest developments in cancer research in the news. There is so much happening in the field of oncology that it’s tempting to imagine a future without cancer just around the corner.

For instance, scientists at Northwestern University have reported that they have found what they call an ancient kill switch in cancer cells. According to the researchers, they may be able to use this mechanism to force cancer cells to kill themselves in multiple ways simultaneously.

Not only that, a revolutionary new form of cancer treatment known as CAR T-cell therapy has swept up the scientific community in an excited fervor. By manipulating the T-cells of the cancer patient’s own immune system with gene therapy and then reinjecting them back into the patient, researchers have successfully destroyed tumors in people who had lost all hope.

According to various news reports, this treatment was so promising that the U.S. Food and Drug Administration (FDA) has recently given it the green light for production and distribution, making it the first use of medicinal gene editing to be available for patients in the United States.

“We’re entering a new frontier in medical innovation with the ability to reprogram a patient’s own cells to attack a deadly cancer,” FDA Commissioner Dr. Scott Gottlieb stated after approving the treatment.

As with anything that’s showered with positive attention by the media, however, it’s not as simple as it appears. All the hype surrounding cancer research is actually blinding us to the reality that we are not winning the war against cancer. In fact, despite what headlines may claim, we are nowhere close to actually finding the cure for cancer.

While such a claim may sound needlessly pessimistic, it is vital to view the current trajectory of cancer research within the context of its larger history. For one thing, cancer has been around for a very, very long time. This immortal and terrifying disease has been around for all of human history, with fossil evidence and ancient manuscripts dating its pervasiveness as far back as 1600 B.C. Needless to say, countless attempts have been made by renowned scientists and medical experts across human history in a collective effort to understand and combat this disease. In recent memory, the most notable collective endeavor is America’s War on Cancer, which was launched by President Nixon in 1971. From that moment on, the United States has devoted increasingly intensified efforts to find a cure.

Over the past 40 years, the U.S. has poured a total of more than $500 billion into winning this war. Even now, that war continues to escalate. In 2017, the National Cancer Institute (NCI) received $5.389 billion for the fiscal year, which is $174.6 million more than what the organization received in 2016. In addition, we have around 260 different nonprofit organizations in the United States that raise money for cancer research and treatment. Together, those nonprofit organizations have budgets that top $2.2 billion.

This should be good news, though, right? All of that money is going towards a worthy cause, after all. Indeed, that much is undeniable. However, the problem is that all that money is translating to very little substantive progress in terms of developing a permanent solution. So far, we have made great strides in understanding the nature of cancer cells and how they behave in general. Unfortunately, utilizing that knowledge to create a reliable treatment has so far proven to be much more difficult than anyone had realized.

Despite receiving billions of dollars in funding and conducting countless expensive and laborious drug trials, scientists have yet to develop anything that can meaningfully increase a patient’s chances of survival, much less actually cure the disease. In fact, a recent study published earlier this year reported that two-thirds of all cancer drugs that were approved in the past two years showed no evidence of extending survival at all (USA Today, “Dozens of New Cancer Drugs Do Little to Improve Survival,” 02.09.2017).

When President Nixon announced the War on Cancer, he vowed that cancer would be cured by 1976. Today, cancer remains as deadly as ever. According to the World Health Organization, one in six deaths in the world in 2015 was caused by cancer, resulting in a total of 8.8 million deaths. As a result, cancer is still the second leading cause of death globally, just behind heart disease. However, the death toll from heart disease has decreased significantly over the past several decades. In fact, between 1950 and 2005, the death rate of heart disease dropped by 64 percent. In contrast, the death rate for cancer fell by a mere five percent during that same time period. That’s how little progress we have made, even with billions of dollars in funding supporting decades of scientists’ focused research.

Of course, the last thing I want to do is discourage further cancer research. Despite the rather bleak odds, there are still benefits in continuing this line of inquiry and searching for other treatment options. The point I’m trying to articulate is that the news you hear about regarding cancer research tends to be so overly positive that they often fail to accurately depict the reality of the situation. No matter where you look, every new insight is a “major breakthrough,” and every new test product is “a miracle in the making.” By exaggerating successes, the media has effectively deceived the general public into believing that the cure for cancer is just around the corner.

Case in point: CAR-T therapy. Remember how I mentioned earlier that this method of cancer treatment showed promising results? When news sources learned that the FDA approved its use in the United States, they became ballistic with excitement. They issued articles about the miracle of CAR-T therapy, with headlines such as “Latest Car-T Therapy for Cancer Signals New Era for Life-Saving Treatments”, “New Gene Therapy for Cancer Offers Hope to Those With No Options Left”, and “Cancer’s Newest Miracle Cure”. In typical fashion, all these articles feature heartwarming stories of cancer patients miraculously being saved by this revolutionary new treatment that will surely stop cancer in its tracks.

What these articles fail to mention is that CAR-T therapy can be incredibly dangerous because it needs to bring your body to the brink of death in order to save you. While the genetically engineered T-cells spread and kill the tumor cells, the patient undergoes a series of intense side effects that are so sudden and severe that a team of top specialists must remain on standby to keep the patient alive.

And sometimes, not even that is enough. So far, several patients have died from neurotoxicity complications during the clinical trials, and experts still haven’t pinned down the exact cause of their deaths. Because CAR-T therapy is so risky and complex, experts warn that it’ll take years before a treatment like this is safe for patients to use. It is certainly not the miracle cure that the media is making it out to be. It’s not even effective against all cancers; CAR-T therapy has mainly been used to treat leukemia but struggles against solid tumors like sarcomas and lymphomas.

Does this mean that CAR-T therapy is a lost cause? Absolutely not. Medical experts are justified to claim that this immunotherapy treatment is a legitimate revolutionary breakthrough in a field that has largely stagnated over the past several decades. This is a major accomplishment, and the cancer survival stories prove that fact. But the issue is that for the past 40 years, the media has consistently trumpeted the end of cancer with every trivial development. By bombarding the public with overly exaggerated tales of successes, the media has essentially deluded the country into believing that we are winning the war against cancer and that all cancer patients have a good chance of not only surviving but also returning to their normal lives. But such rose-colored views are far from the truth and have broken families apart.

As Dr. Otis Brawley, the chief medical officer at the American Cancer Society, explained, “We have a lot of patients who spend their families into bankruptcy getting a hyped therapy that [many] know is worthless…[Some choose a medicine that] has a lot of hype around it and unfortunately lose their chance for a cure.”

It’s already heartbreaking for patients and their loved ones to learn that they have cancer. It feels infinitely worse to undergo several “life-saving” treatments performed by doctors at the country’s best hospitals only to learn that none of it actually works. Consider the tragic story of Michael Uvanni and his brother James, a patient with skin cancer. After hearing about all the miracle treatments that were supposedly available and seeing happy commercials of cancer patients hugging their grandchildren, they felt confident that the odds were in James’ favor. That optimism led to crushing disappointment when his health continued to suffer, even after trying immunotherapy and several other experimental treatments. Three years after his diagnosis, James passed away from metastatic melanoma.

“I thought they were going to save him…You get your hopes up, and then you are dropped off the edge of a cliff. That’s the worst thing in the world,” confessed Michael Uvanni.

This sort of duplicitous optimism, unfortunately, permeates through the entire field of oncology. While newspapers hype research results to attract readers, drug companies make outrageous promises to boost sales and hospitals draw in paying customers by appealing to their hopes and overstating their accomplishments. Many scientists have also fallen victim to this mindset, often exaggerating the successes of their own research results to attract investors. Back in 2003, Dr. Andrew von Eschenbach, the director of the National Cancer Institute, announced the possibility of “eliminating suffering and death due to cancer by 2015.” Even President Obama contributed to the illusion when he announced the Cancer Moonshot project in 2016 by saying, “Let’s make America the country that cures cancer once and for all.”

Given all these overly positive messages, it’s no wonder that so many cancer patients believe that their lives are guaranteed to be saved, only to feel crushed when they learn the awful truth. Let’s be clear: There is no miracle cure for cancer. According to the American Cancer Society, the percentage of people who are alive five years after being diagnosed with stomach cancer is 29 percent. For lung and bronchus cancer patients, the number is 18 percent. For pancreatic cancer patients, it’s 7 percent. Patients with metastatic melanoma typically die within a year of diagnosis. Despite what you may hear, immunotherapy can cause fatal immune system attacks on the lungs, kidneys, and heart. There are no approved immunotherapies for breast cancer, colon cancer or prostate cancer. Not only that, studies have found that immunotherapy only benefits about 10 percent of all cancer patients.

As grim as all this may be, we must remember that not all hope is lost. That said, the last thing cancer patients need right now is to be blindsided by all the fanfare that seems to accompany every piece of cancer news.

Originally published on October 26, 2017, in The Miscellany NewsCancer research advancements overstated

The New Americana: More Teens Are Suffering From Anxiety, Depression Than Ever Before

depressed teen
Picture Credit: Reuters | International Business Times

For a long time, teenagers have been characterized—generally by those older than them—as overly moody, self-centered and irrational. It’s not uncommon for adults to complain about how millennials are emotionally unstable, or to brush aside their problems as typical “teenage angst.” But in reality, these millennials have been rather upstanding.

Illegal drug use among teens has been declining for several years, and far fewer adolescents are smoking cigarettes and drinking alcohol than almost ever before. Not only that, the National Center for Health Statistics has reported a record low in the teen birth rate in the U.S., while high school graduation rates reached an all-time high of 83.2 percent in 2015.

Yet despite all the good news, researchers have noticed a disturbing trend: American adolescents are developing serious mental health problems at an alarming rate. According to the Department of Health and Human Services, about three million teenagers ages 12 to 17 had at least one major depressive episode in 2015 alone, and more than two million teens reported experiencing depression that impairs their daily activities. What’s even more startling is that this number is predicted to increase. According to a study that tracked depression among young adults across the country, the number of teenagers who reported having symptoms of low self-esteem and problems with sleep and concentration rose by 37 percent just between 2015 to 2016.

And it’s not just depression. Researchers have found that cases of anxiety have spiked in recent times. According to the Anxiety and Depression Association of America (ADAA), anxiety disorders have become the most common mental illness in the United States, affecting 18.1 percent of Americans every year. In fact, the National Institute of Mental Health reported that about 6.3 million teens in the U.S. have an anxiety disorder of some kind. Unfortunately, this widespread phenomenon is not just affecting middle- and high-school students. Anxiety has overtaken depression as the most common reason college students seek counseling services. According to the American College Health Association, the number of undergraduates reporting to have “overwhelming anxiety” increased significantly from 50 percent in 2011 to 62 percent in 2016.

It’s not normal “teen angst” anymore; it’s a full-scale epidemic that is bound to get worse over time if ignored. But what can be the cause of such a shocking national trend? Unfortunately, not even the researchers know for sure. Usually, there are several conspicuous reasons for adolescents to feel depressed or anxious. Being raised in abusive households, living in poverty or being surrounded by violence are all understandable causes of emotional instability. Yet, teenagers who live in well-off communities and who seemingly should have nothing to worry about tend to suffer the most. What could possibly be causing these adolescents such grief?

Rather than one definite answer, it is most likely the result of several interwoven factors. For instance, anxiety and depression are shown to have a biological component. Scientists have already located several genes that may influence the risk of developing an anxiety disorder, such as variants of the GLRB gene, which has been linked to responses in the brain that cause us to become startled or overly fearful. However, there are other relevant biological factors besides genetics. Just recently, scientists have discovered that our gut bacteria may influence the functioning of brain regions such as the amygdala and the prefrontal cortex, both of which are heavily linked to anxiety and depression. These studies found that mice with an imbalance in their gut microbiome were more likely to display anxious and depressive behaviors.

However, many experts agree that environment likely plays a larger role in the rise of mental health issues in adolescents than genetics or gut bacteria. More specifically, researchers suspect that this epidemic of intense anxiety and depression in teens may be caused by the overwhelming pressure placed on them not only to succeed but to perform better than everyone else. As a result of this pressure, both high school and college students have reported that their biggest stressor is the fact that no matter what they do, it’s never enough.

“Teenagers used to tell me, ‘I just need to get my parents off my back.’ [But now,] so many students have internalized the anxiety. The kids at this point are driving themselves crazy,” stated Madeline Levine, a practicing psychologist and a founder of a non-profit that works on school reform. This news probably comes as a surprise to no one. In 2013, the American Psychological Association reported that American teenagers have become the most stressed age-group in the United States. Various culprits are likely at fault, including sleep deprivation, the uncertainty surrounding job security and the fear of not living up to people’s expectations.

Researchers have also assigned blame to the prevalence of social media and technology. With everyone connected to the internet, it’s difficult for teens to avoid constantly comparing themselves with their peers and worrying about their digital image. Unsurprisingly, many anxious teenagers agree that social media has had a negative influence on their mental health. According to accounts by teenagers attending Mountain Valley, a residential treatment facility for adolescents suffering from severe anxiety disorder, social media played a large role in lowering self-esteem and provoking feelings of anxiety. Not only that, the students also talked about how their smartphones provided a false sense of control, which they could use to avoid talking to people and escape the stresses of school.

As a result, several experts suspect that there may be a connection between the extreme spike in anxiety and depression in recent years and the wide-spread adoption of the iPhone. As Jean Twenge, a professor of psychology at San Diego State University, puts it, these dramatic trends in teen mental health issues started “exactly the moment when the proportion of Americans who owned a smartphone surpassed 50 percent.”

In the end, researchers have yet to find a conclusive answer to this troubling phenomenon. However, they agree that there is a disturbing lack of resources available to help young adults who are currently struggling with these problems. Studies show that despite the rise in mental health issues, there hasn’t been a corresponding increase in mental health treatment for both teenagers and young adults. Not only that, it’s highly likely that the number of adolescents who are actually struggling with anxiety and depression is greater than the reported figure since many people choose not seek help. In fact, the Child Mind Institute reported in 2015 that only 20 percent of young people with diagnosable anxiety disorder get treatment.

Thus, it is important to understand the true gravity of the situation and reach out to those who need help. During these times of uncertainty and hardship, it’s crucial for us to take the time to understand these individuals and aid them as best as we can rather than brush their problems aside as mere trivialities.

Originally published on October 19, 2017, in The Miscellany NewsMillennial mental health issues spike, prompt response

Our Infatuation with Solar, Wind Could Make Climate Change Worse

solar-panels_2597461k
Picture Credit: Alamy | The Telegraph

During these troubling times of environmental turmoil, in which dangerous levels of carbon dioxide emissions threaten to destabilize the global climate, it’s no surprise that a lot of people are pushing vehemently for greater investment in renewable energy. In fact, despite the childish clamoring of several anti-science government officials, the idea of renewable energy, especially solar and wind energy, is incredibly popular among the vast majority of Americans.

In 2016, the Pew Research Center reported that 89 percent of Americans favor building more solar panel farms and 83 percent favor constructing more wind turbine farms. In contrast, only about 41 percent of Americans wanted to expand the coal mining sector, and these numbers aren’t meaningless, either. According to the Renewables 2016 Global Status Report (GSR), renewable energy saw its largest annual increase in energy contribution ever in 2015, despite low prices for fossil fuels.

It’s pretty clear that a large majority of people hold solar and wind energy in high regard. I’d even go as far to say that in this modern, socially conscious age, there isn’t a term more associated with pure good than renewable energy. However, this blind infatuation may just end up jeopardizing our entire fight against climate change. But how in the world can renewable energy possibly lead to a bad thing?

To better illustrate my point, consider the incredible amount of attention and fanfare that the Idaho-based startup company Solar Roadways Inc. got for its idea to replace all the roads in America with structurally engineered solar panels that could generate backup electricity while withstanding vehicle traffic. Founded in 2006, this startup presented a vision of a world in which solar panel roadways not only use LED lights to light up the streets and change the road design but also power entire cities to create a cleaner, greener world.

When people heard about this revolutionary new idea, they fell madly in love with the concept of solar roadways. During the crowdfunding drive at Indiegogo, more than 50,000 backers supported the project and the startup raised more than $2 million, making it the most popular Indiegogo campaign ever. But it wasn’t just green-energy enthusiasts who contributed financially to this enterprise. Even the Department of Transportation stepped in and invested more than $1.6 million into the project.

Unfortunately, all of it turned out to be a bust. When 30 solar roadway panels were finally installed on a public walkway in 2016, 25 of them broke down within a week, and more malfunctions appeared once it rained. But even more disappointing was that the highly anticipated solar roadway, even when fully operational, generated an average of 0.62-kilowatt hours of electricity per day—not even enough energy to power a hairdryer, much less an entire city.

But solar roadways aren’t the only inventions that took advantage of people’s infatuation with renewable energy. In February, a startup company raised more than $350,000 on Indiegogo when it promoted the Fontus water bottle, a self-filling water bottle that uses solar energy to extract water from the air. According to the campaign video, Fontus is designed to draw air into the bottle and capture moisture through condensation as the air cools. Not only that, the device would be powered by a small, mousepad-sized solar panel, making the Fontus perfect for backpackers and bikers going on a trip. Again, problems appeared when scientists pointed out that a solar panel that small is never going to produce the amount of energy needed to make the whole thing work. In fact, it would require a huge, 250-watt, 16-square-foot solar panel working at 100 percent efficiency under ideal circumstances for the Fontus to even come close to fulfilling its promise.

It’s not just solar energy, either. In 2016, the startup VICI Labs made headlines when it promoted the Waterseer, a device that used the wind to “provide up to 11 gallons of safe drinking water” from the air every day. Raising more than $330,000 on Indiegogo, the inventors behind the Waterseer made it seem as if their invention could end all water shortages thanks to the clean power of wind energy, managing to persuade UC Berkeley and the National Peace Corps Association to help contribute to its development. Once again, the power of green energy was overestimated and several thermodynamicists have pointed that the Waterseer wouldn’t work in dry, arid areas— places that need water the most.

The reason why all these bogus crowdfunding campaigns made so much money despite being scientifically dubious is that so many people were willing to believe that renewable energy sources could accomplish anything, even the impossible. They had such a positive outlook on solar panels and wind turbines that they didn’t even stop to consider the possible limitations of those technologies. Of course, this overly optimistic mindset is a natural product of today’s society, in which the increasingly alarming news of the humanity’s pollutant-ridden path towards ruin make it seem as if renewable energy is our only hope for survival. But no matter how beneficial it may be, renewable energy should not be placed on a pedestal. We can’t afford to treat it like some kind of magical energy source that provides unlimited free electricity without any restrictions or drawbacks.

For example, many people tend to think solar panels can provide unlimited energy because they get their power from the sunlight, which should be infinite, right? In reality, however, a typical solar panel can only absorb about 20 percent of the energy that the sun produces. In addition, unless it is specifically designed to track the movement of the sun, the solar panel can lose up to 60 percent of the sun’s energy on top of the lackluster 20 percent energy absorption. Not only that, the hotter the solar panel gets, the less energy it absorbs. It may sound counterintuitive, but for every degree above 25 degrees Celsius a typical solar panel becomes, its maximum power drops by about 0.5 percent.

This isn’t to say that renewable energy is terrible or that we should give up on it. While not entirely efficient, solar and wind power still produces electricity without consuming any limited resources. Yet we can’t delude ourselves into thinking that solving climate change is as simple as building more solar farms and wind turbines.

In fact, doing so without proper planning might do more harm than good. One major consequence of our infatuation with green energy is the rapid decline of nuclear power, the main source of zero-carbon electricity in the United States. Thanks to the popularity of solar and wind farms, nuclear power plants all across the world are on the verge of shutting down for good, which could severely damage our efforts in fighting climate change.

First of all, despite the negative press that it gets, nuclear energy remains quite possibly the cleanest and most viable form of energy that we currently possess. No matter what sort of Greenpeace propaganda you may have heard, nuclear energy is the safest way of producing reliable energy, a statement backed by the World Health Organization, the Centers for Disease Control and the National Academy of Science. In fact, a 2010 study by those three organizations has found that nuclear power is 40 percent less deadly than the next safest form of energy, wind power. Nuclear energy is also tied for having the lowest carbon footprint, and unlike solar and wind energy, nuclear energy actually stands a chance against the natural gas and coal industries. According to the U.S. Energy Information Administration, although solar and wind power made up a combined seven percent of U.S. electricity generation in 2016, nuclear energy provided 20 percent of the U.S.’s electricity.

But if the problem is that renewable energy isn’t contributing as much as nuclear energy, then can’t we solve this issue by building more solar and wind farms? No, it’s not that simple. One of the biggest problems with solar and wind energy is that they are entirely dependent on the current weather. When the sun doesn’t shine or the winds stop blowing, energy production plummets. Of course, this wouldn’t be an issue if one could store the excess energy generated on an especially sunny or windy day, but as of right now, a large-scale method of storing the electricity generated by solar and wind farms does not exist. As a result, whenever the weather is unfavorable, state governments must find an alternative energy source. What do they turn to now that many of the expensive nuclear plants are shut down? Answer: natural gas and fossil fuels.

This isn’t just a hypothetical scenario. In Southern Australia, a region in which wind energy makes up more than a quarter of its total energy, the government had to switch back on a gas-fired plant that had been shut down when prices of electricity spiked during a period of light wind. Meanwhile, despite investing heavily in green energy, the German government is supposedly paying billions to keep coal generators in reserve in case the weather suddenly becomes unfavorable. This could be why carbon emissions are still rising in Germany, even though Germans pay the most expensive electricity rates in Europe.

The loss of nuclear energy is serious. According to a Bloomberg New Energy Finance analysis, reactors that produce up to 56 percent of America’s nuclear power may shut down and eventually end up becoming replaced by the much cheaper gas-fired generators. If that were to happen, the report estimates, an additional 200 million tons of carbon dioxide would be spewed into the atmosphere annually.

But even if nuclear plants weren’t shutting down, we still lack the infrastructure required to actually utilize green energy generated in the first place. We may spend heavily on building countless wind and solar farms, but most of it is wasted if we don’t have a way to distribute that electricity, especially since most farms are hundreds of miles away from the nearest city. Even worse, some estimates posit that constructing all the high-voltage lines needed to transport the electricity could take several decades.

This is a huge problem with solar and wind farms right now. Since there is no infrastructure in place to distribute the power and no way to store the energy generated, solar farms and wind farms across the United States from Texas to California are often turned off or left idling by, leading to massive energy waste.

Again, despite everything that was mentioned, renewable energy is not a bad thing. It is much more favorable to take advantage of solar and wind energy as soon as possible than to wait and do nothing with it. But mindlessly building more and more solar and wind farms simply because solar and wind energy is “objectively good,” will only drag us further away from our goal of a cleaner future. It is undeniable that renewable energy can save the Earth, but that doesn’t mean we should worship it blindly.

Originally published on October 4, 2017, in The Miscellany News: Renewable energy, while urgent, necessitates skepticism

It’s Time to Replace the Lab Mice

White-mouse-in-lab-009
Picture Credit: Alamy | The Guardian

Let’s do a little experiment. Read the following headlines from these recently published scientific articles and try to find the one thing that all of them have in common: “The Pancreas Provides a Potential Drug Candidate for Brain Disease,” “Chimera Viruses Can Help the Fight Against Lymphomas,” “What Was Once Considered Cell Waste Could Now Treat Pancreatic Cancer,” “Cellular Tango: Immune and Nerve Cells Work Together to Fight Gut Infections,” “Scientists Reveal Fire Ant Venom Could be Used as a Skin Treatment.” The answer? All of the listed studies are based on the results of experiments conducted on mice. And that is a huge problem.

Using lab mice to understand how the human body works is nothing new. This practice officially started in 1902 when French biologist Lucien Cuénot used mice to research the nature of genes. Inspired by the works of Gregor Mendel, the father of modern genetics, Cuénot wanted to see if Mendel’s laws of inheritance applied to more than just sweet peas. Beforehand, Mendelian genetics only applied to tested plants, so the Cuénot discovery that animals follow the laws of inheritance sent shockwaves across the scientific community.

Not long after, more scientists began to use mice to explore the field of genetics, establishing mating programs that created inbred strains of mice and leading efforts to fully map the mouse genome. As decades went by, lab mice skyrocketed in popularity and ended up contributing to numerous award-winning discoveries. Out of the 106 times the Nobel Prize for Physiology or Medicine has been awarded so far, 42 of them involved research on mice or rats in some major way. These studies include the discovery of penicillin, the yellow fever vaccine, the polio vaccine and the HIV-AIDS virus.

It is easy to see how the lab mice became such an iconic symbol of biomedical research. Xavier Montagutelli, the head of the animal facilities at Institut Pasteur in Paris, explains, “[Mice] are small and inexpensive, they reproduce quickly… and they age quickly too, making them ideal for studying age-related complaints. We know how to freeze their embryos, sperm, and ova. We now know how to manipulate their genes…They are remarkable tools.”

Unfortunately, the acceptance of mice as the ideal test subject has led to the rigid assumption that they are some kind of prototypical “blank slate” mammals rather than a species with its own unique features and body mechanisms. As a result, the field of biomedicine has built an entire infrastructure of knowledge around these rodents and has become dependent on their bodily responses to measure clinical success. But they simply don’t work as models of human disease, much less for human drug treatment.

For instance, scientists have used mice to find treatments for tuberculosis for decades. However, mice respond in a drastically different manner in comparison to humans. For one thing, mice don’t cough and aren’t contagious when they have the disease. In addition, the human body triggers an immune response when the bacteria responsible for the disease is detected. Mice don’t have this immune response—they get the disease and die. So it’s no surprise that scientists have found an antibiotic called Linezolid that works spectacularly well on human patients but not on mice.

The opposite can happen as well. In the late 1950s, German doctors prescribed Thaliomide under the drug name Contergan to pregnant women to alleviate morning sickness. Since the drug was successful in mice, they assumed that the same would happen in humans. Instead, Contergan led to countless birth defects and only 40 percent of the children survived. And this isn’t just a fluke, either. Dr. Jean-Marc Cavaillon, head of the cytokines and inflammation unit at Institut Pasteur, explained how researchers have discovered a monoclonal antibody that treats inflammatory conditions in mice but would send human patients to intensive care. “Mice are great for basic research, for understanding overall patterns and grasping mechanisms. But once you start modeling a human disease to find the right treatment, you run up against major differences between us and mice,” he said.

As a result, drug treatments that were successfully tested in mice have a high chance of failure when tested on humans. According to a 2014 study on animal models, researchers have found that, on average, less than eight percent of experimental cancer treatments have successfully transitioned from animal testing to clinical cancer trials. Similarly, researchers trying to find a cure for ALS have submitted about a dozen experimental treatments for clinical testing over the past decade after finding success in mice. But when tested on humans, all but one of them failed and the one that didn’t only showed marginal benefits.

It also doesn’t help that these clinical trials are ridiculously expensive—we’re talking about hundreds of millions of dollars and years’ worth of time. In October 2014, the New England Journal of Medicine published a report about how the clinical trials of three tuberculosis treatments ended in complete failure, despite promising results in lab mice. According to the head researcher, the clinical trials alone cost more than $200 million.

But that raises the question: Can we find a suitable replacement for the lab mouse? Unfortunately, no one can say for sure. It’s not like replacing mice with a different animal will solve everything, since animal testing as a whole is still rather dubious. So far, there are only two major possible alternatives, computer models and in vitro cell culture, neither of which offer much of a substitute since they don’t provide a lot of information regarding the complex interactions of living systems.

In addition, the push to stop the use of lab mice has been very controversial within the scientific community, especially for those who would rather turn a blind eye to the issue. Simply put, lab mice are incredibly cheap, convenient and easy to handle. The initiative would also place a large bulk of biomedical research into jeopardy and cast a shadow of doubt across countless pre-existing studies on disease treatment. Scientists today still continue to experiment on mice and spend millions of dollars on clinical trials, only to wonder why their product didn’t work. But what other choice do they have?

A survey of the National Library of Medicine’s database showed that experiments conducted on mice and rats make up almost half of the 20 million academic citations across the field of biomedicine. Despite all the problems they have caused, lab mice remain deeply entrenched in the field of medical research. Clearly, this isn’t a problem that can be solved in a single day.

But what’s even worse is that many news publications are making it seem as if these experimental treatments have worked on human patients and are bound to hit the shelves in the near future. Remember those headlines mentioned in the beginning of this article? All those articles were based on mouse studies and yet none of them mentioned the word “mice” in the headline. It’s sloppy journalism like this that helps fuel people’s doubt and confusion toward the sciences. In the end, one must always remain diligent when reading about the latest discoveries and findings. Science is already a difficult field to grasp, and diving into the literature blindly won’t make things any easier in the long run.

Originally published on September 21, 2017, in The Miscellany NewsExperiments on mice should not be generalized to humans

Should We Fear the Rise of A.I.?

shutterstock_robo2
Picture Credit: The Register

Earlier this September, billionaire entrepreneur Elon Musk stirred up a huge Twitter-storm when he posted that global competition in artificial intelligence (AI) superiority could potentially lead to World War III. This tweet came after Russian President Vladimir Putin declared, “Whoever becomes the leader in [artificial intelligence] will rule the world,” to which Musk tweeted, “It begins….”

Of course, Elon Musk is rather infamous for making grandiose predictions and promises that often fail to come true. In 2016, he announced that his company SpaceX will master space travel and colonize Mars as early as 2024, only to pull the plug less than a year later when he realized that traveling to Mars in 25 minutes isn’t exactly feasible. However, Musk’s tweet about World War III has been one of many such warnings about the dangers of artificial intelligence, going so far as to reference the “Terminator” movies.

“AI is a fundamental existential risk for human civilization, and I don’t think people fully appreciate that,” he stated at the 2017 National Governors Association in Rhode Island.

But is the situation really that dire? The CEO of robotics and computing company Neurala Massimiliano Versace argues that these doomsday predictions surrounding AI are all largely unsubstantiated. In fact, his biggest complaint so far is that non-experts like Musk who have no clue about how AI actually works seem to be dominating the discussions. In contrast to Musk’s warnings, Versace says that it is much too early to start regulating AI and that doing so would hinder innovation.

Several other critics have also voiced their opinions addressing the robot apocalypse scenario that Musk seems to predict. CEO and co-founder of Google Larry Page made the case that AI is designed to make people’s lives easier so that they have more time to pursue their own interests. Likewise, Facebook’s Mark Zuckerberg compared fears about AI to early fears about airplanes and encouraged people to “choose hope over fear.”

On the other hand, it’s not like Musk is the only dissenting voice in the room. Renowned theoretical physicist Stephen Hawking similarly expressed how artificial intelligence could spell the end of the human race, and Microsoft’s Bill Gates voiced his worries that AI might become a problem after it becomes intelligent enough to dominate the workforce.

However, rather than a “Terminator”-style takeover, the bigger concern for me from a cultural perspective is the direction that AI might take the world in.

It’s undeniable that today’s society places a disproportionate amount of attention on science and technology over any other discipline. Given how dependent on machines we’ve become, it’s no surprise that so many people hold degrees in math-intensive STEM subjects such as computer science, robotics, and electrical engineering and that we place these individuals on lofty pedestals. As a result, pursuing a degree in the humanities is widely seen as a high-risk gamble considering the increasingly bloodthirsty modern arena known as the job market. But the problem here is that the widespread implementation of AI will likely exacerbate this issue even further.

Last March, U.S. Treasury Secretary Steve Mnuchin brushed aside all concerns about AI and stated that “In terms of artificial intelligence taking over the jobs, I think we’re so far away from that that it’s not even on my radar screen.” Unfortunately, Mnuchin couldn’t be more wrong. In reality, AI has already started to seep into the workforce.

Let’s list some examples. In San Francisco, Simbe Robotics’s Tally robot can navigate around human shoppers at the supermarket to make sure that everything is stocked, placed and priced properly. Meanwhile, in Japan, Fukoku Mutual Life Insurance has already replaced 30 of its employees with an AI system that can analyze and interpret data better and much faster than a human can. Artificial intelligence is also replacing financial analysts in the business sector simply because it can predict market patterns faster.

Not only that, careers thought to be safe from the encroaching tech revolution—such as journalism and teaching—are now at risk as well. For instance, companies such as Narrative Science and Automated Insights have created AI bots that write countless business and sports articles for clients like Forbes and the Associated Press. The United States military also relies on a computer-generated virtual therapist to screen soldiers in Afghanistan for PTSD, and physical robots are being used in Japan and Korea to teach English. Even actors could be replaced by some kind of technological innovation like with Grand Moff Tarkin in “Rogue One: A Star Wars Story.” Given the efficient and cost-effective nature of AI, it won’t be long until these systems are used in practically every industry.

Of course, there are various reassuring arguments out there. A common response is that new jobs will naturally form once old jobs are filled. However, exactly what kind of job do you think will be in demand once more and more companies implement AI in their business? A really insightful article by Clive Thompson has a headline that states it best: “The Next Big Blue-Collar Job Is Coding.” Sure, jobs won’t completely disappear, but I predict that the tech industry will be the only area in dire need of employees.

Another common response is that a greater focus on STEM education will eventually solve everything. Jenny Dearborn, an executive at the software company SAP, argues that young people today have a responsibility to become more educated in technology. “If you want to do health care, major in tech with a healthcare focus. If you want to be a neuroscientist, major in tech with that focus,” she emphasized.

However, that’s easier said than done. The United States already lags behind in STEM education compared to the rest of the world, and considering how our current Secretary of Education is a billionaire who has spent millions of dollars fighting against government regulations and crippling teachers’ unions by taking away their right to strike, I’m not feeling too optimistic. Plus, what if you’re simply not naturally inclined toward skills in STEM? What about people who just don’t enjoy it?

Obviously, the last thing I want to do is bash the STEM disciplines and discourage people from pursuing STEM careers. I truly believe that science and technology can inspire wonder and excitement for everyone. However, I worry that students who discover their passions in the humanities will likely end up squeezed to death under the STEM-oriented educational system even more than they do today. As a college student who once had plans of majoring in the humanities, I’d hate to imagine what job searching will be like in a future where AI made has made that notoriously grueling, overly competitive process even harder.

Originally published on September 14, 2017, in The Miscellany NewsGlobal job industries should prepare for growth in AI

Is Drinking Coffee Good or Bad For Your Health?

clip_image012
Picture Credit: USA Today

There is no doubt that America loves its coffee. According to a 2017 study by the National Coffee Association, 62 percent of Americans drink this caffeinated beverage on a daily basis, consuming close to 400 million cups per day. That’s more than 140 billion cups of coffee per year.

On top of that, Americans have no intention of straying from this path. Studies have found that 31 percent of coffee drinkers consider brewing coffee to be the most important activity in the morning and 52 percent of drinkers stated they would rather skip the morning shower than their cup of joe. It’s safe to say neither Starbucks nor your local coffee shop will fall out of fashion anytime soon.

But while coffee’s immense popularity is unquestionable, can we say the same in regards to its health benefits? This has been a contentious issue for a long time, as countless studies over the past several years have either branded this beverage as a cure-all that increases lifespan or a deadly toxin that shortens it. Case in point: In 1981, Harvard published a study that connected coffee with a high risk of pancreatic cancer, which sent the entire nation into a frenzy. Later, those same Harvard researchers concluded that smoking may have been the real culprit instead. Like with dark chocolate and red wine, it’s incredibly difficult to pin down any definitive answer regarding coffee’s effects on the body because it’s by nature impossible to prove cause-and-effect in food studies. However, we should still be able to gather a general idea of its effects and whether the benefits outweigh the risks.

So what does science really say about the health effects of coffee? For the most part, it’s good news—or at the very least, coffee won’t kill you. There are numerous studies that suggest that drinking coffee regularly offers a wide range of health benefits, such as lowering the risk of stroke and dementia .

In fact, there doesn’t seem to be an end to the good news. A 2012 study indicates that the caffeine in coffee could decrease the risk of type 2 diabetes. The study featured almost 80,000 women and more than 40,000 men and controlled for all major lifestyle and dietary risk factors. After more than 20 years, they discovered that coffee consumption was associated with an eight percent risk decrease in women and four percent risk decrease in men.

The same could even be said for heart disease. In a 2015 meta-analysis of studies investigating long-term coffee consumption, Harvard researchers found that people who drank about three to five cups of coffee a day had the lowest risk of heart disease among more than 1,270,000 participants. Not only that, but those who consumed five or more cups a day did not suffer any higher risk than those who didn’t drink coffee at all. This information lines up with what a team of cardiologists at the University of California, San Francisco, stated all the way back in 1994: “Contrary to common belief, [there is] little evidence that coffee and/or caffeine in typical dosages increases the risk of [heart attack], sudden death or arrhythmia.”

On the other hand, studies investigating the supposed ill effects of drinking coffee have surprisingly come up short. To begin with, most of the negative connotations that surround coffee are mere myths. For instance, the old wives’ tale about how kids shouldn’t drink coffee because it stunts their growth is just not true. Years of studies have shown that there is no scientifically valid evidence that suggests that coffee affects a person’s height.

Likewise, the idea that drinking coffee will lead to lower bone density and greater risk of osteoporosis is also dubious. Scientists believe that this fear likely stemmed from early studies that linked caffeine with reduced bone mass. However, those early studies were mostly conducted on elderly people whose diets already lacked milk and other sources of calcium. To top it all off, even fears that coffee increases the risk of hypertension turned out to be unfounded thanks to a 2002 study by Johns Hopkins. Exactly what is it in coffee that provides all these benefits? Most studies point to coffee’s high antioxidant content, which protects the body from free radicals that harm the body and factor into cancer development. In fact, according to the American Chemical Society, coffee is the leading source of antioxidants in American diets due to how often we drink it.

Does this mean that coffee is a miracle drink after all? It’s difficult not to come to that conclusion, especially since two new studies published this year concluded that those who drink coffee regularly tend to live longer than those who do not. However, it’s best not to get carried away since, as stated earlier, food studies are notoriously inconsistent. These are all correlations, not causations. The caffeine in coffee is still a drug that has widespread effects that we’re not even close to uncovering. Coffee is still linked to insomnia, heartburn, addiction and digestion problems, as well as weight gain if consumed in excess (even without cream and sweeteners).

Both the U.S. Food and Drug Administration and the International Food Information Council recommend that you don’t exceed 400 milligrams of caffeine a day, which is roughly equivalent to four regular cups of coffee or one Starbucks Venti. As always with food or drink, moderation is key.

Originally published on September 7, 2017, in The Miscellany News: Coffee in moderation beneficial to health

Declaring Your Major: The Absurd Dichotomy Between STEM and the Humanities

158635639-jpg_205801
Picture Credit: Yahoo News

No matter who you are and what your background may be, every college student will face the question that will haunt them throughout their undergraduate experience: Did I choose the right major given the increasingly competitive and cutthroat job market of the real world? While people sometimes joke about how screwed they are after graduation or how the next step in their illustrious career path is living in a cardboard box for the rest of their life, the fathomless, pitch-black uncertainty that surrounds life after college generates so much terrifying fear and anxiety for students that most will structure their four years in college with the sole purpose of minimizing that uncertainty as much as possible.

Thus, it is inevitable to hear both students and adults discuss fervently which major is the most financially secure or which has the greatest likelihood of success. And within these discussions, the most vocal opinion is that majoring in the hard sciences like computer science, economics and math will obviously lead to more financial stability and employment offers than majoring in something “impractical” like philosophy, art history or English. After all, in today’s digital, market-driven society, everyone knows that STEM majors are “more valuable” than humanities majors.

Speaking as someone who absolutely loves the sciences and strives to convince others of its awe-inspiring brilliance, this laughably wretched sentiment is one of the most deceitful claims I have ever heard.

Unsurprisingly, the counter arguments are many and quite intimidating at first. According to a recent salary survey by the National Association of Colleges and Employers, Class of 2016 college graduates who majored in STEM are expected to receive the highest starting salaries, with majors in engineering and computer science expected to earn an average of over $60,000 per year. In contrast, the average salary for new graduates who majored in the humanities is projected to be around $46,000.

The U.S. Bureau of Labor Statistics also compiled a list of the top “most valuable” college majors in 2012 based on median salary rate, job growth projections through 2020 and wealth of job opportunities available and ranked biomedical engineering at the top followed by biochemistry, computer science, software engineering and civil engineering. On top of all that, countless politicians (both Republicans and members of the Obama administration) have pushed to distribute education funding based on post-college performance and student earnings after graduation in order to combat the shortage of STEM workers, placing the humanities departments in serious jeopardy.

However, I want to point out that just because certain STEM jobs have astronomically high salary rates doesn’t mean that majoring in STEM will guarantee you an easier or more financially stable life with a higher chance of employment.

In truth, the idea that there is this crisis-level shortage of scientists and engineers in the United States is largely baseless. Studies from the National Bureau of Economic Research, the RAND Corporation and the Urban Institute have all been unable to find any compelling evidence indicating the presence of some widespread labor market shortage or hiring difficulties in science and engineering occupations that require bachelor’s degrees or higher. Not only that, the overall consensus was that the United States produces too many science and engineering graduates every year and not enough STEM job openings. The only disagreement among the studies was whether there are 100 percent more STEM graduates than job openings or 200 percent more.

That’s right: the unemployment rate is shockingly high among scientists and engineers, especially for recent graduates and PhDs. This includes graduates who majored in engineering (7.0 percent), computer science (7.8 percent) and information systems (11.7 percent). Of course, this doesn’t even factor in other problems such as unstable careers, slow-growing wages, high risk of jobs moving offshore and the impossibility of landing a tenure-track academic position. Most depressing of all, a recent survey of 3.5 million homes from the U.S. Census Bureau found that almost 75 percent of the people who graduated with a bachelor’s degree in a STEM discipline don’t even work at a STEM job.

Given this bleak situation, does that mean majoring in the humanities is a far better choice for students than majoring in STEM? After all, those who stand against the STEM hype have often argued that the humanities bring a sort of “richness” and “complexity” to society that science cannot replicate. Nope, because that is also wrong. While claims that the humanities teach students about critical thinking and communication skills are valid, they aren’t nearly convincing enough to sway the minds of students worried about their substantial college debt.

If neither STEM nor the humanities are objectively better than the other, then what should financially conscious college students major in? Surely, the answer isn’t something sappy and unhelpful like “pursue your dreams.” No, what I’m proposing is that students create an integrated curriculum that combines elements from both the humanities and STEM to introduce a new perspective to a pre-existing discipline.

But what does that mean? Am I suggesting that low-income students double or triple major in contrasting subjects? Quite the opposite: Students should combine subjects and pursue a path that sheds light on a certain STEM path using elements of a specific humanities discipline or that sheds light on a certain humanities path using elements of a specific STEM discipline. In other words, stray from the “pure” science or “pure” humanities.

Celebrated geniuses of the past were successful not because of they were the master of a single discipline but because they were creative enough to pull inspiration from a wide range of sources and view conventional ideas in radically different ways. For instance, Leonardo da Vinci used his mastery in painting, writing, engineering and biology to study the anatomy of the human body. By choosing to explore the human body in the context of both art and science, da Vinci produced his famous drawings of the human figure that revolutionized the entire world.

Rather than focus solely on philosophy or solely on neuroscience, try to find a new approach by combining the two, like award-winning cognitive scientist David Chalmers, who came up with the concept of naturalistic dualism to explain the nature of consciousness. Instead of majoring in pure physics or pure history, discover a more creative and integrated career path like the famous physicist and historian Thomas Kuhn, who was arguably one of the most influential historians of science in the world. Employers are not looking for college graduates with conventional majors and a one-track mind; they are looking for people who seek new perspective and are willing to explore new territory. If you want success after college, make full use of all the courses that are available and pursue combinations that no one has tried before.

Originally published on May 3, 2017, in The Miscellany NewsMoving beyond the science/humanities dichotomy