The War on Cancer: We Still Think We’re Winning Even Though We’re Not

Picture Credit: Fabian Bimmer | Reuters | Newsweek

I must admit, it can be exciting to read about the latest developments in cancer research in the news. There is so much happening in the field of oncology that it’s tempting to imagine a future without cancer just around the corner.

For instance, scientists at Northwestern University have reported that they have found what they call an ancient kill switch in cancer cells. According to the researchers, they may be able to use this mechanism to force cancer cells to kill themselves in multiple ways simultaneously.

Not only that, a revolutionary new form of cancer treatment known as CAR T-cell therapy has swept up the scientific community in an excited fervor. By manipulating the T-cells of the cancer patient’s own immune system with gene therapy and then reinjecting them back into the patient, researchers have successfully destroyed tumors in people who had lost all hope.

According to various news reports, this treatment was so promising that the U.S. Food and Drug Administration (FDA) has recently given it the green light for production and distribution, making it the first use of medicinal gene editing to be available for patients in the United States.

“We’re entering a new frontier in medical innovation with the ability to reprogram a patient’s own cells to attack a deadly cancer,” FDA Commissioner Dr. Scott Gottlieb stated after approving the treatment.

As with anything that’s showered with positive attention by the media, however, it’s not as simple as it appears. All the hype surrounding cancer research is actually blinding us to the reality that we are not winning the war against cancer. In fact, despite what headlines may claim, we are nowhere close to actually finding the cure for cancer.

While such a claim may sound needlessly pessimistic, it is vital to view the current trajectory of cancer research within the context of its larger history. For one thing, cancer has been around for a very, very long time. This immortal and terrifying disease has been around for all of human history, with fossil evidence and ancient manuscripts dating its pervasiveness as far back as 1600 B.C. Needless to say, countless attempts have been made by renowned scientists and medical experts across human history in a collective effort to understand and combat this disease. In recent memory, the most notable collective endeavor is America’s War on Cancer, which was launched by President Nixon in 1971. From that moment on, the United States has devoted increasingly intensified efforts to find a cure.

Over the past 40 years, the U.S. has poured a total of more than $500 billion into winning this war. Even now, that war continues to escalate. In 2017, the National Cancer Institute (NCI) received $5.389 billion for the fiscal year, which is $174.6 million more than what the organization received in 2016. In addition, we have around 260 different nonprofit organizations in the United States that raise money for cancer research and treatment. Together, those nonprofit organizations have budgets that top $2.2 billion.

This should be good news, though, right? All of that money is going towards a worthy cause, after all. Indeed, that much is undeniable. However, the problem is that all that money is translating to very little substantive progress in terms of developing a permanent solution. So far, we have made great strides in understanding the nature of cancer cells and how they behave in general. Unfortunately, utilizing that knowledge to create a reliable treatment has so far proven to be much more difficult than anyone had realized.

Despite receiving billions of dollars in funding and conducting countless expensive and laborious drug trials, scientists have yet to develop anything that can meaningfully increase a patient’s chances of survival, much less actually cure the disease. In fact, a recent study published earlier this year reported that two-thirds of all cancer drugs that were approved in the past two years showed no evidence of extending survival at all (USA Today, “Dozens of New Cancer Drugs Do Little to Improve Survival,” 02.09.2017).

When President Nixon announced the War on Cancer, he vowed that cancer would be cured by 1976. Today, cancer remains as deadly as ever. According to the World Health Organization, one in six deaths in the world in 2015 was caused by cancer, resulting in a total of 8.8 million deaths. As a result, cancer is still the second leading cause of death globally, just behind heart disease. However, the death toll from heart disease has decreased significantly over the past several decades. In fact, between 1950 and 2005, the death rate of heart disease dropped by 64 percent. In contrast, the death rate for cancer fell by a mere five percent during that same time period. That’s how little progress we have made, even with billions of dollars in funding supporting decades of scientists’ focused research.

Of course, the last thing I want to do is discourage further cancer research. Despite the rather bleak odds, there are still benefits in continuing this line of inquiry and searching for other treatment options. The point I’m trying to articulate is that the news you hear about regarding cancer research tends to be so overly positive that they often fail to accurately depict the reality of the situation. No matter where you look, every new insight is a “major breakthrough,” and every new test product is “a miracle in the making.” By exaggerating successes, the media has effectively deceived the general public into believing that the cure for cancer is just around the corner.

Case in point: CAR-T therapy. Remember how I mentioned earlier that this method of cancer treatment showed promising results? When news sources learned that the FDA approved its use in the United States, they became ballistic with excitement. They issued articles about the miracle of CAR-T therapy, with headlines such as “Latest Car-T Therapy for Cancer Signals New Era for Life-Saving Treatments”, “New Gene Therapy for Cancer Offers Hope to Those With No Options Left”, and “Cancer’s Newest Miracle Cure”. In typical fashion, all these articles feature heartwarming stories of cancer patients miraculously being saved by this revolutionary new treatment that will surely stop cancer in its tracks.

What these articles fail to mention is that CAR-T therapy can be incredibly dangerous because it needs to bring your body to the brink of death in order to save you. While the genetically engineered T-cells spread and kill the tumor cells, the patient undergoes a series of intense side effects that are so sudden and severe that a team of top specialists must remain on standby to keep the patient alive.

And sometimes, not even that is enough. So far, several patients have died from neurotoxicity complications during the clinical trials, and experts still haven’t pinned down the exact cause of their deaths. Because CAR-T therapy is so risky and complex, experts warn that it’ll take years before a treatment like this is safe for patients to use. It is certainly not the miracle cure that the media is making it out to be. It’s not even effective against all cancers; CAR-T therapy has mainly been used to treat leukemia but struggles against solid tumors like sarcomas and lymphomas.

Does this mean that CAR-T therapy is a lost cause? Absolutely not. Medical experts are justified to claim that this immunotherapy treatment is a legitimate revolutionary breakthrough in a field that has largely stagnated over the past several decades. This is a major accomplishment, and the cancer survival stories prove that fact. But the issue is that for the past 40 years, the media has consistently trumpeted the end of cancer with every trivial development. By bombarding the public with overly exaggerated tales of successes, the media has essentially deluded the country into believing that we are winning the war against cancer and that all cancer patients have a good chance of not only surviving but also returning to their normal lives. But such rose-colored views are far from the truth and have broken families apart.

As Dr. Otis Brawley, the chief medical officer at the American Cancer Society, explained, “We have a lot of patients who spend their families into bankruptcy getting a hyped therapy that [many] know is worthless…[Some choose a medicine that] has a lot of hype around it and unfortunately lose their chance for a cure.”

It’s already heartbreaking for patients and their loved ones to learn that they have cancer. It feels infinitely worse to undergo several “life-saving” treatments performed by doctors at the country’s best hospitals only to learn that none of it actually works. Consider the tragic story of Michael Uvanni and his brother James, a patient with skin cancer. After hearing about all the miracle treatments that were supposedly available and seeing happy commercials of cancer patients hugging their grandchildren, they felt confident that the odds were in James’ favor. That optimism led to crushing disappointment when his health continued to suffer, even after trying immunotherapy and several other experimental treatments. Three years after his diagnosis, James passed away from metastatic melanoma.

“I thought they were going to save him…You get your hopes up, and then you are dropped off the edge of a cliff. That’s the worst thing in the world,” confessed Michael Uvanni.

This sort of duplicitous optimism, unfortunately, permeates through the entire field of oncology. While newspapers hype research results to attract readers, drug companies make outrageous promises to boost sales and hospitals draw in paying customers by appealing to their hopes and overstating their accomplishments. Many scientists have also fallen victim to this mindset, often exaggerating the successes of their own research results to attract investors. Back in 2003, Dr. Andrew von Eschenbach, the director of the National Cancer Institute, announced the possibility of “eliminating suffering and death due to cancer by 2015.” Even President Obama contributed to the illusion when he announced the Cancer Moonshot project in 2016 by saying, “Let’s make America the country that cures cancer once and for all.”

Given all these overly positive messages, it’s no wonder that so many cancer patients believe that their lives are guaranteed to be saved, only to feel crushed when they learn the awful truth. Let’s be clear: There is no miracle cure for cancer. According to the American Cancer Society, the percentage of people who are alive five years after being diagnosed with stomach cancer is 29 percent. For lung and bronchus cancer patients, the number is 18 percent. For pancreatic cancer patients, it’s 7 percent. Patients with metastatic melanoma typically die within a year of diagnosis. Despite what you may hear, immunotherapy can cause fatal immune system attacks on the lungs, kidneys, and heart. There are no approved immunotherapies for breast cancer, colon cancer or prostate cancer. Not only that, studies have found that immunotherapy only benefits about 10 percent of all cancer patients.

As grim as all this may be, we must remember that not all hope is lost. That said, the last thing cancer patients need right now is to be blindsided by all the fanfare that seems to accompany every piece of cancer news.

Originally published on October 26, 2017, in The Miscellany NewsCancer research advancements overstated


The New Americana: More Teens Are Suffering From Anxiety, Depression Than Ever Before

depressed teen
Picture Credit: Reuters | International Business Times

For a long time, teenagers have been characterized—generally by those older than them—as overly moody, self-centered and irrational. It’s not uncommon for adults to complain about how millennials are emotionally unstable, or to brush aside their problems as typical “teenage angst.” But in reality, these millennials have been rather upstanding.

Illegal drug use among teens has been declining for several years, and far fewer adolescents are smoking cigarettes and drinking alcohol than almost ever before. Not only that, the National Center for Health Statistics has reported a record low in the teen birth rate in the U.S., while high school graduation rates reached an all-time high of 83.2 percent in 2015.

Yet despite all the good news, researchers have noticed a disturbing trend: American adolescents are developing serious mental health problems at an alarming rate. According to the Department of Health and Human Services, about three million teenagers ages 12 to 17 had at least one major depressive episode in 2015 alone, and more than two million teens reported experiencing depression that impairs their daily activities. What’s even more startling is that this number is predicted to increase. According to a study that tracked depression among young adults across the country, the number of teenagers who reported having symptoms of low self-esteem and problems with sleep and concentration rose by 37 percent just between 2015 to 2016.

And it’s not just depression. Researchers have found that cases of anxiety have spiked in recent times. According to the Anxiety and Depression Association of America (ADAA), anxiety disorders have become the most common mental illness in the United States, affecting 18.1 percent of Americans every year. In fact, the National Institute of Mental Health reported that about 6.3 million teens in the U.S. have an anxiety disorder of some kind. Unfortunately, this widespread phenomenon is not just affecting middle- and high-school students. Anxiety has overtaken depression as the most common reason college students seek counseling services. According to the American College Health Association, the number of undergraduates reporting to have “overwhelming anxiety” increased significantly from 50 percent in 2011 to 62 percent in 2016.

It’s not normal “teen angst” anymore; it’s a full-scale epidemic that is bound to get worse over time if ignored. But what can be the cause of such a shocking national trend? Unfortunately, not even the researchers know for sure. Usually, there are several conspicuous reasons for adolescents to feel depressed or anxious. Being raised in abusive households, living in poverty or being surrounded by violence are all understandable causes of emotional instability. Yet, teenagers who live in well-off communities and who seemingly should have nothing to worry about tend to suffer the most. What could possibly be causing these adolescents such grief?

Rather than one definite answer, it is most likely the result of several interwoven factors. For instance, anxiety and depression are shown to have a biological component. Scientists have already located several genes that may influence the risk of developing an anxiety disorder, such as variants of the GLRB gene, which has been linked to responses in the brain that cause us to become startled or overly fearful. However, there are other relevant biological factors besides genetics. Just recently, scientists have discovered that our gut bacteria may influence the functioning of brain regions such as the amygdala and the prefrontal cortex, both of which are heavily linked to anxiety and depression. These studies found that mice with an imbalance in their gut microbiome were more likely to display anxious and depressive behaviors.

However, many experts agree that environment likely plays a larger role in the rise of mental health issues in adolescents than genetics or gut bacteria. More specifically, researchers suspect that this epidemic of intense anxiety and depression in teens may be caused by the overwhelming pressure placed on them not only to succeed but to perform better than everyone else. As a result of this pressure, both high school and college students have reported that their biggest stressor is the fact that no matter what they do, it’s never enough.

“Teenagers used to tell me, ‘I just need to get my parents off my back.’ [But now,] so many students have internalized the anxiety. The kids at this point are driving themselves crazy,” stated Madeline Levine, a practicing psychologist and a founder of a non-profit that works on school reform. This news probably comes as a surprise to no one. In 2013, the American Psychological Association reported that American teenagers have become the most stressed age-group in the United States. Various culprits are likely at fault, including sleep deprivation, the uncertainty surrounding job security and the fear of not living up to people’s expectations.

Researchers have also assigned blame to the prevalence of social media and technology. With everyone connected to the internet, it’s difficult for teens to avoid constantly comparing themselves with their peers and worrying about their digital image. Unsurprisingly, many anxious teenagers agree that social media has had a negative influence on their mental health. According to accounts by teenagers attending Mountain Valley, a residential treatment facility for adolescents suffering from severe anxiety disorder, social media played a large role in lowering self-esteem and provoking feelings of anxiety. Not only that, the students also talked about how their smartphones provided a false sense of control, which they could use to avoid talking to people and escape the stresses of school.

As a result, several experts suspect that there may be a connection between the extreme spike in anxiety and depression in recent years and the wide-spread adoption of the iPhone. As Jean Twenge, a professor of psychology at San Diego State University, puts it, these dramatic trends in teen mental health issues started “exactly the moment when the proportion of Americans who owned a smartphone surpassed 50 percent.”

In the end, researchers have yet to find a conclusive answer to this troubling phenomenon. However, they agree that there is a disturbing lack of resources available to help young adults who are currently struggling with these problems. Studies show that despite the rise in mental health issues, there hasn’t been a corresponding increase in mental health treatment for both teenagers and young adults. Not only that, it’s highly likely that the number of adolescents who are actually struggling with anxiety and depression is greater than the reported figure since many people choose not seek help. In fact, the Child Mind Institute reported in 2015 that only 20 percent of young people with diagnosable anxiety disorder get treatment.

Thus, it is important to understand the true gravity of the situation and reach out to those who need help. During these times of uncertainty and hardship, it’s crucial for us to take the time to understand these individuals and aid them as best as we can rather than brush their problems aside as mere trivialities.

Originally published on October 19, 2017, in The Miscellany NewsMillennial mental health issues spike, prompt response

Our Infatuation with Solar, Wind Could Make Climate Change Worse

Picture Credit: Alamy | The Telegraph

During these troubling times of environmental turmoil, in which dangerous levels of carbon dioxide emissions threaten to destabilize the global climate, it’s no surprise that a lot of people are pushing vehemently for greater investment in renewable energy. In fact, despite the childish clamoring of several anti-science government officials, the idea of renewable energy, especially solar and wind energy, is incredibly popular among the vast majority of Americans.

In 2016, the Pew Research Center reported that 89 percent of Americans favor building more solar panel farms and 83 percent favor constructing more wind turbine farms. In contrast, only about 41 percent of Americans wanted to expand the coal mining sector, and these numbers aren’t meaningless, either. According to the Renewables 2016 Global Status Report (GSR), renewable energy saw its largest annual increase in energy contribution ever in 2015, despite low prices for fossil fuels.

It’s pretty clear that a large majority of people hold solar and wind energy in high regard. I’d even go as far to say that in this modern, socially conscious age, there isn’t a term more associated with pure good than renewable energy. However, this blind infatuation may just end up jeopardizing our entire fight against climate change. But how in the world can renewable energy possibly lead to a bad thing?

To better illustrate my point, consider the incredible amount of attention and fanfare that the Idaho-based startup company Solar Roadways Inc. got for its idea to replace all the roads in America with structurally engineered solar panels that could generate backup electricity while withstanding vehicle traffic. Founded in 2006, this startup presented a vision of a world in which solar panel roadways not only use LED lights to light up the streets and change the road design but also power entire cities to create a cleaner, greener world.

When people heard about this revolutionary new idea, they fell madly in love with the concept of solar roadways. During the crowdfunding drive at Indiegogo, more than 50,000 backers supported the project and the startup raised more than $2 million, making it the most popular Indiegogo campaign ever. But it wasn’t just green-energy enthusiasts who contributed financially to this enterprise. Even the Department of Transportation stepped in and invested more than $1.6 million into the project.

Unfortunately, all of it turned out to be a bust. When 30 solar roadway panels were finally installed on a public walkway in 2016, 25 of them broke down within a week, and more malfunctions appeared once it rained. But even more disappointing was that the highly anticipated solar roadway, even when fully operational, generated an average of 0.62-kilowatt hours of electricity per day—not even enough energy to power a hairdryer, much less an entire city.

But solar roadways aren’t the only inventions that took advantage of people’s infatuation with renewable energy. In February, a startup company raised more than $350,000 on Indiegogo when it promoted the Fontus water bottle, a self-filling water bottle that uses solar energy to extract water from the air. According to the campaign video, Fontus is designed to draw air into the bottle and capture moisture through condensation as the air cools. Not only that, the device would be powered by a small, mousepad-sized solar panel, making the Fontus perfect for backpackers and bikers going on a trip. Again, problems appeared when scientists pointed out that a solar panel that small is never going to produce the amount of energy needed to make the whole thing work. In fact, it would require a huge, 250-watt, 16-square-foot solar panel working at 100 percent efficiency under ideal circumstances for the Fontus to even come close to fulfilling its promise.

It’s not just solar energy, either. In 2016, the startup VICI Labs made headlines when it promoted the Waterseer, a device that used the wind to “provide up to 11 gallons of safe drinking water” from the air every day. Raising more than $330,000 on Indiegogo, the inventors behind the Waterseer made it seem as if their invention could end all water shortages thanks to the clean power of wind energy, managing to persuade UC Berkeley and the National Peace Corps Association to help contribute to its development. Once again, the power of green energy was overestimated and several thermodynamicists have pointed that the Waterseer wouldn’t work in dry, arid areas— places that need water the most.

The reason why all these bogus crowdfunding campaigns made so much money despite being scientifically dubious is that so many people were willing to believe that renewable energy sources could accomplish anything, even the impossible. They had such a positive outlook on solar panels and wind turbines that they didn’t even stop to consider the possible limitations of those technologies. Of course, this overly optimistic mindset is a natural product of today’s society, in which the increasingly alarming news of the humanity’s pollutant-ridden path towards ruin make it seem as if renewable energy is our only hope for survival. But no matter how beneficial it may be, renewable energy should not be placed on a pedestal. We can’t afford to treat it like some kind of magical energy source that provides unlimited free electricity without any restrictions or drawbacks.

For example, many people tend to think solar panels can provide unlimited energy because they get their power from the sunlight, which should be infinite, right? In reality, however, a typical solar panel can only absorb about 20 percent of the energy that the sun produces. In addition, unless it is specifically designed to track the movement of the sun, the solar panel can lose up to 60 percent of the sun’s energy on top of the lackluster 20 percent energy absorption. Not only that, the hotter the solar panel gets, the less energy it absorbs. It may sound counterintuitive, but for every degree above 25 degrees Celsius a typical solar panel becomes, its maximum power drops by about 0.5 percent.

This isn’t to say that renewable energy is terrible or that we should give up on it. While not entirely efficient, solar and wind power still produces electricity without consuming any limited resources. Yet we can’t delude ourselves into thinking that solving climate change is as simple as building more solar farms and wind turbines.

In fact, doing so without proper planning might do more harm than good. One major consequence of our infatuation with green energy is the rapid decline of nuclear power, the main source of zero-carbon electricity in the United States. Thanks to the popularity of solar and wind farms, nuclear power plants all across the world are on the verge of shutting down for good, which could severely damage our efforts in fighting climate change.

First of all, despite the negative press that it gets, nuclear energy remains quite possibly the cleanest and most viable form of energy that we currently possess. No matter what sort of Greenpeace propaganda you may have heard, nuclear energy is the safest way of producing reliable energy, a statement backed by the World Health Organization, the Centers for Disease Control and the National Academy of Science. In fact, a 2010 study by those three organizations has found that nuclear power is 40 percent less deadly than the next safest form of energy, wind power. Nuclear energy is also tied for having the lowest carbon footprint, and unlike solar and wind energy, nuclear energy actually stands a chance against the natural gas and coal industries. According to the U.S. Energy Information Administration, although solar and wind power made up a combined seven percent of U.S. electricity generation in 2016, nuclear energy provided 20 percent of the U.S.’s electricity.

But if the problem is that renewable energy isn’t contributing as much as nuclear energy, then can’t we solve this issue by building more solar and wind farms? No, it’s not that simple. One of the biggest problems with solar and wind energy is that they are entirely dependent on the current weather. When the sun doesn’t shine or the winds stop blowing, energy production plummets. Of course, this wouldn’t be an issue if one could store the excess energy generated on an especially sunny or windy day, but as of right now, a large-scale method of storing the electricity generated by solar and wind farms does not exist. As a result, whenever the weather is unfavorable, state governments must find an alternative energy source. What do they turn to now that many of the expensive nuclear plants are shut down? Answer: natural gas and fossil fuels.

This isn’t just a hypothetical scenario. In Southern Australia, a region in which wind energy makes up more than a quarter of its total energy, the government had to switch back on a gas-fired plant that had been shut down when prices of electricity spiked during a period of light wind. Meanwhile, despite investing heavily in green energy, the German government is supposedly paying billions to keep coal generators in reserve in case the weather suddenly becomes unfavorable. This could be why carbon emissions are still rising in Germany, even though Germans pay the most expensive electricity rates in Europe.

The loss of nuclear energy is serious. According to a Bloomberg New Energy Finance analysis, reactors that produce up to 56 percent of America’s nuclear power may shut down and eventually end up becoming replaced by the much cheaper gas-fired generators. If that were to happen, the report estimates, an additional 200 million tons of carbon dioxide would be spewed into the atmosphere annually.

But even if nuclear plants weren’t shutting down, we still lack the infrastructure required to actually utilize green energy generated in the first place. We may spend heavily on building countless wind and solar farms, but most of it is wasted if we don’t have a way to distribute that electricity, especially since most farms are hundreds of miles away from the nearest city. Even worse, some estimates posit that constructing all the high-voltage lines needed to transport the electricity could take several decades.

This is a huge problem with solar and wind farms right now. Since there is no infrastructure in place to distribute the power and no way to store the energy generated, solar farms and wind farms across the United States from Texas to California are often turned off or left idling by, leading to massive energy waste.

Again, despite everything that was mentioned, renewable energy is not a bad thing. It is much more favorable to take advantage of solar and wind energy as soon as possible than to wait and do nothing with it. But mindlessly building more and more solar and wind farms simply because solar and wind energy is “objectively good,” will only drag us further away from our goal of a cleaner future. It is undeniable that renewable energy can save the Earth, but that doesn’t mean we should worship it blindly.

Originally published on October 4, 2017, in The Miscellany News: Renewable energy, while urgent, necessitates skepticism