Welcome to SparktheScience, a website that collects and assembles all the science articles I have written. As a science enthusiast, my goal is to encourage and generate excitement for science and technology in other people, and I believe the best way to do so is through clear, clean, and fun writing. This website will cover all topics from quantum computing to biomimicry, a personal favorite of mine. I hope you enjoy everything this site has to offer!
“Science is simply the word we use to describe a method of organizing our curiosity.” ~Tim Minchin
Some have called it a magic wand. Others have referred to it as the beginning of a new scientific revolution. Regardless of how you may see it, it’s a subject matter that shouldn’t be discussed by only scientists.
CRISPR-Cas9 is the latest state-of-the-art gene editing tool that has taken over the scientific community in recent years. While the concept of modifying DNA is certainly not a new invention, CRISPR’s main strength lies its transformation of the complicated process of gene editing into something quick, efficient, precise and ridiculously cheap. In other words, it has the potential to cut out undesirable segments of DNA, eradicate hereditary diseases and even guide humanity to a future where people can shape their body into whatever they want. It’s what discouraged many people from thinking that something like designer babies is “unlikely,” but rather as something “inevitable.”
One area of CRISPR research that has gained a lot of attention recently is the development of gene drive technology, which may give humans the power to modify or even exterminate entire species in the wild. According to evolutionary biologist and gene drive pioneer Kevin Esvelt, the purpose of a gene drive is to use CRISPR to override the traditional rules of Mendelian inheritance and introduce a genetic change in organisms that will be passed down to nearly all of its descendants.
In a typical situation, a parent organism can only pass down its genome to half of its offspring as per the rules of inheritance discovered by Gregor Mendel, the father of modern genetics. As a result, even if scientists were able to genetically modify organisms in the past, they would still encounter immense difficulty in forcing specific genetic changes across an entire population. With gene drive, however, that 50-50 chance of inheritance can skyrocket to as high as 99 percent. This, of course, has groundbreaking implications.
“The ability to edit populations of sexual species would offer substantial benefits to humanity and the environment. For example, RNA-guided gene drives could potentially prevent the spread of disease, support agriculture by reversing pesticide and herbicide resistance in insects and weeds, and control damaging invasive species… [G]ene drives will be capable of influencing entire ecosystems for good or for ill,” stated Esvelt when he first introduced the possibility of using CRISPR to develop gene drives.
We possess the technology to change the world’s ecosystems, but does that mean we should use it? Many people certainly seem to think so, and the proposed benefits seem irrefutable. For instance, one innovative project currently underway is the use of gene drives to eliminate malaria from mosquitoes. Scientists are working on genetically modifying the Anopheles gambiae mosquito, a species known for spreading the malaria parasite so that the female mosquitoes become sterile. That way, once these modified mosquitoes are released into the wild, they can breed with other members of their species and effectively die off. Other scientists are looking towards using gene drive to wipe out invasive species and save endangered native animals.
Esvelt himself has become heavily involved in gene drive technology. His current project aims to reduce the rate of Lyme disease on Nantucket Island in Massachusetts by genetically modifying the island’s white-footed mice to become immune to the disease. Then, ticks will be unable to transfer the bacteria that cause the disease, and the entire transmission cycle will collapse.
However, as promising as all this may sound, it’s doubtful that gene drives will provide a lasting, viable solution. In fact, it’s possible that this technology allows scientists to deal with these serious issues in the wrong way. We may have become too infatuated with how sleek and shiny CRISPR appears to consider better, less risky solutions.
For one thing, ecosystems aren’t so simple that we can just inject new variants of a species into the wild and expect everything to go exactly as we planned. There are too many nebulous factors involved for scientists to be able to correctly predict the outcome of every ecological experiment. One of the test subjects may escape into a different environment or a completely unrelated species may become caught in the crossfire. Most of the time, as Esvelt notes, the gene drive may have little to no effect on the ecosystem at all. Ultimately, it’s arrogant to treat the ecosystem like a math problem with a simple, clean answer.
Even Esvelt seems aware of these limitations, stating, “Let me be the first to say that we do not understand how ecosystems work. They are fantastically complex.”
As if affirming this admittance of ignorance, nature itself seems to have knocked gene drive down several pegs. According to a recent report by population geneticist Philipp Messer, the genetically modified mosquitoes that the team designed to pass down an infertility mutation to all their offspring started developing a resistance to the gene drive. In other words, gene drives may not be the permanent solution that many people claimed it to be. “In the long run, even with a gene drive, evolution wins in the end,” Esvelt commented in response to the news.
But that’s not even the worst part. Upon creating a detailed mathematical model that describes what happens when genetically modified organisms are released, Esvelt discovered that the chances of altered genes spreading to unintended parts of the ecosystem were much higher than he originally predicted.
“I [feel] like I’ve blown it … [Championing this idea was] an embarrassing mistake,” Esvelt admitted.
To be honest, the entire idea of gene drives seemed faulty to begin with, mainly because the desired population modifications were not introduced naturally. Instead of working hand-in-hand with evolution, gene drives attempt to solve ecological problems by simply creating more unsustainable arms races akin to the one we have between antibiotics and bacterial diseases. For instance, even if gene drives eradicated a species of mosquitoes that spread malaria, it wouldn’t be long before a different species of mosquitoes eventually emerged that can spread the bacteria to human hosts.
Instead of making sudden, irreversible changes to the ecosystem, a much more reasonable solution is the one offered by evolutionary biologist Dr. Sharon Moalem in his book The Survival of the Sickest. In it, Dr. Moalem describes how the best way to combat diseases like malaria is to change the conditions of the environment so that the nature of the disease evolves in a way that works in our favor. For example, consider how the widespread use of mosquito nets would not only stop mosquitoes from infecting humans but essentially invalidate mosquitoes in general as vectors for the disease. As a result, evolution may provide an alternative way for malaria to spread, perhaps one that wouldn’t cause the parasite to completely incapacitate the body and instead only slightly weaken it so that the disease can spread similarly to the common cold.
Rather than risk a high-stakes gamble on gene-editing technology, it may be wiser in the long run to contemplate less invasive methods to solve our ecological problems. Humans don’t have a great track record to begin with, after all.
For some reason, there is something dignified and respectable about being a scientist. Seeing as how science is a career dedicated to the pursuit of truth and knowledge in the natural world using logic and evidence, it’s no surprise that so many people look to the task of scientific research as some kind of noble, almost illustrious profession brimming with success. According to a 2013 report by the Pew Research Center, public views of scientists are largely positive, with 65 percent of Americans believing that scientists contribute a great deal to society—only falling short of medical doctors, teachers, and the military.
In general, it seems very clear that we as a society regard scientists and their works with very high esteem, almost to the point of societal worship. As a result, ambitious college students and overbearing parents tend to think that the career path of a scientist in academia is one that guarantees a respectable level of fortune and recognition. However, we must understand that blindly revering anything, from renewable energy to cancer research, often leads to serious consequences instead.
In short, over-glorifying the scientific profession may motivate people to pursue careers in science, but it also instills in people a set of unrealistic expectations that may crush them in the face of harsh reality.
For instance, when we think of what it’s like to become a scientist, an idyllic story comes into mind: A young but passionate individual enters a prestigious graduate school and immediately begins work on the research project of their dreams. Soon, the experiment becomes wildly successful and the results are published in an esteemed academic publication like Science or Nature, and thus follows a life of wonder and scientific discovery for our intrepid fledgling scientist who aspires to change the world.
Needless to say, you would need the devil’s luck for that to happen to you because scientific research isn’t nearly as idealistic or forgiving as most people want to believe.
For one thing, despite constant calls for more people in the sciences, reports show that the United States is producing too many research scientists—to the point of extreme industry congestion, in fact. According to the 2014 Survey of Earned Doctorates by the National Center for Science and Engineering Statistics, over 54,000 research doctorate degrees were awarded in the U.S. in 2014, representing the highest number ever recorded by this survey. Of those doctorates, 75 percent of them belonged to the science and engineering fields, which has increased from 66 percent in 2004.
Although there is an overwhelming number of qualified scientists out there today, there simply aren’t enough desirable science jobs available to support everyone. For many science graduates, the prospect of obtaining a tenure-track professorship at a university is the ultimate goal because it’s one of the few positions in academia that features cutting-edge research and permanent financial security. However, there is such a surplus of PhDs in most fields that the odds of actually achieving that goal is around one in six.
“Whether we like to admit it or not, science today is a pyramid scheme. Over the last two decades, there has been a period of unsustainable growth … As a consequence, it’s child’s play to get a PhD position but almost impossible to secure a faculty job,” remarked David Keays, a biomedical researcher at the Research Institute of Molecular Pathology.
As a result, an overwhelming number of science PhDs in academia end up spending their next four or five years as a postdoc working under a professor for very little pay and meager benefits. For instance, the average postdoc in biomedicine gets paid an annual salary of about $45,000. To put that into perspective, the Bureau of Labor Statistics reports that a typical librarian has a median annual salary of $55,370, while the median annual salary of a postal service mail carrier is $57,200.
Not only that, a recent study found that ex-postdocs in biomedicine make significantly lower wages in the first 15 years of their career than their peers outside academia. And yet, people are desperately vying for these postdoc positions because of the severe lack of academic jobs. Ironically, the 2014 Survey of Earned Doctorates reported that the highest rates of academic employment are reported by doctorate recipients in the humanities and other non-STEM fields with a rate of almost 80 percent, while the lowest rates are reported by engineering—15 percent—and physical science—29 percent—doctorates.
But even once you become a certified scientist, the cutthroat competition doesn’t end. Every year, scientists from around the nation must compete for funding and grants to conduct their experiments. However, grant money is always in short supply and can’t keep up with the rate of young scientists entering the workforce. For instance, the National Institutes of Health, a major funding source for scientists, has been suffering from severe budget cuts for the past several years—all while the cost of conducting experiments has skyrocketed as well. As a result, only about 17 percent of NIH grant applications get approved, a significant decrease from 30 percent in 2000.
The high rejection rate for grant money has fueled a cascade of worrying patterns. For instance, a survey run by Nature found that academic researchers of all ages spend so much time on writing countless applications and other administrative tasks that they spend only about 40 percent of their time on actual research. Not only that, researchers are being worked to the bone to juggle all these tasks. A recent poll of more than 8,000 scientists showed that almost 40 percent of the respondents work for more than 60 hours per week.
Even worse, the ironclad law of “Publish or Perish” dictates that all scientists must pump out as many research papers as they can as quickly possible, or else they risk putting their careers significantly in jeopardy. As a result, many scientists in all fields resort to desperate measures to stay afloat by rushing experiments, exaggerating results and cherry-picking evidence. In 2012, researchers at the biotech firm Amgen could only reproduce six of 53 “landmark” studies in cancer research. In fact, one in three researchers stated that they know of a colleague who has “pepped up a paper” through shady means. Many have even turned to profit-driven “predatory journals” to publish their papers, casting scientific credibility into doubt and proliferating the dangerous culture of pseudoscience.
It should be clear that the science profession is not as pure and righteous as many may believe, but that doesn’t mean being a scientist is a dead-end job either. Despite all the hardships they face, the majority of scientists seems content with what they do; surveys showed that at least 60 percent of scientists are satisfied with their careers.
Simply put, pursuing a career in scientific research is no better or worse than any other job option that you may end up choosing is. It goes to show that all career paths, no matter how highly society may view them, will be fraught with challenges, but those who are most likely to succeed are those who genuinely love what they do.
I must admit, it can be exciting to read about the latest developments in cancer research in the news. There is so much happening in the field of oncology that it’s tempting to imagine a future without cancer just around the corner.
For instance, scientists at Northwestern University have reported that they have found what they call an ancient kill switch in cancer cells. According to the researchers, they may be able to use this mechanism to force cancer cells to kill themselves in multiple ways simultaneously.
Not only that, a revolutionary new form of cancer treatment known as CAR T-cell therapy has swept up the scientific community in an excited fervor. By manipulating the T-cells of the cancer patient’s own immune system with gene therapy and then reinjecting them back into the patient, researchers have successfully destroyed tumors in people who had lost all hope.
According to various news reports, this treatment was so promising that the U.S. Food and Drug Administration (FDA) has recently given it the green light for production and distribution, making it the first use of medicinal gene editing to be available for patients in the United States.
“We’re entering a new frontier in medical innovation with the ability to reprogram a patient’s own cells to attack a deadly cancer,” FDA Commissioner Dr. Scott Gottlieb stated after approving the treatment.
As with anything that’s showered with positive attention by the media, however, it’s not as simple as it appears. All the hype surrounding cancer research is actually blinding us to the reality that we are not winning the war against cancer. In fact, despite what headlines may claim, we are nowhere close to actually finding the cure for cancer.
While such a claim may sound needlessly pessimistic, it is vital to view the current trajectory of cancer research within the context of its larger history. For one thing, cancer has been around for a very, very long time. This immortal and terrifying disease has been around for all of human history, with fossil evidence and ancient manuscripts dating its pervasiveness as far back as 1600 B.C. Needless to say, countless attempts have been made by renowned scientists and medical experts across human history in a collective effort to understand and combat this disease. In recent memory, the most notable collective endeavor is America’s War on Cancer, which was launched by President Nixon in 1971. From that moment on, the United States has devoted increasingly intensified efforts to find a cure.
Over the past 40 years, the U.S. has poured a total of more than $500 billion into winning this war. Even now, that war continues to escalate. In 2017, the National Cancer Institute (NCI) received $5.389 billion for the fiscal year, which is $174.6 million more than what the organization received in 2016. In addition, we have around 260 different nonprofit organizations in the United States that raise money for cancer research and treatment. Together, those nonprofit organizations have budgets that top $2.2 billion.
This should be good news, though, right? All of that money is going towards a worthy cause, after all. Indeed, that much is undeniable. However, the problem is that all that money is translating to very little substantive progress in terms of developing a permanent solution. So far, we have made great strides in understanding the nature of cancer cells and how they behave in general. Unfortunately, utilizing that knowledge to create a reliable treatment has so far proven to be much more difficult than anyone had realized.
Despite receiving billions of dollars in funding and conducting countless expensive and laborious drug trials, scientists have yet to develop anything that can meaningfully increase a patient’s chances of survival, much less actually cure the disease. In fact, a recent study published earlier this year reported that two-thirds of all cancer drugs that were approved in the past two years showed no evidence of extending survival at all (USA Today, “Dozens of New Cancer Drugs Do Little to Improve Survival,” 02.09.2017).
When President Nixon announced the War on Cancer, he vowed that cancer would be cured by 1976. Today, cancer remains as deadly as ever. According to the World Health Organization, one in six deaths in the world in 2015 was caused by cancer, resulting in a total of 8.8 million deaths. As a result, cancer is still the second leading cause of death globally, just behind heart disease. However, the death toll from heart disease has decreased significantly over the past several decades. In fact, between 1950 and 2005, the death rate of heart disease dropped by 64 percent. In contrast, the death rate for cancer fell by a mere five percent during that same time period. That’s how little progress we have made, even with billions of dollars in funding supporting decades of scientists’ focused research.
Of course, the last thing I want to do is discourage further cancer research. Despite the rather bleak odds, there are still benefits in continuing this line of inquiry and searching for other treatment options. The point I’m trying to articulate is that the news you hear about regarding cancer research tends to be so overly positive that they often fail to accurately depict the reality of the situation. No matter where you look, every new insight is a “major breakthrough,” and every new test product is “a miracle in the making.” By exaggerating successes, the media has effectively deceived the general public into believing that the cure for cancer is just around the corner.
What these articles fail to mention is that CAR-T therapy can be incredibly dangerous because it needs to bring your body to the brink of death in order to save you. While the genetically engineered T-cells spread and kill the tumor cells, the patient undergoes a series of intense side effects that are so sudden and severe that a team of top specialists must remain on standby to keep the patient alive.
And sometimes, not even that is enough. So far, several patients have died from neurotoxicity complications during the clinical trials, and experts still haven’t pinned down the exact cause of their deaths. Because CAR-T therapy is so risky and complex, experts warn that it’ll take years before a treatment like this is safe for patients to use. It is certainly not the miracle cure that the media is making it out to be. It’s not even effective against all cancers; CAR-T therapy has mainly been used to treat leukemia but struggles against solid tumors like sarcomas and lymphomas.
Does this mean that CAR-T therapy is a lost cause? Absolutely not. Medical experts are justified to claim that this immunotherapy treatment is a legitimate revolutionary breakthrough in a field that has largely stagnated over the past several decades. This is a major accomplishment, and the cancer survival stories prove that fact. But the issue is that for the past 40 years, the media has consistently trumpeted the end of cancer with every trivial development. By bombarding the public with overly exaggerated tales of successes, the media has essentially deluded the country into believing that we are winning the war against cancer and that all cancer patients have a good chance of not only surviving but also returning to their normal lives. But such rose-colored views are far from the truth and have broken families apart.
As Dr. Otis Brawley, the chief medical officer at the American Cancer Society, explained, “We have a lot of patients who spend their families into bankruptcy getting a hyped therapy that [many] know is worthless…[Some choose a medicine that] has a lot of hype around it and unfortunately lose their chance for a cure.”
It’s already heartbreaking for patients and their loved ones to learn that they have cancer. It feels infinitely worse to undergo several “life-saving” treatments performed by doctors at the country’s best hospitals only to learn that none of it actually works. Consider the tragic story of Michael Uvanni and his brother James, a patient with skin cancer. After hearing about all the miracle treatments that were supposedly available and seeing happy commercials of cancer patients hugging their grandchildren, they felt confident that the odds were in James’ favor. That optimism led to crushing disappointment when his health continued to suffer, even after trying immunotherapy and several other experimental treatments. Three years after his diagnosis, James passed away from metastatic melanoma.
“I thought they were going to save him…You get your hopes up, and then you are dropped off the edge of a cliff. That’s the worst thing in the world,” confessed Michael Uvanni.
This sort of duplicitous optimism, unfortunately, permeates through the entire field of oncology. While newspapers hype research results to attract readers, drug companies make outrageous promises to boost sales and hospitals draw in paying customers by appealing to their hopes and overstating their accomplishments. Many scientists have also fallen victim to this mindset, often exaggerating the successes of their own research results to attract investors. Back in 2003, Dr. Andrew von Eschenbach, the director of the National Cancer Institute, announced the possibility of “eliminating suffering and death due to cancer by 2015.” Even President Obama contributed to the illusion when he announced the Cancer Moonshot project in 2016 by saying, “Let’s make America the country that cures cancer once and for all.”
Given all these overly positive messages, it’s no wonder that so many cancer patients believe that their lives are guaranteed to be saved, only to feel crushed when they learn the awful truth. Let’s be clear: There is no miracle cure for cancer. According to the American Cancer Society, the percentage of people who are alive five years after being diagnosed with stomach cancer is 29 percent. For lung and bronchus cancer patients, the number is 18 percent. For pancreatic cancer patients, it’s 7 percent. Patients with metastatic melanoma typically die within a year of diagnosis. Despite what you may hear, immunotherapy can cause fatal immune system attacks on the lungs, kidneys, and heart. There are no approved immunotherapies for breast cancer, colon cancer or prostate cancer. Not only that, studies have found that immunotherapy only benefits about 10 percent of all cancer patients.
As grim as all this may be, we must remember that not all hope is lost. That said, the last thing cancer patients need right now is to be blindsided by all the fanfare that seems to accompany every piece of cancer news.
For a long time, teenagers have been characterized—generally by those older than them—as overly moody, self-centered and irrational. It’s not uncommon for adults to complain about how millennials are emotionally unstable, or to brush aside their problems as typical “teenage angst.” But in reality, these millennials have been rather upstanding.
Yet despite all the good news, researchers have noticed a disturbing trend: American adolescents are developing serious mental health problems at an alarming rate. According to the Department of Health and Human Services, about three million teenagers ages 12 to 17 had at least one major depressive episode in 2015 alone, and more than two million teens reported experiencing depression that impairs their daily activities. What’s even more startling is that this number is predicted to increase. According to a study that tracked depression among young adults across the country, the number of teenagers who reported having symptoms of low self-esteem and problems with sleep and concentration rose by 37 percent just between 2015 to 2016.
And it’s not just depression. Researchers have found that cases of anxiety have spiked in recent times. According to the Anxiety and Depression Association of America (ADAA), anxiety disorders have become the most common mental illness in the United States, affecting 18.1 percent of Americans every year. In fact, the National Institute of Mental Health reported that about 6.3 million teens in the U.S. have an anxiety disorder of some kind. Unfortunately, this widespread phenomenon is not just affecting middle- and high-school students. Anxiety has overtaken depression as the most common reason college students seek counseling services. According to the American College Health Association, the number of undergraduates reporting to have “overwhelming anxiety” increased significantly from 50 percent in 2011 to 62 percent in 2016.
It’s not normal “teen angst” anymore; it’s a full-scale epidemic that is bound to get worse over time if ignored. But what can be the cause of such a shocking national trend? Unfortunately, not even the researchers know for sure. Usually, there are several conspicuous reasons for adolescents to feel depressed or anxious. Being raised in abusive households, living in poverty or being surrounded by violence are all understandable causes of emotional instability. Yet, teenagers who live in well-off communities and who seemingly should have nothing to worry about tend to suffer the most. What could possibly be causing these adolescents such grief?
Rather than one definite answer, it is most likely the result of several interwoven factors. For instance, anxiety and depression are shown to have a biological component. Scientists have already located several genes that may influence the risk of developing an anxiety disorder, such as variants of the GLRB gene, which has been linked to responses in the brain that cause us to become startled or overly fearful. However, there are other relevant biological factors besides genetics. Just recently, scientists have discovered that our gut bacteria may influence the functioning of brain regions such as the amygdala and the prefrontal cortex, both of which are heavily linked to anxiety and depression. These studies found that mice with an imbalance in their gut microbiome were more likely to display anxious and depressive behaviors.
However, many experts agree that environment likely plays a larger role in the rise of mental health issues in adolescents than genetics or gut bacteria. More specifically, researchers suspect that this epidemic of intense anxiety and depression in teens may be caused by the overwhelming pressure placed on them not only to succeed but to perform better than everyone else. As a result of this pressure, both high school and college students have reported that their biggest stressor is the fact that no matter what they do, it’s never enough.
“Teenagers used to tell me, ‘I just need to get my parents off my back.’ [But now,] so many students have internalized the anxiety. The kids at this point are driving themselves crazy,” stated Madeline Levine, a practicing psychologist and a founder of a non-profit that works on school reform. This news probably comes as a surprise to no one. In 2013, the American Psychological Association reported that American teenagers have become the most stressed age-group in the United States. Various culprits are likely at fault, including sleep deprivation, the uncertainty surrounding job security and the fear of not living up to people’s expectations.
Researchers have also assigned blame to the prevalence of social media and technology. With everyone connected to the internet, it’s difficult for teens to avoid constantly comparing themselves with their peers and worrying about their digital image. Unsurprisingly, many anxious teenagers agree that social media has had a negative influence on their mental health. According to accounts by teenagers attending Mountain Valley, a residential treatment facility for adolescents suffering from severe anxiety disorder, social media played a large role in lowering self-esteem and provoking feelings of anxiety. Not only that, the students also talked about how their smartphones provided a false sense of control, which they could use to avoid talking to people and escape the stresses of school.
As a result, several experts suspect that there may be a connection between the extreme spike in anxiety and depression in recent years and the wide-spread adoption of the iPhone. As Jean Twenge, a professor of psychology at San Diego State University, puts it, these dramatic trends in teen mental health issues started “exactly the moment when the proportion of Americans who owned a smartphone surpassed 50 percent.”
In the end, researchers have yet to find a conclusive answer to this troubling phenomenon. However, they agree that there is a disturbing lack of resources available to help young adults who are currently struggling with these problems. Studies show that despite the rise in mental health issues, there hasn’t been a corresponding increase in mental health treatment for both teenagers and young adults. Not only that, it’s highly likely that the number of adolescents who are actually struggling with anxiety and depression is greater than the reported figure since many people choose not seek help. In fact, the Child Mind Institute reported in 2015 that only 20 percent of young people with diagnosable anxiety disorder get treatment.
Thus, it is important to understand the true gravity of the situation and reach out to those who need help. During these times of uncertainty and hardship, it’s crucial for us to take the time to understand these individuals and aid them as best as we can rather than brush their problems aside as mere trivialities.
During these troubling times of environmental turmoil, in which dangerous levels of carbon dioxide emissions threaten to destabilize the global climate, it’s no surprise that a lot of people are pushing vehemently for greater investment in renewable energy. In fact, despite the childish clamoring of several anti-science government officials, the idea of renewable energy, especially solar and wind energy, is incredibly popular among the vast majority of Americans.
In 2016, the Pew Research Center reported that 89 percent of Americans favor building more solar panel farms and 83 percent favor constructing more wind turbine farms. In contrast, only about 41 percent of Americans wanted to expand the coal mining sector, and these numbers aren’t meaningless, either. According to the Renewables 2016 Global Status Report (GSR), renewable energy saw its largest annual increase in energy contribution ever in 2015, despite low prices for fossil fuels.
It’s pretty clear that a large majority of people hold solar and wind energy in high regard. I’d even go as far to say that in this modern, socially conscious age, there isn’t a term more associated with pure good than renewable energy. However, this blind infatuation may just end up jeopardizing our entire fight against climate change. But how in the world can renewable energy possibly lead to a bad thing?
To better illustrate my point, consider the incredible amount of attention and fanfare that the Idaho-based startup company Solar Roadways Inc. got for its idea to replace all the roads in America with structurally engineered solar panels that could generate backup electricity while withstanding vehicle traffic. Founded in 2006, this startup presented a vision of a world in which solar panel roadways not only use LED lights to light up the streets and change the road design but also power entire cities to create a cleaner, greener world.
When people heard about this revolutionary new idea, they fell madly in love with the concept of solar roadways. During the crowdfunding drive at Indiegogo, more than 50,000 backers supported the project and the startup raised more than $2 million, making it the most popular Indiegogo campaign ever. But it wasn’t just green-energy enthusiasts who contributed financially to this enterprise. Even the Department of Transportation stepped in and invested more than $1.6 million into the project.
Unfortunately, all of it turned out to be a bust. When 30 solar roadway panels were finally installed on a public walkway in 2016, 25 of them broke down within a week, and more malfunctions appeared once it rained. But even more disappointing was that the highly anticipated solar roadway, even when fully operational, generated an average of 0.62-kilowatt hours of electricity per day—not even enough energy to power a hairdryer, much less an entire city.
But solar roadways aren’t the only inventions that took advantage of people’s infatuation with renewable energy. In February, a startup company raised more than $350,000 on Indiegogo when it promoted the Fontus water bottle, a self-filling water bottle that uses solar energy to extract water from the air. According to the campaign video, Fontus is designed to draw air into the bottle and capture moisture through condensation as the air cools. Not only that, the device would be powered by a small, mousepad-sized solar panel, making the Fontus perfect for backpackers and bikers going on a trip. Again, problems appeared when scientists pointed out that a solar panel that small is never going to produce the amount of energy needed to make the whole thing work. In fact, it would require a huge, 250-watt, 16-square-foot solar panel working at 100 percent efficiency under ideal circumstances for the Fontus to even come close to fulfilling its promise.
It’s not just solar energy, either. In 2016, the startup VICI Labs made headlines when it promoted the Waterseer, a device that used the wind to “provide up to 11 gallons of safe drinking water” from the air every day. Raising more than $330,000 on Indiegogo, the inventors behind the Waterseer made it seem as if their invention could end all water shortages thanks to the clean power of wind energy, managing to persuade UC Berkeley and the National Peace Corps Association to help contribute to its development. Once again, the power of green energy was overestimated and several thermodynamicists have pointed that the Waterseer wouldn’t work in dry, arid areas— places that need water the most.
The reason why all these bogus crowdfunding campaigns made so much money despite being scientifically dubious is that so many people were willing to believe that renewable energy sources could accomplish anything, even the impossible. They had such a positive outlook on solar panels and wind turbines that they didn’t even stop to consider the possible limitations of those technologies. Of course, this overly optimistic mindset is a natural product of today’s society, in which the increasingly alarming news of the humanity’s pollutant-ridden path towards ruin make it seem as if renewable energy is our only hope for survival. But no matter how beneficial it may be, renewable energy should not be placed on a pedestal. We can’t afford to treat it like some kind of magical energy source that provides unlimited free electricity without any restrictions or drawbacks.
For example, many people tend to think solar panels can provide unlimited energy because they get their power from the sunlight, which should be infinite, right? In reality, however, a typical solar panel can only absorb about 20 percent of the energy that the sun produces. In addition, unless it is specifically designed to track the movement of the sun, the solar panel can lose up to 60 percent of the sun’s energy on top of the lackluster 20 percent energy absorption. Not only that, the hotter the solar panel gets, the less energy it absorbs. It may sound counterintuitive, but for every degree above 25 degrees Celsius a typical solar panel becomes, its maximum power drops by about 0.5 percent.
This isn’t to say that renewable energy is terrible or that we should give up on it. While not entirely efficient, solar and wind power still produces electricity without consuming any limited resources. Yet we can’t delude ourselves into thinking that solving climate change is as simple as building more solar farms and wind turbines.
In fact, doing so without proper planning might do more harm than good. One major consequence of our infatuation with green energy is the rapid decline of nuclear power, the main source of zero-carbon electricity in the United States. Thanks to the popularity of solar and wind farms, nuclear power plants all across the world are on the verge of shutting down for good, which could severely damage our efforts in fighting climate change.
First of all, despite the negative press that it gets, nuclear energy remains quite possibly the cleanest and most viable form of energy that we currently possess. No matter what sort of Greenpeace propaganda you may have heard, nuclear energy is the safest way of producing reliable energy, a statement backed by the World Health Organization, the Centers for Disease Control and the National Academy of Science. In fact, a 2010 study by those three organizations has found that nuclear power is 40 percent less deadly than the next safest form of energy, wind power. Nuclear energy is also tied for having the lowest carbon footprint, and unlike solar and wind energy, nuclear energy actually stands a chance against the natural gas and coal industries. According to the U.S. Energy Information Administration, although solar and wind power made up a combined seven percent of U.S. electricity generation in 2016, nuclear energy provided 20 percent of the U.S.’s electricity.
But if the problem is that renewable energy isn’t contributing as much as nuclear energy, then can’t we solve this issue by building more solar and wind farms? No, it’s not that simple. One of the biggest problems with solar and wind energy is that they are entirely dependent on the current weather. When the sun doesn’t shine or the winds stop blowing, energy production plummets. Of course, this wouldn’t be an issue if one could store the excess energy generated on an especially sunny or windy day, but as of right now, a large-scale method of storing the electricity generated by solar and wind farms does not exist. As a result, whenever the weather is unfavorable, state governments must find an alternative energy source. What do they turn to now that many of the expensive nuclear plants are shut down? Answer: natural gas and fossil fuels.
This isn’t just a hypothetical scenario. In Southern Australia, a region in which wind energy makes up more than a quarter of its total energy, the government had to switch back on a gas-fired plant that had been shut down when prices of electricity spiked during a period of light wind. Meanwhile, despite investing heavily in green energy, the German government is supposedly paying billions to keep coal generators in reserve in case the weather suddenly becomes unfavorable. This could be why carbon emissions are still rising in Germany, even though Germans pay the most expensive electricity rates in Europe.
The loss of nuclear energy is serious. According to a Bloomberg New Energy Finance analysis, reactors that produce up to 56 percent of America’s nuclear power may shut down and eventually end up becoming replaced by the much cheaper gas-fired generators. If that were to happen, the report estimates, an additional 200 million tons of carbon dioxide would be spewed into the atmosphere annually.
But even if nuclear plants weren’t shutting down, we still lack the infrastructure required to actually utilize green energy generated in the first place. We may spend heavily on building countless wind and solar farms, but most of it is wasted if we don’t have a way to distribute that electricity, especially since most farms are hundreds of miles away from the nearest city. Even worse, some estimates posit that constructing all the high-voltage lines needed to transport the electricity could take several decades.
This is a huge problem with solar and wind farms right now. Since there is no infrastructure in place to distribute the power and no way to store the energy generated, solar farms and wind farms across the United States from Texas to California are often turned off or left idling by, leading to massive energy waste.
Again, despite everything that was mentioned, renewable energy is not a bad thing. It is much more favorable to take advantage of solar and wind energy as soon as possible than to wait and do nothing with it. But mindlessly building more and more solar and wind farms simply because solar and wind energy is “objectively good,” will only drag us further away from our goal of a cleaner future. It is undeniable that renewable energy can save the Earth, but that doesn’t mean we should worship it blindly.
Let’s do a little experiment. Read the following headlines from these recently published scientific articles and try to find the one thing that all of them have in common: “The Pancreas Provides a Potential Drug Candidate for Brain Disease,” “Chimera Viruses Can Help the Fight Against Lymphomas,” “What Was Once Considered Cell Waste Could Now Treat Pancreatic Cancer,” “Cellular Tango: Immune and Nerve Cells Work Together to Fight Gut Infections,” “Scientists Reveal Fire Ant Venom Could be Used as a Skin Treatment.” The answer? All of the listed studies are based on the results of experiments conducted on mice. And that is a huge problem.
Using lab mice to understand how the human body works is nothing new. This practice officially started in 1902 when French biologist Lucien Cuénot used mice to research the nature of genes. Inspired by the works of Gregor Mendel, the father of modern genetics, Cuénot wanted to see if Mendel’s laws of inheritance applied to more than just sweet peas. Beforehand, Mendelian genetics only applied to tested plants, so the Cuénot discovery that animals follow the laws of inheritance sent shockwaves across the scientific community.
Not long after, more scientists began to use mice to explore the field of genetics, establishing mating programs that created inbred strains of mice and leading efforts to fully map the mouse genome. As decades went by, lab mice skyrocketed in popularity and ended up contributing to numerous award-winning discoveries. Out of the 106 times the Nobel Prize for Physiology or Medicine has been awarded so far, 42 of them involved research on mice or rats in some major way. These studies include the discovery of penicillin, the yellow fever vaccine, the polio vaccine and the HIV-AIDS virus.
It is easy to see how the lab mice became such an iconic symbol of biomedical research. Xavier Montagutelli, the head of the animal facilities at Institut Pasteur in Paris, explains, “[Mice] are small and inexpensive, they reproduce quickly… and they age quickly too, making them ideal for studying age-related complaints. We know how to freeze their embryos, sperm, and ova. We now know how to manipulate their genes…They are remarkable tools.”
Unfortunately, the acceptance of mice as the ideal test subject has led to the rigid assumption that they are some kind of prototypical “blank slate” mammals rather than a species with its own unique features and body mechanisms. As a result, the field of biomedicine has built an entire infrastructure of knowledge around these rodents and has become dependent on their bodily responses to measure clinical success. But they simply don’t work as models of human disease, much less for human drug treatment.
For instance, scientists have used mice to find treatments for tuberculosis for decades. However, mice respond in a drastically different manner in comparison to humans. For one thing, mice don’t cough and aren’t contagious when they have the disease. In addition, the human body triggers an immune response when the bacteria responsible for the disease is detected. Mice don’t have this immune response—they get the disease and die. So it’s no surprise that scientists have found an antibiotic called Linezolid that works spectacularly well on human patients but not on mice.
The opposite can happen as well. In the late 1950s, German doctors prescribed Thaliomide under the drug name Contergan to pregnant women to alleviate morning sickness. Since the drug was successful in mice, they assumed that the same would happen in humans. Instead, Contergan led to countless birth defects and only 40 percent of the children survived. And this isn’t just a fluke, either. Dr. Jean-Marc Cavaillon, head of the cytokines and inflammation unit at Institut Pasteur, explained how researchers have discovered a monoclonal antibody that treats inflammatory conditions in mice but would send human patients to intensive care. “Mice are great for basic research, for understanding overall patterns and grasping mechanisms. But once you start modeling a human disease to find the right treatment, you run up against major differences between us and mice,” he said.
As a result, drug treatments that were successfully tested in mice have a high chance of failure when tested on humans. According to a 2014 study on animal models, researchers have found that, on average, less than eight percent of experimental cancer treatments have successfully transitioned from animal testing to clinical cancer trials. Similarly, researchers trying to find a cure for ALS have submitted about a dozen experimental treatments for clinical testing over the past decade after finding success in mice. But when tested on humans, all but one of them failed and the one that didn’t only showed marginal benefits.
It also doesn’t help that these clinical trials are ridiculously expensive—we’re talking about hundreds of millions of dollars and years’ worth of time. In October 2014, the New England Journal of Medicine published a report about how the clinical trials of three tuberculosis treatments ended in complete failure, despite promising results in lab mice. According to the head researcher, the clinical trials alone cost more than $200 million.
But that raises the question: Can we find a suitable replacement for the lab mouse? Unfortunately, no one can say for sure. It’s not like replacing mice with a different animal will solve everything, since animal testing as a whole is still rather dubious. So far, there are only two major possible alternatives, computer models and in vitro cell culture, neither of which offer much of a substitute since they don’t provide a lot of information regarding the complex interactions of living systems.
In addition, the push to stop the use of lab mice has been very controversial within the scientific community, especially for those who would rather turn a blind eye to the issue. Simply put, lab mice are incredibly cheap, convenient and easy to handle. The initiative would also place a large bulk of biomedical research into jeopardy and cast a shadow of doubt across countless pre-existing studies on disease treatment. Scientists today still continue to experiment on mice and spend millions of dollars on clinical trials, only to wonder why their product didn’t work. But what other choice do they have?
A survey of the National Library of Medicine’s database showed that experiments conducted on mice and rats make up almost half of the 20 million academic citations across the field of biomedicine. Despite all the problems they have caused, lab mice remain deeply entrenched in the field of medical research. Clearly, this isn’t a problem that can be solved in a single day.
But what’s even worse is that many news publications are making it seem as if these experimental treatments have worked on human patients and are bound to hit the shelves in the near future. Remember those headlines mentioned in the beginning of this article? All those articles were based on mouse studies and yet none of them mentioned the word “mice” in the headline. It’s sloppy journalism like this that helps fuel people’s doubt and confusion toward the sciences. In the end, one must always remain diligent when reading about the latest discoveries and findings. Science is already a difficult field to grasp, and diving into the literature blindly won’t make things any easier in the long run.
Earlier this September, billionaire entrepreneur Elon Musk stirred up a huge Twitter-storm when he posted that global competition in artificial intelligence (AI) superiority could potentially lead to World War III. This tweet came after Russian President Vladimir Putin declared, “Whoever becomes the leader in [artificial intelligence] will rule the world,” to which Musk tweeted, “It begins….”
Of course, Elon Musk is rather infamous for making grandiose predictions and promises that often fail to come true. In 2016, he announced that his company SpaceX will master space travel and colonize Mars as early as 2024, only to pull the plug less than a year later when he realized that traveling to Mars in 25 minutes isn’t exactly feasible. However, Musk’s tweet about World War III has been one of many such warnings about the dangers of artificial intelligence, going so far as to reference the “Terminator” movies.
“AI is a fundamental existential risk for human civilization, and I don’t think people fully appreciate that,” he stated at the 2017 National Governors Association in Rhode Island.
But is the situation really that dire? The CEO of robotics and computing company Neurala Massimiliano Versace argues that these doomsday predictions surrounding AI are all largely unsubstantiated. In fact, his biggest complaint so far is that non-experts like Musk who have no clue about how AI actually works seem to be dominating the discussions. In contrast to Musk’s warnings, Versace says that it is much too early to start regulating AI and that doing so would hinder innovation.
Several other critics have also voiced their opinions addressing the robot apocalypse scenario that Musk seems to predict. CEO and co-founder of Google Larry Page made the case that AI is designed to make people’s lives easier so that they have more time to pursue their own interests. Likewise, Facebook’s Mark Zuckerberg compared fears about AI to early fears about airplanes and encouraged people to “choose hope over fear.”
On the other hand, it’s not like Musk is the only dissenting voice in the room. Renowned theoretical physicist Stephen Hawking similarly expressed how artificial intelligence could spell the end of the human race, and Microsoft’s Bill Gates voiced his worries that AI might become a problem after it becomes intelligent enough to dominate the workforce.
However, rather than a “Terminator”-style takeover, the bigger concern for me from a cultural perspective is the direction that AI might take the world in.
It’s undeniable that today’s society places a disproportionate amount of attention on science and technology over any other discipline. Given how dependent on machines we’ve become, it’s no surprise that so many people hold degrees in math-intensive STEM subjects such as computer science, robotics, and electrical engineering and that we place these individuals on lofty pedestals. As a result, pursuing a degree in the humanities is widely seen as a high-risk gamble considering the increasingly bloodthirsty modern arena known as the job market. But the problem here is that the widespread implementation of AI will likely exacerbate this issue even further.
Last March, U.S. Treasury Secretary Steve Mnuchin brushed aside all concerns about AI and stated that “In terms of artificial intelligence taking over the jobs, I think we’re so far away from that that it’s not even on my radar screen.” Unfortunately, Mnuchin couldn’t be more wrong. In reality, AI has already started to seep into the workforce.
Let’s list some examples. In San Francisco, Simbe Robotics’s Tally robot can navigate around human shoppers at the supermarket to make sure that everything is stocked, placed and priced properly. Meanwhile, in Japan, Fukoku Mutual Life Insurance has already replaced 30 of its employees with an AI system that can analyze and interpret data better and much faster than a human can. Artificial intelligence is also replacing financial analysts in the business sector simply because it can predict market patterns faster.
Not only that, careers thought to be safe from the encroaching tech revolution—such as journalism and teaching—are now at risk as well. For instance, companies such as Narrative Science and Automated Insights have created AI bots that write countless business and sports articles for clients like Forbes and the Associated Press. The United States military also relies on a computer-generated virtual therapist to screen soldiers in Afghanistan for PTSD, and physical robots are being used in Japan and Korea to teach English. Even actors could be replaced by some kind of technological innovation like with Grand Moff Tarkin in “Rogue One: A Star Wars Story.” Given the efficient and cost-effective nature of AI, it won’t be long until these systems are used in practically every industry.
Of course, there are various reassuring arguments out there. A common response is that new jobs will naturally form once old jobs are filled. However, exactly what kind of job do you think will be in demand once more and more companies implement AI in their business? A really insightful article by Clive Thompson has a headline that states it best: “The Next Big Blue-Collar Job Is Coding.” Sure, jobs won’t completely disappear, but I predict that the tech industry will be the only area in dire need of employees.
Another common response is that a greater focus on STEM education will eventually solve everything. Jenny Dearborn, an executive at the software company SAP, argues that young people today have a responsibility to become more educated in technology. “If you want to do health care, major in tech with a healthcare focus. If you want to be a neuroscientist, major in tech with that focus,” she emphasized.
However, that’s easier said than done. The United States already lags behind in STEM education compared to the rest of the world, and considering how our current Secretary of Education is a billionaire who has spent millions of dollars fighting against government regulations and crippling teachers’ unions by taking away their right to strike, I’m not feeling too optimistic. Plus, what if you’re simply not naturally inclined toward skills in STEM? What about people who just don’t enjoy it?
Obviously, the last thing I want to do is bash the STEM disciplines and discourage people from pursuing STEM careers. I truly believe that science and technology can inspire wonder and excitement for everyone. However, I worry that students who discover their passions in the humanities will likely end up squeezed to death under the STEM-oriented educational system even more than they do today. As a college student who once had plans of majoring in the humanities, I’d hate to imagine what job searching will be like in a future where AI made has made that notoriously grueling, overly competitive process even harder.