Brief Overview

Welcome to SparktheScience, a website that collects and assembles all the science articles I have written. As a science enthusiast, my goal is to encourage and generate excitement for science and technology in other people, and I believe the best way to do so is through clear, clean, and fun writing. This website will cover all topics from quantum computing to biomimicry, a personal favorite of mine. I hope you enjoy everything this site has to offer!

“The imagination of nature is far, far greater than the imagination of man.” ~Richard Feynman


An Unexpected Risk: Why Deodorant, Perfume Are Not Necessary for Good Hygiene

Picture Credit: DontSmellBad.com

When it comes to presenting yourself in public, one of the top concerns that many people worry about is body odor. I’m sure many of us have been given “the talk” during puberty and have been told how we have to start paying attention to how we smell. Even in middle school and high school, we’re bombarded with messages about how we have to do everything we can to make our body smell nice and pleasant like a field of wildflowers. During health class, our health/P.E. teachers never fail to mention how using deodorant is crucial for good hygiene. At the same time, girls are taught by both their friends and popular media to incorporate perfume into their personal grooming

As a result, applying these chemicals on our bodies has become a normal part of our life. It isn’t uncommon to see advertisements that depict expensive cologne as something “sexy” and magazines like Cosmopolitan give popular advice like “Carry a travel foot spray in your purse” and “Spray your bare torso with fragrance.”

As such, it should come to no one’s surprise that the deodorant and perfume industries are make an enormous profit every year. According to Emmanuelle Moeglin, Global Fragrance and Colour Cosmetics Analyst at Mintel, the United States continues to be the biggest market for deodorant worldwide, hitting over $3 billion in 2015 and showing a growth of nearly 5 percent in just one year alone.

The perfume industry is even bigger–the annual global perfume industry sales revenue is about $28.95 billion, with the U.S. market making up $6.1 billion of that profit. Not only that, but perfumes aren’t often known for being cheap. About 46 percent of designer perfume brands are priced at over $75 and can even cost up to $440 for one bottle.

“[I]t should come to no one’s surprise that the deodorant and perfume industries are make an enormous profit every year.”

But is all this necessary? We’re told that these products are essential for our daily lives, but is that really true? According to researchers, our frequent use of aerosol products like deodorant and perfume may not only be gratuitous, but it may also contribute to the declining health of our planet. The first misconception to clear up is how deodorant and perfume actually work.

For one thing, deodorant doesn’t actually target the underlying cause of the bad smell that you’re trying to prevent. Despite its notoriety, sweating is a pivotal mechanism for maintaining proper homeostasis. It helps regulate body temperature, but it also flushes out toxins from clogging up your skin, prevents the buildup of excess salt and calcium in your bones and help fight dangerous pathogens. Sweat also typically doesn’t have a smell. The terrible odor we often associate with sweat is actually caused by the skin bacteria that break down the sweat components.

What deodorant does is mask the smell with a more pleasant fragrance or kill the bacteria on the skin that is causing the smell. There is also a subcategory of deodorants called antiperspirants which use aluminum salts to kill of bacteria by blocking the sweat glands with aluminum salts.

However, the most important thing to know about deodorants is that they’re not as necessary as we tend to believe them to be. In fact, when the first deodorant came out in late 19th century, very few people actually used it, mainly because they handled the body odor problem by washing regularly. Deodorants only started gaining popularity when advertisers started targeting the insecurities of young women in the early 1900s by convincing them that they “carried repellent odor.” The marketing strategy worked, and by 1927, sales of deodorant had reached $1 million.

“Deodorants only started gaining popularity when advertisers started targeting the insecurities of young women in the early 1900s…”

According to Dr. Joshua Zeichner, a director of cosmetic and clinical research at Mount Sinai Hospital, the use of deodorant is dictated more by social norms rather than good health practices. In fact, several experts question the safety of some of the chemicals found in many deodorants and antiperspirants. For instance, research has shown that the chemical compounds known as parabens, which may interfere with the body’s hormone levels, are often used as preservatives in deodorant. While there is no conclusive evidence that link parabens with cancer, lab results suggest that they may promote the growth of cancer cells in both men and women.

Another disturbing ingredient class found in deodorant is phthalates, which may impact fetal development in pregnant women. Scientists have also linked phthalates to higher rates of asthma.

Of course, this doesn’t mean that deodorant actively harms your body. There isn’t any concrete evidence of that yet. However, several studies have shown that many people who use deodorant don’t smell or even need it, and yet people continue using it, because its use has become so ingrained in our society.

On the other hand, what about perfume? Typically, a perfume product is made up of alcohol, water and various molecules that are designed to evaporate at room temperature. A fun fact about perfumes is that they don’t produce the desired fragrance all at once. Instead, almost all perfumes are engineered so that three different types of chemicals become active during three different phases.

The first phase is composed of “top notes,” chemicals that you smell immediately when you apply the perfume, but which evaporate completely after 15 minutes. In the second phase, chemicals known as “heart notes” come into play after about three hours. The smells produced by the heart notes are what you typically associate with the perfume. Finally, the “base note” chemicals appear five hours after application; these boost the strength of the other scent notes.

Are there any health risks to applying perfume? According to scientists, it’s unclear, because perfume ingredients are fiercely guarded to protect trade secrets. While most sprays will use trace amounts of natural essences, they also contain potentially hazardous synthetic chemicals, some of which may be derived from petroleum.

“A rose may be a rose…But that rose-like fragrance in your perfume may be something else entirely, concocted from any number of the fragrance industry’s 3,100 stock chemical ingredients, the blend of which is almost always kept hidden from the consumer,” stated The Environmental Working Group (EWG), an American environmental organization that specializes in researching toxic chemicals to protect public health.

“While most sprays will use trace amounts of natural essences, they also contain potentially hazardous synthetic chemicals…”

According to a report by the EWG, the average fragrance product tested contained 14 secret chemicals not listed on the label. These secret chemicals include those associated with hormone disruption and allergic reactions, including diethyl phthalate, which is linked to sperm damage. This information could explain why some people experience symptoms such as contact dermatitis and why about one in 10 people have allergic reactions to chemical elements in fragrances.

However, it’s important to note that the health risks of using deodorant and perfumes are nothing to panic over. For the most part, you probably won’t even notice anything and are probably about as safe as you are when using any other hygiene product. More significantly, these aerosol products pose a danger to the environment.

According to a recent study published in the journal Science, both deodorants and perfumes are contributing significantly to air pollution at levels as high as emissions from cars and trucks.

This type of news is not unheard of. Back in the early 1900s, it wasn’t uncommon for aerosol products like deodorant to contain chemicals called chlorofluorocarbons. These anthropogenic compounds were very popular with manufacturers because they were not only non-flammable but also non-toxic and non-reactive to most compounds. As a result, they were also used commonly in refrigerators and air conditioners.

“[B]oth deodorants and perfumes are contributing significantly to air pollution at levels as high as emissions from cars and trucks.”

Unfortunately, scientists in the mid-1970s discovered that chlorofluorocarbons have a shocking side effect: They contribute heavily to the thinning of Earth’s ozone layer, which protects us from the sun’s ultraviolet rays. By 1984, researchers gathered conclusive evidence that chlorofluorocarbons were the culprit, and in 1987, 191 countries signed the Montreal Protocol which banned their use.

Yet despite the phasing out of chlorofluorocarbons, modern aerosol sprays still emit volatile organic compounds (VOCs) that contribute to the formation of ground-level ozone, a key component in smog that scars the lungs and can cause heart attacks and lung cancer. In other words, the use of deodorants (both the spray and the stick kind) and perfumes contribute to the creation of smog and other air pollutants just as much as vehicle exhaust.

To many people, this discovery might sound unbelievable. How can something like deodorant or perfume release more VOC emissions than a car? According to the researchers behind this study, automobiles previously produced a lot of VOC emissions, but recent developments in technology have greatly reduced how much air pollution they cause. Nowadays, even though many drivers use several gallons of gasoline every week, most of it is converted to carbon dioxide instead of VOC emissions (these carbon dioxide emissions may not form smog but they do contribute to climate change). In contrast, the damage caused by VOCs found in products like deodorant and perfumes add up and may heavily pollute the air we breathe.

Therefore, it may be a wise idea to start reducing our use of these hygiene products. While their negative effects on our health may still be in doubt, it’s clear that we must do everything we can to stop these VOC emissions and seek out other hidden threats that may harm our planet.

Originally published on February 28, 2018, in The Miscellany NewsDeodorant, perfumes contribute to smog

The Science of Smoke Detectors: How to Solve Vassar’s Infamous Fire Alarm Problem

The front of Jewett House at Vassar, one of the dorms on campus where the fire alarm goes off frequently (Picture Credit: Collin Knopp-Schwyn | Wikipedia)

Editor’s Note: This article was written as an investigative piece looking into the cause of the frequent fire alarms going off in the dorms at Vassar College.

Let’s face it: Vassar College has a fire alarm problem. For a typical resident on campus, it isn’t exactly a rare experience to be sleeping in your room one moment and then be standing outside the dorm building in your pajamas the next moment because the fire alarms have gone off. In fact, these bi-weekly evacuations have become so commonplace that the majority of students just groan and contemplate living off-campus. What makes the whole ordeal so much more frustrating for students is that almost all of these dorm evacuations are the result of nuisance alarms triggered by something completely unrelated to an actual emergency.

The drowsy students aren’t the only people who are greatly inconvenienced by these frequent false alarms. Every time the lights flash and the sirens shriek in a dorm building, the good people at the Arlington Fire District (AFD) have to rush onto campus in their fire engines to respond to the alarm. According to a recent investigation by The Miscellany News in November 2017, the AFD responded to a total of 126 campus fire calls in the 2016-17 school year. That investigation also found that Vassar College had nearly three times the number of calls in 2016 compared to peer colleges like Middlebury and Colgate, as well as the highest cost per call of around $2,871. Add this to the fact that the Vassar administration pays only $40,000 to the AFD each year despite draining an estimated $1 million of their funds, and we have here a rather shameful problem.

Needless to say, something has to change. While many students may attribute the frequency of these nuisance alarms to the poor decisions of their fellow residents, a common source of blame is the smoke detectors and how they are either faulty or too sensitive. But how valid is this claim? Is there really something wrong with the way that smoke detectors are set up at Vassar? Most of us don’t really know how they work, much less how to accurately pinpoint what’s exactly faulty about them, so it may be a good idea to learn more about these fire-protection devices that we complain about on a weekly basis.

“Vassar College had nearly three times the number of calls in 2016 compared to peer colleges like Middlebury and Colgate, as well as the highest cost per call of around $2,871.”

The first thing to remember about smoke detectors is that they really do make a difference in saving lives. According to a 2015 report by the National Fire Protection Association (NFPA), fires in homes with no smoke alarms caused an average of 940 deaths per year from 2009 to 2013. However, the data showed that homes with at least one smoke detector had a 40 percent lower death rate from fires than homes that didn’t have any. It’s also important to note that the likelihood of a smoke detector suddenly malfunctioning is pretty slim. While the report mentions that 21 percent of deaths were caused by fires in homes where the smoke detectors were present but failed to operate, the primary reasons behind these failures were either improper care or residents intentionally disabling them to stop nuisance alarms. As much as we hate our smoke detectors for making us stand outside in the freezing cold, we would definitely be worse off if we did not have them around.

With that being said, smoke detectors are far from perfect. In fact, some experts argue that their flaws have cost many people their lives. This is because in many cases, by the time the smoke detector activates, it’s too late—the smoke and flames have already spread everywhere. But how could that be possible? According to Joseph Fleming, a deputy fire chief with the Boston Fire Department, the fault lies with ionization smoke detectors.

As it turns out, there are two main types of smoke detectors. The most common type is the ionization smoke detector, which relies on an ionization chamber and a small amount of a radioactive element called Americium-241. Essentially, this radioactive material emits alpha particles that ionize (or remove an electron from) the oxygen and nitrogen atoms in the air within the ionization chamber. As a result, a small but reliable current is created inside the smoke detector. However, when smoke enters this chamber, the smoke particles disturb the current and the alarm is triggered.

It may surprise people that these everyday smoke detectors contain a radioactive element, but they pose few, if any, health hazards. A 2001 report by the U.S. Nuclear Regulatory Commission stated that this minuscule amount of Americium-241 gives off a radiation dose of less than 0.002 millirems each year. That’s less than the background radiation you get from walking around in the East Coast for about twelve hours. Just make sure not to swallow the radioactive element inside the device and you’ll be fine.

But the problem with ionization smoke detectors has less to do with their radiation dose. Rather, Fleming argues that ionization smoke detectors have difficulty responding quickly to smoke caused by smoldering fires.

“There are tons of studies that conclude that an ionization smoke detector will not give you enough time to get out of the house in a smoldering fire. I believe that somewhere between 10,000 and 15,000 people have died unnecessarily over the past 20 years because they didn’t have adequate information about their smoke detectors,” states Fleming.

According to Fleming, ionization smoke detectors are excellent at detecting smoke caused by very hot, fast-moving fires. While that is helpful, most of these fast-moving fires occur when people are awake and can quickly put out the fire or escape the building. These are the types of fires that often occur in cooking accidents, in which the smoke detector is treated like a nuisance rather than a life-saving device. In contrast, ionization smoke detectors are slower to detect smoke from smoldering fires, which begin quietly and often suffocate their victims first. Often caused by unattended cigarettes and faulty electrical wiring, these types of fires are extremely common at night when residents are sleeping and depend on their smoke detectors to wake them up.

As a result, Fleming urges everyone to use photoelectric smoke detectors instead. Unlike ionization smoke detectors, photoelectric smoke detectors use a T-shaped chamber where an LED sends a beam of light across the top. When smoke enters the photoelectric smoke detector, the light hits the smoke particles and scatters, striking the photocell at the base of the T-shaped chamber. Once a certain amount of light hits this photocell, the alarm is set off. According to studies by the National Institute of Standards and Technology (NIST), a photoelectric smoke detector senses smoldering fires on average 30 minutes faster than an ionization smoke detector does, giving occupants much more time to escape.

“[I]onization smoke detectors are slower to detect smoke from smoldering fires, which begin quietly and often suffocate their victims first.”

However, an important thing to remember is that each type of smoke detector has its own strengths and weaknesses. While photoelectric detectors react faster to smoldering fires than ionization detectors, critics are quick to point out that photoelectric detectors have more difficulty responding to the fast-moving fires that ionization detectors deal with easily. This is because photoelectric detectors are not very sensitive—there has to be a lot of smoke, enough to block out the light, to activate them. They have no difficulty detecting smoldering fires since they produce a lot of smoke, but they don’t perform as well with fast-moving flames.

While photoelectric detectors definitely have their disadvantages, the flaws of these two types of smoke detectors are nowhere close to being on the same scale. As stated earlier, ionization detectors respond 30 minutes slower to the deadly smoldering flames than do photoelectric detectors. In contrast, photoelectric detectors are only 50 seconds slower than ionization detectors when it comes to sensing fast-moving flames. But to be fair, fast-moving fires do spread much quicker than smoldering fires, so one can argue that those extra 50 seconds are still valuable.

Nevertheless, many experts, including Fleming, have been pushing for legislative change to require the use of photoelectric smoke detectors instead of ionization smoke detectors. Because photoelectric detectors are more expensive than ionization detectors ($26 compared to $13) and most state laws remain silent on which one to use, the majority of buildings only use ionization detectors. Yet, studies have shown that smoldering fires account for 54 percent of deaths while fast-moving fires account for only 16 percent.

Photoelectric smoke detectors also have an additional major advantage over traditional ionization detectors: They are less likely to cause nuisance calls. In fact, a 2000 study conducted in Alaska found that homes with ionization detectors had more than eight times the rate of false alarms than those with photoelectric detectors. As it turns out, it doesn’t take much to trigger a nuisance alarm from an ionization detector. Anything from burnt toast to shower steam to even high humidity can set these smoke alarms off. This high sensitivity is not only a source of annoyance but also an actual health hazard: Six people died in the 1990 Boston fire because the residents turned off their ionization smoke detectors after they became sick of the nuisance alarms going off every time they cooked.

“We know that the photoelectrics are better at alerting people to the fires that are killing more people. A lot of people die from smoke inhalation and not raging fires. The traditional smoke alarms go off frequently when people are cooking or showering and they disable them and forget to reconnect them,” argued former New York City Councilwoman Elizabeth Crowley, a long-time advocate for photoelectric smoke detectors.

Given the evidence stacked against ionization smoke detectors, it’s no surprise that substantial progress is being made in legally requiring the use of photoelectric smoke detectors. After the 2005 apartment fire in Barre, Vermont, the Barre Fire Department conducted a series of experiments testing both photoelectric and ionization detectors and found that some ionization detectors take longer than an hour to go off in the presence of smoldering flames. Soon afterwards, Vermont lawmakers passed a law that required all new homes to have photoelectric smoke detectors installed. In Massachusetts, new or renovated homes and apartments must install photoelectric smoke detectors instead of ionization detectors. Two cities in California, Albany and Palo Alto, have done the same as well.

“If we could wave a magic wand right now and get rid of all the ionization smoke alarms and replace them instantly with photoelectric smoke alarms, we would cut out fire deaths in this country by more than 50 percent,” stated Albany’s fire chief, Marc McGinn.

“[S]ome ionization detectors take longer than an hour to go off in the presence of smoldering flames.”

So what type of smoke detectors does Vassar use? According to Vassar’s Director of Environmental, Health and Safety James Kelly, the campus utilizes a mix of both photoelectric detectors and ionization detectors. And indeed, the NFPA recommends both types to be used so that they can cover for each other’s weaknesses. Unfortunately, since all the smoke detectors are connected to a central fire alarm system, if one smoke detector goes off, the whole system goes off.

But is having both truly ideal? This topic is still hotly debated, but the International Association of Fire Fighters (IAFF), the world’s largest firefighter union, made its stance clear all the way back in 2008: All homes should ONLY use photoelectric smoke detectors. Why? Because not only are the benefits of ionization detectors over photoelectric detectors considered “marginal,” but the high frequency of nuisance alarms caused by ionization detectors will also continue to encourage bad fire alarm habits like removing the batteries.

For now, it seems nothing much can be done about the smoke detectors at Vassar. After all, the safety of the students takes greater priority over their inconvenience. But if the college really does care about the community of Poughkeepsie, then the administration should take a more active role in not putting so much financial strain on the Arlington Fire Department. The firefighters there deserve that much, at the very least.

Originally published on February 21, 2018, in The Miscellany News: Outdated Smoke Detectors Incite Vassar’s Infamous Fire Alarms


Lasting Love: The Science Behind Happy, Fulfilling Relationships

Picture Credit: Getty Images | The Telegraph

Ah yes, Valentine’s Day: a precious little holiday where young couples try their hardest to prove that their relationship is special. It’s not really surprising that so many people are invested in the idea of Valentine’s Day as a sacred time of love. After all, billionaire corporations like Hershey, Hallmark Cards, Victoria’s Secret and Tiffany & Co. have been promoting this time-honored tradition for decades and making more than $18.2 billion during this one day. Nothing says “I love you” quite like rampant consumer capitalism. But whether we like it or not, we are a species that is in love with the idea of falling in love. From Hollywood movies to mediocre young adult romance novels, nothing is as widespread as this notion of two strangers becoming infatuated with each other.

And yet, the media seems very uninterested in what happens after the couple officially gets together. That’s a shame, since so much of what makes a relationship interesting is how the couple behaves in this new situation and the quality of their interactions over a long period of time. Just because a romance seems perfect in the beginning doesn’t mean it will stay that way permanently. According to a longitudinal study by Stanford sociologist Michael Rosenfield, unmarried heterosexual couples have a 60 percent chance of breaking up within just the first year together. Given how tumultuous dating can be, why do some romantic relationships last only for a month while others last for decades in harmony and bliss? Thankfully, researchers working in the niche field of relationship science may have some answers.

Established as recently as the 1980’s, relationship science is a rather interesting branch of psychology that aims to understand the structure of close relationships and how they operate, as well as the effects a certain type of relationship has on other people. According to the experts in this field, countless different factors play into how well a romantic relationship can turn out, and some of them are rather intriguing. In a 2013 study, researchers found that heterosexual couples where the female partner is more attractive than the male partner reported higher levels of satisfaction.

“[These results seem to indicate] that partner physical attractiveness played a larger role in predicting husbands’ marital satisfaction than it did in predicting wives’ marital satisfaction,” the study authors concluded.

Surprisingly, the same results were reported in a similar study in 2008 by researchers at the University of California, Los Angeles. The theory seems to be that men may feel more invested in their romantic partner if they believe that they had “lucked out” by marrying such an attractive wife. Unfortunately, the opposite occurred when the husbands believed that they were more attractive than their wives, and thus they were less willing to help their wives. Typical male behavior.

“Given how tumultuous dating can be, why do some romantic relationships last only for a month while others last for decades in harmony and bliss?”

Psychologists also suspect that money has a huge impact on the stability of a relationship. According to a 2009 study by the University of Michigan that involved more than 1000 married and unmarried adults, researchers found that people who are dissatisfied with their spending habits tend to gravitate towards their spending opposite in love. In other words, those who spend money luxuriously may end up in a relationship with someone who budgets carefully, and vice versa. However, this study also showed that these relationships often fail: “Even though a spendthrift will have a greater debt when married to another spendthrift than when married to a tightwad, the spendthrift is still less likely to argue about money with the other spendthrift,” stated leading author Rick Scott.

Another interesting observation is that birth order may influence the happiness level of a relationship. According to psychologist Linda Blair, one of the happiest pairings is between a first-born child and a last-born child. The explanation? She believes that the success of these relationships come from how the relationship consists of one person who is used to taking care of others and one person who enjoys being taken care of.

And of course, there are countless studies that show that having sex often significantly improves romantic relationships. Probably one of the most famous papers on this topic is the 2004 study by the National Bureau of Economic Research, which sampled 16,000 Americans and found that for both men and women, “The more sex, the happier the person.” But is this conclusion really true? Apparently so, for a 2015 study that surveyed more than 30,000 Americans for 40 years found that couples who have sex once a week are the happiest. However, the researchers stated that there was a limit to this happiness and that having sex more than once a week didn’t really increase happiness. It’s also important to point out that a 2012 study by Cornell University found a positive link between waiting over a month to have sex at the beginning of a relationship and long-term satisfaction.

Regardless of what all these theories say, however, it doesn’t change the fact that all relationships are different and none of them can exactly be boiled down to a precise formula. However, there does seem to be one characteristic that seems to be present in almost all happy relationships. This characteristic is called self-expansion, and it is the idea that the perfect romantic partner is not someone who makes you comfortable but someone who makes you a better person.

According to a 2017 study, researchers discovered that the strongest and most fulfilling relationship was between two people who felt like their ideal selves in their relationship rather than their actual selves.

“[The results of our studies] contradict the popular sentiment that relational authenticity lies in “being yourself” in the relationship,” the study authors noted.

So, what makes a romantic relationship “perfect”? Despite the various outside forces that seem to be in play, it’s more likely that a couple’s happiness depends on how they help each other reach their full potential in both their ambitions and their personal life. Perhaps it’s this partnership in continual self-improvement that makes a relationship between two people truly special.

Picture Credit: The Huffington Post

Originally published on February 7, 2018, in The Miscellany News: Science Explains Love Just in Time for Valentine’s Day

Is the Doomsday Clock Legitimate?

Bulletin Of The Atomic Scientists Moves The "Doomsday Clock" 30 Seconds Closer To Symbolic Apocalypse
Picture Credit: Win McNamee | Getty Images | The Week

Somehow, 2018 has only just started and the situation already seems incredibly bleak. At least that’s what the members behind the academic publication Bulletin of the Atomic Scientists say. Originally conceived in 1945, this nonprofit organization has been gauging the probability of a global catastrophe and has communicated this threat level to the public with its infamous Doomsday Clock, which represents the countdown to the end of civilization if countermeasures aren’t taken. This metaphorical clock has moved backwards and forwards many times throughout its run, from as far back as 17 minutes from midnight in 1991 with the dissolution of the Soviet Union to as close to 2 minutes from midnight in 1953 when the testing of the first hydrogen bomb. But even with the Cold War long over, it seems the global situation has only gotten worse. On January 25, 2018, the 19 international experts that make up the Bulletin’s Science and Security Board adjusted the infamous Doomsday Clock by moving it 30 seconds closer to midnight.

“As of today, it is two minutes to midnight,” announced President and CEO of the Bulletin Rachel Bronson during a recent press conference.

Naturally, many people aren’t too thrilled about this grim assessment. Global geopolitical tensions are already at an all-time high thanks to the frosty relations between Russia and the U.S. as well as the belligerent partisan gridlock in Washington, D.C. It also doesn’t help that President Trump continues to play Russian Roulette on his Twitter feed and goad celebrities and world leaders alike into schoolyard squabbles. Just earlier in January, he taunted North Korean dictator Kim Jong-un’s lack of nuclear arsenals on Twitter and bragged about how his Nuclear Button “is a much bigger & more powerful one than his.” At this rate, it really does appear as if the apocalypse is just around the corner.

But despite how dire “two minutes until midnight” might sound, the entire concept of the Doomsday Clock isn’t as helpful as some people make it out to be. For one thing, it doesn’t empirically measure anything. It’s only a metaphor, after all. The Bulletin’s Science and Security Board may consist of experts in their field, but the time displayed on this “clock” only represents what a handful of people think about the state of the world. The rather arbitrary nature of the time set on the Doomsday Clock becomes clearer in the context of the Bulletin’s history. When the concept was first created in 1947, the clock was initially set at seven minutes to midnight. Was there a logical explanation behind this decision? Nope, the Doomsday Clock started at 11:53 p.m. because, according to the original artist, “it looked good to my eye.”

It’s also important to note that the Doomsday Clock hasn’t been a good predictor of actual nuclear risk. For instance, the Bulletin’s Science and Security Board changed the time from two minutes to midnight to seven minutes in 1960 and kept it that way until 1963, citing how “[f]or the first time, the United States and the Soviet Union appear eager to avoid direct confrontation in regional conflicts.” However, they couldn’t be more wrong. In 1961, the U.S. government ignored warnings from various defense experts and started deploying “Jupiter” nuclear missiles in Italy and Turkey that could reach all across the Soviet Union. As predicted, the USSR became paranoid when it saw American nuclear missiles aimed right at their doorsteps and proceeded to deploy its own nuclear missiles to Cuba in 1962 to “[give the U.S.] a little of their own medicine.” This deadly confrontation lead to the infamous Cuban Missile Crisis, which historians agree was the moment when the two superpowers came closest to nuclear conflict—a time when the Doomsday Clock read seven to midnight.

[D]espite how dire “two minutes until midnight” might sound, the entire concept of the Doomsday Clock isn’t as helpful as some people make it out to be.

Of course, some people would argue that we’re not supposed to treat the Doomsday Clock literally. They would claim that its real benefit comes from how it spreads awareness and conveys the urgency of the global situation. However, its current design does less to promote action and instead seems more preoccupied with keeping people in a heightened state of alarm at all times. In the past, the Doomsday Clock assessed danger specifically based on nuclear weapon proliferation. Sure, it wasn’t always accurate, but at least it had a clear purpose. But ever since 2007, the Bulletin incorporated other dangers as well. In its 2018 report, the Science and Security Board accounted for threats such as climate change, cyberattacks, and advances in CRISPR gene-editing.

“Today, technological innovation in biology, artificial intelligence, and cyber are occurring at speeds that challenge society’s ability to keep pace,” stated Bronson ominously.

While it’s true that all these issues (especially climate change) should be addressed with careful consideration, adding more dangers to worry about muddles the original message and makes it much harder to stay focused. The goal has ultimately changed from reducing nuclear weapons to reducing everything that could possibly pose a threat to humanity, which makes the overflowing mountain of problems incredibly overwhelming to even approach. Even the solutions that they offer to “turn back the Clock,” are mainly directed towards the world leaders rather than the everyday person, and they seem more like wish lists rather than detailed plans of action. In addition, setting the Doomsday Clock closer to midnight because of advancements in gene editing and A.I. and grouping them together with nuclear weapon proliferation is just absurd, given how many lives those scientific advancements can save. In that regard, the Doomsday Clock is no better than the sensationalist news media that automatically labels new scientific technologies as dangerous without making the effort to understand them properly.

Despite what the Bulletin claims, blindly decrying the end of humanity with this “Doomsday” Clock every year is not going to help anyone. It may have opened some people’s eyes during the Cold War, but in the age of social media and 24-hour news networks, the general public is more than cognizant of the terrible state of the world. The real problem is that many people have become too desensitized to all the alarm and overwhelmed to the point of apathy. The Doomsday Clock really only serves to remind people of what they already know—human society is destroying the world.

The Bulletin has always been urging people to pay attention to the Doomsday Clock’s minute hand as it moves closer and closer to midnight in hopes that people will be “shocked” into action. However, it’s clear that these apocalyptic proclamations are starting to have the opposite effect on people after hearing it so often. By its very design, the Clock will never actually reach midnight, so at this rate, all that the Bulletin can do is keep advancing the Clock by smaller and smaller increments until they’re forced to resort to half-seconds to avoid running out of space. While the yearly assessments on the global state of affairs remain important as a way to collect data, perhaps it’s time to retire this Mayan-Calendar-esque Doomsday Clock for good as an outdated relic of the Cold War.

Originally published on January 31, 2018, in The Miscellany News: Is the Doomsday Clock Legitimate?

Masters of Our World: Should We Use Gene Drives to Control the Ecosystem?

Picture Credit: Michael Morgenstern | Science News

Some have called it a magic wand. Others have referred to it as the beginning of a new scientific revolution. Regardless of how you may see it, it’s a subject matter that shouldn’t be discussed by only scientists.

CRISPR-Cas9 is the latest state-of-the-art gene editing tool that has taken over the scientific community in recent years. While the concept of modifying DNA is certainly not a new invention, CRISPR’s main strength lies its transformation of the complicated process of gene editing into something quick, efficient, precise and ridiculously cheap. In other words, it has the potential to cut out undesirable segments of DNA, eradicate hereditary diseases and even guide humanity to a future where people can shape their body into whatever they want. It’s what discouraged many people from thinking that something like designer babies is “unlikely,” but rather as something “inevitable.”

One area of CRISPR research that has gained a lot of attention recently is the development of gene drive technology, which may give humans the power to modify or even exterminate entire species in the wild. According to evolutionary biologist and gene drive pioneer Kevin Esvelt, the purpose of a gene drive is to use CRISPR to override the traditional rules of Mendelian inheritance and introduce a genetic change in organisms that will be passed down to nearly all of its descendants.

In a typical situation, a parent organism can only pass down its genome to half of its offspring as per the rules of inheritance discovered by Gregor Mendel, the father of modern genetics. As a result, even if scientists were able to genetically modify organisms in the past, they would still encounter immense difficulty in forcing specific genetic changes across an entire population. With gene drive, however, that 50-50 chance of inheritance can skyrocket to as high as 99 percent. This, of course, has groundbreaking implications.

“The ability to edit populations of sexual species would offer substantial benefits to humanity and the environment. For example, RNA-guided gene drives could potentially prevent the spread of disease, support agriculture by reversing pesticide and herbicide resistance in insects and weeds, and control damaging invasive species… [G]ene drives will be capable of influencing entire ecosystems for good or for ill,” stated Esvelt when he first introduced the possibility of using CRISPR to develop gene drives.

We possess the technology to change the world’s ecosystems, but does that mean we should use it? Many people certainly seem to think so, and the proposed benefits seem irrefutable. For instance, one innovative project currently underway is the use of gene drives to eliminate malaria from mosquitoes. Scientists are working on genetically modifying the Anopheles gambiae mosquito, a species known for spreading the malaria parasite so that the female mosquitoes become sterile. That way, once these modified mosquitoes are released into the wild, they can breed with other members of their species and effectively die off. Other scientists are looking towards using gene drive to wipe out invasive species and save endangered native animals.

Esvelt himself has become heavily involved in gene drive technology. His current project aims to reduce the rate of Lyme disease on Nantucket Island in Massachusetts by genetically modifying the island’s white-footed mice to become immune to the disease. Then, ticks will be unable to transfer the bacteria that cause the disease, and the entire transmission cycle will collapse.

However, as promising as all this may sound, it’s doubtful that gene drives will provide a lasting, viable solution. In fact, it’s possible that this technology allows scientists to deal with these serious issues in the wrong way. We may have become too infatuated with how sleek and shiny CRISPR appears to consider better, less risky solutions.

For one thing, ecosystems aren’t so simple that we can just inject new variants of a species into the wild and expect everything to go exactly as we planned. There are too many nebulous factors involved for scientists to be able to correctly predict the outcome of every ecological experiment. One of the test subjects may escape into a different environment or a completely unrelated species may become caught in the crossfire. Most of the time, as Esvelt notes, the gene drive may have little to no effect on the ecosystem at all. Ultimately, it’s arrogant to treat the ecosystem like a math problem with a simple, clean answer.

Even Esvelt seems aware of these limitations, stating, “Let me be the first to say that we do not understand how ecosystems work. They are fantastically complex.”

As if affirming this admittance of ignorance, nature itself seems to have knocked gene drive down several pegs. According to a recent report by population geneticist Philipp Messer, the genetically modified mosquitoes that the team designed to pass down an infertility mutation to all their offspring started developing a resistance to the gene drive. In other words, gene drives may not be the permanent solution that many people claimed it to be. “In the long run, even with a gene drive, evolution wins in the end,” Esvelt commented in response to the news.

But that’s not even the worst part. Upon creating a detailed mathematical model that describes what happens when genetically modified organisms are released, Esvelt discovered that the chances of altered genes spreading to unintended parts of the ecosystem were much higher than he originally predicted.

“I [feel] like I’ve blown it … [Championing this idea was] an embarrassing mistake,” Esvelt admitted.

To be honest, the entire idea of gene drives seemed faulty to begin with, mainly because the desired population modifications were not introduced naturally. Instead of working hand-in-hand with evolution, gene drives attempt to solve ecological problems by simply creating more unsustainable arms races akin to the one we have between antibiotics and bacterial diseases. For instance, even if gene drives eradicated a species of mosquitoes that spread malaria, it wouldn’t be long before a different species of mosquitoes eventually emerged that can spread the bacteria to human hosts.

Instead of making sudden, irreversible changes to the ecosystem, a much more reasonable solution is the one offered by evolutionary biologist Dr. Sharon Moalem in his book The Survival of the Sickest. In it, Dr. Moalem describes how the best way to combat diseases like malaria is to change the conditions of the environment so that the nature of the disease evolves in a way that works in our favor. For example, consider how the widespread use of mosquito nets would not only stop mosquitoes from infecting humans but essentially invalidate mosquitoes in general as vectors for the disease. As a result, evolution may provide an alternative way for malaria to spread, perhaps one that wouldn’t cause the parasite to completely incapacitate the body and instead only slightly weaken it so that the disease can spread similarly to the common cold.

Rather than risk a high-stakes gamble on gene-editing technology, it may be wiser in the long run to contemplate less invasive methods to solve our ecological problems. Humans don’t have a great track record to begin with, after all.

Originally published on November 29, 2017, in The Miscellany News: Gene Drives Wrongfully Hailed as Biological Panacea

The Frustrated Scientist: How the Glory of Research is Starting to Flake

Picture Credit: Jean-Philippe Ksiazek | AFP | Los Angeles Times

For some reason, there is something dignified and respectable about being a scientist. Seeing as how science is a career dedicated to the pursuit of truth and knowledge in the natural world using logic and evidence, it’s no surprise that so many people look to the task of scientific research as some kind of noble, almost illustrious profession brimming with success. According to a 2013 report by the Pew Research Center, public views of scientists are largely positive, with 65 percent of Americans believing that scientists contribute a great deal to society—only falling short of medical doctors, teachers, and the military.

In general, it seems very clear that we as a society regard scientists and their works with very high esteem, almost to the point of societal worship. As a result, ambitious college students and overbearing parents tend to think that the career path of a scientist in academia is one that guarantees a respectable level of fortune and recognition. However, we must understand that blindly revering anything, from renewable energy to cancer research, often leads to serious consequences instead.

In short, over-glorifying the scientific profession may motivate people to pursue careers in science, but it also instills in people a set of unrealistic expectations that may crush them in the face of harsh reality.

For instance, when we think of what it’s like to become a scientist, an idyllic story comes into mind: A young but passionate individual enters a prestigious graduate school and immediately begins work on the research project of their dreams. Soon, the experiment becomes wildly successful and the results are published in an esteemed academic publication like Science or Nature, and thus follows a life of wonder and scientific discovery for our intrepid fledgling scientist who aspires to change the world.

Needless to say, you would need the devil’s luck for that to happen to you because scientific research isn’t nearly as idealistic or forgiving as most people want to believe.

For one thing, despite constant calls for more people in the sciences, reports show that the United States is producing too many research scientists—to the point of extreme industry congestion, in fact. According to the 2014 Survey of Earned Doctorates by the National Center for Science and Engineering Statistics, over 54,000 research doctorate degrees were awarded in the U.S. in 2014, representing the highest number ever recorded by this survey. Of those doctorates, 75 percent of them belonged to the science and engineering fields, which has increased from 66 percent in 2004.

Although there is an overwhelming number of qualified scientists out there today, there simply aren’t enough desirable science jobs available to support everyone. For many science graduates, the prospect of obtaining a tenure-track professorship at a university is the ultimate goal because it’s one of the few positions in academia that features cutting-edge research and permanent financial security. However, there is such a surplus of PhDs in most fields that the odds of actually achieving that goal is around one in six.

“Whether we like to admit it or not, science today is a pyramid scheme. Over the last two decades, there has been a period of unsustainable growth … As a consequence, it’s child’s play to get a PhD position but almost impossible to secure a faculty job,” remarked David Keays, a biomedical researcher at the Research Institute of Molecular Pathology.

As a result, an overwhelming number of science PhDs in academia end up spending their next four or five years as a postdoc working under a professor for very little pay and meager benefits. For instance, the average postdoc in biomedicine gets paid an annual salary of about $45,000. To put that into perspective, the Bureau of Labor Statistics reports that a typical librarian has a median annual salary of $55,370, while the median annual salary of a postal service mail carrier is $57,200.

Not only that, a recent study found that ex-postdocs in biomedicine make significantly lower wages in the first 15 years of their career than their peers outside academia. And yet, people are desperately vying for these postdoc positions because of the severe lack of academic jobs. Ironically, the 2014 Survey of Earned Doctorates reported that the highest rates of academic employment are reported by doctorate recipients in the humanities and other non-STEM fields with a rate of almost 80 percent, while the lowest rates are reported by engineering—15 percent—and physical science—29 percent—doctorates.

But even once you become a certified scientist, the cutthroat competition doesn’t end. Every year, scientists from around the nation must compete for funding and grants to conduct their experiments. However, grant money is always in short supply and can’t keep up with the rate of young scientists entering the workforce. For instance, the National Institutes of Health, a major funding source for scientists, has been suffering from severe budget cuts for the past several years—all while the cost of conducting experiments has skyrocketed as well. As a result, only about 17 percent of NIH grant applications get approved, a significant decrease from 30 percent in 2000.

The high rejection rate for grant money has fueled a cascade of worrying patterns. For instance, a survey run by Nature found that academic researchers of all ages spend so much time on writing countless applications and other administrative tasks that they spend only about 40 percent of their time on actual research. Not only that, researchers are being worked to the bone to juggle all these tasks. A recent poll of more than 8,000 scientists showed that almost 40 percent of the respondents work for more than 60 hours per week.

Even worse, the ironclad law of “Publish or Perish” dictates that all scientists must pump out as many research papers as they can as quickly possible, or else they risk putting their careers significantly in jeopardy. As a result, many scientists in all fields resort to desperate measures to stay afloat by rushing experiments, exaggerating results and cherry-picking evidence. In 2012, researchers at the biotech firm Amgen could only reproduce six of 53 “landmark” studies in cancer research. In fact, one in three researchers stated that they know of a colleague who has “pepped up a paper” through shady means. Many have even turned to profit-driven “predatory journals” to publish their papers, casting scientific credibility into doubt and proliferating the dangerous culture of pseudoscience.

It should be clear that the science profession is not as pure and righteous as many may believe, but that doesn’t mean being a scientist is a dead-end job either. Despite all the hardships they face, the majority of scientists seems content with what they do; surveys showed that at least 60 percent of scientists are satisfied with their careers.

Simply put, pursuing a career in scientific research is no better or worse than any other job option that you may end up choosing is. It goes to show that all career paths, no matter how highly society may view them, will be fraught with challenges, but those who are most likely to succeed are those who genuinely love what they do.

Originally published on November 8, 2017, in The Miscellany News: Over-glorification of Scientists Belies Realities of Field

The War on Cancer: We Still Think We’re Winning Even Though We’re Not

Picture Credit: Fabian Bimmer | Reuters | Newsweek

I must admit, it can be exciting to read about the latest developments in cancer research in the news. There is so much happening in the field of oncology that it’s tempting to imagine a future without cancer just around the corner.

For instance, scientists at Northwestern University have reported that they have found what they call an ancient kill switch in cancer cells. According to the researchers, they may be able to use this mechanism to force cancer cells to kill themselves in multiple ways simultaneously.

Not only that, a revolutionary new form of cancer treatment known as CAR T-cell therapy has swept up the scientific community in an excited fervor. By manipulating the T-cells of the cancer patient’s own immune system with gene therapy and then reinjecting them back into the patient, researchers have successfully destroyed tumors in people who had lost all hope.

According to various news reports, this treatment was so promising that the U.S. Food and Drug Administration (FDA) has recently given it the green light for production and distribution, making it the first use of medicinal gene editing to be available for patients in the United States.

“We’re entering a new frontier in medical innovation with the ability to reprogram a patient’s own cells to attack a deadly cancer,” FDA Commissioner Dr. Scott Gottlieb stated after approving the treatment.

As with anything that’s showered with positive attention by the media, however, it’s not as simple as it appears. All the hype surrounding cancer research is actually blinding us to the reality that we are not winning the war against cancer. In fact, despite what headlines may claim, we are nowhere close to actually finding the cure for cancer.

While such a claim may sound needlessly pessimistic, it is vital to view the current trajectory of cancer research within the context of its larger history. For one thing, cancer has been around for a very, very long time. This immortal and terrifying disease has been around for all of human history, with fossil evidence and ancient manuscripts dating its pervasiveness as far back as 1600 B.C. Needless to say, countless attempts have been made by renowned scientists and medical experts across human history in a collective effort to understand and combat this disease. In recent memory, the most notable collective endeavor is America’s War on Cancer, which was launched by President Nixon in 1971. From that moment on, the United States has devoted increasingly intensified efforts to find a cure.

Over the past 40 years, the U.S. has poured a total of more than $500 billion into winning this war. Even now, that war continues to escalate. In 2017, the National Cancer Institute (NCI) received $5.389 billion for the fiscal year, which is $174.6 million more than what the organization received in 2016. In addition, we have around 260 different nonprofit organizations in the United States that raise money for cancer research and treatment. Together, those nonprofit organizations have budgets that top $2.2 billion.

This should be good news, though, right? All of that money is going towards a worthy cause, after all. Indeed, that much is undeniable. However, the problem is that all that money is translating to very little substantive progress in terms of developing a permanent solution. So far, we have made great strides in understanding the nature of cancer cells and how they behave in general. Unfortunately, utilizing that knowledge to create a reliable treatment has so far proven to be much more difficult than anyone had realized.

Despite receiving billions of dollars in funding and conducting countless expensive and laborious drug trials, scientists have yet to develop anything that can meaningfully increase a patient’s chances of survival, much less actually cure the disease. In fact, a recent study published earlier this year reported that two-thirds of all cancer drugs that were approved in the past two years showed no evidence of extending survival at all (USA Today, “Dozens of New Cancer Drugs Do Little to Improve Survival,” 02.09.2017).

When President Nixon announced the War on Cancer, he vowed that cancer would be cured by 1976. Today, cancer remains as deadly as ever. According to the World Health Organization, one in six deaths in the world in 2015 was caused by cancer, resulting in a total of 8.8 million deaths. As a result, cancer is still the second leading cause of death globally, just behind heart disease. However, the death toll from heart disease has decreased significantly over the past several decades. In fact, between 1950 and 2005, the death rate of heart disease dropped by 64 percent. In contrast, the death rate for cancer fell by a mere five percent during that same time period. That’s how little progress we have made, even with billions of dollars in funding supporting decades of scientists’ focused research.

Of course, the last thing I want to do is discourage further cancer research. Despite the rather bleak odds, there are still benefits in continuing this line of inquiry and searching for other treatment options. The point I’m trying to articulate is that the news you hear about regarding cancer research tends to be so overly positive that they often fail to accurately depict the reality of the situation. No matter where you look, every new insight is a “major breakthrough,” and every new test product is “a miracle in the making.” By exaggerating successes, the media has effectively deceived the general public into believing that the cure for cancer is just around the corner.

Case in point: CAR-T therapy. Remember how I mentioned earlier that this method of cancer treatment showed promising results? When news sources learned that the FDA approved its use in the United States, they became ballistic with excitement. They issued articles about the miracle of CAR-T therapy, with headlines such as “Latest Car-T Therapy for Cancer Signals New Era for Life-Saving Treatments”, “New Gene Therapy for Cancer Offers Hope to Those With No Options Left”, and “Cancer’s Newest Miracle Cure”. In typical fashion, all these articles feature heartwarming stories of cancer patients miraculously being saved by this revolutionary new treatment that will surely stop cancer in its tracks.

What these articles fail to mention is that CAR-T therapy can be incredibly dangerous because it needs to bring your body to the brink of death in order to save you. While the genetically engineered T-cells spread and kill the tumor cells, the patient undergoes a series of intense side effects that are so sudden and severe that a team of top specialists must remain on standby to keep the patient alive.

And sometimes, not even that is enough. So far, several patients have died from neurotoxicity complications during the clinical trials, and experts still haven’t pinned down the exact cause of their deaths. Because CAR-T therapy is so risky and complex, experts warn that it’ll take years before a treatment like this is safe for patients to use. It is certainly not the miracle cure that the media is making it out to be. It’s not even effective against all cancers; CAR-T therapy has mainly been used to treat leukemia but struggles against solid tumors like sarcomas and lymphomas.

Does this mean that CAR-T therapy is a lost cause? Absolutely not. Medical experts are justified to claim that this immunotherapy treatment is a legitimate revolutionary breakthrough in a field that has largely stagnated over the past several decades. This is a major accomplishment, and the cancer survival stories prove that fact. But the issue is that for the past 40 years, the media has consistently trumpeted the end of cancer with every trivial development. By bombarding the public with overly exaggerated tales of successes, the media has essentially deluded the country into believing that we are winning the war against cancer and that all cancer patients have a good chance of not only surviving but also returning to their normal lives. But such rose-colored views are far from the truth and have broken families apart.

As Dr. Otis Brawley, the chief medical officer at the American Cancer Society, explained, “We have a lot of patients who spend their families into bankruptcy getting a hyped therapy that [many] know is worthless…[Some choose a medicine that] has a lot of hype around it and unfortunately lose their chance for a cure.”

It’s already heartbreaking for patients and their loved ones to learn that they have cancer. It feels infinitely worse to undergo several “life-saving” treatments performed by doctors at the country’s best hospitals only to learn that none of it actually works. Consider the tragic story of Michael Uvanni and his brother James, a patient with skin cancer. After hearing about all the miracle treatments that were supposedly available and seeing happy commercials of cancer patients hugging their grandchildren, they felt confident that the odds were in James’ favor. That optimism led to crushing disappointment when his health continued to suffer, even after trying immunotherapy and several other experimental treatments. Three years after his diagnosis, James passed away from metastatic melanoma.

“I thought they were going to save him…You get your hopes up, and then you are dropped off the edge of a cliff. That’s the worst thing in the world,” confessed Michael Uvanni.

This sort of duplicitous optimism, unfortunately, permeates through the entire field of oncology. While newspapers hype research results to attract readers, drug companies make outrageous promises to boost sales and hospitals draw in paying customers by appealing to their hopes and overstating their accomplishments. Many scientists have also fallen victim to this mindset, often exaggerating the successes of their own research results to attract investors. Back in 2003, Dr. Andrew von Eschenbach, the director of the National Cancer Institute, announced the possibility of “eliminating suffering and death due to cancer by 2015.” Even President Obama contributed to the illusion when he announced the Cancer Moonshot project in 2016 by saying, “Let’s make America the country that cures cancer once and for all.”

Given all these overly positive messages, it’s no wonder that so many cancer patients believe that their lives are guaranteed to be saved, only to feel crushed when they learn the awful truth. Let’s be clear: There is no miracle cure for cancer. According to the American Cancer Society, the percentage of people who are alive five years after being diagnosed with stomach cancer is 29 percent. For lung and bronchus cancer patients, the number is 18 percent. For pancreatic cancer patients, it’s 7 percent. Patients with metastatic melanoma typically die within a year of diagnosis. Despite what you may hear, immunotherapy can cause fatal immune system attacks on the lungs, kidneys, and heart. There are no approved immunotherapies for breast cancer, colon cancer or prostate cancer. Not only that, studies have found that immunotherapy only benefits about 10 percent of all cancer patients.

As grim as all this may be, we must remember that not all hope is lost. That said, the last thing cancer patients need right now is to be blindsided by all the fanfare that seems to accompany every piece of cancer news.

Originally published on October 26, 2017, in The Miscellany NewsCancer research advancements overstated