Nanopore Sequencing & the Problem With Patents

--1359709322C31 (1)
Picture Credit: DNA | Public Domain Pictures

In 2003, researchers from all over the world achieved one of the greatest scientific endeavors of their time: identifying and mapping out the entire human genome. With over 20,000 genes analyzed, the scientific community reaped the benefits of the age of genomics, where scientists could identify the thousands of nucleotide base pairs involved with specific genetic diseases like Huntington’s and pinpoint the mutations that underlie different forms of cancer.

But now, a device from Oxford Nanopore Technologies could bring the same power of DNA sequencing from the laboratory into the palm of your hand. It’s called the MinION and it can sequence the DNA of any given sample in a matter of hours.

For decades, conventional DNA sequencing was widely regarded as a tedious, time-consuming process. In order to identify the genome of a particular sample, a researcher would have to create numerous identical copies of the DNA molecules, break each of those copies into tiny pieces for the machine to read, sequence each fragment individually and finally reassemble those pieces together again. It’s the equivalent to reading a book by shredding it to read each word separately and then taping the pages back together again. In addition, this cumbersome process involved expensive machines the size of refrigerators and took days or weeks to run.

Due to these practical limitations, many researchers have to rely on the products and services of large corporations to obtain the DNA sequence of their samples. Today, the one that currently dominates the sequencing market is Illumina, Inc., a corporate giant worth billions of dollars. At the moment, Illumina provides machines for almost every large sequencing center in the world and now has an almost complete monopoly in the industry. However, Oxford Nanopore Technologies intends to bring down this powerful behemoth with a revolutionary new way of reading DNA called nanopore sequencing, which identifies the nucleotide base pairs directly without breaking apart the DNA molecule.

The idea is rather brilliant. A nanopore is simply a very tiny hole, about 2.5 nanometers wide. Nanopore sequencing relies on the use of an incredibly thin synthetic membrane with numerous nanopores as well as nanopore sensors. When the membrane is submerged in liquid by itself and a current is ran through, a steady electrical pattern is measured as ions pass through the tiny holes.

These patterns change once a DNA sample is placed on the membrane. When the electrical current pulls a DNA molecule through a nanopore, the nucleotide bases block the pore and stop some of the ions from passing by. This blockage alters the current that the sensor is reading and ultimately causes the electrical pattern to dip. What makes this method so effective is that each nucleotide base of DNA blocks the pore in different ways and generates a unique and identifiable change in the current. In other words, one can identify the DNA sequence by simply reading the various spikes in the electrical pattern.

In addition to its speed, easy usage and portability, the MinION also boasts a 99.99 percent accuracy based on a performance of 90 percent without any false positives. Not only that, Oxford Nanopore Technologies set the price of their new, revolutionary sequencing gadget to a mere $1,000. When the MinION was first revealed to the world in 2012, one scientist tweeted: “I felt a great disturbance in the force, as if a million Illumina investors cried out in pain.”

The idea of genetically identifying any organic substance at any place and time has enormous implications. A DNA sequencer like MinION could not only be used in a lab but also in the field with little to no difficulties. During the Ebola outbreak in 2015, microbiologist Nick Loman used his newly-bought MinION to track the progress of the epidemic in real time while other scientists had to wait weeks for the results of their analysis to arrive.

For something as time-sensitive as a deadly epidemic, nanopore sequencing could save tens of thousands of lives. Not only that, Oxford Nanopore aims to make their product available to everyone everywhere. From NASA astronauts in space to high school students, the company envisions a future where DNA sequencing devices can become like telescopes, a formerly expensive scientific instrument that is now available to the everyday consumer.

Unsurprisingly, Illumina is trying everything in its power to stop MinION’s momentum. Last February, the sequencing industry monopolist filed several lawsuits against Oxford Nanopore Technologies claiming that the British company committed patent infringement by using bacteria-derived pores known as Mycobacterium smegmatis porin (Msp) to create their synthetic membrane.

At the moment, Illumina holds the patents for any system that use these Msp. Oxford Nanopore responded almost immediately, accusing the corporate giant of acting on unsubstantiated speculation to prevent the MinION from ever reaching the market all so that Illumina can maintain its monopoly.

This move by Illumina illustrates just one of numerous legal issues that stand in the way of scientific progress. The scientific community is often plagued by patent aggregators, people or companies who enforce patent rights to make a profit or keep such patents away from those who may pose a threat against them. Despite not using their patents for research or manufacturing purposes, these entities prey on smaller companies to force them out of business. Never having proven their ability to produce their own nanopore sequencer, Illumina could very well be yet another patent aggregator trying to neutralize the incoming threat to their business.

Even if the MinION does not contain Msp pores, Illumina could still utilize the doctrine of equivalents. This aspect of patent law claims that Oxford Nanopore Technologies could still be liable for patent infringement as long as the product in question performs the same function as the patented invention in the same way. Originally created to cover the difficulty in describing the invention exactly, the doctrine can now be used to back companies like Oxford Nanopore into a corner.

Depending on the outcome of this legal battle, the entire course of scientific progress can be altered. With such great scientific advancements at risk due to capitalistic greed, it’s time to take another look at our patent system to prevent other innovations from becoming similarly obstructed. Overhauling the patent system is essential to taking money and special interests out of scientific research and thereby crafting an atmosphere more conducive to intellectual cohabitation and progress.

According to phylogenomics researcher Joe Parker, nanopore sequencing can bring about a second age of genomics. If that future can never come to fruition, then the same bleak stasis will certainly sabotage other shining opportunities for society as well.

Originally published on May 4, 2016, in The Miscellany NewsNanopore sequencing research should be encouraged


Turning to Biomimicry: The Unrecognized Importance of Studying Animals

6265559098_1eb11e71f3_b (1)
Picture Credit: Jiuguang Wang | Snake Robot at Robotics Institute | 2011 | Flickr

What comes to mind when you think of biology and making a difference in the world? Cancer research, gene therapy, or the treatment of diseases are generally the most common responses since these fields never fail to generate public buzz. Unsurprisingly, most people probably didn’t think about the study of animals. Zoology is the field responsible for that objective. Or at least it used to be.

Truth be told, zoology is not an area of study that attracts much attention or respect. In the eyes of most people, studying animals for a living seems more like a hobby or even a career fantasy that a naive child would imagine having as an adult. Unless you want to work for a zoo, some might say, it’s silly and unrealistic. Working with animals just seems too much like playing and lacks the seriousness that biochemistry, genetics, and medicine entails. Even on the Internet, the most common answer to why one should study zoology is “because it’s fun.” Another common answer is “because we must protect endangered species.” Given these rather lukewarm responses, it’s no wonder that most people don’t associate zoology with making an impact in the world.

Yet despite signs of zoology’s rapidly fading reputation, the study of animals is still going strong. It just happens to fall under a plethora of different names.

“Zoology is already dead,” stated John Long, Jr., a biology and cognitive science professor at Vassar College. “This old field has been pulled apart and its pieces put into new disciplines like biomimicry, animal behavior, evolution, biomechanics, biorobotics, […and etc.] While we call the study of animals ‘zoology,’ no one calls themselves a ‘zoologist’ anymore.”

While the term “zoology” is now considered outdated, the study of animals has spread across a wide range of different fields from robotics to cognitive science. Scientists and engineers alike have started to use animals to learn more about the workings of machines and the world. This integration has led to more far-reaching contributions to society than one might expect. Of the many categories, two main fields come to mind: robot biomimicry and animal-inspired innovations.

Robot biomimicry refers to machines or robots that imitate the structure and behavior of real animals in a way that takes advantage of that animal’s survival skill. Scientists and engineers study the design and mechanics behind different animals and attempt to make a simpler yet more efficient copies of the mechanism. For example, a team of scientists at Stanford researched how geckos use their toes to climb vertically in order to design a robot that can easily scale walls. A gecko’s toe contains hundreds of flap-like ridges, each of which has millions of tiny hairs with even tinier split ends. This special feature allows geckos to utilize weak attractive or repulsive forces called “van der Waals” forces in order to stick to walls and ceilings on a molecular level. Using an adhesive that incorporates the same strategy, the Stanford team is currently building robots that can climb rough concrete as well as smooth glass surfaces, making them perfect for reaching places that humans cannot normally access.

Similarly, roboticist Howie Choset of Carnegie Mellon University teamed up with researchers to study the locomotion of sidewinders, a species of desert snakes, to build a robot that can travel across rough terrains without getting stuck in ruts. By studying patterns in a sidewinder’s movements, Choset and his team not only built a robot that can help archaeologists explore dangerous archaeological sites, but they also learned more about the snake species in general.

On the other hand, animal-inspired designs use aspects of certain animals to improve something we already have. For instance, scientists at Harvard University have looked into why humpback whales are so agile in the water despite weighing more than 60,000 pounds. They later found that the bumps on the whale’s flippers allow whales to swim with great speed and flexibility. Excited with their discovery, the team designed turbine blades with similar bumps that were so effective at reducing drag, that Canada’s largest producer of ventilation fans licensed the design. This animal-inspired innovation will also be applied to transportation devices. For example, improvements can be made to stabilize airplanes and boost the speed of submarines.

Of course, there are countless other stories of researchers inspired by the creativity found in animals. In Japan, the design of a kingfisher bird’s bill was studied to improve the country’s famous bullet trains. Boat companies around the world are researching shark skin to design boats that are both faster and self-cleaning. Some experts even believe that examining the bioluminescence from fireflies or deep-sea squids could lead to an eco-friendly replacement of public street lamps. Studying animals allows us to use nature as our guide to create revolutionary designs and products. Every species possesses a unique survival mechanism or trait molded by countless centuries of evolution, and many of these could benefit humanity in unimaginable ways. Tapping into this rich reserve of creativity is our way to find new ideas when our own brainstorming comes up dry.

With all this promise, why does the study of animals suffer from such a dearth of public awareness and excitement? It could be because so many people maintain the stereotype that working with animals is synonymous to just playing with them. The preconception of this type of work as a lackadaisical, frivolous endeavor unfortunately remains deeply embedded in society.

Surprisingly, an interesting parallel can be drawn between the study of animals and environmentalism. In his essay, “Are You an Environmentalist or Do You Work for a Living?”, historian Richard White affirms that the public disdain towards environmentalism stems from its perceived detachment from work. Whether it’s logging, mining, or ranching, many environmentalists protest these encroaching forms of industry and argues that nature should be left pristine and untouched. While the popular slogan of “save the forest” isn’t a bad message, prioritizing the purity of a piece of land over the livelihood of other people has left a negative impression of the movement as a whole. It sends a disturbing signal that a person’s right to enjoy nature and its beauty overrules a person’s will to work in order to feed a family and find success. As White remarks, “Nature has become an arena for human play and leisure. Saving an old-growth forest or creating a wilderness area is certainly a victory for some of the creatures that live in those places, but it is just as certainly a victory for backpackers and a defeat for loggers. It is a victory for leisure and a defeat for work.”

Although White’s paper had stirred up some controversy among environmentalists, there has been a noticeable shift towards environmental work that directly benefits society. Environmentalism now provides a larger focus towards chemical tests on water sources and technology that benefits both nature and humans. As White had stated in his paper, environmentalists have to promote a form of environmentalism that directly promotes the progress of society for the movement to be taken seriously.

Similarly, the study of animals is currently going down the same path. In accordance with the rise of new animal-inspired inventions, a greater focus towards benefiting society may change the public outlook on the field. Thus, we should promote discussions on creative solutions inspired by nature rather than place emphasis on how fun it is to work with animals. Answering how and why different animals survive and flourish in a world ruled by natural selection could inspire wonder within people and ultimately ignite public interest.

After all, research into animals is perhaps humankind’s greatest source of ingenuity and imagination. With it, revolutionary ideas infused with the genius of nature await humankind in the future.

Originally published on April 23, 2016, in Boilerplate MagazineWhen Humans Don’t Have All the Answers – A Turn to Biomimicry

Introducing D-Wave: The New Era of Quantum Computing


Picture Credit: Eric Nost | D-Wave Quantum Computer | 2014 | Edge Effects

In 1965, co-founder of Intel Gordon Moore made a famous prediction: the number of transistors on a microprocessor chip, and thus its performance, will continue to double every two years. Widely known as Moore’s law, this bold claim transformed the semiconductor industry as processor chip manufacturers rushed to fulfill this prediction every year. While attempts have been successful for the past 60 years, transistors are now starting to reach their physical limitations and Moore’s law finally seems doomed to fail. But computer software company D-Wave Systems, Inc. has a solution: if processors can no longer speed up on a classical level, then why not design processors that function on a quantum level?

While the topic of quantum mechanics may appear daunting at first, the general basis behind this fascinating science is rather simple. To provide a brief summary, quantum mechanics is a relatively new branch of physics that focuses on processes happening at an atomic lev­el. It arose when scientists realized that rules of physics governing the world at a large scale don’t seem to match the behavior of subatomic particles like electrons and photons. While ob­jects in classical mechanics exist in a specific place at a specific time, objects in quantum me­chanics can exist in numerous different places and be different things at the same time. Additionally, quantum mechanics isn’t just con­fined to theoretical physics. Companies that can incorporate this new variable into their machinery could potentially change the entire industry of their field, which is ultimately what D-Wave is currently trying to accomplish.

For the past several decades, the semiconduc­tor industry built faster, more efficient processor chips by building increasingly smaller transistors. The idea is that the smaller the transistors are, the more you can squeeze on a microprocessor chip and the faster the chip can process informa­tion. So far, the most advanced microprocessors have circuit features that are only 14 nanometers long, which is smaller than the size of most vi­ruses. Experts estimate that in a few years, manufacturers may start building tran­sistors that are only 10 atoms long. But despite the industry’s successful track record, there is a definite limit to how small a transistor can be be­fore it becomes too unreliable to use. That is why D-Wave used the bizarre rules of quantum phys­ics to create a supercomputer that can process data much faster than what classical computing will ever be capable of. In 2015, the company announced the construction of the world’s first fully-operational quantum computer, the D-Wave 2X system.

The big question for most people, however, is how does a quantum computer work? Canadian Prime Minister Justin Trudeau gained huge me­dia buzz on the internet when he offered his own explanation: “What quantum states allow for is much more complex information to be encod­ed into a single bit. A regular computer bit is either a 1 or 0—on or off. A quantum state can be much more complex than that because as we know, things can be both particle and wave at the same time and the uncertainty around quantum states allows us to encode more information into a much smaller computer.”

Although initially impressive, Trudeau’s ex­planation is not quite correct. While it is true that conventional computers use bits that are either a 1 or a 0, the quantum states of quantum comput­ers don’t allow more information to be squeezed into a single bit. A quantum bit, or a qubit, is in a state of complete mystery until it is measured, at which point, it becomes a normal 0 or 1. However, what makes quantum computers so powerful is that being in that state of complete mystery, known as superposition, allows a qubit to be 0, 1 or both at the same time. This characteristic is what gives quantum computers like the D-Wave system their most valuable feature: the ability to consider all possi­bilities simultaneously and choose the best one.

While a classical computer solves a problem by considering each possible solution one at a time, experts estimate that a quantum computer can process and solve a problem up to a 100 mil­lion times faster than a conventional computer. Given this extraordinary speed, it’s little wonder that Google, NASA, Lock­heed Martin and the U.S. Department of Energy have all showed great interest in D-Wave’s prod­uct.

Quantum computing allows the tech indus­try to not only meet the predictions of Moore’s law, but also greatly surpass them. The D-Wave 2X system is expected to become an irreplace­able asset in solving optimization problems that require sifting through enormous stockpiles of data. Examples of the D-Wave computer’s appli­cations range from finding more accurate pat­terns in weather to becoming the cutting-edge tool in financial analysis that’s worth millions of dollars. Its ability would usher a new age of computing that scientists in the past would have deemed impossible.

Despite all the promise that it offers, quan­tum computing remains a relatively infantile field. The D-Wave system is not perfect and their processing capabilities are currently limited. In addition, quantum computers like D-Wave’s cost between $10 million to $15 million due to the difficulty behind building one. These machines require liquid nitrogen to cool its hardware to just above absolute zero (-273.15 C) in order to maintain its quantum state and any interference, whether outside or inside, could potentially destroy the fragile balance in the system. Yet despite the difficulties involved, the field of quantum computing has attracted the attention of international organizations around the world who are interested in a slice of this revolution­ary new benchmark in scientific achievement. In the end, we should expect further details about the semiconductor industry’s latest and greatest solution as it seems like Moore’s law won’t be broken anytime soon.

Originally published on April 27, 2016, in The Miscellany NewsMore quantum computing research needed

The Ecological Gamble with Deep-Sea Mining

Picture Credit: AErchie | Diagram of Deep-Sea Mining | 2011 | The Curmudgeon’s Magazine

Thanks to our insatiable demand for Earth’s natural resources, science never fails to find new ways to take advantage of what the planet offers. From the sunlight to the bedrock, companies have been succeeding in extracting energy and materials from the Earth in the most creative and often destructive ways. This time, the Australian-Canadian company Nauti­lus Minerals Inc. claims that the next area of focus should be the bottom of the ocean.

The deep sea remains humankind’s last ex­plored frontier on Earth, given how we know more about the surface of Mars than the bot­tom of the ocean. Hidden in the depths of the dark sea floor is an abundance of priceless metals more valuable than any treasure one may read about in a pirate book.

The ocean not only contains huge nodules of manganese, nickel, and copper, but it also has rich deposits of high-grade zinc, gold, and silver beneath its hydrothermal vents and min­eral layers made out of cobalt and platinum.

With such an immense collection of rich­es right in front of us, it’s no surprise that companies are racing to claim rights to these seafloor territories. The first to do so is Nau­tilus Minerals, a pioneer in what experts are calling “deep-sea mining.” Using its new and revolutionary underwater mining machines, the company plans on cutting up parts of the seafloor and using a collection machine to send them up to a ship on the ocean surface.

There, the sediments are filtered to separate the precious minerals from seawater and other substances. Based on existing technology used to dig trenches for oil and gas pipelines, these 50-foot-long mining machines are remote-con­trolled, which allows the company to extract the ores without sending workers more than a mile below the ocean surface.

Unsurprisingly, the efforts of Nautilus Min­erals have caught the attention of several en­vironmental groups who denounced deep-sea mining as destructive to the entire marine eco­system. While it’s true that deep-sea mining may lead to massive habitat destruction and species extinction, the greatest concern is the fact that no one knows exactly what will happen as a consequence of extracting energy and minerals from deep in the ocean.

“The truth is that we don’t know what the true environmental impacts of deep seabed mining are as yet. We know little about the ecology of the deep sea and the resilience of the system, and the effectiveness of the pro­posed efforts to assist natural recovery are unknown,” stated GreenPeace Oceans Cam­paigner Richard Page.

The ocean is more than just an expansive body of water. It not only serves as one of the largest sinks for greenhouse gases on Earth, but it also holds some of the largest reservoirs of methane gas beneath the seafloor. If something went wrong with the carbon dioxide absorption or if the trapped methane escaped into the atmosphere, the effects of cli­mate change would rapidly worsen and cause unimaginable harm to the planet’s atmosphere.

The bottom line is that Nautilus Minerals’ efforts to extract precious metals from the ocean floor will no doubt damage the eco­system, but the scope of that damage remains frightfully unknown. Given the unique nature of the ocean floor, anything can happen in only a short amount of time.

Although deep-sea mining could potentially have a significant effect on the environment, Nautilus Minerals argues that the overall im­pact of deep-sea mining will not be as severe as that of a terrestrial mine. According to Chief Financial Officer Shontel Norgate, there won’t be issues involving community displacement, the use of freshwater supplies, erosion or loss of land. Not only is the procedure itself minimally disrup­tive, but the minerals that this project will collect, especially copper, are crucial for green energy technology like wind and solar energy and electric cars.

“If we’re saying no to fossil fuels, we’re ef­fectively saying yes to more copper. Where is that copper coming from?” asked Norgate.

In addition, Nautilus Minerals stated that it wanted to pave a responsible path towards deep-sea mining by setting an example for oth­er companies. The company asks for the global community and all the skeptics to give them a chance to prove themselves.

“I certainly believe that if we get this right…it does have the potential to start a new in­dustry and change the way we’ve been mining copper for decades. We have a clean piece of paper here to decide how we want to do this, how we want this industry to be,” stated Nor­gate in an interview on July 2015.

As well-intentioned as Nautilus Minerals might be, deep-sea mining just leaves open too many risks for unforeseen consequences. After obtaining permission from the country’s government in 2014, the company expects to begin their mining project off the coast of Pap­ua New Guinea in 2018. If the project is successful, the company may collect at least 80,000 tons of copper and 150,000 ounces of gold per year.

In our market-driven world, the success of Nautilus Minerals will only provide an incen­tive for numerous other companies to do the same. Nautilus Minerals may be the first to ex­periment with deep-sea mining, but it certain­ly won’t be the last. After all, with the ocean floor rich with precious minerals, it’s only natural for people to want to take advantage of them before anyone else does first.

Already, other corporations, such as Lock­heed Martin, are making plans to commercially explore the seafloor. So far, the International Seabed Authority, the United Nations body regulating this growing indus­try, has issued a total of 19 licenses to differ­ent organizations. While the benefits of deep-sea mining may outweigh the costs momentarily, those costs will grow exponentially as more and more firms join the bandwagon.

As the industry grows, the possibility of things going wrong, like a disastrous chemi­cal spill, will rapidly increase as well. Eventu­ally, it will be an entire swarm of underwater mining machines drilling into the ocean floor, which will ravage the planet at an astronomi­cal scale. Even Nautilus Minerals itself plans on expanding to other areas if the project is successful. What is to stop others from doing the same?

Deep-sea mining presents itself as a glitter­ing, attractive new way to squeeze more nat­ural resources from the environment. But, as with similar past endeavors, once the industry gathers enough momentum, it becomes almost impossible to stop and leaves behind a trail of destruction in its wake.

Originally published on April 21, 2016, in The Miscellany NewsDeep sea mining potentially detrimental to environment

A Bright, Eco-Friendly Future: Bioluminescence as Our Next Light Source

Picture Credit: Lit by Bioluminescence | Glowee

Imagine a world where the streets glow with a dreamlike shade of blue as if you’re walking in the presence of ethereal spirits wandering the city. While that image sounds too mythical to be real, one start-up company is working to create this otherworldly environment for the future. Glowee, a French company planning on harnessing the power of bioluminescent bacteria, has officially debuted after successfully crowd-funding in May 2015. Their goal: to replace the electric street lamps of France with blue microbial lamps.

Bioluminescence is an organism’s ability to generate light in the dark. This is different from fluorescence, which involves absorbing light from an external source and immediately re-emitting a modified version of that light. While fluorescence is a physical process, bioluminescence is a chemical one that occurs due to an enzyme, luciferase. In the biochemical reaction, luciferase catalyzes the light-emitting pigment luciferin with oxygen in order to create light. For humans, bioluminescence has the potential to be­come a valuable source of renewable energy.

Consider the latest global push towards reduc­ing CO2 emissions and fighting climate change. At the 2015 UN Climate Change Conference, world leaders came to an agreement that everyone must do everything they can to cut down our energy consumption. While politicians can promise to limit emissions, real progress cannot occur with­out a viable green energy solution. Rather than an immediate transition to green energy, what if we tackled the problem one chunk at a time? This is where inspirations from nature and the creativity of science mesh together. For instance, biolumi­nescence doesn’t require any electricity to pro­duce light. Given this fact, researchers are investi­gating engineered bioluminescence as a possible alternative to regular street lighting.

Replacing electric lamps with bioluminescent ones may seem almost trivial in the face of cut­ting global energy consumption, but reducing the number of public street lamps is a very necessary first step. In truth, lighting up the streets every night is an incredibly expensive task. According to the U.S. Energy Information Administration, the U.S. spent a total of $11 billion on outdoor lighting in 2012, 30 percent of which went to waste in areas that didn’t use or need that light. Furthermore, a recent research study determined that there are currently about 300 million total streetlights around the world and that num­ber will grow to 340 million by 2025. With such severe drawbacks that come with electrical lighting, the use of bioluminescent light is a way to alleviate some if not most of that cost.

Today, the race to find the best form of engi­neered bioluminescence continues to bring us various creative inventions and solutions. At Syr­acuse University, a small team of scientists led by Rabeka Alam discovered a way to chemically at­tach genetically-altered luciferase enzymes from fireflies directly onto the surface of nanorods to make them glow. In a process they called Bioluminescence Resonance Energy Transfer (BRET), the nanorod produces a bright light whenever the luciferase enzyme interacts with the fuel source and can produce different colors depending on the size of the rod. According to one scientist on the team, “It’s conceivable that someday firefly-coated nanorods could be in­serted into LED-type lights that you don’t have to plug in.”

On the other side of the world, Dutch designer Daan Roosegaarde has been working to­gether with the tech company Bioglow to create bioluminescent trees to light up the streets. Incorporating important re­search from the University of Cambridge, Roose­gaarde and his team spliced DNA containing the light-emitting properties from bioluminescent organisms into the chloroplasts of plants. As a re­sult, those plants can produce both luciferase and luciferin that allows them to glow at night.

For Glowee, the plan is to harness biolumines­cence by using Aliivibrio fischeri, a species of bioluminescent bacteria found in certain marine animals like the Hawaiian bobtail squid. They first produce a gel containing the bioluminescent bac­teria along with various nutrients that keep the bacteria alive. Then, the gel is used to fill small, transparent containers, allowing the light to glow through. This method not only makes the light source wireless but also customizable depending on its purpose and design. These bioluminescent lamps would certainly appeal to shop owners in France, especially since the French government recently passed a law that forces all businesses to turn off their lights at 1 a.m. to fight light pollution.

Unfortunately, despite countless efforts towards perfecting engineered bioluminescence, it may still be a long while before our streets are lit by genetically-altered plants or bacteria. The two main obstacles in this endeavor are the rel­atively dim nature of the lights as well as their short lifespan. Even with Glowee’s bio-lights, the company’s current prototype can only produce light up to three days. Some argue that the cost and production of these bioluminescent products greatly overshadow their benefits, saying that such eco-friendly alternatives can never catch up to electrical lighting. While there may be lim­itations, all these projects by businesses and in­stitutions signify the public’s growing desire for real change.

A lot of these projects were funded not by the government but by Kickstarter and other funding platforms. Perhaps many of the backers were just mesmerized by the aesthetic appeal, but the public nevertheless recognizes the potential behind engineered bioluminescence. With continuous effort and scientific innovation, a town or a neighbor­hood powered by living organisms instead of electricity can be a reality. By following the ghost­ly blue light ahead, we would take a tremendous first step towards a world where humans and na­ture can truly coexist.

Originally published on March 30, 2016, in The Miscellany News: Scientists note perks of bioluminescence

Precision Medicine: It’s Best Not to Get Hopes Up

Picture Credit: President Obama | 2016 |

In an ideal world, every person must receive a custom healthcare treatment that matches their biological makeup. We may not live in an ideal world today, but the latest efforts in preci­sion medicine plan on coming as close to it as possible. Despite enthusiasm for this movement, similarly grand ambitions in the past have shown that the results often come up short of the prom­ises made. Personalized medicine is a mode of healthcare where every practice and treatment is tailored specifically to each patient. The idea was to collect genetic information from all indi­viduals to create an all-encompassing database. However, personalized medicine has undergone some changes over the years and has recently re­defined itself as “precision medicine.”

Rather than creating drugs or medical devic­es that are unique to a single patient, precision medicine classifies individuals into small groups based on their susceptibility to a particular dis­ease or their response to a specific treatment. These small groups allow physicians to know what sort of care a patient needs depending on what sub-group the patient is in.

Don’t let the definition change fool you. The name change mainly aims to allow the movement to start afresh. By rebranding itself as precision medicine, the practice gains a second chance af­ter failing previously. Even with this fresh start, precision medicine is still liable to obstacles that personalized medicine stumbled over in the past. For instance, electronically recording the genome of every individual remains expensive. Collecting all that information and acquiring the technology to store it is not something to take lightly.

There are also fears regarding patient privacy concerns and legal liability. This sharing of pa­tient data can easily end badly for both the doctor and the patient. However, precision medicine has only gotten more popular since its re-branding. What makes precision medicine so revolutionary is its focus on individuals rather than on a demographic. Instead of using a one-size-fits-all approach, it takes into account individual differences from genetic makeup to personal lifestyles. The hope is that precision medicine can accelerate the creation of tailored treatments for diseas­es like cancer.

By expanding genetically-based patient trials, scientists and doctors will have much more infor­mation to work with when leading research and providing treatment. A nationwide database of patient genetic and medical information can help guide treatment and reduce uncertainty. Preci­sion medicine also attempts to ensure that drug companies spend time developing treatments for specific groups of patients. Most firms currently try to optimize profits by producing drugs that can benefit large groups of people.

While beneficial to many, this ignores the plights of those with rare medical conditions who must go extreme lengths to get proper care. Precision medicine aims to promote the creation of treatments for a wide range of diseases, common and rare.

There are countless reasons to push for preci­sion medicine. However, I am suspicious at the growing hype over precision as the next great landmark achievement in healthcare. Even with the aforementioned risks, the re-branding has succeeded spectacularly. Precision medicine has entered the forefront of national discussion and the public often views it as the bringer of a new age of healthcare.

In his final State of the Union address, President Obama announced the Precision Medicine Initiative (PMI) to push for the nation to adopt this movement, asserting, “My hope is that this becomes the foundation, the architecture, whereby 10 years from now we can look back and say we’ve revolutionized medicine.” The President asked Congress for $215 million to support the initiative. Thanks to Obama’s support, the PMI Cohort program plans on amassing a record of one million U.S. volunteers.

Despite the optimistic outlook on the issue, precision medicine is far from ideal. In addition to the costs and legal issues, there are concerns as to whether a database on genetic information would even be significantly useful. Back in 2003, scientists discovered that even after mapping out the human genome, a person’s genetic code remains as perplexedly complex as ever.

There are too many risks involved in interpreting genetic information. In one case, a woman underwent extreme surgery and had her uterus removed due to an incorrect reading of her ge­netic-test results. Unfortunately, these accidents are not uncommon. There are also problems outside of the scope of the medical field. Once an all-encompassing patient database is established, countless issues involving ethics arise. Say that the ideal scenario of establishing precision medicine comes to fruition. Who would claim ownership for this data? How do we make sure this information isn’t abused and used to deny insurance coverage or jobs? What is to stop insurance companies from raising premium prices once the person’s genetic information is available?

These are all valid points to consider that come with an issue as complicated as this. This leads to concerns about security. Hospitals and other havens of digital, medical information are easy targets for cyber-attacks. Just recently, a string of hospitals in California, Kentucky, and Maryland became victims of information technology breaches and were forced to pay a ransom to convince the hackers to return the databases to normal. If something similar happened to the precision medicine patient database, the consequences could be catastrophic.

The most important message is that we should always carefully consider all the possibilities before launching headfirst into what seems like a great idea. The extreme hype over precision medicine as some great benchmark in health­care will only blind us to the possible pitfalls that might appear. Sure, the likelihood of disaster may be small, but if it does happen, there will sufficient blame-tossing to go around.

Precision medicine makes great promises to dramatically improve the quality of life with one simple end goal. However, one must not get dragged away by the illusions of grandeur. For now, it’s best to approach the issue with caution.

Originally published on April 13, 2016, in The Miscellany NewsRisks of precision medicine need review

Losing Our Last Resort: The Rise of Antibiotic-Resistant Bacteria

Picture Credit: Bubonic Plague Bacteria | The National Library of Medicine

As we head into 2016, we have much to feel grateful for in this modern age. Technological marvels such as computers and high-speed Internet define an era of advancement that has exponentially sped up our society’s growth and capabilities. But amidst this impressive and fast-paced development, one crucial feature of humankind’s modern society is perilously close to collapsing. Of the many things we take for granted in the 21st century, protection against bacteria probably ranks the highest in terms of human impact. Of course, given our track record against such killer pathogens, this shouldn’t come as much of a surprise. Our species has lost repeatedly to plagues and disease since the start of human history. Pathogens such as bacteria, viruses, and other microorganisms are our oldest enemy.

When it comes to fatal illnesses, a large portion of human history was spent without an adequate solution. In Europe, terrors such as the Bubonic Plague brought death to every door and we had no way of fighting back. It wasn’t until 1796 that our first real counterattack came with the invention of vaccines by English physician Edward Jenner. About 120 years later, Scottish biologist Alexander Fleming discovered penicillin, a powerful antibiotic produced by the blue Penicillium fungi, cementing our defenses against the pathogens and saving millions of lives.

But the bad news is that we only thought we vanquished our invisible adversaries. In fact, they have only gotten stronger. You see, microorganisms like bacteria are not like fearsome monsters that disappear once you slay them. They’re much tinier, but there are so many of them that they are almost impossible to completely eradicate. And thanks to evolution, the ones that survive due to some bizarre mutation multiply uncontrollably until we’re faced with an upgraded version of our old foe. Every time this has happened in the past, scientists have responded with stronger, more toxic antibiotics, which deadly bacteria eventually thwart. Thus, advancements in antibiotics have always led to more resistant bacterial strains with new ways to survive, causing an endless microbiological arms race that’s becoming more tenacious with each cycle.

So, how long until our microbial enemies catch up to our highly sophisticated, advanced medicine? A hundred years? Two hundred? Actually, they already have. In a recent study conducted this year, a team of experts in China discovered strains of E.coli bacteria in livestock that could not be killed by antibiotics. Normal, right? Except the antibiotics in question were polymyxins, a class of antibiotics that have remained effective for the past sixty years since its discovery. These drugs represent the most potent of our arsenal against bacterial infections, our “last resort,” and they have proven to be useless against this new, impervious strand.

Upon further investigation, the team identified the gene responsible as MCR-1. Unfortunately, they also found this gene in 15% of the meat samples from food markets and 21% of livestock in Southern China over the span of four years. Even worse, the E.coli with this gene has already moved onto humans. Of the 1,322 samples from patients with bacterial infections, 16 of them had the MCR-1 gene.

According to Mark Woolhouse, a Professor of Infectious Disease Epidemiology at the University of Edinburgh, infections from antibiotic-resistant bacteria are already causing the deaths of tens of thousands of people every year. Taking the spread of the MCR-1 gene into consideration, that number will surely increase in the future. This could very well start an era of “pandrug-resistant” bacteria or, as some others have called it, the “antibiotic apocalypse.”

But how did this become such a widespread problem so quickly? It turns out that the MCR-1 gene is found on plasmids, a mobile form of DNA that can jump from one organism to another. Therefore, bacteria can easily spread this gene to other bacteria through a process called horizontal gene transfer, which is the primary reason why antibiotic resistance is a problem in the first place. This is an awfully serious development since bugs like E.coli are “the most common form of hospital-acquired infection.” Scientists worry that there may soon come a time when more patients become ill from bacterial infections and doctors won’t be able to do a thing about it.

To emphasize, this isn’t just some isolated, freak incident. A study from 2011 similarly found that the number of cases involving bacteria resistant to carbapenems, one of strongest type of antibiotics in our possession, has increased dramatically from just 3 cases in 2003 to 333 cases in 2010. That’s an increase of over 11,000% in just 7 years.

Experts have been worrying about this day since they realized bacteria could adapt to penicillin. Sure, scientists can just make a new, even stronger antibiotic, but unfortunately, we have long since passed the age of rapid antibiotic development — in fact, we’ve fallen several decades behind. In truth, our advancement in medicine has been steadily slowing down to a plateau, and bacteria have finally caught up and passed us.

It is rather ironic that this terrible news came in the middle of the first World Antibiotic Awareness Week. Just when this global campaign was trying to raise awareness and encourage strict regulations on antibiotics, this study further shows the urgency of the situation. But as much as this crisis seemed inevitable, it really wasn’t. Just as in any story with a moral, we essentially did this to ourselves. Almost every one of us contributed to this situation without even being aware of it. Because you see, the main reason why everything spiraled out of control was because of our love of meat.

It turns out that we have been using antibiotics beyond recklessly in the agriculture industry. According to reports, farmers around the world feed 63,000 tonnes of antibiotics to pigs, cattle, and chicken every year. That number is estimated to grow by 67% to 106,000 tonnes by 2030. That’s right: we have been feeding humanity’s most powerful antibiotics, our last resort against disease, into the mouths of livestock by the truckload. This is because people all over the world have been growing more prosperous in recent years, causing them to buy more meat products. According to the UN Food and Agriculture Organization (FAO), people in developing countries “now eat 50 per cent more meat per person, on average, than they did in 1983.” Livestock– including fish as well as eggs and dairy since they also come from livestock– has become a fast-growing market that we no longer can live without or get enough of.

Thanks to skyrocketing demand, quick and efficient factory farms have become the norm. In order to keep the animals alive and fat, these farms feed them high doses of antibiotics. A whopping 80% of antibiotics consumed in the United States go towards livestock and America is only in second place. On the list of excessive antibiotics use, China is the worst offender, consuming 50% more than the U.S. with a total of 15,000 tonnes per year, and that number is projected to double by 2030. India, Brazil, Indonesia, and Nigeria are all showing a worrisome upward trend in antibiotic use as well.

Even if countries started banning the use of antibiotics as growth promoters in livestock, it’s too late to preserve antibiotics that already exist. Epidemiologists compare that to “closing the barn door after the horse has bolted.” By the time resistant bacteria are multiplying in humans (which they are), the problem is way beyond the control of farmers.

There are other contributors to this problem besides livestock. Any unnecessary use of antibiotics only serves to further tip the scale in favor of deadly bacteria. Careless use of antibiotics to treat colds and the flu contribute to antibiotic resistance, since those sicknesses are caused solely by viruses, not bacteria. Overall, it seems lack of knowledge is the biggest factor in all this. A report by the World Health Organization (WHO) showed that 64% of everyday people who thought they knew about antibiotic resistance believed that antibiotics could be used for colds and the flu.

However, our doom isn’t quite sealed yet. Despite the grim forecast, experts still say that rigorously limiting the use of antibiotics could help greatly. The US Food and Drug Administration states that while banning antibiotics in animals may not stop all resistant strains, it can prevent bacterial infections like Salmonella, which sometimes infects meat, eggs, and dairy, from reaching the same danger levels. Additionally, the U.S. Centers for Disease Control and Prevention advise people to take their antibiotics exactly as the doctor prescribes them, to never share leftover antibiotics, and to not ask for antibiotics if the doctor doesn’t think they’re necessary.

While the situation does look bleak now, it still holds more hope than it did before the creation of vaccines and penicillin. Unlike before, we have weapons and defenses that stand a chance against one of the most powerful forces of nature. Under a united effort, humankind can still achieve a turnabout of miraculous proportions.

Originally published on January 28, 2016, in Boilerplate Magazine: Losing Our Last Resort