Diets High in Whole Grains and Low in Meat Decrease Risk for Colorectal Cancer

Research Highlight by Kurtis Chien

             Colorectal cancer is the third most common cancer among men and women in the United States; approximately 371 people are diagnosed per day. Of these cases, a substantial proportion could be prevented through lifestyle changes. An analysis of 99 papers covering a sample size of 29 million subjects concluded that simple factors such as diet and physical activity reflect heavily on the likelihood of developing colorectal cancer. For example, diets heavy in red meats, which are defined by consumption of 500g per week, raise the chances of getting colorectal cancer. Processed meats, such as bacon, hot dogs, or sausages, were also deemed carcinogenic. Over-consumption of alcoholic beverages (2 or more serving sizes per day) put a person at risk. Finally, being overweight or obese or living a mostly sedentary lifestyle were found to be unfavorable factors.

            On the other side, certain foods and behaviors can reduce the probability of colorectal cancer. For example, eating 90g of whole grains or other fibrous foods every day could bring the risk down by 17 percent. An active lifestyle was also found to contribute to better colon health. Overall, nearly half of U.S. colorectal cases per year were directly attributable to lifestyle; that is, these cases could have been prevented with healthy diet and exercise. Perhaps it is intuitive to realize that healthy diet and lifestyle can prevent disease, but having the data to support such a hypothesis can be beneficial. The report provides evidence of the link between specific lifestyle choices and the development of colorectal cancer, and such information may help those at risk recognize their risk factors. At the very least, it can encourage people to make decisions that will ultimately better their lives. Dr. Edward L. Giovannucci, head author of the report, puts it like this: “Many of the ways to help prevent colorectal cancer are important for overall health” (American Institute for Cancer Research).

American Institute for Cancer Research. "Whole grains decrease colorectal cancer risk, processed meats increase the risk: Report analyzing the global research finds hot dogs and other processed meats increase risk of colorectal cancer, eating more whole grains and being physically active lowers risk" (Science Daily).

ScienceDaily. ScienceDaily, 7 September 2017. <>.

Complications of a Stroke: Susceptibility to Infections

By Min Seo Jeong

Having a stroke causes many detrimental effects on the brain, but research shows that a stroke can also have long-term effects on the body’s immune system.

Researchers at the University of Edinburgh’s Roslin Institute examined the effects of stroke on mice and found that mice that experienced a stroke were more likely to have bacterial lung infections. During a stroke, the nerves produce a chemical called noradrenaline. This chemical has multiple effects: increasing blood supply, raising heart rates, and releasing stored energy. However, noradrenaline also limits the number of marginal zone B cells, which are immune cells that produce antibodies. A result of this effect is damaged immune cells, making stroke patients more prone to infections that can be life-threatening.

In the mice, the researchers used therapies to block the effects of noradrenaline and ultimately decrease their susceptibility to infections. They warned against this treatment in humans and emphasized the importance of finding new treatments to help stroke patients recover.

Dr. Barry McColl, one of the researchers at the institute, says that the goal is now to "build on our findings by developing and testing new treatments that can block or bypass these immune deficits with B cells a particular target."


University of Edinburgh. "Immune discovery points to therapies to improve stroke recovery." ScienceDaily. ScienceDaily, 20 April 2017.

Say that Again: Independent language development in bilingual children

By Meg Thode

            When a child is bilingual, the linguistic development of one language takes places independently from that process in the other. That is what a team of researchers from Florida Atlantic University found in their research with English-Spanish bilingual children. It has previously been established that expanding children’s vocabulary is correlated to more sophisticated use of grammar, and vice versa. This team wanted to know whether exposure in one language was correlated to development of or sophistication in the other. For this study, vocabulary and level of grammatical development were measured in children who spoke Spanish and English as their first languages at six-month intervals between the ages of 2 and a half to 4 years. The finding was that the two are not related; although, in some children, development of English skills made their Spanish more vulnerable.

The study leader, Erika Hoff, frames it so: "There is something about differences among the children and the quality of English they hear that make some children acquire vocabulary and grammar more rapidly in English and other children develop more slowly.” She continues, "I think the key takeaway from our study is that it's not the quantity of what the children are hearing; it's the quality of their language exposure that matters. They need to experience a rich environment." This research has a number of applications, including education policy and child care programs.


Florida Atlantic University. (2017, April 20). In young bilingual children, two languages develop simultaneously but independently: Study also shows Spanish is vulnerable to being taken over by English, but English is not vulnerable to being taken over by Spanish. ScienceDaily. Retrieved April 21, 2017 from

What is the impact of climate change on the virility of the diseases ticks and mosquitoes carry?

By Carolyn Burtt 

With the number of reported cases of Lyme disease having tripled in just over a quarter century and malaria causing thousands of deaths annually, ticks, mosquitoes, and other disease-spreading creatures are on the minds of many scientists.

The association between the increased spread of Lyme disease and the threat of climate change has just begun to be explored, and it brings with it valuable information that many people in the colder northern climates are not yet aware of, given that ticks do not thrive in locations with long freezing periods. Now that climate change has begun to push spring forward each year, ticks have more warm days each year to forage for their next meal in places with warm, humid weather.

Mosquitoes, the insects responsible for diseases of the like of “chikungunya, dengue, malaria, West Nile virus, yellow fever and Zika,” also prefer warm weather to long winters, but the spread of the diseases that mosquitoes carry does not increase any significant amount with the warmer season lasting for longer. The mosquitoes may migrate northward more than they had in the past, but there is an inverse correlation between how warm the weather is and how long a mosquito will live. While the pathogens responsible for the diseases mosquitoes spread prefer warm weather because of the catalytic nature of heat on the pathogen’s development, mosquitoes do not survive long in particularly warm climates. As the warm season increases in length, the length of mosquito season may lengthen, but their virility does not necessarily change. It remains to be seen how the heating of the earth will impact the spread of mosquito-borne diseases, but what we can do in order to prepare for whatever may happen is to stay vigilant and continue to do what we can to protect against mosquitoes.


National Public Radio. (2017, April 21). #CuriousGoat: Will Climate Change Help Ticks and Mosquitoes Spread Disease? NPR. Retrieved April 22, 2017 from

Nutrition Links to Non-Alcoholic Fatty Liver Disease

By Kurtis Chien-Young

A study from Rotterdam surveyed a population of overweight, elderly citizens in the Netherlands, and found that diets high in animal protein were significantly correlated with a higher risk of developing non-alcoholic fatty liver disease (NAFLD). NAFLD occurs when the build up of fat cells in the liver reaches 5% of the hepatocytes, or liver cells. The condition stems from the consumption of animal product and processed foods. It is often comorbid to obesity, and can lead to burdensome health outcomes, such as liver-tissue scarring, liver failure, and cancer.

Approximately 1 billion people suffer from NAFLD, and the disease is most prevalent in developed, Western countries. This may result from the higher density of animal protein and calories in a typical Western diet. Consumption of animal protein has been linked with malfunctions in to the body’s homeostasis and ability to metabolize glucose.

The study recognizes that consumption of macronutrients such as animal protein is only one of the risk factors for the development of NAFLD, and that a greater concern lies in the variation of one’s diet. Simply replacing the intake of animal protein or other macronutrients yielded no meaningful results in the study’s participants. Given this, it is recommended that a focus is placed on adhering to a nutrient-diverse diet rather than on removing any given nutrient, such as fat or animal protein.


European Association for the Study of the Liver. "Diet high in animal protein is associated with NAFLD in overweight people: Significant associations between macronutrients and non-alcoholic fatty liver disease were found predominantly in overweight individuals." ScienceDaily. ScienceDaily, 21 April 2017. <>.

Studies point toward patient complacency in post-heart attack regimens

By Mohamad Hamze

Statins are drugs prescribed to those classified as at-risk for heart attack or stroke in order to lower cholesterol levels in the blood. When an individual suffers a heart attack, their risk for a future heart attack or stroke increases over the course of the next year or more, further warranting their need for a statin regimen. However, even given the elevated risk, studies found that over half of heart attack survivors that were prescribed a high-intensity statin after their first incident were no longer maintaining their initial regimen. According to researchers from the Icahn School of Medicine at Mount Sinai in New York City, only 42 percent of those who had been prescribed a high-intensity statin were still taking them regularly two years out from their heart attack. 13 percent were found to have switched to a medium- or low-intensity statin, 19 percent were still taking high-intensity statins but not regularly, and around 20 percent were no longer taking a statin of any kind.

Dr. Robert Rosenson of Mt. Sinai has stressed the long-term health benefits of statin use in high cardiovascular-risk individuals, and that the decrease in bodily and vascular inflammation and plaque stabilization conferred by high-intensity statin regimens are very important in the prevention of future heart attacks or strokes. Why, then, are patients not continuing with the regimens that may very well save their lives? Dr. David Pearle, a cardiologist at the MedStar Heart & Vascular Institute in Washington, D.C. explains a few possible reasons. Many patients have complained of muscle pain as a side effect of a high-intensity statin regimen, prompting their physicians to drop them to lower intensities or stop their prescriptions altogether. Also, he cites increased cost and decreased cardiologist follow-up and participation in cardiac rehabilitation clinics as factors in decreased statin use over time, implicating the need for physicians, cardiologists, and health care providers to develop more tolerable and affordable plans for patients that can persist for years following discharge.


"Many Heart Attack Patients Fail to Stick With Statins: MedlinePlus Health News."MedlinePlus. HealthDay, 19 Apr. 2017. Web.

Gut Bacteria May Influence Eating Disorders

By: Katie Campbell

Gut bacteria research is a hot topic within the medical community right now as it is found to have more influence within the human body than previously thought. A recent study performed at the University of North Carolina: School of Medicine has found a potential link between patients with anorexia nervosa and gut bacteria which adds to the growing list of research suggesting the connection between the gut microbiota and the brain, known as the “gut-brain axis”.

Anorexia nervosa is a serious eating disorder that affects more than 3 million Americans. It has the highest mortality of any psychological disorder. The study was performed by collecting fecal samples from women with anorexia nervosa when admitted to the UNC Center of Excellence for Eating Disorders and again at discharge (when their weight was restored at approximately 85 percent of their ideal weight). They found significant differences in the gut microbiota in which the samples taken at admission had less intestinal diversity than at discharge, but both samples were significantly less diverse than samples from healthy individuals.

            Interestingly, as the patient’s weight increased, their mood improved as well. Therefore, the researchers posit that there may a connection between improved microbial diversity or abundance and the psychological symptoms of the eating disorder. They will continue to investigate this potential connection with a $2.5 million dollar grant to perform studies with mice. They hope that one day they may be able to provide microbial therapy to patients with eating disorders to minimize readmissions and maximize cure rates.


University of North Carolina Health Care. "Gut bacteria population, diversity linked to anorexia nervosa: Studying the 'gut-brain axis,' researchers find evidence of an association." ScienceDaily. ScienceDaily, 5 October 2015. <>.

Vitamin B: The Way to Your Polluted Heart

By Leili Najmabadi 

Many city dwellers are aware of the effect pollution has on their hearts, and it’s not the effect of a bad mood from a gray day. Air pollution has been consistently linked to cardiovascular and respiratory diseases, and exposing oneself to two hours of this pollution has shown negative effects on levels of white blood cells and heart rate.

However, Dr. Andrea Baccarelli and Dr. Alan Mensch have discovered that taking vitamin B supplements can reverse this harm. As chair of environmental health sciences at Columbia University and senior vice president of medicine at Northwell Health’s Plainview Hospital, Baccarelli and Mensch were able to test 20 participants, all healthy and nonsmoking adults. Half of the participants were assigned to the placebo group and the other half took the vitamin supplements after their two hours of experiencing the pollution’s effects. All participants were exposed to microscopic specks with a diameter of 2.5 micrometers, and the researchers in this study deemed this size or smaller to be “potentially the most dangerous form of air pollution due to their ability to penetrate deep in the lungs and adjacent blood stream”. Air that has a high concentration of these particles can be responsible for increases in inflammation, heart attacks, lung cancer, DNA mutations, and premature births and deaths in the population. To be precise, the 3.7 million premature deaths in the world every year can be linked to this pollution.

The experimental group consumed the vitamin B supplements for 4 weeks before another exposure to the fine-particle air pollution. The researchers found that the supplements could recover negative reactions back to their normal states. Though this step in treating adverse effects is a positive step, it is most important to focus on preventing the worsening of air pollution in the first place


Preidt, Robert. "Could a Daily Vitamin Curb Smog's Effect on the Heart?: MedlinePlus Health News." MedlinePlus Trusted Health Information for You. HealthDay, 14 Apr. 2017. Web. 17 Apr. 2017.

Prejudice AI

By Alexander Pan

Artificial intelligence research has taken another step towards creating machines that have human emotions and biases. In a new study, researchers from Princeton University have determined artificial intelligence to possess cultural biases when learning different languages. When trained to learn a specific human language, the AI are able to detect the cultural biases within the pattern and nature of the wording. As machines are used more frequently for communication purposes, the researchers realize the problem of having internal biases within the machine. Words such as “roses” and “flowers” were associated with pleasant emotions of care and love. In contrast, words such as “ant” and “moth” are associated with unpleasant emotions of disgust and filth. Then, the researchers from Princeton tested whether the AI associated certain words with race and gender. In the results, the AI were more likely to associate words like “engineer, scientist, programmer” with the male gender. On the other hand, words such as “nurse, marriage, wedding” were more likely to be associated with the female gender by the AI. The researchers wished to eliminate these innate, cultural biases within the AI’s communication system. To do so, programmers are starting to look towards mathematical ways to program the AI to think more objectively and not retain cultural biases. 


Princeton University, Engineering School. (2017, April 13). Biased bots: Human prejudices sneak into artificial intelligence systems. ScienceDaily. Retrieved April 16, 2017 from

What will SHERLOCK discover?

By Grace Materne

In the last decade research has uncovered the potential and value of CRISPR-related technologies. CRISPR is an RNA-targeting enzyme which can be used to detect specific molecules in target DNA/RNA. SHERLOCK, a new form of CRISPR technology, has emerged through recent research performed by Omar Abudayyeh and Jonathan Gootenberg.

SHERLOCK (Specific High-Sensitivity Enzymatic Reporter unLOCKing) has a wide range of applications in both science research and clinical settings including: viral/bacterial outbreaks, antibiotic resistance, and detections of cancer. This new technique utilizes the Cas13a enzyme which is described as a promiscuous enzyme due to its collateral cleavage activity after it cleaves its target molecule. Recently, researches have increased the sensitivity of Cas13a a million fold, allowing it to detect any nucleic acid. The specificity of Cas13a increases its potential use in various treatments, especially in detecting cancerous DNA and studying the frequency of antibiotic-resistant bacteria.

In addition, SHERLOCK is a paper-based test that requires no refrigeration. This characteristic lets SHERLOCK be used in rural areas with very limited resources, such as a field clinic during an epidemic. Deb Hung, co-author and co-director of the Broad's Infectious Disease and Microbiome Program, comments on the future use of SHERLOCK saying, “There is still much work to be done, but if SHERLOCK can be developed to its full potential it could fundamentally change the diagnosis of common and emerging infectious diseases”. SHERLOCK is an exciting medical technology because it is complex in structure, but simple in application, allowing it to bring treatment from the lab to a rural crisis.


Broad Institute of MIT and Harvard. "New crispr-based diagnostic platform unveiled: New system adapts tool known for gene editing for rapid, inexpensive disease diagnosis." ScienceDaily. ScienceDaily, 13 April 2017. <>.

Ants rescue their injured companions

By Akari Miki

            The life of Megaponera analis ants is similar to that of a warrior – they search for a nest of termites, assemble an army, and attack. After about twenty minutes of intense fighting, the ants carry the dead termites to their nest and feed on them. During the fierce battles, some ants lose their legs or collapse under the weight of dead termites, and hence become slower than unscathed ones. Biologist Erik Frank at the University of Würzburg in Germany noticed that some ants were rescuing and carrying their injured companions. After this accidental observation, he performed experiments in which he painted the wounded ants and monitored them. He found that after being carried to safety, they recovered quickly as they learned to walk with fewer legs. According to Frank, the injured ants emit chemical signals that prompt rescue. This form of communication and rescue behavior evolved in these ants, because they maintain the population of the colony, ensuring that the army is large enough to successfully attack the termites.

            In a commentary on this study, Peggy Mason, a neurobiologist at the University of Chicago, contrasted the rescue behavior of ants with that of mammals. Her research has focused on how rats rescue each other from traps. In an experiment, her group administered an anti-anxiety drug to rats and found that they were less inclined to release an entrapped rat. This discovery suggested that the rats exhibit empathy toward their companions in distress. In contrast, the ants do not exhibit such emotional response and simply respond to the chemical signals. Nevertheless, regardless of how the ants are motivated to help each other, natural selection has favored their rescue behavior.


Source: Greenfieldboyce, N. (2017, April 12). No Ant Left Behind: Warrior Ants Carry Injured Comrades Home. Shots: Health News from NPR. Retrieved April 16, 2017 from

Notched Raven Bone Provides New Insight into Neanderthal Cognition

By Nicole Loranger

A recently discovered raven bone has allowed for new discussion regarding the neanderthals’ cognition capacity. Found at the Zaskalnaya (ZSK) site in Crimea, this bone stands out due to the two notches engraved on its surface, notches that appear evenly spaced. Seeing this, researchers began to speculate that it may be the work of neanderthals that passed through the area, a hypothesis that would suggest a higher level of thinking ability than initially accepted. To verify the legitimacy of the artifact, researchers conducted an experiment where volunteers made equally spaced notches on turkey bones of similar size; the results indicated that, even when human accuracy errors were accounted for, the notches made in the raven bone were comparable to those made by the volunteers. The raven bone was further compared with bones from other archeological sites that have been etched in a similar manner. After examining all the available data, scientists confirmed that it is very possible these notches were in fact engraved by neanderthals, potentially in attempt to decorate the bone in a meaningful way.

Other finds similar to this have altogether led researchers to begin hypothesizing about the purpose for these decorated bones. According to the article, the leading prediction is “personal ornaments, as opposed to butchery tools or activities.” This particular finding however is one of the first suggesting an intentional modification of bird bones, leading some to conclude that neanderthals may have supported a greater mental capacity than generally given credit for.


PLOS. "A decorated raven bone discovered in Crimea may provide insight into Neanderthal cognition: Two extra notches found in raven bone may have been a symbolic addition." ScienceDaily. ScienceDaily, 30 March 2017. <>.

The Power of Waiting: How Vending Machine Delays Can Help Stop Obesity

By Ursula Biba

            Obesity is endemic in the United States, and coupled with a poor diet, is one of the strongest risk factors for heart disease, stroke and diabetes. As more people are becoming obese, healthcare workers, scientists and engineers need to get creative to craft innovative interventions. With 1.3 million vending machines currently in the U.S., they are the most popular source of high calorie and nutrient poor snacks in the country. Preventative medicine experts at Rush University Medical Center suggest that consumers are more likely to purchase healthier snacks from a vending machine if there is an order delay placed on the less healthier options, as the delay makes the snacks less desirable. While other vending machine interventions involve eliminating all unhealthy snacks and machines in general, these are unfavorable because of decreased profits. Results from a study comparing different vending machine systems that include taxes on unhealthy snacks, discounts on healthy snacks, time delays on healthy snacks and a combination of all three showcased increased healthy snack purchases in the time delay, healthy snack discount and unhealthy snack tax conditions. These findings propelled the creation of the Delays to Improve Snack Choices System (DISC), in which a bar separates healthy and unhealthy snacks, a 25 second delay is placed on purchasing unhealthy snacks, and a live countdown lets the customer swap an unhealthy snack for a healthier one. In this study, healthy snacks are defined as those with greater than 1 g of fiber per serving, no trans-fat, and less than 250 calories, 35% calories from fat, 350 mg of sodium, 5% of the daily value of saturated fat and 10 g of added sugar per serving. As implementation of the DISC exhibited a 2-5% increase in healthy snack purchases without harming general sales, these machines may be the future of snacking in the U.S. and encourage consumers to make healthier choices more often.


Rush University Medical Center. (2017, March 31). Time delays in vending machines prompt healthier snack choices: Researchers develop new vending machine technology to help improve snack habits. ScienceDaily. Retrieved April 8, 2017 from

The Importance of Catching Some Zs

By Rebecca Moragne, TuftScope Research-Highlights Editor

No matter an individual’s age, sleep is essential to heath. However, as humans age, the importance increases due to the connection between sleep and cognitive diseases. University of California-Berkeley released a study earlier this month showing that an elder’s lack of sleep increases their risk of memory loss and associated mental and physical disorders. Matthew Walker, a UC Berkeley professor of psychology and neuroscience and the article’s senior author stated, “Nearly every disease killing us in later life has a causal link to lack of sleep” (University of California-Berkeley, 2017). The decrease in sleep in the elderly is not only in quantity, but also in quality.

As humans age, the brain regions that control deep sleep begin to deteriorate. Deep sleep produces slow waves and “sleep spindles” that convert short-term memories in the hippocampus to long-term information in the prefrontal cortex. Sleep deterioration in the elderly is also due to a decrease in the regulation of neurochemicals. Neurochemicals, such as galanin and orexin, stabilize sleep and promote the ability to transition from sleep to wakefulness. According to the study, sleep deterioration due to these neurological changes “has been linked to such conditions as Alzheimer’s disease, heart disease, obesity, diabetes and stroke” (University of California-Berkeley, 2017). In order to prevent these diseases, pharmaceutical and non-pharmaceutical interventions are being researched. While many seek sleeping pills as a solution, Walker believes that these are a poor option because they lead to sedation, not sleep, which are very different from each other. “Sleeping pills sedate the brain, rather than help it sleep naturally. We must find better treatments for restoring healthy sleep in older adults, and that is now one of our dedicated research missions” (University of California-Berkeley, 2017). Instead of pills, electrical stimulation is a potential solution for a deficiency in deep sleep by amplifying brain waves during sleep to slow brain rhythms. Sleep is essential for daily recuperation and restoration and due to the decreasing strength of an aging brain’s ability to process information, this rest is even more important. Hopefully, more research will explain how to ensure quality of sleep in the elderly while elders learn the importance of establishing quantity.


University of California - Berkeley. (2017, April 5). Deep sleep may act as fountain of youth in old age: Restorative, sedative-free slumber can ward off mental and physical ailments, suggests research. ScienceDaily. Retrieved April 8, 2017 from

How mammals survived the dinosaurs by chewing differently

By Dominic Kleinknecht
Eberhard-Karls-University Tübingen

A new study at the University of Chicago found that mammal jaws evolved to also allow for side-to-side motions of the jaw. As a result, food could not only be bitten but also be grinded with the molar teeth, which meant that mammals could have a more diversified diet. The study goes even further and states that the resulting dietary advantage helped mammals survive and thrive after the mass extinction at the end of the Cretaceous Period 66 million years ago that saw the demise of the dinosaurs.

David Grossnickel, UChicago grad student and author of the study, looked into functional advantages that mammals might have had over dinosaurs in order to explain why we are finding ourselves in the age of mammals instead of still being in the age of dinosaurs.  Therefore, he analyzed the structure of teeth, jaw bones, and the attachment sites of controlling muscles using 2D images of early mammal fossil and collected 3D data of modern specimens. What he found was that while the molar teeth developed to better fit into the corresponding counter-molar of the opposing jaw, the musculature simultaneously evolved to allow for side-by-side or grinding motions. This meant that both a pitch rotation resulting in up-and-down movements for biting as well as a yaw movement for grinding food like a pestle and mortar was possible. This duality of movements is a shared feature of almost all modern mammals and meant that early mammals could eat a broader range of available foods, and be more resourceful during tough times of food scarcity. According to Grossnickel, this adaptations and dietary advantages might have been crucial to the early mammals’ survival and a key characteristic that played into heralding the start of the mammalian era.


University of Chicago Medical Center. "How chewing like a cow helped early mammals thrive: Study shows how mammal jaws evolved to help our earliest ancestors eat a more diversified diet." ScienceDaily. ScienceDaily, 23 March 2017. <>.

Proposed Bill Threatens to Undo Measures for Genetic Privacy in the Workplace

By Mohamad Hamze

            For the most part, we are afforded privacy when it comes to our genes thanks to legislation such as the Americans with Disabilities act and the Genetic Information Nondiscrimination Act (GINA) of 2008. Furthermore, it is common for workplaces to enact wellness programs where participation in the form of voluntary genetic testing is allowed, so long as there are no penalties or incentives implied with providing one’s genetic information.

            However, a bill on its way to US House committee would allow employers to demand genetic test results from their employees and punish those who refuse with premiums of up to 30 percent of their existing health insurance costs – a fee averaging over $5000 per year for employees across the country. This bill, entitled the Preserving Employee Wellness Programs Act (HR 1313), was introduced by House Republicans in the wake of the failure of the recent GOP health care act with the intent of reducing health care costs for employees.

            Proponents of the bill argue that it has the potential to circumvent untidy federal regulations that make wellness programs difficult for a workplace to enforce, thus improving employee health and decreasing health care costs. Opponents counter that the bill would eliminate the possibility for voluntary wellness programs in the workplace, and that it is a thinly-veiled means to evade already-established legislation that provides genetic privacy and protection against wellness discrimination. While the bill has been shelved for the time being, its progress so far has insinuated that the case for genetic privacy is not closed just yet.


Sun, Lena H. "Employees Who Decline Genetic Testing Could Face Penalties under Proposed Bill." The Washington Post. WP Company, 11 Mar. 2017.

Too Close to Home?

By Carolyn Burtt

Dengue infections tend to occur in the physical vicinity of one other, according to a study performed by the Johns Hopkins Bloomberg School of Public Health and the University of Florida. Because dengue is transmitted through the blood that a particular strain of mosquito transfers from one human to the next, having multiple people within a 200 meter area would increase the likelihood of one strain of dengue efficiently spreading.

            Dengue-infected individuals who lived within 200 meters of another infected person had a 60% chance of having contracted the same disease strain, but this percentage decreased to 3% for individuals living between one and five kilometers from one another. This demonstrates the sheer number of dengue strains that exist, as well as the natural competition between each strain due to the geographic proximity of people in high-density areas and how far a mosquito can fly.

            With over 300 million individuals annually infected with dengue and two million of these developing dengue hemorrhagic fever, this research is particularly pertinent to the medical community in their efforts to prevent dengue infections. Vaccines and techniques put in place in residences and densely-populated areas of dengue-prone nations could be tailored to these results and the patterns of infection that appear to exist. Approximately 40% of the human population lives in regions in which the Aedes aegypti mosquito lives, and therefore are in danger of contracting a dengue infection. 


Johns Hopkins University Bloomberg School of Public Health. (2017, March 23). Most dengue infections transmitted in and around home: Findings could aid in interrupting transmission chains and reducing severe illness. ScienceDaily. Retrieved April 2, 2017 from

The Recipe to Improving Family Mealtimes

By Min Seo Jeong

Researchers at The Ohio State University conducted a study on the relationship between family meals and obesity rates. Rachel Tumin, who works at the Ohio Colleges of Medicine Government Resource Center as a survey and population health analyst manager, and Sarah Anderson, an associate professor of epidemiology at the University’s College of Public Health, focused on two family mealtime practices: home-cooked meals and watching the TV.

The study showed that the odds of obesity were lower in adults who ate home-cooked meals for every family meal than in those who ate fewer to no home-cooked meals. Additionally, adults who did not watch TV or videos during the family meals had lower odds than those who did. Adults who engaged in a combination of both home-cooked meals and no TV during the family meals had the lowest obesity rates.

An important aspect of the study’s results is that the frequency of family meals did not have significant impacts on obesity compared to the practices themselves. Adults who ate family meals every day of the week had the same obesity rates as those who ate family meals once or twice a week. Tumin regards this aspect as a plus. "Families have a lot of demands and they can feel pressured to do things 'right' all the time,” she said. “This study showed potential benefits regardless of how often you eat a family meal at home."

Although the study does not provide a direct link between family meals and obesity, previous studies have shown that the social and emotional benefits that come from family meals encourage lower obesity rates.

The analysis was made with data from the 2012 Ohio Medicaid Assessment Survey, which asked participants about their frequency of family meals and whether they watch TV during meals, along with measurements of height and weight.


Ohio State University. "Cooking family meals, skipping TV during those meals linked to lower odds of obesity." ScienceDaily. ScienceDaily, 23 March 2017. <>. 

Excess cholesterol could lead to negatively altered receptor functioning in brain cells, implicated in Alzheimer’s disease

By Kanika Kamal 

New research by the Hospital del Mar Medical Research Institute and Universitat Pompeu Fabra has demonstrated the potentially negative effects of cholesterol on important proteins and receptors in the brain. Specifically, the research was built off of previous studies that found that cholesterol in brain cell membranes can negatively interfere with the function of the Adenosine Receptor of brain cells. The Adenosine Receptor is a type of G Protein-Coupled Receptor (GPCR) that is very important for signal transduction and cell-to-cell communication, especially between brain cells. Because of their important function, the Adenosine Receptors are aptly located in the cell membranes. These receptors are involved in several crucial physiological processes, such as in allowing vision, taste, and smell, in regulating the immune response, and in modulating behavior.

 In diseases such as Alzheimer’s, a neurodegenerative disease, there are increased levels of cholesterol in the cell membrane. This cholesterol can thus directly affect the function of the Adenosine Receptor and other GPCRs. Originally, it was thought that cholesterol changes the protein’s activity by either changing properties of the cell membrane itself or by binding to the surface of the protein. These new findings, however, have demonstrated that cholesterol from the brain cell membranes actually binds to the Adenosine Receptor’s active site. This discovery has lead scientists to further understand the direct effect that high cholesterol levels in diseases like Alzheimer’s have in blocking the Adenosine Receptor’s important function. They believe the blockage of the receptor could be directly related to the disease symptoms exhibited by patients.

Although the exact mechanism through which the cholesterol affects the Adenosine Receptor’s activity is unknown, this finding opens new pathways for investigation. As this discovery is quite new, further research is needed to prove this relationship. With this new knowledge, however, scientists could develop molecules similar to cholesterol and use them to modulate GPCR activity. Thus, the next logical research step is to see whether cholesterol only affects Adenosine Receptors, or whether this molecular mechanism can affect other GPCRs as well. If so, these findings could be a huge step forward in understanding the mechanism and developing treatments for a wider range of neurodegenerative diseases.


IMIM (Hospital del Mar Medical Research Institute). "New role of cholesterol in regulating brain proteins discovered: May be key in central nervous system diseases such as Alzheimer's." ScienceDaily. ScienceDaily, 23 February 2017. <>.

Getting Too Old for This: The connection between age and disease spread

By Meg Thode

            Researchers from the University of Edinburgh’s School of Biological Sciences have found a connection between age distribution and disease spread in animals. Using computer modeling and lab experiments with water fleas (small, marine crustaceans) they simulated outbreaks of bacterial infections among populations with different demographics. They found the age at time of exposure, as well as the age at which females become mothers, are most significant for determining the rate of disease spread (with younger populations being more vulnerable). Unexpectedly, they found that high death rates can cause a disease to spread even faster. This contradicts the usual expectation that a high death rate would slow the progression of an epidemic as the population becomes less dense.

The team hopes to further develop its mathematical model of how the spread of disease affects organisms and populations long-term. The study leader, Jess Clark, frames it in the modern demographic transition: "Many societies around the world are experiencing aging populations, and investigating the impact of this might lend valuable insight into how such populations might respond to an outbreak of disease."


University of Edinburgh. (2017, March 24). Spread of ages is key to impact of disease, animal study finds. ScienceDaily. Retrieved March 29, 2017 from