Brain defects behind tinnitus and chronic pain identified
September 24, 2015
Brain defects behind tinnitus and chronic pain identified
In a new study published in Trends and Cognitive Sciences, researchers say they have identified the defects in the brain that lead to tinnitus and chronic pain. The team hopes the findings will be the first step to developing therapies for these common complaints.
The authors of this new research - from Georgetown University Medical Center (GUMC) in Washington, DC, and Technische Universität München (TUM) in Germany - explain that the mechanisms in the brain that would usually control noise and pain signals lose the ability to do so, which in turn leads to a perception of noise or pain long after the initial injury occurred.
They describe these controls as a kind of "gate."
The scientists were able to follow the flow of these signals through the brain and show where controls should be occurring.
The brain "reorganizes" itself in response to injury, is how Prof. Josef Rauschecker, director of the Laboratory for Integrative Neuroscience and Cognition at GUMC, describes it.
Tinnitus can follow when the ears are damaged by loud noise, something the brain continues to "hear." In a similar way, chronic pain from an injury can remain inside the brain long after the injury has healed.
"Some people call these phantom sensations, but they are real, produced by a brain that continues to 'feel' the initial injury because it cannot down-regulate the sensations enough," says Prof. Rauschecker.
"Both conditions are extraordinarily common, yet no treatment gets to the root of these disorders."
The areas of the brain that are responsible are the nucleus accumbens and several areas of the prefrontal cortex and the anterior cingulate cortex.
Prof. Rauschecker explains that "these areas act as a central 'gatekeeping system' for perceptual sensations, which evaluate the affective meaning of sensory stimuli, whether produced externally or internally, and modulates information flow in the brain. Tinnitus and chronic pain occur when this system is compromised."
Fast facts about tinnitus
- Roughly 10% of the US adult population has experienced tinnitus lasting at least 5 minutes in the last year
- The incidence of tinnitus for people ages 65-84 years is approximately 27%
- Around 13 million people in the US report tinnitus without hearing loss.
Brain plasticity both produces these effects and offers a solution
The researchers also found that depression and anxiety and uncontrollable or long-term stress, all modulated by the nucleus accumbens, operate in close synchronicity with tinnitus or chronic pain, or with both together.
The very brain plasticity that produces some of these effects, the researchers add, suggests that the proper gatekeeping controls could be restored.
Dr. Markus Ploner, PhD, a consultant neurologist and Heisenberg professor of human pain research at TUM, says:
"Better understanding could also lead to standardized assessment of individuals' risk to develop chronic tinnitus and chronic pain, which in turn might allow for earlier and more targeted treatment."
Written by Jonathan Vernon
BP Targets Far Below Guidelines Cut Mortality
September 20, 2015
BLOOD-PRESSURE FAR BELOW GUIDELINES CUT MORTALITY
A large study funded by the National Institutes of Health (NIH) has found that a more intensive strategy of lowering blood pressure—one that aims to achieve a systolic blood-pressure target of 120 mm Hg—reduces the risk of death and cardiovascular events when compared with a strategy that lowers systolic blood pressure to conventional targets
In the Systolic Blood Pressure Intervention Trial (SPRINT), investigators report that treating high-risk hypertensive adults 50 years of age and older to a target of 120 mm Hg significantly reduced cardiovascular events by 30% and reduced all-cause mortality by nearly 25% when compared with patients treated to a target of 140 mm Hg.
"This study shows that intensive blood-pressure management can prevent the cardiovascular complications of hypertension and save lives," Dr Jackson Wright (Case Western Reserve University, Cleveland, OH), one of the SPRINT primary investigators, said during a media briefing announcing the top-line results.
The study, which included hypertensive patients with one additional cardiovascular risk factor or preexisting kidney disease, was stopped earlier than the planned 2018 completion date, given the benefit of the intensive strategy, according to investigators.
The SPRINT investigators did not disclose event rates or the absolute reduction in risk with any of the end points, including the primary composite end point of MI, acute coronary syndrome, stroke, heart failure, or cardiovascular disease death. During the media briefing, they said only that the reduction in the event rate was sufficiently large enough for the SPRINT data safety and monitoring board (DSMB) stop the trial early.
The SPRINT investigators said they plan to submit their findings to a medical journal for peer review and expect to see the paper published before the end of 2015.
Dr Sripal Bangalore (New York University School of Medicine, NY), who was not affiliated with the trial, said the past decade in hypertension has seen some "ups and downs" in terms of what the ideal blood-pressure target should be. "I think now we have a clear answer from a very large, robust study," he told heart wire
Bangalore stressed the importance of analyzing the full data set before making firm conclusions, but right now, given the clear reduction in cardiovascular complications and death, the findings from SPRINT are sure "to shake everything up."
Similarly, Dr Franz Messerli (St Luke's-Roosevelt Hospital, New York) told heartwire that "if the results are iron-clad, and since SPRINT was prematurely terminated, there is little doubt that they are, they will reduce some of our previous thoughts on the J-curve to hogwash."
Messerli said this does not mean that the J-curve—an inverse relationship between cardiovascular events and mortality at low blood pressures—has disappeared altogether, noting that a systolic blood pressure of zero will still result in death, but the "optimal on-treatment blood pressure is obviously lower than what was previously documented in many post hoc studies, including some of our own."
Finishing the SPRINT Faster Than Expected
In SPRINT, conducted across 100 clinical centers in the US and Puerto Rico, approximately 9300 patients were randomized to two treatment strategies. Dr Gary Gibbons, director of National, Heart, Lung, and Blood Institute (NHLBI), said SPRINT was conceived more than 10 years ago, and that while there is a consensus that treating high blood pressure reduces the risk of cardiovascular events, there remain uncertainties over just how much blood pressure should be lowered. He called the data from SPRINT "potentially lifesaving information."
SPRINT was designed as a target-based study, which gave physicians and patients flexibility in selecting antihypertensive medications to achieve the assigned degree of blood-pressure control. SPRINT investigators excluded patients with diabetes and those with a history of stroke. Approximately 25% of patients in the study were 75 years of age and older.
In the first treatment arm, patients were randomized to intensive blood-pressure control, the goal being a systolic blood pressure less than 120 mm Hg. In the intensive-therapy arm, patients were treated with three or more antihypertensive medications, including diuretics, such as chlorthalidone; the calcium-channel blocker amlodipine; and the ACE inhibitor lisinopril. These "evidence-based" drugs, said co–primary investigator Dr Suzanne Oparil (University of Alabama, Birmingham), have been shown in previous studies to not only lower blood pressure but also cardiovascular disease and mortality.
With the second strategy, patients were randomized to standard blood-pressure control, the aim of which was to achieve a target of less than 140 mm Hg. Patients were treated with an average of two antihypertensive medications.
The last patient visit was scheduled for 2016, but, as noted, the trial was stopped early, given the statistically significant 30% reduction in the primary composite end point and approximate 25% reduction in all-cause mortality, a secondary end point.
The SPRINT study also includes a substudy, known as SPRINT-MIND, which is currently ongoing and will assess whether the lower blood-pressure target reduces the incidence of dementia, slows the decline in cognitive function, and results in less cerebral small-vessel disease (assessed by MRI). The effects of treatment on kidney function are also still being evaluated.
What the Guidelines Say
Throughout the media briefing, the SPRINT investigators said it is too early to speculate on how the results will change clinical practice or alter the hypertension guidelines. They said the data still need to be analyzed by the SPRINT investigators and reviewed by other experts. Only once that happens will the data be "digested" by physicians and various guideline writing committees, said Wright. Once that process occurs, clinical recommendations based on SPRINT—namely, whether physicians should treat patients to a target of less than 120 mm Hg—can be made.
"I think it would be premature for us to make recommendations at this time," said Wright. "I will say, though, having said that, I am quite convinced all will be quite impressed when the SPRINT data are made available."
Still, the SPRINT investigators admitted the results are likely to shake up the management of patients with hypertension, especially given the controversies surrounding current guidelines.
Late in 2013, the Eighth Joint National Committee (JNC 8) released new guidelines on the management of adult hypertension, which contained departures from previous recommendations. The JNC 8 expert writing group, led by Dr Paul James (University of Iowa, Iowa City), relaxed the target blood-pressure and treatment-initiation thresholds in elderly patients and those younger than 60 years of age with diabetes and kidney disease.
For the patients 60 years of age and older, the JNC 8 guidelines recommend treating to a target of 150/90 mm Hg and to 140/90 mm Hg in everybody else. The relaxing of the targets, however, was not without controversy. In fact, five members of JNC 8 published a letter outlining their concerns with increasing the target systolic blood pressure target from 140 mm Hg to 150 mm Hg in patients 60 years of age and older.
Speaking during the briefing, Oparil acknowledged the current JNC 8 recommendations are contentious. While relaxing the target to less than 150 mm Hg caused consternation, Oparil said the SPRINT results are more than likely to challenge the conventional 140-mm-Hg target.
"This is a time of enlightenment," she said. "The NHLBI and other institutes have given us powerful new information."
To heartwire , Messerli said that given that SPRINT included patients 50 years of age and older, the 2013 JNC 8 decision to relax blood-pressure goals in those 60 years and older to less than 150 mm Hg appears to be wrong. "We merely hope that, unlike for the JNC 8, it will not take more than decade for a JNC 9 to digest these seminal findings before providing US physicians with evidence-based and clinically useful recommendations," he said.
Bangalore noted the inclusion criteria for entry in SPRINT would apply to a vast majority of US patients with hypertension, which will make the impressive results useful to physicians in practice.
The European Society of Hypertension (ESH) and the European Society of Cardiology (ESC) guidelines state physicians should treat to a target of less than 140 mm Hg. The recommendation is a systolic blood pressure target between 140 to 150 mm Hg, but physicians can go lower than 140 mm Hg if the patient is fit and healthy, according to a document released in 2013.
Written by Michael O'Riordan
Global Antibiotic Use and Resistance in 'Dire' Situation
September 19, 2015
GLOBAL ANTIBIOTIC USE AND RESISTANCE IN "DIRE" SITUATION
Antibiotic resistance rates across the globe are alarming, and the only sustainable solutions are to limit overuse and misuse of antibiotics, according to the Center for Disease Dynamics, Economics & Policy (CDDEP).
CDDEP and the Global Antibiotic Resistance Partnership have coreleased a new report, The State of the World's Antibiotics, 2015, on the state of global antibiotic use and antibiotic resistance in humans and livestock.
The Global Antibiotic Resistance Partnership is a CDDEP project funded by the Bill & Melinda Gates Foundation, which is assisting eight low- and middle-income countries develop local solutions to antibiotic resistance problems while sustaining antibiotic access in South Asia, East Africa, and South Africa.
In addition to the report, CDDEP has developed a new tracking tool that presents the latest global trends in antibiotic use in 69 countries, and drug resistance in 39 countries.
"For the first time, we have data on low- and middle-income countries, where antibiotic resistance is a serious problem but rarely the focus of policy solutions," Ramanan Laxminarayan, CDDEP director and report coauthor, said in a CDDEP blog.
"We hope this report, together with the Resistance Map online tool, will help empower these countries to understand the burden of antibiotic resistance in their region and then take coordinated, research-backed action to limit it," he said.
State of the World's Antibiotics
CDDEP says drug conservation should be prioritized over new research and development efforts.
"We need to focus 80 percent of our global resources on stewardship and no more than 20 percent on drug development," said Laxminarayan. "No matter how many new drugs come out, if we continue to misuse them, they might as well have never been discovered."
A major problem with prioritizing drug development as a solution is that new antibiotics cost much more than those that are currently available and are out of reach for those in low- and middle-income countries.
The World Health Organization recently emphasized the need for country-level antibiotic resistance strategies when it endorsed the global action plan on antimicrobial resistance in May 2015. The plan asks all countries to adopt national strategies within 2 years.
The new report features specific recommendations for achieving this goal.
Effective policies include antibiotic stewardship campaigns, hospital infection control, limiting infections by improving vaccination coverage, and reducing the need for antibiotics.
"Our research shows that antibiotic resistance and misuse is a dire — and growing — problem in every country on earth," said Laxminarayan. "The good news is that every country can work on solving it."
The Resistance Map data address infections caused by 12 common and potentially fatal bacteria, including Escherichia coli (E. coli), Salmonella, and methicillin-resistant Staphylococcus aureus (MRSA).
The tracking tool presents data from low- and middle-income countries including India, Kenya, Vietnam, Thailand, and South Africa. It features line graphs that illustrate trends in antibiotic use and resistance over time, column charts that compare antibiotic use and resistance rates between countries, world maps that compare country-level differences, and subnational maps that compare state and regional variations.
"Though wealthy countries still use far more antibiotics per capita, high rates in the low- and middle-income countries where surveillance data is now available — such as India, Kenya, and Vietnam — sound a warning to the world," according to the CDDEP blog.
"For example, in India, 57 percent of the infections caused by Klebsiella pneumoniae, a dangerous superbug found in hospitals, were found to be resistant to one type of last-resort drug in 2014, up from 29 percent in 2008."
These drugs, known as carbapenems, still work against 90% of Klebsiella infections in the United States and more than 95% of cases in most of Europe.
"Carbapenem antibiotics are for use in the most dire circumstances — when someone's life is in danger and no other drug will cure the infection," Sumanth Gandra, an infectious diseases physician and CDDEP resident scholar in New Delhi, India, said in the blog. "We're seeing unprecedented resistance to these precious antibiotics globally, and especially in India. If these trends continue, infections that could once be treated in a week or two could become routinely life threatening and endanger millions of lives."
Written by Troy Brown
How can aspirin help to cure cancer?
September 05, 2015
HOW CAN ASPIRIN HELP TO CURE CANCER?
A recent study, published in the journal Cell, suggests that aspirin could be effective in boosting the immune system in patients suffering from breast, skin and bowel cancer.
While researchers warn that the use of aspirin in the fight against cancer is still some way off, experiments on mice have proven encouraging. Immunotherapy is growing in strength as a weapon against the disease, as research increasingly focuses on ways in which cancer apparently "tricks" the immune system into allowing it to develop. One way in which cancer avoids the immune system is through "befriending" T cells, which seek out unwanted elements such as bacteria and viruses in the body's fight against disease, but mysteriously, do not attack cancer cells. In the 1990s, a molecule that Japanese scientists called "Programmed Death 1" (PD-1) was found on the surface of T cells. US researchers then found that cancer tumors often produced a matching molecule, "Programmed Death Ligand 1" (PDL-1). In this way, the cancer is able to "trick" the T cells into joining, instead of fighting it, thus circumventing the immune system. This discovery led to the development of a group of drugs known as "immune checkpoint blockade therapies." Another way in which cancers appear to subvert the immune system involves prostaglandin 2 (PGE2). PGE2 normally causes inflammatory response and fever in bacterial and viral infections, but it has been known for some time to promote tumor growth in the gastrointestinal tract. One theory is that the inflammatory process does not always end when it should. Chronic inflammation can eventually cause changes, such as the formation of new blood vessels and DNA mutations, which can give rise to tumors. Cells involved in certain types of inflammation have been found to produce secretions that promote tumors.
Fast facts about cancer
- The four main cancers in the US are: breast, lung, prostate and colorectal
- There are around 1,658,370 new cancer cases expected in 2015
- This year, there are expected to be 589,430 cancer deaths in the US.
Reawakening the immune system
According to the team from the UK's Francis Crick Institute, who carried out this project, PGE2 molecules "dampen down" the response of the immune system, which enables the cancer cells to "hide." If PGE2 molecules can be destroyed, they say, the immune system will "reawaken," find and kill the cancer cells. PGE2 in the body is produced by Cyclooxygenase, known as COX-1 and COX-2 enzymes. COX inhibitors are currently in the spotlight as a way to prevent the production of PGE2 in cancer patients. One way of inhibiting COX is through nonsteroidal anti-inflammatory drugs (NSAIDs), such as aspirin.
This study found that certain types of cancer in mice were substantially slowed by combining aspirin or other COX inhibitors with immunotherapy.
Given the "conservation of signature" across mouse and human melanoma, plus the fact that COX inhibitors appear to reduce gastrointestinal and breast tumors as well as melanoma, the team is hopeful that aspirin and similar drugs can be effectively used alongside current immunotherapy treatments to tackle bowel, breast and skin cancer. Citing a study published in the Journal of the National Cancer Institute, the American Cancer Society suggest that low-dose aspirin could also be useful in treating and preventing recurrence of esophageal, ovarian, stomach and prostate cancer. Concerns that regular low-dose use of aspirin can increase the risk of cardiovascular problems or internal bleeding in cancer patients with gastrointestinal problems have not been definitively proven, according to the US National Cancer Institute. As Peter Johnson, Cancer Research UK's chief clinician, notes regarding the current research, "there is still some way to go [...] but it's an exciting finding that could offer a simple way to dramatically improve the response to treatment in a range of cancers."
Written by Yvette Brazier
Corticosteroid injections may be ineffective for low back pain
August 27, 2015
CORTICOSTEROID INJECTION MAY BE INEFFECTIVE FOR LOW BACK PAIN
People who suffer from low back pain often turn to epidural corticosteroid injections for some relief. According to new research, however, such treatment fails to provide long-term respite, if any.
Lead study author Dr. Roger Chou, of the Oregon Health & Science University in Portland, and colleagues publish their findings in the Annals of Internal Medicine. Low back pain is the leading cause of disability worldwide. In the US, around half of all workers admit to experiencing symptoms of back pain each year, and approximately 80% of us will suffer a back problem at some point in our lives. Primary treatment for low back pain involves nonsurgical options, such as narcotic pain medication and nonsteroidal anti-inflammatory drugs (NSAIDs). Other nonsurgical treatments include epidural corticosteroid injections, administered directly to the epidural space in the spine.
Epidural corticosteroid injections work by reducing inflammation and, in turn, relieving pain. According to Dr. Chou and colleagues, the injections are commonly used for radiculopathy (inflammation of a spinal nerve) and spinal stenosis (narrowing of the spinal canal) - two conditions that cause radiating low back pain. Use of epidural corticosteroid injections for these conditions is increasing, despite the fact that numerous studies have questioned their effectiveness for low back pain.
Fast facts about back pain
- Around 31 million Americans experience back pain at any given time
- Back pain is one of the most common reason for missed work days, and it is the second most common reason for doctors' visits
- Back pain costs Americans around $50 billion annually.
Epidural corticosteroid injects had no effect on spinal stenosis
For their study, the team reviewed 30 trials assessing the short- and long-term effects of epidural corticosteroid injections for individuals with radiculopathy or spinal stenosis, comparing them with a placebo. Specifically, the researchers looked at how epidural corticosteroid injections impacted patients' pain, function and risk for surgery. While the injections provided greater immediate pain relief for radiculopathy than a placebo, the team found that this effect was small and short term. What is more, the treatment did not prevent patients' need for surgery in the long term.
For spinal stenosis, the researchers found epidural corticosteroid injections offered patients no significant pain relief compared with placebo.
These findings remained regardless of what injection techniques and corticosteroids were used, according to the authors. "We found that the injection technique used (transforaminal, interlaminar, caudal), type or dose of corticosteroid, selection of patients with imaging guidance, and other patient and technical factors had no impact on the findings," Dr. Chou told Medical News Today. "Really, the results were the same no matter how you sliced the data." While severe side effects from corticosteroid injections were rare, some minor side effects were identified, which included bleeding, blood clots and nerve root irritation. Explaining why epidural corticosteroid injections appear to be ineffective for radiculopathy and spinal stenosis, Dr. Chou told us: "Corticosteroids are supposed to reduce inflammation and associated swelling. Perhaps it is that inflammation is not a prominent factor in most patients. It's also possible that patients improve over time with or without a treatment. Finally, we know that low back pain treatments have strong placebo effect, so that could be what we're seeing."
Patients should be aware of alternative treatment options
Based on these findings and those from previous studies, Dr. Chou told MNT it is important patients are aware of alternative treatment options for both radiculopathy and spinal stenosis:
"I think that it's important for patients to understand that the benefits of epidural corticosteroid injections for radiculopathy appear small and short-lived.
Patients should be aware of options that range from simple analgesics like acetaminophen or NSAIDs, exercise and non-pharmacological therapies like manipulation, massage, acupuncture etc., and surgery - for which there is evidence of benefit. For some patients, the short-term benefits might be worth it, if they have tried noninvasive therapies and aren't interested in surgery or not a good candidate."
"For spinal stenosis," he added, "the evidence to date indicates no benefit, so it would seem appropriate to consider the alternatives described above." However, the results of this study have been met with some criticism. Dr. Zack McCormick, of the Northwestern University Feinberg School of Medicine in Chicago, IL, told Reuters that the trials analyzed in this research were of low quality, so the findings "cannot be applied to the realistic day-to-day practice of spine medicine." He noted, however, that the aim of epidural corticosteroid injections is to improve short-term symptoms and quality of life for the patient, not to provide a long-term cure. As such, he says the treatment should "not be used as an isolated therapy." Earlier this year, a study published in The BMJ found acetaminophen to be ineffective for lower back pain and osteoarthritis.
Written by Honor Whiteman
What happens to the body when you drink Coca-Cola?
August 01, 2015
Sugary drinks are considered a major contributor to health conditions such as obesity, type 2 diabetes and tooth decay. But have you ever wondered exactly what these beverages do to your body after consumption? One researcher has created an infographic that explains what happens to the body within an hour of drinking a can of Coca-Cola.
According to the Centers for Disease Control and Prevention (CDC), around half of the US population drink sugary beverages on any given day, with consumption of these drinks highest among teenagers and young adults.
There are approximately 10 teaspoons of added sugar in a single can of cola. The World Health Organization (WHO) recommend consuming no more than 6 teaspoons of added sugar daily, meaning drinking just one serving of cola a day could take us well above these guidelines.
As such, it is no surprise that sugary drink consumption is associated with an array of health conditions. According to the Harvard School of Public Health, people who drink 1-2 cans of sugary beverages daily are 26% more likely to develop type 2 diabetes, and last month, Medical News Today reported on a study claiming 184,000 global deaths each year are down to sugary drink consumption.
Now, an infographic created by British pharmacist Niraj Naik - based on research by health writer Wade Meredith - shows the damage a 330 ml can of Coca-Cola can do to the body within 1 hour of consumption.
Coca-Cola 'comparable to heroin' in how it stimulates the brain's reward and pleasure centers
According to Naik, the intense sweetness of Coca-Cola as a result of its high sugar content should make us vomit as soon as it enters the body. However, the phosphoric acid in the beverage dulls the sweetness, enabling us to keep the drink down.
Blood sugar levels increase dramatically within 20 minutes of drinking the Cola, explains Naik, causing a burst of insulin. The liver then turns the high amounts of sugar circulating our body into fat.
Within 40 minutes, the body has absorbed all of the caffeine from the Cola, causing a dilation of pupils and an increase in blood pressure. By this point, the adenosine receptors in the brain have been blocked, preventing fatigue.
Five minutes later, production of dopamine has increased - a neurotransmitter that helps control the pleasure and reward centers of the brain. According to the infographic, the way Coca-Cola stimulates these centers is comparable to the effects of heroin, making us want another can.
An hour after drinking the beverage, a sugar crash will begin, causing irritability and drowsiness. In addition, the water from the Cola will have been cleared from the body via urination, along with nutrients that are important for our health.
According to Naik, the infographic is not only applicable to Coca-Cola, but to all caffeinated fizzy drinks.
"Coke is not just high in high fructose corn syrup, but it is also packed with refined salts and caffeine," writes Naik on his blog The Renegade Pharmacist. "Regular consumption of these ingredients in the high quantities you find in Coke and other processed foods and drinks, can lead to higher blood pressure, heart disease, diabetes and obesity."
"However a small amount now and then won't do any major harm," he adds. "The key is moderation."
In a press statement, a spokesperson for Coca-Cola says the beverage is "perfectly safe to drink and can be enjoyed as part of a balanced diet and lifestyle."
Written by Honor Whiteman
Antibiotics are an alternative to appendectomy, study suggests
June 18, 2015
ANTIBIOTICS ARE AN ALTERNATIVE TO APPENDECTOMY, STUDY SUGGESTS
A randomized controlled trial has brought into question the established medical doctrine that appendicitis should be treated by surgical removal, finding that a level of success can alternatively be achieved by use of antibiotics.
Published in the journal JAMA, results in the arm of the study assigned to receive a 10-day course of antibiotics rather than surgery saw a success rate of just under 73% in terms of whether patients treated in this way did after all need surgery within a year of removal of the appendix.
Of the 256 patients available for 1-year follow-up in the antibiotic group, 186 did not require the later appendectomy.
However, compared with surgical treatment, the antibiotic option did not prove to cross a threshold of effectiveness established during the design of the study.
Over a quarter (70) of the patients assigned to antibiotic treatment went on to undergo surgical intervention within a year of initially presenting with appendicitis.
The researchers' hypothesis was that antibiotic treatment would not be worse than appendectomy. The threshold set - but not met by the results - was that the benefits from avoiding surgery would be worthwhile even if there was up to a 24% failure rate in the antibiotic group. The failure rate found, though, was 27.3%.
The safety of delaying appendectomy to first try antibiotics was, however, shown by a lack of intra-abdominal abscesses or other major complications.
Routine appendectomy is a 100-year-old idea
An editorial article about the study in the same issue of the journal - written by Dr. Edward Livingston, JAMA's deputy editor, and Dr. Corrine Vons, from the Jean Verdier Hospital in Bondy, France - explains that since late in the 19th century, surgery has remained unquestioned in the treatment of appendicitis.
But recent changes, the article outlines, have happened in the management of appendicitis, "even if appendectomy remains the end result."
There is now "almost perfect" diagnostic accuracy achieved by imaging via computed tomography (CT) scanning, and the use of antibiotics perioperatively is changing the condition's natural history.
The editorial goes on to praise some of the study's strengths - noting its large sample size and the use of CT scans, which also allowed the researchers to exclude from analysis cases that would require surgery anyway (because of perforation, abscess, and so on).
Drs. Livingston and Vons reach this conclusion:
"The time has come to consider abandoning routine appendectomy for patients with uncomplicated appendicitis.
The operation served patients well for more than 100 years. With development of more precise diagnostic capabilities like CT and effective broad-spectrum antibiotics, appendectomy may be unnecessary for uncomplicated appendicitis, which now occurs in the majority of acute appendicitis cases."
The authors of the study conclude:
"These results suggest that patients with CT-proven uncomplicated acute appendicitis should be able to make an informed decision between antibiotic treatment and appendectomy."
They call on future studies to focus on early identification of complicated acute appendicitis patients needing surgery, and to prospectively evaluate the optimal use of antibiotic treatment in patients with uncomplicated acute appendicitis.
Written by Markus MacGill
World's first successful penis transplant patient due to become a father
June 13, 2015
WORLD'S FIRST SUCCESSFUL PENIS TRANSPLANT PATIENT DUE TO BECOME A FATHER
The recipient of the world's first penis transplant is due to become a father, according to his surgeon. The news comes just months after the operation was performed, surpassing the expectations of the surgeon.
"We are happy that there were no complications and his penis is functioning well," Dr. Andre Van der Merwe - of Stellenbosch University in South Africa - told AFP.
Dr. Van der Merwe was surprised at how soon the man had recovered his sexual function, having initially aimed for his patient to be fully functional again after 2 years. Despite the surgery only being carried out in December, the patient reports his girlfriend is around 4 months pregnant.
The patient originally had his original penis amputated 3 years ago after a traditional circumcision went wrong, leaving him with only 1 cm of the organ.
In many countries, circumcisions are carried out routinely in safe environments, but in South Africa, circumcision can be a risky procedure. There are many reports of boys being maimed or even killed during traditional initiation ceremonies each year.
Although there can be deadly complications with botched circumcisions, penis transplants are far from routine procedures. Dr. Van der Merwe described the operation as more difficult than a kidney transplant due to the fact that the blood vessels in the penis are significantly narrower than those in the kidneys.
Penis transplants 'can bring patients back to life'
As well as being a thoroughly complex procedure to carry out, the surgeons also faced a struggle in getting approval from authorities to be allowed to perform the surgery. The team had to argue the case that the benefits of the surgery would outweigh the risks.
"You may say it doesn't save their life, but many of these young men when they have penile amputations are ostracized, stigmatized and take their own life," Dr. Van der Merwe told the BBC. "If you don't have a penis you are essentially dead, if you give a penis back you can bring them back to life."
One of the risks of transplants is that transplanted organs can be rejected by their new host body. A previous penis transplant, attempted in China, was initially believed to have gone well, only for the donated penis to later be rejected.
Following the 9-hour operation to attach the donated penis at Tygerberg Hospital in Cape Town, South Africa, the anonymous 21-year-old patient has been able to pass urine, maintain an erection, orgasm and ejaculate, even though full sensation has yet to return to the organ.
The fact that the patient has been able to father a child is unsurprising to Dr. Van der Merwe, however. "There was nothing preventing him from having children because his sperm wasn't affected," he explained to AFP.
Due to the success of the procedure, Dr. Van der Merwe and his team have been flooded with requests from men who have had penis amputations. Unfortunately, they are currently unable to meet the demand.
Dr. Van der Merwe hopes that as news spreads of the procedure's success, more penis donors will become available.
"Right now we have about nine people on our program," he told AFP. "I don't think it would be easy but I believe people will now come forward because of this positive case."
Written by James McIntosh
Heart attack risk may rise by a fifth with use of common antacid
June 12, 2015
HEART ATTACK RISK MAY RISE BY A FIFTH WITH USE OF COMMON ANTACID
Proton pump inhibitors are a form of antacid drug commonly taken by adults for a range of health conditions. However, a new study suggests people may need to be cautious of their use, finding that adults using the drug are 16-21% more likely to have a heart attack than people not using the antacid.
A different type of antacid drug known as an H2 blocker was not associated with an increase in heart attack risk, however.
Proton pump inhibitors (PPIs) are frequently prescribed to treat a variety of conditions, including gastroesophageal reflux disease (GERD) and Helicobacter pylori infection. Names for these drugs - lansoprazole and omeprazole, for example - always feature the suffix "-prazole."
In 2009, they were the third most taken type of drug in the US, and the Food and Drug Administration (FDA) estimates 1 in 14 Americans have used them. Over time, however, experts have begun to question the safety of the drug.
Experts initially believed that use of PPIs was only risky for patients with coronary artery disease who were also using the antiplatelet drug clopidogrel, considering the risk to be caused by a drug-drug interaction. More recent studies, however, have indicated that the risk may extend further.
"Our earlier work identified that the PPIs can adversely affect the endothelium, the Teflon-like lining of the blood vessels," reports senior author Dr. John Cooke. "That observation led us to hypothesize that anyone taking PPIs may be at greater risk for heart attack."
For the study, published in PLOS ONE, researchers from Houston Methodist and Stanford University compared heart attack risk for patients using PPIs with patients using other forms of stomach medication.
Data were collected from 16 million clinical documents for around 2.9 million patients. These documents were obtained from two databases; the Stanford Translational Research Integrated Database Environment (STRIDE) and Practice Fusion, an electronic medical records company.
PPIs associated with increased heart attack risk, unlike H2 blockers
The researchers extracted information from these databases for any patients reported to have been prescribed PPIs or other similar drugs, such as H2 blockers, and then looked to see if these patients had also experienced a major cardiovascular event such as a heart attack.
H2 blockers such as cimetidine and ranitidine are another form of antacid. Unlike PPIs, they have yet to be associated with an increased risk of heart attack or cardiovascular disease.
"By looking at data from people who were given PPI drugs primarily for acid reflux and had no prior history of heart disease, our data-mining pipeline signals an association with a higher rate of heart attacks," says lead author Nigam H. Shah, an assistant professor of biomedical informatics at Stanford, adding:
"Our results demonstrate that PPIs appear to be associated with elevated risk of heart attack in the general population, and H2 blockers show no such association."
Due to uncertainty with the estimation process, the researchers report that the increase in risk of heart attack sits between 16-21%.
As the observational data used in the study is vulnerable to confounding in multiple ways, the researchers hope to conduct a more reliable large, prospective, randomized trial to confirm whether PPIs are harmful to a wider population of patients.
"Our report raises concerns that these drugs - which are available over the counter and are among the most commonly prescribed drugs in the world - may not be as safe as we previously assumed," concludes principal investigator Dr. Nicholas J. Leeper, a vascular medicine specialist at Stanford.
Written by James McIntosh
Pregnancy test: how early can you take a pregnancy test?
June 10, 2015
PREGNANCY TEST: HOW EARLY CAN YOU TAKE A PREGNANCY TEST?
A pregnancy test may let you know, one way or the other, if you are pregnant.
This test can also be completed to diagnose abnormal conditions that can raise HCG levels, or to watch the development of the pregnancy during the first 2 months (quantitative test only).
What is a pregnancy test?
A pregnancy test measures the amount of pregnancy hormone, human chorionic gonadotropin (HCG), circulating within a woman's body.
HCG hormone can be present in the blood and urine approximately 10-14 days following conception and peak between 8 and 11 weeks gestation.
HCG is known as the pregnancy hormone, as it is produced by the cells that form the placenta and provides nourishment to the growing embryo. A negative HCG result is a level less than 5 mIU/ml, while a positive HCG for pregnancy is greater than or equal to 25 mIU/ml.
How are pregnancy tests performed?
Pregnancy tests can be performed in one of two ways: urine or blood testing. When testing at home or in an office setting using urine, the urine is placed on a chemical strip producing a result in approximately 1-2 minutes, although each brand of test may vary in the result times.
These tests can be performed by collecting urine in a collection cup and either dipping the pregnancy test stick into the urine or placing the urine into a container using an eyedropper. Alternatively, a pregnancy stick can be placed into the urine stream.
Each brand of testing has its own method of giving results so when looking for results it is important to read the package insert to determine how the results are given.
Some tests change in color, have line changes, or provide a symbol such as a plus, minus, etc. Other tests, such as digital ones, may provide you with a simple answer in the test window, for example the words "pregnant" or "not pregnant."
When testing using blood, there are two forms of testing: a quantitative and qualitative HCG test.
A quantitative blood test is used to measure the amount of HCG present in the blood, while the qualitative test simply tells you if HCG is present in the blood stream.
While there are advantages to blood testing such as early pregnancy detection and the ability to measure HCG concentration, there are also disadvantages such as high cost, long result wait time and the need to have the test completed in a medical office.
When should a pregnancy test be taken and what do the results mean?
When taking a home urine test, waiting until a period is missed is recommended. However, 14 days from possible conception would be the earliest possible time to take the test negative and positive pregnancy tests. Some tests have line changes, providing a symbol such as a plus or minus - the top test is negative and the one below is positive. Some tests on the market can be taken earlier and depend on their individual sensitivity; read the package insert to determine the best and earliest time to take that particular test. The best time of day to take a urine pregnancy test is when you wake up in the morning.
A positive home pregnancy test simply means that HCG is present in the urine, while a negative test can mean a variety of things.
Negative tests can be truly negative or mean that the test was either taken too early to detect HCG or the test was taken improperly.
How accurate are pregnancy tests?
When taken correctly, urine pregnancy tests offer approximately 97% accuracy. However, incorrect use of the test can result in a reading that is potentially inaccurate. If at first you receive a negative result but are exhibiting symptoms of pregnancy, it is recommended that you re-test in 1 week or speak with your health care provider about a blood test instead.
Written by Lori Smith, NP, nurse practitioner
Could owning a cat raise the risk of mental illness?
June 9, 2015
COULD OWNING A CAT RAISE THE RISK OF MENTAL ILLNESS?
They are cute, fluffy and have that wide-eyed glare that few of us can resist; it is no wonder more than 95 million of us own a cat. But there may be a darker side to our four-legged friends. New research claims the animals could increase our risk of mental illnesses, including schizophrenia and bipolar disorder.
Two studies published in the journals Schizophrenia Research and Acta Psychiatrica Scandinavica attribute this association to Toxoplasma gondii - a parasite found in the intestines of cats. Humans can become infected with the parasite by accidentally swallowing it after coming into contact with the animal's feces.
T. gondii is the cause of a disease known as toxoplasmosis. According to the Centers for Disease Control and Prevention (CDC), more than 60 million people in the US are infected with the parasite, though the majority of people are not aware of it.
People with a healthy immune system often stave off T. gondii infection, so it does not present any symptoms. However, pregnant women and people with weakened immune systems are more susceptible to infection and may experience flu-like symptoms - such as muscle aches and pains and swollen lymph nodes - as a result, while more severe infection may cause blindness and even death.
Previous studies have also linked T. gondii infection to greater risk of mental disorders. In November 2014, for example, Medical News Today reported on a study claiming the parasite is responsible for around a fifth of schizophrenia cases. Now, new research provides further evidence of this association.
T. gondii infection 'may double schizophrenia risk'
For one study, Dr. Robert H. Yolken, of the Stanley Laboratory of Developmental Neurovirology at Johns Hopkins University School of Medicine in Baltimore, MD, and colleagues assessed the results of two previous studies.
These studies had identified a link between cat ownership in childhood and development of later-life schizophrenia and other mental disorders, comparing them with the results of a 1982 National Alliance for the Mentally Ill (NAMI) questionnaire.
The NAMI questionnaire - conducted around a decade before any data was published on cat ownership and mental illness - revealed that around 50% of individuals who had a cat as a family pet during childhood were diagnosed with schizophrenia or other mental illnesses later in life, compared with 42% who did not have a cat during childhood.
The questionnaire, the researchers say, produced similar results to those of the two previous studies, suggesting that "cat ownership in childhood is significantly more common in families in which the child later becomes seriously mentally ill."
"If true," the authors add, "an explanatory mechanism may be T. gondii. We urge our colleagues to try and replicate these findings to clarify whether childhood cat ownership is truly a risk factor for later schizophrenia."
In another study, A. L. Sutterland, of the Academic Medical Centre in Amsterdam, the Netherlands, and colleagues conducted a meta-analysis of more than 50 studies that established a link between T. gondii and increased risk of schizophrenia.
They found that people infected with T. gondii are at more than double the risk of developing schizophrenia than those not infected with the parasite.
The team also identified a link between T. gondii infection and greater risk of bipolar disorder, obsessive-compulsive disorder (OCD) and addiction.
"These findings suggest that T. gondii infection is associated with several psychiatric disorders and that in schizophrenia, reactivation of latent T. gondii infection may occur," note the authors.
The CDC recommend changing a cat's litter box every day to reduce the risk of T. gondii infection, noting that the parasite does not become infectious until 1-5 days after it has been shed in the animal's feces.
They also recommend feeding cats only canned or dried commercial foods or well-cooked meats; feeding them raw or undercooked meats can increase the presence of T. gondii in a cat's feces.
It is important to note that cat feces are not the only source of T. gondii infection. Humans can contract the parasite through consuming undercooked or contaminated meats and by drinking contaminated water.
Written by Honor Whiteman
Questionnaire Predicts Risk for Death Within 5 Years
June 6, 2015
QUESTIONNAIRE PREDICTS RISK FOR DEATH WITHIN 5 YEARS
Answers to a brief questionnaire can predict risk of dying within 5 years for people aged 40 to 70 years in the United Kingdom, according to a new large-scale study published online June 4 in the Lancet.
Andrea Ganna, PhD, from the Karolinska Institutet in Stockholm and Uppsala University in Sweden, and Erik Ingelsson, MD, also from Uppsala University, set up a databank project called UK Biobank. Between 2006 and 2010, it collected 655 measurements, including blood samples, bone density, family history, and so on, from 498,103 UK volunteers aged 40 to 70 years.
The researchers followed the volunteers until February 2014. For those who died, a cause of death was assigned using information from the Health & Social Care Information Centre and National Health Service Central Register. The authors used a statistical survival model to assess the probability that specific demographic, lifestyle, and health measurements could predict death from any cause and six specific causes in men and women separately.
"This is the first study of its kind which is based on a very large study sample, and is not limited to specific populations, single types of risk, or requiring laboratory testing," Dr Ingelsson said in a journal news release.
Questionnaires Beat Physical Tests
The authors found that the variables that most accurately predicted death from all causes within 5 years were not the physical measures, but those reported on the questionnaires.
For example, asking people to rate their overall health and to describe their usual walking pace were two of the strongest predictors in both men and women for different causes of death.
Some findings differed by sex. Self-reported health (C-index including age, 0.74; 95% confidence interval [CI], 0.73 - 0.75]) was the strongest predictor of all-cause mortality in men, and a previous cancer diagnosis (C-index including age, 0.73; 95% CI, 0.72 - 0.74) was the strongest predictor of all-cause mortality in women.
Walking pace was a stronger predictor than smoking habits and other lifestyle measurements in predicting death. Men aged 40 to 52 years who said their walking pace was "slow" had a 3.7 times increased risk for death within 5 years than those who answered "steady average pace."
When researchers examined only people who did not have any major diseases, smoking habits were the strongest predictors of death within 5 years.
Dr Ganna and Dr Ingelsson conclude: "The prediction score we have developed accurately predicts 5 year all-cause mortality and can be used by individuals to improve health awareness, and by health professionals and organisations to identify high-risk individuals and guide public policy."
The authors say it is likely the prediction would work similarly in countries comparable to the United Kingdom in demographic and socioeconomic factors, provision of healthcare, and lifestyle and risk factors, although only studies in each country could present that evidence.
Comment Writers Question Value
In an accompanying comment, Simon G. Thompson, DSc, director of research in biostatistics in the Department of Public Health and Primary Care at the University of Cambridge, United Kingdom, and Peter Willeit, MD, PhD, a chronic disease epidemiologist with the department, say the study's large size offers both advantages and disadvantages.
Because so many factors were investigated, "some of the stronger associations are likely to be exaggerated," they write.
"This study reinforces our evidence that increases in physical activity, smoking cessation, and having a healthy diet can increase longevity," the authors say. "However, the challenge lies in how these changes can be achieved, rather than in removing any uncertainty in scientific understanding."
They add that 5-year risk for death is easier to predict than long-term morbidity or quality of life, "which are more important to individuals and to society."
They note, however, that the accompanying website is intriguing in helping people to determine their "Ubble age" (the age where the average risk in the population is most similar to the estimated risk of the individual), and they say Biobank is beginning to show its true potential.
UK Biobank is available to researchers whose application is approved. So far, more than 1800 scientists have registered to study a wide range of diseases.
Immunotherapy heralds 'new era' for cancer treatment
June 3, 2015
IMMUNOTHERAPY HERALDS 'NEW ERA' FOR CANCER TREATMENT
A "whole new era" for cancer treatment is upon us, according to experts. Two new studies published in the New England Journal of Medicine provide further evidence that immunotherapy - the use of drugs to stimulate immune response - is highly effective against the disease.
Recently presented at the 2015 American Society for Clinical Oncology annual meeting, one study revealed that a drug combination of ipilimumab and nivolumab (an immune therapy drug) reduced tumor size in almost 60% of individuals with advanced melanoma - the deadliest form of skin cancer - compared with ipilimumab alone, while another study found nivolumab reduced the risk of lung cancer death by more than 40%. Nivolumab is a drug already approved by the Food and Drug Administration (FDA) for the treatment of metastatic melanoma in patients who have not responded to ipilimumab or other medications. It is also approved for the treatment of non-small cell lung cancer (NSCLC) that has metastasized during or after chemotherapy. According to cancer experts, however, the results of these latest studies indicate that nivolumab and other immune therapy drugs could one day become standard treatment for cancer, replacing chemotherapy. Prof. Roy Herbst, chief of medical oncology at Yale Cancer Center in New Haven, CT, believes this could happen in the next 5 years. "I think we are seeing a paradigm shift in the way oncology is being treated," he told The Guardian. "The potential for long-term survival, effective cure, is definitely there."
Nivolumab plus ipilimumab reduced tumor size by at least a third for almost 1 year
Nivolumab belongs to a class of drugs known as "checkpoint inhibitors." It works by blocking the activation of PD-L1 and PD-1 - proteins that help cancer cells hide from immune cells, avoiding attack. In a phase 3 trial, Dr. Rene Gonzalez, of the University of Colorado Cancer Center, and colleagues tested the effectiveness of nivolumab combined with ipilimumab - a drug that stimulates immune cells to help fight cancer - or ipilimumab alone in 945 patients with advanced melanoma (stage III or stage IV) who had received no prior treatment.
While 19% of patients who received ipilimumab alone experienced a reduction in tumor size for a period of 2.5 months, the tumors of 58% of patients who received nivolumab plus ipilimumab reduced by at least a third for almost a year.
Commenting on these findings, study co-leader Dr. James Larkin, of the Royal Marsden Hospital in the UK, told BBC News:
"By giving these drugs together you are effectively taking two brakes off the immune system rather than one, so the immune system is able to recognize tumors it wasn't previously recognizing and react to that and destroy them.
For immunotherapies, we've never seen tumor shrinkage rates over 50% so that's very significant to see. This is a treatment modality that I think is going to have a big future for the treatment of cancer."
Dr. Gonzalez and colleagues also demonstrated the effectiveness of another immune therapy drug called pembrolizumab in patients with advanced melanoma. While 16% of 179 patients treated with chemotherapy alone experienced no disease progression after 6 months, the team found that disease progression was halted for 36% of 361 patients treated with pembrolizumab after 6 months. Dr. Gonzalez notes that while a combination of nivolumab and ipilimumab shows greater efficacy against advanced melanoma than pembrolizumab, it also presents greater toxicity. Around 55% of patients treated with nivolumab plus ipilimumab had severe side effects, such as fatigue and colitis, with around 36% of these patients discontinuing treatment. Dr. Gonzalez says such treatment may be better for patients whose cancer does not involve overexpression of the PD-L1 protein. "Maybe PDL1-negative patients will benefit most from the combination, whereas PDL1-positive patients could use a drug targeting that protein with equal efficacy and less toxicity," he adds. "In metastatic melanoma, all patients and not just those who are PD-L1-positive may benefit from pembrolizumab."
Nivolumab almost doubled patient survival from NSCLC
In another study, Dr. Julie Brahmer, director of the Thoracic Oncology Program at the Johns Hopkins Kimmel Cancer Center, and colleagues tested the effectiveness of nivolumab a The team found that patients who received nivolumab had longer overall survival than those treated with standard chemotherapy, at 9.2 months versus 6 months.
At 1 year after treatment, the researchers found nivolumab almost doubled patient survival. Around 42% of patients who received nivolumab were alive after 1 year, compared with only 24% of patients who received chemotherapy.
The study results also demonstrated a longer period of halted disease progression for patients who received nivolumab compared with those who had chemotherapy, at 3.5 months versus 2.8 months. Overall, the researchers estimated that, compared with patients who received chemotherapy, those who received nivolumab were at 41% lower risk of death from NSCLC.
Commenting on these findings, Dr. Brahmer says:
"This solidifies immunotherapy as a treatment option in lung cancer. In the 20 years that I've been in practice, I consider this a major milestone."
While both studies show promise for the use of immunotherapy in cancer treatment, experts note that such treatment would be expensive. The use of nivolumab plus ipilimumab for the treatment of advanced melanoma, for example, would cost at least $200,000 per patient. As such, researchers say it is important that future research determines which cancer patients would be most likely to benefit from immunotherapy.
Written by Honor Whiteman
Alterations to the eye microbiome of contact lens wearers may increase infections
June 1, 2015
ALTERATIONS TO THE EYE MICROBIOME OF CONTACT LENS WEARERS MAY INCREASE INFECTIONS
Contact lens wearers - ever wondered why you are more likely to experience eye infections than your contacts-less friends? Researchers from NYU Langone Medical Center in New York City think they may have found the answer, in a study that used high-precision genetic tests to map the human microbiome.
Presenting their work at the annual meeting of the American Society for Microbiology on May 31st in New Orleans, LA, the NYU Langone researchers report that micro-organisms residing in the eyes of people who wear contact lenses daily more closely resemble micro-organisms residing in eyelid skin than the bacteria usually found in the eyes of people who do not wear contacts. The researchers took hundreds of swabs of different parts of the eye, including the skin directly beneath the eye. Genetic analysis of swabs and used contact lenses allowed the team to identify which bacteria were present.
Comparing nine contact lens wearers with 11 non-contacts users, the team found three times the usual proportion of the bacteria Methylobacterium, Lactobacillus, Acinetobacter and Pseudomonas on the eye surfaces (conjunctiva) of contact lens wearers than on the eye surfaces of the control group.
Examining the bacterial diversity using a plotted graph, the team observed that the eye microbiome of contact lens wearers is more similar in composition to the microbiome of their skin than the eye microbiome of non-lens wearers. Interestingly, the researchers say, Staphylococcus bacteria was found in greater amounts in the eyes of non-lens wearers. Staphylococcus is linked with eye infections, but is usually more prominent on the skin. However, the researchers are unable to explain why non-lens wearers have greater amounts of this bacteria, despite this group traditionally having fewer eye infections than people who wear contacts. Next, the team will investigate whether or not changes in the eye microbiome of people who wear contacts are caused by the direct pressure of the lens altering the immune system in the eye and hope to identify in greater detail which bacteria thrive or are suppressed in this environment.
Putting a foreign object on the eye 'is not a neutral act'
Study author Dr. Jack Dodick, professor and chair of ophthalmology at NYU Langone, says:
"There has been an increase in the prevalence of corneal ulcers following the introduction of soft contact lenses in the 1970s.
A common pathogen implicated has been Pseudomonas. This study suggests that because the offending organisms seem to emanate from the skin, greater attention should be directed to eyelid and hand hygiene to decrease the incidence of this serious occurrence."
"Our research clearly shows that putting a foreign object, such as a contact lens, on the eye is not a neutral act," says senior study investigator and NYU Langone microbiologist Maria Gloria Dominguez-Bello, PhD. "These findings should help scientists better understand the longstanding problem of why contact-lens wearers are more prone to eye infections than non-lens wearers," she adds.
Written by David McNamee
Seasonal allergies: tips and remedies
May 29, 2015
SEASONAL ALLERGIES: TIPS AND REMEDIES
While this time of year usually brings cheerful weather and the growth of beautiful plants, millions of people will be gearing up once again to do battle with a problem that recurs every year. Itchy eyes, repetitive sneezing, a permanently runny nose - the symptoms of seasonal allergies.
For many people, the emergence of marauding ticks at this time of year is the least of their worries. The real struggle for these people is with seasonal allergies, also referred to as hay fever or allergic rhinitis.
If these common symptoms seem to develop for weeks and months on end at the same time each year, it is likely that you could be affected by seasonal allergies. The condition affects many in the US; in 2010, around 11.1 million visits to physicians' offices led to a primary diagnosis of hay fever.
Thankfully, despite how infuriating and disruptive seasonal allergies can be, there are many steps that can be taken to lessen their impact. In this Spotlight, we take a look at what seasonal allergies are and what the best strategies are for handling them.
What causes such allergies?
People develop allergies when their body's immune system reacts to a substance as though it is a threat like an infection, producing antibodies to fight it. These substances are referred to as allergens.
The next time that the body encounters the allergen, it produces more antibodies in anticipation, releasing histamine and chemical mediators in the body that lead to an allergic reaction. It is these chemicals that typically cause symptoms in the nose, throat, eyes and other areas of the body.
Jan Batten, a British Lung Foundation (BLF) Helpline nurse, explained to Medical News Today that as the summer months approach, certain allergies begin to cause more problems, such as allergies to flower pollen, grass pollen, tree molds and fungi. The drier days around this time of year help the allergens to remain in the atmosphere for longer.
"Summer allergies start to pick up around May and those affected will usually get itchy and runny eyes, a runny nose and inflamed, swollen sinuses. Breathing through your nose can be difficult too, and you might have a cough," she explained.
The American College of Allergy, Asthma & Immunology (ACAAI) report that allergies are the sixth leading cause of chronic illness in the US. According to the American Academy of Allergy, Asthma & Immunology (AAAAI), around 7.8% of people aged 18 and above have hay fever. Worldwide, the condition affects 10-30% of the population.
Most people with hay fever understand that their symptoms are set off by pollen, the fine powder released from flowering plants in order to reproduce. Pollens are spread by the wind and can be inhaled or land in the eyes or on the skin.
The most common trigger of seasonal allergies is pollen, though they can also be triggered by grasses and mold. Dealing with seasonal allergies, however, is not merely a matter of knowing when these airborne allergens are most prevalent and trying to avoid them. There are a few added complications to keep you on your toes.
Avoiding triggers - be aware of what sets you off
"People focus on the highs and lows of pollen counts," says Dr. James Sublett, president of ACAAI. "What they don't realize is that a high total pollen count doesn't always mean you will have allergy symptoms. The pollen from the plant you are allergic to may not be high. The key is to know what you're allergic to, and how to treat your particular symptoms."
Different kinds of pollen are prevalent at different times of the year, as well as varying from location to location. Between January and April, pollen is typically released from trees including pine, ash, birch, elm and poplar. During the summer months, grass pollens dominate, and in the fall, weed pollen is most prevalent.
People can determine whether they have an allergy or not by consulting their primary care physician and undergoing allergy testing. Dr. Andrew S. Kim, an allergist from the Allergy & Asthma Centers in Fairfax and Fredericksburg, VA, told MNT that sometimes people confuse having allergies with the flu or common cold.
"Allergies may share some similarities with sneezing and sniffling but the length of time is a big difference. Allergy symptoms usually last for weeks and months and patients typically complain of itchy nose, throat and eyes as well," he said.
"Allergy patients usually do not have fever. They do have dry cough and clear nasal drainage versus infectious cough which is characterized by yellow, or greenish nasal drainage. Some people may have asthma symptoms, such as cough, wheeze and chest tightness."
Once an individual knows that they have a seasonal allergy and is aware of what triggers it, they are in a much better position to avoid debilitating allergic reactions. Keeping track of pollen forecasts is a good place to start. It is good to remember that these change by the hour, and can be boosted when it is warm, dry and windy.
To reduce the chances of an allergic reaction, it is recommended that you stay inside when pollen counts are at their highest. These usually peak around the morning hours and maintain high levels during the afternoon.
If you do need to go outside, there are a number of steps that can be taken to reduce the chances of coming into contact with allergens. Wearing wraparound sunglasses offers protection to the eyes, and applying a small amount of petroleum jelly to the insides of the nostrils can prevent some allergens from reaching the sensitive lining of the nose.
Delegating outdoor chores to people that do not have seasonal allergies is a sensible approach. If there is no escaping lawn mowing or weed pulling, however, wear an NIOSH-rated (National Institute for Occupational Safety and Health) 95 filter mask to keep allergens out.
Laundry should not be hung to dry outside, despite the conditions being perfect for it. Pollen can stick to sheets and towels and be brought into the home - normally a haven from pollens. In fact, when tackling seasonal allergies, ensuring that your home is your castle is a great strategy.
Minimizing the risk indoors
It is impossible to remove all allergens from the air inside the home, but there are certainly steps that can help reduce levels of exposure. Keeping the windows shut is a simple strategy that should be one of the first to be adopted.
Shutting the windows might be the last thing on your mind when temperatures start to rise. To stay cool without the threat of pollen looming large, use air conditioning in the house and car. It is preferable that high-efficiency air filters are used and that units follow regular maintenance schedules.
Whenever you venture outside, there is the chance that you will bring pollen back inside with you on your clothes and hair. For this reason, people should wash their hair and clothes more regularly during periods when the pollen count is high.
If you are drying clothes indoors and keeping the windows closed, you may need to use a dehumidifier to keep the indoor air dry. Keeping the air dry indoors helps prevent the growth of other allergens such as molds.
Keeping the home clean with a vacuum cleaner that has a high-efficiency particulate air filter and using a damp duster to stop pollens moving about the home also helps to clean up any allergens that are present, reducing the chances of them getting onto and into the body.
"Simple changes like wearing wraparound sunglasses, washing your clothes and hair more regularly, keeping your home clean, avoiding open, grassy spaces where possible and keeping your windows shut can help lessen the effect of summer allergies," Jan Batten told
All of these measures are relatively simple to take and can go a long way toward protecting the body from seasonal allergies. However, as stated before, it is nigh-on impossible to completely avoid exposure to allergens. Particularly for people who experience severe reactions to pollen, the best route to ease symptoms is often a medical one.
Medicine and other treatment
People tend to have unique allergic responses, so the treatment that works best for each individual will vary accordingly. While some people will be able to cope with seasonal allergies with over-the-counter medication and being careful about their exposure to allergens, others may require personal treatment plans drawn up by specially trained allergists. There is a wide range of nonprescription medication available for people who have seasonal allergies. Oral antihistamines relieve symptoms such as sneezing, itching and runny noses. Decongestants relieve nasal stuffiness and come in both oral and nasal form. Some medications contain a combination of the two.
Two types of immunotherapy are available to those who require relief from severe symptoms. These are allergy shots and tablets, and they are provided and prescribed by allergists. Allergy testing will need to be carried out first to determine precisely what allergens trigger symptoms.
Allergy shots consist of injecting a patient with diluted extracts of an allergen. Increasing doses are administered until a maintenance dose is established. This process helps the body to build up a form of resistance to the allergen and reduces the severity of symptoms.
Tablets can currently be used to treat allergies to grass and ragweed pollens. Beginning at least 3 months before the relevant pollen season begins, patients take one tablet daily, with the treatment continuing for as long as 3 years.
Dr. Kim told MNT that one of the best ways to reduce the influence of seasonal allergies is to start taking medication - such as topical nasal steroids - about a week before the beginning of the allergy season:
"Don't wait until symptoms kick in and you're already feeling bad before taking allergy medication. Instead, prepare by taking medications just before the season starts to minimize the symptoms of seasonal allergies."
A number of alternative treatments are also available, including natural remedies that feature extracts of butterbur and spirulina. It is recommended that any use of alternative treatments is discussed with a physician first, as some remedies may not be entirely safe for use.
There are many options for alleviating seasonal allergies
Allergies can be worrying, especially for people who are otherwise healthy and unused to experiencing sudden debilitating symptoms. If left unchecked, seasonal allergies can often turn an otherwise enjoyable time of year for many into misery.
Thankfully, there are many routes available for people with seasonal allergies to alleviate their symptoms. As ever, if there are any concerns or worries, it is best to speak with a health care professional who will be able to offer advice, provide treatment or refer on to a specialist.
Although there is no cure at present for seasonal allergies, the multiple options for treatment should hopefully provide some relief until winter rolls around again and we can shiver together, happy in the fact that pollen has gone for another year.
Written by James McIntosh
Metabolic syndrome could increase cardiovascular risks
May 26, 2015
METABOLIC SYNDROME COULD INCREASE CARDIOVSCULAR RISKS
Metabolic syndrome could be more of a risk to people's health than originally thought, according to new research. A study published in the Journal of Clinical Endocrinology & Metabolism suggests that people with metabolic syndrome are more likely to die from cardiovascular disease than those without the condition. Meanwhile, another new study, published in the Journal of the American Heart Association, suggests that metabolic syndrome may increase cardiovascular risk more in black women than in white women.
"It appeared that the cardiovascular disease risk was elevated in black women by the presence of only two or three metabolic abnormalities to a degree that would require four or more metabolic abnormalities among white women," says Dr. Michelle Schmiegelow, author of the Journal of the American Heart Association study and research fellow at University Hospital Gentofte, Denmark.
Metabolic syndrome is a cluster of risk factors that occur together and increase the risk of stroke and diabetes. The risk factors are increased blood pressure, high triglyceride levels, low levels of "good" cholesterol, impaired glucose metabolism and abdominal obesity.
Previous studies have indicated that obesity without metabolic syndrome (defined here as having at least three of the risk factors) is not associated with an increase in cardiovascular disease risk. However, these studies focused predominantly on white participants.
Researchers analyzed a multiethnic group of postmenopausal women aged 50-79 recruited by the Women's Health Initiative, assessing cardiovascular disease risk according to weight and metabolic health status.
Of the 14,364 participants, around 47% were white, 36% were black and 18% were Hispanic. Participants were classified as "overweight" if approximately 10% over their ideal body weight for size and "obese" if around 30 pounds over their ideal weight.
Participants were followed up for 13 years. During this time, 1,101 women had either developed coronary heart disease or had an ischemic stroke for the first time.
The researchers found that, among black women with 2-3 metabolic risk factors, the relative risk of cardiovascular disease increased by 117% in those that were obese and increased by 77% in women who were overweight.
In comparison, white women with 2-3 metabolic risk factors who were obese or overweight experienced cardiovascular events as often as white women with normal weight and without any metabolic disorders.
In the absence of metabolic syndrome, black women who were obese or overweight had a higher risk of cardiovascular disease compared with normal weight black women. In contrast, white women without metabolic syndrome had a similar risk of cardiovascular disease regardless of weight classification.
Dr. Schmiegelow suggests the findings imply that metabolic syndrome may underestimate cardiovascular disease risk in black women and overestimate it in white women, at least in postmenopausal women.
Diabetes and high blood pressure 'elevate the risk of death'
For the other study, published in the Journal of Clinical Endocrinology & Metabolism, researchers assessed the findings of a health screening program at Kangbuk Samsung Hospital in South Korea, in which 155,971 people participated between 2002 and 2009.
Data were collected by conducting questionnaires and measuring the body weight, body mass index, blood pressure, cholesterol and blood sugar of each participant. Death records from the Korea National Statistical Office were also obtained to measure the mortality of the participants.
"Our research found people who had metabolic syndrome had a 1.6-fold-increase in cardiovascular mortality compared to those who did not have the condition," says Prof. Ki-Chul Sung. "Women who have metabolic syndrome faced a greater risk of death from any cause than their counterparts who did not."
A total of 12.6% of the participants had metabolic syndrome when first screened. While the findings indicate that people with metabolic syndrome face a greater risk of death from cardiovascular disease than those with the condition, this difference disappeared when participants with diabetes or high blood pressure were removed from the analysis.
"The analysis tells us diabetes and high blood pressure are significant factors that elevate the risk of death from cardiovascular disease among people with metabolic syndrome," states study author Prof. Eun-Jung Rhee, of Sungkyunkwan University School of Medicine.
"Younger people who have metabolic syndrome should be aware of the risk, particularly those who have diabetes and high blood pressure."
In the US, metabolic syndrome is commonplace. Medical News Today recently reported on a study published in JAMA that found more than a third of adults in the US have metabolic syndrome, with almost half of adults aged 60 and above affected by the condition.
Written by James McIntosh
New treatment for middle ear infection found in anti-stroke drug
May 24, 2015
NEW TREATMENT FOR MIDDLE EAR INFECTION FOUND IN ANTI-STROKE DRUG
Middle ear infection or otitis media is the most common childhood bacterial infection and the leading cause of conductive hearing loss - which can occur during critical stages of children's speech and language development. Now, a new animal study suggests repurposing an existing drug - vinpocetine, that has long been used to treat stroke and other neurological disorders - may provide a much needed, nonantibiotic, treatment.
The drug - which acts by limiting overproduction of mucus as opposed to targeting the bacteria causing the ear infection - may also address the urgent need for nonantibiotic treatments that reduce inflammation without side effects.
Vinpocetine is a synthetic ethyl ester of apovincamine, an alkaloid obtained from the leaves of the lesser periwinkle (Vinca minor) and discovered in the late 1960s. Also known by the trade name Cavington, vinpocetine is used as an anti-stoke drug in most countries and a dietary supplement worldwide.
In the new study, published in the Journal of Immunology, researchers at Georgia State University and the University of Rochester showed that vinpocetine could be an effective treatment for middle ear infection.
They describe how the drug suppressed mucus overproduction, improved bacterial clearance and reduced hearing loss in mice caused by Streptococcus pneumonia bacteria, the most common cause of middle ear infection.
The team suggests the drug could be repurposed as a new, nonantibiotic treatment for otitis media, possibly through topical delivery (that is applied to the affected area as opposed to injection or tablet form).
Senior author Dr. Jian-Dong Li, director of the Institute for Biomedical Sciences at Georgia State, suggests:
"Our proposed studies may lead to developing novel, nonantibiotic therapeutic strategies to control immunopathology, reduce mucus overproduction, improve hearing loss and enhance host defense for otitis media."
There is an urgent need for nonantibiotic drugs to suppress overactive inflammation without significant side effects - especially as inappropriate antibiotic use has led to increased resistance. And vaccines against S. pneumonia have limited effect in otitis media, says Dr. Li.
However, because we don't know much about how S. pneumonia causes infection in the middle ear, there are no nonantibiotic treatments available.
Vinpocetine inhibits mucin production
We do know that mucin, the main component of mucus, plays an important role clearing away unwanted bacteria. But what can happen is that the body produces too much mucus, resulting in conductive hearing loss and less effective clearance of bacteria.
In their study, the researchers found that in cultured middle ear epithelial cells and in the middle ear of mice with middle ear infection, vinpocetine inhibited S. pneumoniae's upregulation of a gene that produces mucin.
Repurposing an existing drug has many advantages over proposing a new, experimental drug. It saves time and cost, and should also reduce safety risks - the authors note there have been no reports of significant adverse effects or toxicity of vinpocetine used in therapeutic doses in adults and children.
In the US, pediatric ear infections cost the health care system billions of dollars a year.
A 2014 study by researchers at Harvard University and the University of California Los Angeles, found that children with ear infections had an average of two additional outpatient visits, 0.2 emergency room visits and 1.6 prescriptions filled, compared with those without ear infections.
In that study, the team estimated that ear infections were associated with an increased cost of $314 per child per year for outpatient care, and an average of $17 in medication costs. Across the US, this added up to $2.88 billion of direct cost of health care for children's ear infections every year. This figure excludes costs associated with work and school days missed, and cost of travel to and from hospitals and clinics.
Written by Catharine Paddock PhD
Long-term acetaminophen use during pregnancy may affect boys' fertility
May 23, 2015
LONG-TERM ACETAMINOPHEN USE DURING PREGNANCY MAY AFFECT BOYS FERTILITY
Research in mice has found that exposing fetal testicular tissue to the painkiller acetaminophen lowers production of testosterone. The lab work suggests that pregnant women taking the drug for one day would not affect their unborn boy - but several days of the analgesic could.
The risks of low testosterone in male fetuses cited by the study authors include common male reproductive disorders that manifest at birth, such as an undescended testis (cryptorchidism) and the defect known as hypospadias, a urethral malformation that results in the boy's urine outlet not being in the normal position at the end of the penis.
Or disorders appearing in young adulthood also risked by low testosterone include low sperm counts and testicular germ cell cancer.
One of the authors, consultant pediatric endocrinologist Dr. Rod Mitchell, a Wellcome Trust intermediate clinical research fellow at the UK's University of Edinburgh, says:
"This study adds to existing evidence that prolonged use of acetaminophen in pregnancy may increase the risk of reproductive disorders in male babies." His advice for expectant women is:
"We would advise that pregnant women should follow current guidance that the painkiller be taken at the lowest effective dose for the shortest possible time."
The study, conducted by Dr. Sander van den Driesche and colleagues at the University of Edinburgh, has been published in Science Translational Medicine, a journal of the American Association for the Advancement of Science.
Mice with human testes
To examine the effects of acetaminophen, also known as paracetamol, on testosterone production, the researchers used a xenograft model in which fragments of human fetal testes were transplanted into castrated mice.
To overcome the limitations of ordinary animal testing, as well as the obvious problems that would be encountered by trying to take measures of testosterone production in unborn boys and link these with drug exposure in pregnant women, the team used a xenograft technique they had developed and validated as a model of human fetal testicular development.
The xenograft model "reflects physiological development and can be used to test the effects of chemical exposures on testosterone production."
In the grafted mice treated 3 times a day for 7 days with a human-equivalent dose of acetaminophen (20 mg per kg of bodyweight):
- Testosterone levels in the blood dropped by 45%
- The weight of the seminal vesicle glands fell by 18%.
The seminal vesicles secrete the large part of the semen fluid, and the researchers used their weight as a biomarker of exposure to testosterone. The results are percentage drops compared with no acetaminophen in a placebo treatment.
Exposure to the drug - available over the counter to women in the US under brands including Tylenol - for a single day, however, did not affect these measures of testosterone production.
The researchers point out that while it is too early to call how applicable these lab findings are with regard to human use of the analgesic, "the findings caution against extended use of acetaminophen during pregnancy."
The team, from the MRC Centre for Reproductive Health at the university, say further research would be needed to understand how acetaminophen could be having the effect on testosterone production in male fetuses.
The human fetal testis tissue used in the medical research was donated by women who had undergone pregnancy termination.
The laboratory research is a more direct examination of acetaminophen effects on fetal testicular tissue, to build on links in previous studies cited by the authors between prolonged use of the painkiller during pregnancy and an increased occurrence of, for example, undescended testes in boys.
'Short-term treatment of fever outweighs potential risk'
The UK's Royal College of Paediatrics and Child Health has responded to the study with a reminder that pregnant women should consult a doctor when seeking treatment of pain or fever.
Dr. Martin Ward-Platt says on behalf of the college that the study is a clear message about expectant mothers not taking prolonged acetaminophen, and keeping to such medical guidance.
The US Food and Drug Administration says: "Pregnant women should always consult with their health care professional before taking any prescription or over-the-counter medicine."
Dr. Ward-Platt adds that the study specifically relates to acetaminophen/paracetamol use over at least several days, but that "there are times where one or two doses is needed to treat one-off episodes of fever for example."
"Fever during pregnancy can be harmful to the developing embryo, with links to a significant increase in the rates of spina bifida and heart malformations, so small doses of paracetamol are sometimes necessary."
Dr. Ward-Platt concludes: "My message to expectant mothers is clear - avoid overuse of paracetamol, but if you do have a fever, or any other sort of pain where you would normally use paracetamol, seek medical advice."
Written by Markus MacGill
Lifelong flu jab steps closer as researchers reveal importance of immune cell memory
May 20, 2015
LIFELONG FLU JAB STEPS CLOSER AS REASEARCHERS REVEAL IMPORTANCE OF IMMUNE CELL MEMORY
In 2013, an elderly man in China fell ill after catching a virus from a live chicken his wife had asked him to buy at a local market. He was patient zero in the first outbreak of a new strain of avian flu called A (H7N9) that killed over 30% of humans infected and hospitalized 99% of survivors.
Since the first cases were reported in March 2013, 640 people have become infected and 224 have died from A (H7N9) globally through March 2015 - most from the mainland of China. The virus passes from infected birds to humans.
As that first outbreak ensued, Australian and Chinese scientists teamed up to investigate why some people died while others survived, and why some survivors recovered more quickly than others.
The researchers found the clue lay in the way a certain group of white blood cells - called CD8+ T cells - responded to the new virus.
They believe their discovery could advance flu vaccine technology and bring closer the day when people just have one flu jab that protects them for life.
One of the investigators, Katherine Kedzierska, an associate professor at the University of Melbourne in Australia, explains how the team - which included specialists from Melbourne and virologists from Fudan University in Shanghai, China - reacted to the outbreak:
"We'd never seen anything like H7N9. The virus was infecting more people rapidly and nobody had immunity. Thankfully, we did manage to contain the virus but we knew we had come face-to-face with a potential pandemic that could kill millions of people around the world if the virus became able to spread between humans."
Early, virus-specific CD8+ T cell response was key to faster recovery in survivors
CD8+ T cells are the body's "assassins" when it comes to taking out new viruses - they kill cells infected with the virus. The new study - published in Nature Communications - explains how these cells memorize the viruses.
When the immune system is faced with a new virus, a cascade of defenses kicks in, involving different groups of cells at different times. First, the innate immune system responds with some generalized defenses to try and stop the virus multiplying and spreading.
This is followed by the adaptive immune response that specifically targets the virus and, if all goes well, clears it from the body.
The speed and pattern of the immune response, particularly how quickly and precisely it targets the new virus, is a big factor in how well it succeeds in overcoming the pathogen.
In the case of the 2013 Chinese A (H7N9) outbreak, the Australian and Chinese team found patients who recovered quickly showed evidence of early CD8+ T cell responses that were specific to the new virus.
The patients that took longer to recover showed a similar - but delayed - CD8+ T cell response, coupled with late recruitment of CD4+ T cells and antibodies that was later boosted by another group of NK cells.
But in the patients who died, the researchers found little evidence of immune cell response that was specific to A (H7N9), and no sign of any T cell activation.
"After collecting samples from infected patients we found that people who couldn't make these T cell flu assassins were dying," Prof. Kedzierska notes.
Step closer to 'one-off universal flu vaccine shot'
The researchers concluded that it was the ability of flu-killing CD8+ T cells to memorize distinct strains that protected the patients who recovered most quickly from being severely affected by the new influenza A virus.
The team believes their discovery could lead to new cellular memory-implant technologies that help flu vaccine developers move from targeting specific flu strains toward a universal one that is based on T cells. Prof. Kedzierska concludes:
"Our extraordinary breakthrough could lead to the development of a vaccine component that can protect against all new influenza viruses, with the potential for future development of a one-off universal flu vaccine shot."
She says their findings should also help doctors predict how well their patients' immune systems are likely to respond to viruses so they can intervene early with artificial ventilation and other treatments, especially if patients are at risk of dying.
In October 2014, Medical News Today learned how another research group led by Brown University in Providence, RI, showed that once CD8+ T cells tackle one virus, they will fight others.
In that study, the team challenged the traditional view that CD8+ T cells can only deal with one pathogen. They discovered that once CD8+ T cells fight one pathogen, they join the innate immune system, ready to respond to cytokine signals that are set off by a wide variety of infections.
Written by Catharine Paddock PhD
Antibiotic use in infancy, gut microbe disruption, and disease later in life are all linked, say researchers
May 17, 2015
ANTIBIOTIC USE IN INFANCY, GUT MICROBE DISRUPTION, AND DISEASE LATER IN LIFE ARE ALL LINKED, SAY RESEARCHERS
Antibiotics are the most common prescription drug given to children, and a third of such prescriptions are not necessary. This would not necessarily be a cause for concern, except that antibiotics disrupt gut microbes, imbalances in which are linked to disease in later life, say researchers after reviewing current knowledge linking antibiotic use in infants, imbalance in gut microbes and adult disease.
Evidence of the potential harm to emotional and physical health caused by imbalances in gut microbes - called dysbiosis - is mounting daily. Dysbiosis has been linked to infectious diseases, allergies and other autoimmune disorders, and even obesity, later in life.
For instance, recently, there was a report on how gut microbes are important for the production of serotonin, a brain chemical traditionally associated with regulation of emotions and behavior, but where imbalances in production outside the brain are now also linked to diseases ranging from irritable bowel syndrome and cardiovascular disease, to osteoporosis.
Writing the journal Cell Host & Microbe, the researchers behind this new study - including members from the University of Minnesota in Minneapolis - highlight the complex nature of the connection between how microbes in the infant gut react to antibiotics and the development of disease later in life.
Senior author Dan Knights, an assistant professor specializing in computational biology at the University of Minnesota, says:
"Diseases related to metabolism and the immune system are increasing dramatically, and in many cases we don't know why."
Framework for studying antibiotic-related dysbiosis in children
To make it easier to navigate their synthesis of current knowledge, the team created a framework that can be used to study antibiotic-related dysbiosis in children. Prof. Knights explains:
"Previous studies showed links between antibiotic use and unbalanced gut bacteria, and others showed links between unbalanced gut bacteria and adult disease. Over the past year we synthesized hundreds of studies and found evidence of strong correlations between antibiotic use, changes in gut bacteria, and disease in adulthood."
The framework is depicted at Figure 1 in the Cell Host & Microbe paper.
For example, in the case of allergies, they found use of antibiotics may destroy communities of gut bacteria that help immune cells mature. Even if these colonies return, the immune system remains impaired.
In relation to obesity, they found antibiotic-induced imbalances in gut microbiota led to increased levels of short-chain fatty acids that affect metabolism.
The team also showed how you can predict an infant's age to within 1.3 months from the maturity of their gut microbiota. This discovery could lead to a test and treatments for children whose microbiome is underdeveloped because of antibiotic use, or other reasons.
Prof. Knights concludes:
"We think these findings help develop a roadmap for future research to determine the health consequences of antibiotic use and for recommendations for prescribing them. The clinical test we demonstrated would also allow us to think about interventions at an early age."
He and his colleagues call for more research into four areas of antibiotic-related dysbiosis in children: loss of key populations, loss of diversity, effect on metabolism and overgrowth of potentially harmful bacteria.
They also say there is a need to form a large and diverse cohort of children to establish a baseline for healthy microbiome development.
Such a baseline cohort is "essential to advancing diagnosis, interpretation, and eventual treatment of pediatric dysbiosis," they urge.
Meanwhile, another recent study shows how gut microbiome diversity appears to be diminished by Western lifestyles. There, researchers compared the gut microbiomes of people living in Papua New Guinea with people living the US. They found 50 bacterial types in Papua New Guinean microbiomes were missing in the US ones.
Written by Catharine Paddock PhD
Extra nuts or oil with Mediterranean diet could protect memory
May 14, 2015
EXTRA NUTS OR OIL WITH MEDITERRANEAN DIET COULD PROTECT MEMORY
Researchers in Spain have suggested that following a Mediterranean diet supplemented with additional portions of antioxidant-rich extra virgin olive oil or mixed nuts could protect cognitive functioning in older adults.
The study, published in JAMA Internal Medicine, was a randomized clinical trial that followed cognitive change over time among volunteers assigned to follow one of three different diets. Volunteers were cognitively healthy, had a high cardiovascular risk and an average age of 67.
Following the Mediterranean diet has been recommended by the Dietary Guidelines for Americans as a way to promote health and prevent disease. Emphasis is placed on eating primarily plant-based foods, basing every meal on foods such as fruits, vegetables, whole grains, legumes and nuts.
The diet discourages the use of saturated fats and trans fats, both associated with heart disease. Instead, healthier types of fat are obtained from sources such as olive oil, predominantly containing monounsaturated fat which can improve cholesterol levels.
Vegetables and healthier fats are also good sources of antioxidants that play an important role in counteracting oxidative stress. Oxidative stress occurs when the body is unable to detoxify itself fully, and the process is believed to play a significant role in cognitive decline.
"Oxidative stress and vascular impairment are believed to partly mediate age-related cognitive decline, a strong risk factor for development of dementia," write the authors. "Epidemiologic studies suggest that a Mediterranean diet, an antioxidant-rich cardioprotective dietary pattern, delays cognitive decline, but clinical trial evidence is lacking."
To test this hypothesis, Dr. Emilio Ros of the Institut d'Investigacions Biomediques August Pi Sunyer, Hospital Clinic, Barcelona, and coauthors examined the effects of Mediterranean diets supplemented with olive oil or nuts compared with a low-fat control diet.
A total of 155 participants were randomly assigned to follow a Mediterranean diet supplemented with one liter of extra virgin olive oil per week. Another 147 participants were assigned to follow a Mediterranean diet supplemented with 30 grams per day of mixed nuts (walnuts, hazelnuts and almonds).
The researchers measured cognitive change in the participants over time using multiple neuropsychological tests focusing on memory, global cognition and frontal cognition (attention and executive function). These participants were compared with a control group of 145 participants following a diet where they were advised to reduce dietary fat.
Extra olive oil or nuts 'may counteract age-related cognitive decline'
After a median of 4 years of the dietary intervention, follow-up cognitive tests were available in 334 participants. The researchers found there were 37 cases of mild cognitive impairment among the participants: 17 (13.4%) in the group that received extra olive oil, eight (7.1%) in the mixed nuts group and 12 (12.6%) in the control group.
Participants in the control group experienced significant decreases in each measured composite of cognitive function. The researchers noted, however, that the two Mediterranean diet arms of the study experienced different improvements in cognitive function.
"The group with nuts did better compared to the control group in memory tests, memorizing names or words, while the olive oil group did better on tests that require speed of thought, your frontal function, your executive function," explains Dr. Ros.
Although the study was a randomized clinical trial, it has its limitations. Not all participants received follow-up cognitive testing, and adherence to all three diets cannot be guaranteed. According to the researchers, further investigation is warranted.
"Our results suggest that in an older population, a Mediterranean diet supplemented with olive oil or nuts may counteract age-related cognitive decline," conclude the authors. "The lack of effective treatments for cognitive decline and dementia points to the need of preventive strategies to delay the onset and/or minimize the effects of these devastating conditions."
In an interview with JAMA, Dr. Ros states that the research team is currently conducting a study into the effects of walnuts on neurodegenerative disease, comparing a walnut diet with a control diet.
Last week, Medical News Today reported on a similar study in which researchers monitored the diets of older adults for 5 years and tested for cognitive decline. The researchers discovered that those who ate healthily experienced only a small drop in brain power.
Written by James McIntosh
Study shows dopamine may play role in chronic pain
May 11, 2015
STUDY SHOWS DOPAMINE MAY PLAY ROLE IN CHRONICAL PAIN
The brain chemical dopamine - already known to be important for thinking, memory, movement and reward - may also play a key role in maintaining chronic pain, says a new study published in The Journal of Neuroscience.
Researchers from the University of Texas (UT) at Dallas and others traced the path of pain signals between the brain and spinal cord in mice and found removing a group of dopamine-containing cells selectively reduced chronic pain.
Senior author Ted Price, associate professor in behavioral and brain sciences at UT Dallas, says the study reveals a new role for dopamine in helping maintain chronic pain states, and suggests:
"This may open up new opportunities to target medicines that could reverse chronic pain."
In acute pain, when we suffer an injury, pain signals travel like electricity from the site of the injury to the spinal cord, which passes them on in the form of other chemical or electrical pulses that in turn are relayed to brain cells that distribute them throughout the brain.
In people with chronic pain, their nerve cells continue to send pain signals to the brain - even in the absence of injury - but the causes of this are not known.
The brain has several pain centers, and evidence suggests chronic pain alters how they are activated.
Targeting A11 dopamine-containing cells permanently reversed chronic pain in mice
The new study shows that a group of dopamine-containing cells called A11 do not affect acute pain, but they appear to have a profound effect on chronic pain.
In their paper, Prof. Price and colleagues describe how they permanently reversed a chronic pain state in mice by targeting A11 cells.
They got the idea for the study because they noticed other studies on chronic pain had looked at the role of other brain chemicals like norepinephrine and serotonin. So they decided to take a look at dopamine. Prof. Price explains what they did:
"We used a toxin that affected A11 neurons, and that's when we found that acute pain signals were still normal, but chronic pain was absent."
The researchers conclude their study increases our understanding of the causes of chronic pain and the factors that contribute to it, which should eventually lead to more effective treatments. Prof. Price adds:
"In future studies, we would like to gain a better understanding of how stress interacts with A11. And we'd like to know more about the interaction between molecular mechanisms that promote chronic pain and dopamine."
In 2011, the Institute of Medicine estimated that more than 100 million Americans suffer from chronic pain, amounting to a national burden of around $600 billion each year in medical care and lost productivity.
The University of Texas at Dallas, the National Institutes of Health and the Rita Allen Foundation helped fund the study.
Written by Catharine Paddock PhD
Obesity as a "Brain Disease"
May 9, 2015
OBESITY AS A "BRAIN DISEASE"
Obesity is a complex multifactorial disease that acts as a gateway to many other chronic conditions — with resultant enormous impact — and should actually be viewed as a brain disease, one expert argues.
Speaking at the 2015 European Congress on Obesity today, bariatric surgeon Carel Le Roux, MBChB, PhD, of University College Dublin, Ireland, explained, "Obese people have a functional deficiency in many of the hormones that should rise after a meal," but in fact the receptors for these hormones lie in the brain, and in addition the gut relays messages about satiety to the brain via the vagus nerve.
"We used to think of obesity and the brain in terms of psychology," he explained in a media master class here. But in genomewide-association studies for obesity, "although we only get a few hits, most point to the brain," and, in support of this observation, the effects of bariatric surgery are mainly mediated in the brain, he added.
Labeling obesity as a brain disease "is controversial," he acknowledged, but it is necessary to "allow us to shift our thinking" and better understand the physiology of the condition for the development of new treatments.
Indeed, a shift in thinking is required to enable doctors worldwide to treat the obesity epidemic, recognizing that it is important to differentiate "how we prevent obesity from how we treat it," he observed.
Personalizing Treatment for Obesity
Immediate past president of the European Association for the Study of Obesity, Gema Frübeck, MD, PhD, of University of Navarra, Pamplona, Spain, told assembled journalists that it is important for doctors to recognize that for any one person, there are different causes for obesity.
Going forward, it will be key to bring the concept of personalized medicine to obesity and "have much more detailed phenotypes in each individual," she noted.
Genetics is known to play a role, but only around 20% of cases of obesity are accounted for by genetic variations, explained Luc van Gaal, from the University of Antwerp, Belgium.
There are, however, numerous other factors that contribute to the mix, including endocrine disruptors; medicines that affect body weight — such as corticosteroids and antidepressants; psychological factors — such as lack of sleep and anxiety and depression; and physiological explanations, such as hypothyroidism — although the last accounts for a very small minority of cases, he admitted.
Going forward, "understanding the physiology of obesity is very important for the development of new treatments," Dr van Gaal stressed.
Renewed understanding of the role of hormones such as ghrelin and leptin, as well as determining how, for example, brown fat fits into the mix, may help with future pharmacological approaches, he added.
Managing Expectations for Treatment of Obesity
It is also vital for doctors to understand that "management of obesity is feasible," but patients need to be properly informed of the anticipated benefits, Dr Le Roux said.
Current president of the European Association for the Study of Obesity, Hermann Toplak, MD, of the University of Graz, Austria, told journalists that "we only need a 5% to 10% weight loss to get benefits in terms of a reduction in cardiovascular risk, for example, but this often doesn't meet the expectation of the patient." Newer obesity drugs will generally provide only this level of weight loss, and it is key to get patients to understand that they also need to institute — and maintain — lifestyle changes, in terms of diet and physical activity, to gain maximum benefit, he noted.
And such obesity agents work effectively only in patients who "respond" to them, Dr Toplak explained, although he added that, thankfully, it is possible to identify, very early on — within the first 3 months of use — which patients will respond to a given drug.
Indeed, for the first time, the European Medicines Agency has instituted a "stopping" rule in its recent approval of two new obesity agents, liraglutide (Saxenda, Novo Nordisk) and naltrexone/buproprion (Mysimba, Orexigen Therapeutics), such that if a patient hasn't lost a certain amount of weight by week 16, therapy should be ceased.
This will be "a big advantage" in helping to limit treatment to responders only, Dr van Gaal explained.
And although it is early days to be able to predict the uptake of these new obesity drugs in Europe — which are the first new agents approved there for many years — as reimbursement issues are still to be sorted out in all countries, he suggested that national health authorities could perhaps somehow tie reimbursement to response in their authorization of use of these agents.
New Aim: 15% Weight Loss With Pharmacological Agents
And while current pharmacologic agents only provide 5% to 10% weight loss, "regardless of the mechanism," there is a big contrast with bariatric surgery, where weight loss can be much greater, up to 30% to 40% in some patients, Dr van Gaal commented.
But if a 15% weight loss could be achieved through pharmacological means, this would be a huge step forward, he said.
"If we want to see the effects on total mortality [in obesity] that so far have only been seen in surgery, we will need combination [pharmacological] therapy," in the same way as hypertension and diabetes are treated today, he added.
To Medscape Medical News, Dr van Gaal explained that he is not talking about combining therapy with existing obesity agents; rather "it must be based on physiology."
Going forward, for example, "I think combining peptide therapy — for instance liraglutide plus leptin, or liraglutide plus [peptide tyrosine-tyrosine] PYY — could be beneficial. There must be a scientific background."
And Dr Le Roux stressed that "losing weight is relatively easy, but maintaining weight loss is hard."
With surgery, "we know we can maintain 25% weight loss at 20 years" in some individuals, and "I agree that if we can get 15% weight loss after 10 years" with pharmacological agents, this will be a huge advance.
"There is no one silver bullet, it must be a combination of approaches."
Healthy diets make brighter brains
May 7, 2015
HEALTHY DIETS MAKE BRIGHTER BRAINS
A study following nearly 28,000 people aged 55 and older at high cardiovascular risk, which monitored their diets for 5 years and tested declines against thinking and memory tests, found a smaller drop in brain power for those who ate well.
The American Academy of Neurology has published the results in the journal Neurology. The healthy eating linked to the stronger cognitive health was a diet with not much red meat, moderate alcohol and lots of fruits and vegetable, nuts and fish.
The 27,860 over-55s included for the analysis, from across 40 countries, were studied over an average of around 5 years.
Certain health conditions were excluded at the start of the study of people at high risk of cardiovascular disease. None of the participants had diabetes or a history of heart disease, stroke or peripheral artery disease; nor had any recently experienced serious disease outcomes such as a stroke or congestive heart failure.
Participants who experienced heart disease or stroke during the study were no longer followed for diet and mental power.
To take a baseline measure of cognitive health and monitor any decline, thinking and memory skills were tested at the start of the study, then 2 years and about 5 years later.
A maximum of 30 points was possible against these thinking and memory tests and cognitive decline was noted when scores dropped by 3 points or more, which happened for 17% overall - a total of 4,699 participants.
Measuring cognitive health
This new study linking brain power and diet involved a test of cognitive health that is used during dementia diagnosis. The mini mental state examination (MMSE) measures:
- Orientation to time and place
- Word recall
- Language abilities
- Attention and calculation
- Visuospatial skills.
Cognitive decline lowest among those who reported healthiest diets
The proportion registering a decline was lower for people reporting the healthiest diets - 14% of these showed a drop in thinking and memory, compared with 18% of the people eating the least healthy diets. For the measure of diet, the participants were asked at the start of the study to say how often they ate certain foods, including vegetables, nuts and soy proteins, whole grains and deep-fried foods. They also reported levels of alcohol intake and gave data to produce a ratio of fish to meat and eggs in their diets.
The measure of diet quality was a modified version of the healthy eating index used by the US government.
Among the 5,687 people with the healthiest diet, 782 made up the 13.8% having cognitive decline, while of the 5,459 people with the least healthy diets, 987 accounted for the 18.1%.
The relative difference from these figures produces a 24% lower likelihood of a drop in thinking and memory for people eating well.
The researchers accounted for factors that could have affected the results, such as physical activity, high blood pressure and history of cancer.
Study author Dr. Andrew Smyth, of McMaster University in Hamilton, Ontario, Canada, and the National University of Ireland in Galway, says diet in later life is only part of the picture:
"Adoption of a healthy diet probably begins early in life, and a healthy diet might also go along with adoption of other healthy behaviors."
For their data, the authors examined participants from randomized drug trials in cardiovascular disease supported by pharmaceutical company Boehringer Ingelheim.
In background to their work, the authors cite previous brain health links to healthy diet but point out that using the large multinational prospective cohort study allows observation of "more precise associations between diet (assessed using standardized methodology) and cognitive outcomes."
Explaining what biological explanations may lie behind the emerging evidence, the authors say: "Dietary intake may modify the risk of cognitive decline through multiple mechanisms, including increased risk of stroke (both overt and covert) and through deficiency of nutrients required for neuronal regeneration (for example, group B vitamins, and vitamin C)."
The risk factors for dementia listed by the US National Institute of Neurological Disorders and Stroke includes a number that can be modified by dietary and lifestyle measures.
The new study ends by stating:
"In conclusion, we report that higher diet quality is associated with a reduced risk of cognitive decline. Improved diet quality represents an important potential target for reducing the global burden of cognitive decline."
Written by Markus MacGill
Two-minute walk every hour may reduce hazard of prolonged sitting
May 5, 2015
TWO-MINUTE WALK EVERU HOUR MAY REDUCE HAZARD OF PROLONGED SITTING
Numerous studies have shown that prolonged sitting day after day is linked to poorer health and early death. Now, a new study suggests even a small change can make a difference. The researchers say a 2-minute walk every hour may offset the risk of death linked to prolonged sitting.
The researchers report their study in the Clinical Journal of the American Society of Nephrology (CJASN). They conclude that low-intensity activity such as standing may not be enough to offset the effect of sitting for long periods, but adding just 2 minutes of walking per hour to a weekly routine of more moderate exercise may have an effect.
Previous studies have linked daily prolonged sitting to increased risk of premature death, as well as heart disease, diabetes and other chronic health problems.
To reduce these risks, bodies like the American Heart Association recommend adults do at least 150 minutes a week of moderate-intensity exercise, or 75 minutes a week of vigorous exercise. However, the researchers note that 80% of Americans do not achieve this recommendation, so they wondered if there might be more achievable goals - what kinds of brief, light activities might be traded for some of the sitting time - that could make a difference? For their analysis, the team used data on 3,243 participants in the 2003-04 National Health and Nutrition Examination Survey (NHANES). To monitor the intensity of their activity during waking hours, the survey had asked participants to wear accelerometers for several days. The participants were followed for 3 years after the activity data was collected. During this time 137 of them died.
2 minutes of walking per hour linked to 33% lower risk of death
On the basis of the accelerometer readings, the researchers worked out on average how many minutes per hour were spent on sedentary (less than 100 accelerometer counts per minute), low (100-499), light (500-2019), and moderate to vigorous (2020 and over) activity. They then examined links with mortality of 2 minutes per hour less of sedentary time spent in one of the low, light, or moderate/vigorous activity durations. The researchers found no benefit to decreasing sitting by 2 minutes per hour and replacing it with 2 minutes of low-intensity activity, such as standing.
However, they found trading 2 minutes of sitting for 2 minutes of light-intensity activity - such as casual walking, light gardening or cleaning - was linked to a 33% lower risk of premature death.
Lead author Srinivasan Beddhu, professor of internal medicine at the University of Utah School of Medicine in Salt Lake City, says: "It was fascinating to see the results because the current national focus is on moderate or vigorous activity. To see that light activity had an association with lower mortality is intriguing."
Even short periods of light activity add up
Prof. Beddhu notes that even short periods of light activity add up to a lot when repeated over the course of a week. Assuming you are awake for 16 hours a day, then strolling for 2 minutes an hour adds 400 kcal a week to your energy expenditure - which is not far off the 600 kcal it takes to accomplish the goal of moderate exercise. He concludes:
"Based on these results we would recommend adding 2 minutes of walking each hour in combination with normal activities, which should include 2.5 hours of moderate exercise each week."
Prof. Beddhu says that moderate exercise brings health benefits in ways that light exercise cannot - for example, it strengthens the heart, muscles and bones. Because of its design, this type of study cannot prove that doing 2 minutes of light activity an hour reduces risk of premature death - it can only show links to it and "strongly suggest." Prof. Beddhu says only large, randomized interventional trials can show if replacing some sitting time with light activity leads to better health. The National Institute of Diabetes and Digestive Kidney Diseases (NIDDK) and the University of Utah Study Design and Biostatistics Center helped to fund the study.
Written by Catharine Paddock PhD
What are the health benefits of tea?
May 3, 2015
WHAT ARE THE HEALTH BENEFITS OF TEA ?
Tea is the second most consumed beverage in the world, second only to water. All tea, including the four main types (black, white, green and oolong) originates from the same plant, Camellia sinensis. Although all tea has the same origin, the differences occur in the harvesting and processing.
Green tea starts with freshly picking Camellia sinensis leaves and immediately steaming or pan-frying them to halt oxidation and fermentation, which results in a fresher, lighter flavor. White tea is made in a similar fashion using the newest, youngest buds of the plant. The leaves of black and oolong tea wither and then are rolled and crushed. Oolong tea is partially fermented while black tea is fully fermented. Matcha tea, which is recently gaining popularity, is finely ground or milled green tea turned into a powder. Traditional matcha tea powder is then sifted into a bowl with hot water until frothy. When drinking matcha, you consume the whole tea leaf instead of just the infusion, which experts say make matcha nutritionally superior to green tea.
Black tea contains 2-3 times more caffeine than green tea.
Herbal tea is made from a variety of plants, herbs and spices and in many countries cannot be called "tea" since it does not come from the Camellia sinensis plant. This MNT Knowledge Center feature is part of a collection of articles on the health benefits of popular foods. It provides a nutritional breakdown of tea, an in-depth look at its possible health benefits, how to incorporate more tea into your diet and any potential health risks of consuming tea.
Contents of this article:
1. Nutritional breakdown of tea
2. Possible health benefits of consuming tea
3. How to incorporate more tea into your diet
4. Potential health risks of consuming tea
Nutritional breakdown of tea
According to the USDA National Nutrient Database, one cup of black tea (approximately 237 grams) contains 2 calories, 1 gram of carbohydrate, 0 grams of sugar, 0 grams of fiber and 0 grams of protein as well as 26% of daily manganese needs and small amounts of riboflavin, folate, magnesium, potassium and copper. Unsweetened brewed green tea is a zero calorie beverage. The caffeine contained in a cup of tea can vary according to the length of infusing time and the amount of tea infused. Overall, tea contains few calories, helps with hydration and is a good source of antioxidants. Catechins, potent antioxidants found primarily in green tea, are known for having beneficial anti-inflammatory and anti-carcinogenic properties.
Possible health benefits of consuming tea
Most available studies on the potential health benefits of consuming tea have used green tea.
A study published in the Journal of Physiological Anthropology looked at the effects of green tea, white tea and water consumption on stress levels in 18 students. The study suggested that both green and white tea had a lowering effect on stress levels, however, white tea had an even greater effect. Larger studies need to be done to confirm this possible health benefit.
Boosting the brain
Some studies suggest that tea may be beneficial in reducing the risk of dementia and even enhancing our brain's cognitive functions, particularly the working memory.
A 2006 study published in the Journal of the American Medical Association concluded that green tea consumption is associated with reduced mortality due to all causes, including cardiovascular disease. The study followed over 40,000 Japanese participants between the ages of 40 and 79 for 11 years, starting in 1994. The participants who drank at least 5 cups of green tea per day had a significantly lower risk of dying (especially from cardiovascular disease) than those who drank less than one cup of tea per day. Another study found that consuming 10 cups of green tea per day can lower total cholesterol, however, consuming 4 cups or less had no effect.
Decreasing cancer risk
Epigallocatechin-3-gallate (EGCG) is the most studied and bioactive polyphenol in tea and has been shown to be the most effective at eliminating free radicals. According to the National Cancer Institute, the polyphenols in tea have been shown to decrease tumor growth in the laboratory and animal studies and may protect against damage caused by ultraviolet UVB radiation. In countries where green tea consumption is high, cancer rates tend to be lower. However, it is impossible to know for sure whether it is the green tea that prevents cancer in these specific populations or other lifestyle factors. One large-scale clinical study compared green tea drinkers with non-drinkers and found that those who drank the most tea were less likely to develop pancreatic cancer, particularly women, who were 50% less likely to develop the disease. Studies have also shown the positive impacts of green tea on breast, bladder, ovarian, colorectal, esophageal, lung, prostate, skin and stomach cancer. Other studies have shown a lack of preventative effects of tea on cancer. The amount of tea required for cancer-preventive effects has also varied widely in studies - from 2-10 cups per day. In 2005, the FDA stated that, "there is no credible evidence to support qualified health claims for green tea consumption and a reduced risk of gastric, lung, colon/rectal, esophageal, pancreatic, ovarian, and combined cancers."
Green tea may have an anti-arthritic effect by suppressing overall inflammation. According to Registered Dietitian Joy Bauer, "studies suggest that EGCG works to stop the production of certain inflammatory chemicals in the body, including those involved in arthritis. Preliminary research suggests that EGCG and other catechins in tea may prevent cartilage from breaking down, possibly helping to preserve joints longer."
How to incorporate more tea into your diet
For optimal health benefits, tea should be steeped as long as possible to increase flavonoids and 2-3 cups should be consumed per day, according to Tori Crawford, MS, RD, LD. Bottled tea does not seem to be as beneficial as brewed tea and contains smaller amounts of beneficial polyphenols. Not only can you drink tea, but also you can incorporate it into your cooking. Check out these healthy and delicious recipes developed by Registered Dietitians that incorporate tea:
Green tea honey vinaigrette dressing
Gingery green tea smoothie
Matcha tea waffles
Matcha and pumpkin seed crusted salmon
Peppermint mocha matcha tea
Matcha vegetable curry.
Potential health risks of consuming tea
People who are extremely sensitive to caffeine could experience insomnia, anxiety, irritability, or an upset stomach when consuming tea. Those taking anticoagulant drugs such as Coumadin/warfarin should drink green tea with caution due to its vitamin K content. Tea has been found to decrease the bioavailability of iron when taken with meals. People with a history of iron deficiency should take care to not consume tea when taking iron supplements or an iron-rich meal. If taken with other stimulant drugs, tea could possibly increase blood pressure and heart rate.
Green tea supplements contain high levels of active substances that can interact with medications and other herbs and supplements. Green tea supplements are unregulated by the FDA and may contain untested substances with unproven health benefits. Always check with a physician before starting any herb or supplement regimen.
In particular, pregnant or breastfeeding women, those with heart problems or high blood pressure, kidney or liver problems, stomach ulcers, or anxiety disorders should not take green tea supplements or extracts. It is the total diet or overall eating pattern that is most important in disease prevention and achieving good health. It is better to eat a diet with variety than to concentrate on individual foods as the key to good health.
Written by Megan Ware
The dangers of working in an office
May 2, 2015
THE DANGERS OF WORKING IN AN OFFICE
The office here at Medical News Today HQ is a pleasant place to work. It is a largely tranquil place (until somebody decides to use the shredder) where the tea is plentiful and occasionally a passing dog can be spotted through the window.
On the idyllic surface, it seems as though it would be a perfectly safe and healthy place to work. There are certainly no obvious hazards of the kind that are commonplace on construction sites or in factories, and many office workers enjoy the benefits of rigid working days, rather than having their body clocks thrown by changing shift patterns.
But offices are not without their hazards, even if these are not as overt as those in other environments. One significant problem comes from being sat at a desk for most of the day. A recent study has suggested that the amount of time spent sitting each day is associated with a higher risk of various diseases.
According to the US Bureau of Labor Statistics (BLS), around 21,638,470 people are employed in jobs defined as office and administrative support occupations. However, this figure does not include other occupations, such as management roles, business and financial operations occupations or computer occupations that are also likely to be based in office environments.
In this Spotlight, we investigate what the health-conscious office worker needs to be wary of if they are going to complete their 9-5 with both body and mind intact, and if there are any ways for them to maintain peak fitness during their employment.
Chained to your desk
As mentioned above, sitting for long periods of time every day is bad for your health. While sitting reduces the amount of time individuals can spend exercising, researchers have demonstrated that prolonged sitting time is associated with poor health outcomes regardless of the amount of physical activity performed.
Although sitting at a desk is a seemingly simple task, it is an easy one for people to do wrong. Workers often complain of sore wrists and pain in the back and neck, and this will frequently be due to the way they position their body while working.
If an individual is sitting or typing in an unhealthy way, it is likely that they will be putting strain on their body for most of their working day. That is a lot of strain for a body to take over the course of a week. Unsurprisingly, back pain is one of the most common reasons for employees missing work and is the second most common reason for visits to the doctor.
The American Chiropractic Association (ACA) state that back pain can be caused by poor posture, obesity and psychological stress among other factors, all of which can easily come into play in an office environment if work is tense and not allowing for employees to take leave from their desks.
Good posture at the desk is the first step to be taken in protecting your health when working in the office. This can be achieved with efficient office ergonomics. Making sure that all objects that will be needed are situated close by reduces excessive stretching.
When sitting in front of a computer, the body should be positioned centrally to the monitor and keyboard. You should sit up straight with feet rested flat on the floor. If this is not possible, a footrest should be used. Thighs should ideally be horizontal with the knees and level with the hips.
The forearms should also be level or tilted up slightly. When typing, wrists should be in a straight and natural position. Using a wrist rest can reduce stress on the wrists and help prevent specific awkward positioning. There are a number of common posture mistakes that can be made when sitting and can easily become part of a routine if not addressed:
- Slouching - this position places a lot of pressure on the lower back, damaging the ligaments, joints and soft tissue in this area and can lead to hunching
- Sitting cross-legged - this position tucks in the hip, making it difficult to sit up straight and leading to slouching. Sitting cross-legged can also lead to muscle imbalances in the hips that cause pain and stiffness
- Hunching forward - can lead to a tight chest and weak upper back, potentially leading to the development of a rounded upper back that is susceptible to pain and stiffness
- Poking the chin forward - sometimes a symptom of a hunched back or sitting too low as an attempt to compensate for excess downward pressure, this can lead to muscle weakness around the neck
- Phone cradling - employees that have to use a phone frequently may hold their phone handset between their ear and shoulder in order to leave their hands free to operate a computer or write. This can weaken the neck muscles and lead to muscle imbalances that cause headaches.
Office workers are advised to get up and move around whenever they can. The nature of many office jobs, however, usually results in long periods of sitting down. If you are going to be sitting down at a desk for any length of time, it is a good idea to get the basics right.
"The number one thing that gets people into trouble as far as a downgrade in their health is their posture," says Luis Feigenbaum, a director of sports physical therapy at the University of Miami's Miller School of Medicine, in conversation with ABC News.
Computers: one-eyed monsters of the office
These days, most people sitting at a desk will have a computer sitting right in front of them. Although they make a lot of jobs easier, they also make keeping healthy in the office a lot harder. Firstly, where a computer and its related hardware are positioned can drastically influence posture. The height of a computer monitor will affect the height of an office chair - a monitor should be positioned directly in front of the user, about an arm's length away, with the top of the screen just below eye level.
As well as posture, using a computer can wear down other parts of the body that are directly using it, namely the eyes and the wrists.
To avoid eye strain, both the computer monitor and the office lighting need to be addressed. The screen should be adjusted so that its brightness and contrast levels suit the lighting conditions in the room, which should not be too bright.
Screen glare is a major cause of eyestrain and can be reduced by ensuring that monitors are not positioned opposite windows where possible. If situated close to a window, use shades and blinds to reduce the amount of light that falls on the monitor.
If the font size of text being read on a computer is too small it can lead to eyestrain as well as harming posture, as a worker may be inclined to hunch forward to read text more closely. Increasing font size or zooming in on a page that is being read protects employees from this risk.
Typing is a repetitive action that puts the hands and wrists under great pressure. If performed forcefully enough and for long enough periods of time, it can lead to disabling pain. In office workers, it can lead to repetitive strain injuries, whereby the tissue surrounding the joints becomes inflamed or stress fractures develop.
Wrist injuries through typing can be prevented or at least reduced by maintaining a good typing posture. As mentioned earlier, wrists should be kept in a relaxed, natural position. Foam or gel wrist supports can provide extra protection.
One of the key messages when it comes to using computers in the office is how important it is to take regular breaks. The US Occupational Safety and Health Administration (OSHA) recommend that workers take a 10 minute break for every hour spent on a computer, allowing the body to recover and reducing the risk of strain.
These breaks can include working on other tasks that do not involve using a computer. They also represent an opportunity for employees to get out of the sitting position. Alternatively, if employees have the freedom to do so, breaks could involve seeking sustenance to refuel their bodies.
Here be vending machines
The office environment is often full of temptation when it comes to eating healthily. Many offices are home to vending machines filled with sugary drinks and fatty snacks that sing out to workers eager to get a quick energy boost. A desire for this kind of unhealthy food is increased if an individual hasn't eaten properly in the morning or obtained enough sleep the night before. Finding time for both sleep and breakfast helps reduce the lure of unhealthy food throughout the working day.
Bringing a lunch and snacks to work also helps keep office workers away from vending machines and restaurants, as well as saving them money. Snacking is fine if it is done healthily, and while vending machines are unlikely to stock fruit, vegetables, hummus and seeds, workers can bring these in themselves.
"It's really important to eat at least every four hours," Beth Thayer tells ABC. "You need to make sure you're setting some time aside to make sure you're getting food in."
Thayer, a registered dietitian and spokesperson for the American Dietetic Association, recommends packaging and preparing your own meals. "Small bags of nuts or snack mix you make yourself, or a small bag of fruit like apples or grapes," she suggested. "Fruit works well for people who drive a lot."
Eating is a great opportunity for workers to escape from their workstations, but few take advantage of it. According to a survey conducted by the American Dietetic Association in 2011, 62% of Americans eat lunch at their desks.
As well as preventing workers from getting away from work and keeping them sitting down in the same place, eating at the desk can lead to a build-up of bacteria if the correct hygiene precautions are not taken.
"We need to wash our hands and clean up the area after we eat at our desks," Thayer warns. "Don't let desks become places for bacterial growth."
Leaving the desk for a break allows workers to regroup and collect themselves away from their work. Doing this can be particularly important in mentally demanding roles. Taking a proper break can help reduce stress levels that can be responsible for a wide range of health problems.
How to improve your fitness at work
Although the office can often be a comfortable place to work, it is important that workers do not allow unhealthy practices to become comfortable and routine. Remaining sedentary, using office equipment incorrectly and eating unhealthily can eventually lead to debilitating health problems that could stop individuals from working altogether.
Thankfully, office work also provides a number of options for keeping fit, and if these are incorporated into a working routine then there is no reason why working in an office should condemn employees to a life of ill health.
- Travel to work by walking or biking. Get off public transport a stop earlier than normal or park your car further away from the office
- Stand instead of sitting when working as much as possible. Find as many excuses to get out of your chair as possible
- Spend time during breaks to go for a brisk walk or do some stretching to keep the muscles loose and strong.
On the surface, working in an office appears to be a simple form of employment. While that may be true in comparison with some other jobs, it is important that office workers do not get complacent and sit idly as their health runs away from them.
Dr. Timothy Church, from the Preventive Medicine Laboratory at Pennington Biomedical Research Center, Louisiana State University, told MNT that the biggest risk to the health of office workers is the sedentary lifestyle.
"The answer is getting active," he said. "Get up at least every 45 minutes and obtain at least 7,000 steps per day."
Written by James McIntosh
Telomere biomarker may lead to blood test that predicts cancer years in advance
May 1, 2015
TELOMERE BIOMARKER MAY LEAD TO BLOOD TEST THAT PREDICTS CANCER YEARS IN ADVANCE
A distinct pattern of changes in blood telomeres appears to predict cancer years before diagnosis. This was the result of a new study believed to be the first to follow what happens to the protective ends of DNA strands over time in people who go on to develop cancer.
Researchers from Northwestern and Harvard Universities report their findings in the new journal EBioMedicine.
They found blood telomeres age faster and then stop aging for a few years in the period leading up to a cancer diagnosis.
Lead author Lifang Hou, associate professor in Preventive Medicine - Cancer Epidemiology and Prevention at Northwestern University Feinberg School of Medicine, says:
"Understanding this pattern of telomere growth may mean it can be a predictive biomarker for cancer."
Telomeres are sequences of DNA on the ends of chromosomes - like the plastic caps on the ends of shoelaces - that stop them fraying and losing their integrity.
They gradually shorten as we age - by the time we grow up they are half the length they were when we were born, then they halve again as we enter old age.
Rapid shortening of telomeres followed by 3-4 years of stabilization
Scientists consider blood telomeres to be a marker of biological age, but they have also been looking at how they change in people developing cancer.
However, studies exploring blood telomere changes in relation to cancer have reached inconsistent conclusions: some say people developing cancer have shorter blood telomeres, others say they are longer, and some find no links at all.
In the new study, the team investigated how telomeres change over time as opposed to taking just a single snapshot. They found that a distinct pattern in the changing length of blood telomeres can predict cancer years before people are diagnosed.
The distinctive pattern shows a rapid shortening of the blood telomeres followed by 3-4 years where not much happens to the length.
The researchers say this distinctive pattern could serve as a biomarker that predicts cancer development with a blood test. As far as they know, theirs is the first study to report how telomere length changes in the years leading up to a cancer diagnosis, before treatment begins. This is significant because treatment for cancer can affect telomere length.
For their study, the researchers measured telomere length several times over a 13-year period in 792 people. One hundred and thirty-five of the participants eventually developed various cancers, including leukemia, and prostate, skin and lung cancer.
The telomeres of the participants who were later diagnosed with cancer aged much faster - that is they shortened more rapidly - in the first few years.
In the participants who developed cancer, the telomeres looked as much as 15 years older than those of the participants who did not develop cancer. But what was surprising was that the accelerated aging stopped 3-4 years before cancer diagnosis. Prof. Hou adds:
"Because we saw a strong relationship in the pattern across a wide variety of cancers, with the right testing these procedures could be used to eventually diagnose a wide variety of cancers."
She explains that the inconsistency of previous findings may be because they did not spot the stabilizing period:
"We saw the inflection point at which rapid telomere shortening stabilizes. We found cancer has hijacked the telomere shortening in order to flourish in the body."
Telomeres shorten every time a cell divides, which is why they get progressively shorter as we age. If the telomeres of a cell become too short, they can cause the cell to become faulty, and normally the cell self-destructs.
This raises a puzzling scientific question: since cancer cells divide more rapidly than normal cells, why don't they self-destruct when their telomeres become dangerously short?
This study may have the answer: it suggests cancer cells have found a way somehow to stop telomeres getting shorter.
Prof. Hou says if we can find out how cancer cells hijack this normal cell process, then perhaps we can develop treatments that cause them to self-destruct without harming healthy cells.
Funds for the study came from the National Institute of Environmental Health Sciences of the National Institutes of Health.
Written by Catharine Paddock PhD
Mitochondria's other job is to control stem cell development
April 30, 2015
MITOCHONDRIA'S OTHER JOB IS TO CONTROL STEM CELL DEVELOPMENT
In a remarkable discovery, scientists show that blocking the action of a key enzyme in mitochondria stops stem cells from developing into egg cells in fruit flies.
Mitochondria - tiny digestive systems found inside nearly every cell of the body - are traditionally known for their vital role in generating energy for cells to function.
In the new study, published in Nature Cell Biology, a team led by researchers from NYU Langone Medical Center, NY, shows that mitochondria's role in the development of stem cells is entirely distinct from that of producing energy for cell metabolism.
In their traditional role, mitochondria provide cells with units of energy in the form of adenosine triphosphate (ATP). The chemical reaction that produces ATP relies on an important enzyme called ATP synthase enzyme.
The new research shows that ATP synthase is also important for normal stem cell development. The enzyme directly controls the growth and maintenance of "cristae" - the wrinkled, folded membranes inside mitochondria - as the stem cells divide and form the specific cell components of the female germ cell or egg.
Because ATP synthase energy production is common in all cells with a nucleus, the researchers say it is very likely that what they have found in the fruit flies experiments will be true of all mammals, including humans.
Blocking any of 13 key ATP synthase proteins stopped egg development
Senior investigator Ruth Lehmann, a cell biology professor at NYU Langone, says earlier studies have discovered damaged or immature cristae in several animal species with faulty ATP synthase, but this is the first study to show a link to stem cell development.
In their experiments, Prof. Lehmann and colleagues found blocking any of the 13 key proteins linked to ATP synthase disrupted or stalled egg development in the fruit flies.
They also found that blocking other enzymes involved in ATP production - before ATP synthase steps in - did not damage egg development.
The study took 2 years as the team screened more than 8,000 fruit fly genes thought to be involved in the development of stem cells that lead to egg or sperm. ATP synthase stood out - they noticed how it remained active even when other enzymes involved in ATP production were turned off.
Team plans to investigate whether mitochondria play any other vital roles
Prof. Lehmann explains how she and her colleagues will continue the research:
"Our team plans further investigations into precisely how ATP synthase biologically controls cristae development, and whether other developmental roles are influenced by mitochondria."
Funds for the study came from the US National Cancer Institute and National Institute of Child Health and Human Development, both parts of the National Institutes of Health. Boehringer Ingelheim Fonds, and the American Cancer Society also supported the study.
Mitochondria hit the headlines earlier this week because of a breakthrough study that holds promise for families affected by potentially severe diseases caused by faulty mitochondrial DNA that passes to children from their mothers.
Written by Catharine Paddock PhD
New tickborne disease found in China may pose substantial health threat
April 28, 2015
NEW TICKBORNE DISEASE FOUND IN CHINA MAY POSE SUBSTANTIAL HEALTH THREAT
Tickborne illnesses - such as Lyme disease, Tularemia and Rocky Mountain spotted fever - can be serious and sometimes deadly. They are a major public health problem around the world. Now, a new study reports the discovery in northern China of a tickborne illness in humans that has never been seen before.
Ticks are small, blood-sucking arthropods, and like their cousins - mites, spiders and scorpions - they have eight legs. There are many different species of tick, with different ones biting and sucking the blood of different animals, and sometimes this includes humans.
Some ticks carry pathogens like viruses and bacteria that enter the bloodstream of the animals and people that they bite. There are many different tickborne illnesses caused by a range of pathogens.
The Centers for Disease Control and Prevention (CDC) list at least 14 different types of tickborne diseases known in the US. Tickborne diseases usually cause fever, chills, aches, pains and rash. Symptoms range from mild reactions that are treatable at home to severe infections requiring hospitalization.
Entirely new species of bacteria
The new discovery, reported in The Lancet Infectious Diseases, is the work of a team of researchers from China and the US. In their paper, they say it is possible that the newly-discovered disease could be a "substantial" threat to human and animal health in the region where the tick prevails.
They name the newly discovered pathogen - a bacterium - Anaplasma capra, after the fact it appears to be common in goats. "Capra" is the Latin word for "goat."
The bacterium is related to other Anaplasma bacteria, some of which can also cause illness when transmitted from ticks to humans.
However, the researchers note they are not sure how widespread A. capra and the tick that carries it might be and whether they bite other animals as well as goats.
Co-author J. Stephen Dumler, a professor of pathology at the University of Maryland School of Medicine in Baltimore and an expert with global experience of tickborne diseases, says:
"This is an entirely new species of bacteria. This had never been seen in humans before. We still have a lot to learn about this species, but it may be that this bacteria is infecting humans over a wide area."
Prof. Dumler himself discovered another Anaplasma bacterium that causes the disease human anaplasmosis 2 decades ago.
For the study, he and his colleagues - including researchers from the Beijing Institute of Microbiology and Epidemiology, the Mudanjiang Forestry Central Hospital and Shanghai Jiaotong University, all in China - tested 477 patients in northeast China who had been bitten by a tick over the period of a month in the spring of 2014.
They found that 6% of the patients - 28 individuals - were infected by the new species of bacteria - A. capra.
The symptoms of infection by A. capra include fever, muscle aches, headache, tiredness and dizziness. The patients recovered after treatment with antibiotics, particularly doxycycline.
A. capra probably transmitted by the taiga tick
Not much is known about A. capra. It is not easy to diagnose - there is no simple blood test.
The researchers say A. capra is probably transmitted by the taiga tick - a close relative of the deer tick. The tick is widespread in Eastern Europe and Asia - including Russia, China and Japan.
If the taiga tick spreads A. capra throughout this region, then human infection may be common, says Prof. Dumler, who notes that more than a billion people live in areas where the tick is prevalent. He and his colleagues conclude:
"The emergence of A. capra as a cause of human disease suggests that individuals living in or traveling to endemic regions in northern China should take precautions to reduce their risk of exposure to this novel tickborne pathogen."
The Natural Science Foundation of China and the US National Institutes of Health funded the study.
Written by Catharine Paddock PhD
Surgery is not an option for two thirds of global population
April 27, 2015
SURGERY IS NOT AN OPTION FOR TWO THIRDS OF GLOBAL POPULATION
Worldwide, 5 billion people are excluded from surgery that could often save lives or avert disability, finds a study published in The Lancet. This represents two thirds of the world not having access to safe and affordable surgery and anesthesia when they need it - and it is largely the "poor, marginalized and rural" who "face impossible hurdles."
These are the comments of Prof. John Meara and Dr. Sarah Greenberg, both from an initiative at Harvard Medical School in Boston, MA, named the Program in Global Surgery and Social Change. Their commentary introduces new studies on global surgical care forming part of a campaign launched by the medical journal.
"Surgery has, until now, been overlooked as a critical need for the health of the world's population," says The Lancet, which established a commission of 25 leading experts in surgery and anesthesia, and has taken contributions from more than 110 countries.
One of the lead authors in the commission, Dr. Lars Hagander, from Lund University in Sweden, says: "The problem is especially acute in the low- and middle-income countries of eastern, western and central sub-Saharan Africa, and South and Southeast Asia.
"Too many people are dying from common, treatable surgical conditions, such as appendicitis, obstructed labor and fractures."
Doctors deal with, broadly speaking, either surgical or medical conditions, and we hear more about the burden of disease from the latter. Yet conditions that could have been treated with surgery accounted for a total of 16.9 million deaths in 2010, the journal has found - which was just under a third (32.9%) of all deaths that year, "well surpassing the number of deaths from HIV/AIDS, TB and malaria combined."
The group of experts leading the studies calls for a global investment of $420 billion by 2030, an amount the commission says would give acceptable levels of access to surgery in those countries that have the worst availability.
They believe this would be an achievable cost "far outweighed by the devastating economic cost to countries, communities, and families incurred by the current global shortfall in access to surgery."
Prof. Meara, who is an associate professor of surgery at Boston Children's Hospital in addition to holding a professorship in global surgery at Harvard, believes the "scale-up costs are large, [but] the costs of inaction are higher" and that creating access to essential surgery where it is presently lacking would be a "highly cost-effective investment, rather than a cost." He adds the following call to action:
"Surgical conditions - whether cancers, injuries, congenital anomalies, childbirth complications or infectious disease manifestations - are ubiquitous, growing and marginalizing to those who are afflicted by them.
The good news is that we believe it is possible to turn this dire situation around within the next 2 decades - but only if the international community wakes up to the enormous scale of the problem, and commits to the provision of better global surgical and anesthesia care wherever it is needed."
'Quarter of surgical patients incur financial catastrophe'
The commission has been gathering evidence from a collection of studies in its campaign, and in its report, "Global surgery 2030: evidence and solutions for achieving health, welfare and economic development" it lists statistics that adversely affect low- and middle-income countries (LMICs) the most. The authors say that:
- 9 in every 10 people "cannot access basic surgical care" in LMICs, and an extra 143 million surgical procedures "are needed in these countries to save lives and prevent disability"
- Need is greatest in eastern, western, and central sub-Saharan Africa, and south Asia
- Across the world, 33 million individuals each year face "catastrophic health expenditure" on surgery and anesthesia. "A quarter of people who have a surgical procedure will incur financial catastrophe as a result of seeking care"
- Without urgent investment in surgery, "LMICs will continue to have losses in economic productivity, estimated cumulatively at 12.3 trillion US dollars" between 2015 and 2030
- Surgery is an "indivisible, indispensable part of health care" and "should be an integral component of a national health system in countries at all levels of development."
A campaigning video produced by The Lancet at YouTube uses infographics to give a picture of the obstacles to surgical care that are faced by many:
Source: The lancet
Simple strategy could revive first-line antibiotics
April 26, 2015
SIMPLE STRATEGY COULD REVIVE FIRST-LINE ANTIBIOTICS
Using a computer model, researchers have identified a simple way to optimize dosing that could bring back a whole arsenal of first-line antibiotics and preserve last-resort antibiotics in the fight against drug-resistant bacteria.
Writing in the journal PLOS Computational Biology, researchers at Duke University in Durham, NC, describe how a computer simulation developed in their lab shows that a dosing regimen based on the recovery time of a target bacteria could eliminate an otherwise resistant strain.
In his Nobel Prize acceptance speech in 1945, Sir Alexander Fleming - the doctor and bacteriologist who revolutionized medicine with his discovery of penicillin - warned that there will come a time when penicillin will be so easy to acquire that "the ignorant man may easily underdose himself and by exposing his microbes to non-lethal quantities of the drug, make them resistant."
Seventy years later, the era Fleming predicted is upon us. The World Health Organization (WHO) describe antibiotic resistance as a major threat to global public health; bacteria are becoming resistant faster than we can develop new drugs to fight them.
This means there is a pressing need to use the antibiotics we already have more effectively.
First author Hannah Meredith, creator of the computer model and biomedical engineering graduate fellow at Duke, says:
"We hope this research will help hospitals improve patient outcomes while also making our antibiotics last as long as possible."
Computer model focuses on period before beta-lactamase degrades the antibiotic
The computer model simulates the relationship between bacteria and antibiotics while focusing on the activity of a bacterial enzyme called beta-lactamase. The enzyme attacks beta-lactam antibiotics - one of the largest and most-used class of antibiotics.
Many beta-lactam antibiotics are overlooked because doctors believe the infection they are treating is completely resistant to them, even if they are shown to be effective in lab tests.
However, the computer model shows there is a period - before the beta-lactamase degrades the drug - when the bacterium is sensitive to the antibiotic.
Senior author Lingchong You, an associate professor of biomedical engineering who heads the lab Meredith works in, explains:
"You can think of this as a race between the cells and the antibiotics. Before their beta-lactamase degrades the antibiotics, the cells are still sensitive and can be killed. But the antibiotics degrade faster than the cell population declines, allowing some cells to survive and repopulate."
Database of recovery times could bring back first-line antibiotics
When doctors realize an infection is resistant, they often go straight to a last-resort antibiotic. But the study suggests if they were to stick to the first-line antibiotic and change the dosing frequency so each dose hits the bacteria during their recovery period while they are weak, some infections could be cleared without reaching for the last resort.
The team says, in theory, a database of recovery times for different combinations of bacteria and antibiotics could allow first-line antibiotics to clear many resistant infections.
There are other important considerations when administering antibiotics. For example, doctors need to take care not to wipe out native populations of bacteria that are important for health.
A database of the responses of different strains to different antibiotics could allow the simulation to find the optimum dosing regimen that keeps total exposure to a minimum. It could also show whether multiple doses are likely to work and let doctors know when it is time to reach for the stronger drug.
Prof. You explains that others are already working on how to determine antibiotic dosing schedules, but they are typically using models based on complex biological mechanisms. Such approaches are time-consuming, he adds, and there are thousands of new strains evolving all the time - it is impossible to keep up.
"We're trying to see if this one, easy-to-test metric of recovery time can make a good enough prediction without years of study," says Prof. You.
Work on developing the database has already started and early results are promising, says Meredith:
"Our preliminary data have confirmed many of the clinical aspects of the model's predictions, so we are tremendously excited by those. If this strategy is successful, it could potentially reintroduce a large number of first-line antibiotics for patient treatment."
The National Science Foundation, the National Institutes of Health and a number of Fellowship awards helped fund the study.
Written by Catharine Paddock PhD
Scientists 'incredibly excited' by asthma treatment breakthrough
April 25, 2015
SCIENTISTS INCREDIBLY EXCITED BY ASTHMA TREATMENT BREAKTHROUGH
A breakthrough study has uncovered a potential root cause of asthma and a drug that reversed symptoms in lab tests. The finding brings hope to the 300 million asthma sufferers worldwide who are plagued by debilitating bouts of coughing, wheezing, shortness of breath and tightness in the chest.
The study - led by Cardiff University in the UK - reveals for the first time that the calcium-sensing receptor (CaSR) plays a key role in causing the airway disease.
The team used human airway tissue from asthmatic and nonasthmatic people and lab mice with asthma to reach their findings.
In the journal Science Translational Medicine, they describe how manipulating CaSR with an existing class of drugs known as calcilytics reversed all symptoms.
Calcilytics block the calcium-sensing receptor and were originally developed for the treatment of osteoporosis - a condition that makes bones more likely to break - also referred to as "brittle bone disease."
One of the crucial study results is that the symptoms the drug reversed include airway narrowing, airway twitchiness and inflammation - all of which make breathing more difficult.
Daniela Riccardi, principal investigator and a professor in Cardiff's School of Biosciences, describes their findings as "incredibly exciting," because for the first time they have linked airway inflammation - which can be triggered for example by cigarette smoke and car fumes - with airway twitchiness. She adds:
"Our paper shows how these triggers release chemicals that activate CaSR in airway tissue and drive asthma symptoms like airway twitchiness, inflammation, and narrowing. Using calcilytics, nebulized directly into the lungs, we show that it is possible to deactivate CaSR and prevent all of these symptoms."
While the finding is likely to be welcomed by all asthma sufferers, it will particularly excite the 1 in 12 patients who do not respond to current treatments and who account for around 90% of health care costs associated with the disease.
Could be treating asthma patients in 5 years - huge implications for other airway diseases
Calcilytics were first developed about 15 years ago for the treatment of osteoporosis, but while they proved safe and well tolerated in trials, results have been disappointing in patients with osteoporosis.
However, the fact they have already been developed and tested gives researchers the unique opportunity to repurpose them and hugely reduce the time it usually takes to bring a new drug to market.
Once funding is secured, the team hopes to be testing the drugs on humans within the next 2 years. Prof. Riccardi concludes:
"If we can prove that calcilytics are safe when administered directly to the lung in people, then in 5 years we could be in a position to treat patients and potentially stop asthma from happening in the first place."
The researchers believe their findings about the role of CaSR in airway tissue could have important implications for other respiratory conditions such as chronic obstructive pulmonary disease (COPD), chronic bronchitis. There are currently no cure for these diseases, which predictions suggest will be the third biggest killers worldwide by 2020. In the following video, Prof. Riccardi and colleagues talk about their findings and a patient with asthma describes her excitement about the potential implications.
Asthma UK, the Cardiff Partnership Fund and the Biotechnology and Biological Sciences Research Council (BBSRC) helped finance the study.
Written by Catharine Paddock PhD
Pets: are you aware of the risks to human health?
April 24, 2015
PETS: ARE YOU AWARE OF THE RISK TO HUMAN HEALTH?
There is no doubt America is a nation of animal lovers. In 2012, more than 62% of American households included at least one pet. But while most of us are aware of the numerous benefits of pet ownership, are you aware of its risks to human health?
Those of you who have a cat, dog, bird or any other animal in your household will likely consider that pet to be member of your family, and rightly so.
Pets offer comfort and companionship, and we can't help but love them. In fact, when it comes to dogs, a recent study found the famous "puppy dog eyes" glare triggers a whopping 300% increase in owners' oxytocin levels - the "love hormone" involved in maternal bonding.
What is more, pets offer a number of benefits to human health. In December 2014, Medical News Today reported on a study that associated household pets with stronger social skills in children with autism. And in May 2013, a study published in the journal Circulation linked pet ownership to reduced risk of heart disease.
But while pets can benefit our health in a number of ways, they also have the potential to spread infection and cause human illness. In this Spotlight, we take a look at the some of the health risks associated with ownership of many of the nation's most-loved animals.
Most of us have heard of Campylobacter. The bacterium is one of the most common causes of diarrhea in the US, estimated to affect more than 1.3 million people annually.
As well as diarrhea, infection with Campylobacter - called campylobacteriosis - can cause cramping, abdominal pain and fever within 2-5 days of exposure to the bacteria.
While most cases are caused by exposure to contaminated food - particularly meat and eggs - and water, it can also be contracted through exposure to stool of an infected animal - including dogs and cats.
According to PetMD, around 49% of dogs and 45% of stray cats carry Campylobacter and shed it in their feces. It is most common in puppies and kittens younger than 6 months.
It should be noted that infection with Campylobacter is rarely life-threatening, though individuals with weak immune systems, young children and the elderly are most at risk.
Tapeworm, hookworm and roundworm
Dipylidium caninum is the most common tapeworm in both dogs and cats in the US. It is caused by ingestion of fleas that carry the tapeworm larvae. This can happen when the animal grooms itself.
D. caninum can be passed to humans, though the risk of infection is very low. It most commonly occurs in young children who accidentally swallow an infected flea.
According to the Centers for Disease Control and Prevention (CDC), flea control is the best way to reduce the risk of D. caninum infection in both pets and humans. Ancylostoma brazilense, A. caninum, A. ceylanicum and Uncinaria stenocephala are just some of the species of hookworm that can infect cats and dogs.
The hookworm parasite can be shed in the feces of animals, and humans can contract it by coming into contact with infected feces or contaminated soil and sand where such feces have been.
Hookworm infection in humans most commonly causes a skin condition called cutaneous larva migrans (CLM), in which the hookworm larvae penetrate the skin. This causes a red, itchy and sometimes painful rash.
In rare cases, specific strains of hookworm can infect the intestines of humans, causing abdominal pain and diarrhea.
Toxocariasis is an infection caused by the transmission of Toxocara - parasitic roundworms - from dogs and cats to humans. According to the CDC, almost 14% of Americans have Toxocara antibodies, indicating that millions of us have been exposed to the parasite.
In dogs and cats infected with Toxocara, eggs of the parasite are shed in their feces. Humans can contract the parasite by accidentally swallowing dirt that has been contaminated with these feces.
Though it appears human exposure to Toxocara is high, most people infected with it do not develop symptoms or become sick. In the rare cases people do become ill from toxocariasis, the condition may cause inflammation and vision loss in one eye (ocular toxocariasis), or abdominal pain, fever, fatigue and coughing due to damage to various organs (visceral toxocariasis).
Though not as cute and fluffy as kittens and puppies, reptiles - such as turtles, snakes and lizards - are owned by around 3% of households in the US.
There is no doubt reptiles are interesting creatures and can make brilliant pets, but they are also a carrier of Salmonella - a bacteria responsible for salmonellosis. Humans can contract the bacteria simply through touching a reptile and ingesting the germs.
According to the CDC, more than 1 million people in the US become ill from Salmonella infection each year. Of these illnesses, more than 70,000 are caused by contact with reptiles.
Within 12-72 hours of being infected with Salmonella, people may experience diarrhea, fever and abdominal cramps that last around 4-7 days. While most people fully recover without treatment, others may need to be hospitalized.
Turtles are a main culprit of Salmonella infection in the US. The sale of turtles less than 4 inches was even banned by the US Food and Drug Administration (FDA) in 1975 because of their high disease risk - particularly among young children, the elderly and people with weak immune systems.
Rabies is one of the most severe diseases that humans can contract from dogs and cats, as well as smaller animals such as ferrets. A recent study reported by MNT found the disease kills around 59,000 people worldwide every year.
Rabies is a disease that infects the central nervous system (CNS). Caused by a bite from an animal infected with rabies virus, the disease causes fever, headache and weakness, before progressing to more severe symptoms - including hallucinations, full or partial paralysis, insomnia, anxiety and difficulty swallowing. Death normally occurs within days of more serious symptoms appearing.
According to the CDC, domestic animals accounted for 8% of all rabid animals reported in 2010.
In the US, the most common way domestic animals can contract rabies is through a bite from infected wild animals, particularly foxes, raccoons, skunks and bats. Symptoms normally occur 1-3 days after infection and include excess salivation, paralysis and unusual shyness or aggression.
If an owner suspects their pet may have been bitten by a rabid animal, they must take them to a veterinarian for care immediately, even if they have been vaccinated against the virus. Any person who believes they may have been bitten by a rabid animal must seek immediate medical care.
Despite its name, parrot fever does not only occur in parrots - all birds can be affected. However, human transmission of the disease most commonly involves parrots, parakeets, macaws, cockatiels and poultry - particularly turkeys and ducks.
Also known as psittacosis, parrot fever is a bacterial disease caused by a bacterium called Chlamydia psittaci that humans can contract through inhalation of birds' secretions, including urine and feces.
If a person becomes infected with C. psittaci, symptoms usually appear around 10 days after exposure. These may include fever, nausea and vomiting, diarrhea, fatigue, chest pain and shortness of breath.
In more severe cases, infection with C. psittaci can cause inflammation of the brain, liver and other internal organs. It can also reduce lung function and cause pneumonia.
It is important to note, however, that parrot fever in humans is very rare in the US. According to the CDC, fewer than 50 people a year are infected, and this has been the case since 1996.
Toxoplasmosis is a disease caused by a single-celled parasite - Toxoplasma gondii. It is most commonly contracted in humans through ingestion of undercooked or contaminated meat.
However, humans can also contract T. gondii by coming into contact with cat feces or any area or object contaminated with cat feces, as felines are carriers of it. T. gondii cannot be absorbed through skin, but infection can occur if the parasite is accidentally ingested.
It is estimated that more than 60 million people in the US are infected with T. gondii. However, very few people become ill from the infection as the human immune system is normally able to fight it.
If the infection does present symptoms, these may include swollen glands and muscle aches and pains. In very severe cases, T. gondii infection may cause damage to the brain and other organs, or eye damage.
Pregnant women, elderly individuals, young children and people with weakened immune systems are at highest risk of developing symptoms from T. gondii infection.
Although our cute little kitties very rarely mean to scratch us, it does happen. And while many of us think nothing of a small graze from a cat's claw, it has the potential to cause more damage than you may think.
Cat-scratch disease (CSD) is caused by a bacterium called Bartonella henselae, which around 40% of cats carry at some point in their lifetime, though most show no signs of illness.
B. henselae is most common in kittens under the age of 1 year, and since kittens are more likely to scratch during playtime, they are most likely to spread the bacterium to humans.
An early sign of CSD can be an infection at the site of the scratch around 3-14 days after it occurred, characterized by swelling, pain and tenderness. Headache, fever, loss of appetite and fatigue may also present, and in very rare cases, CSD can affect the brain, heart and other organs.
Children under the age of 5 years and individuals with weakened immune systems are most likely to experience severe symptoms from CSD.
What can be done to prevent pet-related infections?
It is clear pets can harbor an abundance of germs that can be passed to humans, but there are a number of ways pet-related infections can be prevented:
- Wash your hands - hygiene is key for preventing the majority of pet-related infections. After coming into contact with pets, their saliva or feces, hands should be washed thoroughly with warm, soapy water. A scratch or bite from a pet should also be cleaned immediately
- Pick up and dispose of feces - quickly disposing of your pet's feces, particularly in areas where children may play - can prevent the spread of disease to humans and other animals
- Avoid scratches and bites - the best way to avoid infections from pet bites and scratches is to avert them in the first place. If you are scratched by a cat, dog or other animal, clean the wound immediately with warm, soapy water. A cat or dog bite may require medical attention due to the risk of rabies or other serious infection
- Get your pet vaccinated and routinely evaluated - visit a veterinarian regularly to ensure your pet is healthy and to prevent infectious diseases. Also, ensure your pet is up-to-date with the required vaccinations.
It is important to note that the likelihood of a person catching a disease from their pet is low, particularly if the correct precautions are taken. With this in mind, there is no reason why the millions of pet owners in the US can't enjoy the companionship and joy their animals provide.
Written by Honor Whiteman
Researcher warns of increased cancer risk with excess supplement use
April 22, 2015
RESEARCHER WARNS OF INCREASED CANCER RISK WITH EXCESS SUPPLEMENT USE
Popular notion holds that dietary supplements are good for our health. But increasingly, research is suggesting otherwise. At the 2015 American Association for Cancer Research Annual Meeting, one researcher discusses a number of studies associating excess use of dietary supplements with increased risk of cancer.
The use of dietary supplements is common in the US, with more than half of Americans reporting regular use of at least one form, according to a National Health and Nutrition Examination Survey (NHANES).
The purpose of dietary supplements is to help the body achieve essential nutrients that it may not be getting from food, though numerous studies have suggested they pose additional health benefits.
Last month, for example, Medical News Today reported on a study suggesting vitamin D supplements may slow or reverse the progression of low-grade prostate tumors.
But in recent years, a number of studies have suggested that dietary supplements may raise the risk of cancer rather than reduce it. Dr. Tim Byers, of the University of Colorado Cancer Center, is one researcher who has been involved in such studies.
Dietary supplements no substitute for good, nutritional food
According to Dr. Byers, initial tests of dietary supplements in animal models indicated they protect against cancer development.
But on testing these dietary supplements in humans, Dr. Byers and colleagues found they may have the opposite effect for some individuals.
"We studied thousands of patients for 10 years who were taking dietary supplements and placebos," he says. "We found that the supplements were actually not beneficial for their health. In fact, some people actually got more cancer while on the vitamins."
Dr. Byers points to one study that investigated the effects of beta carotene supplements in humans. The results of the study revealed that individuals who took beta carotene at doses higher than the recommended levels were at 20% increased risk of both lung cancer and heart disease.
What is more, Dr. Byers speaks of another study that associated use of folic acid supplements with an increased number of polyps in patients with colon cancer, despite this particular supplement being previously associated with a reduction in the number of polyps.
Dr. Byers notes that these findings should not deter people from taking vitamins and minerals; at the correct dosage, they can pose health benefits. It is when people take too many that the problems occur. He adds:
"At the end of the day we have discovered that taking extra vitamins and minerals do more harm than good."
In addition, he stresses "there is no substitute for good, nutritional food," noting that the majority of people can get all the vitamins and minerals they need through a healthy diet.
Earlier this month, Medical News Today reported on a study published in Drug Testing and Analysis, which revealed that many dietary supplements still contain an amphetamine-like stimulant 2 years after the Food and Drug Administration (FDA) first flagged the issue.
Written by Honor Whiteman
Study finds increased risk of type 2 diabetes with statin use
April 21, 2015
STAUDY FINDS INCREASED RISK OF TYPE 2 DIABETES WITH STATIN USE
A new study published in the journal Diabetologia finds the use of statins - drugs commonly used to lower cholesterol - may significantly increase the risk of type 2 diabetes, and that this risk remains even after accounting for confounding factors, including age, smoking status and body mass index.
The link between statin use and higher risk of diabetes is not new. Back in 2013, for example, Medical News Today reported on a study published in The BMJ that found certain statins - particularly atorvastatin (Lipitor), rosuvastatin (Crestor) and simvastatin (Zocor) - raised the risk of diabetes by up to 22%.
But according to the researchers of this latest study - including Prof. Markku Laakso of the Institute of Clinical Medicine at the University of Eastern Finland and Kuopio University Hospital in Finland - such studies have had numerous limitations.
The team explains that many of these studies have included selective populations, such as those at high risk of cardiovascular disease. As a result, findings may not be applicable to the general population.
The researchers also note that these studies have often included participants whose diabetes has been self-reported or based on their fasting glucose measurements, which may underestimate the actual number of incident diabetes cases.
Increased risk 'most likely linked to statins that reduce insulin sensitivity and secretion'
For their study, Prof. Laakso and colleagues analyzed the effects of statin use on 8,749 nondiabetic Caucasian men aged 45-73 years who were part of the Finland-based Metabolic Syndrome in Men (METSIM) study.
During the 5.9-year follow-up, 625 men were diagnosed with type 2 diabetes, as determined by either an oral glucose tolerance test (OGTT), an HbA1c level of at least 6.5%, or the commencement of antidiabetic medication.
The results of the analysis revealed that men who were treated with statins were at 46% higher risk of diabetes than men who were not treated with statins.
This 46% increased diabetes risk was present even after adjusting for the men's age, body mass index (BMI), waist circumference, physical activity levels, smoking status, alcohol intake, family history of diabetes and treatment with beta-blockers and diuretic medications.
The researchers also assessed changes in insulin resistance and insulin secretion among men who were treated with statins. They found that statins led to a 24% reduction in insulin sensitivity during follow-up, as well as a 12% reduction in insulin secretion.
For two statins - simvastatin and atorvastatin - the researchers found the associated risk of type 2 diabetes was dose-dependent, as were the reductions in insulin sensitivity and insulin secretion among the men taking these statins.
After accounting for the aforementioned confounding factors, the team found high-dose simvastatin was linked to a 44% higher risk of type 2 diabetes, while a lower dose was linked to a 28% increased risk. High-dose atorvastatin was associated with a 37% increased risk of type 2 diabetes.
Of the study participants, 53% were taking atorvastatin and 29% were taking simvastatin.
Based on their results, the researchers say:
"Statin therapy was associated with a 46% increased risk of type 2 diabetes after adjustment for confounding factors, suggesting a higher risk of diabetes in the general population than previously reported.
The association of statin use with increased risk of developing diabetes is most likely directly related to statins decreasing both insulin sensitivity and secretion."
Prof. Laakso and colleagues say that while one strength of this study is its large size, the fact that all participants were male and Caucasian means the findings may not be generalizable to women or those of other ethnicities.
Written by Honor Whiteman
Antibiotic resistance genes found in bacteria of remote South American tribe
April 20, 2015
ANTIBIOTICS RESISTANCE GENES FOUND IN BACTERIA OF REMOTE SOUTH AMERICAN TRIBE
Scientists have discovered antibiotic resistance genes in the bacteria of remote tribespeople who have had no contact with the industrialized world or exposure to antibiotic drugs. This discovery suggests that the ability to resist antibiotics was already in the human body long before today's antibiotic drugs were developed.
The mountains of southern Venezuela are home to an isolated tribe of Yanomami Amerindians that has lived there since their ancestors first settled in South America over 10,000 years ago.
Before their discovery by Westerners in 2009, the tribespeople had had no contact with the modern world or exposure to modern antibiotics.
In the journal Science Advances, researchers from the US and Venezuela describe how they analyzed bacteria from the skin, mouth and intestines of the Yanomami tribe members and found they contained antibiotic resistance genes.
Inappropriate and overuse of antibiotics in medicine and agriculture is fueling a growing global health problem of drug resistance where once powerful drugs are losing their ability to kill emerging "superbug" strains of disease-causing bacteria.
One explanation for the emergence of antibiotic-resistant bacteria is that random mutations in the microbes coupled with their ability to swap genes is spurring the evolution of resistant strains.
But the new study suggests that resistance genes have been around in the human microbiome (the trillions of bacteria that live in and on the body) for thousands of years - long before antibiotic drugs were invented.
Speculating on the findings, the team says antibiotics are not just a human invention. Bacteria evolved strategies to kill each other long before we were around - they were the first inventors of antibiotics. And to defend against this, they developed resistance mechanisms.
Tribespeople's bacteria had resistance genes that deactivate range of antibiotics
The discovery of the Yanomami village gave the team - including Erica C. Pehrsson of the Washington University School of Medicine, St. Louis, MO - the opportunity to study their bacterial flora and compare it to what we already know from Western human populations.
Pehrsson says the tribespeople's only exposure to antibiotics would have been through ingestion of soil bacteria that make naturally occurring versions of today's modern antibiotics, and notes:
"Yet we were able to identify several genes in bacteria from their fecal and oral samples that deactivate natural, semi-synthetic and synthetic drugs."
The researchers also found that the tribespeople's microbiome was much more diverse than that of the typical Western person.
They do not know if the diversity of specific bacteria they found in the tribe improves or harms human health, but note that the microbiome of the typical Westerner is about 40% less diverse than that of the Yanomami.
Decreased bacterial diversity, modern diet and antibiotics linked to disease
Senior author Maria Dominguez-Bello, associate professor of medicine at New York University Langone Medical Center, says:
"Our results bolster a growing body of data suggesting a link between, on one hand, decreased bacterial diversity, industrialized diets and modern antibiotics, and on the other, immunological and metabolic diseases - such as obesity, asthma, allergies and diabetes, which have dramatically increased since the 1970s.
We believe there is something occurring in the environment during the past 30 years that has been driving these diseases, and we think the microbiome could be involved."
Prof. Dominguez-Bello and colleagues exposed bacteria from the tribe to 23 different antibiotics and found the drugs were able to kill all of them.
But when they ran further tests, they found the bacteria contained "silent" resistance genes that were activated by exposure to the antibiotics.
The results showed cultured bacteria from the tribe members contained many resistance genes that can fight off many modern antibiotics.
And when they tested bacteria that are hard to culture, the scientists found even more resistance genes.
Scientists 'alarmed' to find genes resistant to synthetic antibiotics
The team was surprised to find that many of the resistance genes they found in the bacteria from the tribespeople deactivated not only natural antibiotics but also synthetic and semi-synthetic antibiotics, including third- and fourth-generation cephalosporins, which are normally reserved for fighting off the worse infections.
Co-author Gautam Dantas, associate professor of pathology and immunology at Washington University, says:
"It was alarming to find genes from the tribespeople that would deactivate these modern, synthetic drugs."
One explanation for this finding is the idea of cross-resistance, where genes that help bacteria resist natural antibiotics can also help them resist related synthetic drugs, as Prof. Dantas explains:
"We've seen resistance emerge in the clinic to every new class of antibiotics, and this appears to be because resistance mechanisms are a natural feature of most bacteria and are just waiting to be activated or acquired with exposure to antibiotics."
In December 2014, Medical News Today learned how the microbiome may be shaping the human age structure. In the journal mBio, researchers at NYU's Langone Medical Center and Vanderbilt University describe how they created a model of an early hunter-gatherer population to see what role the microbiome might have played. They concluded evolution may have acted on the human microbiome to favor bacteria that help their hosts live longer.
Written by Catharine Paddock PhD
Preventable rabies kills 160 people worldwide every day
April 19, 2015
PREVENTABLE RABIES KILLS 160 PEOPLE WORLWIDE EVERY DAY
Researchers have found that around the world, 160 people die each day from canine rabies. An estimated 59,000 people are thought to die every year as a result of this preventable disease.
The study, published in PLOS Neglected Tropical Disease, is the first to assess the impact of canine rabies on a global scale, estimating its public health and economic burden and measuring the extent of worldwide control efforts.
"This ground-breaking study is an essential step towards improved control and eventual elimination of rabies," reports Prof. Louis Nel, Executive Director of the Global Alliance for Rabies Control (GARC). "An understanding of the actual burden helps us determine and advocate for the resources needed to tackle this fatal disease."
The study was conducted by GARC's Partners for Rabies Prevention Group and led by Dr. Katie Hampson of the University of Glasgow, Scotland.
In addition to the large amount of deaths caused by rabies, the authors estimate that annual economic losses caused by the disease total around $8.6 billion. This cost is attributed primarily to premature deaths but also includes vaccination spending and lost income for victims.
Rabies is a difficult disease to track and underreporting is believed to be commonplace. As rabies is close to 100% fatal, a large number of rabies victims never report to health facilities and are never diagnosed. In regions where malaria is prevalent, misdiagnosis is also frequent.
In light of these problems, the researchers decided that an updated assessment of the global rabies burden was necessary. For their study, they combined all available data sources into a modeling framework that allowed them to estimate as accurately as possible any missing information.
"The breadth of data used in this study, from surveillance reports to epidemiological study data to global vaccine sales figures, is far greater than ever analyzed before, allowing this more detailed output," Dr. Hampson explains.
Dog vaccination programs could reduce medical sector costs
Rabies is a fatal viral disease that is usually acquired when humans are bitten by infected animals, most typically domestic dogs. Through the prompt administration of a fast-acting shot to bite victims, the disease is entirely preventable, yet in populations with limited access to health care, the disease is prevalent.
The researchers found the greatest risk of canine rabies was in the world's poorest countries. Sub-Saharan Africa is home to the highest death rates while India reports the highest number of human fatalities - approximately 20,800 deaths per year, over 35% of the global rabies burden.
Canine rabies can be controlled through the mass vaccination of domestic dogs, yet the researchers also found that in nearly all African and Asian countries, the proportion of dogs vaccinated for the disease is far below the level necessary to control it.
Mass vaccination of dogs comprised a very small proportion of the economic burden of rabies - less than 1.5%. The researchers report that outside of North America and Europe, a large investment in dog vaccination has only been sustained in the Americas, leading to a small rabies burden in this region.
"Generally, medical sector costs were much higher than veterinary costs," write the study authors, "but investment in dog vaccination could bring down costs to the medical sector, demonstrating the need for intersectoral coordination."
The study authors state that collaboration between the animal and human health sectors - vaccinating dogs and improving access to human vaccines - is necessary in order to save lives and reduce the burden of this disease on some of the world's most vulnerable economies.
In addition, effective systems for reporting rabies cases are essential to rabies elimination efforts, as they allow for public health officials to monitor and evaluate prevention measures.
"No one should die of rabies and GARC and its partners will continue to work together using a One Health approach towards global rabies elimination," Prof. Nel states.
Written by James McIntosh
How worried should we be about ticks?
April 18, 2015
HOW WORRIED SHOULD WE BE ABOUT TICKS?
For many people, spring has well and truly arrived. The time is now right to enjoy the outdoors and to roam freely in the woods and the long grass. But be wary! It is not just humans that like to get out and about at this time of year...
Popular singer Avril Lavigne found this out to her cost last year. Her 30th birthday was disrupted by the onset of Lyme disease, an illness that left her bedridden for 5 months.
"I felt like I couldn't breathe, I couldn't talk and I couldn't move," she said in an interview with People. "I thought I was dying." But what was it that caused the bacterial infection? Lavigne believes that she was bitten by a tick at some point in the spring.
It seems strange that something as seemingly innocuous as a tick, often close in size to a pinhead, could damage someone's health to the extent that they are fearful for their life, but Lavigne's situation is shared by many. In 2013, the Centers for Disease Control and Prevention (CDC) report that there was a total of 27,203 cases of Lyme disease confirmed in the US.
This figure does not tell the whole story when it comes to ticks, however. Although closely associated with Lyme disease, these small arthropods are capable of carrying a wide variety of other pathogens that can cause human disease.
How worried should we be? In this Spotlight feature, we place the diminutive tick under the microscope and find out how much of a danger to health these little bugs can be, as well as what steps should be taken if you are unfortunate enough to be bitten by one.
What are ticks?
Although they look similar to both insects and spiders, a tick is neither of these two creatures. Ticks are arthropods - invertebrates with jointed legs - that belong to the same class of arachnids as mites. They are small external parasites that feed purely on the blood of other creatures.
There are two main kinds of tick: hard ticks (ixodidae) and soft ticks (argasidae). The difference between the two is that hard ticks are protected by a hard protective plate on their backs that restricts the rate at which they can feed. Soft ticks are more leathery and are unrestricted by a protective plate, enabling them to feast more quickly.
Some ticks will only feed on a particular type of animal, while some are far less selective and will happily feed on other creatures if their regular host animal is unavailable. As well as between different species, the feeding habits of ticks can vary across the four stages of their life cycle: egg, larva, nymph and adult. Ticks locate potential hosts by detecting breath, odors, body heat, moisture, vibrations and even shadows in some cases. As ticks are unable to fly or jump, they wait for hosts on the tips of grasses and shrubs in a position known as "questing." When questing, ticks hold onto the grass or shrub with their back pairs of legs while their first pair of legs is outstretched, ready to climb onto a host when they brush past.
When feeding, ticks do not burrow into the skin. Rather, a tick will grasp the surface of the skin and insert its feeding tube. Some tick species will secure themselves further with barbs on their feeding tubes, or by secreting a cement-like substance.
Ticks can be very difficult to notice if you are not actively looking for them. In addition to being quite small, ticks can also secrete saliva with anesthetic properties, numbing the area where the tick is feeding and preventing the host from feeling that the tick has attached itself.
Once attached, a tick will begin to feed. The amount of time taken to feed varies between species, but hard ticks can take as long as several days to feed fully. When most ticks finish feeding, they drop off of their host and prepare for the next stage of their life cycle.
While tick bites can provide a small amount of discomfort, the real danger with ticks comes from the pathogens that some ticks carry. If feeding on a host animal with a bloodborne infection, ticks can ingest the pathogens along with the blood. These pathogens can then be transmitted to other hosts the next time a tick attaches itself to feed.
Ticks and Lyme disease
Ticks are almost synonymous with Lyme disease, a bacterial infection characterized by fatigue, fever, headaches and a skin rash (these symptoms are common to many tickborne diseases). Untreated, Lyme disease can spread through the body, affecting the heart, joints and nervous system.
As a bacterial infection, Lyme disease is frequently treated with antibiotic medication such as doxycycline or amoxicillin. If the disease is allowed to develop over a course of several weeks, patients may require the administration of intravenous antibiotics, depending on the severity of the disease's progression.
However, being on the receiving end of a tick bite is by no means a guarantee that Lyme disease has been contracted. The chances of catching Lyme disease depend on a number of factors, including the type of tick that has been encountered and the length of time for which it was feeding.
Specifically, Lyme disease bacteria are only transmitted in the US by blacklegged ticks, also known as deer ticks. Ticks are not born carrying Lyme disease pathogens and will only acquire the infection after feeding on an infected animal - typically a mouse. For this reason, larval deer ticks will not transmit these pathogens.
Blacklegged ticks are only located in specific areas of the country. The CDC report that most Lyme disease infections are found in these endemic locations:
- North-central states, mainly Wisconsin and Minnesota
- Northeast and mid-Atlantic areas, from northeastern Virginia to Maine
- The West Coast, particularly northern California.
In 2013, 95% of confirmed Lyme disease cases in the US were reported in just 14 states: Connecticut, Delaware, Maine, Maryland, Massachusetts, Minnesota, New Hampshire, New Jersey, New York, Pennsylvania, Rhode Island, Vermont, Virginia and Wisconsin.
Being bitten by a blacklegged tick in one of these states still does not guarantee the transmission of Lyme disease. In most cases a tick carrying the Lyme disease pathogens needs to be attached for at least 36-48 hours before the bacteria are transmitted. Removing a tick promptly after being bitten greatly reduces the risk of acquiring the disease.
The disease was first recognized in the Lyme area of Connecticut in 1975 and takes its name from here. However, although Lyme disease is the most common tickborne illness in North America and Europe, it is not the only one. Recently, news reports have suggested that one such disease is beginning to emerge in this northeastern state.
Other ticks, other diseases
Although it shares many symptoms with Lyme disease, the Powassan virus differs in that there is no treatment currently available for it. According to Dr. Theodore Andreadis, of the Connecticut Agricultural Experiment Station, the virus can be fatal in some cases. Powassan virus can also be transmitted much quicker than Lyme disease. "These ticks will transmit this virus when they feed within a matter of hours, whereas with Lyme disease, for example, ticks generally have to feed up to 2 days before they're capable of transmitting it," Dr. Andreadis told CBS New York.
Dr. Andreadis also stated that there have yet to be any reported human cases of the virus in this region, but people should be more careful than ever when venturing into woodland environments that could be home to ticks carrying these pathogens.
Lyme disease and Powassan virus are not the only conditions that can be spread by the blacklegged tick. Other diseases transmitted by this species include anaplasmosis and babesiosis.
Of course, blacklegged ticks are not the only species of tick known to spread disease to humans. Across the US, for instance, a number of different species can be found that carry a variety of different pathogens potentially dangerous to humans.
The lonestar tick is a repeat offender when it comes to spreading disease. Found in southcentral and eastern states in the US, this hard tick can carry pathogens that cause diseases such as ehrlichiosis, southern tick-associated rash illness (STARI) and tularemia. Recent studies suggest that they may also transmit heartland virus.
Another tick to look out for is the Rocky Mountain wood tick, found in the Rocky Mountain states at elevations of 4,000 to 10,500 feet. These creatures can carry pathogens that cause Colorado tick fever, Rocky Mountain spotted fever (RMSF) and tularemia.
It is not just the slow-feeding hard ticks that can carry harmful pathogens either. Tick-borne relapsing fever (TBRF) is transmitted by swifter soft ticks and cases have been reported in 15 states so far: Arizona, California, Colorado, Idaho, Kansas, Montana, Nevada, New Mexico, Ohio, Oklahoma, Oregon, Texas, Utah, Washington and Wyoming.
These examples illustrate the fact that although Lyme disease typically occurs in very specific areas of the US, there are other tickborne diseases that can be found in other areas, so long as the environment is suited to the ticks.
Preventing and treating tick bites
There are a number of precautions that can be taken in order reduce the chances of a tick attaching, feeding and potentially transmitting an infection. When questing, ticks are most likely to be found in wooded and bushy areas, with high grass and leaf litter, so either avoid or be cautious in these types of environment.
Clothing can provide some protection from ticks. Wearing long-sleeved tops can protect the arms, and tucking pant legs into socks and boots can prevent ticks from having easy access to legs. Repellents are also available that can be applied to both skin and clothing. Those containing 20-30% DEET (N, N-diethyl-m-toluamide) offer several hours of protection.
After being out in an environment that could be home to ticks, it is recommended that you conduct a full-body tick check, especially as it is hard to notice them without actively searching.
As stated before, prompt removal of ticks is crucial to reducing the risk of infection. Although specialized tick removal devices are available, a regular pair of fine-tipped tweezers is more than adequate.
Using the tweezers, grasp the tick as close to the surface of the skin as possible. With steady, even pressure, pull upwards. Twisting and jerking the tick can cause some of its mouth-parts to remain embedded in the skin. If this occurs, carefully attempt to remove the remaining parts with the tweezers.
Once removed, clean the affected area and your hands, and dispose of the tick by submersing it in alcohol, placing it in a sealed container or disposing of it down the toilet. Do not crush a tick with your fingers.
Ticks are most active in the warmer months, between April and September, so now is the time to be particularly wary of these questing bugs. Although ticks are capable of spreading harmful diseases, with proper caution, these little beasties should not prevent you from being able to enjoy the great outdoors.
Written by James McIntosh
Alcohol: does it really offer health benefits?
April 17, 2015
ALCOHOL: DOES IT REALLY OFFER HEALTH BENEFITS ?
Many of us like the odd drink or two, particularly after a hard day at work. In fact, some studies have suggested moderate alcohol consumption is good for our health. On the other hand, some studies claim this may not be the case. Such conflicting findings beg the question: should we succumb to the occasional glass of wine?
According to the National Institute on Alcohol Abuse and Alcoholism (NIAAA), moderate drinking is defined as consuming up to one alcohol drink a day for women and up to two alcohol drinks a day for men.
The definition of a "standard" alcoholic drink depends on the alcohol content of the beverage. In the US, the NIAAA consider one alcoholic drink to be 5 oz of wine, 12 oz of beer or 1.5 oz of spirits.
Most of us have had a taste of at least one alcoholic drink at some point in our lives. According to the NIAAA, more than 70% of Americans aged 18 and older report having drank in the past year and 56% have consumed alcohol in the past month.
It is well established that drinking too much alcohol - either at once or over a long period of time - is detrimental to health. It can cause heart problems, liver disease and even cancer. In fact, 88,000 deaths in the US each year are alcohol-related, making it the third leading preventable cause of death in the country.
But many studies suggest that, if consumed moderately, alcohol may actually be beneficial for health, protecting against some of the problems drinking too much of it can cause.
Reduced risk of heart problems
In January, Medical News Today reported on a study published in the European Heart Journal, in which researchers from Brigham and Women's Hospital in Boston, MA, found consuming up to seven alcoholic drinks a week could protect against heart failure.
In this study, the researchers defined one alcoholic beverage as 14 g of alcohol - equivalent to a 125 ml glass of wine or just under one shot of liquor.
From an analysis of almost 15,000 participants, the team found that - compared with participants who consumed no alcohol - men who drank up to seven alcoholic drinks a week had a 20% reduced risk of heart failure, while women who consumed up to seven drinks weekly were at 16% lower risk of heart failure.
And this is not the only study to associate moderate alcohol consumption with improved heart health. According to the Harvard School of Public Health in Boston, MA, more than 100 prospective studies have suggested moderate alcohol use may protect against stroke, heart attack, heart disease, sudden cardiovascular death and other cardiovascular conditions, as well as improve overall mortality.
But what are the mechanisms underlying the link between moderate alcohol use and good heart health?
According to Prakash Deedwania, chief of the cardiology division at the University of California-San Francisco School of Medicine, drinking a glass of wine - and possibly other alcoholic beverages - can benefit the heart by increasing our levels of high-density lipoprotein (HDL), or "good" cholesterol.
"The grape skin provides flavonoids and other antioxidant substances that protect the heart and vessels from the damaging effects of free oxygen radicals produced by our body," she explains.
"The strongest evidence is in favor of wine, but some evidence recently showed beer and other types of alcohol may provide the same benefits related to increasing good cholesterol."
The Harvard School of Public Health note that moderate alcohol consumption may also prevent the formation small blood clots that block arteries in the heart, neck and brain - a common cause of heart attack and stroke.
Lower risk of diabetes
Numerous studies have associated heavy or binge drinking with increased risk of diabetes. But others have found that a moderate intake could reduce the likelihood of developing the condition. In 2005, a study published in Diabetes Care, a journal of the American Diabetes Association, found participants who consumed moderate amounts of alcohol - around 6-48 g a day - were 30% less likely to develop type 2 diabetes, compared with heavy drinkers and nondrinkers.
These findings were supported by a 2010 study published in the American Journal of Nutrition. Researchers from the Netherlands found that, not only does moderate alcohol consumption lower the risk of type 2 diabetes, the association is independent of other factors that may contribute to reduced diabetes risk.
But what is behind this association? Researchers have linked moderate alcohol consumption with increased insulin sensitivity, which may lower diabetes risk. The researchers from the Netherlands, for example, point to studies indicating that moderate alcohol use increases circulating concentrations of adiponectin - a protein involved in regulating glucose levels.
The aforementioned suggestion that moderate alcohol consumption can increase levels of HDL cholesterol may also explain its association with reduced diabetes risk.
While many of you may be testament to the fact that a few too many drinks can play havoc with memory, some studies have associated light or moderate alcohol consumption with improved memory.
In June 2014, a study published in The Journal of Nutrition found that moderate alcohol consumption improved participants' memory and thinking skills, particularly for women and those aged 70 and older.
And in October 2014, MNT reported on a study published in the American Journal of Alzheimer's Disease and Other Dementias finding that older individuals who engaged in light or moderate drinking had higher episodic memory - the ability to remember events. They also had a larger volume in the hippocampus - a brain region that plays an important role in memory.
These findings, the researchers say, support animal studies indicating that moderate alcohol consumption plays a role in protecting hippocampal volume through boosting the growth of new nerve cells in that region of the brain.
Data not sufficient to recommend drinking to anyone
While an array of studies have hailed moderate alcohol use for its health benefits, Dr. Geoff Kane, chair of the Medical-Scientific Committee at the National Council on Alcoholism and Drug Dependence (NCADD), told MNT he believes there is not enough evidence to suggest that nondrinkers should up their alcohol intake:
"The main way modest drinking is thought to benefit health is by increasing HDL; there are other means to do that, such as diet and exercise, that do not carry the dangers of drinking.
If someone does not have a drinking problem already and is not at high risk, and if they choose to drink, then they may take satisfaction in the possibility they derive health benefits. My opinion - and so far as I know that of other authorities - is that the data are not at all sufficient to recommend drinking to anyone."
Dr. Kane cautioned that individuals with alcohol addiction or a less severe alcohol use disorder should not be swayed to engage in alcohol use based on studies documenting its health benefits. "Any small benefits are extremely likely to be overpowered by the many adverse health risks of heavy drinking," he added.
What is more, in January, MNT reported on a study claiming the health benefits associated with moderate alcohol consumption may be "overestimated."
From a study of more than 18,000 adults, the team found that mortality benefits were only identified in men aged 50-64 who consumed 15-20 units of alcohol a week, and women aged 65 and over who drank up to 10 units of alcohol a week.
The researchers say their findings indicate that past studies demonstrating health benefits from moderate alcohol use "may, in part, be attributable to an inappropriate selection of a referent group and weak adjustment for confounders."
"The effect of such biases should therefore be borne in mind when evaluating findings from alcohol health studies - particularly when seeking to extrapolate results to the population level," they added.
Alcohol 'may not benefit everyone who drinks moderately
From the studies mentioned in this Spotlight, one thing is clear: it is quite possible that alcohol poses potential health benefits if consumed in moderation. But don't go reaching for the wine just yet - it may not benefit all of us.
As the NIAAA state:
"Expanding our understanding of the relationship between moderate alcohol consumption and potential health benefits remains a challenge, and although there are positive effects, alcohol may not benefit everyone who drinks moderately."
April is Alcohol Awareness Month - an annual campaign launched in 1987 and sponsored by the NCADD that aims to increase public awareness of alcoholism and its related issues.
Written by Honor Whiteman
Breath test could predict, diagnose stomach cancer
April 15, 2015
BREATH TEST COULD PREDICT, DIAGNOSE STOMACH CANCER
New study published in the journal Gut reveals how researchers have created a breath test that could be used to diagnose stomach cancer, as well as predict whether an individual is at high risk for the disease.
It is estimated that around 24,590 people in the US will be diagnosed with stomach cancer, or gastric cancer, this year. It is most common among older people, with 69 being the average age of diagnosis in the US.
Stomach cancer rarely causes symptoms in its early stages, making it hard to detect. If it does cause symptoms, these include poor appetite, weight loss, abdominal pain and nausea, which can often be mistaken for other conditions.
As such, only 1 in 5 stomach cancers in the US are diagnosed before the cancer has spread to other parts of the body, emphasizing the need for tools that can detect the disease in its early stages - something that would dramatically improve treatment outcomes.
In this latest study, Prof. Hossam Haick, of the Israel Institute of Technology, and colleagues looked to the use of nanoarray analysis for the early identification of stomach cancer. The technology can detect tiny changes in gut compounds that are exhaled in an individual's breath.
The researchers say past studies have investigated nanoarray analysis for the detection of stomach cancer, but they note that these studies have not assessed whether the technology could be used to identify precancerous changes.
Fast facts about stomach cancer
- The chance of developing stomach cancer over the course of a lifetime is around 1 in 111
- Men are at higher risk for developing stomach cancer than women
- The average 5-year survival rate in the US for people with stomach cancer is around 29%.
Nanoarray analysis a potential 'accurate, noninvasive and low-cost' screening tool
Prof. Haick and colleagues obtained two breath samples from 484 individuals, of whom 99 had already been diagnosed with stomach cancer but had not yet been treated with chemotherapy or radiotherapy. The participants had fasted for 12 hours prior to the breath samples being taken and had refrained from smoking for at least 3 hours beforehand. In addition, subjects were tested for infection with Helicobacter pylori infection - an established risk factor for stomach cancer - and their smoking and drinking habits were analyzed.
The researchers used gas chromatography-mass spectrometry (GCMS) - a technology measures the levels of volatile organic compounds (VOCs) in exhaled breath - to analyze the first breath sample from each participant, while a combination of nanoarray analysis and pattern recognition was used to analyze the second breath sample.
The GCMS analysis identified 130 VOCs in participants' exhaled breath. On comparing the breath samples of participants with stomach cancer with those of participants who had changes in VOC levels considered to be precancerous, the researchers identified eight distinctive "breath-print" compositions.
On applying the nanoarray technique to the breath samples, the team found it was effectively able to distinguish between breath-print compositions in participants with stomach cancer and those at low and high risk of the condition. The method achieved 73% sensitivity, 98% specificity and 92% accuracy, according to the results.
These results remained even after accounting for potential confounding factors, such as age, alcohol intake and use of proton pump inhibitors - drugs that reduce stomach acid production - according to the team.
While GCMS technology is unable to be used for stomach cancer screening due to its high cost and complexity, the researchers say nanoarray analysis may be a highly accurate and cheaper alternative.
The researchers add:
"The attraction of this test lies in its noninvasiveness, ease of use (therefore high compliance would be expected), rapid predictiveness, insensitivity to confounding factors and potentially low cost."
Written by Honor Whiteman
Could your dog give you norovirus?
April 14, 2015
COULD YOUR DOG GIVE YOU NOROVIRUS ?
Norovirus is the leading cause of foodborne illness in the US. You can also catch it from infected people and contaminated surfaces. Now, new research raises the question of whether humans can catch it from dogs.
Writing in the Journal of Clinical Microbiology, veterinarian Sarah Caddy and colleagues explain how they found some dogs can mount an immune response to human norovirus - a strong clue that they have been infected by the bug.
Caddy, who is working toward her PhD at the University of Cambridge and Imperial College London in the UK, says:
"We also confirmed that human norovirus can bind to the cells of the canine gut, which is the first step required for infection of cells."
Together with evidence that human norovirus has been isolated from domestic dogs in Europe, the findings raise concerns that people could catch the bug from animals.
Norovirus is a leading cause of gastroenteritis, or "stomach flu," causing vomiting and diarrhea in both adults and children. It is very contagious and can infect anyone. You can catch it from an infected person, contaminated food or water, or from contaminated surfaces.
According to the Centers for Disease Control and Prevention (CDC), in the US every year norovirus is responsible for 19-21 million cases of acute gastroenteritis and contributes to 570-800 deaths, mostly among young children and the elderly.
Human norovirus particles can bind to dog intestinal tissue
For their study, Caddy and colleagues used noninfectious human norovirus particles - comprising just the bug's outer protein coat, or capsid. The capsid is the part of the virus that binds to host cells. Capsids alone cannot cause infection because they lack the internal machinery of the virus.
The team studied the ability of capsids to bind to tissue samples from dog intestines in test tubes. They found evidence that seven different strains of human norovirus may be able to bind to canine gastrointestinal tissue. This suggests "that infection is at least theoretically possible," they note.
The researchers also carried out other tests to discover if dogs can carry human norovirus.
While they found no trace of virus in stool samples from 248 dogs (including some with diarrhea), they did find evidence of antibodies to human norovirus in blood samples from 43 out of 325 dogs.
It is currently not known whether human norovirus can cause clinical disease in dogs. Assuming that it can, the study found no evidence that dogs can shed it in sufficient quantities to infect humans. However, the authors note that other studies have suggested as few as 18 virus particles can cause human infection.
There is also little evidence that dogs or animals are involved in spreading norovirus among people when large outbreaks occur, such as on cruise ships and in hospitals.
Evidence from this study is sufficient to warrant further investigation
"There are plenty of anecdotal cases of dogs and humans in the same household, having simultaneous gastroenteritis, but very little rigorous scientific research is conducted in this area. Until more definitive data is available, sensible hygiene precautions should be taken around pets, especially when gastroenteritis in either humans or dogs is present in a household."
Written by Catharine Paddock PhD
Carefully alternating antibiotics can prevent bacteria developing resistance, say researchers
April 11, 2015
CAREFULLY ALTERNATING ANTIBIOTICS CAN PREVENT BACTERIA DEVELOPING RESISTANCE, SAY RESEARCHERS
In a surprising new study, researchers show it is possible to kill drug-resistant bacteria by alternating two antibiotics at doses that would ordinarily boost bacterial resistance and survival when used alone or combined.
Writing in the journal PLOS Biology, the international team - led by Robert Beardmore, a biosciences professor at the University of Exeter in the UK - describes how carefully devised "sequential treatments" using antibiotics may help combat the rise of resistant bacteria.
Prof. Beardmore says they found a complex relationship between dose, density of bacteria population and drug resistance, and:
"As we demonstrate, it is possible to reduce bacterial load to zero at dosages that are usually said to be sublethal and, therefore, are assumed to select for increased drug resistance."
Another crucial finding of the study is that the technique may also reduce the risk of bacteria developing resistance to antibiotics, thereby extending their useful lifetime.
The researchers decided to carry out the study because for decades research has focused on using drugs as "cocktails" where the drugs help each other as a "synergistic combination." But what if there is also an effect from "sequential synergy?"
Sequential treatment worked with antibiotic doses that singly or combined had no effect
For their investigation, the team used a simple test-tube model of an Escherichia coli bacterial infection where the bacteria had some antibiotic resistance genes.
They tested the effect of two commonly prescribed antibiotics - erythromycin and doxycycline - on the bacteria in three ways: given singly, given as a combination cocktail and given as a sequential treatment.
The results showed that antibiotics given in certain sequential treatments cleared the infection - even when those same drugs given at higher doses singly or combined failed to have any effect.
Further tests showed that while sequential treatments did not stop drug resistance mutations in the bacteria altogether, the first drug made the bacteria sensitive to the second drug and thereby reduced the risk of resistance developing.
The researchers suggest that the fluctuating environment created by a well-designed sequence of alternating precise doses is what sensitizes the bacteria to concentrations that, in a steady environment, would normally induce drug resistance and extend survival.
They note that these findings are still at the experimental stage, and there is still a lot of work to do before any sequential treatments are ready for clinical use.
The team expects the findings will trigger a series of studies looking at sequential ways to use antibiotics at doses lower than previously thought possible.
The study was funded by the Engineering and Physical Sciences Research Council (EPSRC) - the main body for funding research in engineering and the physical sciences in the UK. Prof. Beardmore is an EPSRC Leadership Fellow in Mathematical Biosciences.
Meanwhile, Medical News Today recently learned that a multi-drug resistant intestinal bug is spreading in the US. The authorities say the Shigella bacteria are entering the country in infected travelers and causing a series of outbreaks.
Written by Catharine Paddock PhD
Personalized vaccine shows promise against melanoma recurrence
April 5, 2015
PERSONALIZED VACCINE SHOWS PROMISE AGAINST MELANOMA RECURRENCE
Early data from a "first-in-people" clinical trial at Washington University School of Medicine in St. Louis, MO, suggests that personalized melanoma vaccines may provide a powerful immune response against tumor mutations.
Such vaccines have been attempted before, but they have worked by targeting proteins expressed at high levels in certain cancers - the problem with this approach being that the same proteins can also be found in healthy cells. As such, it is difficult to provoke a strong immune response with this method. The new vaccines were created by first sequencing the genomes of the patients' tumors, as well as samples of healthy tissues from the patient, so that mutated proteins called neoantigens - which are unique to tumor cells - can be identified. Computer algorithms and laboratory tests were then used to predict which neoantigens would be most likely to cause a potent immune response that could be used in a vaccine. "You can think of a neoantigen as a flag on each cancer cell," explains first author of the study Beatriz Carreno, associate professor of medicine. "Each patient's melanoma can have hundreds of different flags. As part of validating candidate vaccine neoantigens, we were able to identify the flags on the patients' cancer cells. Then we created customized vaccines to a select group of flags on each patient's tumor."
Three patients with advanced melanoma received these vaccines. After receiving the vaccines, blood samples were taken from the patients every week for 4 months. The researchers found that each patient had an increased number and diversity of T cells fighting the tumors.
According to the researchers, the vaccine's stimulation of T cell clones suggests that this approach may be useful in activating a range of T cells that can target mutations in lung cancer, bladder cancer or colorectal cancer, which are other cancers with similarly high mutation rates to melanoma. The patients had previously had surgery to remove their tumors, but cancer cells had spread to their lymph nodes, which often predicts a return of melanoma.
Study proves efficacy and safety in the short term but further trials are needed
"This proof-of-principal study shows that these custom-designed vaccines can elicit a very strong immune response," says senior author Dr. Gerald Linette, a Washington University medical oncologist leading the clinical trial at Siteman Cancer Center at Barnes-Jewish Hospital in St. Louis. Dr. Linette adds:
"The tumor antigens we inserted into the vaccines provoked a broad response among the immune system's killer T cells responsible for destroying tumors. Our results are preliminary, but we think the vaccines have therapeutic potential based on the breadth and remarkable diversity of the T cell response."
Due to the success reported so far, a phase 1 vaccine trial has now been approved by the Food and Drug Administration (FDA) that will enroll six patients. If testing in these patients confirms the vaccines to be effective, then the vaccines could one day be used to stimulate an anticancer immune attack following surgery. These immune attacks could seek out and destroy any remaining cancer cells, preventing recurrence.
However, the researchers caution that it is too early to say whether the vaccines will be effective in the long term, as the study was designed to assess safety and immune response. However, none of the trial participants have experienced adverse effects from their personalized vaccines.
"Our team has developed a new strategy for personalized cancer immunotherapy," Dr. Linette says.
"Many researchers have hypothesized that it would be possible to use neoantigens to broadly activate the human immune system, but we didn't know that for sure until now. We still have much more work to do, but this is an important first step and opens the door to personalized immune-based cancer treatments."
Written by David McNamee
Could passive exposure to bleach increase infection rates in kids?
April 3, 2015
COULD PASSIVE EXPOSURE TO BLEACH INCREASE INFECTION RATES IN KIDS?
A team of European researchers has suggested that passive exposure to bleach in the home could increase the frequency of respiratory and other infections among school-age children.
The study, published in Occupational & Environmental Medicine, examined the effects of exposure to bleach in the home among children from schools in Finland, Spain and the Netherlands.
Previous studies have suggested the use of cleaning agents in the home may increase the risk of respiratory infections and wheezing during the first year of life, and airway inflammation at school age.
Bleach is a cleaning agent that is used widely across the world. According to the researchers, a cross-sectional study has previously reported that children at school age living in a house where bleach is used had an increased risk of recurrent bronchitis, although also received some protection against asthma and allergies.
For the new study, the researchers examined the impact of bleach use in the homes of 9,102 children aged 6-12 attending 19 schools in Utrecht, The Netherlands, 18 schools in Barcelona, Spain, and 17 schools located in Eastern and Central Finland.
The parents of participating children completed questionnaires detailing whether they used bleach to clean their homes once a week and the amount of times their children had developed the following infections over the course of the past 12 months:
Parents had the choice of reporting infection frequency as "never," "once," "twice" or "more than three times." Bleach use varied in the countries participating in the study. While 72% of respondents from Spain reported using bleach, only 7% of those from Finland did. Additionally, all of the Spanish schools involved with the study were cleaned with bleach whereas none of the Finnish schools were.
Effects reported in the study are a public health concern, authors state
After adjusting for other potentially influential factors such as passive smoking, household mold and the use of bleach to clean school premises, the researchers found that the number and frequency of infections were highest among children whose parents regularly used bleach in their homes. The researchers noted statistically significant differences for flu, tonsillitis and any infection. Among children whose parents used bleach in the home, the risk of one episode of flu in the past 12 months was 20%. For recurrent tonsillitis, the risk was 35% higher, and for any infection it was 18% higher. The findings were the same in all three countries.
"Our results suggest that passive exposure to cleaning bleach at home is associated with an increased frequency of respiratory and related infections in school-age children," write the researchers.
A number of limitations to the study are identified by the authors. "Unfortunately, we did not have information on the use of other cleaning products and we cannot exclude the possibility that the observed results are due to the use of other irritants or to their combinations," they state.
Equally, only basic information was gathered for bleach use in the home, making it difficult for the authors to differentiate exposure levels between participating families.
As the study is observational, the authors cannot make any definitive conclusions about causation though they believe their findings support those from other studies suggesting a link between cleaning products and respiratory infections.
They suggest that further studies in this area should be conducted, including more detailed descriptions of bleach use and objective measurements of exposure and health outcomes in order to confirm their findings.
"Nevertheless, the high frequency of use of disinfecting cleaning products - caused by erroneous belief, reinforced by advertising, that our homes should be free of microbes - makes the modest effects reported in our study of public health concern," the authors conclude.
Written by James McIntosh
Acetaminophen does not work for lower back pain or osteoarthritis
April 1, 2015
ACETAMINOPHEN DOES NOT WORK FOR LOWER BACK PAIN OR OSTEOARTHRITIS
Acetaminophen - also known as paracetamol and marketed under brand names such as Mapap, Panadol and Tylenol - is not effective for the treatment of lower back pain and offers little value for osteoarthritis of the hip or knee, according to a study published in The BMJ.
The systematic review and meta-analysis is a synthesis of the research evidence from 13 randomized controlled trials designed to investigate the safety and efficacy of acetaminophen in the management of spinal pain - lower back or neck - and osteoarthritis.
The paper concludes that the widely used painkiller is ineffective against lower back pain and offers only "minimal short-term benefit" for people with osteoarthritis of the hip or knee.
The authors of the study call for updates to guidelines that currently recommend acetaminophen as the first analgesic option.
Lead author Gustavo Machado, of The George Institute for Global Health in the UK and the University of Sydney in Australia, says:
"Worldwide, paracetamol is the most widely used over-the-counter medicine for musculoskeletal conditions, so it is important to reconsider treatment recommendations given this new evidence."
Opinion leaders in the fields of general practice and rheumatology write in an editorial article about the study that some of its findings are not surprising.
Professors Christian Mallen and Elaine Hay say in the same issue of The BMJ that the new evidence reopens a debate that has already raised questions about the value of the painkiller. They cite, for example, the changing advice of the UK's drug-rationing body on the prescription of acetaminophen for osteoarthritis.
These questions leave "patients and clinicians wondering what is left that can help to manage these common, painful and highly disabling conditions."
The editorial urges physical treatments as the way forward, including exercise. It concludes:
"Ongoing and ever-increasing concerns about pharmacological management of musculoskeletal pain highlights the importance of nonpharmacological options, which form the cornerstone of self-management of spinal pain and osteoarthritis."
The review of the studies comparing acetaminophen against placebo found "high-quality" evidence that:
- The painkiller is ineffective in patients with low back pain for reducing pain intensity and disability
- The analgesic produces a significant but "clinically unimportant" effect on pain and disability in patients with osteoarthritis.
The authors of the present study also found that acetaminophen increased the likelihood of having abnormal results on liver function tests compared with placebo.
Machado explains: "Use of paracetamol for low back pain and osteoarthritis was also shown to be associated with higher risk of liver toxicity in patients." The clinical relevance of this, however - the way it affects patients - remains uncertain, say the authors.
Levels of overall adverse side effects across the studies were not found to be higher for the painkiller compared with placebo. However, acetaminophen, like any drug, is not 100% safe, and the evidence concerning this has developed recently.
Safety issues with acetaminophen
Prof. David Hunter, an osteoarthritis expert from the University of Sydney but not one of the authors, cites recent evidence to show that "paracetamol can be associated with an increasing incidence of mortality, and increased risk of cardiovascular, gastrointestinal and renal disease in the general adult population."
We reported earlier this month on the study Prof. Hunter refers to - it suggested the risks of acetaminophen have been underestimated.
Acetaminophen has been the subject of tighter controls from the US drug regulator the Food and Drug Administration (FDA) in recent years. Since 2013, new warnings have been included on labels because of the risk of rare but serious skin reactions.
More recently, the maximum dose of acetaminophen in any tablet or capsule that combines the drug with an opioid painkiller has been restricted to 325 mg. Since March 2014, no manufacturer has marketed the treatments with a dose above this. The FDA explains this latest safety control in this YouTube video.
Source: Medical News Today
Does an apple a day really keep the doctor away?
March 31, 2015
DOES AN APPLE A DAY REALLY KEEP THE DOCTOR AWAY?
The proverbial advice to eat an apple a day first appeared in print in 1866. Nearly 150 years later, a medical journal has used the excuse of April Fool's Day to publish a study that asks - seriously - whether this wisdom really does keep the doctor away.
The study tells us that the "an apple a day keeps the doctor away" aphorism was coined in 1913 but was based on the original form with a different rhyme, some 149 years ago in Wales: "Eat an apple on going to bed and you'll keep the doctor from earning his bread," went the proverb in Pembrokeshire. The University of Michigan School of Nursing researchers in Ann Arbor believe giving such medical proverbs an empirical evaluation "may allow us to profit from the wisdom of our predecessors." For the study's measure of keeping the doctor away, Matthew Davis, PhD, and co-authors evaluated an outcome of no more than one visit a year to the doctor as a means of investigating the proverb's success in daily apple eaters compared with non-apple eaters. So did a daily apple succeed in keeping the doctor away? No, it did not. There was no statistically meaningful difference in visits to the doctor for daily apple eaters in the analysis. But the study did find that an apple a day kept the pharmacist away.
'Avoiding the use of health care services'
When socio-demographic and health-related characteristics such as education and smoking were taken into account, daily apple eating was not associated with successfully keeping to a maximum of one self-reported doctor visit a year. Of the 8,399 participants who answered a questionnaire to recall their dietary intakes, 9% (753) were apple eaters and the remainder, 7,646, were non-apple eaters. The apple eaters showed higher educational attainment, were more likely to be from a racial or ethnic minority, and were less likely to smoke. The data for the analysis came from the National Health and Nutrition Examination Survey conducted during 2007-08 and 2009-10. "While the direction of the associations we observed supports the superiority of apple eaters over non-apple eaters at avoiding the use of health care services, these differences largely lacked statistical significance," say the authors after accounting for the differences in apple-eaters that - beyond the effects of the apple-eating itself - could have explained why they used health care services less.
An apple a day means one of at least 7 cm diameter
To analyze apple-eating against visits to the doctor, the researchers compared daily apple eaters with non-apple eaters. An apple a day counted if the participants answered that they had at least 149 g of raw apple. Eating less than this amount counted as no daily apple-eating, and apple consumption based purely on juices or sauces was also excluded. The study also looked for any response to increasing the amount of daily apple-eating by comparing doctor visits from people who ate no apples with those who ate one small apple, one medium apple or one large apple daily.
The analysis shows no relationship between apple "dose" and the likelihood of keeping the doctor away in terms of "avoiding health care services." Except, found the authors, for avoidance of prescription medications.
The study found that apple eaters were more likely to keep the doctor away, but this was before adjusting for the socio-demographic and health characteristics of the survey respondents - 39.0% of apple-eaters avoided more than one yearly doctor visit, compared with 33.9% of non-apple eaters. The daily apple eaters were also more likely to successfully avoid prescription medication use (47.7% versus 41.8%) - and this difference survived statistical analysis. The association between eating an apple a day and keeping the pharmacist away, then, was a statistically significant finding, whereas keeping the doctor away failed to hold true. Nor did the proverb show any effect in an analysis of overnight hospital stays or mental health visits - there was no difference for apple eaters in the likelihood of keeping either of these two away. The overall conclusion of this study was that only one finding supported the long-standing wisdom. Apple eaters "were somewhat more likely to avoid prescription medication use than non-apple eaters." The authors say in their final analysis that promotion of apple consumption may have only "limited benefit" in reducing national health care spending, adding:
"In the age of evidence-based assertions, however, there may be merit to saying, 'An apple a day keeps the pharmacist away."
Source: Medical News Today
Scientists identify neural mechanism responsible for chronic pain
March 30, 2015
SCIENTISTS IDENTIFY NEURAL MECHANISM RESPONSIBLE FOR CHRONIC PAIN
In a new study published in the journal Neuron, scientists from the University of Berne in Switzerland identify a mechanism in the brain they suggest is responsible for chronic pain. The researchers hope that their discovery will lead toward new treatments for chronic pain.
"The constant perception of pain severely influences the quality of life of the patients and represents an extraordinary emotional burden," says lead author Thomas Nevian from the Department of Physiology at the University of Bern. Nevian explains that despite chronic pain affecting more than 1 million people in Switzerland alone (about 100 million Americans are affected by the condition), proper treatment strategies are missing in many cases. "Thus," he says, "understanding the development of chronic pain is of outmost importance for neuroscience research."
Neurons in the gyrus cinguli create a 'pain memory'
Nevian and colleagues' discovery is the identification of a cellular mechanism in a brain region called gyrus cinguli, which is typically associated with the emotional aspects of pain. In a mouse model, the researchers found that neurons in this region are modified by chronic pain, establishing a form of "pain memory." "The neurons are constantly activated by a noxious stimulus," explains Nevian, "thus building a memory trace for pain that becomes irreversible. Our idea was to understand this mechanism better to derive potential new treatment strategies." Because pain is perceived by electrical impulses in neurons, the researchers looked for electrical fluctuations among neurons in the limbic system. They found such changes - "more excitable" neurons - in the gyrus cinguli.
They believe that the neurons were more excitable here due to a down-regulation of an ion channel, which instead of regulating electrical properties of cells, leads to an increased number of nerve impulses in these cells. The increase in nerve impulses is therefore perceived by the brain as pain.
"Next, the researchers attempted to restore the function of the ion channel. They succeeded in doing this by activating a receptor sensitive to serotonin. Nevian explains the role serotonin plays in reactivating the ion channel:
"It has been known for some time that serotonin can modulate pain perception and the function of some drugs is based on this. Nevertheless, what is new in our study now is that we were able to identify a specific subtype of serotonin receptor that reduced the perception of pain more efficiently. This is an important result, which might help to treat chronic pain more efficiently in the future."
An interesting additional finding of this research is that the results suggest a mechanism to explain how tricyclic antidepressants work. Previously, it has been assumed that tricyclics work by acting on the spinal cord, but the Bern study shows that they also work directly on pain perception in the brain. However, Nevian says that even though he believes an important step has been made in this research, it will be some time before novel drugs are designed based on these results.
Source: Medical News Today
Higher Fitness Levels Linked to Lower Risk of Some Cancers and Death
March 27, 2015
HIGHER FITNESS LEVELS LINKED TO LOWER RISK OF SOME CANCERS AND DEATH
Higher fitness levels among middle-aged men might be associated with a lower risk for later lung and colorectal cancer, but not prostate cancer, according to a new study. The study also linked higher fitness levels in midlife to a lower risk for later death from cancer or cardiovascular disease. The findings were published online March 26 in JAMA Oncology. This study is the first to show that cardiorespiratory fitness (CRF) predicts the incidence of certain types of cancer and the risk for death from cancer or cardiovascular disease after being diagnosed with cancer, the authors report. "Among the men who developed cancer, those who were more fit at middle age had a lower risk of dying from all three cancers studied, as well as from cardiovascular disease," said first author Susan G. Lakoski, MD, MS, from the University of Vermont in Burlington.
"Even a small improvement in fitness (by 1-MET) made a significant difference in survival, reducing the risk of dying from cancer by 10% and from cardiovascular disease by 25%," she told Medscape Medical News. A 1-MET difference equates to the difference between running an 11.5-minute mile (9 METS) and a 12.0-minute mile (8 METS), Dr Lakoski explained. Dr Lakoski, a former professional athlete for whom training has "always been an important part of my life," revealed that her training has evolved over the years because of age-related changes and work schedules. "I understand the importance of staying committed to a regular exercise routine," she said, "I know my limits and how hard to push myself based on my prior fitness testing, which we advocate for in this study."
The study used information from the Cooper Center Longitudinal Study, which was conducted at a preventive medicine clinic in Dallas. It involved 13,949 men who underwent comprehensive medical examination, cardiovascular risk assessment, and treadmill testing to evaluate CRF from 1971 to 2009. The mean age of the participants was 49 years, and 98% of the cohort was white. The researchers used Medicare claims data from 1999 to 2009 to assess lung, prostate, and colorectal cancer diagnosed in people 65 years and older. Over a follow-up period of 6.5 years, 1310 men developed prostate cancer, 200 developed lung cancer, and 181 developed colorectal cancer.
The study revealed that a high CRF in midlife, compared with a low CRF, led to a 55% reduction in lung cancer (adjusted hazard ratio [HR], 0.45; 95% confidence interval [CI], 0.29 - 0.68) and a 44% reduction in colorectal cancer (adjusted HR, 0.56; 95% CI, 0.36 - 0.87). This association did not apply to prostate cancer (adjusted HR, 1.22; 95% CI, 1.02 - 1.46; P = .0004). "The relations between fitness and prostate cancer risk is controversial," Dr Lakoski explained. "It is possible that men with higher CRF are more likely to undergo more frequent preventive healthcare screening and/or detection visits, and have a greater opportunity to be diagnosed with localized prostate cancer than men with lower CRF." However, men who developed prostate cancer had a lower risk of ultimately dying from cancer or cardiovascular disease if they had higher fitness levels before diagnosis, she pointed out.
"This speaks to the importance of being fit in mid-life to improve survival, even if a man ultimately develops lung, prostate, or colorectal cancer later in life," Dr Lakoski continued. In addition, a high CRF in midlife, compared with a low CRF, was associated with a 32% reduction in all cancer-related death in men 65 years and older who developed lung, colorectal, or prostate cancer (adjusted HR, 0.68; 95% CI, 0.47 - 0.98). And a high CRF was associated with a 68% reduction in death from cardiovascular disease after receiving a cancer diagnosis (adjusted HR, 0.32; 95% CI, 0.16 - 0.64). Study limitations include the inability to assess the length and intensity of smoking, cancer stage, changes in CRF from the initial health screening to cancer diagnosis, and outcomes that occurred between study entry and Medicare eligibility. "We did not capture the individual workouts of each participant," Dr Lakoski reported. "However, it is known that increasing both the intensity and duration of exercise improves fitness levels. Patients should talk to their physician about what amount of exercise is right for them to start with. This is the key first step." "We propose that fitness testing is an objective guide to help physicians counsel their patients on this topic," she emphasized. "These findings provide support for the utility of CRF assessment in preventive healthcare settings, and possibly following a diagnosis of cancer." "Future studies are needed to test these results across all major cancers in men and women, and to address how much an individual must change their fitness to see cancer prevention benefit," she concluded.
Dr Lakoski reports receiving partial funding from the National Institute of General Medical Sciences/National Institute of Health. Dr Jones reports receiving research grants from the National Cancer Institute.
UK can expect mosquito-borne diseases as climate warms, say experts
March 25, 2015
UK CAN EXPECT MOSQUITO-BORNE DISEASES AS CLIMATE WARMS, SAY EXPERTS
new report from public health experts warns that climate change could accelerate the arrival in the UK of vector-borne diseases, which are transmitted by insects such as mosquitoes and ticks.
While not blaming the anticipated arrival of chikungunya, dengue fever and West Nile virus in the UK in the next decades entirely on climate change, the effect of rising temperatures is likely to accelerate it, say the authors, two experts from the Emergency Response Department at Public Health England. Dr Jolyon Medlock and Professor Steve Leach report their concerns and the evidence they base them on in a review published in The Lancet Infectious Diseases journal.
They warn that vector-borne diseases are rising and spreading across Europe. The last decade has seen malaria arrive in Greece, West Nile virus in eastern Europe, and chikungunya in Italy and France.
The paper follows a United Nations report in December 2014 that warned dengue poses a serious threat to large parts of Europe and South America. Both the UN report and the new review say that currently these regions are too cold to allow larvae and eggs of the dengue-carrying mosquitoes to survive the winter. But as global temperatures rise, these areas will become smaller. Prof. Leach says climate change is one of several factors that are driving the increase in vector-borne diseases in the UK and Europe. He notes that other factors that should be considered include socioeconomic development, urbanization, migration, widespread change in land use.
Used tires are an ideal breeding ground for disease-carrying mosquitoes
Public Health England has been monitoring seaports, airports and even motorway service stations for signs of disease-carrying mosquitoes. So far, they have not found any non-native mosquitoes in the UK, but Dr. Medlock says a better system is needed to monitor imported used tires, a favored place for mosquitoes to lay their eggs.
Fast facts about dengue virus
- Dengue fever is caused by any one of four related viruses spread through mosquito bites
- There are no vaccines against dengue and the only protection is to reduce the risk of being bitten by mosquitoes
- Dengue currently infects 400 million people a year and is a leading cause of death in the tropics and subtropics
Used tires are a good example of where it is not just the change in climate but also human activity that increases the spread of vector-borne diseases.
Used tires are an ideal breeding ground for several species of disease-carrying mosquitoes. They fill with leaf litter and rain, making conditions right for mosquito larvae to grow. For example, in the US, they have been targeted as an entry point for vector-borne diseases since the mid-1980s when a substantial breeding population of Asian Tiger (Aedes albopictus) mosquitoes was discovered in Houston, TX. These mosquitoes are known carriers of dengue and chikungunya viruses. The US authorities concluded it was likely the insects arrived from Japan as eggs deposited in used tires. Medlock and Leach suggest that increased rainfall and milder climate are likely to spur the Asian Tiger mosquito to breed and expand in the UK, particularly in the south of England.
Climate change models predict chikungunya virus spread in London by early 2040s
The authors note that climate change models predict that by the early 2040s, conditions will be suitable for 1 month of chikungunya virus transmission in London, and up to 3 months in southeast England by the early 2070s. And just a 2°C rise in temperature is enough to extend the active season for dengue-carrying mosquitoes by 1 month, and increase their geographical spread by up to 30% by 2030, they add. There are already several mosquitoes in the UK capable of spreading West Nile virus. At present the number of mosquitoes that bite humans (the Culex species) is too low, but rising temperatures could alter this, warn the authors, who also note that a number of sites harboring Culex modestus - the number 1 carrier of West Nile virus in Europe - have been found recently in Kent. Prof. Leach concludes:
"Lessons from the outbreaks of West Nile virus in North America and chikungunya in the Caribbean emphasize the need to assess future vector-borne disease risks and prepare contingencies for future outbreaks."
The paper's release is planned to coincide with the Impact of Environmental Changes on Infectious Diseases 2015 meeting that is taking place in Sitges, Spain, this week.
Source: Medical News Today
Study Found Nine Modifiable Triggers for Low Back Pain
March 24, 2015
STUDY FOUND NINE MODIFIBLE TRIGGERS FOR LOW BACK PAIN
Distraction was the greatest risk factor by far for new-onset acute low back pain (LBP), according to a new case-crossover study. The study, published online February 24 and in the March issue of Arthritis Care & Research, also found that onset was most likely between 7:00 am and noon, that LBP risk was substantially increased by a number of modifiable physical and psychosocial triggers, and that people older than 60 years were less at risk from heavy loads than younger participants, perhaps because they have learned how to lift safely.
"The key message for clinicians is that even brief exposure to heavy loads and awkward postures will drastically increase our chances of developing back pain," study coauthor Manuela L. Ferreira, PhD, associate professor at the George Institute for Global Health and the Institute of Bone and Joint Research, Sydney Medical School, University of Sydney, Australia, told Medscape Medical News. "In other words, not only people who are exposed to lifting activities on an ongoing basis are at risk. In addition, being distracted and fatigued during manual tasks will likely increase our risk of having back pain. The next step would be to develop and test prevention strategies based on these results."
The researchers, led by Daniel Steffens, BPT, from the University of Sydney, recruited 999 patients with new episodes of acute LBP from 300 primary care clinics in Sydney, Australia, between October 2011 and November 2012. Subjects were asked to report exposure to 12 possible LBP triggers during the 96 hours before pain onset. Dr Ferreira said most of the included triggers have been listed as potentially hazardous activities in the Australian Code of Practice for the prevention of musculoskeletal disorders and added that the researchers also chose activities that could be modified and targeted in back pain prevention campaigns.
The possible physical triggers studied were "heavy load; awkward positioning; handling of objects far from the body; handling people or animals and unstable loading; a slip, trip, or fall; engagement in moderate or vigorous physical activity; and sexual activity." The possible psychosocial triggers were alcohol consumption, fatigue, and being distracted during an activity or task. Patients were interviewed by telephone within 7 days of presenting with acute LBP, using a guided interview script that included date and time of pain onset as well as time and duration (if any) of exposure to each of the 12 possible triggers. The researchers used a case-crossover design, so that each person served as his or her own control; the design aims to reduce the potential for between-person confounding and to eliminate potential confounders such as genetic and lifestyle influences.
The greatest risk for an episode of acute LBP was associated with the psychosocial trigger of being distracted during a task or activity, which had an odds ratio (OR) of 25.0 (Table). All of the physical triggers except sexual activity were strongly associated with increased risk for back pain, with ORs ranging from 8.0 to 2.7. The most dangerous physical trigger was manual tasks involving awkward positioning. In addition, alcohol consumption was not linked to increased risk of LBP. "While many of the triggers included in the study had been previously seen as hazardous activities, especially in the workplace, we were surprised that being distracted and fatigued during manual tasks will drastically increase our chances of developing back pain," Dr Ferreira said.
Table. Risk for Acute LBP Associated With Modifiable Trigger Factors
OR (95% CI)
Distracted during activity
25.0 (3.4 - 184.5)
8.0 (5.5 - 11.8)
Objects not close to body
6.2 (2.4 - 15.9)
Manual task involving people or animals
5.8 (2.3 - 15.0)
Unstable, unbalanced, or difficult to grasp
5.1 (2.4 - 10.9)
5.0 (3.3 - 7.4)
Vigorous physical activity only
3.9 (2.4 - 6.3)
3.7 (2.2 - 6.3)
Moderate or vigorous physical activity
2.7 (2.0 - 3.6)
1.5 (0.6 - 3.7)
0.7 (0.3 - 1.8)
An unexpected finding was a diurnal variation in LBP pain onset. The authors write that 35.2% of participants reported pain onset between 7:00 and 10:00 am. This inspired exploratory post hoc analyses of risk associated with being exposed to triggers between 7:00 am and noon vs between 1:00 pm and 6:00 pm. Although the study lacked sufficient power to completely resolve this issue, the researchers found that both morning exposure to awkward posture and manual tasks involving unstable loading in the morning were strongly associated with risk for LBP onset. Dr Ferreira said, "While we are not sure why back pain risk is highest in the morning, a possible explanation is that spinal discs swell with fluid overnight, and therefore the lower back will be more susceptible to stress and strains in the morning." The researchers found no interaction between any trigger and habitual participation in physical activity, body mass index, number of previous LBP episodes, depression, or anxiety. The interactions analysis did reveal that the OR associated with manual tasks involving heavy loads decreased from 13.6 (95% confidence interval [CI], 5.4 - 34.5) at 20 years to 2.7 (95% CI, 1.5 - 4.7) at 60 years. "One potential reason for this may be that older people have learned to lift correctly or are more careful when handling heavy loads," the authors write.
"This is an excellent study, carefully and cleverly designed, well-powered, and well-reported by a great team with well-known world experts on the topic of [LBP]," Wolf E. Mehling, MD, associate professor of clinical family and community medicine, Osher Center for Integrative Medicine and Department of Family and Community Medicine, University of California, San Francisco, who was not involved in the study, told Medscape Medical News. "The strongest triggers were postural (how a manual task was done) and inattention towards the task at the moment of the task. The former is not really major news, as back school programs and occupational prevention programs are aware of this. The latter sounds trivial but may be a major topic, as present-moment awareness and mindfulness could be preventive measures." Dr Mehling suggested that morning hours and younger age may be associated with "a belief that one can do whatever one wants, a sense of having no limits, which with maturation of individuals or sustained back problems may slowly wane."
He added, "It is very reassuring is that sex is not a trigger and that getting older may not be a problem." The researchers point out that their findings have major public health implications, highlighting the need for research on programs that modify the LBP triggers. They write, "[T]he burden of disease due to road traffic injury is far less than that for back pain, yet many countries devote considerable resources to controlling behaviors that increase the risk of road crashes." Dr Ferreira said, "For the first time, we were able to identify that brief exposure to hazardous activities will increase the risk of developing a sudden episode of moderate to severe back pain, strong enough to lead you to seek care for it. Until now, back pain was believed to be associated only with repetitive and ongoing exposure to these activities, so our results add important information on the length of exposure. We also add important information on the role of being fatigued and distracted while engaged in physical tasks, as this will dramatically increase our risk of developing back pain."
Early detection of osteoarthritis via blood test in sight, says study
March 23, 2015
EARLY DETECTION OF OSTEOARTHITIS VIA BLOOD TEST IN SIGHT, SAYS STUDY
The first blood test for early-stage osteoarthritis could soon be developed, say researchers who suggest the biomarker they have identified can detect the painful joint condition before bone damage occurs.
The research, led by the University of Warwick in the UK, is published in the journal Scientific Reports.
The authors found that testing for citrullinated proteins (CPs) in the blood could lead to osteoarthritis (OA) being diagnosed years before physical symptoms emerge.
They also found that CPs may serve as a reliable way to detect early-stage rheumatoid arthritis (RA).
Lead researcher Dr. Naila Rabbani, reader of experimental systems biology at Warwick, says:
"This is a remarkable and unexpected finding. It could help bring early-stage and appropriate treatment for arthritis, which gives the best chance of effective treatment."
Dr. Rabbani and colleagues note that while there are established biomarker tests for early-stage RA, there are none for OA and suggest their findings could lead to a test for both that also distinguishes between the two.
CPs elevated in patients with early-stage OA, RA
Osteoarthritis is a joint disease associated with aging. It develops when the protective cartilage layer inside joints wears away because of continually being stressed over a person's lifetime. This type of arthritis normally affects the knees, hips, fingers and lower spine and rarely strikes before the age of 60.
The World Health Organization (WHO) estimate that about 9.6% of men and 18.0% of women over 60 have symptomatic osteoarthritis.
RA is a chronic systemic disease that affects not only the joints but also tendons, connective tissue, muscle and fibrous tissue. It is a disabling condition that often causes pain and deformity and tends to strike between the ages of 20 and 40.
According to WHO, RA affects up to 3% of the global population. The disease is more common in women and in developed countries, where around half of people who develop it are not able to have a full-time job within 10 years of onset.
For their study, the researchers developed a method based on mass spectrometry to measure CPs in body fluids. They found that CPs were elevated in patients with early-stage OA and early-stage RA.
Previous studies had already established that people with RA had antibodies to CPs, but these have not been found in early-stage OA.
Single blood test to detect and distinguish the two major forms of arthritis in early stages
In the next stage of their work, the team developed an algorithm that combined three biomarkers - CPs, anti-CP antibodies and hydroxyproline, a bone-derived compound - into one test.
Using the algorithm, they found that with a single blood test they could potentially detect and distinguish between the two types of arthritis before bone damage took place.
Dr. Rabbani explains that the algorithm uses the presence of autoimmunity to CPs in early-stage RA (antibodies are present) and the absence of such autoimmunity in early-stage OA (no antibodies) to distinguish between the two.
She explains that the team would have been satisfied with discovering the basis of a test for OA - but also finding they could discriminate between early-stage RA and other joint diseases was an added bonus:
"This discovery raises the potential of a blood test that can help diagnose both RA and OA several years before the onset of physical symptoms."
Meanwhile, Medical News Today recently reported how research is bringing closer the day when patients with knee osteoarthritis can benefit from stem cell therapy to regenerate damaged cartilage. In a study published in Stem Cells Translational Medicine, researchers reported successfully regrowing cartilage in rats using embryonic stem cells.
Written by Catharine Paddock PhD
Physical Activity May Reduce Cataract Risk
March 22, 2015
PHYSICAL ACTIVITY MAY REDUCE CATARACT RISK
High levels of total and long-term physical activity, as well as specific types of physical activity, may decrease the risk for age-related cataract later in life, researchers report in an article published in the February issue of Ophthalmology. Of 52,660 men and women 45 to 83 years of age who completed questionnaires to assess physical activity as part of two large population-based cohorts, 11,580 developed age-related cataract during a 12-year follow-up period, write Jinjin Zheng Selin, MSc, from the Division of Nutritional Epidemiology, Institute of Environmental Medicine, Karolinska Institutet, Stockholm, Sweden, and colleagues.
Participants with the highest quartile of physical activity had a 13% decreased risk of developing cataracts relative to those with the lowest levels of physical activity, after adjustments for multiple factors including fruit and vegetable intake, antioxidant supplement use, and alcohol intake (hazard ratio [HR], 0.87; 95% confidence interval [CI], 0.83 - 0.92). In addition, increased amounts of long-term total physical activity both at 30 years of age and at the beginning of the study (mean age, 59.4 years) decreased the risk for cataract by 24% compared with low levels of activity, according to the researchers (HR, 0.76; 95% CI, 0.69 - 0.85). Participants answered six questions about physical activity and inactivity habits during the previous year at baseline. The specific types and time spent on different physical activities included walking or bicycling (hardly ever to more than 1.5 hours daily), leisure time exercise (less than 1 hour to more than 5 hours weekly), work or occupational activity (mostly sitting to heavy manual labor), home or housework (less than 1 hour to more than 8 hours daily), and inactive leisure time (less than 1 hour to more than 6 hours daily).
When the investigators looked at specific activities, they found that walking or bicycling 60 minutes per day or more decreased the risk for cataract by 12% compared with hardly ever walking or bicycling, and work or occupational activity requiring heavy manual labor decreased the risk for cataract by 16% compared with mostly sedentary occupations. Compared with individuals reporting less than 1 hour of leisure time inactivity per day, those who were physically inactive for 6 or more hours of leisure time daily were 27% more likely to develop age-related cataract, they write. "Our results on different types of physical activity suggest that being physically active on a regular daily basis may contribute to decreased risk of cataract, rather than short weekly episodes of exercising/training," Selin explained in an interview with Medscape Medical News. No association was observed between home or housework and risk for cataract. "This may be due to different types of tasks and activity levels performed," Selin said. There may be several possible mechanisms behind the inverse association between high levels of physical activity and cataract risk, including lower levels of oxidative stress and inflammation and improved insulin resistance and lipid profiles, according to Selin. "Physical activity may also be related to a healthy lifestyle," he said. "Because this is an observational study, we cannot rule out the possibility of unmeasured or residual confounding. However, we have adjusted for several potential confounders in the analyses, and the results remained similar," Selin explained. The new findings are consistent with previous data, according to Manuel B. Datiles III, MD, medical officer and senior investigator and senior attending Ophthalmologist from the National Institutes of Health National Eye Institute in Bethesda, Maryland.
"In one of the previous papers, published in 2009, the author contends that when one is well conditioned, there is good evidence that one's defense against oxidative stress is also high, and this helps prevent cataract formation," Dr Datiles explained to Medscape Medical News. "Oxidative stress is one of the main causes of cataract, and antioxidants help lower one's risk for cataract." In addition, according to Dr Datiles, who was not involved in the current investigation, "there are many papers that deal with obesity — indirectly reflecting on lack of physical activity — and higher risk of cataracts. And researchers have also studied body mass index and correlated it with cataract risk, again a reflection of lack of physical activity." The current study extends the earlier data by prospectively examining the association of total and specific types of physical activity, as well as inactivity, with the risk for age-related cataract in a general population, Dr Datiles explained. At this time, surgical extraction is the only treatment for cataract. For this reason, "it is of particular importance to identify protective factors as well as risk factors for cataract," the authors write.
Researchers identify enzyme that causes heart failure
March 21, 2015
RESEARCHERS IDENTIFY ENZYME THAT CAUSES HEART FAILURE
new treatment for heart failure could soon be on the cards, according to a new study. A research team - including scientists from Johns Hopkins Medicine - claims to have discovered an enzyme that triggers the condition, and medications that block this enzyme are already being tested for other diseases.
Senior investigator Dr. David Kass - professor of medicine at the Heart and Vascular Institute at Johns Hopkins University School of Medicine - and colleagues publish their findings in the journal Nature.
More than 5 million Americans have heart failure. The condition occurs when the heart is no longer able to pump enough blood and oxygen around the body to fuel other organs.
Common causes of heart failure include diabetes, high blood pressure and heart disease. But until now, it has been unclear as to exactly what happens in the heart to trigger the condition.
For the heart to function normally, the team explains, two signaling pathways need to be in working order. The chemicals nitric oxide and natriuretic peptide stimulate each pathway to produce a signaling molecule called cGMP, which activates a protein called PKG - the protector of the heart muscle. A breakdown in both of the signaling pathways is a cause of most heart failure cases.
Past research from Dr. Kass and colleagues found that an enzyme called PDE-5 is responsible for the breakdown in the first of the heart's signaling pathways. In this latest study, the team found that an enzyme called PDE-9 is responsible for the breakdown in the second signaling pathway.
How does PDE-9 trigger heart failure?
Earlier research revealed that an excess of PDE-5 causes damage in the first of the heart's signaling pathways by interfering with the signaling molecule cGMP and the protein PKG. In this latest study, the team found that too much PDE-9 triggers heart failure by specifically interfering with the type of cGMP produced by the second heart signaling pathway.
In detail, PDE-9 speeds up the breakdown of cGMP, which reduces PKG production. This leaves the heart cells susceptible to defects, which can lead to scarring and damage in the heart muscle.
Commenting on the discovery, lead investigator Dong Lee, a cardiology research associate at Johns Hopkins University School of Medicine, says:
"Like a play with multiple characters, heart muscle function is the result of a complex but perfectly synchronized interaction of several proteins, enzymes and hormones.
Our findings reveal that, like two subplots that converge in the end of the play, PDE-5 and PDE-9 are independent rogue operators, each leading to heart muscle damage but doing so through different means."
Preserving function of just one signaling pathway 'could avert clinical disease'
The team notes that medications that block PDE-9 activity are currently undergoing testing for Alzheimer's disease - another condition in which the enzyme is believed to play a part - suggesting it may not be too long before the drugs can be used to treat heart failure.
On testing such medications on mouse models of heart failure, the researchers found they halted enlargement and scarring of the heart muscle. What is more, the drugs almost reversed the condition completely.
"We believe the identification of PDE-9 puts us on the cusp of creating precision therapies that target the second pathway or developing combined therapies that avert glitches in both pathways," says Dr. Kass.
In another experiment, the team gave mice with heart failure either a PDE-5 inhibitor, a PDE-9 inhibitor or a placebo for 4 weeks. They found that mice treated with the PDE-5 inhibitor or the PDE-9 inhibitor showed significant improvements in heart muscle size and function, and the drugs nearly restored the heart's pumping ability to normal.
Next, the researchers gave half of the treated mice a chemical that deactivated the signaling pathway regulated by PDE-5. On treating these mice with PDE-5 inhibitors, the team found the drugs had no effect on heart function. Treating mice with a PDE-9 inhibitor, however, led to significant improvements in heart function.
These findings, the researchers say, support the idea that heart failure is triggered by faults in two separate signaling pathways, with one regulated by PDE-5 and the other regulated by PDE-9.
"In practical terms," adds Dr. Kass, "this affirms that preserving the function in one pathway can avert clinical disease, even if the other one goes bad."
Written by Honor Whiteman
Breastfeeding for longer leads to smarter adults
March 19, 2015
BREASTFEEDING FOR LONGER LEADS TO SMARTER ADULTS
study has found that prolonged breastfeeding is linked to higher intelligence, longer schooling and greater earnings as an adult.
The study, published in The Lancet Global Health, followed 3,493 infants born in Pelotas, Brazil. After an average of 30 years, the researchers measured their IQs and collected further information about their educational achievement and income.
"The effect of breastfeeding on brain development and child intelligence is well established, but whether these effects persist into adulthood is less clear," says lead author Dr. Bernardo Lessa Horta of Federal University of Pelotas in Brazil.
"Our study provides the first evidence that prolonged breastfeeding not only increases intelligence until at least the age of 30 years but also has an impact both at an individual and societal level by improving educational attainment and earning ability."
In the short-term, breastfeeding is known to reduce the prevalence of infectious diseases and mortality from them among infants. Exclusive breastfeeding is commonly recommended for the first 6 months after birth, to be continued alongside solid foods until at least the age of 1 year.
The Mayo Clinic describe breast milk as "the gold standard for infant nutrition" as it contains the right balance of nutrients for the baby while boosting its immune system.
Many previous observational studies involving breastfeeding have been limited due to social patterning. Mothers that breastfeed for longer durations have typically held high socioeconomic positions, and their improved access to health care may lead to overestimating the health benefits of breastfeeding.
While previous studies have been criticized for this potential confounding factor, the authors of the new study address this issue.
"What is unique about this study is the fact that, in the population we studied, breastfeeding was not more common among highly educated, high-income women, but was evenly distributed by social class," Dr. Horta explains.
Results also suggest the amount of milk consumed could play a role
For the study, the subjects were divided into five groups based on the amount of time they were breastfed for as infants. The participants were also controlled for 10 variables that may have contributed to increases in IQ, such as family income at birth, maternal age and parental schooling.
The researchers found that breastfeeding led to increases in adult intelligence, longer schooling and higher adult earnings, but also that the magnitude of the benefits was greater the longer a child was breastfed for, up to 12 months.
Compared with infants who were breastfed for less than one month, infants breastfed for 12 months had four more IQ points, 0.9 years more schooling and earned $104 per month more on average. Dr. Horta believes there is a biological mechanism for the study's findings:
"The likely mechanism underlying the beneficial effects of breast milk on intelligence is the presence of long-chain saturated fatty acids (DHAs) found in breast milk, which are essential for brain development. Our finding that predominant breastfeeding is positively related to IQ in adulthood also suggests that the amount of milk consumed plays a role."
Although the researchers did not measure the characteristics of the infants' home environment or maternal-infant bonding, the researchers state that previous research suggests breastfed subjects have demonstrated improved cognitive functioning even after controlling for home environment and stimulation.
"Our results suggest that breastfeeding not only improves intelligence up to adulthood, but also has an effect at both the individual and societal level, by increasing educational attainment and earning ability," the authors conclude.
In contrast to the new research, Medical News Today reported on a study last year that suggested that breastfeeding may be no more beneficial than bottle-feeding for many long-term health outcomes.
Written by James McIntosh
Statin drug rosuvastatin 'does not deserve' best-selling slot
March 19, 2015
STATIN DRUG ROSUVASTATIN DOES NOT DESERVE BEST-SELLING SLOT
The best-selling statin drug, rosuvastatin, which is sold under the Crestor brand, "should not be used," according to a doctor writing in The BMJ - because the evidence of benefit has been weak, and there is growing evidence of side-effects.
Writing as founder of the health research arm of the consumer group Public Citizen, Dr. Sidney Wolfe says that he hopes Crestor's position as the most prescribed brand name drug in the US in 2014 "declines" - because, says the opinion piece in The BMJ, the evidence of clinical benefit has "fallen" along with "more evidence of risks."
Dr. Wolfe suggests that Crestor's annual multibillion dollar success is explained by rosuvastatin having, milligram for milligram, the best cholesterol-lowering potency of all statins - a "fact exploited in advertising campaigns."
In spite of its success, the drug should have been withdrawn, the article argues, when Public Citizen first called on the US Food and Drug Administration (FDA) to consider "serious problems [that] were identified before rosuvastatin's [FDA] approval."
Dr. Wolfe expresses his exasperation at the persistent use of the statin brand, which is licensed for prevention of heart disease and stroke as well as to lower high cholesterol levels.
He asks: "Given the evidence of more serious risks and less clinical benefit than other statins, how has the drug fared so well for so long?"
The FDA license for Crestor specifies preventive prescribing for "slowing the progression of atherosclerosis" in addition to treating primary hyperlipidemia and other disorders of cholesterol levels. In the US in 2014, some 22.3 million prescriptions were filled for the drug.
The FDA license was updated in late 2010 to include further preventive use, but this "later approval to prevent heart attacks in a very selected group of people was based on the results of a study which was stopped early," says The BMJ in a press release, "prompting concern that the treatment effects may have been overestimated."
"There is also growing evidence that the drug carries a higher risk of serious adverse effects compared with other statins, such as an increased risk of developing diabetes."
Marketing campaigns in the 'statins war'
Dr. Wolfe believes safety concerns have not been taken into account in marketing activities for the drug amid a so-called statins war.
He describes a row that played out in another leading medical journal, The Lancet, about whether AstraZeneca, the pharmaceutical company responsible for Crestor, "pushed its marketing machine too hard and too fast."
That editorial was in 2003, and Dr Wolfe goes on to cite a 2004 warning from the FDA against AstraZeneca's marketing - specifically, a clarification about the accuracy of an advert that the company took out in response to Public Citizen's campaign against the drug, he says.
The FDA was concerned on that occasion about how far the company claimed its statin was safer than the other drugs in the class, and later warned the company again, writes Dr. Wolfe, about the way it made a claim in other promotions for the comparative efficacy of Crestor.
Dr. Wolfe concludes his argument against rosuvastatin by saying he hopes "the drug's disadvantages will lead to a sharp decline in its use" before the AstraZeneca patent for rosuvastatin expires in 2016.
He worries that, without the points he raises being widely heard, the drug could continue to enjoy success in the same way that other statin drugs have done so after coming off patent. Dr. Wolfe says:
"When patents expired for simvastatin, pravastatin and atorvastatin, the rise in generic prescriptions quickly equaled or exceeded the sharp decreases in brand name prescriptions."
Dr. Wolfe ends The BMJ's feature article by hoping that because "AstraZeneca's need to promote" the drug would stop in 2016 with the patent loss, the campaign against rosuvastatin would have an effect "for the sake of the public's health."
MNT asked a spokesperson for AstraZeneca to respond to Dr. Wolfe's article. The company said: "Crestor is an effective treatment for lowering LDL-cholesterol and raising HDL-cholesterol, when compared to other statins, and it has been shown to slow the progression of atherosclerosis."
The company added that it took its commitment to patient safety "extremely seriously" and that Crestor "has a well-established safety profile." AstraZeneca also responded that Crestor "is approved by health care authorities in over 109 countries and used by tens of millions of patients worldwide."
In other news about cholesterol-lowering drugs this week, a new type of treatment, a monoclonal antibody, could be more effective than statins. The drug, evolocumab, has been submitted to the FDA and UK and EU regulators in application for marketing licenses.
Written by Markus MacGill
Fish-Oil Supplementation May Lower Thrombosis Risk
March 18, 2015
FISH-OIL SUPPLEMENTATION MAY LOWER THROMBOSIS RISK
Although lively debate has raged about whether fish-oil supplementation (FOS) really is beneficial for preventing cardiovascular events, new research suggests that it may reduce the "overall atherothrombotic risk profile" in patients with suspected CAD.
Results from the Multi-Analyte, Thrombogenic, and Genetic Markers of Atherosclerosis (MAGMA) study, presented in a poster here at the American College of Cardiology (ACC) 2015 Scientific Sessions, showed a significant association between FOS and decreased inflammation, thrombogenicity, and lipid markers. The effect was especially strong for those who were not taking lipid-lowering medications vs those who were, with significantly lower LDL-C, total VLDL-C, and triglycerides.
"In clinical-trial data, there's a lot of controversy about whether these supplements really affect outcomes," lead author Dr Paul A Gurbel (Sinai Center for Thrombosis Research, Baltimore, MD), told heartwire from Medscape.
"But if we personalize therapy with the fish oil to assess response based on fairly validated biomarkers, then I think there's hope for fish oil as a beneficial antithrombotic strategy—and maybe a competitor for statins," added Gurbel.
Although past research has shown that omega-3 polyunsaturated fatty acids (PUFAs) from fish oil are associated with cardiovascular benefits, how FOS affects thrombosis is "incompletely understood," note the investigators.
For the current study, they sought to measure lipid profile, inflammation, and thrombogenicity markers by using thromboelastography, aggregation, and "urinary 11-dehydrothromoxane B2 immediately prior to elective coronary angiography." Vertical density-gradient ultracentrifugation and AtherOx tests were also used.
A total of 600 patients with suspected CAD were enrolled in MAGMA. Of these, 128 were on FOS (67.2% men; mean age 64.4 years; body-mass index [BMI] 30.7) vs 472 who were not (61.7% mean; mean age also 64.4 years; BMI 30.9). In addition, 70.3% of the FOS group was on lipid-lowering therapy (including statins, nonstatins, or both) vs 71.2% of the non-FOS group.
"This is a study we've been doing in our hospital of patients who have come to our cath laboratory. And we found some interesting results," reported Gurbel.
In the entire patient population as a whole, taking FOS was significantly associated with lower total VLDL-C (P=0.002), intermediate-density-lipoprotein cholesterol (P=0.02), triglycerides (P=0.04), and AtherOx-tested levels (P=0.02) vs not taking FOS. They also had significantly lower urinary 11-dehydrothromboxane B2 levels (P=0.0007).
Patients who took FOS and were not on lipid-lowering medications also showed significantly lower levels of these measures, as well as lower LDL-C, remnant lipoproteins, and collagen-induced platelet aggregation. On the thrombogenicity profile, they also had lower thrombin-induced platelet fibrin clot strength (P=0.01) and "G," signifying clot firmness (P=0.0003).
FOS was not significantly associated with an influence on any of these measures in patients who also took statins.
Overall, the findings suggest that FOS "may have its most potent antiatherothrombotic effects in patients not on lipid-lowering therapy," write the investigators, adding that future studies should actually compare effects from FOS vs statins.
"We really need to look at how these biomarkers in response to fish oil correlate with clinical outcomes. For example, if there's an anti-inflammatory response to fish oil, maybe that's a reason to keep a patient on fish oil. That's the kind of personalization that we're interested in," said Gurbel.
"Good News," but Outcomes Trials Needed
Dr Norman E Lepor (Cedars Sinai Medical Center, Los Angeles, CA) echoed Gurbel's comments to heartwire that the issue of using FOS "has been quite controversial" because of the lack of prospective, randomized outcome studies.
"At this moment, we don't really know whether fish oil prevents cardiovascular events. What's interesting is that this study looked at its effect on a variety of specific indicators of the propensity to clot and for the development of inflammation," said Lepor, who is not involved with this research.
"We can't necessarily correlate that benefits with either one or both of these will lead to a reduction in heart attacks, stroke, and death, which is what we're really concerned about," he noted. "However, these data show that patients who take [FOS] seem less predisposed to blood clotting, their platelets seemed less sticky, and at least the components of inflammation that were measured in the trial were reduced."
So, there is evidence that FOS provides some benefits, he said, adding that this is good news for cardiologists.
"But we're still waiting for clinical data that answer, at the end of the day: if we give these to patients, will they live longer? Are they less likely to have a heart attack? It's really the outcomes that's going to drive cardiologists to prescribe these medications."
Lepor noted that FOS is readily accepted now by the public and is often used by cardiologists to treat high triglyceride levels. However, "you'll see me prescribe more fish oils once we have the outcome data showing that there is a reduction in actual events."
Diabetes drugs may promote heart failure, study finds
March 17, 2015
DIABETES DRUGS MAY PROMOTE HEART FAILURE, STUDY FINDS
Patients who manage type 2 diabetes with drugs that lower glucose or blood sugar may be at higher risk for heart failure.
This was the finding of a comprehensive analysis of clinical trials covering more than 95,000 patients reported in The Lancet Diabetes and Endocrinology. The study was also presented at the 64th Annual Scientific Session of the American College of Cardiology in San Diego, CA, earlier this week.
Heart failure - where the heart does not pump enough blood around the body at the right pressure - is a common condition in patients with type 2 diabetes.
Heart failure has a major impact on the quality of life of patients and is a major driver of health care costs in the US.
The Centers for Disease Control and Prevention (CDC) estimate heart failure costs the nation $32 billion each year. This figure includes the cost of health care services, medications and missed days of work.
For the new study, the investigators searched libraries of published studies for large, randomized controlled trials of type 2 diabetes glucose-lowering drugs or strategies that assessed cardiovascular outcomes.
Fourteen trials involving a total of 95,502 participants matched their criteria. They pooled and analyzed the data to calculate the relative risks of heart failure posed by each of the type 2 diabetes medications or treatments.
A heart failure event occurred in 4% of patients during the individual trials they participated in, while 9.8% suffered a major cardiovascular event, such as heart attack or stroke.
Fast facts about heart failure
- Breathlessness, tiredness and swollen ankles are the main symptoms of heart failure
- While it is a serious condition, it does not mean the heart has stopped beating
- About 5.1 million people in the US have heart failure.
14% increased risk of heart failure in patients on sugar-lowering drugs to manage diabetes
Lead investigator Dr. Jacob Udell, of the Peter Munk Cardiac Centre at the University Health Network (UHN) and the Women's College Hospital (WCH), both in Toronto, Canada, says they found:
"Patients randomized to new or more intensive blood sugar-lowering drugs or strategies to manage diabetes showed an overall 14% increased risk for heart failure."
He explains that the "increased risk was directly associated with the type of diabetes therapy that was chosen, with some drugs more likely to cause heart failure than others, compared with placebo or standard care."
Senior author Dr. Michael Farkouh, chair of the Peter Munk Centre of Excellence in Multinational Clinical Trials, adds:
"While some drugs showed an increased risk, other strategies tested, such as intensive weight loss to control blood sugar, showed a trend towards a lower risk for heart failure.
Overall, the results show that for every kilo of weight gain due to sugar-lowering diabetes treatment, there was an associated 7% higher risk of heart failure directly linked to that treatment.
The authors note that the relative increase in the risk of heart failure outweighed a 5% fall in heart attacks.
They also calculated that for around every 200 patients treated, there was one extra hospital admission for heart failure after an average follow-up of 4 years.
In the following video, Dr. Udell summarizes the findings and implications of their study for patients, doctors and researchers:
Dr. Barry Rubin, medical director of the Peter Munk Cardiac Centre at UHN, says:
"The results of this study could prove to be the catalyst for how diabetes patients at risk for heart disease are managed moving forward."
Written by Catharine Paddock PhD
New cholesterol-lowering drug 'could halve risk of heart attack, stroke'
March 17, 2015
NEW CHOLESTEROL-LOWERING DRUG COULD HALVE RISK OF HEART ATTACK, STROKE
Currently, statin therapy is the standard treatment for many patients with high cholesterol. But a new study published in The New England Journal of Medicine claims a drug called evolocumab could be much more effective; it reduced cholesterol levels so dramatically that patients' risk of cardiovascular events - such as heart attack and stroke - fell by more than half, compared with those receiving standard therapy alone.
ELead study author Dr. Marc Sabatine, a senior physician at Brigham and Women's Hospital in Boston, MA, and colleagues recently presented their findings at the American College of Cardiology's 64th Annual Scientific Session in San Diego, CA.
The study was a 1-year extension of 12 phase 2 and 3 clinical trials that had assessed evolocumab's ability to reduce levels of low-density lipoprotein (LDL) cholesterol - commonly referred to as "bad" cholesterol because of the role it plays in blocking the arteries
According to the Centers for Disease Control and Prevention (CDC), around 71 million Americans have high LDL cholesterol - blood levels at 160 milligrams per deciliter (mg/dL) or higher. High LDL cholesterol can raise the risk of heart attack, stroke and heart disease.
The 4,465 patients involved in the study had been a part of at least one of the previous trials investigating evolocumab, which works by blocking a protein that stops the liver from removing LDL cholesterol from the blood - called proprotein convertase subtilisin-kexin 9 (PCSK9).
Of the participants, 2,976 were randomized to receive an injection of evolocumab under the skin every 2 or 4 weeks plus standard therapy, while 1,489 patients received standard therapy alone, which mostly involved moderate- or high-intensity statin therapy. The average follow-up duration was 11.1 months.
The study was open-label, meaning the participants were fully aware of the treatment they were receiving, as were the researchers. However, a central committee that reviewed the data - assessing the effects of evolocumab on LDL cholesterol levels and reporting any cardiovascular events during follow-up - was blinded to the treatment groups.
Evolocumab linked to 53% reduction in cardiovascular events
At study baseline, the average LDL cholesterol level among participants was 120 mg/dL, which the researchers say is similar to the average level found among the general population.
However, the team found that patients treated with evolocumab experienced an average 61% reduction in LDL cholesterol levels. Within 12 weeks, LDL cholesterol levels reduced to less than 100 mg/dL - defined as the optimal range - in 90.2% of evolocumab-treated patients, while levels reached 70 mg/dL or less for 73.6% of patients who received the drug. These reductions were sustained throughout the entire follow-up period, the researchers report.
In comparison, only 26% of patients who received standard therapy alone saw their LDL cholesterol levels fall below 100 mg/dL, while only 3.8% had such levels fall below 70 mg/dL.
What is more, compared with patients who received standard therapy alone, those treated with evolocumab experienced a 53% reduction in cardiovascular events, including heart attack, stroke, hospitalization, angioplasty and death; evolocumab-treated patients had a 0.95% risk of a cardiovascular event during follow-up, while standard-therapy patients had a 2.18% risk.
The team says these results remained even after accounting for patients' age, baseline LDL levels, statin use, primary or secondary prevention and incidence of valve disease. Evolocumab was also found to be well tolerated by patients.
However, the researchers admit their study is subject to some limitations. They note, for example, that the number of cardiovascular events in the study was relatively small, with only 60 identified.
Still, the team believes the findings show that not only is evolocumab effective for dramatically reducing cholesterol levels, but it may be effective for rapid risk reduction of cardiovascular events. Dr. Sabatine adds:
"The reduction in LDL was profound and that may be why we saw a marked reduction in cardiovascular events so quickly. It suggests that if we can drive a patient's LDL cholesterol down a large amount to a very low level, we may start to see a benefit sooner than would be expected with a more modest intervention."
The researchers note that because evolocumab works differently to statins - which block an enzyme in the liver that is responsible for making cholesterol - the drug also holds hope for patients who do not respond to statins or who are unable to tolerate them.
Evolocumab is currently undergoing further testing in a clinical trial involving more than 27,500 patients, the results of which are expected in 2017. Though Dr. Sabatine notes that no definitive conclusions about evolocumab's effectiveness can be made until then, this current study shows promise.
"We know from previous research that evolocumab lowers LDL cholesterol, but these data offer support for their potential to reduce major adverse cardiovascular events in our patients," Dr. Sabatine adds.
The study was funded by Amgen - the biopharmaceutical company that manufactures evolocumab.
Written by Honor Whiteman
Folic acid supplementation linked to reduced risk of first stroke in people with hypertension
March 16, 2015
FOLIC ACID SUPPLEMENTATION LINKED TO REDUCED RISK OF FIRST STROKE IN PEOPLE WITH HYPERTENSION
new study published in JAMA finds that a combination of folic acid supplementation and hypertension medication may be an effective way to reduce the risk of first stroke among adults with high blood pressure.
Each year, more than 795,000 people in the US have a stroke. Of these, around 610,000 are first-time strokes. High blood pressure, or hypertension, is a known risk factor for stroke. According to the Centers for Disease Control and Prevention (CDC), around 8 in 10 first-time strokes are among people with high blood pressure. Past studies looking at the effects of folic acid supplementation for prevention of cardiovascular disease have indicated that the vitamin may be effective for reducing stroke risk. But the investigators of this latest research - including Dr. Yong Huo of Peking University First Hospital in Beijing, China - say no studies have had stroke as the primary outcome, making it difficult to make a firm connection between the two. As such, the team set out to assess the link between folic acid supplementation and stroke risk among 20,702 adults from China aged 45-75 years. All adults had hypertension, but they had no history of stroke or heart attack at study baseline. Variations in the MTHFR C677T genotypes (CC, CT or TT) - which can affect folate levels - were assessed among participants, and their folate levels were measured at study baseline. Between May 2008 and August 2013, participants were randomized to receive either 10 mg of enalapril - a drug commonly used to treat high blood pressure - and 8 mg of folic acid daily, or a daily 10 mg dose of enalapril alone. Folic acid is a B vitamin that the body needs for healthy cell production. A lack of folic acid can lead to anemia and other health complications. It is highly recommended that women increase their intake of folic acid prior to and during pregnancy, as studies have suggested it can significantly reduce the risk of major birth defects, such as spina bifida and anencephaly.
Treatment with folic acid and enalapril reduced stroke risk by 21%
During the median 4.5-year follow-up period, 282 (2.7%) participants who were treated with both enalapril and folic acid had a first stroke, compared with 355 (3.4%) participants treated with enalapril.
The team calculated that participants treated with both enalapril and folic acid were at 21% lower risk of stroke, compared with participants treated with enalapril alone. Treatment with enalapril and folic acid also represented a 0.7% reduction in absolute risk of first-time stroke, the researchers found.
A lower relative risk of ischemic stroke was also identified among participants treated with enalapril and folic acid, and these participants were also at lower risk of combinations of cardiovascular events, including heart attack, stroke and cardiovascular death. The team found that participants with TT genotypes were most likely to benefit from combination treatment with enalapril and folic acid, as were participants who had low folate levels at study baseline. Commenting on their findings, the authors say:
"We speculate that even in countries with folic acid fortification and widespread use of folic acid supplements, such as in the United States and Canada, there may still be room to further reduce stroke incidence using more targeted folic acid therapy - in particular, among those with the TT genotype and low or moderate folate levels."
In an editorial linked to the study, Dr. Meir Stampfer and Dr. Walter Willet, of the Harvard T. H. Chan School of Public Health and Channing Division of Network Medicine in Boston, MA, say these findings hold important implications for the prevention of stroke across the globe, noting that the results are also likely to apply to populations without high blood pressure. "Ideally, adequate folate levels would be achieved from food sources such as vegetables (especially dark green leafy vegetables), fruits and fruit juices, nuts, beans and peas. However, for many populations, achieving adequate levels from diet alone is difficult because of expense or availability," they add. "This study seems to support fortification programs where feasible, and supplementation should be considered where fortification will take more time to implement."
Written by Honor Whiteman
Satisfying thirst and the kidneys: the importance of drinking water
March 12, 2015
SATISFYING THIRST AND THE KIDNEYS: THE IMPORTANCE OF DRINKING WATER
Many people may take drinking water for granted, but keeping hydrated can have a huge impact on overall health. Despite how crucial it is that people drink enough water, a significant amount of people may be failing to drink recommended levels of fluids each day.
Around of 70% of the body is comprised of water, and around of 71% of the planet's surface is covered by water. Perhaps it is the ubiquitous nature of water that means that drinking enough of it each day is not at the top of many people's lists of healthy priorities? One part of the body that relies on adequate water intake is the kidneys. The kidneys are organs that might not get as much attention as the heart or lungs, but they are responsible for many functions that help keep the body as healthy as possible. But what happens to the kidneys when we do not drink enough water? And what can be done to improve our levels of hydration? On World Kidney Day, we take a look at the role of drinking enough water for two of the most important organs in the body.
Why do we need to drink water?
Water is needed by all the cells and organs in the body in order for them to function properly. It is also used to lubricate the joints, protect the spinal cord and other sensitive tissues, regulate body temperature and assist the passage of food through the intestines. Although some of the water required by the body is obtained through foods with a high water content - soups, tomatoes, oranges - the majority is gained through drinking water and other beverages. During normal everyday functioning, water is lost by the body, and this needs to be replaced. It is noticeable that we lose water through activities such as sweating and urination, but water is even lost when breathing. Drinking water, be it from the tap or a bottle, is the best source of fluid for the body. Beverages such as milk and juices are also decent sources of water, but beverages containing alcohol and caffeine, such as soft drinks, coffee and beer, are less than ideal due to having diuretic properties, meaning that they cause the body to release water. he recommended amount of water that should be drunk per day varies from person to person depending on factors such as how active they are and how much they sweat. There is no universally agreed upon threshold of water consumption that must be reached, but there is a general level of consensus as to what a healthy amount is.
According to the Institute of Medicine (IOM), an adequate intake for men is approximately 13 cups (3 liters) a day. For women, an adequate intake is around 9 cups (2.2 liters).
Many people may have heard the phrase, "Drink eight 8-ounce glasses of water a day," which works out at around 1.9 liters and is close to the IOM's recommendation for women. Drinking "8 by 8" is an easy-to-remember amount that can put people on the right track in terms of water consumption. Water also helps dissolve minerals and nutrients so that they are more accessible to the body, as well as helping transport waste products out of the body. It is these two functions that make water so vital to the kidneys.
What do the kidneys do?
The kidneys are two small fist-sized organs that are shaped like beans. They are located in the middle of the back, on either side of the spine and situated just below the rib cage. Despite their importance, the kidneys can sometimes receive less attention than other organs in the body. "The role of the kidneys is often underrated when we think about our health," state Kidney Health Australia. "In fact, the kidneys play an important role in the daily workings of our body. They are so important to health that nature gave us two kidneys to cover the possibility that one might be lost to an injury. They are so important that with no kidney function, death occurs within a few days." A crucial function of the kidneys is to remove waste products and excess fluid from the body via urine. The kidneys also regulate the levels of salt, potassium and acid in the body and produce hormones that influence the performance of other organs. When we eat and drink, nutrients and minerals enter the bloodstream in order to be transported around the body and used for energy, growth, maintenance or repair. The blood also passes through the kidneys where it is filtered, and any waste products and excess nutrients and water are removed and sent to the bladder for expulsion. Every day, the kidneys filter around 200 quarts of fluid. Of these, approximately 2 quarts are removed from the body in the form of urine, and 198 are recovered by the bloodstream. If the kidneys do not function properly through kidney disease, waste products and excess fluid can build up inside the body. Untreated, chronic kidney disease can lead to kidney failure, whereby the organs stop working, and either dialysis or kidney transplantation is required. Water is important for the workings of the kidneys, not only for helping to initially dissolve the nutrients, but for ensuring that waste products, bacteria and proteins do not build up in the kidneys and the bladder. These can lead to dangerous infections and painful kidney stones.
How does not drinking enough affect the kidneys?
Urinary tract infections (UTIs) are the second most common type of infection in the body and account for around 8.1 million visits to health care providers in the US every year. If infections spread to the upper urinary tract, including the kidneys, permanent damage can be caused. Sudden kidney infections (acute) can be life-threatening, particularly if septicemia occurs. Drinking plenty of water is one of the simplest ways to reduce the risk of developing a UTI and is also advised for people that have developed an infection. The presence of kidney stones can complicate UTIs as they can compromise how the kidneys work. Complicated UTIs tend to require longer periods of antibiotics to treat them, typically lasting between 7 and 14 days. The leading cause of kidney stones is a lack of water, and they are commonly reported in people that have been found not drinking the recommended daily amount of water. As well as complicating UTIs, research has suggested that kidney stones also increase the risk of chronic kidney disease developing. In November 2014, the American College of Physicians issued new guidelines for people who have previously developed kidney stones, stating that increasing fluid intake to enable 2 liters of urination a day could decrease the risk of stone recurrence by at least half with no side effects. Dehydration - using and losing more water than the body takes in - can also lead to an imbalance in the body's electrolytes. Electrolytes, such as potassium, phosphate and sodium, help carry electrical signals between cells. The levels of electrolytes in the body are kept stable by properly functioning kidneys. When the kidneys are unable to maintain a balance in the levels of electrolytes, these electrical signals become mixed up, which can lead to seizures, involving involuntary muscle movements and loss of consciousness. In severe cases, dehydration can also lead to kidney failure, a potentially life-threatening outcome. Possible complications of chronic kidney failure include anemia, damage to the central nervous system, heart failure and a compromised immune system. There are a considerable number of health problems that can occur simply through not drinking enough water, and yet researchers have found that a significant number of Americans may be failing to obtain the recommended levels of fluid intake every day.
Does the US not drink enough water?
A study carried out by the Centers for Disease Control and Prevention (CDC) in 2013 analyzed data from the National Cancer Institute's 2007 Food Attitudes and Behaviors Survey, in order to assess the characteristics of people who have a low intake of drinking water. Out of a sample of 3,397 adults, the researchers found the following:
- 7% of adults reported no daily consumption of drinking water
- 36% of adults reported drinking 1-3 cups of drinking water a day
- 35% of adults reported drinking 4-7 cups of drinking water a day
- 22% of adults reported drinking 8 cups or more a day.
People were more likely to drink less than 4 cups of drinking water daily if they consumed 1 cup or less of fruits or vegetables a day. The study indicates that among this sample, a large number of people may well have not been drinking the suggested 8 cups of fluid a day. Although the study only measured the intake of drinking water and fluid can be gained from other beverages, water is the ideal source of fluid due to it being readily available, calorie-free, caffeine-free and alcohol-free.
The fact that 7% of respondents reported drinking no water at all daily, and that respondents who drank low volumes of water were associated with low levels of fruit and vegetable consumption, would suggest there is a certain number of people who are risking their health by not getting enough fluid.
Even if the respondents reporting low levels of water intake were obtaining enough fluid, it is likely that they would be obtaining it from sources that could potentially compromise their health in other ways. "The biologic requirement for water may be met with plain water or via foods and other beverages," write the study authors. "Results from previous epidemiologic studies indicate that water intake may be inversely related to volume of calorically sweetened beverages and other fluid intake."
The CDC make a number of suggestions that could help people increase the amount of water they normally drink:
- Carrying a water bottle with you means that fluid can be accessed when out and about, at work or running errands
- This water can be frozen in freezer-safe water bottles to provide a supply of ice-cold water all day long, which can be more satisfying than other beverages in certain situations
- Adding a wedge of lime or lemon to water can give it a different edge that may improve its taste without affecting its nutritional value.
Drinking enough should be an easily achievable health goal. "Under normal conditions, most people can drink enough fluids to meet their water needs," state the CDC. Although it is a relatively simple step to take, it can easily get overlooked as part of increasingly hectic lifestyles. On National Kidney Day, it is worth remembering the risks that can come from not getting enough fluids, and to raise a glass - ideally filled with water - to those underrated organs that do so much for the health of the body.
Written by James McIntosh
A 'big first step' toward slowing the aging process with new class of drugs
March 11, 2015
A BIG FIRST STEP TOWARD SLOWING THE AGING PROCESS WITH NEW CLASS OF DRUGS
The results of a new study have pushed researchers one step closer to developing drugs that slow the human aging process. Scientists from The Scripps Research Institute in Jupiter, FL, the Mayo Clinic in Rochester, MN, and colleagues have identified a new class of drugs that dramatically improved cardiac function, reduced symptoms of frailty and prolonged the healthy lifespan of mice.
Published in the journal Aging Cell, the study reveals how the newly discovered drugs - named "senolytics" - successfully target and kill aging-related senescent cells without damaging other cells nearby. Senescent cells are cells that stop dividing as we age. They accumulate in various body tissues, secreting proteins that cause damage to surrounding healthy cells and tissues. Senescent cells speed up the aging process and play a significant role in the development of age-related diseases. The research team - led by Prof. Paul Robbins and Dr. Laura Niedernhofer of The Scripps Research Institute (TSRI) - already knew that killing senescent cells in mice could increase their healthy lifespan, and hypothesized that doing so in humans would have a similar effect. However, the team needed to find a way to target and kill the senescent cells while avoiding harm to surrounding cells. In their study, the researchers found that - just like cancer cells - senescent cells have increased expression of "pro-survival networks" that allow them to resist programmed cell death, or apoptosis. As such, the team set out to identify drugs that target senescent cells and induce apoptosis.
A single dose produced significant anti-aging effects
From testing 46 drugs on human senescent cells in culture, the researchers identified two that showed promise: a cancer drug called dasatinib (brand name Sprycel), and an antihistamine and anti-inflammatory supplement called quercetin. Used together, the researchers found the compounds effectively induced apoptosis in senescent cells.
On testing a combination of the two drugs in mouse models, the team found they significantly improved cardiovascular function, boosted exercised endurance, reduced osteoporosis and frailty and dramatically extended the animals' lifespan. "Remarkably, in some cases, these drugs did so with only a single course of treatment," says Dr. Niedernhofer.
In detail, a single dose of senolytics improved the cardiovascular function of older mice within 5 days, while one dose was found to significantly boost exercise endurance in weak mice that had been exposed to radiation therapy. The team says these effects lasted for at least 7 months. Among mice with accelerated aging, the researchers found regular administration of the senolytics delayed age-related symptoms, spine degeneration and osteoporosis, and increased their healthy lifespan. Senior study author Dr. James Kirkland, head of the Mayo Clinic Kogod Center on Aging, explains the findings further in the video below:
Commenting on their findings, Prof. Robbins says the team sees this study as a "big first step" toward developing drugs that extend patients' healthy lifespan and tackle age-related diseases. "When senolytic agents, like the combination we identified, are used clinically, the results could be transformative," he adds. While both drugs are already approved for separate use in humans, the researchers note that further testing is required to determine if combination use would be safe. They point out that the drugs may have side effects, particularly if used long term. Still, the team remains very optimistic about the findings. Dr. Kirkland says:
"If translatable to humans - which makes sense as we were using human cells in many of the tests - this type of therapy could keep the effects of aging at bay and significantly extend the healthspan of patients."
Written by Honor Whiteman
Largest Risk for Diabetes With Statins Yet Seen, in New Study
March 5, 2015
LARGEST RISK FOR DIABETES WITH STATINS YET SEEN, IN NEW STUDY
Statin therapy appears to increase the risk for type 2 diabetes by 46%, even after adjustment for confounding factors, a large new population-based study concludes. This suggests a higher risk for diabetes with statins in the general population than has previously been reported, which has been in the region of a 10% to 22% increased risk, report the researchers, led by Henna Cederberg, MD, PhD, from the University of Eastern Finland and Kuopio University Hospital, and colleagues, who published their study online March 4 in Diabetologia.
The majority of people in this new study were taking atorvastatin and simvastatin, and the risk for diabetes was dose-dependent for these two agents, the researchers found. Nevertheless, senior author Markku Laakso, MD, from the University of Eastern Finland and Kuopio University Hospital, told Medscape Medical News: "Even if statin treatment is increasing the risk of getting diabetes, statins are very effective in reducing cardiovascular risk.
"Therefore I wouldn't make a conclusion from my study that people should stop statin treatment, especially those patients who have a history of myocardial infarction or so on. "But what I would say is that people who are at the higher risk, if they are obese, if they have diabetes in the family, etc, should try to lower their statin dose, if possible, because high-dose statin treatment increases the risk vs lower-dose statin treatment," he continued.
Asked to comment, Alvin C Powers, MD, from Vanderbilt University School of Medicine, Nashville, Tennessee, explained that there were limitations to the conclusions that could be drawn from this study. Speaking as part of the Endocrine Society, he said: "The first thing is that this study did not examine the benefits of statin therapy, it examined only the risk of diabetes." With every treatment, there are risks and benefits, and the benefits of statins have been clearly proven in certain situations. In those instances, "the benefit would outweigh the increased risk of diabetes for many people," Dr Powers told Medscape Medical News.
Statins Appear to Affect Insulin Secretion and Sensitivity
Dr Cederberg and colleagues explain that previous studies have suggested an increased risk of developing diabetes, of varying levels, associated with statin use. However, in many of these, study populations have been selective, especially in statin trials, which have included participants at high risk for cardiovascular disease. Hence, the risk for diabetes in clinical trials is likely to differ from that in the general population. And very often, in previous studies the diagnosis of diabetes has been based on self-reported diabetes or fasting glucose measurement, leading to an underestimation of the actual numbers of incident diabetes cases.
In this new study, the authors investigated the effects of statin treatment on blood glucose control and the risk for type 2 diabetes in 8749 nondiabetic men age 45 to 73 years in a 6-year follow-up of the population-based Metabolic Syndrome in Men (METSIM) trial, based in Kuopio, Finland. The authors also investigated the mechanisms of statin-induced diabetes by evaluating changes in insulin resistance and insulin secretion. Diabetes was diagnosed via an oral glucose tolerance test (OGTT), HbA1c levels ≥ 6.5% (48 mmol/mol) or by having started glucose-lowering medication. During the follow-up, 625 of the participants were diagnosed with diabetes. OGTT-derived indices were used to assess insulin sensitivity and secretion. Statins were taken by 2412 individuals. The drugs were associated with an increased risk for type 2 diabetes even after adjustment for age, body mass index, waist circumference, physical activity, smoking, alcohol intake, family history of diabetes, and beta-blocker and diuretic treatment, at a hazard ratio (HR) of 1.46. The risk was found to be dose-dependent for simvastatin and atorvastatin, which were taken by 388 and 1409 participants, respectively. High-dose simvastatin was associated with a hazard ratio (HR) of 1.44 for diabetes vs 1.28 for low-dose therapy, while the HR for diabetes with high-dose atorvastatin was 1.37.
Statin therapy was also associated with a significant increase in 2-hour glucose (P = .001) and the glucose area under the curve at follow-up ( P < .001), as well as a nominally significant increase in fasting plasma glucose (P = .037). Furthermore, individuals taking statins had a 24% decrease in insulin sensitivity and a 12% reduction in insulin secretion compared with those not receiving the drugs. These increases were again dose-dependent for atorvastatin and simvastatin. Although pravastatin, fluvastatin, and lovastatin were found to be less diabetogenic than atorvastatin and simvastatin, the number of participants taking these agents was too small to reliably estimate their individual effects on the risk for diabetes, the research team notes.
Which Patients Should Take Statins?
Discussing the take-home message for prescribers seeking to balance the risk for diabetes with the benefits of statin therapy, Dr Laasko reiterated that individuals with a history of cardiovascular events and high LDL cholesterol "should definitely take statins." However, he emphasized that the main aim of statins is to prevent a recurrent cardiovascular event, so individuals need to have had one event to start statin therapy. "But in primary prevention, especially in women, who are at a lower risk of getting cardiovascular disease, maybe we should be more careful when we start statin treatment?" he ventured. "Statins are not meant to be a treatment for everybody." Dr Powers observed that this new study doesn't provide any information about whether people who have diabetes who are on a statin should continue with the statin, "but there are clear benefits for statin therapy in people who have diabetes. "People who have diabetes who are on a statin should continue with the statin.…This increased risk of diabetes, to me, is not relevant to their reason for taking the statin," he commented.
And in diabetes patients who have heart disease and are taking a statin, "the risk/benefit ratio would clearly be in the direction of benefit," Dr Powers observed. In individuals who do not have diabetes and who are taking a statin, for example to reduce their risk for cardiovascular disease, "statin therapy has to be considered in the context of what's the benefit of the statin therapy in that group…especially in individuals who are genetically susceptible to type 2 diabetes or who have prediabetes," he continued. "Those individuals will need to be monitored for the development of diabetes." "People who are taking statins should keep taking statins, if there's an appropriate reason for them taking a statin. The risk/benefit ratio in most people is in favor of benefit; the risk is outweighed by that benefit," he concluded. This work has been supported by the Academy of Finland, the Finnish Diabetes Research Foundation, the Finnish Cardiovascular Research Foundation, the Strategic Research Funding from the University of Eastern Finland, Kuopio, and a grant from Kuopio University Hospital. The authors have reported no relevant financial relationships.
Acetaminophen (Paracetamol) Risks May Have Been Underestimated
March 3, 2015
ACETAMINOPHEN (PARACETAMOL) RISKS MAY HAVE BEEN UNDERESTIMATED
Paracetamol, known as acetaminophen in the United States, may have more risks than originally thought, particularly when it is taken at the higher end of standard therapeutic doses, according to a new systematic review. The authors and an outside expert recommend caution when interpreting the data, as they are observational in nature and are subject to uncontrolled confounders. That said, the authors do note that the dose–response curves seen for each adverse outcome examined suggest "a considerable degree of paracetamol toxicity especially at the upper end of standard analgesic doses." Emmert Roberts, from South London and the Maudsley Mental Health Trust, Maudsley Hospital, London, United Kingdom, and colleagues present their findings in an article published online March 1 in BMJ.
"Paracetamol is the most widely used over-the-counter and prescription analgesic worldwide. It is the first step on the [World Health Organization] pain ladder and is currently recommended as first-line pharmacological therapy by a variety of international guidelines for a multitude of acute and chronic painful conditions," the authors write. They conducted a systematic literature review to determine the adverse event profile of paracetamol by searching Medline and Embase from the date of inception to May 1, 2013. They identified observational studies written in English that reported mortality, cardiovascular, gastrointestinal, or renal adverse events in adults in the general population who took standard analgesic doses of paracetamol. Ultimately, they included eight of 1888 studies retrieved. All of the included studies were cohort studies. The researchers assessed study quality using Grading of Recommendations Assessment, Development and Evaluation. They pooled or adjusted summary statistics for each outcome.
Both studies that examined mortality risk among adults who took paracetamol and those who did not found an elevation in overall risk. In one study, the standardized mortality ratio was 1.9 (95% confidence interval [CI], 1.88 - 1.94) for those taking the drug. The other study showed an overall risk of 1.28 (95% CI, 1.26 - 1.30), as well as a dose-response increase in the relative rate of mortality from 0.95 (95% confidence interval [CI], 0.92 - 0.98) at the lowest exposure, compared with nonusers, to 1.63 (95% CI, 1.58 - 1.68) at the highest exposure. Of four studies that reported cardiovascular adverse events, all found a dose-response, with one study demonstrating an increased risk ratio of all cardiovascular events from 1.19 (95% CI, 0.81 - 1.75) at the lowest exposure to 1.68 (95% CI, 1.10 - 2.57) at the highest. One study that reported gastrointestinal adverse events found a dose-response with relative rate of gastrointestinal adverse events or bleeding increasing from 1.11 (95% CI, 1.04 - 1.18) to 1.49 (95% CI, 1.34 - 1.66). Four studies reported adverse events; of those, three found a dose-response, with one that reported an odds ratio of 30% or more decrease in estimated glomerular filtration rate increasing from 1.40 (95% CI, 0.79 - 2.48) to 2.19 (95% CI, 1.4 - 3.43).
"Because this literature review was based on long-term observational data, there are many potential biases that could influence the results, so it cannot be called 'hard' data at all," study author Philip Conaghan, MBBS, PhD, professor of musculoskeletal medicine, University of Leeds; consultant rheumatologist, Leeds Teaching Hospitals National Health Service Trust; National Institute for Health Research senior investigator; and deputy director, National Institute for Health Research Leeds Musculoskeletal Biomedical Research Unit, United Kingdom, told Medscape Medical News. "For example, one confounder that is impossible to measure is the use of over-the-counter medicines, which are usually not recorded and can include drugs with significant side effects, such as ibuprofen. Of course it's almost impossible to get long-term data from clinical trials: They usually don't run for many years, so we are dependent on this sort of imperfect data to explore long-term potential drug side-effects," Dr Conaghan said. "I don't think this study is reproducible because of the softness of the data. [Also], that kind of risk profile is very hard to imagine is meaningful," Norton M. Hadler, MD, emeritus professor of medicine and microbiology/immunology, University of North Carolina at Chapel Hill, told Medscape Medical News.
Implications for Clinical Practice
The first thing clinicians should do when reading studies like this is to closely examine the methods of the study and not simply rely on the abstract, Dr Hadler noted. Moreover, clinicians should ask themselves whether a patient needs medication in the first place, Dr Hadler said. Although over-the-counter medications are generally safe, it makes sense for clinicians and patients to try nonmedication ways of relieving pain first. Dr Conaghan agrees. "First they should assess if paracetamol is needed for a given patient. It might not add much to people also taking other pain killers such as [nonsteroidal anti-inflammatory drugs] or opioids. Second, they should ask their patients about all their pain killers, including over-the-counter pills, to get a complete picture of analgesic use (note [that nonsteroidal anti-inflammatory drugs] are analgesics too). Thirdly, they should be conscious that people using moderate to high doses of paracetamol over long periods of time may be more prone to certain side effects that they need to look out for," Dr Conaghan added.
"I am assuming that the common long-term use of paracetamol is for musculoskeletal pain in this response.... [T]here is a massive need for pain control with ageing communities, increased levels of back pain and osteoarthritic joint pain, and lots of people can't tolerate aspirin and ibuprofen," he concluded. He also noted that it is worth reassessing every so often whether the drug is still helping the patient. "That might mean stopping it for a couple of days and seeing if it makes much difference to their pain. Then they have to consider if they are doing the simple things that effectively improve joint pain (if that's their problem) without side effects; for example, muscle strengthening exercises followed by increased physical activity, and weight loss if needed, all help knee pain. Fitting these things into busy lives is difficult, but ultimately they are more effective and safer than pills," Dr Conaghan explained. "[W]e should consider the benefit-risk ratio for particular conditions, and would need to see where paracetamol has demonstrated benefits. A recent study in Lancet suggested paracetamol wasn't effective for treating acute lower back pain, although its safety was good over the 4-week period of that study," Dr Conaghan said.
The authors and Dr Conaghan have disclosed no relevant financial relationships. Source: Lancet.com
Beliefs about nicotine 'may override its effects on the brain'
March 2, 2015
BELIEFS ABOUT NICOTINE MAY OVERRIDE ITS EFFECTS ON THE BRAIN
Nicotine replacement therapy and prescription medications such as varenicline are often used as smoking cessation aids. But a new study suggests there may be another way to quit the habit: by manipulating the brain's reward system through beliefs.
Published in the Proceedings of the National Academy of Sciences, the study revealed that participants who were told their cigarettes contained no nicotine showed less activity in areas of the brain that drive addiction - the reward-learning pathways, suggesting that an individual's beliefs about nicotine may influence a person's addiction to it.
Smoking is the leading preventable cause of death in the US. While it is other toxic agents in tobacco that are responsible for the damaging health effects of smoking, it is nicotine that causes tobacco addiction.
According to the research team, led by Read Montague, director of the Computational Psychiatry Unit at the Virginia Tech Carillon Research Institute, nicotine stimulates neural pathways in the brain associated with pleasure and reward, which is what drives nicotine addiction.
In their study, Montague and his team set out to investigate whether smokers' beliefs about nicotine, rather than their actual nicotine intake, could modify activity in reward-learning pathways of the brain.
The researchers point out that beliefs are known to contribute to the "placebo effect" - the idea that a "sham" treatment will have a positive effect based on the expectation that it will.
"A subject's belief that he or she is receiving a treatment could lead to observable improvement even in the absence of active drugs," the authors note. "These treatment effects are putatively accomplished by neurobiological processes usually associated with pharmacological actions of active drugs, even though active drugs are not administered."
Lower reward-learning activity for those who believed cigarettes were nicotine-free
The team tested whether a similar effect would be seen in 24 smokers who were divided into two groups. The subjects in one group were told the cigarettes they were about to smoke were nicotine free. In fact, both groups smoked conventional nicotine-containing cigarettes.
After smoking, all participants underwent functional magnetic resonance imaging (fMRI). During the brain scans, they played a reward-based learning game, in which they were given money, shown a historical stock price graph and asked to make an investment. This allowed the researchers to measure both their activity in the reward-learning pathways of the brain and the effect on choice behavior.
The study results revealed that the participants who believed they had smoked nicotine had much higher activity in their reward-learning pathways than those who believed their cigarettes were nicotine-free. Both groups also made very difference choices in the reward-based learning game.
These findings, the researchers say, "go beyond the placebo effect," suggesting that belief alone can either eliminate or boost the brain effects of nicotine. They add:
"These results provide compelling evidence demonstrating that prior beliefs about nicotine have the capacity to override the presence of a powerful neuroactive drug like nicotine by selectively modulating biophysically described processes in a fashion that correlates with measurable impact on learning and choice behavior."
Montague believes these findings may be useful for developing new treatments for addiction. "Just as drugs micromanage the belief state," he says, "maybe we can micromanage beliefs to better effect behavior change in addiction."
Montague speaks more about the team's findings in the video below:
In an editorial linked to the study, Nora D. Volkow and Ruben Baler, of the National Institute on Drug Abuse (NIDA), say the research enhances understanding of why drug abusers perceive a drug to be more pleasurable when they expect it to be compared with when they do not.
"The report [...] represents an important step forward in this context because it offers new insights into how the power of belief modulates nicotine-driven learning signals related to nondrug rewards (money), as well as non-drug-related decisions (choice behavior)," they say. "More specifically, this work illuminates the mechanisms whereby belief can influence nonconscious learned association by modulating how the brain performs risk decisions while under the effects of nicotine."
Earlier this month, Medical News Today reported on a study published in JAMA, which found varenicline (brand name Chantix) could be useful for helping smokers to quit the habit gradually.
Written by Honor Whiteman. Medical News Today
World's first 'bionic reconstructions' of hands performed
February 26, 2015
WORLD'S FIRST 'BIONIC' RECONSTRUCTIONS' OF HANDS PERFORMED
Scientists have successfully given three Austrian men mind-controlled robotic prosthetic hands to treat debilitating nerve injuries. The results of these groundbreaking procedures - referred to as "bionic reconstructions" - are published in The Lancet
The men had been living with injuries to their brachial plexus, a network of nerves that begins in the neck region and connects to the majority of nerves responsible for controlling the movement of the upper limbs. Brachial plexus injuries most commonly occur in high-speed collisions or collision sports such as football or rugby. Sustained as a result of climbing and motor vehicle accidents, the men's injuries had led to poor hand function. Prof. Oskar Aszmann, a developer of bionic reconstruction, describes such injuries as representing "an inner amputation, irreversibly separating the hand from neural control." The new technique entails a combination of nerve and muscle transfers, amputation and the utilization of a robotic prosthesis that responds to electrical impulses in the muscles. Prior to amputation, the patients required a significant amount of cognitive training, supported with comprehensive rehabilitation afterward. Prof. Aszmann, from the Medical University of Vienna, Austria, developed bionic reconstruction alongside colleagues from the University Medical Center Göttingen in Germany. "Existing surgical techniques for such injuries are crude and ineffective and result in poor hand function," he explains. The scientific advance here was that we were able to create and extract new neural signals via nerve transfers amplified by muscle transplantation. These signals were then decoded and translated into solid mechatronic hand function." Cognitive training consisted of learning to activate their muscles and then use the resulting electric signals to control a virtual hand. Following this, they would practice further using a prosthetic hand attached to their nonfunctioning hands. The patients spent an average of 9 months before their elective amputations undergoing this training.
'No limitations to prevent procedure being conducted at similar medical centers'
The scientists found that, 3 months after amputation, the robotic prostheses had given the patients significantly improved functional movement in their hands, along with less pain and a better quality of life.
Everyday tasks of varying complexity are now possible for the men. These tasks range from relatively simple actions, such as picking up a ball or using a key, to more complex actions requiring some finesse, such as using two hands to undo buttons.
"So far, bionic reconstruction has only been done in our center in Vienna," says Prof. Aszmann. "However, there are no technical or surgical limitations that would prevent this procedure from being done in centers with similar expertise and resources." In a linked comment, Prof. Simon Kay - the surgeon that conducted the UK's first hand transplant - states these findings are encouraging as they provide additional neural inputs into prosthetic systems that would not exist otherwise. However, he believes overall success can only be determined over time: "However, the final verdict will depend on long-term outcomes, which should include assessment of in what circumstances and for what proportion of their day patients wear and use their prostheses. Compliance declines with time for all prostheses, and motorized prostheses are heavy, need power, and are often noisy, as well as demanding skilled repair when damaged." For now, bionic reconstruction marks an exciting new development in the world of prosthetics, and three men can attest that the new technique can make a difference, at least in the short term. Last year, Medical News Today reported on the story of a prosthetic arm connected directly to the bone that can be controlled by the brain, described as a "union between the body and the machine."
Written by James McIntosh
'Bionic' eye allows man to see wife for first time in a decade
February 25, 2015
BIONIC EYE ALLOWS MAN TO SEE WIFE FOR FIRST TIME IN A DECADE
blind man is now able to see objects and people again, including his wife and family, for the first time in a decade. How? With the help of a bionic eye implant.
Affected by a degenerative condition known as retinitis pigmentosa, Allen Zderad was effectively blind, unable to see anything but a bright light. As the condition has no cure, Zderad, from Minneapolis-Saint Paul, MN, was forced to quit his professional career. He made adjustments to his lifestyle and was able to continue woodworking through his sense of touch and spatial awareness. However, with the help of his new retinal prosthesis, Zderad is now able to make out the outlines of objects and people, and could even register his reflection in a window. "I would like to say I think he's a remarkable man, when you consider what he's overcome in dealing with his visual disability," says Dr. Raymond Iezzi Jr., an ophthalmologist from the Mayo Clinic. "To be able to have offered him the retinal prosthesis to enhance what he can already do was a great honor for me." Retinitis pigmentosa is an inherited condition that causes the degeneration of specific cells in the retina called photoreceptors. The disease can cause some people to lose their entire vision. Mr. Zderad's grandson has the disease in its early stages and, after seeing him, Dr. Iezzi asked if he could meet his grandfather. The eye implant that Zderad now has works by bypassing the damaged retina and sending light wave signals directly to the optic nerve. A small chip was attached to the back of the eye with multiple electrodes offering 60 points of stimulation.
'Not like any form of vision that he's had before'
Wires from the device on the retinal surface connect to a pair of glasses worn by Mr. Zderad. The glasses have a camera at the bridge of the nose that relay images to a small computer worn in a belt pack. These images are then processed and transmitted as visual information to the implant which in turn interprets them, passing them on to the retina and eventually the brain. "Mr. Zderad is experiencing what we call artificial vision," explains Dr. Iezzi. "It's not like any form of vision that he's had before. He's receiving pulses of electrical signal that are going on to his retina and those are producing small flashes of light called electro-phosphenes. These small flashes of light are sort of like the points of light on a scoreboard at a baseball game." There are only 60 of these flashes of light, but it is enough for Zderad to reconstruct scenes and objects. Although he will not be able to see the details of faces or read, Mr. Zderad will now be able to navigate through crowded environments without the use of a cane, significantly improving his quality of life.
Dr. Iezzi would like to see the technology expanded to patients who have lost the use of their eyes, such as wounded soldiers or people with advanced diabetes or glaucoma. "In addition, while Mr. Zderad has 60 points of stimulation, if we were able to increase that number to several hundred points of stimulation, I think we could extend the technology so that patients could recognize faces and perhaps even read," he concludes. "It's crude, but it's significant," said Zderad happily, as he first used the device. "It'll work." Zderad will now be able to see his family again, including his 10 grandchildren and his wife, Carmen. And how does he distinguish her, having not seen her for a decade? "It's easy," says Zderad, "she's the most beautiful one in the room." At the end of last year, Medical News Today reported on the story of a woman with quadriplegia who is now able to use her mind to move a robotic arm, demonstrating "10° brain control" of the prosthetic.
Written by James McIntosh
Breastfeeding may influence immune system development in early life
February 22, 2015
BREASTFEEDING MAY INFLUENCE IMMUNE SYSTEM DEVELOPMENT IN EARLY LIFE
Aseries of studies set to be presented at the American Academy of Allergy, Asthma and Immunology's Annual Meeting in Houston, TX, claim an infant's immune system development and susceptibility to asthma and allergies may be influenced by a number of factors that shape what bacteria is in their gut, such as gestational age at birth, breastfeeding and delivery by Cesarean section.
The research team, including Dr. Christine Cole Johnson, chair of the Department of Public Health Sciences at Henry Ford Hospital in Detroit, MI, says the findings further support the "hygiene hypothesis" - the idea that early childhood exposure to pathogens affects later-life risk of disease.
"For years now, we've always thought that a sterile environment was not good for babies. Our research shows why. Exposure to these micro-organisms, or bacteria, in the first few months after birth actually help stimulate the immune system," says Dr. Johnson.
"The immune system is designed to be exposed to bacteria on a grand scale," she adds. "If you minimize those exposures, the immune system won't develop optimally." Other studies have supported this claim. In June 2014, for example, Medical News Today reported on a study published in the journal Allergy and Clinical Immunology, in which researchers found exposing babies to bacteria and allergens in the first year of life may reduce the risk of allergies, wheezing and asthma later in life.
Breastfed babies 'at lower risk of pet-related allergies'
In this latest research - consisting of six studies - Dr. Johnson and colleagues set out to determine whether maternal or birth factors, as well as breastfeeding, affect the composition of gut bacteria - or the gut microbiome - in infants, and whether these compositions influence their risk of developing allergies or asthma. In addition, the team assessed whether specific compositions of gut bacteria influenced the development of regulatory T cells (Treg) - white blood cells that regulate the immune system. To reach their findings, the researchers analyzed data from the Wayne County Health, Environment, Allergy and Asthma Longitudinal Study (WHEALS), which investigates how environmental and biological factors influence the development of allergies and asthma in early life. The researchers analyzed stool samples collected from babies at 1 and 6 months following birth. The results of their analysis revealed that a mother's race/ethnicity, an infant's gestational age at birth, prenatal and postnatal tobacco smoke exposure, the presence of pets in the home and whether a baby was born via Cesarean section or vaginal delivery influenced an infant's gut microbiome composition.
They also found that babies who were breastfed at 1 and 6 months had specific gut microbiome compositions, compared with babies who were not breastfed, which the researchers say may affect immune system development. In addition, babies who were breastfed at 1 month were at lower risk of pet-related allergies.
The researchers also identified a specific gut microbiome composition among children with asthma who experienced flare-ups or night-time coughing within the first year of life. What is more, they found - for the first time - that an infant's gut microbiome composition was associated with levels of Treg cells. Commenting on their findings, Dr. Johnson says: "The research is telling us that exposure to a higher and more diverse burden of environmental bacteria and specific patterns of gut bacteria appear to boost the immune system's protection against allergies and asthma."
Written by Honor Whiteman
FDA Approves New Varicose Vein Treatment
February 21, 2015
FDA APPROVES NEW VARICOSE VEIN TREATMENT
The US Food and Drug Administration (FDA) has approved the VenaSeal closure system (Covidien LLC), the first device to permanently treat varicose veins by sealing them with an adhesive agent. Varicose veins often cause no symptoms, but some patients may experience mild to moderate pain, blood clots, skin ulcers, or other problems. In these cases, treatment may include compression stockings or medical procedures to remove or close the affected veins.
The VenaSeal system gives patients "another treatment option for this common condition," William Maisel, MD, MPH, acting director of the Office of Device Evaluation in the FDA's Center for Devices and Radiological Health, said in a news release. "Because the VenaSeal system does not incorporate heat application or cutting, the in-office procedure can allow patients to quickly return to their normal activities, with less bruising," he noted.
The VenaSeal system is intended for patients with symptomatic superficial varicose veins of the legs. The sterile kit includes a specially formulated n-butyl-2-cyanoacrylate adhesive, catheter, guidewire, dispenser gun, dispenser tips, and syringes. "The device must be used as a system and differs from procedures that use drugs, laser, radio waves or cuts in the skin to close or remove veins," the FDA explains. "A trained healthcare professional inserts the catheter through the skin into the diseased vein to allow injection of the VenaSeal adhesive, a clear liquid that polymerizes into solid material. The healthcare professional monitors proper placement of the catheter using ultrasound imaging during delivery of the adhesive into the diseased vein to seal it," the agency explains.
The video below shows the Mechanisms of varicose vein formation
The FDA reviewed data for the VenaSeal system from three clinical studies sponsored by the manufacturer, which showed the device was safe and effective for the treatment of symptomatic superficial varicose veins of the legs. The VenaSeal system should not be used in patients who have a known hypersensitivity to the VenaSeal adhesive, acute inflammation of the veins resulting from blood clots or acute whole-body infection. Adverse events observed in the trial, and generally associated with treatments for varicose veins, included phlebitis and paresthesia in the treatment area.
Fecal matters: treating infection with stool transplants
February 19, 2015
FECAL METTERS: TREATING INFECTION WITH STOOL TRANSPLANTS
Having someone else's stool placed inside your body sounds more like grounds for treatment rather than treatment itself. Yet fecal microbiota transplant is a procedure that has been found to be a particularly effective for treating Clostridium difficile infection.
Of all the forms of transplantation that currently exist, it is safe to say that a transplant of fecal matter - also known as stool - is one of the strangest that a patient can have. But at the same time, it could also be one of the most important, representing a solution to the problem of antimicrobial resistance that affects many areas of medicine. Fecal microbiota transplants (FMT) are a form of therapy known as bacteriotherapy, whereby harmless bacteria are utilized to displace harmful organisms. In addition to treating infections, bacteriotherapy should avoid disturbing the natural bacteria that exist within the body, unlike some antibacterial agents. One story that has been prominently featured in various health news outlets is the case of a woman who became obese after fecal transplantation from an overweight donor. For many people, this story would be the first time they might hear about this form of treatment. How exactly does the procedure work? Where did it come from? And why is it being used and promoted by doctors ahead of other forms of treatment? In this Spotlight, we investigate and attempt to answer these questions.
The origins of FMT
The first example of FMT can be traced all the way back to China in the 4th century, where literature of the time makes reference to the use of stool transplantation in the treatment of food poisoning and diarrhea. Later, in the 16th century, an influential herbalist called Li Shizhen is known to have treated abdominal diseases using remedies referred to as "yellow soup" and "golden syrup" that contained fresh, dried or fermented stool.
In 16th century veterinary medicine, a treatment that is still used today known as transfaunation was carried out among ruminating animals. The process involved the transfer of micro-organisms from the stomach of healthy donor animals to those of sick animals.
During World War II, German soldiers confirmed that a Bedouin remedy for bacterial dysentery - the consumption of fresh camel dung - was effective. The utilization of stool in the treatment of disease is rooted deeply within the history of medicine, and it has been portrayed as effective from the 4th century to the 20th. Medical practice has moved on considerably since these times, however, and the process in which stool is utilized has been refined since the days of Li Shizhen.
How does FMT work?
These days, the FMT procedure is relatively direct. It begins with the selection of a healthy donor, who donates a sample of their stool to be used. The stool sample is then mixed with a solution and strained to remove particulate matter before it is transplanted into the patient. A number of ways in which the sample can then be placed inside the patient exist. Doctors can use enemas, endoscopy, colonoscopy and sigmoidoscopy. No one method has been found to be better than others, and so often the needs of the patient will determine which approach is used. Micro-organisms found in the gut have been identified as playing an important role in keeping us healthy. Dr. Henning Gerke, a specialist in gastroenterology and hepatology at the University of Iowa, explains: "These organisms - bacteria, fungi, protozoa - start colonizing the bowel in infancy. They appear to be important in training our immune systems and keeping pathogens (organisms that cause disease) in check." The purpose of FMT is, therefore, to create a diversity of micro-organisms within the bowel of the patient, to fight off disease and prevent future afflictions. Dr. Gerke writes that the idea of exposure to bacteria and parasites being beneficial to health reflects the "hygiene hypothesis," whereby lack of exposure to micro-organisms in early childhood can make individuals more susceptible to disease. "We know, for instance, that certain autoimmune diseases are less common in countries with lower hygiene standards than those in the industrialized world," explains Dr. Gerke.
The problem with C. diff
Currently, FMT is most commonly utilized to treat patients with C. diff infection, a bacterial infection that occurs due to a shortage of healthy bacteria in the body and attacks the lining of the intestine. Symptoms caused by C. diff infection include diarrhea and abdominal pain. The infection frequently occurs when an individual takes antibiotics to treat another condition. Although antibiotics can be an effective way of treating bacterial infections, they can also have an adverse effect on the gut microbiota "Antibiotics are lifesavers, but anytime we give them to a patient to eradicate one pathogen, there's collateral damage, in that along with the bad bacteria we wipe off some good organisms that help keep the complex workings of our gut in perfect balance," says Dr. Maria Oliva-Hemker, director of pediatric gastroenterology at Johns Hopkins Children's Center in Baltimore, MD. C. diff infection has often been treated with antibiotics in the past; a practice that Dr. Gerke describes as "fighting fire with fire." And ultimately, as Dr. Suchitra Hourigan explains, this ignores what caused the illness in the first place: "When we administer an antibiotic to treat the C. diff infection, we destroy some of the bad bacteria, but that does not address the other half of the problem - the loss of good bacteria that might have led to the infection to begin with, so we never truly restore the balance in the gut and often the diarrhea returns with a vengeance in a matter of weeks." Losing good bacteria is not the only problem with continued antibiotic use, however. When antibiotics fail to kill off a strain of bacteria completely, the bacteria can develop a resistance to the medication. As a result, future strains can develop that are heavily resistant to previously effective treatments. According to the authors of a paper published in Clinical Gastroenterology and Hepatology, C. diff infections are increasing in incidence, severity and mortality. In addition, existing treatment options are limited and many appear to be losing efficacy. FMT could offer an effective solution to these problems, and according to research cited by the authors, the process is safe, inexpensive and effective with success rates of over 90%. The Mayo Clinic even describe a randomized controlled trial that had to be stopped early due to overwhelmingly positive results.
Current regulation of the procedure
A number of stumbling blocks exist that currently prevent widespread adoption of the procedure, one of which being how the procedure is currently legislated in the US. The Food and Drug Administration (FDA) have yet to fully approve FMT. Instead, it exercises enforcement discretion, allowing the use of the procedure for patients who are not responding to standard therapies, provided that informed consent is given and that the use of FMT is regarded as investigational. Fecal transplants are also classified as drugs within the FDA legislation. Experts believe that this approach to FMT may be problematic. "It's going to give a monopoly to whatever company gets the drug approved," Mark Smith told The New York Times last year. "We think it should be regulated, but unlike most products the FDA oversees, there's a real risk of the black market. If you restrict access, there's going to be lots of people doing it underground." In 2012, Mark Smith and colleagues opened the first human stool bank in the US - OpenBiome - in order to make FMT safer, cheaper and more widely available to clinicians and patients. They provide hospitals with frozen stool samples that are ready to be used in FMT.
"People are dying, and it's crazy because we know what the solution is," Smith said. "People are doing fecal transplants in their basements and may not be doing any of the right screening or sterile preparation. We need an intermediate solution until there are commercial products on the market."
The Fecal Transplant Foundation state that there is only a small number of physicians in the US that provide FMT for the large number of patients that would benefit from the procedure. Additionally, many patients do not have a healthy donor who would be able to assist them.
The unknown and the future
At present, there are still a number of unknown factors that need to be explored and understood. Scientists do not yet know how donated bacteria specifically alter the patient's gut microbiota, and out of the trillion bacteria contained in stool, it is unknown which ones are beneficial, which are dangerous and which have no influence. Clinicians also need to know how to guarantee they can use FMT safely. As the previously mentioned case of the patient becoming obese after FMT attests, there is an element of uncertainty that still surrounds the procedure, even if that particular case involved just a single patient.
Dr. Gerke points out that methods of treatment with stool transplantation are not currently standardized. "More work needs to be done to determine what constitutes the ideal stool donor, optimal method of stool preparation and best route of administration," he states.
The Fecal Transplant Foundation maintain that there has never been a single, serious side effect reported from FMT, in all documentation from 4th century China to the present day. Randomized controlled trials are being conducted at present, with a view to putting an end to this uncertainty. As well as treating C. diff infection, it may be that FMT could be used to treat other conditions related to the micro-organisms of the gut. Inflammatory bowel disease, irritable bowel syndrome, obesity and type 2 diabetes could be potential targets for future research programs. Some of the concerns about variables with stool samples could be eradicated with the development of synthetic samples to replace human fecal matter. Already, researchers are developing synthetic stool from bacterial cultures and putting human stool in gel capsules for easy consumption. "In less than a decade, we'll have lab-cooked poop that we can administer to restore balance in the guts of people with a wide array of conditions caused by the imbalance between good and bad germs," predicts Dr. Oliva-Hemker. While on the surface it may seem like a messy business, FMT represents both the past and the future of medicine. With antimicrobial resistance a growing problem, this will be an important area of research in the coming years.
Written by James McIntosh
Acupuncture back pain success determined by psychological factors
February 16, 2015
ACUPUNTURE BACK PAIN SUCCESS DETERMINED BY PSYCHOLOGICAL FACTORS
According to new research, people being treated for lower back pain with acupuncture are likely to gain less benefit from the treatment if they have low expectations of how effective it is.
The study, published in The Journal of Clinical Pain, also suggests that patients who are positive about their back pain and feel in control of their symptoms go on to experience less back-related disability while receiving acupuncture. "The analysis showed that psychological factors were consistently associated with back-related disability," says study author Dr. Felicity Bishop. "People who started out with very low expectations of acupuncture - who thought it probably would not help them - were more likely to report less benefit as treatment went on." Well established as a form of complementary therapy, acupuncture is commonly used to treat a wide range of health problems. The World Health Organization (WHO) has stated that acupuncture is an effective form of treatment for 28 conditions, including lower back pain and the following:
- Allergic rhinitis
- Essential hypertension
Evidence also suggests that acupuncture could be beneficial in the treatment of many other diseases, symptoms and conditions, although the WHO believe further proof is needed.
However, previous research has also found that factors other than the insertion of needles into specific areas of the skin play a part in how effective acupuncture is. These factors include the patient's belief in the therapy and the relationship between the patient and acupuncturist.
For the study, 485 people receiving acupuncture for lower back pain were recruited, being seen by a total of 83 acupuncturists. The participants completed questionnaires prior to the commencement of their treatment, and then again after 2 weeks, 3 months and 6 months. The questionnaires measured demographic characteristics and lower back disability, as well as variables from four different psychological theories for predicting lower back pain outcomes: the fear-avoidance model, the common sense model, expectancy theory and social-cognitive theory.
'Processing of different emotions in relation to treatment can influence outcomes'
As hypothesized by the authors, psychological variables were associated with changes in disability among the participants and were accountable for two thirds of the variance in disability. Dr. Bishop explains that when individual patients were able to see their pain in a more positive light they would go on to experience less back-related disability: "In particular, they experienced less disability over the course of treatment when they came to see their back pain as more controllable, when they felt they had better understanding of their back pain, when they felt better able to cope with it, were less emotional about it, and when they felt their back pain was going to have less of an impact on their lives." The authors acknowledge that it is difficult to assess how representative the sample of patients in the study is. Compared with a British survey of acupuncture users, the participants were similar in age and sex but fewer had previous acupuncture experience. Dr. Stephen Simpson, director of research at Arthritis Research UK, says that the study emphasizes how the placebo effect influences pain. "The process whereby the brain's processing of different emotions in relation to their treatment can influence outcome is a really important area for research," he adds.
The authors of the study suggest future research should test whether integrating acupuncture with psychological interventions targeting illness and self-perceptions can improve patient outcomes.
"Factors such as the relationship between practitioner and the patient can inform this and we should be able to understand the biological pathways by which this happens," concludes Dr. Simpson. "This understanding could lead in the future to better targeting of acupuncture and related therapies in order to maximize patient benefit."
Written by James McIntosh
Raising Systolic BP Target May Hike Stroke Risk
February 13, 2015
RAISING SYSTOLIC BP TARGET MAY HIKE STROKE RISK
Older patients without diabetes or kidney disease who have a systolic blood pressure (SBP) of 140 to 149 mm Hg have a risk for stroke higher than that of similar patients with a SBP under 140 mm Hg and a risk similar to that of patients with a level of 150 mm Hg and over, a new study shows. This is particularly true for African American and Hispanic patients, the researchers found. In light of these findings, recent new recommendations to increase target SBP from 140 to 150 mm Hg in older patients could have a detrimental effect on stroke risk, especially among minority populations, according to study authors. The research was presented here at the International Stroke Conference (ISC) 2015. Last year, the Eighth Joint National Committee (JNC 8) recommended increasing the current SBP target of 140 mm Hg to 150 mm Hg in patients aged 60 and older without diabetes mellitus or chronic kidney disease. The recommendations were published February 5, 2014, in JAMA. Researchers for the current study aimed to find out whether raising this threshold would have a harmful effect, explained lead author Ralph Sacco, MD, Department of Neurology, University of Miami, Florida.
"We wanted to look at what impact this would have in our study on stroke because there is lot of evidence that hypertension is one of the most important risk factors for stroke," Dr Sacco said. The study included 1706 patients over 60 years of age who had not had a stroke and did not have chronic kidney disease or diabetes. These patients were part of the Northern Manhattan Study (NOMAS), a prospective longitudinal study started in 1993. The mean age of this cohort was 72 years; 37% of patients were male. About a quarter were white, another quarter were non-Hispanic black, and almost 50% were Hispanic. About 41% were receiving antihypertensive medication.
Antibiotic use has more unwanted effects than previously thought
February 11, 2015
ANTIBIOTIC USE HAS MORE UNWANTED EFFECTS THAN PREVIOUSLY THOUGHT
We have known for some time that one of the unwanted side effects of taking antibiotics is their disruption of friendly microbes in the gut. But now a new study that takes a closer look suggests the consequences of long-term antibiotic use could be even far-more reaching than we thought.
Writing in the journal Gut, Andrey Morgun, an assistant professor at the College of Pharmacy in Oregon State University, Corville, and colleagues hope the study will increase understanding of the widespread damage antibiotics cause to the gut and will offer new ways to investigate and offset the consequences. Antibiotic use is widespread - around 40% of adults and 70% of children take at least one a year, and billions of animals are treated with them. When used properly, antibiotics eliminate life-threatening infections, but around 1 in 10 people treated with them suffer adverse side effects. Scientists are beginning to discover that antibiotic use - and overuse especially - is associated with a range of problems that affect, among other things, glucose metabolism, the immune system, food digestion and behavior. They also suspect it is linked to obesity and stress. Prof. Morgun says: "Just in the past decade a whole new universe has opened up about the far-reaching effects of antibiotic use, and now we're exploring it. The study of microbiota is just exploding. Nothing we find would surprise me at this point."
Antibiotics kill intestinal epithelium cells
For their study, the team used mice to look at the effects of four antibiotics commonly given to lab animals.
Previously, it was thought the antibiotics only killed gut bacteria and blocked some immune functions in the gut. But the new study shows they also destroy cells in the intestinal epithelium.
The intestinal epithelium is a velvet-like layer of specialized cells that lines the intestine and helps absorb water, glucose and essential nutrients into the bloodstream. It is also a barrier between the rest of the body and the huge colonies of bacteria that live in the gut.The velvet-like appearance of the intestinal epithelium is due to the millions of tiny projections called villi that maximize the surface area of the epithelium. The intestinal epithelium is home to an abundance of immune cells that live alongside the trillions of gut bacteria with whom they are in constant dialogue to maintain the delicate stability of the partnership between the host body and its bacterial colonies.
Antibiotics disrupt mitochondria and host-microbe signaling
The team also discovered that antibiotics affect a gene that is critical to the communication between host and gut bacteria. Prof. Morgun notes: "When the host microbe communication system gets out of balance it can lead to a chain of seemingly unrelated problems." Disruption in host-microbe dialog can not only disrupt digestion, cause diarrhea and ulcerative colitis, but new research is also linking it to immune function, obesity, food absorption, depression, sepsis, asthma and allergies.The team also found that the antibiotics and bacteria that have developed resistance to them cause significant changes to mitochondria, leading to more cell death. Mitochondria are tiny compartments inside cells that act like batteries - they convert food into energy for the cell. They also play an important role in cell signaling and growth and need to function properly for good health. In evolutionary terms, mitochondria are descended from bacteria, which may explain why antibiotics attack cell components that most closely resemble them.
Studies like this support the idea that killing bad bacteria with antibiotics is perhaps not a good way to deal with infection - given the increasing list of side-effects and problems they cause. Prof. Morgun suggests boosting the healthy bacteria so they outcompete the unwanted ones might be a better approach.
The Medical Research Foundation of Oregon and the National Institutes of Health helped fund the study. In January 2015, Medical News Today reported a study that suggested travelers taking antibiotics could be helping to spread antibiotic resistance. The researchers found travelers who take antibiotics for diarrhea are not only increasing their chances of contracting resistant intestinal bacteria, they could also be spreading them to their own countries.
Written by Catharine Paddock PhD
Latest news →