Ensuring Biomedical Research Remains Trustworthy and Transparent

Reforms at Johns Hopkins Medicine include expanded education for investigators and a better system for storing and sharing primary data.

Published in Dome - May/June 2017

Biomedical research illuminates the world and saves lives. It’s also messy, tedious and often frustrating.

Just ask Donna Dang. Working in Rajini Rao’s physiology lab, the school of medicine doctoral student spends about 60 hours a week, including weekends, in a quest to figure out how a certain protein, SPCA2, makes the HER2 breast cancer cell multiply.

It took her about two years to figure out how to grow cancer cells in petri dishes and mix in just the right amount of protein. Four years and many failed experiments into her work, she now runs a makeshift assembly line, maintaining as many as 30 dishes at a time in an incubator set at human body temperature.

Dang measures the growth, death and movement of cancer cells using a digital camera attached to a microscope. She sends the images to her computer, where they are converted to dense rows of numbers reflecting the quantity, activity and health of the cells.  

Dang is finally starting to see patterns. Too much of the protein seems to spur growth of the cancer cells, while too little appears to make them migrate. If she can find that sweet spot, with just the right amount of protein to kill the cancer cells, her work could eventually lead to breast cancer therapies.

But only if her research proves reliable.  

That’s why the young cellular and molecular medicine investigator repeats her most promising experiments several times and even asks others in the lab to try to get similar results.

Dang is demonstrating a bedrock of scientific research: testing research by reproducing it. Yet the past few years have seen a “reproducibility crisis” that is eroding research trust.

Nationally, in several highly publicized cases, investigators painstakingly followed the steps of earlier experiments but did not get similar results. The failures often occurred after pharmaceutical companies invested heavily in hopes of turning the now-questionable findings into treatments. Resources were squandered, the march toward therapies took a detour and the research community was embarrassed.

The school of medicine, which produces more than 5,000 published research papers a year, is working to avoid the problem. A robust education program for investigators is being expanded, and a recently convened reproducibility task force has outlined recommendations that include creation of a secure system for storing and sharing primary data.

The reforms will ensure that research at Johns Hopkins Medicine remains trustworthy, says Paul Rothman, dean of the medical faculty and CEO of Johns Hopkins Medicine. “Clearly now is the time for the U.S. research enterprise to re-evaluate our processes and incentive systems.”

Worldwide, retractions of published papers are growing, says Stuart Ray, vice chair for data integrity and analytics for the Department of Medicine. In 2015, 720 papers in the PubMed database of biomedical literature were retracted, a more than tenfold increase from 2004, while the number of publications per year increased just twofold during the same time period.

At least one Johns Hopkins researcher, no longer with the institution, has retracted some of his papers, citing study design flaws, according to the Retraction Watch website.

In the most egregious cases of retracted research, investigators fabricate data or results, as with a British physician’s 1997 study falsely linking vaccines and autism. But outright falsifications are rare, according to experts.  

Far more often, they say, research can’t be reproduced because of factors such as poor design, too-small data sets, math mistakes or journal articles that leave out the details needed to mimic the trial.   

Avoiding the Pitfalls

Research involves studying the existing literature, forming a hypothesis, gathering and analyzing data, and publishing the findings. Other researchers then build on those results, creating an accumulation of knowledge that leads to treatments and cures. 

It sounds simple, but pitfalls can trip up scientists every step of the way. To help investigators spot and avoid stumbling blocks, the school of medicine’s Responsible Conduct of Research Program requires that they learn research rules and best practices through research integrity colloquia, online courses and department meetings devoted to research conduct.

Now, Rao, who directs the Graduate Program in Cellular and Molecular Medicine, is using National Institutes of Health funding to develop an online education series with topics including how to design an experiment and how to analyze large data sets. The course could be ready by fall, she says. 

Another goal is to create a unified, secure and easy-to-use data storage system—a measure recommended by the school of medicine’s recently launched Research Reproducibility Task Force. “We need to archive, store and secure these data so that the steps are traceable,” says neuroscientist Alex Kolodkin, task force chair.

More journals are asking researchers to provide links to their primary data so subsequent investigators can see how the conclusions were reached. Those data provide a priceless resource for continued investigation, and consideration of their reproducibility is crucial if the integrity of research is questioned.

Currently, though, investigators jot down observations in physical notebooks, stash data on flash drives that can get destroyed in coffee spills, or compute charts on personal laptops that crash or get replaced. When students graduate, they may leave behind primary data that are poorly organized or labeled. 

“In our lab, our researchers use everything from pen and paper to laptops to smartphones and computers,” says Rao. “If a journal asked to see original data, I don’t know how we would deal with that.” 

Dang, for example, writes information in pen on sticky notes before transferring the numbers to her laptop and a Johns Hopkins server. She wants the information available for the next researcher in Rao’s lab, who may build on her work after she graduates, she says. “Our rule of thumb is don’t delete anything,” she says.

She estimates she has used 200 gigabytes of data—enough to hold about 40,000 songs—over the past two years for her digital interpretations of breast cancer cells. Analyzing all that data can be daunting, she says, which is why she sometimes turns to biostatistician friends for help.

In many labs, the principal investigators who receive the grants may have little knowledge of how to handle today’s large data sets. Data volumes have increased to a staggering degree, complicating both storage and interpretation.

Thirty-five years ago, when molecular biologist Randall Reed received his doctorate, he based his thesis paper on a single image. Now, his research might be based on 100 million DNA sequences, analyzed by researchers in his lab before he sees it. 

“The data are more complex, and the principal investigator is farther from primary data than a decade or two ago,” points out Reed, a member of the task force and assistant dean for research.

Meanwhile, inexperienced researchers are working in a pressure cooker environment, where an exciting study, published in a prestigious journal, can help new graduates land academic jobs. Experiments that don’t yield the intended results are far less likely to get published or help investigators win their next grants.  

Rao, whose lab supports five to seven researchers at a time, encourages her graduate students to follow the research where it leads, without worrying about “shiny and immediate” results. “I like to tell them if the experiment is interesting, the particular result isn’t what’s important,” she says.  

Dang, who aspires to a career in the biotechnology industry, appreciates that philosophy. “Some experiments take weeks to set up and don’t yield useful data,” she says. “You can’t force a result. If it doesn’t work, I just go back to the drawing board.”

Announcing the 2016 Synergy and Innovation Award Winners

The awards support collaboration among investigators.

No image available

A SMART Approach to SMA Research

An unusual approach to funding spinal muscular atrophy yields promise.

SUMNER charlotte 1

Collaboration Is an Anchor of Johns Hopkins Research

Johns Hopkins researchers are breaking new ground all the time in the fight against pancreatic disease.

ITR1608047 Web Images_tumor DNA