In 2016, the 21st Century Cures Act was signed into law, mandating that the FDA “establish a program to evaluate the use of real-world evidence (RWE) to help support the approval of new indications for a drug and to help to support or satisfy post-approval study requirements.” Along with advances in the availability and quality of real-world data (RWD) from sources like electronic health records (EHRs), registries, medical claims and pharmacy data, the Cures Act has been a catalyst for increased emphasis on using RWE in clinical and regulatory decision-making.
Today, RWE is playing an increasingly important role in life sciences research. In the last 10 years, close to 1,600 clinical trials have been registered that were informed using RWE. In May 2019, the FDA released draft guidance on how to submit RWE for drug and biologic approval and has already granted approvals for submissions using RWD in the areas of efficacy, label expansion, and rare diseases. One well-known example is Pfizer’s breast cancer treatment Ibrance. Although initially approved for post-menopausal women, the FDA granted label expansion after Pfizer provided RWE gained from EHRs to demonstrate its effectiveness in male patients.
However, many questions remain about how to gather, analyze, and submit RWE to the FDA and other regulators. When submitting RWE, it is crucial for life sciences organizations to not only feel confident in the data they provide, but also that it will be accepted by regulatory reviewers. The main question these companies must ask themselves is, how can we trust the evidence we are providing? And, subsequently, how can we demonstrate that the evidence is trustworthy? Establishing this trust requires several key steps and can be confirmed with a system I have used throughout my career in analytics that I call the three “Rs” – repeatability, reproducibility, and replicability.
Ensuring Trustworthy RWE
The most important foundational step when it comes to gathering RWE lies with the data itself. Because there are so many types of RWD available from various sources, organizations must ensure they are using data sources that are relevant and fit for their study. Because RWD is gleaned from the real world, researchers must address a series of questions that may affect their analyses, including, “Are the key data elements available?” “Are the data complete – that is, do the data exist for the time periods needed?” And, “Is anything missing with regard to key variables?”
Because there are so many considerations, researchers must examine and understand their data thoroughly via data characterization before determining the appropriate study design and analysis. Data characterization is theoretically simple, but this important step can add time to the process. Using proven data analytics platforms can help accelerate the process of analyzing, reviewing, and understanding the data. A robust platform can help researchers assess feasibility for answering a particular question and can help researchers rapidly gain a better understanding of any limitations using descriptive statistics and graphical depictions.
Beyond the initial study design steps of examining fit-for-purpose data and feasibility, analytics platforms can be used to perform a wide array of analyses, from simple descriptive ones to more sophisticated ones that adjust or control for confounders. Despite these advanced capabilities that can help speed up processes, researchers must still ensure that their results are credible and valid.
Validate results with a proven method – the three “Rs”
Researchers can increase their credibility via various methods of validation. One system that I have historically used to test validity involves evaluating repeatability, reproducibility and replicability – or as I call them, the three “R”s:
– The most basic and fundamental “R” is repeatability. If an analysis is repeatable, it means that if all things are held constant – that is the data, the “code”, the computing environment – there should be no variation in results and you would expect to get identical results each time.
– The second “R” is reproducibility. If a different research team or analytics tool was used to repeat the same study, but all other factors remain constant, the end results should be the same.
– Finally, replicability refers to the ability to recreate an analysis when several factors are changed. For example, if researchers use different teams, different data and different methods, they can still feel confident in their finding if the results have the same directionality and similar magnitudes.
Reproducing or replicating research using any method can be a time-consuming process, but analytics platforms can be used to expedite it. Because they can run the same program on different databases or data sources, they can reduce the amount of coding time required of researchers to evaluate different datasets and can rapidly compare various studies to ensure confidence in the results generated. Additionally, some analytics platforms automatically enable transparency, which aids in reproducibility and replication. If the three “R”s can be tested and the results achieved demonstrate similar conclusions, then the RWE, in my view, is valid and trustworthy enough to be used as a proof point in regulatory submissions.
The future of RWE
While RWE has long been used for things like analyzing drug utilization and incidence and prevalence counts, it is still early days for its use in regulatory submissions. However, the growing demand for data-driven healthcare and the acceptance of RWE by regulatory agencies worldwide likely means that its usage will continue. By implementing the right analytics tools, testing studies to prove their trustworthiness and carefully planning study parameters, life sciences companies can more comfortably work to apply and submit RWE more broadly for label expansion and other regulatory submissions. As trust in data, analytics technology and data science professionals continues to grow, we will likely see the use of RWE expand to exciting new areas in the near future.
About Tiffany Siu Woodworth
Tiffany Siu Woodworth, MPH, MBA, is the Senior Director of Analytic Solutions at Panalgo, providing analytic and operational expertise in support of the company’s Instant Health Data (IHD) platform. Tiffany has more than 15 years of varied health outcomes research experience, from leading safety analyses for regulators to supporting traditional health economics outcomes research. Tiffany was instrumental in the establishment and organization of FDA’s Mini-Sentinel and Sentinel Operations Center at Harvard Pilgrim. Prior to joining Panalgo, she served as the Sentinel Data Operations Lead, overseeing the execution of hundreds of FDA-initiated data requests that supported regulatory decision-making.