Clinithink’s Russ Anderson and Dr. Tielman Van Vleck breaks down the potential of big data by exploring the progress and promise unfolding from it.
Surely, you’ve heard all about big data and its promise to improve healthcare by now. If you haven’t, here’s the scoop: big data could save the U.S. healthcare system more than $3 billion annually, according to a McKinsey report. Yes, it seems big data is unstoppable and its potential undeniable. What remains unclear, however, is how big data’s potential will break down to truly add value to healthcare.
While we don’t have all the answers yet, Clinithink’s Russ Anderson and Dr. Tielman Van Vleck and have provided us with a few good examples of how capturing and unlocking big data’s value has the potential to lead us down powerful new paths in medicine, research, treatment and geneomonics.
The UK-based software provider has developed its own solution, CLiX, to help healthcare providers tap into that much-desired value. Thus, Anderson, Vice President of Product Management and Van Vleck, Director of Language Processing, are all too familiar with the challenges big data proposes—as well as the opportunities it promises.
Still, where is all this data descending from? How is big data enabling change today that will lead to more promising developments tomorrow? And what can healthcare do to properly capture and unlock the true value that big data brings with it?
According to Anderson, before you can start breaking down big data definitively, you have to break it from the buzz that surrounds it first. “There is a lot of noise in the market about big data today, and it’s difficult to separate the noise from the signal,” he said.
Russell L. Anderson, VP , Product Management
“A common way to look at big data is to evaluate it across four dimensions: volume, velocity, variety and complexity. It’s easy to get stuck and overwhelmed by the volume, but it’s critical to recognize the importance of recognizing the value that lay within all that data.”
Anderson has a point. Volume does seem to be prominent aspect of big data since there is plenty of it. Over the past few years, there has been a literal explosion in the amount of available healthcare data thanks to emerging technologies. By some estimates, it’s as high as 152 Exabytes and expected to explode to over 40 times that amount by 2020, said Anderson.
Naturally, the data landslide is descending partly from the increasing availability of electronic medical data produced by EHRs/EMRs. Other clinical-based contributors include genetic sequencing (as it becomes more economical) as well as wearable, biomedical-monitoring devices. However, there are plenty of devices capturing critical medical data apart from clinician use, thanks to the proliferation of mobile devices, health-related apps, social media channels and online forums. According to Anderson, that data is equally valuable to inspiring change within the healthcare.
“Privacy concerns around standard clinical data make the opportunities to leverage non-clinical data particularly compelling,” he said. “For example, social media sites have been trawled for patient’s public disclosures, such as adverse reactions to medication. Another example is the Google Flu project, which demonstrated that analysis of user-search terms can track the spread of influenza with accuracy approaching that of CDC monitoring. Clearly, the value is out there, if you know where to look.”
According Anderson, the potential of big data is slowly shifting into a recognizable reality as healthcare begins to make headway in what the Institute for Healthcare Improvement (IHI) refers to as the Triple Aim: enhancing patient care, improving population health, and reducing and controlling costs per capita.
Here are just a few examples provided by Anderson and Van Vleck as to how big data is playing a part in all three:
Patient Care/Treatment
Cancer researchers at Washington University have sequenced the tumor cells of more than 700 cancer patients. By comparing the tumor cells to healthy cells, they can pinpoint the mutations underlying the patient’s cancer. This is pushing researchers to change how they classify and treat tumors based on their genetic makeup, rather than location in the body. By targeting the genetic makeup and associated mutations, they can customize the treatment to the individual patient and potentially improve the outcome.
Population Health/Improving Outcomes
At the University of Rochester, Dr. Henry Kautz is looking at ways of using social media data streams from sites such as Twitter to identify people who are complaining of being ill. Then, he is applying a geo-mapping overlay on top of that data and creating a visualization map to harness that data and predict the spread of diseases such as influenza
Reducing Costs
In 2010, Blue Shield of California and California Public Employees Retirement System (CalPERS) launched a shared savings program designed to reduce costs while improving outcomes. They started with a lofty goal of $15 million in savings. To achieve this, they shifted their focus from the typical cost-based premium pricing to a price-based costing model: establish a price (benchmark) and develop a program that will achieve that objective.
Blue Shield used massive amounts of data, over a span of three years to target key cost drivers and put in place key strategies such as evidence based treatment, targeted care delivery, and intervention. They implemented rapid exchange of crucial clinical data, delivered at the point of care, to hone treatments and reduce adverse events. Not only did they reach their stated goal, but exceeded it by $5 million—all while significantly reducing readmissions, length of stay, and number of inpatient admissions.
Those examples do show the merits of what’s to come—but what about right now? How can a majority of healthcare institutions start taking advantage of the swell of data now arriving at their fingertips?
Van Vleck explained:
Dr. Tielman Van Vleck, Director of Language Processing,
“Big data will change healthcare in the U.S., because it is the linchpin of attaining the triple aim. However, for the U.S healthcare market to fully actualize the benefits of big data, rapid access to large, disparate yet clinically relevant data sources and data integration is essential. Of course, underpinning all of this is improving the quality of the data as well.”
Data access, integration, and quality— those are the common pillars of data management. However, the challenge that big data brings to the table is how to enter, classify, and transfer that data in a timely manner while making it intuitive and consumable for human use. Data is often classified in different ways. After all, some data is structured and some is unstructured. Extracting valuable information from those unstructured forms is essential to not only enhancing data quality, but also performing better at the point of care for patients.
An increasing use for natural language processing (NLP) and standard clinical ontologies, such as SNOMED CT, are helping to apply meaning to those unstructured data sets that don’t fit into structured EHR or EMR fields clinicians have come to know and loathe. Data fields are useful but they aren’t always conducive to the way people communicate. That’s exactly the problem companies like Clinithink have tasked themselves with solving through NLP-powered technologies such as CLiX.
“We rely on structured tools like EHRs and EMRs to document data and drive workflow, but the problem is, this isn’t the way physicians like to document their care,” said Van Vleck. “They prefer to tell a story by using their own unique language. NLP unlocks the narrative and organizes it in a way that system can then consume.”
Clinical content has its own unique framework, so relaying on NLP alone isn’t enough. Tools that use clinical frameworks such as SNOMED to structure the data and apply clinical meaning are also essential as well as cross mapping to other frameworks, such as ICD-10, RXNorm and LOINC.
“Big data is all about giving researchers the power to see the forest for the trees,” said Anderson. “This means tools to store massive amounts of data, along with tools to analyze it – both algorithmically and through visualization for human processing. But it’s also about aggregating that data in the first place. This is why enabling access to dramatically more data from narrative is so crucial.”
Companies like Clinithink are not just hunting through the hay stacks to discover, decipher and classify data, but to devise better ways of inputting the data for intuitive use. To achieve that, you have to know as much about human workflow as you do about technical limits. Thus, the trend of decoding big data continues and so does the promise of what’s possible because of it.
“By making sense out of massive volumes of data, you are able to separate the noise from the signal and come to meaningful conclusions that are not obvious on their own,” said Anderson. “This paves the way for new ways of treating patients or investigating alternative medications. Think mass customization. It also positions providers, with restricted resources to target their care by delivering the right care at the right time in the right setting. The promise of progress thanks to big data is here. We just need time to figure out how we can best capture and communicate it all.”