It only takes one data breach to paralyze a whole industry. Hospitality offers one prescient example. In 2014, hotel chain Marriott International was victim to a severe cyber attack, in which the information of 500 million clients was released. That scandal exploded to the political sphere as Chinese hackers were accused of trying to destroy competition in the hotel industry. Hotel reservations are one thing; just imagine the reaction to a similar hack exposing the medical records of entire nations.
Attacks on hospitals and medical care centers are not a question of why, but when. Protenus’ annual ‘Breach Barometer’ report details that 41.1 million private patient records were breached across 572 detected incidents in 2020. IBM estimates that the cost of a single data breach targeting major healthcare providers is at least $7 million. The Coronavirus pandemic has made the territory yet more precarious, with an Interpol report attesting to an alarming increase in cyberattacks “exploiting the fear and uncertainty caused by the unstable social and economic situation.”
As healthcare becomes wholly digitized, so too does the huge amount of private patient data the sector generates. The Electronic Health Records (EHR) for a single patient average several gigabytes per month.
Telehealth, which has grown exponentially due to the pandemic, has served to deepen the data troves still further. In 2020, 43,5% of patients in the US received primary care via telehealth, meaning health providers require fast and secure communication between several data storage services to handle the influx. It’s also paramount that highly sensitive Personal Identifiable Information (PII) and Protected Health Information (PHI) are safely guarded.
To manage such a quantity of sensitive data there is only one long term solution: structuring data along healthy data management and storage principles and migrating to an online cloud while ensuring cloud nativity. While complex, providers must face the challenges posed by data migration head-on, because each day data continues to exist in older structures and systems, it becomes increasingly vulnerable.
Sticking With Legacy
The healthcare industry is a late adopter to cloud computing, putting providers and their patients at risk. Until now, most healthcare providers have tried to delay data migration by prolonging the use of legacy systems. Migrating data is a complex process as it calls for taking multiple technical, process, personnel, and business aspects into consideration.
Preserving business continuity and ensuring no disruption to user experience adds to the complexity, timelines, and costs. In addition, businesses have to come up with creative ways of providing analytics, reports, and intelligence from this data to stay competitive.
In this backdrop, it is tempting to resort to approaches that create a quick win/ sense of comfort. But this invariably translates to myopic measures like implementing network level security perimeters, sophisticated authentication routines, niche security tools, data warehouses that aggregate data, and providing controlled access on demand to authorized individuals.
While these measures can provide certain gains, they invariably end up creating a false sense of security. The bigger problem however is that these measures cause delays in taking comprehensive steps for creating enduring data security and business intelligence. This also means that they have been storing most of their sensitive data on outdated, physical storage systems in an ecosystem which is becoming increasingly more vulnerable and outdated each day.
Pre-pandemic, migration of data systems was costly and cumbersome. The situation worsened with the pandemic, which saw enterprises shifting rapidly to the cloud, causing a spike in demand and price. Some companies registered a 20% to 50% cost increase.
If healthcare providers cannot afford to pay large companies to outsource their migration service, then it is left to their IT personnel to go at it alone. However, migrating data requires not only expertise in IT, but also in the area of health, and in regulatory requirements. When migrating data to cloud systems, any little error encompasses an upswing in the likelihood of cyberattacks.
Hospitals are chronically understaffed. A study by the Healthcare Information and Management Systems Society found that more than half of hospital respondents said their organization di not employ even one information or technology executive. If providers are brutally honest with themselves, they should accept the shortage in manpower and expertise they face when considering migrating data on their own. Even when fully staffed with skilled professionals who know best practices for handling data, the time that it takes to undertake a migration project—alongside maintaining business continuity—alone makes for an extremely difficult process.
How to Migrate
Data migration includes
- restructuring data for PII/PHI separation and encryption,
- moving data from legacy systems to the cloud,
- ensuring that the systems are built to be cloud native, and
- ensuring security with network isolation controls and with least privilege.
It must take into account
- constructs for on-demand and continuous access,
- communication between data repos both at rest and in stream,
- factoring in physical data isolation per tenant for multi-tenant systems, and
- ensuring that the architecture natively lends itself to providing robust data analytics, reporting, and intelligence.
To accomplish this effectively, healthcare systems should seek to avoid taking their “current baggage” with them when moving over to a new system.
This can be accomplished by using an automated strategy that brings forth healthy engineering principles into every layer of the ecosystem. With the right constructs guiding the thought process and with appropriate planning and “Cloud Engineering Automation”, this can be accomplished as part of the migration process.
This migration process may be viewed as multiple iterative cycles where each application together with its data is migrated from its source to its new cloud destination, one application at a time. One of the tactics could be to use machine learning to reliably spot errors or missing data points when collating data from various applications.
Automation can also ensure compliance to the security standards and corporate policies set by the CTO/CIO/CISO for the organization. The DevOps team can introduce phase gates to ensure policies are adhered to throughout the data life cycle.
For compliance, PII/PHI data must be physically separated from the rest of the operational data. Appropriate application of data encryption and least privilege in combination with physical separation of data per tenant in case of multi-tenant systems can help reduce the possibility of data being held for a ransom. This also greatly reduces the blast radius in case of a breach.
Further, a well-developed open API standard system allows a security team to see how interactions are taking place between data repositories, and how APIs are communicating with databases. The mechanism working the API system requires network isolation control and the enforcement of least privilege, which allows applications to interact with one another, while remaining separate and secure. This allows security teams to observe the interference between the databases and analyze potential threats.
Finally, it is important that all communications be encrypted. No one can expect constant vigilance from those using communication channels; at some point patients or providers might share highly sensitive information. Encryption makes sure that information can not fall into the wrong hands.
Staying Safe in a World of Threats
Migrating your data to the Cloud is not just an efficient initiative. There is an immediate urgency for healthcare providers to migrate their data to the cloud and to make their systems cloud native. In today’s environment, however, healthcare providers must look at the broader view to ensure enduring security, efficiency—both capital and resource—and to make data a corporate strategic asset.
The central idea should always be, the end-state which is based on healthy engineering and data architecture principles. Technology leaders and architects should never forget the fact that this is not a problem with the tools that they are using as much as it is about the ecosystem that inevitably causes them to compromise data and system integrity resulting in vulnerabilities.
It is important to understand that enduring security and data integration can only result from solid holistic architecture; not from patching niche tools over antiquated systems, inconsistent configurations, and implementations.
Sashank Purighalla is the Founder/CEO at BOS Framework, a Cloud Engineering Platform that automates seamless transition to Microservices and DevSecOps, enabling businesses to drive Data and Product strategies with security, scale and compliance.