This is the final installment of a series focused on interoperability, a key to continuing the momentum started by the Meaningful Use program, highlighted in “2016 HIT Trends: Consensus Predictions.”
A number of technologies and trends will guide the way interoperability evolves over the next several years. These are detailed in the Office of the National Coordinator's 10 year plan for interoperability in the US, which provides a set of interoperability goals for the next decade and a blueprint for reaching them that encompasses standards, policies, technologies, and market forces.
There's quite a lot of content in the 77 page report, so this article will focus on just three aspects that are relevant to the consensus predictions for 2016 summarized earlier.
1. Learning Health System
The overarching aim of the interoperability roadmap is to create a Learning Health System (LHS). A learning health system is briefly described as one in which encounter data is collected and analyzed to gain new insights about diseases and treatments. The insights are rapidly turned into best practices, advancing the state of the art and improving future patient encounters.
This is, of course, how the practice of medicine has evolved throughout its history. The important difference with a LHS is that enabling technologies accelerate the cycle time. The LHS relies on data flowing from healthcare facilities to research institutions, public health departments, and payers, so interoperability is a fundamental prerequisite.
2. Fast Healthcare Interoperability Resources (FHIR)
Featured prominently in the ONC’s roadmap, but less so in the 2016 predictions by industry publications, is a new technology called Fast Healthcare Interoperability Resources (FHIR).
Of the 81 predictions gathered from 10 articles about trends and predictions for 2016, only two mentioned FHIR. Chilmark predicted that APIs would gain momentum, but FHIR would not yet catch on. John Halamka, MD, in Healthcare IT News, predicted that a new breed of apps would use FHIR APIs to add a new layer of functionality on top of transactional systems.
The technical details behind FHIR are discussed in this presentation, but the non-technical aspects are what make it truly unique. First, FHIR leverages well-adopted technologies, such as REST, JSON, XML, and OAUTH2. This means that developers don't have to learn and implement new things to facilitate healthcare interoperability. Second, FHIR strikes a compromise between HL7 2.x's granular, real-time messages and CDA's compendious XML document architecture. FHIR resources are just the right size for business applications. Third, because of the first two points, FHIR is gaining broad praise and acceptance by developers. The impact of an enthusiastic developer community can not be overestimated. A long and growing list of vendors has formally expressed interest in FHIR, and there are other organizations besides these working with FHIR.
3. Interoperability Measurement
The ONC's plan calls for measuring the current level of interoperability with an eye toward improving the metrics over time. The private sector has begun to define what successful interoperability looks like. KLAS Research held a summit of 12 large EMR vendors that resulted in agreement on a method for measuring interoperability and will begin publishing its measurements as a way to inform the market about how vendors are performing.
While the ONC’s roadmap contains a number of ambitious objectives, the early success of FHIR and steps toward quantifying interoperability are building momentum toward a Learning Health System.
This is the second installment of a series focused on interoperability, a key to continuing the momentum started by the Meaningful Use program, highlighted in “2016 HIT Trends: Consensus Predictions.”
As an in industry, Healthcare IT has succeeded in implementing the first phase of interoperability by solving the problem of internal integration. Despite best intentions, the next phase – integrating across institutions - is off to a slow start.
Solving the first interoperability challenge focused on getting siloed systems within an individual hospital or facility to talk with one another. Time savings, error reduction and improved workflow provided the motivation to move on this initiative. Fortunately, when a hospital wants to integrate its existing systems:
It owns the software and data network.
The employees share a common mission and management structure.
Standards like HL7 version 2.x and TCP/IP are equal to the task of integrating systems within a hospital's environment.
Internal integration is a largely solved problem today as a result of many years of hard work by dedicated teams of integration specialists.
The new interoperability challenge before us is integrating data across different institutions. In addition to the technical difficulties discussed in my last article, integration across institutions involves the difficulties of managing a project across two (or more) different legal entities with their own management, processes, policies and priorities. The complexities of this problem remain unresolved, and that's why it appears on the list of predictions for 2016.
In addition to communications and project management challenges, the second phase of interoperability is driven by different, more ambitious motivators. Some of these (1-3 below) are discussed at length in the AHA's publication, Why Interoperability Matters.
Top six motivators for integration between institutions
1. Care Coordination
Care coordination means making sure a patient transitions smoothly among care settings, including their home. A recent Chilmark blog post provides an example of a patient with an injury who starts in an urgent care center, visits an orthopedist, and looks for his record in a patient portal. Patient data, including medication lists, discharge instructions, and progress notes need to flow through the system in tandem with the patient in order to produce the best outcomes and reduce admission risks.
2. Public Health Reporting
Public health departments in all jurisdictions require clinicians to report on certain conditions when they are detected during a patient visit. Reporting processes are largely manual, which creates extra work for providers and results in untimely reports. Completely solving the public health reporting problem will require considerable effort, but interim progress would prove helpful, even with partial solutions.
3. Patient Engagement
Patients become disengaged when the clinicians providing their care have poor communication tools. The AHA reports that one third of patients have had problems related to clinician information exchange, such as having to bring their own X-rays to an appointment. Interoperability isn't the only factor that limits patient engagement, but it's an important one to get under control.
4. Population Health
In Healthcare IT News, John Halamka, M.D. defines population health as the capability to "automatically aggregate data from multiple provider, payer and patient sources then create lists of patients with care gaps to be closed". This is only possible when the various sources are connected.
To build the learning health system envisioned by the ONC's 10 year interoperability roadmap, it is necessary to run analytics on clinical, financial, and other data that comes from multiple data sources within and across institutions. This requires the ability to move data between and among facilities.
6. The Internet of Things (IoT)
Integrating data from medical and personal devices is required to support innovative patterns of healthcare delivery. The new types and formats of data from these devices present a novel challenge to integration tools and practitioners now and into the future.
Meaningful Use may have intended to be a motivator of integration initiatives, but the real drivers are healthcare system reforms, such as new payment models and an emphasis on high quality and outcomes. Meaningful Use was an important incentive to help realize these changes and build the infrastructure to support them, but it was never a primary driver (and going forward it will be replaced).
My next article will discuss emerging technologies and policies that will impact interoperability in the near future.
This article is the first of a series focused on interoperability, a key to continuing the momentum started by the Meaningful Use program, highlighted in “2016 HIT Trends: Consensus Predictions.”
Now that a critical mass of healthcare data is stored electronically rather than on paper, it's natural that we would expect it to flow freely among electronic systems. The healthcare community is learning what integration specialists have always known: interoperability is hard!
Why? Because there are many details that have to be exactly aligned in order to get two systems talking to one another. Here are four aspects of interoperability with a rating of how we're doing overall in each.
1. Protocol-level interoperability. This means just being able to get a message from one system to another regardless of the message content. For example, connections can be made over a simple TCP connection, a heavyweight SOAP web service, or any number of other standard protocols. After agreeing on a protocol, sending and receiving systems must agree on security methods. While there are a lot of details to handle, these are usually reasonably easy to resolve. Grade: A
2. Syntactic interoperability. Syntax is about transmitting messages that can be read. Like protocol interoperability, there are a number of standards in common use: XML, HL7, X12, and many others. Once there is agreement on general syntax (e.g., HL7 version 2.4), there is still a lot of work to do to ensure that, between sending and receiving systems, all the fields are in the same place (e.g. is it "firstname lastname"? or "lastname, firstname"?), field lengths match, etc. This typically takes much longer to negotiate, but usually the receiving system can get what it needs from the sender. Grade: B+
3. Workflow interoperability. After the systems are successfully exchanging messages, the applications need to agree on what is appropriate to say. For example, imagine an EMR that receives lab results from a laboratory information management system (LIMS). The EMR might reasonably require that a lab result must be in the final status before a correction is accepted. But perhaps the LIMS allows the user to skip the final lab result if a correction is available at the time of transmission. If the LIMS transmits the correction without first sending the final result, the EMR will reject the message due to the unexpected status. These types of interoperability mismatches tend to be difficult to smooth over because each system has different business rules to define expected messages. These rules are usually hard-coded in the application rather than handled in the interface layer, where more flexibility is available. Often a resolution is only possible with compromise, which can sometimes impact the seamless flow of data. Grade: C
4. Semantic interoperability. Semantic interoperability means the sender and receiver have achieved a common understanding by exchanging a message. This is only possible if the sender and receiver have the same meanings for each data field and use the same code sets to encode those fields. For example, does "diagnosis" mean admission diagnosis or discharge diagnosis? Is it coded with ICD-9 or ICD-10? (Or SNOMED?). Translations among coding systems are possible, but usually imperfect. They can only be implemented if an interface developer realizes there is a mismatch. Grade: D
Other industries have standards that make interoperability relatively easy. Unfortunately, such standards are not available in healthcare for two reasons. First, various vendors have used different data models and business rules to represent healthcare concepts in their systems. Rather than implementing standards with strict adherence to the specifications, vendors often implement standards loosely so that their model of healthcare more easily fits into the standard. (Using Z-segments in HL7 version 2.x is the textbook example of this type of loose implementation.) Second, the richness and importance of healthcare data make it difficult to capture the meaning and nuance electronically, let alone trying to have one system communicate with another in an understandable way.
The path to complete complete interoperability in healthcare is going to be long and difficult. But it’s not all doom and gloom. Looking on the bright side there has been remarkable progress so far, making healthcare interoperability worth the pursuit in achieving better care for all.
My next posts will drill a little deeper into the prediction that interoperability is the key to continuing the momentum started by the Meaningful Use program.
The first post in this series was an overview of the 2016 Health IT Trends project (including the list of sources). This article summarizes and condenses the 81 individual predictions into a top 10 list of meta-predictions for 2016.
I have listed them (approximately) in order from most frequently predicted to least.
The role of IT (and therefore also the role of CIO) will shift starting in 2016. Maintaining and improving core infrastructure and legacy systems will remain important, but IT will also need to step up its development efforts using customer-focused, agile techniques to meet the requirements of the rapidly evolving business environment.
Major changes in provider and payer organizations -- including mergers, acquisitions, divestitures, and partnerships -- will favor organizations that are poised to capitalize on new models of reimbursement. Government may block or regulate some of these activities to ensure a competitive environment.
Where market forces and self-policing by industry fail to keep costs and quality in line, government will step in to enact reforms. Drug costs, lab developed tests, mobile apps, and medical necessity are among the candidates for regulation.
Security threats will continue to be a top priority for IT departments. New techniques and best practices will become more widespread to help reduce some threats.
Interoperability is perennially on the list of areas needing improvement. In 2016 vendors and IT departments must make significant advances or face government intervention. Meaningful Use Stages 1 and 2 helped migrate a majority of medical records form paper to electronic format and established an interoperability baseline. To build on this momentum data needs to flow freely among systems.
Simple patient care activities that have historically taken place in the primary care provider's office will occur in other settings, notably retail locations (such as pharmacies) and remotely via telemedicine.
Mobile apps, wearables, and the Internet of Things (IoT) will enhance and automate remote management of patients with chronic conditions. The same technologies will also help healthy patients maintain good health and watch for early warning signs of medical problems that they can electronically transmit to their care providers (patient generated data). Exactly how clinicians will use consumer device data and whether the government will regulate it are still open questions.
Population Health will move from a buzzword to actual practice. Precision Medicine -- which takes into account each patient's unique social and genomic determinants of health -- will emerge as the new cutting edge for healthcare providers to proactively render care.
Employers and other purchasers of healthcare insurance will begin to scrutinize costs, demanding more predictability and transparency.
The healthcare system will require a different set of tools as its focus shifts from treating ill patients to maintaining healthy populations. Analytics, care coordination systems and secure communication tools will gain prominence, and the primacy of the EHR/EMR will decline.
Future installments in this series will examine these consensus predictions in more detail with an eye toward tracking what's happening today, where the trends may lead by the end of 2016, and which prognosticators made the most accurate predictions.
Each new year starts with a flood of predictions, and it can be difficult to know which ones are truly prescient and which are just a lazy way to fill column space during the holidays. I had initially planned a project that would catalog as many 2016 predictions in the field of Healthcare IT as possible and track which came true throughout the year, but two major problems arose. First, there are simply too many predictions to track. Second, most of the predictions are subjective and untestable. The predictions are mostly about trends, not events. Even for those that contain quantified metrics (e.g., the wearables market will hit $6 billion), it can be difficult to determine whether the target was achieved.
I decided to address the first problem by choosing 10 publications from which to draw the lists of 2016 predictions. To solve the second problem, I've rolled up the predictions into a consensus. The next blog post will document the consensus predictions. I will explore each one in detail over the next several installments.
By the time I have explored each prediction, it will be deep enough into the year to ascertain how the predictions are tracking against reality. Because the predictions are imprecise, my analysis will be qualitative rather than a yes/no on whether each one came true.
The predictions come from the following 10 sources:
7. MDDI: 5 Bold Predictions For Medtech in 2016MDDI serves the Medical Device and Diagnostic Industry, which might not be the best bellwether for the broader HIT industry. This set of predictions adds some richness to the overall data set.
All together, the 10 articles contain 81 distinct predictions. No two predictions were exactly the same, but many of them cover the same territory. My next post will highlight the common themes and create a list of consensus predictions for 2016.