Healthcare IT News – Authored by John Andrews, Contributing Editor.

The promise of genomics and personalized care are closer than many realize. But clinical systems and EHRs are not ready yet. While policymakers and innovators play catch-up, here’s a look at what you need to know. 

Considering how fast technology advances in the healthcare industry, it seems natural that a once-innovative concept could become obsolete in the span of, say, a dozen years. Knowledge, comprehension and capabilities continue moving forward, and if the instruments of support don’t keep pace, it can cause a rift to appear. If nothing is done, it can exacerbate into a seismic event.

Some contend that this situation exists with the rapid advancement of precision medicine continually outstripping the static state of electronic health records. Medical research is forging ahead with genomic discoveries, while EHRs remain essentially the same as when the Office of the National Coordinator for Health Information Technology launched the interoperability initiative in 2004.

Over that time, healthcare provider IT teams have worked tirelessly at implementing systems with EHR capability and towards industry-wide interoperability. If the relationship between science and infrastructure has hit an inexorable bottleneck, what are the reasons for it?

“It depends on how you look at it,” noted Nephi Walton, MD, biomedical informaticist and genetics fellow at Washington University School of Medicine in St. Louis. “One of the problems I have seen is when new functionality is created in EHRs, it is not necessarily well integrated into the overall data structure and many EHRs as a result have a database structure underneath them that is unintelligible with repetitive data instances. We often seem to be patching holes and building on top of old architecture instead of tearing down and remodeling the right way.”

Walton addressed the disconnect between the growth in precision medicine and the limitations of healthcare IT infrastructure at a presentation during the recent HIMSS Big Data and Healthcare Analytics Forum in San Francisco.

Nephi Walton

“IT in healthcare tends to lag a bit behind other industries for a number of reasons,” he said. “One of them is that healthcare IT is seen as a cost center rather than a revenue-generating center at most institutions, so fewer resources are put into it.”

Overall, EHR limitations have resonated negatively among providers since they were introduced, said Dave Bennett, executive vice president of product and strategy at Orion Health in Scottsdale, Ariz.

“The EHR reality has fallen painfully short of the promise of patient-centric quality care, happy practitioners and reduced costs,” he said. “In recent surveys, EHRs are identified as the top driver of dissatisfaction among providers. According to the clinical end-users of EHRs, it takes too long to manage menial tasks, it decreases face-to-face time with patients, and it degrades the quality of documentation. In one sentence, it does not bring value to providers and consumers.”

Despite the limitations though, EHR designs aren’t to blame, Bennett said.

“It is not the technology in itself – it is the technology usability that needs a new approach to successfully deliver data-driven healthcare,” he said. “We need to redesign the EHR with the patient in mind and build a technology foundation that allows the EHR full integration into the care system. Today’s EHRs are good for billing and documenting but are not really designed to be real-time and actionable. They cannot support an ecosystem of real-time interactions, and they lack the data-driven approaches that retail, financial, and high tech industries have taken to optimize their ecosystems.”

Strengthening weak links
Technological disparity doesn’t just exist between medical research and EHRs, but in how EHRs are used within health systems, added Jon Elwell, CEO for Boise, Idaho-based Kno2.

“One of the biggest struggles in healthcare IT today is the widely uneven distribution of healthcare providers, facilities and systems along the maturity continuum towards a totally digital healthcare future,” he said. “One healthcare system is only as technologically advanced as the least mature provider or facility within its network.”

For example, he said an advanced accountable care organization may be using EHRs in every department and using direct messaging to exchange patient charts and other content with others in the network. However, he said, it is still common for some to be using faxes to communicate, “thrusting the advanced system back into the dark ages.”

As an industry, providers “have to work harder to develop solutions that prevent early adopters from being dragged down to the lowest common technology denominator,” Elwell said. “These new solutions should extend a hand up to less-advanced providers and facilities by providing easy ways for them to adopt digital processes, particularly when it comes to interoperability.”

Aligning vision with reality
The Office of the National Coordinator for Health IT, which evolved alongside EHRs over the past 12 years, hasn’t sat idly by as the imbalance has gradually appeared. Jon White, MD, deputy national coordinator, is fully aware of the situation and says it is time to take a fresh look at precision medicine and EHRs.

“What we need to do is bring reality in with our vision,” he said. “It’s not just science, but the IT infrastructure that supports it.”

With its roots going back to 2000, precision medicine sprung up from genome sequencing and has continued to map its route forward. White says at its inception the ONC realized that information infrastructure needed improvement and the EHR initiative was designed to get the process moving.

“The view of precision medicine and the vision for precision medicine has broadened considerably beyond the genome, which is still a viable part of the precision medicine field,” White said. “But it is really about people’s data and understanding how it relates to one another.”

Precision medicine is being given a cross-institutional approach, with new types of science and analysis emerging and a new methodology being envisioned, White said. For IT, a solid and dynamic infrastructure has been built “where little existed before and over the past seven years EHRs adoption has gone from 30 percent of physicians to 95 percent now.”

So the vast majority of provider organizations are now using EHRs and the systems are operating with the clinical utility that was expected, White said. Next steps for interoperability and enhanced functionality, he added, are a logical part of the long-term process.

“EHRs are doing a lot of the things we want them to do,” he said. “We’re at a place where we have the information structure and need to understand how to best use it as well as continuing to adapt and evolve the systems.”

Eric Just

More introspection needed
In order for EHRs to gain more functionality and interoperability to achieve a wider scope of utilization, more has to be done with the inner machinations of the process, Walton said.

“I don’t think there has been much of a focus on interoperability between systems, especially now where you have a few major players that have pretty much taken over the market,” he said. “I fear that as we have less choices, there will be less innovation and I sense now that EHR vendors are more likely to dictate what you want than to give into what you need. The overarching problem with interoperability is that there is no common data model – not only between vendors, but between instances of a particular vendor. There really needs to be a standard data model for healthcare.”

Yet while precision medicine – especially as it relates to genomics – continues to emerge, analysts like Eric Just, vice president of technology at Salt Lake City-based Health Catalyst, aren’t sure IT infrastructure is solely to blame for the problem.

“I’m not really convinced that EHR interoperability is the true rate limiter here, save for a few very advanced institutions,” he said. “Practical application of genomics in a clinical setting requires robust analytics, the ability to constantly ingest new genomic evidence, and there needs to be a clinical vision for actually incorporating this data into clinical care. But very few organizations have all of these pieces in place to actually push up against the EHR limits.”

To be sure, White acknowledged that academic institutions who pushed EHRs for research purposes do want more functionality and capability from electronic records.

 “Those large academic institutions have been telling their vendors that when it comes to EHRs, ‘this is our business and we need you to meet our needs,'” he said.

When presenting on the topic of precision medicine and EHRs, Just said he senses “a big rift” between academic and non-academic centers on the topic.

“Our poll shows that maybe the issue is not EHRs, but the science that needs to be worked out,” he said. “A lot of progress is being made, but analyzing the whole genome and bringing it to medical record is not an agenda that many organizations are pushing. And those that are don’t have clear vision of what they’re looking for.”

Charting new horizons
Because precision medicine’s advancement is growing so rapidly, it is understandable that EHRs will be limited, Just said.

“These new analyses have workflows no one has seen before, they need to be developed and current technology won’t allow it,” he said. “EHRs are good at established workflows, but we need to open workflows so that third parties can develop extensions to the EHR.”

As it exists today, the healthcare IT infrastructure is “simply genomic unaware,” said Chris Callahan, vice president of Cambridge, Mass.-based Genelnsight, meaning that genetic data has no accommodations within the records.

“Epic and Cerner don’t have a data field in their system called ‘variant,’ for example, the basic unit of analysis in genetics,” he said. “It’s simply not represented in the system. They are not genomic ready.”

Enakshi Singh is a genomic researcher who sees firsthand academia’s quest for higher EHR functionality. As a senior product manager for genomics and healthcare for SAP in Palo Alto, Calif., she is at the center of Stanford School of Medicine’s efforts to apply genomic data at the point of care. In this role, she works with multidisciplinary teams to develop software solutions for real-time analyses of large-scale biological, wearable and clinical data.

“The interoperability win will be when patients can seamlessly add data to their EHRs,” she said. “But at this point, today’s EHR systems can’t handle genomic data or wearable data streams.”

EHRs may not be equipped for ultra-sophisticated data processing and storage, but Singh also understands that they reflect the limitations of the medical establishment when it comes to genomic knowledge. Every individual has approximately three billion characteristics in their genomic code, with three million variants that are specific to each person.

“General practitioners aren’t equipped to understand the three million characteristics that make each individual unique,” she said.

One reason for precision medicine’s growth is how the cost of sequencing has shrunk, Singh said. The first genomic sequence in 2000 took 13 years and $13 billion from a large consortium to produce. Today a genome can be sequenced for $1,000, which has led to a stampede of consumers wanting to find out their genetic predispositions, she said.

Singh’s colleague Carlos Bustamante, professor of biomedical data science and genetics at Stanford calls the trend “a $1,000 genome for a $1 million interpretation.”

The frontier for genomics and precision medicine continues to be vast and wide, Singh said, because of the three million variants, “we only know a fraction of what that means. When we talk about complex diseases, it’s an interplay of multiple different characters and mutations and how it’s related to environment and lifestyle. Integrating this hasn’t evolved yet.”

The other challenge is connecting with clinical data and sets that have shown to play a role in disease, how to integrate at the point of care and create assertions based on profile information. Singh is involved with building new software that takes new data streams and provides for quick interpretation. The Stanford hospital clinic is in the process of piloting a genomic service, where anyone at the hospital can refer patients to the service for a swab and sequencing.

“They will work the process and curate it, filter down what’s not important and go down the list related to symptoms,” Singh said. “This replaces searching through databases. What we have done is to create a prototype app that automates the workflow and create a field where workflow is streamlined for interpreting more patients. Current workflow without the prototype is 50 hours per patient and ours is dramatically cutting the time down. It’s not close to being in clinical decision support yet, but it did go through 30 patients with the genomic service.”

Jeff Wu

Workflow and analytics
With a background in EHRs, Jeffrey Wu, director of product development for Health Catalyst, specializes in population health. To adequately utilize EHRs for genomics, Wu is developing an analytics framework capable of bringing data in from different sources, which could include genomics as part of the much broader precision medicine field. Ultimately, he said it’s about giving EHRs the capability to handle a more complete patient profile.

“Right now there is minimal differentiation between patients, which makes it harder to distinguish between them,” Wu said. “Standardizing the types of genomes and the type of care for those genomes will make EHRs more effective.”

Wu explained that his project has two spaces – the EHRs are the workflow space, coinciding with a separate analytics engine for large computations and complex algorithms.

“These two architectures live separately,” he said. “Our goal is to get those integration points together to host the capabilities and leverage up-and-coming technologies to get the data in real time.”

Stoking the FHIR
A key tool in helping vendors expand the EHR’s functionality is FHIR – Fast Health Interoperability Resources, an open healthcare standard from HL7. While it has been available for trial use since 2014.

SMART on FHIR is the latest platform offering, designed to provide a complete open standards-based technology stack. SMART on FHIR is designed so developers can integrate a vast array of clinical data with ease.

Joshua Mandel, MD, research scientist in biomedical informatics at Harvard and lead architect of the SMART Health IT project, is optimistic that SMART on FHIR and a pilot project called Sync for Science will give vendors the incentive and the platform to move EHR capability in a direction that can accommodate advancing medical science.

“When ONC and the National Institutes of Health were looking for forward-thinking ways to incorporate EHR data into research, using the SMART on FHIR API was a natural fit,” he said. “It’s a technology that works for research, but also provides a platform for other kinds of data access as well. The technology fits into the national roadmap for providing patient API access, where patients can use whatever apps they choose, and connect those apps to EHR data. In that sense, research is just one use case – if we have a working apps ecosystem, then researchers can leverage that ecosystem just the same as any other app developer.”

With Sync for Science, Mandel’s team at the Harvard Department of Biomedical Informatics is leading a technical coordination effort that is funded initially for 12 months to work with seven EHR vendors – Allscripts, athenahealth, Cerner, drchrono, eClinicalWorks, Epic, and McKesson/RelayHealth – to ensure that each of these vendors can implement a consistent API that allows patients to share their clinical data with researchers.

Sync for Science – known as S4S – is designed to help any research study ask for (and receive, if the patient approves) patient-level electronic health record data, Mandel said. One important upcoming study is the Precision Medicine Initiative.

Josh Mandel

“It’s important to keep in mind that much of the most interesting work will involve aggregation of data from multiple modalities, including self-reports, mobile device/sensors, ‘omics’ data, and the EHR, he said. “S4S is focused on this latter piece – making the connection to the EHR. This will help keep results grounded in traditional clinical concepts like historical diagnoses and lab results.”

The project is focused on a relatively small “summary” data set, known as the Meaningful Use Common Clinical Data Set. It includes the kind of basic structured clinical data that makes up the core of a health record, including allergies, medications, lab results, immunizations, vital signs, procedure history, and smoking status. The timeline is structured so that the pilot should be completed by the end of December and Mandel expects that the technical coordination work will be finished by that time. The next step, he says, is to test the deployments with actual patients.

“We’re still working out the details of how these tests will happen,” Mandel said. “One possibility is that the Precision Medicine Initiative Cohort Program will be able to run these tests as part of their early participant app data collection workflow.”

Built on the FHIR foundation, S4S is designated as the lynchpin for interoperability to broaden its scope for research, clinical data and patient access. FHIR is organized toward profiles and the use case data and the data types that characterize it. S4S is building a FHIR profile so that the data such as demographics, medications and laboratory results can be accessed and donated to precision medicine.

As a proponent of S4S, the ONC sees the program as an extension of “the fundamental building blocks for interoperability,” White said. The APIs that are integral to the S4S effort have been used in EHRs for a long time, but he said vendors kept them proprietary.

“When we told vendors in 2015 that they would need to open APIs so that there could be appropriate access to data, they agreed, and moreover, they said they would lead the charge,” White said.

MU and MACRA influence
When the industry started on the EHR and interoperability initiative in 2004, meaningful use hadn’t been conceived of yet. With MU’s arrival as part of President Obama’s ARRA program, healthcare providers were suddenly diverted from the original development plan with an extra layer of bureaucracy.

Walton talks about its impact on the overall effort: “Meaningful use had some value but largely missed the goals of its intention,” he said. “I think a lot of people essentially played the system to achieve the financial benefit of meaningful use without necessarily being concerned about how that translated into benefits for patients. Meaningful use has pushed people to start talking about interoperability, which is good, but it has not gone much further than that. Most of the changes in EHRs around meaningful use were driven by billing and financial reimbursement, but it has opened the door to more possibilities.”

The broader problem, says Wayne Oxenham, president of Orion Health’s North America operation, is that a business-to-business model did not really exist in healthcare, “so incentives were not aligned, and MU was only focusing on EHR interoperability and quality measures that provide no value versus proactive care models.”

In essence, Oxenham said “MU did not deliver much. The program tried to do too much by wanting to digitize healthcare in 10 years and curiously, their approach was only focused on the technology instead of focusing on the patient and creating value. The point was to improve outcomes and stabilize costs, not to exchange documents that did not necessarily need to be shared, and they brought no value when stored in a locker deep in a clinical portal. MU missed the point – it just helped digitize processes that were and are still oriented towards billing, but aren’t focused on optimizing care and using the data in meaningful ways.”

As with MU, new certification requirements for the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) could also influence the dynamics of precision medicine and genomics, but Walton contends that it isn’t an issue at this point.

“I don’t think MU has inhibited the development at all, and people are still trying to wrap their heads around MACRA,” Walton said. “A big part of the problem is that there has not really been a financial incentive to pursue this and many healthcare IT innovations are driven by billing and trying to increase revenue. I think that MU has tied up a lot of healthcare IT resources but I don’t know that I can say they would have been working on precision medicine if they were not tied up.”

Eric Rock CEO of Plano, Texas-based Vivify, calls MU “a measuring stick used to drive quality through financial penalties or gains, the strongest driver for healthcare industry change.” While he considers MU a “good move, it perhaps wasn’t strong enough to make a ‘meaningful’ difference with interoperability or on the cost of care.”

Forthcoming CMS bundles, such as the recent Comprehensive Care for Joint Replacement model, could advance the MU incentive component further, he said.

“The impact that CMS bundles and other value-based care models will have is a much stronger demand by providers towards healthcare interoperability in a landmark effort to reduce costs,” Rock said. “As a result, winning large contracts may require a commitment to a new level of interoperability.”

Eric Rock

Altering the trajectory
If the current trajectory of precision medicine-EHR imbalance continues, it won’t be for a lack of trying by medical science and the healthcare IT industry to curb it. Programs like Sync for Science need time to develop and produce results. At this point, however, there are a lot of questions about how the “technology gap” issue will proceed and whether it will continue to widen.

From a technology perspective, Walton believes the focus needs to be on scaling horizontally.

“Right now EHRs are primarily based on massive servers rather than distributing tasks across multiple computers,” he said. “You can’t handle the data from precision medicine this way – it’s just not going to work, especially when you have multiple departments trying to access and process it at the same time.”

True cloud computing, whether internally or externally hosted, is needed for this type of data, Walton said, because “the database infrastructure behind EHRs and clinical data warehouses is not geared towards precision medicine and cannot handle the data generated. There are clinical data warehouses that can handle the data better but they are not usually updated in real time, which you need for an effective system for precision medicine.  This will require investments in very fast hardware and distributed computing and we have a ways to go on this front.”

On the current trajectory, precision medicine is “slowly sweeping into standards of care and what we are doing is going little-step by little-step to find places where personalized medicine is applicable and can be used,” Callahan said.

The only way the current trajectory will change is if reimbursement patterns change, he said.

“If and when payers latch onto the idea that personalized medicine is actually a key enabler of population health, then that they should pay for it as an investment,” he said. “That will be a game changer, a new trajectory. Right now the payer community views precision medicine and genetics as just another costly test and people don’t know what it means or what the clinical utility of it is. That is the exact wrong way to think about it. Precision medicine and genetics are key enablers for population management. When you can get your head around that idea, when you can marry the idea, then you really start to see things change.”

View Article