With healthcare’s continuing shift towards electronic health records (or EHRs), interoperability between systems and devices is a key driver for achieving successful communication between all parties in the healthcare environment. For patients, this means our health information must be available to anyone who may treat us, presented in a meaningful and value-driven manner. Implementing an EHR system, in many cases, can be a very complicated task which requires coordination between multiple disciplines inside healthcare. Although migrating to an electronic health system is a step forward to process optimization and efficiency, it can be a driver towards over-dependency and reliability for staff members in their everyday work.
Following the Ebola case at Texas Health, the Dallas hospital had initially stated that the hospital’s electronic health record had failed to admit Thomas Eric Duncan, an American unknowingly infected with the virus. During that time, Mr. Duncan’s health information was not passed along to other care providers via their EHR system. This caused physicians not to receive his travel history due to a “flawed” setup in its health record system. This was all later retracted by the hospital in a statement made by Texas Health Resources:
“We would like to clarify a point made in the statement released earlier in the week. As a standard part of the nursing process, the patient's travel history was documented and available to the full care team in the electronic health record (EHR), including within the physician’s workflow. There was no flaw in the EHR in the way the physician and nursing portions interacted related to this event.”
The confusion as to the root-cause of the issue is an alarming example of the lack of visibility engineers and quality experts can come across in the processes of health systems. Delivering quality standards down to the design-level, for both data and workflows, is key to driving process improvement and optimization inside the constantly changing technology of a healthcare facility.
As a consultant working in healthcare data integration, it’s common to see these kinds of high demand data systems being requested and built under short timelines. These environments can lead to various levels of risk in projects and may jeopardize quality data standards. Though these activities can often be overlooked, the quality of data can eventually trickle up to the level of care of the patient. These data pipelines affect the direction of care physicians and nurses will take during different stages of a patient’s care process.
Understanding the kind of data that must be applied to different points of care is critical for hospital staff to effectively do their job. Analyzing the types of data which must be presented and the processes which adhere to requirements – and understanding how it impacts the value of care the patient receives – are core concepts of Just-In-Time (JIT) level design. When applied correctly through quality analysis, JIT and other continuous improvement methodologies can be used to better analyze a patient’s care process throughout their health lifecycle while also minimizing hospital’s process variability. Quality processes and data workflows ensure a standardized level of care, ultimately minimizing hospital risk and error; alleviating the chance for more uncommon diseases to jeopardize the quality of care during the most extreme of circumstances – such as the Ebola case late last month.
Read more about our healthcare solutions:
- Maximizing the Capabilities of Health Cloud
- I'm the Patient, Remember Me?
- Is Salesforce HIPAA Compliant?
- 5 Steps to an Effective Data Governance Plan