Idea HePaaS

Why Idea HePaaS

Leveraging The Valuable Data That Enterprises Have To Build Compelling Products And Insights Is Key To The Success Of Any Organization.

It is vital for every organization to leverage its valuable data to build compelling products and insights. In the case of Healthcare organizations, data is all the more  available from in-house sources like EHRs, patient  portals, payer systems and more recently medical  devices and wearables.

With the availability of such rich and diverse data,  the complexity of managing it also increases  exponentially . Every healthcare enterprise is  diverting a lot of its effort to build the horizontal components necessary to manage this data, while  they actually should be focusing on developing the business use cases
specific to them.

HePaaS is our solution to this problem statement. We  have abstracted all the functionalities required by an  enterprise grade Healthcare data platform and built it into our HePaaS. This enables our customers to  marshal their data and develop interesting business  use cases.



Healthcare-specific Complexities

Compliance
Healthcare organizations need to meet stringent  requirements pertaining to HIPAA, 21CFR, Meaningful Use given that they handle very sensitive ePHI information.
Security
There is a need for high grade security (SOC2 compliance) as PII data is involved.
Data standards
Healthcare data standards (HL7, FHIR) ensure interoperability, but variations persist. Advanced mapping and auto-configuration are essential for consistency.

Platform Functionalities

HePaaS has the capability to handle the following use cases:

Ingesting Real-Time Streaming Data
The Ideas HePaaS can ingest real-time streaming data from any source publishing healthcare data, supporting a wide variety of protocols including HL7 V2/V3, CDP, FHIR, etc. It can process large volumes of data efficiently and supports encryption at flight and rest.
ML/AI based Schema Mapping & Data Preparation
Map various data sources to our canonical data model
leveraging AI/ML. Frequently requires data preparation
features like imputations are also available.
Data Cleansing & Preparation of
Structured Healthcare Data
Healthcare data is often messy and ill-prepared for analysis. This  latform facilitates the preparation of internal healthcare data for cohort building and analytics.  The purpose of this use case is to process and clean
multiple administrative datasets and build a cohort of  patients.
Process Unstructured Data
Unstructured data like clinical notes is an important element of any healthcare platform. Idea HePaaS has the ability to facilitate the processing and analysis of unstructured data such as free text and images. It supports the ingestion, parsing and storage of unstructured data.
Pre-built Public Datasets
Many analytical use cases can be improved by enriching the enterprise’s data with publicly available datasets. This could be healthcare related, socio-economical data and more. The platform has integrations with many public APIs to ingest rich datasets. It has also imported publicly available datasets published by institutions like Harvard and MIT.
AI/ML
A key focus of the platform is to build, train and implement Machine Learning models into clinical practice, incorporating a variety of datasets. So Idea HePaaS comes with an integrated functionality for building, training and operationalizing ML models.

Architecture

This architecture supports the following use cases for the data platform.

Data Ingestion from Heterogeneous Sources

Problem Statement

  • The platform should be able to digest data available in different silos and make use of it to get insights in real-time to enable analytics.
  • The chosen platform should be able to ingest real-time streaming data from hospital enterprise service buses.
  • The ingested data should be available for analyzing and implementing role based access.

Our Solution

The reference architecture design supports this use case objective with the following Azure components and implementation.

  • Our design includes the FHIR server to process data from EHR systems. Since FHIR is evolving as a dominant standard, we have included it as a principal component in the design to ensure interoperability.
  • FHIR is used to standardize the exchange of healthcare information, enabling healthcare providers and administrators to easily share patient information even though they are using different software systems.
  • The aim of FHIR is to address the growing digitization of the healthcare industry and the need for patient records to be readily available, discoverable, and understandable
  • Our design includes Azure API for FHIR with legacy FHIR conversion software to process data from received HL7 streaming messages.
  • Our design also supports integration of the Azure API for FHIR with EHR systems.
  • Our design includes data ingestion using Azure Data Factory and custom ETL scripts and storing the data in Azure Data Lake.
  • Our design ensures Data encryption in transit and in rest by using configuration available in different Azure components.
  • Our design enables granular access controls to different users for different levels of data by using Resource-based Access Control module in Azure.
  • Our design includes Data Analytics and reporting on real-time data using PowerBI.

Key consideration for using Azure API for FHIR

  • High performance, low latency
  • Enterprise grade, managed FHIR service
  • Role Based Access Control (RBAC) – allowing you to manage access to your data at scale
  • Audit log tracking for access, creation, modification, and reads within each data store
  • Secure compliance in the cloud: ISO 27001:2013 certified, supports HIPAA and GDPR, and built on the HITRUST certified Azure platform
  • Data is isolated to a unique database as per API instance
  • Protection of your data with multi-region failover

Data Cleansing & Standardization

Problem Statement

  • Healthcare data is often messy and ill-prepared for analysis.
  • Data platforms should have the ability to process and clean multiple administrative datasets for cohort building and analytics.
  • It must process multiple internal healthcare data and itmust be linked based on patient information.
  • Data processing should include rapid exploration, deduplication and data cleaning.

Our Solution

  • Our platform has capability to import Bulk data via secure FTP using Azure Cloud Service and FTP2Azure Application
  • The data is ingested into Azure Blob Storage system
  • It utilizes the Apache Spark framework in Azure Databrick for performing data mining, data transformation and other data operations such as:
  • de-identification, masking, suppression and profile data
  • cleaning, deduplication, ensure all field types and value
  • feature engineering
  • It includes building a secure data pipeline between Azure Blob Storage to Azure Databrick for data mining.
  • Idea HePaaS handles archival and historical storage of data in Azure Data Lake.
  • Supports creation of a decision support system using Azure Synapse Analytic for supporting business and organizational decision-making activities.

Pre-built Analytics & Predictions

Problem Statement

  • Build data platform infrastructure, train and implement Machine Learning models into clinical practice, incorporating a variety of datasets.
  • Building, training and operationalizing a predictive model for mental health admissions that incorporates an external dataset.
  • Collaborative development, validation, optimization and lifecycle management of the model.

Our Solution

  • Idea HePaaS supports the full lifecycle of ML from data analysis to building/testing and deployment.
  • It Supports ingestion of external data source using Azure Data Factory and storing the information in Azure Data Lake.
  • Handles data preparation for the Machine Learning model using Apache Spark framework in Azure Databrick.
  • Handles building the best machine learning model and measuring accuracy using Azure Machine Learning.
  • Supports Lifecycle Management and ML Democratization of Machine Learning models using Azure Machine Learning pipeline and Azure Machine Learning.

Prebuilt Suite of AI models

  • Intensive Care Unit (ICU) Readmission Prediction
  • Patient Appointment No-Shows Prediction
  • Prediction of Early Onset of Sepsis
  • DocSearch – a Natural Language Processing (NLP) based Answering System
  • `Face Mask Detector using Deep Learning (PyTorch)

Machine Learning based Schema Mapping

Problem Statement

  • Healthcare datasets are myriad in nature, with different metadata types and data formats.
  • Mapping schema from different EHR and other healthcare sources is a challenge and takes a lot of manual effort.
  • While rule / template-based mapping might work for a few data sources, it might not be a feasible solution when data sources scale up.

Our Solution

  • Using ML based schema mapping, input data labels are predicted and classified.
  • ML model can be integrated with any input data source (HL7V2, FHIR, etc).
  • ML based schema mapping can enable seamless data integration.
  • An expert rule engine on top of ML solution would make the solution more accurate thus reducing the job of manual integration.
  • Auto Data Prep features like Data Imputation, Outlier detection, Normalization,
  • Scaling enables to prepare the data with very less manual effort and coding.

Connect with Us

We'd love to brainstorm your Gen AI initiatives and contribute to the best outcomes.