Data Collection is a set of highly robust, scalable, near real-time services specifically designed for integration of Healthcare Systems and Devices to provide them Socket API,RESTful API,MQTT API interfaces. API's and Meta-data driven approach provides the ability to define, configure and receive information from these systems and devices without the need to update or change the communications protocol.
Data Collection services expose a set of very granular RESTful,Socket and MQTT API's to enable communications with a variety of varity of Domain such as Healthcare devices, IT systems, IoT devices, databases and other systems. In order to communicate with these end points, a simple translator connectors are developed and published through an above mentioned API. Generally, connectivity and communication to an endpoint can be accomplished in just a few hours.
In addition to physical devices, services are available to perform actions such as Continuous Query of databases, interface to Nifi Flows, interface to a published spark streaming, or even make polling calls to an external system. All of these actions are initiated by simple API service calls. Once the connection is established, there needs to be no further action from your application. Platform will continue to ingest the data and provide normalized information to your application.
As information from various end points is consumed into Platform, it can be analyzed in near real-time for specific events or conditions that may be of interest. Rules,Machine Learning Models and Event Management services allow control over the conversion of your inbound data to actionable information.
Rules,Machine Learning Models and Event Management service rules can have a variety of actions, from simple stakeholder notifications to complex application notification methods. Actions can also interact with other Platform services like Device and Data Source Management to change the behavior of an endpoint.
As the number of data sources increases within the network, the amount of traffic generated by these sources could be enormous. Many tenants are finding that having some level of edge intelligence is critical to prevent network overload. Using Platforms’s EDGE/Gateway deployment model, tenants shall use platforms's gateways to receive data which avoids the traffic overload and filter the information only to relay the actionable data into the platform for further processing.
A mechanism to match tenant attributes against metadata and policies right in the datastore. To use an attribute-based approach, tenant do need metadata, but tenant also need a way to describe tenant and applications that want to access the data. Attributes are easily changed, and they are flexible—if tenant effectively manage them, and if tenant identity system is sufficiently flexible, tenant attributes can be extended to account for just about any combination of organization, facility,unit, role, authorization, device, relationship, etc.
In-datastore policy management, so that access rules are automatically enforced. Every data request for access should be mediated by the governance policies of the platforms’s policy management framework shall be able to provide a decision for each data access request, based just on the characteristics of the data and the attributes of the tenant. With policy-based access control set by metadata and tenant attributes, tenant get data security along with the kind of flexibility tenant need to manage data sharing and collaboration.
Data Lake(DL) – The Raw Data Storage area where the we ingest the live data stream from various Data Sources. The data lake stores raw data, in whatever form the data source provides. There is no assumptions about the schema of the data, each data source can use whatever schema it likes. It's up to the consumer products data ponds of that data to make sense of that data for their own purposes.The data lake is schemaless, it's up to the source systems to decide what schema to use and for consumer products data ponds to work out how to deal with the resulting chaos.
Data Pond(DP) – The Extracted Data from the Data Lake ‘fit-for-purpose’ to support Platform's Product needs. Dataponds curate and organize the data for products analytics usecases.
Meta-Data(MD) - Holds the Platform's and its product data. It plays very important role while creating the Datalake and Dataponds. Metadata, used intelligently, can structure a whole data lake, so that it is available to tenants who can use it—and reuse it—to create value, as well as to those whose job it is to audit it, cleanse it, and supervise it responsibly.
Indexing and cataloging capabilities, so that data is findable. Automatic indexing (a result of an effective metadata strategy) is the one of the most important ways tenant can reduce lookup time and the amount of time tenant have to spend hunting for data. Platforms rowkey based index data shall maintain integrity with the data being stored. Tenant shall be able to elaborate on and enrich their metadata as per their requirements evolve, and tenants shall be able to structure it and link it - to show relationships between data items.
API Management centralizes and unifies functionality into one place.Build efficient distributed architectures ready to scale.Expand functionality from one place with a simple command.
Every request being made to the API will hit API Gateway first, and then it will be proxied to the final API. In between requests and responses API Gateway Shall execute any plugin that tenant decided to install, empowering your APIs. API Gateway is effectively going to be the entry point for every API request.
We have augmented on-premise EMRs with Advanced Analytics on our platform, for certain providers/tenants Provision HealthCare-Analytics as a Service platform for other tenants.
when a patient goes into cardiac arrest, medical staff issue a code blue alert. Staff stop what they’re doing and attend to the patient and yet less than 70% of victims survive. We are working on predicting these medical crises before they happen, allowing doctors and nurses to intervene before patients have cardiac arrest.