167,261 research outputs found

    A Privacy Preserving Framework for RFID Based Healthcare Systems

    Get PDF
    RFID (Radio Frequency IDentification) is anticipated to be a core technology that will be used in many practical applications of our life in near future. It has received considerable attention within the healthcare for almost a decade now. The technology’s promise to efficiently track hospital supplies, medical equipment, medications and patients is an attractive proposition to the healthcare industry. However, the prospect of wide spread use of RFID tags in the healthcare area has also triggered discussions regarding privacy, particularly because RFID data in transit may easily be intercepted and can be send to track its user (owner). In a nutshell, this technology has not really seen its true potential in healthcare industry since privacy concerns raised by the tag bearers are not properly addressed by existing identification techniques. There are two major types of privacy preservation techniques that are required in an RFID based healthcare system—(1) a privacy preserving authentication protocol is required while sensing RFID tags for different identification and monitoring purposes, and (2) a privacy preserving access control mechanism is required to restrict unauthorized access of private information while providing healthcare services using the tag ID. In this paper, we propose a framework (PriSens-HSAC) that makes an effort to address the above mentioned two privacy issues. To the best of our knowledge, it is the first framework to provide increased privacy in RFID based healthcare systems, using RFID authentication along with access control technique

    A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows

    Get PDF
    This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes. Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques. The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic system’ components serves as a knowledge base. The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete. After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system. A real-life case (from the King’s College hospital accident and emergency (A&E) department’s trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling

    Information Accountability Framework for a Trusted Health Care System

    Get PDF
    Trusted health care outcomes are patient centric. Requirements to ensure both the quality and sharing of patients’ health records are a key for better clinical decision making. In the context of maintaining quality health, the sharing of data and information between professionals and patients is paramount. This information sharing is a challenge and costly if patients’ trust and institutional accountability are not established. Establishment of an Information Accountability Framework (IAF) is one of the approaches in this paper. The concept behind the IAF requirements are: transparent responsibilities, relevance of the information being used, and the establishment and evidence of accountability that all lead to the desired outcome of a Trusted Health Care System. Upon completion of this IAF framework the trust component between the public and professionals will be constructed. Preservation of the confidentiality and integrity of patients’ information will lead to trusted health care outcomes

    Jefferson Digital Commons quarterly report: April-June 2019

    Get PDF
    This quarterly report includes: Articles CREATE Day Presentations Dissertations From the Archives Grand Rounds and Lectures House Staff Quality Improvement and Patient Safety Posters JCIPE Student Hotspotting Posters Journals and Newsletters MPH Capstone Presentations Posters Sigma Xi Research Day What People are Saying About the Jefferson Digital Common

    National Health Policy

    Get PDF

    A State Policymakers' Guide to Federal Health Reform: Part I: Anticipating How Federal Health Reform Will Affect State Roles

    Get PDF
    Examines how federal healthcare reform will affect states' tools and roles in connecting people to services, promoting coordination and integration, improving care for those with complex needs, being results-oriented, and increasing efficiencies

    Bending the Curve: Options for Achieving Savings and Improving Value in Health Spending

    Get PDF
    Analyzes the potential of fifteen federal health policy options to lower spending over the next ten years and yield higher value on investments in health care

    CamFlow: Managed Data-sharing for Cloud Services

    Full text link
    A model of cloud services is emerging whereby a few trusted providers manage the underlying hardware and communications whereas many companies build on this infrastructure to offer higher level, cloud-hosted PaaS services and/or SaaS applications. From the start, strong isolation between cloud tenants was seen to be of paramount importance, provided first by virtual machines (VM) and later by containers, which share the operating system (OS) kernel. Increasingly it is the case that applications also require facilities to effect isolation and protection of data managed by those applications. They also require flexible data sharing with other applications, often across the traditional cloud-isolation boundaries; for example, when government provides many related services for its citizens on a common platform. Similar considerations apply to the end-users of applications. But in particular, the incorporation of cloud services within `Internet of Things' architectures is driving the requirements for both protection and cross-application data sharing. These concerns relate to the management of data. Traditional access control is application and principal/role specific, applied at policy enforcement points, after which there is no subsequent control over where data flows; a crucial issue once data has left its owner's control by cloud-hosted applications and within cloud-services. Information Flow Control (IFC), in addition, offers system-wide, end-to-end, flow control based on the properties of the data. We discuss the potential of cloud-deployed IFC for enforcing owners' dataflow policy with regard to protection and sharing, as well as safeguarding against malicious or buggy software. In addition, the audit log associated with IFC provides transparency, giving configurable system-wide visibility over data flows. [...]Comment: 14 pages, 8 figure
    • …
    corecore