956 research outputs found
Recommended from our members
Computerization of workflows, guidelines and care pathways: a review of implementation challenges for process-oriented health information systems
There is a need to integrate the various theoretical frameworks and formalisms for modeling clinical guidelines, workflows, and pathways, in order to move beyond providing support for individual clinical decisions and toward the provision of process-oriented, patient-centered, health information systems (HIS). In this review, we analyze the challenges in developing process-oriented HIS that formally model guidelines, workflows, and care pathways. A qualitative meta-synthesis was performed on studies published in English between 1995 and 2010 that addressed the modeling process and reported the exposition of a new methodology, model, system implementation, or system architecture. Thematic analysis, principal component analysis (PCA) and data visualisation techniques were used to identify and cluster the underlying implementation āchallengeā themes. One hundred and eight relevant studies were selected for review. Twenty-five underlying āchallengeā themes were identified. These were clustered into 10 distinct groups, from which a conceptual model of the implementation process was developed. We found that the development of systems supporting individual clinical decisions is evolving toward the implementation of adaptable care pathways on the semantic web, incorporating formal, clinical, and organizational ontologies, and the use of workflow management systems. These architectures now need to be implemented and evaluated on a wider scale within clinical settings
Recommended from our members
Coreference resolution in clinical discharge summaries, progress notes, surgical and pathology reports: a unified lexical approach
We developed a lexical rule-based system that uses a unified approach to resolving coreference across a wide variety of clinical records comprising discharge summaries, progress notes, pathology, radiology and surgical reports from two corpora (Ontology Development and Information Extraction (ODIE) and i2b2/VA) provided for the fifth i2b2/VA shared task. Taking the unweighted mean between 4 coreference metrics, validation of the system against the i2b2/VA corpus attained an overall F-score of 87.7% across all mention classes, with a maximum of 93.1% for coreference of persons, and a minimum of 77.2% for coreference of tests. For the ODIE corpus the overall F-score across all mention classes was 79.4%, with a maximum of 82.0% for coreference of persons and a minimum of 13.1% for coreference of diagnostic reagents. For the ODIE corpus our results are comparable to the mean reported inter-annotator agreement with the gold standard. We discuss the four categories of errors we identified, and how these might be addressed. The system uses a number of reusable modules and techniques that may be of benefit to the research community
Recommended from our members
A tool for enhancing MetaMap performance when annotating clinical guideline documents with UMLS concepts
We developed a tool that integrates the National Library of Medicine's MetaMap software with GATE, an open-source text an- alytics framework. The tool allows non-ASCII encoded documents of numerous formats to be annotated with UMLS concepts. We created a GATE pipeline to chunk cardiovascular disease guideline text into default segments (blank-line delimited), XML element content, sentences and phrases, which were sequentially submitted to MetaMap for annotation. XML element, sentence and phrase chunking allowed term extraction and mapping to be completed in around 1/3 of the time taken with de- fault chunking, although with slight loss of accuracy (F1.0s=0.94-0.99). However, phrase chunking allows more complex input to be processed in real time, which is not possible with the other approaches. We discuss the results in relation to use of MetaMap's --term processing option for generating pre- and post-coordinated mappings from composite phrases
Recommended from our members
Automated recognition and post-coordination of complex clinical terms
One of the key tasks in integrating guideline-based decision support systems with the electronic patient record is the mapping of clinical terms contained in both guidelines and patient notes to a common, controlled terminology. However, a vocabulary of pre-coordinated terms cannot cover every possible variation - clinical terms are often highly compositional and complex. We present a rule-based approach for automated recognition and post-coordination of clinical terms using minimal, morpheme-based thesauri, neoclassical combining forms and part-of-speech analysis. The process integrates MetaMap with the open-source GATE framework
Enabling initiation of a lean management system in SMEs: a case study of a high performance plastics manufacturer
This paper investigates the challenges in developing a Lean Management System in a typical Small to Medium Size Enterprise (SMEās) in the UK. Through a series of pilot projects measured and implemented by action research this case study reflects on the changes in mind-set and behaviours that are required on the part of the researcher in order to implement a Lean Manufacturing System. The implementation and suggestions by the researcher to adopt the observational methods such as visual data management and Hoshin Kanri were part of the action research. The data gathered influenced management strategy and planning to incorporate lean practices in the organisation. The actions and results were achieved by the workforce commitment to ensure embedding and sustainability for the future. Benefits realised included a 21% increase in on time delivery performance and cross functional problem solving actions resulted in a lead time reduction of 8 to 4 weeks
Minimizing the Time of Spam Mail Detection by Relocating Filtering System to the Sender Mail Server
Unsolicited Bulk Emails (also known as Spam) are undesirable emails sent to
massive number of users. Spam emails consume the network resources and cause
lots of security uncertainties. As we studied, the location where the spam
filter operates in is an important parameter to preserve network resources.
Although there are many different methods to block spam emails, most of program
developers only intend to block spam emails from being delivered to their
clients. In this paper, we will introduce a new and efficient approach to
prevent spam emails from being transferred. The result shows that if we focus
on developing a filtering method for spams emails in the sender mail server
rather than the receiver mail server, we can detect the spam emails in the
shortest time consequently to avoid wasting network resources.Comment: 10 pages, 7 figure
A Volume-of-Fluid Based Numerical Simulation of Solidification in Binary Alloys on Fixed Non-uniform Co-located Grids
In this work, the author presents a platform for the modeling of mold filling and solidification of binary alloys with properties similar to Mg alloys. A volume-of-fluid (VOF) based method to capture the interface between solid and liquid in a solidification process on a fixed 2D non-uniform grid, developed for implementation in a co-located finite volume framework, is presented. Contrary to other works, to update the volume fraction of fluid in the field, a link between source-based type of energy equation and VOF algorithm is described and implemented. A new approximation to the pressure gradient is presented to remove all āSpurious Currentsā [1] resulting from pressure jumps in the vicinity of the interface.
Based upon the work presented, it is summarized that the present combination of the equations are not only computationally simple to implement and upgrade to a 3D problem, but also provides an excellent platform to capture the interface between constituents in a die-casting process including solidification and mold filling process. This will lead to a better understanding of the die-casting process
Subcritical water extraction of antioxidant compounds from canola meal
Antioxidant compounds were extracted from canola meal by subcritical water extraction (SWE), hot water (80Ā°C) extraction and ethanolic (95%) extraction. The highest extract yields were obtained with SWE at 160Ā°C, and the lowest with ethanolic extraction (SWE 160Ā°C > SWE sequential > SWE 135Ā°C > SWE 110Ā°C = hot water extraction > ethanolic extraction). Ethanolic extracts exhibited the highest total phenolics contents and Trolox equivalent antioxidant capacity (TEAC) values on a per gram of extract basis, and hot water extracts, the lowest (ethanolic extraction > SWE 110Ā°C > SWE 160Ā°C > hot water extraction). Extraction pressure (3.44-6.89 MPa) had no effect on the yields, total phenolics contents or TEAC values of extracts from SWE. The use of buffered water (pH 2-8) for SWE increased extract yield but had adverse effects on the total phenolics contents and TEAC values of extracts. No increase in efficacy of SWE at 110 or 160Ā°C was observed at extraction times longer than 25-30 min.
The total phenolics contents and antioxidant capacities of extracts were assessed by the total phenolics assay, the 2,2-diphenyl-1-picrylhydrazyl (DPPHā¢) scavenging method, TEAC method, the β-carotene-linoleic acid (linoleate) model system, the reducing power assay and the stripped oil model system. Ethanolic extracts exhibited the highest total phenolic contents and antioxidant capacities on a per gram of extract basis. Subcritical water extraction at 160Ā°C exhibited the highest total phenolic contents and antioxidant capacities on a per gram of meal basis. Results from the total phenolics assay and the antioxidant capacity assays were significantly correlated, with the exception of those from the stripped oil model system
Recommended from our members
Conceptual analysis of a diverse set of healthcare quality indicators
Ontologies, described as a specification of a representational vocabulary for a shared domain of discourse [1], can facilitate automated quality monitoring by categorising and establishing relationships between concepts. In terms of ontology development, conceptualisation is the informal representation of domain terms in the form of concepts, instances, relations, and properties [2]. Chan et al [3] suggest a need for research into attributes of quality indicators to support electronic health record (EHR) compatibility. Identification of levels of indicator relationships can serve as a step towards repackaging formulas into reusable components.
A 2009 set of over 200 indicators, collated by the English National Health Service Health and Social Care Information Centre (NHS HSCIC) was chosen to attempt to address some of the gaps in research exploring ontologies and healthcare quality indicators [4]. The gaps included: research on healthcare quality indicator purposes, an ontology for healthcare quality indicators that is not dependent on data available in EHRs, an ontology that covers many clinical subject areas, and a healthcare quality indicator ontology that does not require a framework for indicator development.
We set out to identify relationships and layers of inclusion and exclusion criteria for a large, diverse set of quality indicators from the English NHS. The indicators, originating from different sources ranging from the UK Renal Registry to the NHS Quality and Outcomes Framework, are measures related to processes and outcomes. Our analysis served as the conceptualisation stage for an ontology for the indicators
- ā¦