1,063 research outputs found

    Integrated patient-to-room and nurse-to-patient assignment in hospital wards

    Full text link
    Assigning patients to rooms and nurses to patients are critical tasks within hospitals that directly affect patient and staff satisfaction, quality of care, and hospital efficiency. Both patient-to-room assignments and nurse-to-patient assignments are typically agreed upon at the ward level, and they interact in several ways such as jointly determining the walking distances nurses must cover between different patient rooms. This motivates to consider both problems jointly in an integrated fashion. This paper presents the first optimization models and algorithms for the integrated patient-to-room and nurse-to-patient assignment problem. We provide a mixed integer programming formulation of the integrated problem that considers the typical objectives from the single problems as well as additional objectives that can only be properly evaluated when integrating both problems. Moreover, motivated by the inherent complexity that results from integrating these two NP-hard and already computationally challenging problems, we devise an efficient heuristic for the integrated patient-to-room and nurse-to-patient assignment problem. To evaluate the running time and quality of the solution obtained with the heuristic, we conduct extensive computational experiments on both artificial and real-world instances. The artificial instances are generated by a parameterized instance generator for the integrated problem that is made freely available

    In situ X-ray imaging of hot cracking and porosity during LPBF of Al-2139 with TiB2 additions and varied process parameters

    Get PDF
    Laser powder bed fusion (LPBF) additive manufacturing of 2XXX series Al alloys could be used for low volume specialist aerospace components, however, such alloys exhibit hot cracking susceptibility that can lead to component failure. In this study, we show two approaches to suppress the formation of hot cracks by controlling solidification behaviour using: (1) TiB2 additions; and (2) optimisation of LPBF process parameters. Using high-speed synchrotron X-ray radiography, we monitored LPBF of Al-2139 in situ, with and without TiB2 under a range of process conditions. In situ X-ray radiography results captured the crack growth over 1.0 ms at a rate of ca. 110 mm s−1, as well as pore evolution, wetting behaviour and build height. High-resolution synchrotron X-ray computed tomography (sCT) was used to measure the volume fraction of defects, e.g. hydrogen pores and microcracks, in the as-built LPBF samples. Our results show adding TiB2 in Al-2139 reduces the volume of cracks by up to 79 % under a volume energy density of 1000 to 5000 J mm−3, as well as reducing the average length, breadth, and surface area of cracks

    Collective Particle Flow through Random Media

    Full text link
    A simple model for the nonlinear collective transport of interacting particles in a random medium with strong disorder is introduced and analyzed. A finite threshold for the driving force divides the behavior into two regimes characterized by the presence or absence of a steady-state particle current. Below this threshold, transient motion is found in response to an increase in the force, while above threshold the flow approaches a steady state with motion only on a network of channels which is sparse near threshold. Some of the critical behavior near threshold is analyzed via mean field theory, and analytic results on the statistics of the moving phase are derived. Many of the results should apply, at least qualitatively, to the motion of magnetic bubble arrays and to the driven motion of vortices in thin film superconductors when the randomness is strong enough to destroy the tendencies to lattice order even on short length scales. Various history dependent phenomena are also discussed.Comment: 63 preprint pages plus 6 figures. Submitted to Phys Rev

    Scihadoop: Array-based query processing in hadoop.

    Get PDF
    ABSTRACT Hadoop has become the de facto platform for large-scale data analysis in commercial applications, and increasingly so in scientific applications. However, Hadoop's byte stream data model causes inefficiencies when used to process scientific data that is commonly stored in highly-structured, array-based binary file formats. This limits the scalability of Hadoop applications in science. We introduce SciHadoop, a Hadoop plugin allowing scientists to specify logical queries over array-based data models. SciHadoop executes queries as map/reduce programs defined over the logical data model. We describe the implementation of a SciHadoop prototype for NetCDF data sets, and quantify the performance of five separate optimizations that address the following goals for a representative holistic aggregate function query: reduce total data transfers, reduce remote reads, and reduce unnecessary reads. Two optimizations allow holistic functions to be evaluated opportunistically during the map phase; Two additional optimizations intelligently partition input data to increase read locality, and one optimization avoids block scans by examining the data dependencies of an executing query to prune input partitions. Experiments involving a holistic function show run-time improvements of up to 8x, with drastic reductions of I/O, both locally, and over the network

    Life cycle assessment of emerging technologies: Evaluation techniques at different stages of market and technical maturity

    Full text link
    Life cycle assessment (LCA) analysts are increasingly being asked to conduct life cycleâ based systems level analysis at the earliest stages of technology development. While early assessments provide the greatest opportunity to influence design and ultimately environmental performance, it is the stage with the least available data, greatest uncertainty, and a paucity of analytic tools for addressing these challenges. While the fundamental approach to conducting an LCA of emerging technologies is akin to that of LCA of existing technologies, emerging technologies pose additional challenges. In this paper, we present a broad set of market and technology characteristics that typically influence an LCA of emerging technologies and identify questions that researchers must address to account for the most important aspects of the systems they are studying. The paper presents: (a) guidance to identify the specific technology characteristics and dynamic market context that are most relevant and unique to a particular study, (b) an overview of the challenges faced by early stage assessments that are unique because of these conditions, (c) questions that researchers should ask themselves for such a study to be conducted, and (d) illustrative examples from the transportation sector to demonstrate the factors to consider when conducting LCAs of emerging technologies. The paper is intended to be used as an organizing platform to synthesize existing methods, procedures and insights and guide researchers, analysts and technology developer to better recognize key study design elements and to manage expectations of study outcomes.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/154465/1/jiec12954-sup-0001-SuppMat.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/154465/2/jiec12954.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/154465/3/jiec12954_am.pd

    Risk estimation of distant metastasis in node-negative, estrogen receptor-positive breast cancer patients using an RT-PCR based prognostic expression signature

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Given the large number of genes purported to be prognostic for breast cancer, it would be optimal if the genes identified are not confounded by the continuously changing systemic therapies. The aim of this study was to discover and validate a breast cancer prognostic expression signature for distant metastasis in untreated, early stage, lymph node-negative (N-) estrogen receptor-positive (ER+) patients with extensive follow-up times.</p> <p>Methods</p> <p>197 genes previously associated with metastasis and ER status were profiled from 142 untreated breast cancer subjects. A "metastasis score" (MS) representing fourteen differentially expressed genes was developed and evaluated for its association with distant-metastasis-free survival (DMFS). Categorical risk classification was established from the continuous MS and further evaluated on an independent set of 279 untreated subjects. A third set of 45 subjects was tested to determine the prognostic performance of the MS in tamoxifen-treated women.</p> <p>Results</p> <p>A 14-gene signature was found to be significantly associated (p < 0.05) with distant metastasis in a training set and subsequently in an independent validation set. In the validation set, the hazard ratios (HR) of the high risk compared to low risk groups were 4.02 (95% CI 1.91–8.44) for the endpoint of DMFS and 1.97 (95% CI 1.28 to 3.04) for overall survival after adjustment for age, tumor size and grade. The low and high MS risk groups had 10-year estimates (95% CI) of 96% (90–99%) and 72% (64–78%) respectively, for DMFS and 91% (84–95%) and 68% (61–75%), respectively for overall survival. Performance characteristics of the signature in the two sets were similar. Ki-67 labeling index (LI) was predictive for recurrent disease in the training set, but lost significance after adjustment for the expression signature. In a study of tamoxifen-treated patients, the HR for DMFS in high compared to low risk groups was 3.61 (95% CI 0.86–15.14).</p> <p>Conclusion</p> <p>The 14-gene signature is significantly associated with risk of distant metastasis. The signature has a predominance of proliferation genes which have prognostic significance above that of Ki-67 LI and may aid in prioritizing future mechanistic studies and therapeutic interventions.</p
    corecore