19,522 research outputs found

    ENABLING EFFICIENT FLEET COMPOSITION SELECTION THROUGH THE DEVELOPMENT OF A RANK HEURISTIC FOR A BRANCH AND BOUND METHOD

    Get PDF
    In the foreseeable future, autonomous mobile robots (AMRs) will become a key enabler for increasing productivity and flexibility in material handling in warehousing facilities, distribution centers and manufacturing systems. The objective of this research is to develop and validate parametric models of AMRs, develop ranking heuristic using a physics-based algorithm within the framework of the Branch and Bound method, integrate the ranking algorithm into a Fleet Composition Optimization (FCO) tool, and finally conduct simulations under various scenarios to verify the suitability and robustness of the developed tool in a factory equipped with AMRs. Kinematic-based equations are used for computing both energy and time consumption. Multivariate linear regression, a data-driven method, is used for designing the ranking heuristic. The results indicate that the unique physical structures and parameters of each robot are the main factors contributing to differences in energy and time consumption. improvement on reducing computation time was achieved by comparing heuristic-based search and non-heuristic-based search. This research is expected to significantly improve the current nested fleet composition optimization tool by reducing computation time without sacrificing optimality. From a practical perspective, greater efficiency in reducing energy and time costs can be achieved.Ford Motor CompanyNo embargoAcademic Major: Aerospace Engineerin

    Qluster: An easy-to-implement generic workflow for robust clustering of health data

    Get PDF
    The exploration of heath data by clustering algorithms allows to better describe the populations of interest by seeking the sub-profiles that compose it. This therefore reinforces medical knowledge, whether it is about a disease or a targeted population in real life. Nevertheless, contrary to the so-called conventional biostatistical methods where numerous guidelines exist, the standardization of data science approaches in clinical research remains a little discussed subject. This results in a significant variability in the execution of data science projects, whether in terms of algorithms used, reliability and credibility of the designed approach. Taking the path of parsimonious and judicious choice of both algorithms and implementations at each stage, this article proposes Qluster, a practical workflow for performing clustering tasks. Indeed, this workflow makes a compromise between (1) genericity of applications (e.g. usable on small or big data, on continuous, categorical or mixed variables, on database of high-dimensionality or not), (2) ease of implementation (need for few packages, few algorithms, few parameters, ...), and (3) robustness (e.g. use of proven algorithms and robust packages, evaluation of the stability of clusters, management of noise and multicollinearity). This workflow can be easily automated and/or routinely applied on a wide range of clustering projects. It can be useful both for data scientists with little experience in the field to make data clustering easier and more robust, and for more experienced data scientists who are looking for a straightforward and reliable solution to routinely perform preliminary data mining. A synthesis of the literature on data clustering as well as the scientific rationale supporting the proposed workflow is also provided. Finally, a detailed application of the workflow on a concrete use case is provided, along with a practical discussion for data scientists. An implementation on the Dataiku platform is available upon request to the authors

    Targeted proteomics links virulence factor expression with clinical severity in staphylococcal pneumonia

    Get PDF
    IntroductionThe bacterial pathogen Staphylococcus aureus harbors numerous virulence factors that impact infection severity. Beyond virulence gene presence or absence, the expression level of virulence proteins is known to vary across S. aureus lineages and isolates. However, the impact of expression level on severity is poorly understood due to the lack of high-throughput quantification methods of virulence proteins.MethodsWe present a targeted proteomic approach able to monitor 42 staphylococcal proteins in a single experiment. Using this approach, we compared the quantitative virulomes of 136 S. aureus isolates from a nationwide cohort of French patients with severe community-acquired staphylococcal pneumonia, all requiring intensive care. We used multivariable regression models adjusted for patient baseline health (Charlson comorbidity score) to identify the virulence factors whose in vitro expression level predicted pneumonia severity markers, namely leukopenia and hemoptysis, as well as patient survival.ResultsWe found that leukopenia was predicted by higher expression of HlgB, Nuc, and Tsst-1 and lower expression of BlaI and HlgC, while hemoptysis was predicted by higher expression of BlaZ and HlgB and lower expression of HlgC. Strikingly, mortality was independently predicted in a dose-dependent fashion by a single phage-encoded virulence factor, the Panton-Valentine leucocidin (PVL), both in logistic (OR 1.28; 95%CI[1.02;1.60]) and survival (HR 1.15; 95%CI[1.02;1.30]) regression models.DiscussionThese findings demonstrate that the in vitro expression level of virulence factors can be correlated with infection severity using targeted proteomics, a method that may be adapted to other bacterial pathogens

    Measurement of telescope transmission using a Collimated Beam Projector

    Full text link
    With the increasingly large number of type Ia supernova being detected by current-generation survey telescopes, and even more expected with the upcoming Rubin Observatory Legacy Survey of Space and Time, the precision of cosmological measurements will become limited by systematic uncertainties in flux calibration rather than statistical noise. One major source of systematic error in determining SNe Ia color evolution (needed for distance estimation) is uncertainty in telescope transmission, both within and between surveys. We introduce here the Collimated Beam Projector (CBP), which is meant to measure a telescope transmission with collimated light. The collimated beam more closely mimics a stellar wavefront as compared to flat-field based instruments, allowing for more precise handling of systematic errors such as those from ghosting and filter angle-of-incidence dependence. As a proof of concept, we present CBP measurements of the StarDICE prototype telescope, achieving a standard (1 sigma) uncertainty of 3 % on average over the full wavelength range measured with a single beam illumination

    A generalised multi-factor deep learning electricity load forecasting model for wildfire-prone areas

    Full text link
    This paper proposes a generalised and robust multi-factor Gated Recurrent Unit (GRU) based Deep Learning (DL) model to forecast electricity load in distribution networks during wildfire seasons. The flexible modelling methods consider data input structure, calendar effects and correlation-based leading temperature conditions. Compared to the regular use of instantaneous temperature, the Mean Absolute Percentage Error (MAPE) is decreased by 30.73% by using the proposed input feature selection and leading temperature relationships. Our model is generalised and applied to eight real distribution networks in Victoria, Australia, during the wildfire seasons of 2015-2020. We demonstrate that the GRU-based model consistently outperforms another DL model, Long Short-Term Memory (LSTM), at every step, giving average improvements in Mean Squared Error (MSE) and MAPE of 10.06% and 12.86%, respectively. The sensitivity to large-scale climate variability in training data sets, e.g. El Ni\~no or La Ni\~na years, is considered to understand the possible consequences for load forecasting performance stability, showing minimal impact. Other factors such as regional poverty rate and large-scale off-peak electricity use are potential factors to further improve forecast performance. The proposed method achieves an average forecast MAPE of around 3%, giving a potential annual energy saving of AU\$80.46 million for the state of Victoria

    Kirchhoff-Love shell representation and analysis using triangle configuration B-splines

    Full text link
    This paper presents the application of triangle configuration B-splines (TCB-splines) for representing and analyzing the Kirchhoff-Love shell in the context of isogeometric analysis (IGA). The Kirchhoff-Love shell formulation requires global C1C^1-continuous basis functions. The nonuniform rational B-spline (NURBS)-based IGA has been extensively used for developing Kirchhoff-Love shell elements. However, shells with complex geometries inevitably need multiple patches and trimming techniques, where stitching patches with high continuity is a challenge. On the other hand, due to their unstructured nature, TCB-splines can accommodate general polygonal domains, have local refinement, and are flexible to model complex geometries with C1C^1 continuity, which naturally fit into the Kirchhoff-Love shell formulation with complex geometries. Therefore, we propose to use TCB-splines as basis functions for geometric representation and solution approximation. We apply our method to both linear and nonlinear benchmark shell problems, where the accuracy and robustness are validated. The applicability of the proposed approach to shell analysis is further exemplified by performing geometrically nonlinear Kirchhoff-Love shell simulations of a pipe junction and a front bumper represented by a single patch of TCB-splines

    A Decision Support System for Economic Viability and Environmental Impact Assessment of Vertical Farms

    Get PDF
    Vertical farming (VF) is the practice of growing crops or animals using the vertical dimension via multi-tier racks or vertically inclined surfaces. In this thesis, I focus on the emerging industry of plant-specific VF. Vertical plant farming (VPF) is a promising and relatively novel practice that can be conducted in buildings with environmental control and artificial lighting. However, the nascent sector has experienced challenges in economic viability, standardisation, and environmental sustainability. Practitioners and academics call for a comprehensive financial analysis of VPF, but efforts are stifled by a lack of valid and available data. A review of economic estimation and horticultural software identifies a need for a decision support system (DSS) that facilitates risk-empowered business planning for vertical farmers. This thesis proposes an open-source DSS framework to evaluate business sustainability through financial risk and environmental impact assessments. Data from the literature, alongside lessons learned from industry practitioners, would be centralised in the proposed DSS using imprecise data techniques. These techniques have been applied in engineering but are seldom used in financial forecasting. This could benefit complex sectors which only have scarce data to predict business viability. To begin the execution of the DSS framework, VPF practitioners were interviewed using a mixed-methods approach. Learnings from over 19 shuttered and operational VPF projects provide insights into the barriers inhibiting scalability and identifying risks to form a risk taxonomy. Labour was the most commonly reported top challenge. Therefore, research was conducted to explore lean principles to improve productivity. A probabilistic model representing a spectrum of variables and their associated uncertainty was built according to the DSS framework to evaluate the financial risk for VF projects. This enabled flexible computation without precise production or financial data to improve economic estimation accuracy. The model assessed two VPF cases (one in the UK and another in Japan), demonstrating the first risk and uncertainty quantification of VPF business models in the literature. The results highlighted measures to improve economic viability and the viability of the UK and Japan case. The environmental impact assessment model was developed, allowing VPF operators to evaluate their carbon footprint compared to traditional agriculture using life-cycle assessment. I explore strategies for net-zero carbon production through sensitivity analysis. Renewable energies, especially solar, geothermal, and tidal power, show promise for reducing the carbon emissions of indoor VPF. Results show that renewably-powered VPF can reduce carbon emissions compared to field-based agriculture when considering the land-use change. The drivers for DSS adoption have been researched, showing a pathway of compliance and design thinking to overcome the ‘problem of implementation’ and enable commercialisation. Further work is suggested to standardise VF equipment, collect benchmarking data, and characterise risks. This work will reduce risk and uncertainty and accelerate the sector’s emergence

    A direct-laser-written heart-on-a-chip platform for generation and stimulation of engineered heart tissues

    Full text link
    In this dissertation, we first develop a versatile microfluidic heart-on-a-chip model to generate 3D-engineered human cardiac microtissues in highly-controlled microenvironments. The platform, which is enabled by direct laser writing (DLW), has tailor-made attachment sites for cardiac microtissues and comes with integrated strain actuators and force sensors. Application of external pressure waves to the platform results in controllable time-dependent forces on the microtissues. Conversely, oscillatory forces generated by the microtissues are transduced into measurable electrical outputs. After characterization of the responsivity of the transducers, we demonstrate the capabilities of this platform by studying the response of cardiac microtissues to prescribed mechanical loading and pacing. Next, we tune the geometry and mechanical properties of the platform to enable parametric studies on engineered heart tissues. We explore two geometries: a rectangular seeding well with two attachment sites, and a stadium-like seeding well with six attachment sites. The attachment sites are placed symmetrically in the longitudinal direction. The former geometry promotes uniaxial contraction of the tissues; the latter additionally induces diagonal fiber alignment. We systematically increase the length for both configurations and observe a positive correlation between fiber alignment at the center of the microtissues and tissue length. However, progressive thinning and “necking” is also observed, leading to the failure of longer tissues over time. We use the DLW technique to improve the platform, softening the mechanical environment and optimizing the attachment sites for generation of stable microtissues at each length and geometry. Furthermore, electrical pacing is incorporated into the platform to evaluate the functional dynamics of stable microtissues over the entire range of physiological heart rates. Here, we typically observe a decrease in active force and contraction duration as a function of frequency. Lastly, we use a more traditional ?TUG platform to demonstrate the effects of subthreshold electrical pacing on the rhythm of the spontaneously contracting cardiac microtissues. Here, we observe periodic M:N patterns, in which there are ? cycles of stimulation for every ? tissue contractions. Using electric field amplitude, pacing frequency, and homeostatic beating frequencies of the tissues, we provide an empirical map for predicting the emergence of these rhythms
    corecore