2,850 research outputs found

    Continuous maintenance and the future – Foundations and technological challenges

    Get PDF
    High value and long life products require continuous maintenance throughout their life cycle to achieve required performance with optimum through-life cost. This paper presents foundations and technologies required to offer the maintenance service. Component and system level degradation science, assessment and modelling along with life cycle ‘big data’ analytics are the two most important knowledge and skill base required for the continuous maintenance. Advanced computing and visualisation technologies will improve efficiency of the maintenance and reduce through-life cost of the product. Future of continuous maintenance within the Industry 4.0 context also identifies the role of IoT, standards and cyber security

    Data mining as a tool for environmental scientists

    Get PDF
    Over recent years a huge library of data mining algorithms has been developed to tackle a variety of problems in fields such as medical imaging and network traffic analysis. Many of these techniques are far more flexible than more classical modelling approaches and could be usefully applied to data-rich environmental problems. Certain techniques such as Artificial Neural Networks, Clustering, Case-Based Reasoning and more recently Bayesian Decision Networks have found application in environmental modelling while other methods, for example classification and association rule extraction, have not yet been taken up on any wide scale. We propose that these and other data mining techniques could be usefully applied to difficult problems in the field. This paper introduces several data mining concepts and briefly discusses their application to environmental modelling, where data may be sparse, incomplete, or heterogenous

    Dynamic Resource Allocation For Coordination Of Inpatient Operations In Hospitals

    Get PDF
    Healthcare systems face difficult challenges such as increasing complexity of processes, inefficient utilization of resources, high pressure to enhance the quality of care and services, and the need to balance and coordinate the staff workload. Therefore, the need for effective and efficient processes of delivering healthcare services increases. Data-driven approaches, including operations research and predictive modeling, can help overcome these challenges and improve the performance of health systems in terms of quality, cost, patient health outcomes and satisfaction. Hospitals are a key component of healthcare systems with many scarce resources such as caregivers (nurses, physicians) and expensive facilities/equipment. Most hospital systems in the developed world have employed some form of an Electronic Health Record (EHR) system in recent years to improve information flow, health outcomes, and reduce costs. While EHR systems form a critical data backbone, there is a need for platforms that can allow coordinated orchestration of the relatively complex healthcare operations. Information available in EHR systems can play a significant role in providing better operational coordination between different departments/services in the hospital through optimized task/resource allocation. In this research, we propose a dynamic real-time coordination framework for resource and task assignment to improve patient flow and resource utilization across the emergency department (ED) and inpatient unit (IU) network within hospitals. The scope of patient flow coordination includes ED, IUs, environmental services responsible for room/bed cleaning/turnaround, and patient transport services. EDs across the U.S. routinely suffer from extended patient waiting times during admission from the ED to the hospital\u27s inpatient units, also known as ED patient `boarding\u27. This ED patient boarding not only compromises patient health outcomes but also blocks access to ED care for new patients from increased bed occupancy. There are also significant cost implications as well as increased stress and hazards to staff. We carry out this research with the goal of enabling two different modes of coordination implementation across the ED-to-IU network to reduce ED patient boarding: Reactive and Proactive. The proposed `reactive\u27 coordination approach is relatively easy to implement in the presence of modern EHR and hospital IT management systems for it relies only on real-time information readily available in most hospitals. This approach focuses on managing the flow of patients at the end of their ED care and being admitted to specific inpatient units. We developed a deterministic dynamic real-time coordination model for resource and task assignment across the ED-to-IU network using mixed-integer programming. The proposed \u27proactive\u27 coordination approach relies on the power of predictive analytics that anticipate ED patient admissions into the hospital as they are still undergoing ED care. The proactive approach potentially allows additional lead-time for coordinating downstream resources, however, it requires the ability to accurately predict ED patient admissions, target IU for admission, as well as the remaining length-of-stay (care) within the ED. Numerous other studies have demonstrated that modern EHR systems combined with advances in data mining and machine learning methods can indeed facilitate such predictions, with reasonable accuracy. The proposed proactive coordination optimization model extends the reactive deterministic MIP model to account for uncertainties associated with ED patient admission predictions, leading to an effective and efficient proactive stochastic MIP model. Both the reactive and proactive coordination methods have been developed to account for numerous real-world operational requirements (e.g., rolling planning horizon, event-based optimization and task assignments, schedule stability management, patient overflow management, gender matching requirements for IU rooms with double occupancy, patient isolation requirements, equity in staff utilization and equity in reducing ED patient waiting times) and computational efficiency (e.g., through model decomposition and efficient construction of scenarios for proactive coordination). We demonstrate the effectiveness of the proposed models using data from a leading healthcare facility in SE-Michigan, U.S. Results suggest that even the highly practical optimization enabled reactive coordination can lead to dramatic reduction in ED patient boarding times. Results also suggest that signification additional reductions in patient boarding are possible through the proposed proactive approach in the presence of reliable analytics models for prediction ED patient admissions and remaining ED length-of-stay. Future research can focus on further extending the scope of coordination to include admissions management (including any necessary approvals from insurance), coordination needs for admissions that stem from outside the ED (e.g., elective surgeries), as well as ambulance diversions to manage patient flows across the region and hospital networks

    The Technological Emergence of AutoML: A Survey of Performant Software and Applications in the Context of Industry

    Full text link
    With most technical fields, there exists a delay between fundamental academic research and practical industrial uptake. Whilst some sciences have robust and well-established processes for commercialisation, such as the pharmaceutical practice of regimented drug trials, other fields face transitory periods in which fundamental academic advancements diffuse gradually into the space of commerce and industry. For the still relatively young field of Automated/Autonomous Machine Learning (AutoML/AutonoML), that transitory period is under way, spurred on by a burgeoning interest from broader society. Yet, to date, little research has been undertaken to assess the current state of this dissemination and its uptake. Thus, this review makes two primary contributions to knowledge around this topic. Firstly, it provides the most up-to-date and comprehensive survey of existing AutoML tools, both open-source and commercial. Secondly, it motivates and outlines a framework for assessing whether an AutoML solution designed for real-world application is 'performant'; this framework extends beyond the limitations of typical academic criteria, considering a variety of stakeholder needs and the human-computer interactions required to service them. Thus, additionally supported by an extensive assessment and comparison of academic and commercial case-studies, this review evaluates mainstream engagement with AutoML in the early 2020s, identifying obstacles and opportunities for accelerating future uptake

    On the role of pre and post-processing in environmental data mining

    Get PDF
    The quality of discovered knowledge is highly depending on data quality. Unfortunately real data use to contain noise, uncertainty, errors, redundancies or even irrelevant information. The more complex is the reality to be analyzed, the higher the risk of getting low quality data. Knowledge Discovery from Databases (KDD) offers a global framework to prepare data in the right form to perform correct analyses. On the other hand, the quality of decisions taken upon KDD results, depend not only on the quality of the results themselves, but on the capacity of the system to communicate those results in an understandable form. Environmental systems are particularly complex and environmental users particularly require clarity in their results. In this paper some details about how this can be achieved are provided. The role of the pre and post processing in the whole process of Knowledge Discovery in environmental systems is discussed

    Edge Impulse: An MLOps Platform for Tiny Machine Learning

    Full text link
    Edge Impulse is a cloud-based machine learning operations (MLOps) platform for developing embedded and edge ML (TinyML) systems that can be deployed to a wide range of hardware targets. Current TinyML workflows are plagued by fragmented software stacks and heterogeneous deployment hardware, making ML model optimizations difficult and unportable. We present Edge Impulse, a practical MLOps platform for developing TinyML systems at scale. Edge Impulse addresses these challenges and streamlines the TinyML design cycle by supporting various software and hardware optimizations to create an extensible and portable software stack for a multitude of embedded systems. As of Oct. 2022, Edge Impulse hosts 118,185 projects from 50,953 developers

    Optimization of Fluid Bed Dryer Energy Consumption for Pharmaceutical Drug Processes through Machine Learning and Cloud Computing Technologies

    Full text link
    [ES] Los altos costes energéticos, las constantes medidas regulatorias aplicadas por las administraciones para mantener bajos los costes sanitarios, así como los cambios en la normativa sanitaria que se han introducido en los últimos años, han tenido un impacto significativo en la industria farmacéutica y sanitaria. El paradigma Industria 4.0 engloba cambios en el modelo productivo tradicional de la industria farmacéutica con la inclusión de tecnologías que van más allá de la automatización tradicional. El objetivo principal es lograr medicamentos más rentables mediante la incorporación óptima de tecnologías como la analítica avanzada. El proceso de fabricación de las industrias farmacéuticas tiene diferentes etapas (mezclado, secado, compactado, recubrimiento, envasado, etc.) donde una de las etapas más costosas energéticamente es el proceso de secado. El objetivo durante este proceso es extraer el contenido de líquidos como el agua mediante la inyección de aire caliente y seco en el sistema. Este tiempo de secado normalmente está predeterminado y depende del volumen y el tipo de unidades de producto farmacéutico que se deben deshidratar. Por otro lado, la fase de precalentamiento puede variar dependiendo de varios parámetros como la experiencia del operador. Por lo tanto, es posible asumir que una optimización de este proceso a través de analítica avanzada es posible y puede tener un efecto significativo en la reducción de costes en todo el proceso de fabricación. Debido al alto coste de la maquinaria involucrada en el proceso de producción de medicamentos, es una práctica común en la industria farmacéutica tratar de maximizar la vida útil de estas máquinas que no están equipados con los últimos sensores. Así pues, es posible implementar un modelo de aprendizaje automático que utilice plataformas de analítica avanzada, como la computación en la nube, para analizar los posibles ahorros en el consumo de energía. Esta tesis está enfocada en mejorar el consumo de energía en el proceso de precalentamiento de un secador de lecho fluido, mediante la definición e implementación de una plataforma de computación en la nube IIOT (Industrial Internet of Things)-Cloud, para alojar y ejecutar un algoritmo de aprendizaje automático basado en el modelo Catboost, para predecir cuándo es el momento óptimo para detener el proceso y reducir su duración y, en consecuencia, su consumo energético. Los resultados experimentales muestran que es posible reducir el proceso de precalentamiento en un 45% de su duración en tiempo y, en consecuencia, reducir el consumo de energía hasta 2.8 MWh por año.[CAT] Els elevats costos energètics, les constants mesures reguladores aplicades per les administracions per mantenir uns costos assistencials baixos, així com els canvis en la normativa sanitària que s'han introduït en els darrers anys, han tingut un impacte important en el sector farmacèutic i sanitari. El paradigma de la indústria 4.0 engloba els canvis en el model de producció tradicional de la indústria farmacèutica amb la inclusió de tecnologies que van més enllà de l'automatització tradicional. L'objectiu principal és aconseguir fàrmacs més rendibles mitjançant la incorporació òptima de tecnologies com l'analítica avançada. El procés de fabricació de les indústries farmacèutiques té diferents etapes (mescla, assecat, compactació, recobriment, envasat, etc.) on una de les etapes més costoses energèticament és el procés d'assecat. L'objectiu d'aquest procés és extreure el contingut de líquids com l'aigua injectant aire calent i sec al sistema. Aquest temps de procediment d'assecat normalment està predeterminat i depèn del volum i del tipus d'unitats de producte farmacèutic que cal deshidratar. D'altra banda, la fase de preescalfament pot variar en funció de diversos paràmetres com l'experiència de l'operador. Per tant, podem assumir que una optimització d'aquest procés mitjançant analítiques avançades és possible i pot tenir un efecte significatiu de reducció de costos en tot el procés de fabricació. A causa de l'elevat cost de la maquinària implicada en el procés de producció de fàrmacs, és una pràctica habitual a la indústria farmacèutica intentar maximitzar la vida útil d'aquestes màquines que no estan equipats amb els darrers sensors. Així, es pot implementar un model d'aprenentatge automàtic que utilitza plataformes de analítiques avançades com la computació en núvol, per analitzar l'estalvi potencial del consum d'energia. Aquesta tesis està enfocada a millorar el consum d'energia en el procés de preescalfament d'un assecador de llit fluid, mitjançant la definició i implementació d'una plataforma IIOT (Industrial Internet of Things)-Cloud computing, per allotjar i executar un algorisme d'aprenentatge automàtic basat en el modelatge Catboost, per predir quan és el moment òptim per aturar el procés i reduir-ne la durada, i en conseqüència el seu consum energètic. Els resultats de l'experiment mostren que és possible reduir el procés de preescalfament en un 45% de la seva durada en temps i, en conseqüència, reduir el consum d'energia fins a 2.8 MWh anuals.[EN] High energy costs, the constant regulatory measures applied by administrations to maintain low healthcare costs, and the changes in healthcare regulations introduced in recent years have all significantly impacted the pharmaceutical and healthcare industry. The industry 4.0 paradigm encompasses changes in the traditional production model of the pharmaceutical industry with the inclusion of technologies beyond traditional automation. The primary goal is to achieve more cost-efficient drugs through the optimal incorporation of technologies such as advanced analytics. The manufacturing process of the pharmaceutical industry has different stages (mixing, drying, compacting, coating, packaging, etc..), and one of the most energy-expensive stages is the drying process. This process aims to extract the liquid content, such as water, by injecting warm and dry air into the system. This drying procedure time usually is predetermined and depends on the volume and the kind of units of a pharmaceutical product that must be dehydrated. On the other hand, the preheating phase can vary depending on various parameters, such as the operator's experience. It is, therefore, safe to assume that optimization of this process through advanced analytics is possible and can have a significant cost-reducing effect on the whole manufacturing process. Due to the high cost of the machinery involved in the drug production process, it is common practice in the pharmaceutical industry to try to maximize the useful life of these machines, which are not equipped with the latest sensors. Thus, a machine learning model using advanced analytics platforms, such as cloud computing, can be implemented to analyze potential energy consumption savings. This thesis is focused on improving the energy consumption in the preheating process of a fluid bed dryer by defining and implementing an IIOT (Industrial Internet of Things) Cloud computing platform. This architecture will host and run a machine learning algorithm based on Catboost modeling to predict when the optimum time is reached to stop the process, reduce its duration, and consequently its energy consumption. Experimental results show that it is possible to reduce the preheating process by 45% of its time duration, consequently reducing energy consumption by up to 2.8 MWh per year.Barriga Rodríguez, R. (2023). Optimization of Fluid Bed Dryer Energy Consumption for Pharmaceutical Drug Processes through Machine Learning and Cloud Computing Technologies [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/19584

    Prognostics and health management for maintenance practitioners - Review, implementation and tools evaluation.

    Get PDF
    In literature, prognostics and health management (PHM) systems have been studied by many researchers from many different engineering fields to increase system reliability, availability, safety and to reduce the maintenance cost of engineering assets. Many works conducted in PHM research concentrate on designing robust and accurate models to assess the health state of components for particular applications to support decision making. Models which involve mathematical interpretations, assumptions and approximations make PHM hard to understand and implement in real world applications, especially by maintenance practitioners in industry. Prior knowledge to implement PHM in complex systems is crucial to building highly reliable systems. To fill this gap and motivate industry practitioners, this paper attempts to provide a comprehensive review on PHM domain and discusses important issues on uncertainty quantification, implementation aspects next to prognostics feature and tool evaluation. In this paper, PHM implementation steps consists of; (1) critical component analysis, (2) appropriate sensor selection for condition monitoring (CM), (3) prognostics feature evaluation under data analysis and (4) prognostics methodology and tool evaluation matrices derived from PHM literature. Besides PHM implementation aspects, this paper also reviews previous and on-going research in high-speed train bogies to highlight problems faced in train industry and emphasize the significance of PHM for further investigations

    Software development to estimate the Pavement Interaction effects on vehicle fuel consumption

    Get PDF
    corecore