373 research outputs found

    Forcasting Model Exsponensial Smoothing TIME Series Rata Rata Mechanical Availability Unit Off Highway Truck Cat 777d Caterpillar

    Full text link
    Rendahnya nilai availability dipengaruhi oleh besarnya total downtime losses. Total downtime losses disebabkan oleh frekuensi breakdown yang sering serta lamanya waktu menganggur mesin maupun waktu penyetelan mesin. Nilai availabilty ini bisa terjadi bervariasi seiring dengan waktu yang bisa diakumulasi rata-rata tiap bulan dalam suatu periode waktu pengoperasiannya. Tingkat nilai availability ini bisa diramalkan berdasarkan data tersebut salah satunnya dengan metode Exponential Smoothing.Bersumber dari hasil kajian referensi dari jurnal dan juga tinjuan pustaka yang mendukung penelitian ini, bawah state of art permasalahan yang mendasar adalah mengaplikasikan metode exponential smoothing ini dalam aspek yang lebih luas. Dari penelitian-penelitian yang sudah dilakukan penerapan model ini banyak dilakukan di lingkungan industri, dan belum ada yang mengaplikasikannya dalam membuat model peramalan exsponential smoothing availability pada peralatan berat terutama yang dioperasikan disektor pertambangan.Pada penelitian ini, berdarkan pada aplikasi penggunaan metode exponential smoothing yang cukup luas dalam peramalan, maka cukup menarik untuk menerapkan metode peramalan ini untuk mechanical availability alat berat unit truck tambang batubara, dan membuat model regresinya dari unit tersebut. Hal ini dilakukan untuk bisa menjadi rujukan sebagai sebuah model regresi mechanical availablity truk tambangDari hasil pembahasan dapat disimpulkan bahwa model yang dibangun adalah layak untuk meramalkan Nilai MA dari OHT 777 D tiga tahun kedapan. 36,0% nilai Mechanical Availabilty pada masa yang akan datang dipengaruhi oleh pola data nilai Mechanical Availablity yang terjadi pada masa lampau hingga saat ini, sedangkan sisanya (64%), dipengaruhi oleh faktor-faktor yang lain. Tingkat akurasi rata-rata model dalam memprediksi nilai MA pada OHT 777 D adalah 81,564%.

    Forcasting Model Exsponensial Smoothing Time Series Rata Rata Mechanical Availability Unit Off Highway Truck CAT 777D Caterpillar

    Get PDF
    Rendahnya nilai availability dipengaruhi oleh besarnya total downtime losses. Total downtime losses disebabkan oleh frekuensi breakdown yang sering serta lamanya waktu menganggur mesin maupun waktu penyetelan mesin. Nilai availabilty ini bisa terjadi bervariasi seiring dengan waktu yang bisa diakumulasi rata-rata tiap bulan dalam suatu periode waktu pengoperasiannya. Tingkat nilai availability ini bisa diramalkan berdasarkan data tersebut salah satunnya dengan metode Exponential Smoothing.Bersumber dari hasil kajian referensi dari jurnal dan juga tinjuan pustaka yang mendukung penelitian ini, bawah state of art permasalahan yang mendasar adalah mengaplikasikan metode exponential smoothing ini dalam aspek yang lebih luas. Dari penelitian-penelitian yang sudah dilakukan penerapan model ini banyak dilakukan di lingkungan industri, dan belum ada yang mengaplikasikannya dalam membuat model peramalan exsponential smoothing availability pada peralatan berat terutama yang dioperasikan disektor pertambangan.Pada penelitian ini, berdarkan pada aplikasi penggunaan metode exponential smoothing yang cukup luas dalam peramalan, maka cukup menarik untuk menerapkan metode peramalan ini untuk mechanical availability alat berat unit truck tambang batubara, dan membuat model regresinya dari unit tersebut. Hal ini dilakukan untuk bisa menjadi rujukan sebagai sebuah model regresi mechanical availablity truk tambangDari hasil pembahasan dapat disimpulkan bahwa model yang dibangun adalah layak untuk meramalkan Nilai MA dari OHT 777 D tiga tahun kedapan. 36,0% nilai Mechanical Availabilty pada masa yang akan datang dipengaruhi oleh pola data nilai Mechanical Availablity yang terjadi pada masa lampau hingga saat ini, sedangkan sisanya (64%), dipengaruhi oleh faktor-faktor yang lain. Tingkat akurasi rata-rata model dalam memprediksi nilai MA pada OHT 777 D adalah 81,564%.

    Sensing and Signal Processing in Smart Healthcare

    Get PDF
    In the last decade, we have witnessed the rapid development of electronic technologies that are transforming our daily lives. Such technologies are often integrated with various sensors that facilitate the collection of human motion and physiological data and are equipped with wireless communication modules such as Bluetooth, radio frequency identification, and near-field communication. In smart healthcare applications, designing ergonomic and intuitive human–computer interfaces is crucial because a system that is not easy to use will create a huge obstacle to adoption and may significantly reduce the efficacy of the solution. Signal and data processing is another important consideration in smart healthcare applications because it must ensure high accuracy with a high level of confidence in order for the applications to be useful for clinicians in making diagnosis and treatment decisions. This Special Issue is a collection of 10 articles selected from a total of 26 contributions. These contributions span the areas of signal processing and smart healthcare systems mostly contributed by authors from Europe, including Italy, Spain, France, Portugal, Romania, Sweden, and Netherlands. Authors from China, Korea, Taiwan, Indonesia, and Ecuador are also included

    Imaging photoplethysmography: towards effective physiological measurements

    Get PDF
    Since its conception decades ago, Photoplethysmography (PPG) the non-invasive opto-electronic technique that measures arterial pulsations in-vivo has proven its worth by achieving and maintaining its rank as a compulsory standard of patient monitoring. However successful, conventional contact monitoring mode is not suitable in certain clinical and biomedical situations, e.g., in the case of skin damage, or when unconstrained movement is required. With the advance of computer and photonics technologies, there has been a resurgence of interest in PPG and one potential route to overcome the abovementioned issues has been increasingly explored, i.e., imaging photoplethysmography (iPPG). The emerging field of iPPG offers some nascent opportunities in effective and comprehensive interpretation of the physiological phenomena, indicating a promising alternative to conventional PPG. Heart and respiration rate, perfusion mapping, and pulse rate variability have been accessed using iPPG. To effectively and remotely access physiological information through this emerging technique, a number of key issues are still to be addressed. The engineering issues of iPPG, particularly the influence of motion artefacts on signal quality, are addressed in this thesis, where an engineering model based on the revised Beer-Lambert law was developed and used to describe opto-physiological phenomena relevant to iPPG. An iPPG setup consisting of both hardware and software elements was developed to investigate its reliability and reproducibility in the context of effective remote physiological assessment. Specifically, a first study was conducted for the acquisition of vital physiological signs under various exercise conditions, i.e. resting, light and heavy cardiovascular exercise, in ten healthy subjects. The physiological parameters derived from the images captured by the iPPG system exhibited functional characteristics comparable to conventional contact PPG, i.e., maximum heart rate difference was <3 bpm and a significant (p < 0.05) correlation between both measurements were also revealed. Using a method for attenuation of motion artefacts, the heart rate and respiration rate information was successfully assessed from different anatomical locations even in high-intensity physical exercise situations. This study thereby leads to a new avenue for noncontact sensing of vital signs and remote physiological assessment, showing clear and promising applications in clinical triage and sports training. A second study was conducted to remotely assess pulse rate variability (PRV), which has been considered a valuable indicator of autonomic nervous system (ANS) status. The PRV information was obtained using the iPPG setup to appraise the ANS in ten normal subjects. The performance of the iPPG system in accessing PRV was evaluated via comparison with the readings from a contact PPG sensor. Strong correlation and good agreement between these two techniques verify the effectiveness of iPPG in the remote monitoring of PRV, thereby promoting iPPG as a potential alternative to the interpretation of physiological dynamics related to the ANS. The outcomes revealed in the thesis could present the trend of a robust non-contact technique for cardiovascular monitoring and evaluation

    Decision support system for cardiovascular problems

    Get PDF
    The DISHEART project aims at developing a new computer based decision support system (DSS) integrating medical image data, modelling, simulation, computational Grid technologies and artificial intelligence methods for assisting clinical diagnosis and intervention in cardiovascular problems. The RTD goal is to improve and link existing state of the art technologies in order to build a computerised cardiovascular model for the analysis of the heart and blood vessels. The resulting DISHEART DSS interfaces computational biomechanical analysis tools with the information coming from multimodal medical images. The computational model is coupled to an artificial neural network (ANN) based decision model that can be educated for each particular patient with data coming from his/her images and/or analyses. The DISHEART DSS system is validated in trials of clinical diagnosis, surgical intervention and subject-specific design of medical devices in the cardiovascular domain. The DISHEART DSS also contributes to a better understanding of cardiovascular morphology and function as inferred from routine imaging examinations. Four reputable medical centers in Europe took an active role in the validation and dissemination of the DISHEART DSS as well as the elaboration of computational material and medical images. The integrated DISHEART DSS supports health professionals in taking promptly the best possible decision for prevention, diagnosis and treatment. Emphasis was put in the development of userfriendly, fast and reliable tools and interfaces providing access to heterogeneous health information sources, as well as on new methods for decision support and risk analysis. The use of Grid computing technology is essential in order to optimise and distribute the heavy computational work required for physical modelling and numerical simulations and especially for the parametric analysis required for educating the DSS for every particular application. The four end user SMEs participating in the project benefits from the new DISHEART DSS. The companies COMPASS, QUANTECH and Heartcore will market the DSS among public and private organizations related to the cardiovascular field. EndoArt will exploit the DISHEART DSS as a support for enhanced design and production of clinical devices. The partnership was sought in order to gather the maximum complementary of skills for the successful development of the project Disheart DSS, requiring experts in Mechanical sciences, Medical sciences, Informatic, and FEM technique to grow up the testes.Postprint (published version

    Fast Objective Coupled Planar Illumination Microscopy

    Get PDF
    Among optical imaging techniques light sheet fluorescence microscopy stands out as one of the most attractive for capturing high-speed biological dynamics unfolding in three dimensions. The technique is potentially millions of times faster than point-scanning techniques such as two-photon microscopy. This potential is especially poignant for neuroscience applications due to the fact that interactions between neurons transpire over mere milliseconds within tissue volumes spanning hundreds of cubic microns. However current-generation light sheet microscopes are limited by volume scanning rate and/or camera frame rate. We begin by reviewing the optical principles underlying light sheet fluorescence microscopy and the origin of these rate bottlenecks. We present an analysis leading us to the conclusion that Objective Coupled Planar Illumination (OCPI) microscopy is a particularly promising technique for recording the activity of large populations of neurons at high sampling rate. We then present speed-optimized OCPI microscopy, the first fast light sheet technique to avoid compromising image quality or photon efficiency. We enact two strategies to develop the fast OCPI microscope. First, we devise a set of optimizations that increase the rate of the volume scanning system to 40 Hz for volumes up to 700 microns thick. Second, we introduce Multi-Camera Image Sharing (MCIS), a technique to scale imaging rate by incorporating additional cameras. MCIS can be applied not only to OCPI but to any widefield imaging technique, circumventing the limitations imposed by the camera. Detailed design drawings are included to aid in dissemination to other research groups. We also demonstrate fast calcium imaging of the larval zebrafish brain and find a heartbeat-induced motion artifact. We recommend a new preprocessing step to remove the artifact through filtering. This step requires a minimal sampling rate of 15 Hz, and we expect it to become a standard procedure in zebrafish imaging pipelines. In the last chapter we describe essential computational considerations for controlling a fast OCPI microscope and processing the data that it generates. We introduce a new image processing pipeline developed to maximize computational efficiency when analyzing these multi-terabyte datasets, including a novel calcium imaging deconvolution algorithm. Finally we provide a demonstration of how combined innovations in microscope hardware and software enable inference of predictive relationships between neurons, a promising complement to more conventional correlation-based analyses

    Positioning of a wireless relay node for useful cooperative communication

    Get PDF
    Given the exorbitant amount of data transmitted and the increasing demand for data connectivity in the 21st century, it has become imperative to search for pro-active and sustainable solutions to the effectively alleviate the overwhelming burden imposed on wireless networks. In this study a Decode and Forward cooperative relay channel is analyzed, with the employment of Maximal Ratio Combining at the destination node as the method of offering diversity combining. The system framework used is based on a three-node relay channel with a source node, relay node and a destination node. A model for the wireless communications channel is formulated in order for simulation to be carried out to investigate the impact on performance of relaying on a node placed at the edge of cell. Firstly, an AWGN channel is used before the effect of Rayleigh fading is taken into consideration. Result shows that performance of cooperative relaying performance is always superior or similar to conventional relaying. Additionally, relaying is beneficial when the relay is placed closer to the receiver
    corecore