112,239 research outputs found
Tool life prediction and management for an integrated tool selection system
In machining, it is often difficult to select appropriate tools (tool holder and insert), machining parameters (cutting speed, feed rate and depth of cut) and tool replacement times for all tools due to the wide variety of tooling options and the complexity of the machining operations. Of particular interest is the complex interrelationships between tool selection, cutting data calculation and tool life prediction and control. Numerous techniques and methods of measuring and modelling tool wear, particularly in turning operations were reviewed. The characteristics of these methods were analysed and it was found that most tool wear studies were self-contained without any obvious interface with tool selection. The work presented herein deals with the development of an integrated, off-line tool life control system (TLC). The tool life control system (TLC) predicts tool life for the various turning operations and for a wide variety of workpiece materials. TLC is a closed-loop system combining algorithms with feedback based on direct measurement of flank wear. TLC has been developed using Crystal, which is a rule-based shell and statistical techniques such as multiple regression and the least-squares method. TLC consists of five modules namely, the technical planning of the cutting operation (TPO), tool life prediction (TLP), tool life assessor (TLA), tool life management (TLM) and the tool wear balancing and requirement planning (TRP).The technical planning of the cutting operation (TPO) module contains a procedure to select tools and generate efficient machining parameters (cutting velocity, feed rate and depth of cut) for turning and boring operations. For any selected insert grade, material sub-class, type of cut (finishing, medium-roughing and roughing) and type of cutting fluid, the tool life prediction (TLP) module calculates the theoretical tool life value (T(_sugg)) based on tool life coefficients derived from tool manufacturers' data. For the selected operation, the tool life assessor (TLA) generates a dynamic multiple regression to calculate the approved tool life constants (InC, 1/a, 1/β) based on the real tool life data collected from experiments. These approved constants are used to calculate a modified tool life value (T(_mod)) for the given operation. The stochastic nature of tool life is taken into account, as well as the uncertainty of the available information by introducing a 95% confidence level for tool life. The tool life management module (TLM) studies the variations in tool life data predicted by TLP and TLA and the approved tool life data collected from the shop floor and provides feedback concerning the accuracy of tool life predictions. Finally, the tool life balancing and requirement planning (TRP) methods address the problem of controlling and balancing the wear rate of the cutting edge by the appropriate alteration of cutting conditions so that each one will machine the number of parts that optimize the overall tool changing strategy. Two new tool changing strategies were developed based on minimum production cost, with very encouraging results. Cutting experiments proved that the state of wear and the tool life can be predicted efficiently by the proposed model. The resulting software can be used by machine manufacturers, tool consultants or process planners to achieve the integrated planning and control of tool life as part of the tool selection and cutting data calculation activity
Predicting Intermediate Storage Performance for Workflow Applications
Configuring a storage system to better serve an application is a challenging
task complicated by a multidimensional, discrete configuration space and the
high cost of space exploration (e.g., by running the application with different
storage configurations). To enable selecting the best configuration in a
reasonable time, we design an end-to-end performance prediction mechanism that
estimates the turn-around time of an application using storage system under a
given configuration. This approach focuses on a generic object-based storage
system design, supports exploring the impact of optimizations targeting
workflow applications (e.g., various data placement schemes) in addition to
other, more traditional, configuration knobs (e.g., stripe size or replication
level), and models the system operation at data-chunk and control message
level.
This paper presents our experience to date with designing and using this
prediction mechanism. We evaluate this mechanism using micro- as well as
synthetic benchmarks mimicking real workflow applications, and a real
application.. A preliminary evaluation shows that we are on a good track to
meet our objectives: it can scale to model a workflow application run on an
entire cluster while offering an over 200x speedup factor (normalized by
resource) compared to running the actual application, and can achieve, in the
limited number of scenarios we study, a prediction accuracy that enables
identifying the best storage system configuration
Realising intelligent virtual design
This paper presents a vision and focus for the CAD Centre research: the Intelligent Design Assistant (IDA). The vision is based upon the assumption that the human and computer can operate symbiotically, with the computer providing support for the human within the design process. Recently however the focus has been towards the development of integrated design platforms that provide general support irrespective of the domain, to a number of distributed collaborative designers. This is illustrated within the successfully completed Virtual Reality Ship (VRS) virtual platform, and the challenges are discussed further within the NECTISE, SAFEDOR and VIRTUE projects
Realising intelligent virtual design
This paper presents a vision and focus for the CAD Centre research: the Intelligent Design Assistant (IDA). The vision is based upon the assumption that the human and computer can operate symbiotically, with the computer providing support for the human within the design process. Recently however the focus has been towards the development of integrated design platforms that provide general support irrespective of the domain, to a number of distributed collaborative designers. This is illustrated within the successfully completed Virtual Reality Ship (VRS) virtual platform, and the challenges are discussed further within the NECTISE, SAFEDOR and VIRTUE projects
Maintenance Strategies to Reduce Downtime Due to Machine Positional Errors
Manufacturing strives to reduce waste and increase
Overall Equipment Effectiveness (OEE). When managing machine tool maintenance a manufacturer must apply an appropriate decision technique in order to reveal hidden costs associated with production losses, reduce equipment downtime
competently and similarly identify the machines’ performance.
Total productive maintenance (TPM) is a maintenance program that involves concepts for maintaining plant and equipment effectively. OEE is a powerful metric of manufacturing performance incorporating measures of the utilisation, yield and efficiency of a given process, machine or manufacturing line. It supports TPM initiatives by accurately tracking progress towards achieving “perfect production.”
This paper presents a review of maintenance management methodologies and their application to positional error calibration decision-making. The purpose of this review is to evaluate the contribution of maintenance strategies, in particular TPM, towards improving manufacturing performance, and how they could be applied to reduce downtime due to inaccuracy of the machine. This is to find a balance between predictive
calibration, on-machine checking and lost production due to inaccuracy.
This work redefines the role of maintenance management techniques and develops a framework to support the process of implementing a predictive calibration program as a prime method to supporting the change of philosophy for machine tool calibration decision making.
Keywords—maintenance strategies, down time, OEE, TPM, decision making, predictive calibration
A Process to Implement an Artificial Neural Network and Association Rules Techniques to Improve Asset Performance and Energy Efficiency
In this paper, we address the problem of asset performance monitoring, with the intention
of both detecting any potential reliability problem and predicting any loss of energy consumption
e ciency. This is an important concern for many industries and utilities with very intensive
capitalization in very long-lasting assets. To overcome this problem, in this paper we propose an
approach to combine an Artificial Neural Network (ANN) with Data Mining (DM) tools, specifically
with Association Rule (AR) Mining. The combination of these two techniques can now be done
using software which can handle large volumes of data (big data), but the process still needs to
ensure that the required amount of data will be available during the assets’ life cycle and that its
quality is acceptable. The combination of these two techniques in the proposed sequence di ers
from previous works found in the literature, giving researchers new options to face the problem.
Practical implementation of the proposed approach may lead to novel predictive maintenance models
(emerging predictive analytics) that may detect with unprecedented precision any asset’s lack of
performance and help manage assets’ O&M accordingly. The approach is illustrated using specific
examples where asset performance monitoring is rather complex under normal operational conditions.Ministerio de Economía y Competitividad DPI2015-70842-
A novel haptic model and environment for maxillofacial surgical operation planning and manipulation
This paper presents a practical method and a new haptic model to support manipulations of bones and their segments during the planning of a surgical operation in a virtual environment using a haptic interface. To perform an effective dental surgery it is important to have all the operation related information of the patient available beforehand in order to plan the operation and avoid any complications. A haptic interface with a virtual and accurate patient model to support the planning of bone cuts is therefore critical, useful and necessary for the surgeons. The system proposed uses DICOM images taken from a digital tomography scanner and creates a mesh model of the filtered skull, from which the jaw bone can be isolated for further use. A novel solution for cutting the bones has been developed and it uses the haptic tool to determine and define the bone-cutting plane in the bone, and this new approach creates three new meshes of the original model. Using this approach the computational power is optimized and a real time feedback can be achieved during all bone manipulations. During the movement of the mesh cutting, a novel friction profile is predefined in the haptical system to simulate the force feedback feel of different densities in the bone
Specification-Driven Predictive Business Process Monitoring
Predictive analysis in business process monitoring aims at forecasting the
future information of a running business process. The prediction is typically
made based on the model extracted from historical process execution logs (event
logs). In practice, different business domains might require different kinds of
predictions. Hence, it is important to have a means for properly specifying the
desired prediction tasks, and a mechanism to deal with these various prediction
tasks. Although there have been many studies in this area, they mostly focus on
a specific prediction task. This work introduces a language for specifying the
desired prediction tasks, and this language allows us to express various kinds
of prediction tasks. This work also presents a mechanism for automatically
creating the corresponding prediction model based on the given specification.
Differently from previous studies, instead of focusing on a particular
prediction task, we present an approach to deal with various prediction tasks
based on the given specification of the desired prediction tasks. We also
provide an implementation of the approach which is used to conduct experiments
using real-life event logs.Comment: This article significantly extends the previous work in
https://doi.org/10.1007/978-3-319-91704-7_7 which has a technical report in
arXiv:1804.00617. This article and the previous work have a coauthor in
commo
Recommended from our members
State-of-the-art on research and applications of machine learning in the building life cycle
Fueled by big data, powerful and affordable computing resources, and advanced algorithms, machine learning has been explored and applied to buildings research for the past decades and has demonstrated its potential to enhance building performance. This study systematically surveyed how machine learning has been applied at different stages of building life cycle. By conducting a literature search on the Web of Knowledge platform, we found 9579 papers in this field and selected 153 papers for an in-depth review. The number of published papers is increasing year by year, with a focus on building design, operation, and control. However, no study was found using machine learning in building commissioning. There are successful pilot studies on fault detection and diagnosis of HVAC equipment and systems, load prediction, energy baseline estimate, load shape clustering, occupancy prediction, and learning occupant behaviors and energy use patterns. None of the existing studies were adopted broadly by the building industry, due to common challenges including (1) lack of large scale labeled data to train and validate the model, (2) lack of model transferability, which limits a model trained with one data-rich building to be used in another building with limited data, (3) lack of strong justification of costs and benefits of deploying machine learning, and (4) the performance might not be reliable and robust for the stated goals, as the method might work for some buildings but could not be generalized to others. Findings from the study can inform future machine learning research to improve occupant comfort, energy efficiency, demand flexibility, and resilience of buildings, as well as to inspire young researchers in the field to explore multidisciplinary approaches that integrate building science, computing science, data science, and social science
- …