6,232 research outputs found
Recommended from our members
Data analytics to reduce stop-on-fail test in electronics manufacturing
The use of data driven techniques is popular in smart manufacturing. Machine learning, statistics or a combination of both have been used to improve processes in electronic manufacturing. This paper presents the application of classical techniques to reduce production cycle time by compacting a production test sequence. This set of tests is run on stop-on-fail scenario for quality assurance of an electronical device. Data generated in the production test-set on stop-on-fail scenario challenges the traditional application of the data driven techniques, because of the missing data characteristic. The developed computational procedures handle this application-specific data attribute. The novelty of this work is in the algorithm developed, which applies classical techniques in an iterative environment, as a strategy to analyse incomplete datasets. Results show that the method can reduce a production test set with parametric and non-parametric tests by building an accurate prognostic model. The results can reduce production cycle time and costs. The paper details and provides discussions on the advantages and limitations of the proposed algorithms
Continuous maintenance and the future – Foundations and technological challenges
High value and long life products require continuous maintenance throughout their life cycle to achieve required performance with optimum through-life cost. This paper presents foundations and technologies required to offer the maintenance service. Component and system level degradation science, assessment and modelling along with life cycle ‘big data’ analytics are the two most important knowledge and skill base required for the continuous maintenance. Advanced computing and visualisation technologies will improve efficiency of the maintenance and reduce through-life cost of the product. Future of continuous maintenance within the Industry 4.0 context also identifies the role of IoT, standards and cyber security
Disparate data integration case for connected factories using timestamps
Manufacturing data integration of machine, process, and sensor data from the shop floor remains an important issue to achieve the anticipated business value of fully connected factories. Integrated manufacturing data has been a hallmark of Industry 4.0 initiatives, because integrated data precipitates better decision-making for cost, schedule, and system optimizations. In this paper, we extend work on optimizing manufacturing costs, describing an algorithm using timestamps to integrate previously unassociated quality and test information, enabling us to better identify and eliminate redundant tests. Results are provided and discussed, and we suggest the approach described may be valuable for some types of heterogeneous manufacturing data integration where timestamps and event chronologies are available
Failure mode & effect analysis for improving data veracity and validity
Failure Mode & Effect Analysis (FMEA) is a method that has been used to improve reliability of products, processes, designs, and software for different applications, including electronics manufacturing. In this paper we propose a modification of this method to extend its application for data veracity and validity improvement. The proposed DVV-FMEA method is based on engineering features and in addition, provides transparency and understandability of the data and its pre-processing, making it reproducible and trustful
WeighstEd
The purpose of this design thesis is to outline and describe the design project; WeighstEd. WeighstEd, is a data collection, storage, and analysis system for food waste to help Santa Clara University’s Sustainability Center reach a quantifiable food waste reduction goal of 10% by 2020 by using data to make informed cafeteria changes. The report will outline the entire engineering design process from ideation to manufacture including analysis techniques and benchmark testing. This report will serve as a written documentation of three mechanical engineers Senior Design Project completed at Santa Clara University. WeighstEd will be implemented at on campus events and in the university cafeteria beginning in the 2019-2020 school year
Recommended from our members
Failure mode & effect analysis and another methodology for improving data veracity and validity
Failure Mode & Effect Analysis (FMEA) is a method that has been used to improve reliability of products, processes, designs, and software for different applications. In this paper we extend its usage for data veracity and validity improvement in the context of big data analysis and discuss its application in an electronics manufacturing test procedure which consists of a sequence of tests. Finally, we describe another methodology, developed as a result of the DVV-FMEA application which is aimed at improving the tests' repeatability and failure detection capabilities as well as monitoring their reliability
Data driven predictive model to compact a production stop-on-fail test set for an electronic device
Decision Tree is a popular machine learning algorithm used for fault detection and classification in the industry. In this paper, the modelling technique is used to compact a production test set defined for quality assurance of an electronic asset. The novelty of this work is in the proposed method that builds in an iterative way decision trees until an accurate predictive model that meets classification accuracy target in a stop-on-fail test scenario. Generated test data is characterized with missing values which is a major challenge to the traditional use of decision trees. The developed computational procedure handles this application-specific data attribute. Exemplary results show that the method is able to significantly reduce a production test set with parametric and non-parametric tests, and generate a truthful prognostic model. In addition, the method is computationally efficient and easy to implement. It could also be combined with another test compaction strategies such as variables association analysis. Furthermore, the method proposed offers the flexibility of exploring the trade-off between the number of removed tests from the production test set and the prediction accuracy. The results can enable production costs reduction without impacting quality detection accuracy. The paper details and provides discussions on the advantages and limitations of the proposed algorithm
Supply chain risk management: an interactive simulation model in a big data context
Peer-review under responsibility of the scientific committee of the International Conference on Industry 4.0 and Smart Manufacturing. Aligned with the Industry 4.0 research and innovation agenda, a Decision Support System is currently being developed with the purpose of enhancing decision-making in risk scenarios at Supply Chains. It is comprised of a Big Data Warehouse and a simulation model. The former stores and provides integrated real data to the simulation model, which models the respective materials and information flows. Thus, the purpose of this paper is to present such tool being used to test scenarios that, contrarily to the traditional simulation approach, incorporate disruptions in an interactive way, meaning that users may fire such events at any desired simulation time and with different parameters. Thus, the tool is used to assess the impact of disruptions in the performance of the system. The conclusions of this paper highlight the benefits that can be obtained with the proposed interactive approach, as it allows a virtualization of the real system to be obtained and, at the same time, use the simulation model to assess what would be the impact of certain disruptions.This work has been supported by national funds through FCT – Fundação para a Ciência e Tecnologia within the Project Scope: UID/CEC/00319/2019 and by the Doctoral scholarship PDE/BDE/114566/2016 funded by FCT, the Portuguese Ministry of Science, Technology and Higher Education, through national funds, and co-financed by the European Social Fund (ESF) through the Operational Programme for Human Capital (POCH)
Designing Predictive Maintenance for Agricultural Machines
The Digital Transformation alters business models in all fields of application, but not all industries transform at the same speed. While recent innovations in smart products, big data, and machine learning have profoundly transformed business models in the high-tech sector, less digitalized industries—like agriculture—have only begun to capitalize on these technologies. Inspired by predictive maintenance strategies for industrial equipment, the purpose of this paper is to design, implement, and evaluate a predictive maintenance method for agricultural machines that predicts future defects of a machine’s components, based on a data-driven analysis of service records. An evaluation with 3,407 real-world service records proves that the method predicts damaged parts with a mean accuracy of 86.34%. The artifact is an exaptation of previous design knowledge from high-tech industries to agriculture—a sector in which machines move through rough territory and adverse weather conditions, are utilized extensively for short periods, and do not provide sensor data to service providers. Deployed on a platform, the prediction method enables co-creating a predictive maintenance service that helps farmers to avoid resources shortages during harvest seasons, while service providers can plan and conduct maintenance service preemptively and with increased efficiency
- …