229,762 research outputs found

    ATMSeer: Increasing Transparency and Controllability in Automated Machine Learning

    Full text link
    To relieve the pain of manually selecting machine learning algorithms and tuning hyperparameters, automated machine learning (AutoML) methods have been developed to automatically search for good models. Due to the huge model search space, it is impossible to try all models. Users tend to distrust automatic results and increase the search budget as much as they can, thereby undermining the efficiency of AutoML. To address these issues, we design and implement ATMSeer, an interactive visualization tool that supports users in refining the search space of AutoML and analyzing the results. To guide the design of ATMSeer, we derive a workflow of using AutoML based on interviews with machine learning experts. A multi-granularity visualization is proposed to enable users to monitor the AutoML process, analyze the searched models, and refine the search space in real time. We demonstrate the utility and usability of ATMSeer through two case studies, expert interviews, and a user study with 13 end users.Comment: Published in the ACM Conference on Human Factors in Computing Systems (CHI), 2019, Glasgow, Scotland U

    Application of Audible Signals in Tool Condition Monitoring using Machine Learning Techniques

    Get PDF
    Machining is always accompanied by many difficulties like tool wear, tool breakage, improper machining conditions, non-uniform workpiece properties and some other irregularities, which are some of major barriers to highly-automated operations. Effective tool condition monitoring (TCM) system provides a best solution to monitor those irregular machining processes and suggest operators to take appropriate actions. Even though a wide variety of monitoring techniques have been developed for the online detection of tool condition, it remains an unsolved problem to look for a reliable, simple and cheap solution. This research work mainly focuses on developing a real-time tool condition monitoring model to detect the tool condition, part quality in machining process by using machine learning techniques through sound monitoring. The present study shows the development of a process model capable of on-line process monitoring utilizing machine learning techniques to analyze the sound signals collected during machining and train the proposed system to predict the cutting phenomenon during machining. A decision-making system based on the machine learning technique involving Support Vector Machine approach is developed. The developed system is trained with pre-processed data and tested, and the system showed a significant prediction accuracy in different applications which proves to be an effective model in applying to machining process as an on-line process monitoring system. In addition, this system also proves to be effective, cheap, compact and sensory position invariant. The successful development of the proposed TCM system can provide a practical tool to reduce downtime for tool changes and minimize the amount of scrap in metal cutting industry

    THE CNC VIRTUAL AS TEACHING AND TRAINING AID OF CNC PROGRAMMING

    Get PDF
    CNC machine tools is the most important practical means of teaching and training of CNC Programming in Vocational High School. Its relatively-high price causes the incapabibilty of the school for getting it, so the teaching of CNC programming in Vocational High School mostly doesn’t use CNC machine. The effect is many students can’t reach the standard competence of applied CNC programming. The unavailability of CNC machine tools in teaching of CNC programming in Vocational High School is treated by using CNC Simulator. The CNC Simulator consist Virtual CNC, and CNC Machine Simulator. It’is a media to simulate of NC Part Program execution..The simulation of NC Part Program execution are displayed tool path a machining process at monitor. NC Part Program has been simulated can be sent to unit control of CNC Machine Simulator. Implementation of CNC Simulator in teaching and training of CNC programming begins from building CNC Virtual. The CNC Virtual is a software which provides a visual effect of environment of CNC machine in the monitor. The building uses Research and Development (R&D) method. Implementation of CNC Simulator in teaching of CNC programming shows; (1) the students are very interested and excited to use the virtual CNC which provides a visual effect of environment of CNC machine in the monitor, actively trying the simulation of numpad virtual in the monitor, inputting data on the panel virtual, and making simulation or execution of the CNC program at CNC Machine Simulator, (2) the students practice to make and execute the CNC programming individually in the classroom or outdoor class. (3) CNC Virtual can be used as teaching and training media classically (in classroom), individually learning, even e-learning

    A system of remote patients' monitoring and alerting using the machine learning technique

    Get PDF
    Machine learning has become an essential tool in daily life, or we can say it is a powerful tool in the majority of areas that we wish to optimize. Machine learning is being used to create techniques that can learn from labelled or unlabeled information, as well as learn from their surroundings. Machine learning is utilized in various areas, but mainly in the healthcare industry, where it provides significant advantages via appropriate decision and prediction methods. ,e proposed work introduces a remote system that can continuously monitor the patient and can produce an alert whenever necessary. ,e proposed methodology makes use of different machine learning algorithms along with cloud computing for continuous data storage. Over the years, these technologies have resulted in significant advancements in the healthcare industry. Medical professionals utilize machine learning tools and methods to analyse medical data in order to detect hazards and offer appropriate diagnosis and treatment. ,e scope of remote healthcare includes anything from tracking chronically sick patients, elderly people, preterm children, and accident victims.The current study explores the machine learning technologies’ capability of monitoring remote patients and alerts their current condition through the remote system. New advances in contactless observation demonstrate that it is only necessary for the patient to be present within a few meters of the sensors for them to work. Sensors connected to the body and environmental sensors connected to the surroundings are examples of the technology available.Campus At

    Darknet Traffic Analysis A Systematic Literature Review

    Full text link
    The primary objective of an anonymity tool is to protect the anonymity of its users through the implementation of strong encryption and obfuscation techniques. As a result, it becomes very difficult to monitor and identify users activities on these networks. Moreover, such systems have strong defensive mechanisms to protect users against potential risks, including the extraction of traffic characteristics and website fingerprinting. However, the strong anonymity feature also functions as a refuge for those involved in illicit activities who aim to avoid being traced on the network. As a result, a substantial body of research has been undertaken to examine and classify encrypted traffic using machine learning techniques. This paper presents a comprehensive examination of the existing approaches utilized for the categorization of anonymous traffic as well as encrypted network traffic inside the darknet. Also, this paper presents a comprehensive analysis of methods of darknet traffic using machine learning techniques to monitor and identify the traffic attacks inside the darknet.Comment: 35 Pages, 13 Figure

    A generalized model for monitor units determination in ocular proton therapy using machine learning:A proof-of-concept study

    Get PDF
    Objective. Determining and verifying the number of monitor units is crucial to achieving the desired dose distribution in radiotherapy and maintaining treatment efficacy. However, current commercial treatment planning system(s) dedicated to ocular passive eyelines in proton therapy do not provide the number of monitor units for patient-specific plan delivery. Performing specific pre-treatment field measurements, which is time and resource consuming, is usually gold-standard practice. This proof-of-concept study reports on the development of a multi-institutional-based generalized model for monitor units determination in proton therapy for eye melanoma treatments. Approach. To cope with the small number of patients being treated in proton centers, three European institutes participated in this study. Measurements data were collected to address output factor differences across the institutes, especially as function of field size, spread-out Bragg peak modulation width, residual range, and air gap. A generic model for monitor units prediction using a large number of 3748 patients and broad diversity in tumor patterns, was evaluated using six popular machine learning algorithms: (i) decision tree; (ii) random forest, (iii) extra trees, (iv) K-nearest neighbors, (v) gradient boosting, and (vi) the support vector regression. Features used as inputs into each machine learning pipeline were: Spread-out Bragg peak width, range, air gap, fraction and calibration doses. Performance measure was scored using the mean absolute error, which was the difference between predicted and real monitor units, as collected from institutional gold-standard methods. Main results. Predictions across algorithms were accurate within 3% uncertainty for up to 85.2% of the plans and within 10% uncertainty for up to 98.6% of the plans with the extra trees algorithm. Significance. A proof-of-concept of using machine learning-based generic monitor units determination in ocular proton therapy has been demonstrated. This could trigger the development of an independent monitor units calculation tool for clinical use.</p

    A generalized model for monitor units determination in ocular proton therapy using machine learning:A proof-of-concept study

    Get PDF
    Objective. Determining and verifying the number of monitor units is crucial to achieving the desired dose distribution in radiotherapy and maintaining treatment efficacy. However, current commercial treatment planning system(s) dedicated to ocular passive eyelines in proton therapy do not provide the number of monitor units for patient-specific plan delivery. Performing specific pre-treatment field measurements, which is time and resource consuming, is usually gold-standard practice. This proof-of-concept study reports on the development of a multi-institutional-based generalized model for monitor units determination in proton therapy for eye melanoma treatments. Approach. To cope with the small number of patients being treated in proton centers, three European institutes participated in this study. Measurements data were collected to address output factor differences across the institutes, especially as function of field size, spread-out Bragg peak modulation width, residual range, and air gap. A generic model for monitor units prediction using a large number of 3748 patients and broad diversity in tumor patterns, was evaluated using six popular machine learning algorithms: (i) decision tree; (ii) random forest, (iii) extra trees, (iv) K-nearest neighbors, (v) gradient boosting, and (vi) the support vector regression. Features used as inputs into each machine learning pipeline were: Spread-out Bragg peak width, range, air gap, fraction and calibration doses. Performance measure was scored using the mean absolute error, which was the difference between predicted and real monitor units, as collected from institutional gold-standard methods. Main results. Predictions across algorithms were accurate within 3% uncertainty for up to 85.2% of the plans and within 10% uncertainty for up to 98.6% of the plans with the extra trees algorithm. Significance. A proof-of-concept of using machine learning-based generic monitor units determination in ocular proton therapy has been demonstrated. This could trigger the development of an independent monitor units calculation tool for clinical use.</p

    Use of new generation geospatial data and technology for low cost drought monitoring and SDG reporting solution : a thesis presented in partial fulfillment of the requirement for the degree of Master of Science in Computer Science at Massey University, Manawatū, New Zealand

    Get PDF
    Food security is dependent on ecosystems including forests, lakes and wetlands, which in turn depend on water availability and quality. The importance of water availability and monitoring drought has been highlighted in the Sustainable Development Goals (SDGs) within the 2030 agenda under indicator 15.3. In this context the UN member countries, which agreed to the SDGs, have an obligation to report their information to the UN. The objective of this research is to develop a methodology to monitor drought and help countries to report their ndings to UN in a cost-e ective manner. The Standard Precipitation Index (SPI) is a drought indicator which requires longterm precipitation data collected from weather stations as per World Meteorological Organization recommendation. However, weather stations cannot monitor large areas and many developing countries currently struggling with drought do not have access to a large number of weather-stations due to lack of funds and expertise. Therefore, alternative methodologies should be adopted to monitor SPI. In this research SPI values were calculated from available weather stations in Iran and New Zealand. By using Google Earth Engine (GEE), Sentinel-1 and Sentinel- 2 imagery and other complementary data to estimate SPI values. Two genetic algorithms were created, one which constructed additional features using indices calculated from Sentinel-2 imagery and the other data which was used for feature selection of the Sentinel-2 indices including the constructed features. Followed by the feature selection process two datasets were created which contained the Sentinel- 1 and Sentinel-2 data and other complementary information such as seasonal data and Shuttle Radar Topography Mission (SRTM) derived information. The Automated Machine Learning tool known as TPOT was used to create optimized machine learning pipelines using genetic programming. The resulting models yielded an average of 90 percent accuracy in 10-fold cross validation for the Sentinel- 1 dataset and an average of approximately 70 percent for the Sentinel-2 dataset. The nal model achieved a test accuracy of 80 percent in classifying short-term SPI (SPI- 1 and SPI-3) and an accuracy of 65 percent of SPI-6 by using the Sentinel-1 test dataset. However, the results generated by using Sentinel-2 dataset was lower than Sentinel-1 (45 percent for SPI-1 and 65 percent for SPI-6) with the exception of SPI-3 which had an accuracy of 85 percent. The research shows that it is possible to monitor short-term SPI adequately using cost free satellite imagery in particular Sentinel-1 imagery and machine learning. In addition, this methodology reduces the workload on statistical o ces of countries in reporting information to the SDG framework for SDG indicator 15.3. It emerged that Sentinel-1 imagery alone cannot be used to monitor SPI and therefore complementary data are required for the monitoring process. In addition the use of Sentinel-2 imagery did not result in accurate results for SPI-1 and SPI-6 but adequate results for SPI-3. Further research is required to investigate how the use of Sentinel-2 imagery with Sentinel-1 imagery impact the accuracy of the models

    Tool Wear Prediction Upgrade Kit for Legacy CNC Milling Machines in the Shop Floor

    Get PDF
    The operation of CNC milling is expensive because of the cost-intensive use of cutting tools. The wear and tear of CNC tools influence the tool lifetime. Today’s machines are not capable of accurately estimating the tool abrasion during the machining process. Therefore, manufacturers rely on reactive maintenance, a tool change after breakage, or a preventive maintenance approach, a tool change according to predefined tool specifications. In either case, maintenance costs are high due to a loss of machine utilization or premature tool change. To find the optimal point of tool change, it is necessary to monitor CNC process parameters during machining and use advanced data analytics to predict the tool abrasion. However, data science expertise is limited in small-medium sized manufacturing companies. The long operating life of machines often does not justify investments in new machines before the end of operating life. The publication describes a cost-efficient approach to upgrade legacy CNC machines with a Tool Wear Prediction Upgrade Kit. A practical solution is presented with a holistic hardware/software setup, including edge device, and multiple sensors. The prediction of tool wear is based on machine learning. The user interface visualizes the machine condition for the maintenance personnel in the shop floor. The approach is conceptualized and discussed based on industry requirements. Future work is outlined
    • …
    corecore