1,505 research outputs found

    Air pollution forecasts: An overview

    Full text link
    © 2018 by the authors. Licensee MDPI, Basel, Switzerland. Air pollution is defined as a phenomenon harmful to the ecological system and the normal conditions of human existence and development when some substances in the atmosphere exceed a certain concentration. In the face of increasingly serious environmental pollution problems, scholars have conducted a significant quantity of related research, and in those studies, the forecasting of air pollution has been of paramount importance. As a precaution, the air pollution forecast is the basis for taking effective pollution control measures, and accurate forecasting of air pollution has become an important task. Extensive research indicates that the methods of air pollution forecasting can be broadly divided into three classical categories: statistical forecasting methods, artificial intelligence methods, and numerical forecasting methods. More recently, some hybrid models have been proposed, which can improve the forecast accuracy. To provide a clear perspective on air pollution forecasting, this study reviews the theory and application of those forecasting models. In addition, based on a comparison of different forecasting methods, the advantages and disadvantages of some methods of forecasting are also provided. This study aims to provide an overview of air pollution forecasting methods for easy access and reference by researchers, which will be helpful in further studies

    The Application of ANN and ANFIS Prediction Models for Thermal Error Compensation on CNC Machine Tools

    Get PDF
    Thermal errors can have significant effects on Computer Numerical Control (CNC) machine tool accuracy. The errors come from thermal deformations of the machine elements caused by heat sources within the machine structure or from ambient temperature change. The effect of temperature can be reduced by error avoidance or numerical compensation. The performance of a thermal error compensation system essentially depends upon the accuracy and robustness of the thermal error model and its input measurements. This thesis first reviews different methods of designing thermal error models, before concentrating on employing Artificial Intelligence (AI) methods to design different thermal prediction models. In this research work the Adaptive Neuro-Fuzzy Inference System (ANFIS) is used as the backbone for thermal error modelling. The choice of inputs to the thermal model is a non-trivial decision which is ultimately a compromise between the ability to obtain data that sufficiently correlates with the thermal distortion and the cost of implementation of the necessary feedback sensors. In this thesis, temperature measurement was supplemented by direct distortion measurement at accessible locations. The location of temperature measurement must also provide a representative measurement of the change in temperature that will affect the machine structure. The number of sensors and their locations are not always intuitive and the time required to identify the optimal locations is often prohibitive, resulting in compromise and poor results. In this thesis, a new intelligent system for reducing thermal errors of machine tools using data obtained from thermography data is introduced. Different groups of key temperature points on a machine can be identified from thermal images using a novel schema based on a Grey system theory and Fuzzy C-Means (FCM) clustering method. This novel method simplifies the modelling process, enhances the accuracy of the system and reduces the overall number of inputs to the model, since otherwise a much larger number of thermal sensors would be required to cover the entire structure. An Adaptive Neuro-Fuzzy Inference System with Fuzzy C-Means clustering (ANFIS-FCM) is then employed to design the thermal prediction model. In order to optimise the approach, a parametric study is carried out by changing the number of inputs and number of Membership Functions (MFs) to the ANFIS-FCM model, and comparing the relative robustness of the designs. The proposed approach has been validated on three different machine tools under different operation conditions. Thus the proposed system has been shown to be robust to different internal heat sources, ambient changes and is easily extensible to other CNC machine tools. Finally, the proposed method is shown to compare favourably against alternative approaches such as an Artificial Neural Network (ANN) model and different Grey models

    Spatial relational learning and foraging in cotton-top tamarins

    Get PDF
    Spatial relationalleaming can be defined as the use of the spatial (geometric) relationship between two or more cues (landmarks) in order to locate additional points in space (O'Keefe and Nadel, 1979). An internal spatial representation enables an animal to compute novel locations and travel routes from familiar landmarks and routes (Dyer, 1993). A spatial representation is an internal construct mediating between perceived stimuli in the environment and the behaviour of the animal (Tolman, 1948). In this type of spatial representation the information encoded must be isomorphic with the physical environment such that the geometric relations of distance, angle and direction are maintained or can be computed from the stored information (Gallistel, 1990). A series of spatial and foraging task experiments were conducted to investigate the utilisation of spatial relational learning as a spatial strategy available to cotton-top tamarins (Sag uinus oedipus oedipus). The apparatus used was an 8x8 matrix of holes set in an upright wooden board to allow for the manipulation of visual cues and hidden food items such that the spatial configuration of cues and food could be transformed (translated or rotated) with respect to the perimeter of the board. The definitive test of spatial relational learning was whether the monkeys relied upon the spatial relationship between the visual cues to locate the position of the hidden food items. In a control experiment testing for differential use of perceptual information the results showed that if given the choice, tamarins relied on visual over olfactory cues in a foraging task. Callitrichids typically depend on olfactory communication in socio-sexual contexts so it was unusual that olfaction did not also play a significant role in foraging. In the first spatial learning experiment, the tamarins were found to rely on the three visually presented cues to locate the eleven hidden food items. However, their performance was not very accurate. In the next experiment the task was simplified so that the types of spatial strategies the monkeys were using to solve the foraging task could be clearly identified. In this experiment, only two visual cues were presented on either end of a line of four hidden food items. Once the monkeys were trained to these cues, the cues and food were translated and/or rotated on the board. Data from the beginning and middle of each testing session were used in the final analysis: in a previous analysis it was found that the monkeys initially searched the baited holes in the beginning of a testing session and thereafter predominantly searched unbaited holes. This suggests that they followed a win-stay/lose-shift foraging strategy, a finding that is supported by other studies of tamarins in captivity (Menzel and Juno, 1982) and the wild (Garber, 1989). The results also showed that the monkeys were searching predominately between the cues and not outside or around of them, indicating that they were locating the hidden food by using the spatial relationship between the visual cues. This provides evidence for the utilisation of spatial relational learning as a foraging strategy by cotton-top tamarins and the existence of complex internal spatial representations. Further studies are suggested to test captive monkeys' spatial relational capabilities and their foraging strategies. In addition, comparative and field studies are outlined that would provide information regarding New World monkeys' spatial learning abilities, neurophysiological organisation and the evolution of complex computational processes

    Modeling and Optimization of Micro-EDM Operation for Fabrication of Micro Holes

    Get PDF
    Based on the experimental results, an analysis was made to identify the performance of various electrodes during fabrication of micro holes considering Inconel 718 as well as titanium as workpiece materials. It was found that that platinum followed by graphite and copper as electrode material exhibited higher MRR for both the workpiece materials but on the other hand platinum showed higher values of OC, RCL and TA respectively when compared to graphite and copper. The variation of temperature distribution in radial and depth direction with different process parameters has been determined for Inconel 718 and Titanium 5. Theoretical cavity volume was calculated for different process parameter settings for both workpiece materials and it was found that Titanium 5 exhibited higher cavity volume then Inconel 718. This research work offers new insights into the performance of micro-µ-EDM of Inconel 718 and Titanium5 using different electrodes. The optimum process parameters have been identified to determine multi-objective machinability criteria such as MRR, angle of taper of micro-hole, the thickness of recast-layer and overcut for fabrication of micro-holes

    A novel approach to handwritten character recognition

    Get PDF
    A number of new techniques and approaches for off-line handwritten character recognition are presented which individually make significant advancements in the field. First. an outline-based vectorization algorithm is described which gives improved accuracy in producing vector representations of the pen strokes used to draw characters. Later. Vectorization and other types of preprocessing are criticized and an approach to recognition is suggested which avoids separate preprocessing stages by incorporating them into later stages. Apart from the increased speed of this approach. it allows more effective alteration of the character images since more is known about them at the later stages. It also allows the possibility of alterations being corrected if they are initially detrimental to recognition. A new feature measurement. the Radial Distance/Sector Area feature. is presented which is highly robust. tolerant to noise. distortion and style variation. and gives high accuracy results when used for training and testing in a statistical or neural classifier. A very powerful classifier is therefore obtained for recognizing correctly segmented characters. The segmentation task is explored in a simple system of integrated over-segmentation. Character classification and approximate dictionary checking. This can be extended to a full system for handprinted word recognition. In addition to the advancements made by these methods. a powerful new approach to handwritten character recognition is proposed as a direction for future research. This proposal combines the ideas and techniques developed in this thesis in a hierarchical network of classifier modules to achieve context-sensitive. off-line recognition of handwritten text. A new type of "intelligent" feedback is used to direct the search to contextually sensible classifications. A powerful adaptive segmentation system is proposed which. when used as the bottom layer in the hierarchical network. allows initially incorrect segmentations to be adjusted according to the hypotheses of the higher level context modules

    Advanced Data Analytics Methodologies for Anomaly Detection in Multivariate Time Series Vehicle Operating Data

    Get PDF
    Early detection of faults in the vehicle operating systems is a research domain of high significance to sustain full control of the systems since anomalous behaviors usually result in performance loss for a long time before detecting them as critical failures. In other words, operating systems exhibit degradation when failure begins to occur. Indeed, multiple presences of the failures in the system performance are not only anomalous behavior signals but also show that taking maintenance actions to keep the system performance is vital. Maintaining the systems in the nominal performance for the lifetime with the lowest maintenance cost is extremely challenging and it is important to be aware of imminent failure before it arises and implement the best countermeasures to avoid extra losses. In this context, the timely anomaly detection of the performance of the operating system is worthy of investigation. Early detection of imminent anomalous behaviors of the operating system is difficult without appropriate modeling, prediction, and analysis of the time series records of the system. Data based technologies have prepared a great foundation to develop advanced methods for modeling and prediction of time series data streams. In this research, we propose novel methodologies to predict the patterns of multivariate time series operational data of the vehicle and recognize the second-wise unhealthy states. These approaches help with the early detection of abnormalities in the behavior of the vehicle based on multiple data channels whose second-wise records for different functional working groups in the operating systems of the vehicle. Furthermore, a real case study data set is used to validate the accuracy of the proposed prediction and anomaly detection methodologies

    A Comprehensive Survey on Enterprise Financial Risk Analysis: Problems, Methods, Spotlights and Applications

    Full text link
    Enterprise financial risk analysis aims at predicting the enterprises' future financial risk.Due to the wide application, enterprise financial risk analysis has always been a core research issue in finance. Although there are already some valuable and impressive surveys on risk management, these surveys introduce approaches in a relatively isolated way and lack the recent advances in enterprise financial risk analysis. Due to the rapid expansion of the enterprise financial risk analysis, especially from the computer science and big data perspective, it is both necessary and challenging to comprehensively review the relevant studies. This survey attempts to connect and systematize the existing enterprise financial risk researches, as well as to summarize and interpret the mechanisms and the strategies of enterprise financial risk analysis in a comprehensive way, which may help readers have a better understanding of the current research status and ideas. This paper provides a systematic literature review of over 300 articles published on enterprise risk analysis modelling over a 50-year period, 1968 to 2022. We first introduce the formal definition of enterprise risk as well as the related concepts. Then, we categorized the representative works in terms of risk type and summarized the three aspects of risk analysis. Finally, we compared the analysis methods used to model the enterprise financial risk. Our goal is to clarify current cutting-edge research and its possible future directions to model enterprise risk, aiming to fully understand the mechanisms of enterprise risk communication and influence and its application on corporate governance, financial institution and government regulation

    Change blindness: eradication of gestalt strategies

    Get PDF
    Arrays of eight, texture-defined rectangles were used as stimuli in a one-shot change blindness (CB) task where there was a 50% chance that one rectangle would change orientation between two successive presentations separated by an interval. CB was eliminated by cueing the target rectangle in the first stimulus, reduced by cueing in the interval and unaffected by cueing in the second presentation. This supports the idea that a representation was formed that persisted through the interval before being 'overwritten' by the second presentation (Landman et al, 2003 Vision Research 43149–164]. Another possibility is that participants used some kind of grouping or Gestalt strategy. To test this we changed the spatial position of the rectangles in the second presentation by shifting them along imaginary spokes (by ±1 degree) emanating from the central fixation point. There was no significant difference seen in performance between this and the standard task [F(1,4)=2.565, p=0.185]. This may suggest two things: (i) Gestalt grouping is not used as a strategy in these tasks, and (ii) it gives further weight to the argument that objects may be stored and retrieved from a pre-attentional store during this task

    Optimization of Surface Roughness, Material Removal Rate and cutting Tool Flank Wear in Turning Using Extended Taguchi Approach

    Get PDF
    Quality and productivity play significant role in today’s manufacturing market.From customers’ viewpoint quality is very important because the extent of quality of the procured item (or product) influences the degree of satisfaction of the consumers during usage of the procured goods. Therefore, every manufacturing or production unit should concern about the quality of the product. Apart from quality, there exists another criterion, called productivity which is directly related to the profit level and also goodwill of the organization. Every manufacturing industry aims at producing a large number of products within relatively lesser time. But it is felt that reduction in manufacturing time may cause severe quality loss. In order to embrace these two conflicting criteria it is necessary to check quality level of the item either on-line or off-line. The purpose is to check whether quality lies within desired tolerance level which can be accepted by the customers. Quality of a product can be described by various quality attributes. The attributes may be quantitative or qualitative. In on-line quality control controller and related equipments are provided with the job under operation and continuously the quality is being monitored. If quality falls down the expected level the controller supplies a feedback in order to reset the process environment. In off-line quality control the method is either to check the quality of few products from a batch or lot (acceptance sampling) or to evaluate the best process environment capable of producing desired quality product. This invites optimization problem which seeks identification of the best process condition or parametric combination for the said manufacturing process. If the problem is related to a single quality attribute then it is called single objective (or response) optimization. If more than one attribute comes into consideration it is very difficult to select the optimal setting which can achieve all quality requirements simultaneously. Otherwise optimizing one quality feature may lead severe quality loss to other quality characteristics which may not be accepted by the customers. In order to tackle such a multi-objective optimization problem, the present study applied extended Taguchi method through a case study in straight turning of mild viii steel bar using HSS tool. The study aimed at evaluating the best process environment which could simultaneously satisfy requirements of both quality and as well as productivity with special emphasis on reduction of cutting tool flank wear. Because reduction in flank wear ensures increase in tool life. The predicted optimal setting ensured minimization of surface roughness, height of flank wear of the cutting tool and maximization of MRR (Material Removal Rate). In view of the fact, that traditional Taguchi method cannot solve a multi-objective optimization problem; to overcome this limitation grey relational theory has been coupled with Taguchi method. Furthermore to follow the basic assumption of Taguchi method i.e. quality attributes should be uncorrelated or independent. But is practical case it may not be so. To overcome this shortcoming the study applied Principal Component analysis (PCA) to eliminate response correlation that exists between the responses and to evaluate independent or uncorrelated quality indices called Principal Components. Finally the study combined PCA, grey analysis, utility concept and Taguchi method for predicting the optimal setting. Optimal result was verified through confirmatory test. This indicates application feasibility of the aforesaid techniques for correlated multi-response optimization and off-line quality control in turning operation
    corecore