13,866 research outputs found

    Fuzzy stability analysis of regenerative chatter in milling

    Get PDF
    During machining, unstable self-excited vibrations known as regenerative chatter can occur, causing excessive tool wear or failure, and a poor surface finish on the machined workpiece. Consequently it is desirable to predict, and hence avoid the onset of this instability. Regenerative chatter is a function of empirical cutting coefficients, and the structural dynamics of the machine-tool system. There can be significant uncertainties in the underlying parameters, so the predicted stability limits do not necessarily agree with those found in practice. In the present study, fuzzy arithmetic techniques are applied to the chatter stability problem. It is first shown that techniques based upon interval arithmetic are not suitable for this problem due to the issue of recursiveness. An implementation of fuzzy arithmetic is then developed based upon the work of Hanss and Klimke. The arithmetic is then applied to two techniques for predicting milling chatter stability: the classical approach of Altintas, and the time-finite element method of Mann. It is shown that for some cases careful programming can reduce the computational effort to acceptable levels. The problem of milling chatter uncertainty is then considered within the framework of Ben-Haim's information-gap theory. It is shown that the presented approach can be used to solve process design problems with robustness to the uncertain parameters. The fuzzy stability bounds are then compared to previously published data, to investigate how uncertainty propagation techniques can offer more insight into the accuracy of chatter predictions

    Algorithm of arithmetical operations with fuzzy numerical data

    Get PDF
    In this article the theoretical generalization for representation of arithmetic operations with fuzzy numbers is considered. Fuzzy numbers are generalized by means of fuzzy measures. On the basis of this generalization the new algorithm of fuzzy arithmetic which uses a principle of entropy maximum is created. As example, the summation of two fuzzy numbers is considered. The algorithm is realized in the software "Fuzzy for Microsoft Excel".fuzzy measure (Sugeno), fuzzy integral (Sugeno), fuzzy numbers; arithmetical operations; principle of entropy maximum

    An improved artificial dendrite cell algorithm for abnormal signal detection

    Get PDF
    In dendrite cell algorithm (DCA), the abnormality of a data point is determined by comparing the multi-context antigen value (MCAV) with anomaly threshold. The limitation of the existing threshold is that the value needs to be determined before mining based on previous information and the existing MCAV is inefficient when exposed to extreme values. This causes the DCA fails to detect new data points if the pattern has distinct behavior from previous information and affects detection accuracy. This paper proposed an improved anomaly threshold solution for DCA using the statistical cumulative sum (CUSUM) with the aim to improve its detection capability. In the proposed approach, the MCAV were normalized with upper CUSUM and the new anomaly threshold was calculated during run time by considering the acceptance value and min MCAV. From the experiments towards 12 benchmark and two outbreak datasets, the improved DCA is proven to have a better detection result than its previous version in terms of sensitivity, specificity, false detection rate and accuracy

    A knowledge based system for valuing variations in civil engineering works: a user centred approach

    Get PDF
    There has been much evidence that valuing variations in construction projects can lead to conflicts and disputes leading to loss of time, efficiency, and productivity. One of the reasons for these conflicts and disputes concerns the subjectivity of the project stakeholders involved in the process. One way to minimise this is to capture and collate the knowledge and perceptions of the different parties involved in order to develop a robust mechanism for valuing variations. Focusing on the development of such a mechanism, the development of a Knowledge Based System (KBS) for valuing variations in civil engineering work is described. Evaluation of the KBS involved demonstration to practitioners in the construction industry to support the contents of the knowledge base and perceived usability and acceptance of the system. Results support the novelty, contents, usability, and acceptance of the system, and also identify further potential developments of the KBS

    An overview of decision table literature 1982-1995.

    Get PDF
    This report gives an overview of the literature on decision tables over the past 15 years. As much as possible, for each reference, an author supplied abstract, a number of keywords and a classification are provided. In some cases own comments are added. The purpose of these comments is to show where, how and why decision tables are used. The literature is classified according to application area, theoretical versus practical character, year of publication, country or origin (not necessarily country of publication) and the language of the document. After a description of the scope of the interview, classification results and the classification by topic are presented. The main body of the paper is the ordered list of publications with abstract, classification and comments.

    Consolidity: Mystery of inner property of systems uncovered

    Get PDF
    AbstractThis paper uncovers the mystery of consolidity, an inner property of systems that was amazingly hidden. Consolidity also reveals the secrecy of why strong stable and highly controllable systems are not invulnerable of falling and collapsing. Consolidity is measured by its Consolidity Index, defined as the ratio of overall changes of output parameters over combined changes of input and system parameters, all operating in fully fuzzy environment. Under this notion, systems are classified into consolidated, quasi-consolidated, neutrally consolidated, unconsolidated, quasi-unconsolidated and mixed types. The strategy for the implementation of consolidity is elaborated for both natural and man-made existing systems as well as the new developed ones. An important critique arises that the by-product consolidity of natural or built-as-usual system could lead to trapping such systems into a completely undesired unconsolidity. This suggests that the ample number of conventional techniques that do not take system consolidity into account should gradually be changed, and adjusted with improved consolidity-based techniques. Four Golden Rules are highlighted for handling system consolidity, and applied to several illustrative case studies. These case studies cover the consolidity analysis of the Drug Concentration problem, Predator-Prey Population problem, Spread of Infectious Disease problem, AIDS Epidemic problem and Arm Race model. It is demonstrated that consolidity changes are contrary (opposite in sign) to changes of both stability and controllability. This is a very significant result showing that our present practice of stressing on building strong stable and highly controllable systems could have already jeopardized the consolidity behavior of an ample family of existing real life systems. It is strongly recommended that the four Golden Rules of consolidity should be enforced as future strict regulations of systems modeling, analysis, design and building of different disciplines of sciences. It can be stated that with the mystery of consolidity uncovered, the door is now wide open towards the launching of a new generation of systems with superior consolidity in various sciences and disciplines. Examples of these disciplines are basic sciences, evolutionary systems, engineering, astronautics, astronomy, biology, ecology, medicine, pharmacology, economics, finance, commerce, political and management sciences, humanities, social sciences, literature, psychology, philosophy, mass communication, and education

    A review on analysis and synthesis of nonlinear stochastic systems with randomly occurring incomplete information

    Get PDF
    Copyright q 2012 Hongli Dong et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.In the context of systems and control, incomplete information refers to a dynamical system in which knowledge about the system states is limited due to the difficulties in modeling complexity in a quantitative way. The well-known types of incomplete information include parameter uncertainties and norm-bounded nonlinearities. Recently, in response to the development of network technologies, the phenomenon of randomly occurring incomplete information has become more and more prevalent. Such a phenomenon typically appears in a networked environment. Examples include, but are not limited to, randomly occurring uncertainties, randomly occurring nonlinearities, randomly occurring saturation, randomly missing measurements and randomly occurring quantization. Randomly occurring incomplete information, if not properly handled, would seriously deteriorate the performance of a control system. In this paper, we aim to survey some recent advances on the analysis and synthesis problems for nonlinear stochastic systems with randomly occurring incomplete information. The developments of the filtering, control and fault detection problems are systematically reviewed. Latest results on analysis and synthesis of nonlinear stochastic systems are discussed in great detail. In addition, various distributed filtering technologies over sensor networks are highlighted. Finally, some concluding remarks are given and some possible future research directions are pointed out. © 2012 Hongli Dong et al.This work was supported in part by the National Natural Science Foundation of China under Grants 61273156, 61134009, 61273201, 61021002, and 61004067, the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Royal Society of the UK, the National Science Foundation of the USA under Grant No. HRD-1137732, and the Alexander von Humboldt Foundation of German

    Affine arithmetic-based methodology for energy hub operation-scheduling in the presence of data uncertainty

    Get PDF
    In this study, the role of self-validated computing for solving the energy hub-scheduling problem in the presence of multiple and heterogeneous sources of data uncertainties is explored and a new solution paradigm based on affine arithmetic is conceptualised. The benefits deriving from the application of this methodology are analysed in details, and several numerical results are presented and discussed
    corecore