41 research outputs found

    Assessing productive efficiency of banks using integrated Fuzzy-DEA and bootstrapping:a case of Mozambican banks

    Get PDF
    Performance analysis has become a vital part of the management practices in the banking industry. There are numerous applications using DEA models to estimate efficiency in banking, and most of them assume that inputs and outputs are known with absolute precision. Here, we propose new Fuzzy-DEA α-level models to assess underlying uncertainty. Further, bootstrap truncated regressions with fixed factors are used to measure the impact of each model on the efficiency scores and to identify the most relevant contextual variables on efficiency. The proposed models have been demonstrated using an application in Mozambican banks to handle the underlying uncertainty. Findings reveal that fuzziness is predominant over randomness in interpreting the results. In addition, fuzziness can be used by decision-makers to identify missing variables to help in interpreting the results. Price of labor, price of capital, and market-share were found to be the significant factors in measuring bank efficiency. Managerial implications are addressed

    Semantics of fuzzy quantifiers

    Get PDF
    The aim of this thesis is to discuss the semantics of FQs (fuzzy quantifiers), formal semantics in particular. The approach used is fuzzy semantic based on fuzzy set theory (Zadeh 1965, 1975), i.e. we explore primarily the denotational meaning of FQs represented by membership functions. Some empirical data from both Chinese and English is used for illustration. A distinguishing characteristic of the semantics of FQs like about 200 students and many students as opposed to other sorts of quantifiers like every student and no students, is that they have fuzzy meaning boundaries. There is considerable evidence to suggest that the doctrine that a proposition is either true or false has a limited application in natural languages, which raises a serious question towards any linguistic theories that are based on a binary assumption. In other words, the number of elements in a domain that must satisfy a predicate is not precisety given by an FQ and so a proposition con¬ taining one may be more or less true depending on how closely numbers of elements approximate to a given norm. The most significant conclusion drawn here is that FQs are compositional in that FQs of the same type function in the same way to generate a constant semantic pattern. It is argued that although basic membership functions are subject to modification depending on context, they vary only with certain limits (i.e. FQs are motivated—neither completely predicated nor completely arbitrary), which does not deny compositionality in any way. A distinctive combination of compositionality and motivation of FQs makes my formal semantic framework of FQs unique in the way that although some specific values, such as a norm, have to be determined pragmatically, semantic and inferential patterns are systematic and predictable. A number of interdisciplinary implications, such as semantic, general linguistic, logic and psychological, are discussed. The study here seems to be a somewhat troublesome but potentially important area for developing theories (and machines) capable of dealing with, and accounting for, natural languages

    A heuristic information retrieval study : an investigation of methods for enhanced searching of distributed data objects exploiting bidirectional relevance feedback

    Get PDF
    A thesis submitted for the degree of Doctor of Philosophy of the University of LutonThe primary aim of this research is to investigate methods of improving the effectiveness of current information retrieval systems. This aim can be achieved by accomplishing numerous supporting objectives. A foundational objective is to introduce a novel bidirectional, symmetrical fuzzy logic theory which may prove valuable to information retrieval, including internet searches of distributed data objects. A further objective is to design, implement and apply the novel theory to an experimental information retrieval system called ANACALYPSE, which automatically computes the relevance of a large number of unseen documents from expert relevance feedback on a small number of documents read. A further objective is to define a methodology used in this work as an experimental information retrieval framework consisting of multiple tables including various formulae which anow a plethora of syntheses of similarity functions, ternl weights, relative term frequencies, document weights, bidirectional relevance feedback and history adjusted term weights. The evaluation of bidirectional relevance feedback reveals a better correspondence between system ranking of documents and users' preferences than feedback free system ranking. The assessment of similarity functions reveals that the Cosine and Jaccard functions perform significantly better than the DotProduct and Overlap functions. The evaluation of history tracking of the documents visited from a root page reveals better system ranking of documents than tracking free information retrieval. The assessment of stemming reveals that system information retrieval performance remains unaffected, while stop word removal does not appear to be beneficial and can sometimes be harmful. The overall evaluation of the experimental information retrieval system in comparison to a leading edge commercial information retrieval system and also in comparison to the expert's golden standard of judged relevance according to established statistical correlation methods reveal enhanced system information retrieval effectiveness

    Death of Paradox: The Killer Logic Beneath the Standards of Proof

    Get PDF
    The prevailing but contested view of proof standards is that factfinders should determine facts by probabilistic reasoning. Given imperfect evidence, they should ask themselves what they think the chances are that the burdened party would be right if the truth were to become known; they then compare those chances to the applicable standard of proof. I contend that for understanding the standards of proof, the modern versions of logic — in particular, fuzzy logic and belief functions — work better than classical probability. This modern logic suggests that factfinders view evidence of an imprecisely perceived and described reality to form a fuzzy degree of belief in a fact’s existence; they then apply the standard of proof in accordance with the theory of belief functions, by comparing their belief in a fact’s existence to their belief in its negation. This understanding explains how the standard of proof actually works in the law world. It gives a superior mental image of the factfinders’ task, conforms more closely to what we know of people’s cognition, and captures better what the law says its standards are and how it manipulates them. One virtue of this conceptualization is that it is not a radically new view. Another virtue is that it nevertheless manages to resolve some stubborn problems of proof, including the infamous conjunction paradox

    Experimental analysis of fuzzy economic optimization

    Full text link

    Intelligent Pattern Analysis of the Foetal Electrocardiogram

    Get PDF
    The aim of the project on which this thesis is based is to develop reliable techniques for foetal electrocardiogram (ECG) based monitoring, to reduce incidents of unnecessary medical intervention and foetal injury during labour. World-wide electronic foetal monitoring is based almost entirely on the cardiotocogram (CTG), which is a continuous display of the foetal heart rate (FHR) pattern together with the contraction of the womb. Despite the widespread use of the CTG, there is no significant improvement in foetal outcome. In the UK alone it is estimated that birth related negligence claims cost the health authorities over £400M per-annum. An expert system, known as INFANT, has recently been developed to assist CTG interpretation. However, the CTG alone does not always provide all the information required to improve the outcome of labour. The widespread use of ECG analysis has been hindered by the difficulties with poor signal quality and the difficulties in applying the specialised knowledge required for interpreting ECG patterns, in association with other events in labour, in an objective way. A fundamental investigation and development of optimal signal enhancement techniques that maximise the available information in the ECG signal, along with different techniques for detecting individual waveforms from poor quality signals, has been carried out. To automate the visual interpretation of the ECG waveform, novel techniques have been developed that allow reliable extraction of key features and hence allow a detailed ECG waveform analysis. Fuzzy logic is used to automatically classify the ECG waveform shape using these features by using knowledge that was elicited from expert sources and derived from example data. This allows the subtle changes in the ECG waveform to be automatically detected in relation to other events in labour, and thus improve the clinicians position for making an accurate diagnosis. To ensure the interpretation is based on reliable information and takes place in the proper context, a new and sensitive index for assessing the quality of the ECG has been developed. New techniques to capture, for the first time in machine form, the clinical expertise / guidelines for electronic foetal monitoring have been developed based on fuzzy logic and finite state machines, The software model provides a flexible framework to further develop and optimise rules for ECG pattern analysis. The signal enhancement, QRS detection and pattern recognition of important ECG waveform shapes have had extensive testing and results are presented. Results show that no significant loss of information is incurred as a result of the signal enhancement and feature extraction techniques

    Geometric Fuzzy Logic Systems

    Get PDF
    There has recently been a significant increase in academic interest in the field oftype-2 fuzzy sets and systems. Type-2 fuzzy systems offer the ability to model and reason with uncertain concepts. When faced with uncertainties type-2 fuzzy systems should, theoretically, give an increase in performance over type-l fuzzy systems. However, the computational complexity of generalised type-2 fuzzy systems is significantly higher than type-l systems. A direct consequence of this is that, prior to this thesis, generalised type-2 fuzzy logic has not yet been applied in a time critical domain, such as control. Control applications are the main application area of type-l fuzzy systems with the literature reporting many successes in this area. Clearly the computational complexity oftype-2 fuzzy logic is holding the field back. This restriction on the development oftype-2 fuzzy systems is tackled in this research. This thesis presents the novel approach ofdefining fuzzy sets as geometric objects - geometric fuzzy sets. The logical operations for geometric fuzzy sets are defined as geometric manipulations of these sets. This novel geometric approach is applied to type-I, type-2 interval and generalised type-2 fuzzy sets and systems. The major contribution of this research is the reduction in the computational complexity oftype-2 fuzzy logic that results from the application of the geometric approach. This reduction in computational complexity is so substantial that generalised type-2 fuzzy logic has, for the first time, been successfully applied to a control problem - mobile robot navigation. A detailed comparison between the performance of the generalised type-2 fuzzy controller and the performance of the type-l and type-2 interval controllers is given. The results indicate that the generalised type-2 fuzzy logic controller outperforms the other robot controllers. This outcome suggests that generalised type-2 fuzzy systems can offer an improved performance over type-l and type-2 interval systems
    corecore