10 research outputs found

    Rough set approach for categorical data clustering

    Get PDF
    A few techniques of rough categorical data clustering exist to group objects having similar characteristics. However, the performance of the techniques is an issue due to low accuracy, high computational complexity and clusters purity. This work proposes a new technique called Maximum Dependency Attributes (MDA) to improve the previous techniques due to these issues. The proposed technique is based on rough set theory by taking into account the dependency of attributes of an information system. The main contribution of this technique is to introduce a new technique to classify objects from categorical datasets which has better performance as compared to the baseline techniques. The algorithm of the proposed technique is implemented in MATLAB® version 7.6.0.324 (R2008a). They are executed sequentially on a processor Intel Core 2 Duo CPUs. The total main memory is 1 Gigabyte and the operating system is Windows XP Professional SP3. Results collected during the experiments on four small datasets and thirteen UCI benchmark datasets for selecting a clustering attribute show that the proposed MDA technique is an efficient approach in terms of accuracy and computational complexity as compared to BC, TR and MMR techniques. For the clusters purity, the results on Soybean and Zoo datasets show that MDA technique provided better purity up to 17% and 9%, respectively. The experimental result on supplier chain management clustering also demonstrates how MDA technique can contribute to practical system and establish the better performance for computation complexity and clusters purity up to 90% and 23%, respectively

    I believe it's possible it might be so.... : Exploiting Lexical Clues for the Automatic Generation of Evidentiality Weights for Information Extracted from English Text

    Get PDF
    Information formulated in natural language is being created at an incredible pace, far more quickly than we can make sense of it. Thus, computer algorithms for various kinds of text analytics have been developed to try to find nuggets of new, pertinent and useful information. However, information extracted from text is not always credible or reliable; often buried in sentences are lexical and grammatical structures that indicate the uncertainty of the proposition. Such clues include hedges such as modal adverbs and adjectives, as well as hearsay markers, indicators of inference or belief (”mindsay”), and verb forms identifying future actions which may not take place. In this thesis, we demonstrate how analysis of these lexical and grammatical forms of uncertainty can be automatically analyzed to provide a method of determining an evidential weight to the proposition, which can be used to assess the credibility of the information extracted from English text

    GA approach for finding Rough Set decision rules based on bireducts

    Get PDF
    Feature selection plays an important role in knowledge discovery and data mining nowadays. In traditional rough set theory, feature selection using reduct - the minimal discerning set of attributes - is an important area. Nevertheless, the original definition of a reduct is restrictive, so in one of the previous research it was proposed to take into account not only the horizontal reduction of information by feature selection, but also a vertical reduction considering suitable subsets of the original set of objects. Following the work mentioned above, a new approach to generate bireducts using a multi--objective genetic algorithm was proposed. Although the genetic algorithms were used to calculate reduct in some previous works, we did not find any work where genetic algorithms were adopted to calculate bireducts. Compared to the works done before in this area, the proposed method has less randomness in generating bireducts. The genetic algorithm system estimated a quality of each bireduct by values of two objective functions as evolution progresses, so consequently a set of bireducts with optimized values of these objectives was obtained. Different fitness evaluation methods and genetic operators, such as crossover and mutation, were applied and the prediction accuracies were compared. Five datasets were used to test the proposed method and two datasets were used to perform a comparison study. Statistical analysis using the one-way ANOVA test was performed to determine the significant difference between the results. The experiment showed that the proposed method was able to reduce the number of bireducts necessary in order to receive a good prediction accuracy. Also, the influence of different genetic operators and fitness evaluation strategies on the prediction accuracy was analyzed. It was shown that the prediction accuracies of the proposed method are comparable with the best results in machine learning literature, and some of them outperformed it

    The Hunt for Lost Blood: Nazi Germanization Policy in Occupied Europe

    Get PDF
    Throughout the Second World War, the National Socialist regime enacted a wide-ranging campaign to enhance the German nation by assimilating conquered populations into its demographic structure. At the axis of this multifaceted enterprise stood the Re-Germanization Procedure, or WED – a special program designed to absorb “racially valuable” foreigners into the German body politic by sending them to live with host families in the very heart of the Third Reich. The following dissertation provides the first ever study of the Re-Germanization Procedure and examines the momentous influence this initiative exerted over Nazi policy-making in occupied Europe. It is a story of the nexus between popular opinion on the home front and imperialism abroad, a fresh inquiry into the dynamics of German rule and their basis in the experiences of ordinary human beings, a kaleidoscopic portrait detailing a signature aspect of the National Socialist era that has largely eluded the scrutiny of historical analysis. The WED created a space where German and non-German civilians could articulate their understandings of race, community, and national belonging from within the settings of everyday life. Drawing on methodological tools from the fields of critical race studies and postcolonial theory, my research probes the extraordinary degree to which their interactions with state actors, and with each other, helped shape the classification of indigenous peoples across the length and breadth of Hitler’s empire – a place where identity politics often meant the difference between life and death. By situating this process within a global context of nation-building and colonialism, my project reveals an unfamiliar side of an infamous epoch in order to show how, under the wartime Third Reich, discourses of race came to function not just as an impetus for genocidal violence, but as a transformative framework of inclusion

    Impact et facteurs clés de l'introduction d'équipements miniers innovants : le cas d'une mine souterraine

    Get PDF
    Les entreprises minières naviguent dans un environnement économique cyclique influencé par les prix du marché. À cela, s'ajoute une pression sociale accrue au niveau des conditions de travail et de la sécurité des travailleurs. C'est donc à un contexte hautement concurrentiel que les entreprises de ce secteur sont confrontées. Afin de demeurer compétitives, l'une des solutions qu'elles privilégient est l'acquisition d'équipements innovants. Toutefois, l'introduction d'équipements innovants ne se fait pas sans heurts. Plusieurs études ont en effet démontré que l'arrivée de nouveaux équipements plus gros, plus puissants et plus sophistiqués a également entraîné des effets négatifs. Parmi ceux-ci notons, les périodes d'adaptation plus longues que prévues. Mais encore, ces équipements sont aussi en cause dans bon nombre d'accidents et de décès, et ce, tant à l'échelle internationale que chez les mines québécoises. Devant ces constats de succès mitigés, il appert fondamental de mieux comprendre les facteurs de succès lors de l'implantation d'équipements miniers innovants. Dans cette thèse nous proposons l'étude approfondie de ce sujet par une étude de cas réalisée dans une mine aurifère souterraine témiscabitibienne. Dans un premier temps, notre démarche vise à mesurer l'impact de dix projets innovants sur des indicateurs de performance en productivité et en santé et sécurité du travail (SST). Dans un deuxième temps, nous proposons l'utilisation d'un outil d'aide à la décision, l'approche des ensembles approximatifs basés sur la dominance, afin d'identifier les facteurs clés favorisant l'implantation de ces équipements innovants. Parmi les résultats obtenus, deux facteurs ont été identifiés comme les plus pertinents sur l'ensemble des indicateurs de performance étudiés, soit le niveau d'habileté requis pour maîtriser la technologie et le niveau d'acceptation de cette dernière par les opérateurs. En plus de ces deux facteurs, la qualité du siège et l'expérience des opérateurs ont également été identifiées comme pertinentes pour expliquer les résultats en SST, alors que le niveau de standardisation du nouvel équipement s'est montré pertinent pour expliquer ceux en productivité. Nos travaux permettent ainsi à notre partenaire industriel de cibler et de prioriser ses besoins pour que l'implantation d'équipements innovants entraîne dorénavant une amélioration de la performance en productivité et en SST. Bien que nos résultats proviennent et se limitent à une étude de cas, l'approche innovante et rigoureuse que nous proposons à la communauté scientifique et industrielle peut être mise en application à chaque entreprise minière souterraine désirant identifier ses propres facteurs de succès inhérents à son propre environnement. D'autres limites et perspectives offrent des pistes de recherches potentielles sur lesquelles se conclue notre thèse. À ce titre, nous proposons des indicateurs de performance supplémentaires, tels que le nombre de tonnes transportées par les camions et le taux de sévérité des blessures. De plus, une étude similaire, mais prenant en considération les accidents touchant les employés d'entrepreneurs miniers, de même que les accidents survenus lors de réparation ou de maintenance, ajouterait des connaissances complémentaires et intéressantes sur le sujet développé dans cette thèse

    Reabilitação de edifícios antigos de acordo com os requisitos ENERPHIT

    Get PDF
    Mestrado em Engenharia CivilThis thesis refers to the applying of Passive house concept to a old building from the early twentieth century in Polish climate, focusing on city of Jarocin. All work is based on the EnerPHit requirements for buildings retrofitting (Certification thermomodernization with approved quality using quality components for passive construction - EnerPHit) The aim of this study is to reach solutions to solve the problem of achiving low heating demand for old building in colder climate, according EnerPhit requirements. The study began with the introduction to Passive House concepts for new and retrofitted buildings. Therefore, the examples of construction solutions, materials and the thermal performance comparison between them have been described. The software “Passive House Planning Package” has been adopted for the thermal balance calculation. Summarizing, this study presents the Passive House concept for building retrofitting, which focus on an historical old building, located in central of Poland, and conclude for the possible achivement of this standard requirements

    Towards Better Performance in the Face of Input Uncertainty while Maintaining Interpretability in AI

    Get PDF
    Uncertainty is a pervasive element of many real-world applications and very often existing sources of uncertainty (e.g. atmospheric conditions, economic parameters or precision of measurement devices) have a detrimental impact on the input and ultimately results of decision-support systems. Thus, the ability to handle input uncertainty is a valuable component of real-world decision-support systems. There is a vast amount of literature on handling of uncertainty through decision-support systems. While they handle uncertainty and deliver a good performance, providing an insight into the decision process (e.g. why or how results are produced) is another important asset in terms of having trust in or providing a ‘debugging’ process in given decisions. Fuzzy set theory provides the basis for Fuzzy Logic Systems which are often associated with the ability for handling uncertainty and possessing mechanisms for providing a degree of interpretability. Specifically, Non-Singleton Fuzzy Logic Systems are essential in dealing with uncertainty that affects input which is one of the main sources of uncertainty in real-world systems. Therefore, in this thesis, we comprehensively explore enhancing non-singleton fuzzy logic systems capabilities considering both capturing-handling uncertainty and also maintaining interpretability. To that end the following three key aspects are investigated; (i) to faithfully map input uncertainty to outputs of systems, (ii) to propose a new framework to provide the ability for dynamically adapting system on-the-fly in changing real-world environments. (iii) to maintain level of interpretability while leveraging performance of systems. The first aspect is to leverage mapping uncertainty from input to outputs of systems through the interaction between input and antecedent fuzzy sets i.e. firing strengths. In the context of Non-Singleton Fuzzy Logic Systems, recent studies have shown that the standard technique for determining firing strengths risks information loss in terms of the interaction of the input uncertainty and antecedent fuzzy sets. This thesis explores and puts forward novel approaches to generating firing strengths which faithfully map the uncertainty affecting system inputs to outputs. Time-series forecasting experiments are used to evaluate the proposed alternative firing strength generating technique under different levels of input uncertainty. The analysis of the results shows that the proposed approach can also be a suitable method to generate appropriate firing levels which provide the ability to map different uncertainty levels from input to output of FLS that are likely to occur in real-world circumstances. The second aspect is to provide dynamic adaptive behaviours to systems at run-time in changing conditions which are common in real-world environments. Traditionally, in the fuzzification step of Non-Singleton Fuzzy Logic Systems, approaches are generally limited to the selection of a single type of input fuzzy sets to capture the input uncertainty, whereas input uncertainty levels tend to be inherently varying over time in the real-world at run-time. Thus, in this thesis, input uncertainty is modelled -where it specifically arises- in an online manner which can provide an adaptive behaviour to capture varying input uncertainty levels. The framework is presented to generate Type-1 or Interval Type-2 input fuzzy sets, called ADaptive Online Non-singleton fuzzy logic System (ADONiS). In the proposed framework, an uncertainty estimation technique is utilised on a sequence of observations to continuously update the input fuzzy sets of non-singleton fuzzy logic systems. Both the type-1 and interval type-2 versions of the ADONiS frameworks remove the limitation of the selection of a specific type of input fuzzy sets. Also this framework enables input fuzzy sets to be adapted to unknown uncertainty levels which is not perceived at the design stage of the model. Time-series forecasting experiments are implemented and results show that our proposed framework provides performance advantages over traditional counterpart approaches, particularly in environments that include high variation in noise levels, which are common in real-world applications. In addition, the real-world medical application study is designed to test the deployability of the ADONiS framework and to provide initial insight in respect to its viability in replacing traditional approaches. The third aspect is to maintain levels of interpretability, while increasing performance of systems. When a decision-support model delivers a good performance, providing an insight of the decision process is also an important asset in terms of trustworthiness, safety and ethical aspects etc. Fuzzy logic systems are considered to possess mechanisms which can provide a degree of interpretability. Traditionally, while optimisation procedures provide performance benefits in fuzzy logic systems, they often cause alterations in components (e.g. rule set, parameters, or fuzzy partitioning structures) which can lead to higher accuracy but commonly do not consider the interpretability of the resulting model. In this thesis, the state of the art in fuzzy logic systems interpretability is advanced by capturing input uncertainty in the fuzzification -where it arises- and by handling it the inference engine step. In doing so, while the performance increase is achieved, the proposed methods limit any optimisation impact to the fuzzification and inference engine steps which protects key components of FLSs (e.g. fuzzy sets, rule parameters etc.) and provide the ability to maintain the given level of interpretability

    [Bibliographies] [Recurso electrónico]

    Get PDF

    Magyar építészettörténeti és műemléki bibliográfia 2001-2015

    Get PDF
    corecore