30,014 research outputs found

    Empirical Study of the Associative Approach in the Context of Classification Problems

    Get PDF
    Research carried out by the scientific community has shown that the performance of the classifiers depends not only on the learning rule, if not also on the complexities inherent in the data sets. Some traditional classifiers have been commonly used in the context of classification problems (three Neural Networks, C4.5, SVM, among others). However, the associative approach has been further explored in the recovery context, than in the classification task, and its performance almost has not been analyzed when several complexities in the data are presented. The present investigation analyzes the performance of the associative approach (CHA, CHAT and original Alpha Beta) when three classification problems occur (class imbalance, overlapping and a typical patterns). The results show that the CHAT algorithm recognizes the minority class better than the rest of the classifiers in the context of class imbalance. However, the CHA model ignores the minority class in most cases. In addition, the CHAT algorithm requires well-defined decision boundaries when Wilson’s method is applied, because of its performance increases. Also, it was noted that when a balance between the rates is emphasized, the performance of the three classifiers increase (RB, RFBR and CHAT). The original Alfa Beta model shows poor performance when pre-processing the data is done. The performance of the classifiers increases significantly when the SMOTE method is applied, which does not occur without a pre-processing or with a subsampling, in the context of the imbalance of the classes.Investigaciones realizadas por la comunidad científica han evidenciado que el rendimiento de los clasificadores, no solamente depende de la regla de aprendizaje, sino también de las complejidades inherentes en los conjuntos de datos. Algunos clasificadores se han utilizado habitualmente en el contexto de losproblemas de clasificación (tres Redes neuronales, C4.5, SVM, entre otros). No obstante, el enfoque asociativo se ha explorado más en en el ámbito de recuperación, que en la tarea de clasificación, y su rendimiento se ha analizado escasamente cuando se presentan varias complejidades en los datos. La presente investigación analiza el rendimiento del enfoque asociativo (CHA, CHAT y Alfa Beta original) cuando se presentan tres problemas de clasificación (desequilibrio de las clases, solapamiento y patrones atípicos). Los resultados evidencian que el CHAT reconoce mejor la clase minoritaria en comparación con el resto de los clasificadores en el contexto del desequilibrio de las clases. Sin embargo, el modelo CHA ignora la clase minoritaria en la mayoría de los casos. Además, el modelo CHAT exhibe la necesidad de requerir de fronteras de decisión bien definidas cuando se aplica el método de Wilson, ya que su rendimiento se incrementa. También, se notó que cuando se enfatiza un equilibrio entre las tasas, el rendimiento de tres clasificadores incrementa (CHAT, RB y RFBR). El modelo Alfa beta original sigue mostrando un desempeño pobre cuando se realiza el pre-procesamiento en los datos. El rendimiento de los clasificadores incrementa significativamente al aplicarse el método SMOTE, situación que no se presenta sin un pre-procesamiento o submuestreo, en el contexto del desequilibrio de las clases

    Anxiety and Posttraumatic Stress Disorder in the Context of Human Brain Evolution:A Role for Theory in DSM-V?

    Get PDF
    The “hypervigilance, escape, struggle, tonic immobility”\ud evolutionarily hardwired acute peritraumatic response\ud sequence is important for clinicians to understand. Our\ud commentary supplements the useful article on human\ud tonic immobility (TI) by Marx, Forsyth, Gallup, Fusé and Lexington (2008). A hallmark sign of TI is peritraumatic\ud tachycardia, which others have documented as a\ud major risk factor for subsequent posttraumatic stress\ud disorder (PTSD). TI is evolutionarily highly conserved\ud (uniform across species) and underscores the need for\ud DSM-V planners to consider the inclusion of evolution\ud theory in the reconceptualization of anxiety and PTSD.\ud We discuss the relevance of evolution theory to the\ud DSM-V reconceptualization of acute dissociativeconversion\ud symptoms and of epidemic sociogenic disorder(epidemic “hysteria”). Both are especially in need of attention in light of the increasing threat of terrorism\ud against civilians. We provide other pertinent examples.\ud Finally, evolution theory is not ideology driven (and\ud makes testable predictions regarding etiology in “both\ud directions”). For instance, it predicted the unexpected\ud finding that some disorders conceptualized in DSM-IV-TR as innate phobias are conditioned responses and thus better conceptualized as mild forms of PTSD. Evolution\ud theory may offer a conceptual framework in\ud DSM-V both for treatment and for research on psychopathology.\u

    Contextualizing concepts using a mathematical generalization of the quantum formalism

    Get PDF
    We outline the rationale and preliminary results of using the State Context Property (SCOP) formalism, originally developed as a generalization of quantum mechanics, to describe the contextual manner in which concepts are evoked, used, and combined to generate meaning. The quantum formalism was developed to cope with problems arising in the description of (1) the measurement process, and (2) the generation of new states with new properties when particles become entangled. Similar problems arising with concepts motivated the formal treatment introduced here. Concepts are viewed not as fixed representations, but entities existing in states of potentiality that require interaction with a context---a stimulus or another concept---to `collapse' to observable form as an exemplar, prototype, or other (possibly imaginary) instance. The stimulus situation plays the role of the measurement in physics, acting as context that induces a change of the cognitive state from superposition state to collapsed state. The collapsed state is more likely to consist of a conjunction of concepts for associative than analytic thought because more stimulus or concept properties take part in the collapse. We provide two contextual measures of conceptual distance---one using collapse probabilities and the other weighted properties---and show how they can be applied to conjunctions using the pet fish problem

    A functional-cognitive framework for attitude research

    Get PDF
    In attitude research, behaviours are often used as proxies for attitudes and attitudinal processes. This practice is problematic because it conflates the behaviours that need to be explained (explanandum) with the mental constructs that are used to explain these behaviours (explanans). In the current chapter we propose a meta-theoretical framework that resolves this problem by distinguishing between two levels of analysis. According to the proposed framework, attitude research can be conceptualised as the scientific study of evaluation. Evaluation is defined not in terms of mental constructs but in terms of elements in the environment, more specifically, as the effect of stimuli on evaluative responses. From this perspective, attitude research provides answers to two questions: (1) Which elements in the environment moderate evaluation? (2) What mental processes and representations mediate evaluation? Research on the first question provides explanations of evaluative responses in terms of elements in the environment (functional level of analysis); research on the second question offers explanations of evaluation in terms of mental processes and representations (cognitive level of analysis). These two levels of analysis are mutually supportive, in that better explanations at one level lead to better explanations at the other level. However, their mutually supportive relation requires a clear distinction between the concepts of their explanans and explanandum, which are conflated if behaviours are treated as proxies for mental constructs. The value of this functional-cognitive framework is illustrated by applying it to four central questions of attitude research

    Cognitive processes in categorical and associative priming: a diffusion model analysis

    Get PDF
    Cognitive processes and mechanisms underlying different forms of priming were investigated using a diffusion model approach. In a series of 6 experiments, effects of prime-target associations and of a semantic and affective categorical match of prime and target were analyzed for different tasks. Significant associative and categorical priming effects were found in standard analyses of response times (RTs) and error frequencies. Results of diffusion model analyses revealed that priming effects of associated primes were mapped on the drift rate parameter (v), while priming effects of a categorical match on a task-relevant dimension were mapped on the extradecisional parameters (t(0) and d). These results support a spreading activation account of associative priming and an explanation of categorical priming in terms of response competition. Implications for the interpretation of priming effects and the use of priming paradigms in cognitive psychology and social cognition are discussed

    A Corpus-Based Investigation of Definite Description Use

    Full text link
    We present the results of a study of definite descriptions use in written texts aimed at assessing the feasibility of annotating corpora with information about definite description interpretation. We ran two experiments, in which subjects were asked to classify the uses of definite descriptions in a corpus of 33 newspaper articles, containing a total of 1412 definite descriptions. We measured the agreement among annotators about the classes assigned to definite descriptions, as well as the agreement about the antecedent assigned to those definites that the annotators classified as being related to an antecedent in the text. The most interesting result of this study from a corpus annotation perspective was the rather low agreement (K=0.63) that we obtained using versions of Hawkins' and Prince's classification schemes; better results (K=0.76) were obtained using the simplified scheme proposed by Fraurud that includes only two classes, first-mention and subsequent-mention. The agreement about antecedents was also not complete. These findings raise questions concerning the strategy of evaluating systems for definite description interpretation by comparing their results with a standardized annotation. From a linguistic point of view, the most interesting observations were the great number of discourse-new definites in our corpus (in one of our experiments, about 50% of the definites in the collection were classified as discourse-new, 30% as anaphoric, and 18% as associative/bridging) and the presence of definites which did not seem to require a complete disambiguation.Comment: 47 pages, uses fullname.sty and palatino.st

    A review of associative classification mining

    Get PDF
    Associative classification mining is a promising approach in data mining that utilizes the association rule discovery techniques to construct classification systems, also known as associative classifiers. In the last few years, a number of associative classification algorithms have been proposed, i.e. CPAR, CMAR, MCAR, MMAC and others. These algorithms employ several different rule discovery, rule ranking, rule pruning, rule prediction and rule evaluation methods. This paper focuses on surveying and comparing the state-of-the-art associative classification techniques with regards to the above criteria. Finally, future directions in associative classification, such as incremental learning and mining low-quality data sets, are also highlighted in this paper

    Representing complex data using localized principal components with application to astronomical data

    Full text link
    Often the relation between the variables constituting a multivariate data space might be characterized by one or more of the terms: ``nonlinear'', ``branched'', ``disconnected'', ``bended'', ``curved'', ``heterogeneous'', or, more general, ``complex''. In these cases, simple principal component analysis (PCA) as a tool for dimension reduction can fail badly. Of the many alternative approaches proposed so far, local approximations of PCA are among the most promising. This paper will give a short review of localized versions of PCA, focusing on local principal curves and local partitioning algorithms. Furthermore we discuss projections other than the local principal components. When performing local dimension reduction for regression or classification problems it is important to focus not only on the manifold structure of the covariates, but also on the response variable(s). Local principal components only achieve the former, whereas localized regression approaches concentrate on the latter. Local projection directions derived from the partial least squares (PLS) algorithm offer an interesting trade-off between these two objectives. We apply these methods to several real data sets. In particular, we consider simulated astrophysical data from the future Galactic survey mission Gaia.Comment: 25 pages. In "Principal Manifolds for Data Visualization and Dimension Reduction", A. Gorban, B. Kegl, D. Wunsch, and A. Zinovyev (eds), Lecture Notes in Computational Science and Engineering, Springer, 2007, pp. 180--204, http://www.springer.com/dal/home/generic/search/results?SGWID=1-40109-22-173750210-
    corecore