28,636 research outputs found

    Contextual Outlier Interpretation

    Full text link
    Outlier detection plays an essential role in many data-driven applications to identify isolated instances that are different from the majority. While many statistical learning and data mining techniques have been used for developing more effective outlier detection algorithms, the interpretation of detected outliers does not receive much attention. Interpretation is becoming increasingly important to help people trust and evaluate the developed models through providing intrinsic reasons why the certain outliers are chosen. It is difficult, if not impossible, to simply apply feature selection for explaining outliers due to the distinct characteristics of various detection models, complicated structures of data in certain applications, and imbalanced distribution of outliers and normal instances. In addition, the role of contrastive contexts where outliers locate, as well as the relation between outliers and contexts, are usually overlooked in interpretation. To tackle the issues above, in this paper, we propose a novel Contextual Outlier INterpretation (COIN) method to explain the abnormality of existing outliers spotted by detectors. The interpretability for an outlier is achieved from three aspects: outlierness score, attributes that contribute to the abnormality, and contextual description of its neighborhoods. Experimental results on various types of datasets demonstrate the flexibility and effectiveness of the proposed framework compared with existing interpretation approaches

    Radiocarbon evidence for the pace of the M-/L-PPNB transition in the 8th millennium BC south-west Asia

    Get PDF
    The transition from the Middle to Late Pre-Pottery Neolithic B (PPNB) happened throughout southwest Asia in the mid-8th millennium cal BC. It entailed the abandonment of a number of sites, rapid growth of others, as well as the wide spread of morphologically domestic caprines. What remains an unknown is how rapid these processes were in real time. Over the period when the transition was taking place, the calibration curve has two shallow sections divided by a sudden drop, which for many of the older dates creates an illusion of a sudden cultural break around 7600–7500 cal BC. Yet a more detailed study presented in this paper suggests that the transition event could have been spread over a more extended period of time. This, however, is still far from certain due to risks of old wood effects and complexities of site formation

    Examining Recent Expert Elicitation Judgment Guidelines: Value Assumptions and the Prospects for Rationality

    Get PDF
    This paper was presented at the VALDOR Symposium, Stockholm, June 1999. The author examines the value assumptions in the U.S. Department of Energy and Nuclear Regulatory Commission guidance on the use of expert judgment relating to high level nuclear waste disposal site selection

    Exploring site formation and building local contexts through wiggle-match radiocarbon dating: re-dating of the Firth of Clyde Crannogs, Scotland

    Get PDF
    There are at least four wooden intertidal platforms, also known as marine crannogs, in the Firth of Clyde, on the west coast of Scotland. The interpretation of these sites partly depends on their dating and, if coeval, they could point to the presence of a native maritime hub. Furthermore, the spatial coincidence with the terminus of the Antonine Wall has led to speculation about the role they may have played in Roman-native interaction during the occupation of southern Scotland in the early first millennium cal ad. Hence, a better absolute chronology is essential to evaluate whether the marine crannogs were contemporary with one another and whether they related to any known historic events. This article presents results of a wiggle-match dating project aimed at resolving these uncertainties at two of the sites in question, Dumbuck and Erskine Bridge crannogs. The results show that the construction of these sites pre-date direct Roman influence in Scotland. Furthermore, the results indicate that the two sites were built at least 300 years apart, forcing us to consider the possibility that they may have functioned in very different historical contexts. Other findings include technical observations on the fine shape of the radiocarbon calibration curve near the turn of the first millennia bc/ad and potential evidence for persistent contamination in decayed and exposed sections of waterlogged alder

    Predictive intelligence to the edge through approximate collaborative context reasoning

    Get PDF
    We focus on Internet of Things (IoT) environments where a network of sensing and computing devices are responsible to locally process contextual data, reason and collaboratively infer the appearance of a specific phenomenon (event). Pushing processing and knowledge inference to the edge of the IoT network allows the complexity of the event reasoning process to be distributed into many manageable pieces and to be physically located at the source of the contextual information. This enables a huge amount of rich data streams to be processed in real time that would be prohibitively complex and costly to deliver on a traditional centralized Cloud system. We propose a lightweight, energy-efficient, distributed, adaptive, multiple-context perspective event reasoning model under uncertainty on each IoT device (sensor/actuator). Each device senses and processes context data and infers events based on different local context perspectives: (i) expert knowledge on event representation, (ii) outliers inference, and (iii) deviation from locally predicted context. Such novel approximate reasoning paradigm is achieved through a contextualized, collaborative belief-driven clustering process, where clusters of devices are formed according to their belief on the presence of events. Our distributed and federated intelligence model efficiently identifies any localized abnormality on the contextual data in light of event reasoning through aggregating local degrees of belief, updates, and adjusts its knowledge to contextual data outliers and novelty detection. We provide comprehensive experimental and comparison assessment of our model over real contextual data with other localized and centralized event detection models and show the benefits stemmed from its adoption by achieving up to three orders of magnitude less energy consumption and high quality of inference

    Quality of Information in Mobile Crowdsensing: Survey and Research Challenges

    Full text link
    Smartphones have become the most pervasive devices in people's lives, and are clearly transforming the way we live and perceive technology. Today's smartphones benefit from almost ubiquitous Internet connectivity and come equipped with a plethora of inexpensive yet powerful embedded sensors, such as accelerometer, gyroscope, microphone, and camera. This unique combination has enabled revolutionary applications based on the mobile crowdsensing paradigm, such as real-time road traffic monitoring, air and noise pollution, crime control, and wildlife monitoring, just to name a few. Differently from prior sensing paradigms, humans are now the primary actors of the sensing process, since they become fundamental in retrieving reliable and up-to-date information about the event being monitored. As humans may behave unreliably or maliciously, assessing and guaranteeing Quality of Information (QoI) becomes more important than ever. In this paper, we provide a new framework for defining and enforcing the QoI in mobile crowdsensing, and analyze in depth the current state-of-the-art on the topic. We also outline novel research challenges, along with possible directions of future work.Comment: To appear in ACM Transactions on Sensor Networks (TOSN

    Underpowered samples, false negatives, and unconscious learning

    Get PDF
    The scientific community has witnessed growing concern about the high rate of false positives and unreliable results within the psychological literature, but the harmful impact of false negatives has been largely ignored. False negatives are particularly concerning in research areas where demonstrating the absence of an effect is crucial, such as studies of unconscious or implicit processing. Research on implicit processes seeks evidence of above-chance performance on some implicit behavioral measure at the same time as chance-level performance (that is, a null result) on an explicit measure of awareness. A systematic review of 73 studies of contextual cuing, a popular implicit learning paradigm, involving 181 statistical analyses of awareness tests, reveals how underpowered studies can lead to failure to reject a false null hypothesis. Among the studies that reported sufficient information, the meta-analytic effect size across awareness tests was d z = 0.31 (95 % CI 0.24–0.37), showing that participants’ learning in these experiments was conscious. The unusually large number of positive results in this literature cannot be explained by selective publication. Instead, our analyses demonstrate that these tests are typically insensitive and underpowered to detect medium to small, but true, effects in awareness tests. These findings challenge a widespread and theoretically important claim about the extent of unconscious human cognition
    • …
    corecore