5,839 research outputs found

    Measurement Invariance, Entropy, and Probability

    Full text link
    We show that the natural scaling of measurement for a particular problem defines the most likely probability distribution of observations taken from that measurement scale. Our approach extends the method of maximum entropy to use measurement scale as a type of information constraint. We argue that a very common measurement scale is linear at small magnitudes grading into logarithmic at large magnitudes, leading to observations that often follow Student's probability distribution which has a Gaussian shape for small fluctuations from the mean and a power law shape for large fluctuations from the mean. An inverse scaling often arises in which measures naturally grade from logarithmic to linear as one moves from small to large magnitudes, leading to observations that often follow a gamma probability distribution. A gamma distribution has a power law shape for small magnitudes and an exponential shape for large magnitudes. The two measurement scales are natural inverses connected by the Laplace integral transform. This inversion connects the two major scaling patterns commonly found in nature. We also show that superstatistics is a special case of an integral transform, and thus can be understood as a particular way in which to change the scale of measurement. Incorporating information about measurement scale into maximum entropy provides a general approach to the relations between measurement, information and probability

    A tool for subjective and interactive visual data exploration

    Get PDF
    We present SIDE, a tool for Subjective and Interactive Visual Data Exploration, which lets users explore high dimensional data via subjectively informative 2D data visualizations. Many existing visual analytics tools are either restricted to specific problems and domains or they aim to find visualizations that align with user’s belief about the data. In contrast, our generic tool computes data visualizations that are surprising given a user’s current understanding of the data. The user’s belief state is represented as a set of projection tiles. Hence, this user-awareness offers users an efficient way to interactively explore yet-unknown features of complex high dimensional datasets

    Inclusion of seasonal variation in river system microbial communities and phototroph activity increases environmental relevance of laboratory chemical persistence tests

    Get PDF
    Regulatory tests assess crop protection product environmental fate and toxicity before approval for commercial use. Although globally applied laboratory tests can assess biodegradation, they lack environmental complexity. Microbial communities are subject to temporal and spatial variation, but there is little consideration of these microbial dynamics in the laboratory. Here, we investigated seasonal variation in the microbial composition of water and sediment from a UK river across a two-year time course and determined its effect on the outcome of water-sediment (OECD 308) and water-only (OECD 309) biodegradation tests, using the fungicide isopyrazam. These OECD tests are performed under dark conditions, so test systems incubated under non-UV light:dark cycles were also included to determine the impact on both inoculum characteristics and biodegradation. Isopyrazam degradation was faster when incubated under non-UV light at all collection times in water-sediment microcosms, suggesting that phototrophic communities can metabolise isopyrazam throughout the year. Degradation rate varied seasonally between inoculum collection times only in microcosms incubated in the light, but isopyrazam mineralisation to 14CO2 varied seasonally under both light and dark conditions, suggesting that heterotrophic communities may also play a role in degradation. Bacterial and phototroph communities varied across time, but there was no clear link between water or sediment microbial composition and variation in degradation rate. During the test period, inoculum microbial community composition changed, particularly in non-UV light incubated microcosms. Overall, we show that regulatory test outcome is not influenced by temporal variation in microbial community structure; however, biodegradation rates from higher tier studies with improved environmental realism, e.g. through addition of non-UV light, may be more variable. These data suggest that standardised OECD tests can provide a conservative estimate of pesticide persistence end points and that additional tests including non-UV light could help bridge the gap between standard tests and field studies

    Flexible delivery of Er:YAG radiation at 2.94 µm with negative curvature silica glass fibers:a new solution for minimally invasive surgical procedures

    Get PDF
    We present the delivery of high energy microsecond pulses through a hollow-core negative-curvature fiber at 2.94 µm. The energy densities delivered far exceed those required for biological tissue manipulation and are of the order of 2300 J/cm(2). Tissue ablation was demonstrated on hard and soft tissue in dry and aqueous conditions with no detrimental effects to the fiber or catastrophic damage to the end facets. The energy is guided in a well confined single mode allowing for a small and controllable focused spot delivered flexibly to the point of operation. Hence, a mechanically and chemically robust alternative to the existing Er:YAG delivery systems is proposed which paves the way for new routes for minimally invasive surgical laser procedures

    Efficient estimation of AUC in a sliding window

    Full text link
    In many applications, monitoring area under the ROC curve (AUC) in a sliding window over a data stream is a natural way of detecting changes in the system. The drawback is that computing AUC in a sliding window is expensive, especially if the window size is large and the data flow is significant. In this paper we propose a scheme for maintaining an approximate AUC in a sliding window of length kk. More specifically, we propose an algorithm that, given ϵ\epsilon, estimates AUC within ϵ/2\epsilon / 2, and can maintain this estimate in O((logk)/ϵ)O((\log k) / \epsilon) time, per update, as the window slides. This provides a speed-up over the exact computation of AUC, which requires O(k)O(k) time, per update. The speed-up becomes more significant as the size of the window increases. Our estimate is based on grouping the data points together, and using these groups to calculate AUC. The grouping is designed carefully such that (ii) the groups are small enough, so that the error stays small, (iiii) the number of groups is small, so that enumerating them is not expensive, and (iiiiii) the definition is flexible enough so that we can maintain the groups efficiently. Our experimental evaluation demonstrates that the average approximation error in practice is much smaller than the approximation guarantee ϵ/2\epsilon / 2, and that we can achieve significant speed-ups with only a modest sacrifice in accuracy
    corecore