346 research outputs found

    A randomized trial in a massive online open course shows people don't know what a statistically significant relationship looks like, but they can learn

    Full text link
    Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online course (MOOC). Each subject was shown a random set of scatterplots and asked to visually determine if the underlying relationships were statistically significant at the P < 0.05 level. Subjects correctly classified only 47.4% (95% CI: 45.1%-49.7%) of statistically significant relationships, and 74.6% (95% CI: 72.5%-76.6%) of non-significant relationships. Adding visual aids such as a best fit line or scatterplot smooth increased the probability a relationship was called significant, regardless of whether the relationship was actually significant. Classification of statistically significant relationships improved on repeat attempts of the survey, although classification of non-significant relationships did not. Our results suggest: (1) that evidence-based data analysis can be used to identify weaknesses in theoretical procedures in the hands of average users, (2) data analysts can be trained to improve detection of statistically significant results with practice, but (3) data analysts have incorrect intuition about what statistically significant relationships look like, particularly for small effects. We have built a web tool for people to compare scatterplots with their corresponding p-values which is available here: http://glimmer.rstudio.com/afisher/EDA/.Comment: 7 pages, including 2 figures and 1 tabl

    Link Travel Time Estimation in Double-Queue-Based Traffic Models

    Get PDF
    Double queue concept has gained its popularity in dynamic user equilibrium (DUE) modeling because it can properly model real traffic dynamics. While directly solving such double-queue-based DUE problems is extremely challenging, an approximation scheme called first-order approximation was proposed to simplify the link travel time estimation of DUE problems in a recent study without evaluating its properties and performance. This paper focuses on directly investigating the First-In-First-Out property and the performance of the first-order approximation in link travel time estimation by designing and modeling dynamic network loading (DNL) on single-line stretch networks. After model formulation, we analyze the First-In-First-Out (FIFO) property of the first-order approximation. Then a series of numerical experiments is conducted to demonstrate the FIFO property of the first-order approximation, and to compare its performance with those using the second-order approximation, a point queue model, and the cumulative inflow and exit flow curves. The numerical results show that the first-order approximation does not guarantee FIFO and also suggest that the second-order approximation is recommended especially when the link exit flow is increasing. The study provides guidance for further study on proposing new methods to better estimate link travel times

    SmartTrack: Efficient Predictive Race Detection

    Full text link
    Widely used data race detectors, including the state-of-the-art FastTrack algorithm, incur performance costs that are acceptable for regular in-house testing, but miss races detectable from the analyzed execution. Predictive analyses detect more data races in an analyzed execution than FastTrack detects, but at significantly higher performance cost. This paper presents SmartTrack, an algorithm that optimizes predictive race detection analyses, including two analyses from prior work and a new analysis introduced in this paper. SmartTrack's algorithm incorporates two main optimizations: (1) epoch and ownership optimizations from prior work, applied to predictive analysis for the first time; and (2) novel conflicting critical section optimizations introduced by this paper. Our evaluation shows that SmartTrack achieves performance competitive with FastTrack-a qualitative improvement in the state of the art for data race detection.Comment: Extended arXiv version of PLDI 2020 paper (adds Appendices A-E) #228 SmartTrack: Efficient Predictive Race Detectio

    Role of Copper Oxide Layer on Pool Boiling Performance with Femtosecond Laser Processed Surfaces

    Get PDF
    Copper pool boiling surfaces are tested for pool boiling enhancement due to femtosecond laser surface processing (FLSP). FLSP creates self-organized micro/nanostructures on metallic surfaces and creates highly wetting and wicking surfaces with permanent surface features. In this study two series of samples were created. The first series consists of three flat FLSP copper surfaces with varying microstructures and the second series is an open microchannel configuration with laser processing over the horizontal surfaces of the microchannels. These microchannels range in height from 125 microns to 380 microns. Each of these surfaces were tested for pool boiling performance. It was found that all the processed surfaces except one resulted in a decrease in critical heat flux and heat transfer coefficient compared to an unprocessed surface. It was found that the laser fluence parameter had a significant role in whether there was an increase in CHF or HTC. A cross sectioning technique was employed to study the different layers of the microstructure and to understand how FLSP could have a negative effect on the CHF and HTC. It was found that a thick oxide layer forms during the FLSP process of copper in an open-air atmosphere. The thickness and uniformity of the oxide layer is highly dependent on the laser fluence. A low fluence sample results in an inconsistent oxide layer of nonuniform thickness and subsequently an increase in CHF and HTC. A high laser fluence sample results in a uniformly thick oxide layer which increases the thermal resistance of the sample and allows for a premature CHF and decrease in HTC

    An Advanced Apparatus for Integrating Nanophotonics and Cold Atoms

    Get PDF
    Integrating nanophotonics with cold atoms permits the exploration of novel paradigms in quantum optics and many-body physics. We realize an advanced apparatus which enables the delivery of single-atom tweezer arrays in the vicinity of photonic crystal waveguides

    Towards Reactive Acoustic Jamming for Personal Voice Assistants

    Get PDF
    Personal Voice Assistants (PVAs) such as the Amazon Echo are com- monplace and it is now likely to always be in range of at least one PVA. Although the devices are very helpful they are also continuously monitoring conversations. When a PVA detects a wake word, the immediately following conversation is recorded and transported to a cloud system for further analysis. In this paper we investigate an active protection mechanism against PVAs: reactive jamming. A Protection Jamming Device (PJD) is employed to observe conversations. Upon detection of a PVA wake word the PJD emits an acoustic jamming signal. The PJD must detect the wake word faster than the PVA such that the jamming signal still prevents wake word detection by the PVA. The paper presents an evaluation of the e ectiveness of di erent jamming signals. We quantify the impact of jamming signal and wake word overlap on jamming success. Furthermore, we quantify the jamming false positive rate in depen- dence of the overlap. Our evaluation shows that a 100% jamming success can be achieved with an overlap of at least 60% with a negligible false positive rate. Thus, reactive jamming of PVAs is feasible without creating a system perceived as a noise nuisance
    corecore