26,074 research outputs found

    GraFIX: a semiautomatic approach for parsing low- and high-quality eye-tracking data

    Get PDF
    Fixation durations (FD) have been used widely as a measurement of information processing and attention. However, issues like data quality can seriously influence the accuracy of the fixation detection methods and, thus, affect the validity of our results (Holmqvist, Nyström, & Mulvey, 2012). This is crucial when studying special populations such as infants, where common issues with testing (e.g., high degree of movement, unreliable eye detection, low spatial precision) result in highly variable data quality and render existing FD detection approaches highly time consuming (hand-coding) or imprecise (automatic detection). To address this problem, we present GraFIX, a novel semiautomatic method consisting of a two-step process in which eye-tracking data is initially parsed by using velocity-based algorithms whose input parameters are adapted by the user and then manipulated using the graphical interface, allowing accurate and rapid adjustments of the algorithms’ outcome. The present algorithms (1) smooth the raw data, (2) interpolate missing data points, and (3) apply a number of criteria to automatically evaluate and remove artifactual fixations. The input parameters (e.g., velocity threshold, interpolation latency) can be easily manually adapted to fit each participant. Furthermore, the present application includes visualization tools that facilitate the manual coding of fixations. We assessed this method by performing an intercoder reliability analysis in two groups of infants presenting low- and high-quality data and compared it with previous methods. Results revealed that our two-step approach with adaptable FD detection criteria gives rise to more reliable and stable measures in low- and high-quality data

    Multi-path Summation for Decoding 2D Topological Codes

    Get PDF
    Fault tolerance is a prerequisite for scalable quantum computing. Architectures based on 2D topological codes are effective for near-term implementations of fault tolerance. To obtain high performance with these architectures, we require a decoder which can adapt to the wide variety of error models present in experiments. The typical approach to the problem of decoding the surface code is to reduce it to minimum-weight perfect matching in a way that provides a suboptimal threshold error rate, and is specialized to correct a specific error model. Recently, optimal threshold error rates for a variety of error models have been obtained by methods which do not use minimum-weight perfect matching, showing that such thresholds can be achieved in polynomial time. It is an open question whether these results can also be achieved by minimum-weight perfect matching. In this work, we use belief propagation and a novel algorithm for producing edge weights to increase the utility of minimum-weight perfect matching for decoding surface codes. This allows us to correct depolarizing errors using the rotated surface code, obtaining a threshold of 17.76±0.02%17.76 \pm 0.02 \%. This is larger than the threshold achieved by previous matching-based decoders (14.88±0.02%14.88 \pm 0.02 \%), though still below the known upper bound of 18.9%\sim 18.9 \%.Comment: 19 pages, 13 figures, published in Quantum, available at https://quantum-journal.org/papers/q-2018-10-19-102

    Human activities and global warming: a cointegration analysis

    Get PDF
    Using econometric tools for selecting I(1) and I(2) trends, we found the existence of static long-run steady-state and dynamic long-run steady-state relations between temperature and radiative forcing of solar irradiance and a set of three greenhouse gases series. Estimates of the adjustment coefficients indicate that temperature series is error correcting around 5-65% of the disequilibria each year, depending on the type of long-run relation. The estimates of the I(1) and I(2) trends indicate that they are driven by linear combinations of the three greenhouse gases and their loadings indicate strong impact on the temperature series. The equilibrium temperature change for a doubling of carbon dioxide is between 2.15 and 3.4 C, which is in agreement with past literature and the report of the IPCC in 2001 using 15 different general circulation models.Global warming; Radiative forcing; Cointegration; I(1) process; I(2) process; Unit roots

    Reliability assessment of null allele detection: inconsistencies between and within different methods

    Get PDF
    Microsatellite loci are widely used in population genetic studies, but the presence of null alleles may lead to biased results. Here we assessed five methods that indirectly detect null alleles, and found large inconsistencies among them. Our analysis was based on 20 microsatellite loci genotyped in a natural population of Microtus oeconomus sampled during 8 years, together with 1200 simulated populations without null alleles, but experiencing bottlenecks of varying duration and intensity, and 120 simulated populations with known null alleles. In the natural population, 29% of positive results were consistent between the methods in pairwise comparisons, and in the simulated dataset this proportion was 14%. The positive results were also inconsistent between different years in the natural population. In the null-allele-free simulated dataset, the number of false positives increased with increased bottleneck intensity and duration. We also found a low concordance in null allele detection between the original simulated populations and their 20% random subsets. In the populations simulated to include null alleles, between 22% and 42% of true null alleles remained undetected, which highlighted that detection errors are not restricted to false positives. None of the evaluated methods clearly outperformed the others when both false positive and false negative rates were considered. Accepting only the positive results consistent between at least two methods should considerably reduce the false positive rate, but this approach may increase the false negative rate. Our study demonstrates the need for novel null allele detection methods that could be reliably applied to natural population

    Predicting growing stock volume of Eucalyptus plantations using 3-D point clouds derived from UAV imagery and ALS data

    Get PDF
    Estimating forest inventory variables is important in monitoring forest resources and mitigating climate change. In this respect, forest managers require flexible, non-destructive methods for estimating volume and biomass. High-resolution and low-cost remote sensing data are increasingly available to measure three-dimensional (3D) canopy structure and to model forest structural attributes. The main objective of this study was to evaluate and compare the individual tree volume estimates derived from high-density point clouds obtained from airborne laser scanning (ALS) and digital aerial photogrammetry (DAP) in Eucalyptus spp. plantations. Object-based image analysis (OBIA) techniques were applied for individual tree crown (ITC) delineation. The ITC algorithm applied correctly detected and delineated 199 trees from ALS-derived data, while 192 trees were correctly identified using DAP-based point clouds acquired fromUnmannedAerialVehicles(UAV), representing accuracy levels of respectively 62% and 60%. Addressing volume modelling, non-linear regression fit based on individual tree height and individual crown area derived from the ITC provided the following results: Model E ciency (Mef) = 0.43 and 0.46, Root Mean Square Error (RMSE) = 0.030 m3 and 0.026 m3, rRMSE = 20.31% and 19.97%, and an approximately unbiased results (0.025 m3 and 0.0004 m3) using DAP and ALS-based estimations, respectively. No significant di erence was found between the observed value (field data) and volume estimation from ALS and DAP (p-value from t-test statistic = 0.99 and 0.98, respectively). The proposed approaches could also be used to estimate basal area or biomass stocks in Eucalyptus spp. plantationsinfo:eu-repo/semantics/publishedVersio
    corecore