6,186,163 research outputs found

    Efficacy Of A Structured Free Recall Intervention To Improve Rating Quality In Performance Evaluations

    Get PDF
    This experiment investigated the effects of a rater training on halo errors and accuracy during performance evaluations. 408 participants were randomly assigned to three groups (n=136) where they were either presented with a structured free recall intervention (SFRI), frame of reference training (FoRT), or no training. The purpose of this study was to further investigate the efficacy of SFRI against prominent training methods and no training at all. Results were not significant, and did not support previous finding in the literature. Further explanations are offered and a discussion is presented as to why these results were obtained

    Bronco Ember An Edge Computing Acceleration Platform with Computer Vision

    Get PDF
    Bronco Ember is a nascent wildfire detection system that leverages edge computing capabilities, multi-spectral imaging, and artificial intelligence to greatly increase the performance of small satellite remote sensing payloads. The core hardware onboard is a SWIR InGaAs camera imaging in the 900nm to 1700nm wavelength and a GPU enabled single board computer. Artificial intelligence is used for fire detection and analysis using computer vision and neural networks being able to detect fires only filling a few pixels in each image. The system is based on traditional CNN networks and includes time series analysis that gives the system an 85% success rate in being able to detect wildfires with about a 50m diameter from a high-altitude balloon technology demonstration flight. The neural net is trained to monitor the movement and spread of the fire compared to prediction maps. This greatly reduces the number of false positive detected. The development of this payload has been supported through the NASA TechLeap Autonomous Observation Challenge No. 1 that has pushed the technology from concept to test flight in less than one calendar year. The system acts a rapid response remote sensing technology

    Individualized Learning Plans in Guiding Career-Technical Course-Taking and Achieving Post-High-School Employment Goals

    Get PDF
    Most adolescents and young adults in the U.S. seek employment after high school regardless of their education or work status, yet career readiness and work preparation have not received equal attention as the college readiness and preparation at the secondary level. Using data from the High School Longitudinal Study of 2009 (HSLS:2009), we explored possible connections between individualized learning plans (ILP) and both secondary Career-Technical Education (CTE) course taking and employment goal attainment in the U.S. Results showed that ILPs were positively associated with establishing employment goals, securing employment, and achieving employment goals after high school. Students who had employment goals were likely to earn more CTE credits and had higher probabilities of working after high school. However, ILPs did not moderate the relationship between employment goals and earned CTE credits, nor moderate the relationship between employment goals and work activities. Findings reflect an overlook of integrated college and career readiness preparation and underutilization of school-based career education resources. Keywords: Individualized learning plan; career and technical education; employment; school-based career development; college and career readiness DOI: 10.7176/JEP/14-16-04 Publication date:June 30th 202

    Maximum discharges and maximum runoffs in Poland

    Get PDF
    Published in: Natural environment of Poland and its protection in Łódź University Geographical Research, edited by E. Kobojek and T.Marsza

    Maximum Fidelity

    Full text link
    The most fundamental problem in statistics is the inference of an unknown probability distribution from a finite number of samples. For a specific observed data set, answers to the following questions would be desirable: (1) Estimation: Which candidate distribution provides the best fit to the observed data?, (2) Goodness-of-fit: How concordant is this distribution with the observed data?, and (3) Uncertainty: How concordant are other candidate distributions with the observed data? A simple unified approach for univariate data that addresses these traditionally distinct statistical notions is presented called "maximum fidelity". Maximum fidelity is a strict frequentist approach that is fundamentally based on model concordance with the observed data. The fidelity statistic is a general information measure based on the coordinate-independent cumulative distribution and critical yet previously neglected symmetry considerations. An approximation for the null distribution of the fidelity allows its direct conversion to absolute model concordance (p value). Fidelity maximization allows identification of the most concordant model distribution, generating a method for parameter estimation, with neighboring, less concordant distributions providing the "uncertainty" in this estimate. Maximum fidelity provides an optimal approach for parameter estimation (superior to maximum likelihood) and a generally optimal approach for goodness-of-fit assessment of arbitrary models applied to univariate data. Extensions to binary data, binned data, multidimensional data, and classical parametric and nonparametric statistical tests are described. Maximum fidelity provides a philosophically consistent, robust, and seemingly optimal foundation for statistical inference. All findings are presented in an elementary way to be immediately accessible to all researchers utilizing statistical analysis.Comment: 66 pages, 32 figures, 7 tables, submitte

    Distributed Approximation of Maximum Independent Set and Maximum Matching

    Full text link
    We present a simple distributed Δ\Delta-approximation algorithm for maximum weight independent set (MaxIS) in the CONGEST\mathsf{CONGEST} model which completes in O(MIS(G)logW)O(\texttt{MIS}(G)\cdot \log W) rounds, where Δ\Delta is the maximum degree, MIS(G)\texttt{MIS}(G) is the number of rounds needed to compute a maximal independent set (MIS) on GG, and WW is the maximum weight of a node. %Whether our algorithm is randomized or deterministic depends on the \texttt{MIS} algorithm used as a black-box. Plugging in the best known algorithm for MIS gives a randomized solution in O(lognlogW)O(\log n \log W) rounds, where nn is the number of nodes. We also present a deterministic O(Δ+logn)O(\Delta +\log^* n)-round algorithm based on coloring. We then show how to use our MaxIS approximation algorithms to compute a 22-approximation for maximum weight matching without incurring any additional round penalty in the CONGEST\mathsf{CONGEST} model. We use a known reduction for simulating algorithms on the line graph while incurring congestion, but we show our algorithm is part of a broad family of \emph{local aggregation algorithms} for which we describe a mechanism that allows the simulation to run in the CONGEST\mathsf{CONGEST} model without an additional overhead. Next, we show that for maximum weight matching, relaxing the approximation factor to (2+ε2+\varepsilon) allows us to devise a distributed algorithm requiring O(logΔloglogΔ)O(\frac{\log \Delta}{\log\log\Delta}) rounds for any constant ε>0\varepsilon>0. For the unweighted case, we can even obtain a (1+ε)(1+\varepsilon)-approximation in this number of rounds. These algorithms are the first to achieve the provably optimal round complexity with respect to dependency on Δ\Delta

    Maximum entanglement of formation for a two-mode Gaussian state over passive operations

    Get PDF
    We quantify the maximum amount of entanglement of formation (EoF) that can be achieved by continuous-variable states under passive operations, which we refer to as EoF-potential. Focusing, in particular, on two-mode Gaussian states we derive analytical expressions for the EoF-potential for specific classes of states. For more general states, we demonstrate that this quantity can be upper-bounded by the minimum amount of squeezing needed to synthesize the Gaussian modes, a quantity called squeezing of formation. Our work, thus, provides a new link between non-classicality of quantum states and the non-classicality of correlations.Comment: Revised versio
    corecore