13,737 research outputs found

    Looking Good With Flickr Faves: Gaussian Processes for Finding Difference Makers in Personality Impressions

    Get PDF
    Flickr allows its users to generate galleries of "faves", i.e., pictures that they have tagged as favourite. According to recent studies, the faves are predictive of the personality traits that people attribute to Flickr users. This article investigates the phenomenon and shows that faves allow one to predict whether a Flickr user is perceived to be above median or not with respect to each of the Big-Five Traits (accuracy up to 79\% depending on the trait). The classifier - based on Gaussian Processes with a new kernel designed for this work - allows one to identify the visual characteristics of faves that better account for the prediction outcome

    Discontinuous Reception for Multiple-Beam Communication

    Get PDF
    This is the final version. Available from IEEE via the DOI in this recordDiscontinuous reception (DRX) techniques have successfully been proposed for energy savings in 4G radio access systems, which are deployed on legacy 2GHz spectrum bands with signal features of omni-directional propagation. In upcoming 5G systems, higher frequency spectrum bands will also be utilized. Unfortunately higher frequency bands encounter more significant path loss, thus requiring directional beamforming to aggregate the radiant signal in a certain direction. We, therefore, propose a DRX scheme for multiple beam (DRXB) communication scenarios. The proposed DRXB scheme is designed to avoid unnecessary energy-and-time-consuming beam-training procedures, which enables longer sleep periods and shorter wake-up latency. We provide an analytical model to investigate the receiver-side energy efficiency and transmission latency of the proposed scheme. Through simulations, our approach is shown to have clear performance improvements over the conventional DRX scheme where beam training is conducted in each DRX cycle.Swedish Research CouncilNational Natural Science Foundation of ChinaEuropean Union Horizon 202

    Gaussian Approximation Potentials: the accuracy of quantum mechanics, without the electrons

    Get PDF
    We introduce a class of interatomic potential models that can be automatically generated from data consisting of the energies and forces experienced by atoms, derived from quantum mechanical calculations. The resulting model does not have a fixed functional form and hence is capable of modeling complex potential energy landscapes. It is systematically improvable with more data. We apply the method to bulk carbon, silicon and germanium and test it by calculating properties of the crystals at high temperatures. Using the interatomic potential to generate the long molecular dynamics trajectories required for such calculations saves orders of magnitude in computational cost.Comment: v3-4: added new material and reference

    Parallel Batch-Dynamic Graph Connectivity

    Full text link
    In this paper, we study batch parallel algorithms for the dynamic connectivity problem, a fundamental problem that has received considerable attention in the sequential setting. The most well known sequential algorithm for dynamic connectivity is the elegant level-set algorithm of Holm, de Lichtenberg and Thorup (HDT), which achieves O(log2n)O(\log^2 n) amortized time per edge insertion or deletion, and O(logn/loglogn)O(\log n / \log\log n) time per query. We design a parallel batch-dynamic connectivity algorithm that is work-efficient with respect to the HDT algorithm for small batch sizes, and is asymptotically faster when the average batch size is sufficiently large. Given a sequence of batched updates, where Δ\Delta is the average batch size of all deletions, our algorithm achieves O(lognlog(1+n/Δ))O(\log n \log(1 + n / \Delta)) expected amortized work per edge insertion and deletion and O(log3n)O(\log^3 n) depth w.h.p. Our algorithm answers a batch of kk connectivity queries in O(klog(1+n/k))O(k \log(1 + n/k)) expected work and O(logn)O(\log n) depth w.h.p. To the best of our knowledge, our algorithm is the first parallel batch-dynamic algorithm for connectivity.Comment: This is the full version of the paper appearing in the ACM Symposium on Parallelism in Algorithms and Architectures (SPAA), 201

    Properties of the mechanosensitive channel MscS pore revealed by tryptophan scanning mutagenesis

    Get PDF
    Funding This work was supported by a Wellcome Trust Programme grant [092552/A/10/Z awarded to I.R.B., S.M., J. H. Naismith (University of St Andrews, St Andrews, U.K.), and S. J. Conway (University of Oxford, Oxford, U.K.)] (T.R. and M.D.E.), by a BBSRC grant (A.R.) [BB/H017917/1 awarded to I.R.B., J. H. Naismith, and O. Schiemann (University of St Andrews)], by a Leverhulme Emeritus Fellowship (EM-2012-060\2), and by a CEMI grant to I.R.B. from the California Institute of Technology. The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7/2007-2013 FP7/2007-2011) under Grant PITN-GA-2011-289384 (FP7-PEOPLE-2011-ITN NICHE) (H.G.) (awarded to S.M.).Peer reviewedPublisher PD

    Sensitivity analysis of the reactor safety study

    Get PDF
    Originally presented as the first author's thesis, (M.S.) in the M.I.T. Dept. of Nuclear Engineering, 1979.The Reactor Safety Study (RSS) or Wash-1400 developed a methodology estimating the public risk from light water nuclear reactors. In order to give further insights into this study, a sensitivity analysis has been performed to determine the significant contributors to risk for both the PWR and BWR. The sensitivity to variation of the point values of the failure probabilities reported in the RSS was determined for the safety systems identified therein, as well as for many of the generic classes from which individual failures contributed to system failures. Increasing as well as decreasing point values were considered. An analysis of the sensitivity to increasing uncertainty in system failure probabilities was also performed. The sensitivity parameters chosen were release category prob- abilities, core melt probability, and the risk parameters of early fatalities, latent cancers and total property damage. The latter three are adequate for describing all public risks identified in the RSS. The results indicate reductions of public risk by less than a factor of two for factor reductions in system or generic failure probabilities as hignh as one hundred. There also appears to be more benefit in monitoring the most sensitive systems to verify adherence to RSS failure rates than to backfitting present reactors. The sensitivity analysis results do indicate, however, possible benefits in reducing human error rates.Final report for research project sponsored by Northeast Utilities Service Company, Yankee Atomic Electric Company under the M.I.T. Energy Laboratory Electric Utility Program

    Pressurized water reactor loss-of-coolant accidents by hypothetical vessel rupture

    Get PDF
    Also issued by the 1st author as an Sc. D. thesis, Massachusetts Institute of Technology. Dept. of Nuclear Engineering, 1972Includes bibliographical references (leaves 331-349

    Common cause analysis : a review and extension of existing methods

    Get PDF
    The quantitative common cause analysis code, MOBB, is extended to include uncertainties arising from modelling uncertainties and data uncertainties. Two methods, Monte Carlo simulation and the Method-of-Moments are used to propagate uncertainties through the analysis. The two different capabilities of the code are then compared. When component failure rates are assumed lognormallv distributed, bounded lognormal (Sb) distributions are used to evaluate higher moment terms, as required by the Method-of-Moments, in order to minimize the effect of the tail of the lognormal. A code using the discrete probability distribution (DPD) method is developed for analyzing system unavailability due to common initiating events (internal and external). Sample problems demonstrating each approach are also presented
    corecore