200 research outputs found

    A well-separated pairs decomposition algorithm for k-d trees implemented on multi-core architectures

    Get PDF
    Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.Variations of k-d trees represent a fundamental data structure used in Computational Geometry with numerous applications in science. For example particle track tting in the software of the LHC experiments, and in simulations of N-body systems in the study of dynamics of interacting galaxies, particle beam physics, and molecular dynamics in biochemistry. The many-body tree methods devised by Barnes and Hutt in the 1980s and the Fast Multipole Method introduced in 1987 by Greengard and Rokhlin use variants of k-d trees to reduce the computation time upper bounds to O(n log n) and even O(n) from O(n2). We present an algorithm that uses the principle of well-separated pairs decomposition to always produce compressed trees in O(n log n) work. We present and evaluate parallel implementations for the algorithm that can take advantage of multi-core architectures.The Science and Technology Facilities Council, UK

    Science and Ideology in Economic, Political, and Social Thought

    Get PDF
    This paper has two sources: One is my own research in three broad areas: business cycles, economic measurement and social choice. In all of these fields I attempted to apply the basic precepts of the scientific method as it is understood in the natural sciences. I found that my effort at using natural science methods in economics was met with little understanding and often considerable hostility. I found economics to be driven less by common sense and empirical evidence, then by various ideologies that exhibited either a political or a methodological bias, or both. This brings me to the second source: Several books have appeared recently that describe in historical terms the ideological forces that have shaped either the direct areas in which I worked, or a broader background. These books taught me that the ideological forces in the social sciences are even stronger than I imagined on the basis of my own experiences. The scientific method is the antipode to ideology. I feel that the scientific work that I have done on specific, long standing and fundamental problems in economics and political science have given me additional insights into the destructive role of ideology beyond the history of thought orientation of the works I will be discussing

    Lifespan extension and the doctrine of double effect

    Get PDF
    Recent developments in biogerontology—the study of the biology of ageing—suggest that it may eventually be possible to intervene in the human ageing process. This, in turn, offers the prospect of significantly postponing the onset of age-related diseases. The biogerontological project, however, has met with strong resistance, especially by deontologists. They consider the act of intervening in the ageing process impermissible on the grounds that it would (most probably) bring about an extended maximum lifespan—a state of affairs that they deem intrinsically bad. In a bid to convince their deontological opponents of the permissibility of this act, proponents of biogerontology invoke an argument which is grounded in the doctrine of double effect. Surprisingly, their argument, which we refer to as the ‘double effect argument’, has gone unnoticed. This article exposes and critically evaluates this ‘double effect argument’. To this end, we first review a series of excerpts from the ethical debate on biogerontology in order to substantiate the presence of double effect reasoning. Next, we attempt to determine the role that the ‘double effect argument’ is meant to fulfil within this debate. Finally, we assess whether the act of intervening in ageing actually can be justified using double effect reasoning

    Artificial intelligence, systemic risks, and sustainability

    Get PDF
    Automated decision making and predictive analytics through artificial intelligence, in combination with rapid progress in technologies such as sensor technology and robotics are likely to change the way individuals, communities, governments and private actors perceive and respond to climate and ecological change. Methods based on various forms of artificial intelligence are already today being applied in a number of research fields related to climate change and environmental monitoring. Investments into applications of these technologies in agriculture, forestry and the extraction of marine resources also seem to be increasing rapidly. Despite a growing interest in, and deployment of AI-technologies in domains critical for sustainability, few have explored possible systemic risks in depth. This article offers a global overview of the progress of such technologies in sectors with high impact potential for sustainability like farming, forestry and the extraction of marine resources. We also identify possible systemic risks in these domains including a) algorithmic bias and allocative harms; b) unequal access and benefits; c) cascading failures and external disruptions, and d) trade-offs between efficiency and resilience. We explore these emerging risks, identify critical questions, and discuss the limitations of current governance mechanisms in addressing AI sustainability risks in these sectors

    Paths Explored, Paths Omitted, Paths Obscured: Decision Points & Selective Reporting in End-to-End Data Analysis

    Full text link
    Drawing reliable inferences from data involves many, sometimes arbitrary, decisions across phases of data collection, wrangling, and modeling. As different choices can lead to diverging conclusions, understanding how researchers make analytic decisions is important for supporting robust and replicable analysis. In this study, we pore over nine published research studies and conduct semi-structured interviews with their authors. We observe that researchers often base their decisions on methodological or theoretical concerns, but subject to constraints arising from the data, expertise, or perceived interpretability. We confirm that researchers may experiment with choices in search of desirable results, but also identify other reasons why researchers explore alternatives yet omit findings. In concert with our interviews, we also contribute visualizations for communicating decision processes throughout an analysis. Based on our results, we identify design opportunities for strengthening end-to-end analysis, for instance via tracking and meta-analysis of multiple decision paths

    LSST: from Science Drivers to Reference Design and Anticipated Data Products

    Get PDF
    (Abridged) We describe here the most ambitious survey currently planned in the optical, the Large Synoptic Survey Telescope (LSST). A vast array of science will be enabled by a single wide-deep-fast sky survey, and LSST will have unique survey capability in the faint time domain. The LSST design is driven by four main science themes: probing dark energy and dark matter, taking an inventory of the Solar System, exploring the transient optical sky, and mapping the Milky Way. LSST will be a wide-field ground-based system sited at Cerro Pach\'{o}n in northern Chile. The telescope will have an 8.4 m (6.5 m effective) primary mirror, a 9.6 deg2^2 field of view, and a 3.2 Gigapixel camera. The standard observing sequence will consist of pairs of 15-second exposures in a given field, with two such visits in each pointing in a given night. With these repeats, the LSST system is capable of imaging about 10,000 square degrees of sky in a single filter in three nights. The typical 5σ\sigma point-source depth in a single visit in rr will be ∌24.5\sim 24.5 (AB). The project is in the construction phase and will begin regular survey operations by 2022. The survey area will be contained within 30,000 deg2^2 with ÎŽ<+34.5∘\delta<+34.5^\circ, and will be imaged multiple times in six bands, ugrizyugrizy, covering the wavelength range 320--1050 nm. About 90\% of the observing time will be devoted to a deep-wide-fast survey mode which will uniformly observe a 18,000 deg2^2 region about 800 times (summed over all six bands) during the anticipated 10 years of operations, and yield a coadded map to r∌27.5r\sim27.5. The remaining 10\% of the observing time will be allocated to projects such as a Very Deep and Fast time domain survey. The goal is to make LSST data products, including a relational database of about 32 trillion observations of 40 billion objects, available to the public and scientists around the world.Comment: 57 pages, 32 color figures, version with high-resolution figures available from https://www.lsst.org/overvie

    Monte Carlo of Trapped Ultracold Neutrons in the UCNτ Trap

    Get PDF
    In the UCNτ experiment, ultracold neutrons (UCN) are confined by magnetic fields and the Earth’s gravitational field. Field-trapping mitigates the problem of UCN loss on material surfaces, which caused the largest correction in prior neutron experiments using material bottles. However, the neutron dynamics in field traps differ qualitatively from those in material bottles. In the latter case, neutrons bounce off material surfaces with significant diffusivity and the population quickly reaches a static spatial distribution with a density gradient induced by the gravitational potential. In contrast, the field-confined UCN—whose dynamics can be described by Hamiltonian mechanics—do not exhibit the stochastic behaviors typical of an ideal gas model as observed in material bottles. In this report, we will describe our efforts to simulate UCN trapping in the UCNτ magneto-gravitational trap. We compare the simulation output to the experimental results to determine the parameters of the neutron detector and the input neutron distribution. The tuned model is then used to understand the phase space evolution of neutrons observed in the UCNτ experiment. We will discuss the implications of chaotic dynamics on controlling the systematic effects, such as spectral cleaning and microphonic heating, for a successful UCN lifetime experiment to reach a 0.01% level of precision

    Characteristics of Sexual Abuse in Childhood and Adolescence Influence Sexual Risk Behavior in Adulthood

    Get PDF
    Childhood and adolescent sexual abuse has been associated with subsequent (adult) sexual risk behavior, but the effects of force and type of sexual abuse on sexual behavior outcomes have been less well-studied. The present study investigated the associations between sexual abuse characteristics and later sexual risk behavior, and explored whether gender of the child/adolescent moderated these relations. Patients attending an STD clinic completed a computerized survey that assessed history of sexual abuse as well as lifetime and current sexual behavior. Participants were considered sexually abused if they reported a sexual experience (1) before age 13 with someone 5 or more years older, (2) between the ages of 13 and 16 with someone 10 or more years older, or (3) before the age of 17 involving force or coercion. Participants who were sexually abused were further categorized based on two abuse characteristics, namely, use of penetration and force. Analyses included 1177 participants (n=534 women; n=643 men). Those who reported sexual abuse involving penetration and/or force reported more adult sexual risk behavior, including the number of lifetime partners and number of previous STD diagnoses, than those who were not sexually abused and those who were abused without force or penetration. There were no significant differences in sexual risk behavior between nonabused participants and those who reported sexual abuse without force and without penetration. Gender of the child/adolescent moderated the association between sexual abuse characteristics and adult sexual risk behavior; for men, sexual abuse with force and penetration was associated with the greatest number of episodes of sex trading, whereas for women, those who were abused with penetration, regardless of whether the abuse involved force, reported the most episodes of sex trading. These findings indicate that more severe sexual abuse is associated with riskier adult sexual behavior

    Correlated Evolution of Nearby Residues in Drosophilid Proteins

    Get PDF
    Here we investigate the correlations between coding sequence substitutions as a function of their separation along the protein sequence. We consider both substitutions between the reference genomes of several Drosophilids as well as polymorphisms in a population sample of Zimbabwean Drosophila melanogaster. We find that amino acid substitutions are “clustered” along the protein sequence, that is, the frequency of additional substitutions is strongly enhanced within ≈10 residues of a first such substitution. No such clustering is observed for synonymous substitutions, supporting a “correlation length” associated with selection on proteins as the causative mechanism. Clustering is stronger between substitutions that arose in the same lineage than it is between substitutions that arose in different lineages. We consider several possible origins of clustering, concluding that epistasis (interactions between amino acids within a protein that affect function) and positional heterogeneity in the strength of purifying selection are primarily responsible. The role of epistasis is directly supported by the tendency of nearby substitutions that arose on the same lineage to preserve the total charge of the residues within the correlation length and by the preferential cosegregation of neighboring derived alleles in our population sample. We interpret the observed length scale of clustering as a statistical reflection of the functional locality (or modularity) of proteins: amino acids that are near each other on the protein backbone are more likely to contribute to, and collaborate toward, a common subfunction

    Absorbing customer knowledge: how customer involvement enables service design success

    Get PDF
    Customers are a knowledge resource outside of the firm that can be utilized for new service success by involving them in the design process. However, existing research on the impact of customer involvement (CI) is inconclusive. Knowledge about customers’ needs and on how best to serve these needs (articulated in the service concept) is best obtained from customers themselves. However, codesign runs the risk of losing control of the service concept. This research argues that of the processes of external knowledge, acquisition (via CI), customer knowledge assimilation, and concept transformation form a capability that enables the firm to exploit customer knowledge in the form of a successful new service. Data from a survey of 126 new service projects show that the impact of CI on new service success is fully mediated by customer knowledge assimilation (the deep understanding of customers’ latent needs) and concept transformation (the modification of the service concept due to customer insights). However, its impact is more nuanced. CI exhibits an “∩”-shaped relationship with transformation, indicating there is a limit to the beneficial effect of CI. Its relationship with assimilation is “U” shaped, suggesting a problem with cognitive inertia where initial learnings are ignored. Customer knowledge assimilation directly impacts success, while concept transformation only helps success in the presence of resource slack. An evolving new service design is only beneficial if the firm has the flexibility to adapt to change
    • 

    corecore