1,018 research outputs found

    Efficacy of non-lead ammunition for culling elk at Theodore Roosevelt National Park

    Get PDF
    During 2010 to 2013, park staff and public volunteers culled 983 elk (Cervus elaphus) from Theodore Roosevelt National Park (United States) utilizing non-lead rifle ammunition as part of a sanctioned herd management operation. Because there is little empirical evidence available on the performance of non-lead ammunition, staff recorded information on tools and techniques relevant to the scenarios under which elk were culled and the outcome of each engagement. We also conducted a fi ring range experiment to evaluate the precision of nonlead ammunition used in park fi rearms. Specific objectives were to identify program factors predicting efficient destruction of elk with non-lead ammunition and to evaluate the precision of non-lead ammunition in National Park Service (NPS) fi rearms to facilitate accurate shot placement. To address these objectives, we conducted multivariate ordinal regression analyses of 13 variables, including bullet type, marksman type, shot distance, initial shot impact location, number of shots fi red, and need for a killing shot, as predictors of distance traveled by elk after being shot. Among 921 elk removals evaluated, mean shot distance was 182 meters, and the median and mode of distance traveled were 46 m and 0 m, respectively. Multivariate analyses revealed that shots to the head and neck were most effective, followed by those striking the shoulder and chest. Heavier bullets should be used whenever practical. Mean group size for non-lead ammunition fi red through NPS fi rearms was 50 mm at 91 m, with minimum and maximum group sizes of 18.8 and 98.6 mm, respectively. We found that non-lead ammunition provided the necessary precision for accurate shot placement in spot and stalk hunting conditions and that these bullets typically accomplished instantaneous or near-instantaneous incapacitation of elk whenever vital areas of the body were impacted. We conclude that non-lead bullets are effective for wildlife management and hunting scenarios

    Sign-time distribution for a random walker with a drifting boundary

    Full text link
    We present a derivation of the exact sign-time distribution for a random walker in the presence of a boundary moving with constant velocity.Comment: 5 page

    Underground mine scheduling under uncertainty

    Get PDF
    17 USC 105 interim-entered record; under review.The article of record as published may be found at http://dx.doi.org/10.1016/j.ejor.2021.01.011Underground mine schedules seek to determine start dates for activities related to the extraction of ore, often with an objective of maximizing net present value; constraints enforce geotechnical precedence between activities, and restrict resource consumption on a per-time-period basis, e.g., development footage and extracted tons. Strategic schedules address these start dates at a coarse level, whereas tactical schedules must account for the day-to-day variability of underground mine operations, such as unanticipated equipment breakdowns and ground conditions, both of which might slow production. At the time of this writing, the underground mine scheduling literature is dominated by a deterministic treatment of the problem, usually modeled as a Resource Constrained Project Scheduling Problem (RCPSP), which precludes mine operators from reacting to unforeseen circumstances. Therefore, we propose a stochastic integer programming framework that: (i) characterizes uncertainty in duration and economic value for each underground mining activity; (ii) formulates a new stochastic variant of the RCPSP; (iii) suggests an optimization-based heuristic; and, (iv) produces implementable, tactical schedules in a practical amount of time and provides corresponding managerial insights.National Institute of Occupational Safety and HealthNational Agency for Research and Development (ANID

    A novel product representation to highlight cross-assembly dependencies and product robustness

    Get PDF
    AbstractManufacturing industry has traditionally used Bill of Materials (BOMs) and Product Lifecycle Management (PLM) tools to track components and sub-assemblies within a product. These apply a hierarchical structure to product assemblies and sub-assemblies. Impacts of change to one or more components can easily be traced throughout the assembly tree; however, changes impacting another component not directly or explicitly connected to the first are not considered. Here the authors present the novel Kendrick Reticulated Ontology Model (KROM), a mesh component network to highlight cross-assembly dependencies. Nth–order connections are considered through user inputted links between otherwise unconnected components. Unexpected emergent behaviours can therefore be anticipated. Network analysis was applied to the resulting graph, quantifying the design's robustness though centrality measures. Considering both product components and assembly associated tooling and jigging demonstrates the true propagating impact of design change. It is shown that core component connectedness order is changed when tooling becomes part of the network. This is particularly significant when considering the regular omission of tooling in BOMs. Here, a disconnection between Design Engineering and Production Engineering after design finalisation has been determined and a solution presented

    Measuring Redshift-Space Distortions using Photometric Surveys

    Get PDF
    We outline how redshift-space distortions (RSD) can be measured from the angular correlation function w({\theta}), of galaxies selected from photometric surveys. The natural degeneracy between RSD and galaxy bias can be minimized by comparing results from bins with top-hat galaxy selection in redshift, and bins based on the radial position of galaxy pair centres. This comparison can also be used to test the accuracy of the photometric redshifts. The presence of RSD will be clearly detectable with the next generation of photometric redshift surveys. We show that the Dark Energy Survey (DES) will be able to measure f(z){\sigma}_8(z) to a 1{\sigma} accuracy of (17 {\times} b)%, using galaxies drawn from a single narrow redshift slice centered at z = 1. Here b is the linear bias, and f is the logarithmic rate of change of the linear growth rate with respect to the scale factor. Extending to measurements of w({\theta}) for a series of bins of width 0.02(1 + z) over 0.5 < z < 1.4 will measure {\gamma} to a 1{\sigma} accuracy of 25%, given the model f = {\Omega}_m(z)^{\gamma}, and assuming a linear bias model that evolves such that b = 0.5 + z (and fixing other cosmological parameters). The accuracy of our analytic predictions is confirmed using mock catalogs drawn from simulations conducted by the MICE collaboration.Comment: Accepted by MNRAS, revisions include fixing of typos and clarification of the tex

    Distance, Growth Factor, and Dark Energy Constraints from Photometric Baryon Acoustic Oscillation and Weak Lensing Measurements

    Full text link
    Baryon acoustic oscillations (BAOs) and weak lensing (WL) are complementary probes of cosmology. We explore the distance and growth factor measurements from photometric BAO and WL techniques and investigate the roles of the distance and growth factor in constraining dark energy. We find for WL that the growth factor has a great impact on dark energy constraints but is much less powerful than the distance. Dark energy constraints from WL are concentrated in considerably fewer distance eigenmodes than those from BAO, with the largest contributions from modes that are sensitive to the absolute distance. Both techniques have some well determined distance eigenmodes that are not very sensitive to the dark energy equation of state parameters w_0 and w_a, suggesting that they can accommodate additional parameters for dark energy and for the control of systematic uncertainties. A joint analysis of BAO and WL is far more powerful than either technique alone, and the resulting constraints on the distance and growth factor will be useful for distinguishing dark energy and modified gravity models. The Large Synoptic Survey Telescope (LSST) will yield both WL and angular BAO over a sample of several billion galaxies. Joint LSST BAO and WL can yield 0.5% level precision on ten comoving distances evenly spaced in log(1+z) between redshift 0.3 and 3 with cosmic microwave background priors from Planck. In addition, since the angular diameter distance, which directly affects the observables, is linked to the comoving distance solely by the curvature radius in the Friedmann-Robertson-Walker metric solution, LSST can achieve a pure metric constraint of 0.017 on the mean curvature parameter Omega_k of the universe simultaneously with the constraints on the comoving distances.Comment: 15 pages, 9 figures, details and references added, ApJ accepte

    Persistence exponents of non-Gaussian processes in statistical mechanics

    Full text link
    Motivated by certain problems of statistical physics we consider a stationary stochastic process in which deterministic evolution is interrupted at random times by upward jumps of a fixed size. If the evolution consists of linear decay, the sample functions are of the "random sawtooth" type and the level dependent persistence exponent \theta can be calculated exactly. We then develop an expansion method valid for small curvature of the deterministic curve. The curvature parameter g plays the role of the coupling constant of an interacting particle system. The leading order curvature correction to \theta is proportional to g^{2/3}. The expansion applies in particular to exponential decay in the limit of large level, where the curvature correction considerably improves the linear approximation. The Langevin equation, with Gaussian white noise, is recovered as a singular limiting case.Comment: 20 pages, 3 figure

    Dirty Money: The Role of Moral History in Economic Judgments

    Full text link
    Although traditional economic models posit that money is fungible, psychological research abounds with examples that deviate from this assumption. Across eight experiments, we provide evidence that people construe physical currency as carrying traces of its moral history. In Experiments 1 and 2, people report being less likely to want money with negative moral history (i.e., stolen money). Experiments 3–5 provide evidence against an alternative account that people’s judgments merely reflect beliefs about the consequences of accepting stolen money rather than moral sensitivity. Experiment 6 examines whether an aversion to stolen money may reflect contamination concerns, and Experiment 7 indicates that people report they would donate stolen money, thereby counteracting its negative history with a positive act. Finally, Experiment 8 demonstrates that, even in their recall of actual events, people report a reduced tendency to accept tainted money. Altogether, these findings suggest a robust tendency to evaluate money based on its moral history, even though it is designed to participate in exchanges that effectively erase its origins.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/136744/1/cogs12464_am.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/136744/2/cogs12464.pd
    • …
    corecore