465 research outputs found
Discovery of Five New Pulsars in Archival Data
Reprocessing of the Parkes Multibeam Pulsar Survey has resulted in the
discovery of five previously unknown pulsars and several as-yet-unconfirmed
candidates. PSR J0922-52 has a period of 9.68 ms and a DM of 122.4 pc cm^-3.
PSR J1147-66 has a period of 3.72 ms and a DM of 133.8 pc cm^-3. PSR J1227-6208
has a period of 34.53 ms, a DM of 362.6 pc cm^-3, is in a 6.7 day binary orbit,
and was independently detected in an ongoing high-resolution Parkes survey by
Thornton et al. and also in independent processing by Einstein@Home volunteers.
PSR J1546-59 has a period of 7.80 ms and a DM of 168.3 pc cm^-3. PSR J1725-3853
is an isolated 4.79-ms pulsar with a DM of 158.2 pc cm^-3. These pulsars were
likely missed in earlier processing efforts due to their high DMs and short
periods and the large number of candidates that needed to be looked through.
These discoveries suggest that further pulsars are awaiting discovery in the
multibeam survey data.Comment: 12 pages, 2 figures, 2 tables, accepted to Ap
Formal Verification of Nonlinear Inequalities with Taylor Interval Approximations
We present a formal tool for verification of multivariate nonlinear
inequalities. Our verification method is based on interval arithmetic with
Taylor approximations. Our tool is implemented in the HOL Light proof assistant
and it is capable to verify multivariate nonlinear polynomial and
non-polynomial inequalities on rectangular domains. One of the main features of
our work is an efficient implementation of the verification procedure which can
prove non-trivial high-dimensional inequalities in several seconds. We
developed the verification tool as a part of the Flyspeck project (a formal
proof of the Kepler conjecture). The Flyspeck project includes about 1000
nonlinear inequalities. We successfully tested our method on more than 100
Flyspeck inequalities and estimated that the formal verification procedure is
about 3000 times slower than an informal verification method implemented in
C++. We also describe future work and prospective optimizations for our method.Comment: 15 page
Evaluating Medical Devices Remotely: Current Methods and Potential Innovations
Objective: We present examples of laboratory and remote studies, with a focus on studies appropriate for medical device design and evaluation. From this review and description of extant options for remote testing, we provide methods and tools to achieve research goals remotely. Background: The FDA mandates human factors evaluation of medical devices. Studies show similarities and differences in results collected in laboratories compared to data collected remotely in non-laboratory settings. Remote studies show promise, though many of these are behavioral studies related to cognitive or experimental psychology. Remote usability studies are rare but increasing, as technologies allow for synchronous and asynchronous data collection. Method: We reviewed methods of remote evaluation of medical devices, from testing labels and instruction to usability testing and simulated use. Each method was coded for the attributes (e.g., supported media) that need consideration in usability studies. Results: We present examples of how published usability studies of medical devices could be moved to remote data collection. We also present novel systems for creating such tests, such as the use of 3D printed or virtual prototypes. Finally, we advise on targeted participant recruitment. Conclusion: Remote testing will bring opportunities and challenges to the field of medical device testing. Current methods are adequate for most purposes, excepting the validation of Class III devices. Application: The tools we provide enable the remote evaluation of medical devices. Evaluations have specific research goals, and our framework of attributes helps to select or combine tools for valid testing of medical devices
A Parallel Monte Carlo Code for Simulating Collisional N-body Systems
We present a new parallel code for computing the dynamical evolution of
collisional N-body systems with up to N~10^7 particles. Our code is based on
the the Henon Monte Carlo method for solving the Fokker-Planck equation, and
makes assumptions of spherical symmetry and dynamical equilibrium. The
principal algorithmic developments involve optimizing data structures, and the
introduction of a parallel random number generation scheme, as well as a
parallel sorting algorithm, required to find nearest neighbors for interactions
and to compute the gravitational potential. The new algorithms we introduce
along with our choice of decomposition scheme minimize communication costs and
ensure optimal distribution of data and workload among the processing units.
The implementation uses the Message Passing Interface (MPI) library for
communication, which makes it portable to many different supercomputing
architectures. We validate the code by calculating the evolution of clusters
with initial Plummer distribution functions up to core collapse with the number
of stars, N, spanning three orders of magnitude, from 10^5 to 10^7. We find
that our results are in good agreement with self-similar core-collapse
solutions, and the core collapse times generally agree with expectations from
the literature. Also, we observe good total energy conservation, within less
than 0.04% throughout all simulations. We analyze the performance of the code,
and demonstrate near-linear scaling of the runtime with the number of
processors up to 64 processors for N=10^5, 128 for N=10^6 and 256 for N=10^7.
The runtime reaches a saturation with the addition of more processors beyond
these limits which is a characteristic of the parallel sorting algorithm. The
resulting maximum speedups we achieve are approximately 60x, 100x, and 220x,
respectively.Comment: 53 pages, 13 figures, accepted for publication in ApJ Supplement
Examination of direct-photon and pion production in proton-nucleon collisions
We present a study of inclusive direct-photon and pion production in hadronic
interactions, focusing on a comparison of the ratio of gamma/pi0 yields with
expectations from next-to-leading order perturbative QCD (NLO pQCD). We also
examine the impact of a phenomenological model involving k_T smearing (which
approximates effects of additional soft-gluon emission) on absolute predictions
for photon and pion production and their ratio.Comment: 20 pages, 12 figures. Minor changes in wording and in figure
The Relationship Between Belief and Credence
Sometimes epistemologists theorize about belief, a tripartite attitude on which one can believe, withhold belief, or disbelieve a proposition. In other cases, epistemologists theorize about credence, a fine-grained attitude that represents one’s subjective probability or confidence level toward a proposition. How do these two attitudes relate to each other? This article explores the relationship between belief and credence in two categories: descriptive and normative. It then explains the broader significance of the belief-credence connection and concludes with general lessons from the debate thus far
Recommended from our members
Bioavailability in soils
The consumption of locally-produced vegetables by humans may be an important exposure pathway for soil contaminants in many urban settings and for agricultural land use. Hence, prediction of metal and metalloid uptake by vegetables from contaminated soils is an important part of the Human Health Risk Assessment procedure. The behaviour of metals (cadmium, chromium, cobalt, copper, mercury, molybdenum, nickel, lead and zinc) and metalloids (arsenic, boron and selenium) in contaminated soils depends to a large extent on the intrinsic charge, valence and speciation of the contaminant ion, and soil properties such as pH, redox status and contents of clay and/or organic matter. However, chemistry and behaviour of the contaminant in soil alone cannot predict soil-to-plant transfer. Root uptake, root selectivity, ion interactions, rhizosphere processes, leaf uptake from the atmosphere, and plant partitioning are important processes that ultimately govern the accumulation ofmetals and metalloids in edible vegetable tissues. Mechanistic models to accurately describe all these processes have not yet been developed, let alone validated under field conditions. Hence, to estimate risks by vegetable consumption, empirical models have been used to correlate concentrations of metals and metalloids in contaminated soils, soil physico-chemical characteristics, and concentrations of elements in vegetable tissues. These models should only be used within the bounds of their calibration, and often need to be re-calibrated or validated using local soil and environmental conditions on a regional or site-specific basis.Mike J. McLaughlin, Erik Smolders, Fien Degryse, and Rene Rietr
Combined Decision Techniques for the Existential Theory of the Reals
Methods for deciding quantifier-free non-linear arithmetical conjectures over *** are crucial in the formal verification of many real-world systems and in formalised mathematics. While non-linear (rational function) arithmetic over *** is decidable, it is fundamentally infeasible: any general decision method for this problem is worst-case exponential in the dimension (number of variables) of the formula being analysed. This is unfortunate, as many practical applications of real algebraic decision methods require reasoning about high-dimensional conjectures. Despite their inherent infeasibility, a number of different decision methods have been developed, most of which have "sweet spots" --- e.g., types of problems for which they perform much better than they do in general. Such "sweet spots" can in many cases be heuristically combined to solve problems that are out of reach of the individual decision methods when used in isolation. RAHD ("Real Algebra in High Dimensions") is a theorem prover that works to combine a collection of real algebraic decision methods in ways that exploit their respective "sweet-spots." We discuss high-level mathematical and design aspects of RAHD and illustrate its use on a number of examples
Blueberry Advisory Committee Extension Report
The 1986 edition of the Blueberry Advisory Committee Extension Reports was prepared for the Maine Blueberry Commission and the University of Maine Blueberry Advisory Committee by researchers with the Maine Agricultural Experiment Station and Maine Cooperative Extension Service at the University of Maine, Orono. Projects in this report include:
1. Fertility Levels
2. Insect and Disease Fact Sheets
3. Development of Insect ID Information for Growers
4. Effect of Pruning Practices an Blueberry Insect Abundance
5. Control of Blueberry Maggot (Alternatives to Guthion)
6. Economic Thresholds and Control of Secondary Blueberry Pests
7. Chemical Control of Mummyberry Disease
8. Chemical Control of Botrytis Bloom Blight
9. Effects of Late Summer Fungicide Applications
10. Mowing vs. Burning - Comparisons of Disease Incidence
11. Long-term Effects of N and NPK Fertilizer on Plant Growth and Yield
12. Effect of Several Mulches on Frost Heaving, Soil Moisture, Soil Temperature and Rhizome Development
13. Interaction of Fertility and Pruning Practices on Soil
14. Effect of Block Freezing on Physical Characterization and Sugar Migration on Lowbush Blueberries
15. Demonstration of the Rota-Cone Vacuum Drying Process on Lowbush Blueberries
16. Production of a Blueberry Gelatin
17. Isolation and Characterization of Blueberry Pectin
18. The Effect of pH, Chemicals and Holding time-temperature on the color of Blueberry Puree
19. Effect of Hexazinone on Species Distribution in Lowbush Blueberry Fields
20. Evaluation of Postemergent Herbicides for Grass Control
21. Evaluation of Sulfonyl urea and lmidazoline compounds for Bunchberry Control
22. Use of Mechanical wiper with glyphosate or dicamba for control of dogbane
23. Hand-wiper Applications of Herbicides on Woody Weeds
24. Dogbane Control with 2% Glyphosate
25. Low Volume Solution of Asulam for Bracken Fern Control
26. Integrated Weed Management
27. 1986 Annual Report to the Maine Lowbush Blueberry Commissio
- …