1,139 research outputs found

    Recent Progress in Neutron Star Theory

    Get PDF
    This review contains chapters discussing: Energy density fluctionals of nuclear matter, Many-body theory of nucleon matter, Hadronic and quark matter, Mixtures of phases in dense matter, Neutron star observations and predictions.Comment: 33 pages +13 figs., Ann. Rev. Nucl. & Part. Science, 200

    Robots that can adapt like animals

    Get PDF
    As robots leave the controlled environments of factories to autonomously function in more complex, natural environments, they will have to respond to the inevitable fact that they will become damaged. However, while animals can quickly adapt to a wide variety of injuries, current robots cannot "think outside the box" to find a compensatory behavior when damaged: they are limited to their pre-specified self-sensing abilities, can diagnose only anticipated failure modes, and require a pre-programmed contingency plan for every type of potential damage, an impracticality for complex robots. Here we introduce an intelligent trial and error algorithm that allows robots to adapt to damage in less than two minutes, without requiring self-diagnosis or pre-specified contingency plans. Before deployment, a robot exploits a novel algorithm to create a detailed map of the space of high-performing behaviors: This map represents the robot's intuitions about what behaviors it can perform and their value. If the robot is damaged, it uses these intuitions to guide a trial-and-error learning algorithm that conducts intelligent experiments to rapidly discover a compensatory behavior that works in spite of the damage. Experiments reveal successful adaptations for a legged robot injured in five different ways, including damaged, broken, and missing legs, and for a robotic arm with joints broken in 14 different ways. This new technique will enable more robust, effective, autonomous robots, and suggests principles that animals may use to adapt to injury

    Three-Nucleon Electroweak Capture Reactions

    Get PDF
    Recent advances in the study of the p-d radiative and mu-3he weak capture processes are presented and discussed. The three-nucleon bound and scattering states are obtained using the correlated-hyperspherical-harmonics method, with realistic Hamiltonians consisting of the Argonne v14 or Argonne v18 two-nucleon and Tucson-Melbourne or Urbana IX three-nucleon interactions. The electromagnetic and weak transition operators include one- and two-body contributions. The theoretical accuracy achieved in these calculations allows for interesting comparisons with experimental data.Comment: 12 pages, 4 figures, invited talk at the CFIF Fall Workshop: Nuclear Dynamics, from Quarks to Nuclei, Lisbon, 31st of October - 1st of November 200

    The detection of the imprint of filaments on cosmic microwave background lensing

    Full text link
    Galaxy redshift surveys, such as 2dF, SDSS, 6df, GAMA and VIPERS, have shown that the spatial distribution of matter forms a rich web, known as the cosmic web. The majority of galaxy survey analyses measure the amplitude of galaxy clustering as a function of scale, ignoring information beyond a small number of summary statistics. Since the matter density field becomes highly non-Gaussian as structure evolves under gravity, we expect other statistical descriptions of the field to provide us with additional information. One way to study the non-Gaussianity is to study filaments, which evolve non-linearly from the initial density fluctuations produced in the primordial Universe. In our study, we report the first detection of CMB (Cosmic Microwave Background) lensing by filaments and we apply a null test to confirm our detection. Furthermore, we propose a phenomenological model to interpret the detected signal and we measure how filaments trace the matter distribution on large scales through filament bias, which we measure to be around 1.5. Our study provides a new scope to understand the environmental dependence of galaxy formation. In the future, the joint analysis of lensing and Sunyaev-Zel'dovich observations might reveal the properties of `missing baryons', the vast majority of the gas which resides in the intergalactic medium and has so far evaded most observations

    Minimum Sensitivity Based Robust Beamforming with Eigenspace Decomposition

    Get PDF
    An enhanced eigenspace-based beamformer (ESB) derived using the minimum sensitivity criterion is proposed with significantly improved robustness against steering vector errors. The sensitivity function is defined as the squared norm of the appropriately scaled weight vector and since the sensitivity function of an array to perturbations becomes very large in the presence of steering vector errors, it can be used to find the best projection for the ESB, irrespective of the distribution of additive noises. As demonstrated by simulation results, the proposed method has a better performance than the classic ESBs and the previously proposed uncertainty set based approach

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Fluids in cosmology

    Full text link
    We review the role of fluids in cosmology by first introducing them in General Relativity and then by applying them to a FRW Universe's model. We describe how relativistic and non-relativistic components evolve in the background dynamics. We also introduce scalar fields to show that they are able to yield an inflationary dynamics at very early times (inflation) and late times (quintessence). Then, we proceed to study the thermodynamical properties of the fluids and, lastly, its perturbed kinematics. We make emphasis in the constrictions of parameters by recent cosmological probes.Comment: 34 pages, 4 figures, version accepted as invited review to the book "Computational and Experimental Fluid Mechanics with Applications to Physics, Engineering and the Environment". Version 2: typos corrected and references expande

    The factor structure of the Forms of Self-Criticising/Attacking & Self-Reassuring Scale in thirteen distinct populations

    Get PDF
    There is considerable evidence that self-criticism plays a major role in the vulnerability to and recovery from psychopathology. Methods to measure this process, and its change over time, are therefore important for research in psychopathology and well-being. This study examined the factor structure of a widely used measure, the Forms of Self-Criticising/Attacking & Self-Reassuring Scale in thirteen nonclinical samples (N = 7510) from twelve different countries: Australia (N = 319), Canada (N = 383), Switzerland (N = 230), Israel (N = 476), Italy (N = 389), Japan (N = 264), the Netherlands (N = 360), Portugal (N = 764), Slovakia (N = 1326), Taiwan (N = 417), the United Kingdom 1 (N = 1570), the United Kingdom 2 (N = 883), and USA (N = 331). This study used more advanced analyses than prior reports: a bifactor item-response theory model, a two-tier item-response theory model, and a non-parametric item-response theory (Mokken) scale analysis. Although the original three-factor solution for the FSCRS (distinguishing between Inadequate-Self, Hated-Self, and Reassured-Self) had an acceptable fit, two-tier models, with two general factors (Self-criticism and Self-reassurance) demonstrated the best fit across all samples. This study provides preliminary evidence suggesting that this two-factor structure can be used in a range of nonclinical contexts across countries and cultures. Inadequate-Self and Hated-Self might not by distinct factors in nonclinical samples. Future work may benefit from distinguishing between self-correction versus shame-based self-criticism.Peer reviewe
    corecore