1,317 research outputs found

    Recent Progress in Neutron Star Theory

    Get PDF
    This review contains chapters discussing: Energy density fluctionals of nuclear matter, Many-body theory of nucleon matter, Hadronic and quark matter, Mixtures of phases in dense matter, Neutron star observations and predictions.Comment: 33 pages +13 figs., Ann. Rev. Nucl. & Part. Science, 200

    Robots that can adapt like animals

    Get PDF
    As robots leave the controlled environments of factories to autonomously function in more complex, natural environments, they will have to respond to the inevitable fact that they will become damaged. However, while animals can quickly adapt to a wide variety of injuries, current robots cannot "think outside the box" to find a compensatory behavior when damaged: they are limited to their pre-specified self-sensing abilities, can diagnose only anticipated failure modes, and require a pre-programmed contingency plan for every type of potential damage, an impracticality for complex robots. Here we introduce an intelligent trial and error algorithm that allows robots to adapt to damage in less than two minutes, without requiring self-diagnosis or pre-specified contingency plans. Before deployment, a robot exploits a novel algorithm to create a detailed map of the space of high-performing behaviors: This map represents the robot's intuitions about what behaviors it can perform and their value. If the robot is damaged, it uses these intuitions to guide a trial-and-error learning algorithm that conducts intelligent experiments to rapidly discover a compensatory behavior that works in spite of the damage. Experiments reveal successful adaptations for a legged robot injured in five different ways, including damaged, broken, and missing legs, and for a robotic arm with joints broken in 14 different ways. This new technique will enable more robust, effective, autonomous robots, and suggests principles that animals may use to adapt to injury

    Three-Nucleon Electroweak Capture Reactions

    Get PDF
    Recent advances in the study of the p-d radiative and mu-3he weak capture processes are presented and discussed. The three-nucleon bound and scattering states are obtained using the correlated-hyperspherical-harmonics method, with realistic Hamiltonians consisting of the Argonne v14 or Argonne v18 two-nucleon and Tucson-Melbourne or Urbana IX three-nucleon interactions. The electromagnetic and weak transition operators include one- and two-body contributions. The theoretical accuracy achieved in these calculations allows for interesting comparisons with experimental data.Comment: 12 pages, 4 figures, invited talk at the CFIF Fall Workshop: Nuclear Dynamics, from Quarks to Nuclei, Lisbon, 31st of October - 1st of November 200

    The detection of the imprint of filaments on cosmic microwave background lensing

    Full text link
    Galaxy redshift surveys, such as 2dF, SDSS, 6df, GAMA and VIPERS, have shown that the spatial distribution of matter forms a rich web, known as the cosmic web. The majority of galaxy survey analyses measure the amplitude of galaxy clustering as a function of scale, ignoring information beyond a small number of summary statistics. Since the matter density field becomes highly non-Gaussian as structure evolves under gravity, we expect other statistical descriptions of the field to provide us with additional information. One way to study the non-Gaussianity is to study filaments, which evolve non-linearly from the initial density fluctuations produced in the primordial Universe. In our study, we report the first detection of CMB (Cosmic Microwave Background) lensing by filaments and we apply a null test to confirm our detection. Furthermore, we propose a phenomenological model to interpret the detected signal and we measure how filaments trace the matter distribution on large scales through filament bias, which we measure to be around 1.5. Our study provides a new scope to understand the environmental dependence of galaxy formation. In the future, the joint analysis of lensing and Sunyaev-Zel'dovich observations might reveal the properties of `missing baryons', the vast majority of the gas which resides in the intergalactic medium and has so far evaded most observations

    Fluids in cosmology

    Full text link
    We review the role of fluids in cosmology by first introducing them in General Relativity and then by applying them to a FRW Universe's model. We describe how relativistic and non-relativistic components evolve in the background dynamics. We also introduce scalar fields to show that they are able to yield an inflationary dynamics at very early times (inflation) and late times (quintessence). Then, we proceed to study the thermodynamical properties of the fluids and, lastly, its perturbed kinematics. We make emphasis in the constrictions of parameters by recent cosmological probes.Comment: 34 pages, 4 figures, version accepted as invited review to the book "Computational and Experimental Fluid Mechanics with Applications to Physics, Engineering and the Environment". Version 2: typos corrected and references expande

    Minimum Sensitivity Based Robust Beamforming with Eigenspace Decomposition

    Get PDF
    An enhanced eigenspace-based beamformer (ESB) derived using the minimum sensitivity criterion is proposed with significantly improved robustness against steering vector errors. The sensitivity function is defined as the squared norm of the appropriately scaled weight vector and since the sensitivity function of an array to perturbations becomes very large in the presence of steering vector errors, it can be used to find the best projection for the ESB, irrespective of the distribution of additive noises. As demonstrated by simulation results, the proposed method has a better performance than the classic ESBs and the previously proposed uncertainty set based approach

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Search for new phenomena in final states with an energetic jet and large missing transverse momentum in pp collisions at √ s = 8 TeV with the ATLAS detector

    Get PDF
    Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of √ s = 8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT > 120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between Emiss T > 150 GeV and Emiss T > 700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with either large extra spatial dimensions, pair production of weakly interacting dark matter candidates, or production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presente

    The deuteron: structure and form factors

    Get PDF
    A brief review of the history of the discovery of the deuteron in provided. The current status of both experiment and theory for the elastic electron scattering is then presented.Comment: 80 pages, 33 figures, submited to Advances in Nuclear Physic

    Predictor variables and screening protocol for depressive and anxiety disorders in cancer outpatients

    Get PDF
    Background Cancer patients are at increased risk of persistent depressive and anxiety symptoms and disorders compared to the general population. However, these issues are not always identified, which may worsen the prognosis and increase morbidity and mortality. Therefore, the objectives of this study are to identify predictor variables (demographic and clinical) for the development of mood and anxiety disorders in cancer outpatients and to propose a probabilistic screening protocol considering these variables and certain standardized screening instruments. Methods A total of 1,385 adults, of both genders, receiving outpatient cancer care were evaluated using a questionnaire and screening instruments. Thereafter, 400 of these subjects responded to the Structured Clinical Interview for the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (SCID-IV) by telephone to confirm or rule out the presence of a Current Major Depressive Episode (CMDE) or Anxiety Disorder (AD). Results Of the patients surveyed, 64% met the criteria for CMDE and 41% for AD. Female gender was found to be a risk factor for both disorders, and the presence of previous psychiatric history and marital status (divorced and widowed) were risk factors for anxiety disorders. When scoring above the recommended cutoff score, the screening instruments also indicated a risk of the studied disorders. Based on these findings, a screening protocol and nomograms were created for the quantification, combination and probabilistic estimate of risk, with accuracy indicators >0.68. Conclusion The prevalence rates for the disorders under study are extremely high in cancer patients. The use of the proposed protocol and nomogram can facilitate rapid and wide screening, thus refining triage and supporting the establishment of criteria for referral to mental health professionals, so that patients can be properly diagnosed and treated.info:eu-repo/semantics/publishedVersio
    corecore