19,804 research outputs found

    A model for fossil energy use in Danish agriculture used to compare organic and conventional farming

    Get PDF
    Knowledge about fossil energy use in agricultural systems is needed, because it can improve the understanding of how to reduce the unsustainable use of limited energy resources and the following greenhouse gas emissions. This study describes and validates a model to assess fossil energy use in Danish agriculture; gives an example of how the model can be used to compare organic and conventional farming; and discusses the implications and potentials of using the model to simulate energy use in scenarios of agricultural production. The model is a development of an existing model, which was too coarse to predict measured energy use on Danish farms. The model was validated at the field operational, thecroptype, and the national level, and can supplement the Intergovernmental Panel on Climate Change manual to quantify fossil energy use and subsequent carbon dioxide emissions from agriculture. The model can be used to model energy use as one indicator in a multi-criteria evaluation of sustainability, also including other agroecological and socio-economicindicators. As an example, energy use for eight conventional and organic crop types on loamy, sandy, and irrigated sandy soil was compared. The energy use was generally lower in the organic than in the conventionalsystem, but yields were also lower. Consequently, conventional crop production had the highest energy production, where as organic crop production had the highest energy efficiency. Generally, grain cereals such as wheat have a lower energy use per area than roughage crops such as beets. However, because of higher roughage crop yields per area, energy use per feed unit was higher in the roughage crops. Energy use for both conventional cattle and pig production was found to be higher than that for organic production. With respect to fossil energy use per produced livestock unit, agro-ecosystems producing pigs were in both cases less energy effective than those producing cattle. Fossil energy use for thre escenarios of conversion to organic farming with increasing fodder import was compared to current conventional farming in Denmark.The scenario with the highest fodder import showed the highest energy use per livestock unit produced. In all scenarios, the energy use per unit produced was lower than in the present situation. However, the total Danish crop production was also lower. In conclusion, the model can be used to simulate scenarios, which can add new information to the discussion of future, sustainable agricultural production

    Storage tests of nitrogen tetroxide and hydrazine in aluminum containers

    Get PDF
    Nitrogen tetroxide and hydrazine compatibility with aluminum alloy storage tank

    Allograft and Xenograft Acceptance under FK‐506 and Other Immunosuppressant Treatment

    Get PDF
    We will focus on two issues, both involving, but not confined to FK-506: first, the meaning of the graft acceptance, which is, after all, the objective of immunosuppression for the transplant surgeon; and second, how to take the next great step of xenotransplantation

    The art of HIV elimination: past and present science

    Get PDF
    Introduction: Remarkable strides have been made in controlling the HIV epidemic, although not enough to achieve epidemic control. More recently, interest in biomedical HIV control approaches has increased, but substantial challenges with the HIV cascade of care hinder successful implementation. We summarise all available HIV prevention methods and make recommendations on how to address current challenges. Discussion: In the early days of the epidemic, behavioural approaches to control the HIV dominated, and the few available evidence-based interventions demonstrated to reduce HIV transmission were applied independently from one another. More recently, it has become clear that combination prevention strategies targeted to high transmission geographies and people at most risk of infections are required to achieve epidemic control. Biomedical strategies such as male medical circumcision and antiretroviral therapy for treatment in HIV-positive individuals and as preexposure prophylaxis in HIV-negative individuals provide immense promise for the future of HIV control. In resourcerich settings, the threat of HIV treatment optimism resulting in increased sexual risk taking has been observed and there are concerns that as ART roll-out matures in resource-poor settings and the benefits of ART become clearly visible, behavioural disinhibition may also become a challenge in those settings. Unfortunately, an efficacious vaccine, a strategy which could potentially halt the HIV epidemic, remains elusive. Conclusion: Combination HIV prevention offers a logical approach to HIV control, although what and how the available options should be combined is contextual. Therefore, knowledge of the local or national drivers of HIV infection is paramount. Problems with the HIV care continuum remain of concern, hindering progress towards the UNAIDS target of 90-90-90 by 2020. Research is needed on combination interventions that address all the steps of the cascade as the steps are not independent of each other. Until these issues are addressed, HIV elimination may remain an unattainable goal

    Vibration Isolation Design for the Micro-X Rocket Payload

    Get PDF
    Micro-X is a NASA-funded, sounding rocket-borne X-ray imaging spectrometer that will allow high precision measurements of velocity structure, ionization state and elemental composition of extended astrophysical systems. One of the biggest challenges in payload design is to maintain the temperature of the detectors during launch. There are several vibration damping stages to prevent energy transmission from the rocket skin to the detector stage, which causes heating during launch. Each stage should be more rigid than the outer stages to achieve vibrational isolation. We describe a major design effort to tune the resonance frequencies of these vibration isolation stages to reduce heating problems prior to the projected launch in the summer of 2014.Comment: 6 pages, 7 figures, LTD15 Conference Proceeding

    The Mu2e crystal calorimeter

    Get PDF
    The Mu2e Experiment at Fermilab will search for coherent, neutrino-less conversion of negative muons into electrons in the field of an Aluminum nucleus, μ− + Al → e− +Al. Data collection start is planned for the end of 2021. The dynamics of such charged lepton flavour violating (CLFV) process is well modelled by a two-body decay, resulting in a mono-energetic electron with an energy slightly below the muon rest mass. If no events are observed in three years of running, Mu2e will set an upper limit on the ratio between the conversion and the capture rates R_μe = μ− + A(Z,N) → e− + A(Z,N)/μ− + A(Z,N) → ν_μ− + A(Z−1,N) of ≤ 6 × 10^(−17) (@ 90% C.L.). This will improve the current limit of four order of magnitudes with respect to the previous best experiment. Mu2e complements and extends the current search for μ → e γ decay at MEG as well as the direct searches for new physics at the LHC. The observation of such CLFV process could be clear evidence for New Physics beyond the Standard Model. Given its sensitivity, Mu2e will be able to probe New Physics at a scale inaccessible to direct searches at either present or planned high energy colliders. To search for the muon conversion process, a very intense pulsed beam of negative muons (~ 10^(10) μ/sec) is stopped on an Aluminum target inside a very long solenoid where the detector is also located. The Mu2e detector is composed of a straw tube tracker and a CsI crystals electromagnetic calorimeter. An external veto for cosmic rays surrounds the detector solenoid. In 2016, Mu2e has passed the final approval stage from DOE and has started its construction phase. An overview of the physics motivations for Mu2e, the current status of the experiment and the required performances and design details of the calorimeter are presented

    Structured networks and coarse-grained descriptions: a dynamical perspective

    Get PDF
    This chapter discusses the interplay between structure and dynamics in complex networks. Given a particular network with an endowed dynamics, our goal is to find partitions aligned with the dynamical process acting on top of the network. We thus aim to gain a reduced description of the system that takes into account both its structure and dynamics. In the first part, we introduce the general mathematical setup for the types of dynamics we consider throughout the chapter. We provide two guiding examples, namely consensus dynamics and diffusion processes (random walks), motivating their connection to social network analysis, and provide a brief discussion on the general dynamical framework and its possible extensions. In the second part, we focus on the influence of graph structure on the dynamics taking place on the network, focusing on three concepts that allow us to gain insight into this notion. First, we describe how time scale separation can appear in the dynamics on a network as a consequence of graph structure. Second, we discuss how the presence of particular symmetries in the network give rise to invariant dynamical subspaces that can be precisely described by graph partitions. Third, we show how this dynamical viewpoint can be extended to study dynamics on networks with signed edges, which allow us to discuss connections to concepts in social network analysis, such as structural balance. In the third part, we discuss how to use dynamical processes unfolding on the network to detect meaningful network substructures. We then show how such dynamical measures can be related to seemingly different algorithm for community detection and coarse-graining proposed in the literature. We conclude with a brief summary and highlight interesting open future directions

    Fermi gamma-ray `bubbles' from stochastic acceleration of electrons

    Full text link
    Gamma-ray data from Fermi-LAT reveal a bi-lobular structure extending up to 50 degrees above and below the galactic centre, which presumably originated in some form of energy release there less than a few million years ago. It has been argued that the gamma-rays arise from hadronic interactions of high energy cosmic rays which are advected out by a strong wind, or from inverse-Compton scattering of relativistic electrons accelerated at plasma shocks present in the bubbles. We explore the alternative possibility that the relativistic electrons are undergoing stochastic 2nd-order Fermi acceleration by plasma wave turbulence through the entire volume of the bubbles. The observed gamma-ray spectral shape is then explained naturally by the resulting hard electron spectrum and inverse Compton losses. Rather than a constant volume emissivity as in other models, we predict a nearly constant surface brightness, and reproduce the observed sharp edges of the bubbles.Comment: 4 pages, 4 figures; REVTeX4-1; discussion amended and one figure added; to appear in PR

    Locating bugs without looking back

    Get PDF
    Bug localisation is a core program comprehension task in software maintenance: given the observation of a bug, e.g. via a bug report, where is it located in the source code? Information retrieval (IR) approaches see the bug report as the query, and the source code files as the documents to be retrieved, ranked by relevance. Such approaches have the advantage of not requiring expensive static or dynamic analysis of the code. However, current state-of-the-art IR approaches rely on project history, in particular previously fixed bugs or previous versions of the source code. We present a novel approach that directly scores each current file against the given report, thus not requiring past code and reports. The scoring method is based on heuristics identified through manual inspection of a small sample of bug reports. We compare our approach to eight others, using their own five metrics on their own six open source projects. Out of 30 performance indicators, we improve 27 and equal 2. Over the projects analysed, on average we find one or more affected files in the top 10 ranked files for 76% of the bug reports. These results show the applicability of our approach to software projects without history
    corecore