1,357 research outputs found

    Assessing Livestock Water Productivity in Mixed Farming Systems of Gumara Watershed, Ethiopia

    Get PDF
    A monitoring study was carried out in Gumara watershed, upper Blue Nile basin, with the objective of evaluating livestock water productivity (LWP) using a life cycle assessment method. Sixty two smallholder farmers were selected for the study implemented between November 2006 and February 2008. Data on crop and livestock production were collected to allow assessment of livestock water productivity. Study sites were situated in three different rainfed mixed crop/livestock farming systems; barley/potato based system (BPS), tef/finger-millet based system (TMS), and rice/noug based system (RNS). LWP was found to be significantly lower (p < 0.01) in RNS (0.057 USD m−3 water) than in TMS (0.066 USD m−3 water) or in BPS (0.066 USD m−3 water). Notably, water requirement per kg live weight of cattle increased towards the lower altitude area (in RNS) mainly because of increased evapo-transpiration. As a result, 20% more water was required per kg live weight of cattle in the low ground RNS compared to BPS situated in the upstream parts of the study area. Cattle herd management that involved early offtake increased LWP by 28% over the practice of late offtake. Crop water productivity expressed in monetary units (0.39 USD m−3 water) was higher than LWP (0.063 USD m−3 water) across the mixed farming systems of Gumara watershed. Strategies for improving LWP, from its present low level, could include keeping only the more productive animals, increasing pasture productivity and linking production to marketing. These strategies would also ease the imbalance between the existing high livestock population and the declining carrying capacity of natural pasture.Peer Reviewe

    Desynchronization and Wave Pattern Formation in MPI-Parallel and Hybrid Memory-Bound Programs

    Full text link
    Analytic, first-principles performance modeling of distributed-memory parallel codes is notoriously imprecise. Even for applications with extremely regular and homogeneous compute-communicate phases, simply adding communication time to computation time does often not yield a satisfactory prediction of parallel runtime due to deviations from the expected simple lockstep pattern caused by system noise, variations in communication time, and inherent load imbalance. In this paper, we highlight the specific cases of provoked and spontaneous desynchronization of memory-bound, bulk-synchronous pure MPI and hybrid MPI+OpenMP programs. Using simple microbenchmarks we observe that although desynchronization can introduce increased waiting time per process, it does not necessarily cause lower resource utilization but can lead to an increase in available bandwidth per core. In case of significant communication overhead, even natural noise can shove the system into a state of automatic overlap of communication and computation, improving the overall time to solution. The saturation point, i.e., the number of processes per memory domain required to achieve full memory bandwidth, is pivotal in the dynamics of this process and the emerging stable wave pattern. We also demonstrate how hybrid MPI-OpenMP programming can prevent desirable desynchronization by eliminating the bandwidth bottleneck among processes. A Chebyshev filter diagonalization application is used to demonstrate some of the observed effects in a realistic setting.Comment: 18 pages, 8 figure

    A Survey of Satisfiability Modulo Theory

    Full text link
    Satisfiability modulo theory (SMT) consists in testing the satisfiability of first-order formulas over linear integer or real arithmetic, or other theories. In this survey, we explain the combination of propositional satisfiability and decision procedures for conjunctions known as DPLL(T), and the alternative "natural domain" approaches. We also cover quantifiers, Craig interpolants, polynomial arithmetic, and how SMT solvers are used in automated software analysis.Comment: Computer Algebra in Scientific Computing, Sep 2016, Bucharest, Romania. 201

    Application of the quick scan audit methodology in an industrial filter production process

    Get PDF
    The quick scan audit methodology (QSAM) is an established investigative tool to assess the health of business processes and supply chains within short schedules. This study extends the standard QSAM procedure to include the simulation step. It also extends the QSAM to a wider industry platform by applying it into the precision mechanical engineering industry, where managers have been under competitive pressure to reduce an industrial filter production lead time. Following a review of the relevant literature, this paper presents the research design adopted in the study. The QSAM has been conducted using various data collection techniques (such as observations, process activity mapping, interviews, questionnaires, brainstorming and access to company documents) and data analysis methods (including cause and effect analysis, Pareto analysis and time series plot). This is followed by the development of a set of improvement strategies, namely, direct information sharing, priority planning, and additional data recording and analysis. In addition to testing the potential benefits of changing scheduling approaches for the paint plant, simulation has been utilized in this study as a communication means to increase employee participation in the QSAM process and enhance the audit accuracy. It has also provided the case company with a better understanding of the behaviour and characteristics of the system under study, thus facilitating more thoughtful decisions to improve the system. The paper concludes with further research opportunities derived from this study

    A new Late Agenian (MN2a, Early Miocene) fossil assemblage from Wallenried (Molasse Basin, Canton Fribourg, Switzerland)

    Get PDF
    Excavations of two fossiliferous layers in the Wallenried sand- and marl pit produced a very diversified vertebrate fauna. New material allows the reassessment of the taxonomic position of the ruminant taxa Andegameryx andegaviensis and endemic Friburgomeryx wallenriedensis. An emended diagnosis for the second species is provided and additional material of large and small mammals, as well as ectothermic vertebrates, is described. The recorded Lagomorpha show interesting morphological deviations from other Central European material, and probably represent a unique transitional assemblage with a co-occurrence of Titanomys, Lagopsis and Prolagus. Rodentia and Eulipotyphla belong to typical and well-known species of the Agenian of the Swiss Molasse Basin. Abundant small mammal teeth have allowed us to pinpoint the biostratigraphic age of Wallenried to late MN2a. The biostratigraphic age conforms to data derived from the charophyte assemblages and confirms the oldest occurrence of venomous snake fangs. The palaeoenvironmental context is quite complex. Sedimentary structures and fauna (fishes, frogs, salamanders, ostracods) are characteristic for a humid, lacustrine environment within a flood plain system

    Cosmology from LOFAR Two-metre Sky Survey Data Release 2: Angular Clustering of Radio Sources

    Get PDF
    Covering ∼5600 deg2 to rms sensitivities of ∼70−100 μJy beam−1, the LOFAR Two-metre Sky Survey Data Release 2 (LoTSS-DR2) provides the largest low-frequency (∼150 MHz) radio catalogue to date, making it an excellent tool for large-area radio cosmology studies. In this work, we use LoTSS-DR2 sources to investigate the angular two-point correlation function of galaxies within the survey. We discuss systematics in the data and an improved methodology for generating random catalogues, compared to that used for LoTSS-DR1, before presenting the angular clustering for ∼900,000 sources ≥1.5 mJy and a peak signal-to-noise ≥7.5 across ∼80% of the observed area. Using the clustering we infer the bias assuming two evolutionary models. When fitting {angular scales of 0.5≤θ&lt;5°, using a linear bias model, we find LoTSS-DR2 sources are biased tracers of the underlying matter, with a bias of bC=2.14+0.22−0.20 (assuming constant bias) and bE(z=0)=1.79+0.15−0.14 (for an evolving model, inversely proportional to the growth factor), corresponding to bE=2.81+0.24−0.22 at the median redshift of our sample, assuming the LoTSS Deep Fields redshift distribution is representative of our data. This reduces to bC=2.02+0.17−0.16 and bE(z=0)=1.67+0.12−0.12 when allowing preferential redshift distributions from the Deep Fields to model our data. Whilst the clustering amplitude is slightly lower than LoTSS-DR1 (≥2 mJy), our study benefits from larger samples and improved redshift estimates

    Security Games with Market Insurance

    Full text link
    Abstract. Security games are characterized by multiple players who strategically adjust their defenses against an abstract attacker, repre-sented by realizations of nature. The defense strategies include both ac-tions where security generates positive externalities and actions that do not. When the players are assumed to be risk averse, market insurance enters as a third strategic option. We formulate a one-shot security game with market insurance, characterize its pure equilibria, and describe how the equilibria compare to established results. Simplifying assumptions include homogeneous players, fair insurance premiums, and complete in-formation except for realizations of nature. The results add more realism to the interpretation of analytical models of security games and might inform policy makers on adjusting incentives to improve network security and foster the development of a market for cyber-insurance
    corecore