4,499 research outputs found

    What LFA beef and sheep farmers should do and why they should do it

    Get PDF
    This paper describes how representative farm business models were employed to identify optimal beef and sheep production systems for Less Favoured Area (LFA) farms in Northern Ireland. The bio-economic models identify the optimal farming system for theses farms under various market and policy assumptions. They are useful, therefore, in helping to develop industry strategy. The models indicate that, under current market and policy conditions, a dairy-based beef system is likely to be the most profitable beef enterprise. However, depending on land quality and livestock housing resources, and the market and policy environment, suckler-based beef systems can also feature in the profit maximising enterprise mix. The results also suggest that the optimal sheep system is consistent with the stratified sheep systems traditionally operated in Northern Ireland. In general, beef production appears to have some advantages over sheep production where, depending on relative prices and resource availabilities, it is often better to replace sheep with cattle and employ the released labour off-farm, than to replace cattle with sheep and invest the released capital off-farm. In some situations, farmers should significantly reduce their capital and labour inputs to the farm business by substantially reducing stocking rates or even abandoning land completelyAgricultural and Food Policy,

    Developing the Spatial Dimension of Farm Business Models

    Get PDF
    A non-linear mathematical farm business optimisation model, that is set within a spatial economic framework, has been developed. The model incorporates factors such as location, spatial market orientation and technology use, and identifies the business strategy that is optimal in different market and policy environments. Farm household time-use is incorporated centrally within the model, enabling it to examine how on-farm and off-farm activities compete for limited farm household human resources. The model is applied to a beef and sheep farm that can choose between selling livestock to meat processors or processing on-farm and selling direct to consumers. Model simulations reveal when it is optimal for the farm business to innovate in this way and how this decision is affected by changes in key parameters. The farm business model is solved using the GAMS/LINDOGlobal mathematical programming software package. While traditional nonlinear programming and mixed-integer nonlinear programming algorithms are guaranteed to converge only under certain convexity assumptions, GAMS/LINDOGlobal finds guaranteed globally optimal solutions to general nonlinear problems. The model and model results are discussed within the context of theoretical underpinnings, model tractability, and potential applications.Farm Management,

    Spectacular Neolithic finds emerge from the lochs of Lewis

    Get PDF

    Identifying Robust Milk Production Systems

    Get PDF
    The European dairy industry faces an increasingly uncertain world. There is uncertainty about subsidy payment levels and compliance conditions, global competition, price variability, consumer demand, carbon footprints, water quality, biodiversity, landscapes, animal welfare, food safety, etc. The future is uncertain because it cannot be reliably predicted; therefore the industry must adopt production systems that will be financially robust over a wide range of possible circumstances. Adding to the uncertainty is a lack of consensus regarding the specific characteristics of these sustainable production systems. In this interdisciplinary research project we developed a profit maximizing whole-farm model and employ it to identify robust milk production systems for Northern Ireland under varying market, policy and farm family conditions. The milk production systems incorporated into the model involve variations in date of calving, quantity of concentrate fed, and nature of forage utilized. The model also incorporates a disaggregated specification of time use within farm households and links intra-household resource allocation to the process of agricultural technology adoption. This work illustrates how profit maximizing whole-farm models can play a decision support role in helping farmers, agricultural researchers, agribusiness advisers and agricultural policy makers to identify economically sustainable agricultural production systems.Production Economics,

    Resilience engineering as a quality improvement method in healthcare

    Get PDF
    Current approaches to quality improvement rely on the identification of past problems through incident reporting and audits or the use of Lean principles to eliminate waste, to identify how to improve quality. In contrast, Resilience Engineering (RE) is based on insights from complexity science, and quality results from clinicians’ ability to adapt safely to difficult situations, such as a surge in patient numbers, missing equipment or difficult unforeseen physiological problems. Progress in applying these insights to improve quality has been slow, despite the theoretical developments. In this chapter we describe a study in the Emergency Department of a large hospital in which we used RE principles to identify opportunities for quality improvement interventions. In depth observational fieldwork and interviews with clinicians were used to gather data about the key challenges faced, the misalignments between demand and capacity, adaptations that were required, and the four resilience abilities: responding, monitoring, anticipating and learning. Data were transcribed and used to write extended resilience narratives describing the work system. The narratives were analysed thematically using a combined deductive/inductive approach. A structured process was then used to identify potential interventions to improve quality. We describe one intervention to improve monitoring of patient flow and organisational learning about patient flow interventions. The approach we describe is challenging and requires close collaboration with clinicians to ensure accurate results. We found that using RE principles to improve quality is feasible and results in a focus on strengthening processes and supporting the challenges that clinicians face in their daily work

    Assembly and use of new task rules in fronto-parietal cortex

    Get PDF
    Severe capacity limits, closely associated with fluid intelligence, arise in learning and use of new task rules. We used fMRI to investigate these limits in a series of multirule tasks involving different stimuli, rules, and response keys. Data were analyzed both during presentation of instructions and during later task execution. Between tasks, we manipulated the number of rules specified in task instructions, and within tasks, we manipulated the number of rules operative in each trial block. Replicating previous results, rule failures were strongly predicted by fluid intelligence and increased with the number of operative rules. In fMRI data, analyses of the instruction period showed that the bilateral inferior frontal sulcus, intraparietal sulcus, and presupplementary motor area were phasically active with presentation of each new rule. In a broader range of frontal and parietal regions, baseline activity gradually increased as successive rules were instructed. During task performance, we observed contrasting fronto-parietal patterns of sustained (block-related) and transient (trial-related) activity. Block, but not trial, activity showed effects of task complexity. We suggest that, as a new task is learned, a fronto-parietal representation of relevant rules and facts is assembled for future control of behavior. Capacity limits in learning and executing new rules, and their association with fluid intelligence, may be mediated by this load-sensitive fronto-parietal network

    DNA lesion bypass and the stochastic dynamics of transcription coupled repair

    Get PDF
    DNA base damage is a major source of oncogenic mutations and disruption to gene expression. The stalling of RNA polymerase II (RNAP) at sites of DNA damage and the subsequent triggering of repair processes have major roles in shaping the genome-wide distribution of mutations, clearing barriers to transcription, and minimizing the production of miscoded gene products. Despite its importance for genetic integrity, key mechanistic features of this transcription-coupled repair (TCR) process are controversial or unknown. Here, we exploited a well-powered in vivo mammalian model system to explore the mechanistic properties and parameters of TCR for alkylation damage at fine spatial resolution and with discrimination of the damaged DNA strand. For rigorous interpretation, a generalizable mathematical model of DNA damage and TCR was developed. Fitting experimental data to the model and simulation revealed that RNA polymerases frequently bypass lesions without triggering repair, indicating that small alkylation adducts are unlikely to be an efficient barrier to gene expression. Following a burst of damage, the efficiency of transcription-coupled repair gradually decays through gene bodies with implications for the occurrence and accurate inference of driver mutations in cancer. The reinitation of transcription from the repair site is not a general feature of transcription-coupled repair, and the observed data is consistent with reinitiation never taking place. Collectively, these results reveal how the directional but stochastic activity of TCR shapes the distribution of mutations following DNA damage

    The Size of the Radio-Emitting Region in Low-luminosity Active Galactic Nuclei

    Full text link
    We have used the VLA to study radio variability among a sample of 18 low luminosity active galactic nuclei (LLAGNs), on time scales of a few hours to 10 days. The goal was to measure or limit the sizes of the LLAGN radio-emitting regions, in order to use the size measurements as input to models of the radio emission mechanisms in LLAGNs. We detect variability on typical time scales of a few days, at a confidence level of 99%, in half of the target galaxies. Either variability that is intrinsic to the radio emitting regions, or that is caused by scintillation in the Galactic interstellar medium, is consistent with the data. For either interpretation, the brightness temperature of the emission is below the inverse-Compton limit for all of our LLAGNs, and has a mean value of about 1E10 K. The variability measurements plus VLBI upper limits imply that the typical angular size of the LLAGN radio cores at 8.5 GHz is 0.2 milliarcseconds, plus or minus a factor of two. The ~ 1E10 K brightness temperature strongly suggests that a population of high-energy nonthermal electrons must be present, in addition to a hypothesized thermal population in an accretion flow, in order to produce the observed radio emission.Comment: 61 pages, 17 figures, 10 tables. Accepted for publication in the Astrophysical Journa
    corecore