1,368 research outputs found

    On the effects of turbulence on a screw dynamo

    Full text link
    In an experiment in the Institute of Continuous Media Mechanics in Perm (Russia) an non--stationary screw dynamo is intended to be realized with a helical flow of liquid sodium in a torus. The flow is necessarily turbulent, that is, may be considered as a mean flow and a superimposed turbulence. In this paper the induction processes of the turbulence are investigated within the framework of mean--field electrodynamics. They imply of course a part which leads to an enhanced dissipation of the mean magnetic field. As a consequence of the helical mean flow there are also helical structures in the turbulence. They lead to some kind of α\alpha--effect, which might basically support the screw dynamo. The peculiarity of this α\alpha--effect explains measurements made at a smaller version of the device envisaged for the dynamo experiment. The helical structures of the turbulence lead also to other effects, which in combination with a rotational shear are potentially capable of dynamo action. A part of them can basically support the screw dynamo. Under the conditions of the experiment all induction effects of the turbulence prove to be rather weak in comparison to that of the main flow. Numerical solutions of the mean--field induction equation show that all the induction effects of the turbulence together let the screw dynamo threshold slightly, at most by one per cent, rise. The numerical results give also some insights into the action of the individual induction effects of the turbulence.Comment: 15 pages, 7 figures, in GAFD prin

    Parametric hazard rate models for long-term sickness absence

    Get PDF
    PURPOSE: In research on the time to onset of sickness absence and the duration of sickness absence episodes, Cox proportional hazard models are in common use. However, parametric models are to be preferred when time in itself is considered as independent variable. This study compares parametric hazard rate models for the onset of long-term sickness absence and return to work. METHOD: Prospective cohort study on sickness absence with four follow-up years of 53,830 employees working in the private sector in the Netherlands. The time to onset of long-term (>6 weeks) sickness absence and return to work were modelled by parametric hazard rate models. RESULTS: The exponential parametric model with a constant hazard rate most accurately described the time to onset of long-term sickness absence. Gompertz-Makeham models with monotonically declining hazard rates best described return to work. CONCLUSIONS: Parametric models offer more possibilities than commonly used models for time-dependent processes as sickness absence and return to work. However, the advantages of parametric models above Cox models apply mainly for return to work and less for onset of long-term sickness absence

    Noncommutative generalizations of theorems of Cohen and Kaplansky

    Get PDF
    This paper investigates situations where a property of a ring can be tested on a set of "prime right ideals." Generalizing theorems of Cohen and Kaplansky, we show that every right ideal of a ring is finitely generated (resp. principal) iff every "prime right ideal" is finitely generated (resp. principal), where the phrase "prime right ideal" can be interpreted in one of many different ways. We also use our methods to show that other properties can be tested on special sets of right ideals, such as the right artinian property and various homological properties. Applying these methods, we prove the following noncommutative generalization of a result of Kaplansky: a (left and right) noetherian ring is a principal right ideal ring iff all of its maximal right ideals are principal. A counterexample shows that the left noetherian hypothesis cannot be dropped. Finally, we compare our results to earlier generalizations of Cohen's and Kaplansky's theorems in the literature.Comment: 41 pages. To appear in Algebras and Representation Theory. Minor changes were made to the numbering system, in order to remain consistent with the published versio

    Eighteenth century Yersinia pestis genomes reveal the long-term persistence of an historical plague focus

    Full text link
    © Bos et al. The 14th-18th century pandemic of Yersinia pestis caused devastating disease outbreaks in Europe for almost 400 years. The reasons for plague's persistence and abrupt disappearance in Europe are poorly understood, but could have been due to either the presence of now-extinct plague foci in Europe itself, or successive disease introductions from other locations. Here we present five Y. pestis genomes from one of the last European outbreaks of plague, from 1722 in Marseille, France. The lineage identified has not been found in any extant Y. pestis foci sampled to date, and has its ancestry in strains obtained from victims of the 14th century Black Death. These data suggest the existence of a previously uncharacterized historical plague focus that persisted for at least three centuries. We propose that this disease source may have been responsible for the many resurgences of plague in Europe following the Black Death

    Cluster randomised trial of a tailored intervention to improve the management of overweight and obesity in primary care in England

    Get PDF
    Background: Tailoring is a frequent component of approaches for implementing clinical practice guidelines, although evidence on how to maximise the effectiveness of tailoring is limited. In England, overweight and obesity are common, and national guidelines have been produced by the National Institute for Health and Care Excellence. However, the guidelines are not routinely followed in primary care. Methods: A tailored implementation intervention was developed following an analysis of the determinants of practice influencing the implementation of the guidelines on obesity and the selection of strategies to address the determinants. General practices in the East Midlands of England were invited to take part in a cluster randomised controlled trial of the intervention. The primary outcome measure was the proportion of overweight or obese patients offered a weight loss intervention. Secondary outcomes were the proportions of patients with (1) a BMI or waist circumference recorded, (2) record of lifestyle assessment, (3) referred to weight loss services, and (4) any change in weight during the study period. We also assessed the mean weight change over the study period. Follow-up was for 9 months after the intervention. A process evaluation was undertaken, involving interviews of samples of participating health professionals. Results: There were 16 general practices in the control group, and 12 in the intervention group. At follow-up, 15. 08 % in the control group and 13.19 % in the intervention group had been offered a weight loss intervention, odds ratio (OR) 1.16, 95 % confidence interval (CI) (0.72, 1.89). BMI/waist circumference measurement 42.71 % control, 39.56 % intervention, OR 1.15 (CI 0.89, 1.48), referral to weight loss services 5.10 % control, 3.67 % intervention, OR 1.45 (CI 0.81, 2.63), weight management in the practice 9.59 % control, 8.73 % intervention, OR 1.09 (CI 0.55, 2.15), lifestyle assessment 23.05 % control, 23.86 % intervention, OR 0.98 (CI 0.76, 1.26), weight loss of at least 1 kg 42.22 % control, 41.65 % intervention, OR 0.98 (CI 0.87, 1.09). Health professionals reported the interventions as increasing their confidence in managing obesity and providing them with practical resources. Conclusions: The tailored intervention did not improve the implementation of the guidelines on obesity, despite systematic approaches to the identification of the determinants of practice. The methods of tailoring require further development to ensure that interventions target those determinants that most influence implementation

    Approximating Optimal Behavioural Strategies Down to Rules-of-Thumb: Energy Reserve Changes in Pairs of Social Foragers

    Get PDF
    Functional explanations of behaviour often propose optimal strategies for organisms to follow. These ‘best’ strategies could be difficult to perform given biological constraints such as neural architecture and physiological constraints. Instead, simple heuristics or ‘rules-of-thumb’ that approximate these optimal strategies may instead be performed. From a modelling perspective, rules-of-thumb are also useful tools for considering how group behaviour is shaped by the behaviours of individuals. Using simple rules-of-thumb reduces the complexity of these models, but care needs to be taken to use rules that are biologically relevant. Here, we investigate the similarity between the outputs of a two-player dynamic foraging game (which generated optimal but complex solutions) and a computational simulation of the behaviours of the two members of a foraging pair, who instead followed a rule-of-thumb approximation of the game's output. The original game generated complex results, and we demonstrate here that the simulations following the much-simplified rules-of-thumb also generate complex results, suggesting that the rule-of-thumb was sufficient to make some of the model outcomes unpredictable. There was some agreement between both modelling techniques, but some differences arose – particularly when pair members were not identical in how they gained and lost energy. We argue that exploring how rules-of-thumb perform in comparison to their optimal counterparts is an important exercise for biologically validating the output of agent-based models of group behaviour

    Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems

    Get PDF
    A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud \u

    Initiative, Personality and Leadership in Pairs of Foraging Fish

    Get PDF
    Studies of coordinated movement have found that, in many animal species, bolder individuals are more likely to initiate movement and shyer individuals to follow. Here, we show that in pairs of foraging stickleback fish, leadership is not merely a passive consequence of temperamental differences. Instead, the act of initiating a joint foraging trip out of cover itself brings about a change in the role that an individual plays throughout the subsequent trip, and success in recruiting a partner affects an individual's tendency to initiate the next trip. On each joint trip, whichever fish took the initiative in leading out of cover gains greater influence over its partner's behaviour, which persists even after several changes in position (i.e. termination attempts and re-joining). During any given trip, the initiator is less responsive to its partner's movements than during trips initiated by the partner. An individual's personality had an important effect on its response to failure to recruit a partner: while bold fish were unaffected by failures to initiate a joint trip, shy individuals were less likely to attempt another initiation after a failure. This difference provides a positive feedback mechanism that can partially stabilise social roles within the pair, but it is not strong enough to prevent occasional swaps, with individuals dynamically adjusting their responses to one another as they exchange roles

    Mesoscopic organization reveals the constraints governing C. elegans nervous system

    Get PDF
    One of the biggest challenges in biology is to understand how activity at the cellular level of neurons, as a result of their mutual interactions, leads to the observed behavior of an organism responding to a variety of environmental stimuli. Investigating the intermediate or mesoscopic level of organization in the nervous system is a vital step towards understanding how the integration of micro-level dynamics results in macro-level functioning. In this paper, we have considered the somatic nervous system of the nematode Caenorhabditis elegans, for which the entire neuronal connectivity diagram is known. We focus on the organization of the system into modules, i.e., neuronal groups having relatively higher connection density compared to that of the overall network. We show that this mesoscopic feature cannot be explained exclusively in terms of considerations, such as optimizing for resource constraints (viz., total wiring cost) and communication efficiency (i.e., network path length). Comparison with other complex networks designed for efficient transport (of signals or resources) implies that neuronal networks form a distinct class. This suggests that the principal function of the network, viz., processing of sensory information resulting in appropriate motor response, may be playing a vital role in determining the connection topology. Using modular spectral analysis, we make explicit the intimate relation between function and structure in the nervous system. This is further brought out by identifying functionally critical neurons purely on the basis of patterns of intra- and inter-modular connections. Our study reveals how the design of the nervous system reflects several constraints, including its key functional role as a processor of information.Comment: Published version, Minor modifications, 16 pages, 9 figure

    WebCARMA: a web application for the functional and taxonomic classification of unassembled metagenomic reads

    Get PDF
    Gerlach W, Jünemann S, Tille F, Goesmann A, Stoye J. WebCARMA: a web application for the functional and taxonomic classification of unassembled metagenomic reads. BMC Bioinformatics. 2009;10(1):430.Background Metagenomics is a new field of research on natural microbial communities. High-throughput sequencing techniques like 454 or Solexa-Illumina promise new possibilities as they are able to produce huge amounts of data in much shorter time and with less efforts and costs than the traditional Sanger technique. But the data produced comes in even shorter reads (35-100 basepairs with Illumina, 100-500 basepairs with 454-sequencing). CARMA is a new software pipeline for the characterisation of species composition and the genetic potential of microbial samples using short, unassembled reads. Results In this paper, we introduce WebCARMA, a refined version of CARMA available as a web application for the taxonomic and functional classification of unassembled (ultra-)short reads from metagenomic communities. In addition, we have analysed the applicability of ultra-short reads in metagenomics. Conclusions We show that unassembled reads as short as 35 bp can be used for the taxonomic classification of a metagenome. The web application is freely available at http://webcarma.cebitec.uni-bielefeld.d
    corecore