1,861 research outputs found

    An appropriate tool for entrepreneurial learning in SMEs? The case of the 20Twenty Leadership Programme

    Get PDF
    The 20Twenty Leadership Programme was developed by Cardiff Metropolitan University as an executive education programme to be delivered within South Wales to small businesses. It is funded by the European Social Fund (ESF) and administered by the Welsh European Funding Office and has the key aim of developing SME’s growth potential via a range of leadership and management skills, including a focus on β€˜soft’ skills. The focus of this paper is to place the 20Twenty Leadership Programme within the wider context of entrepreneurship policy and SME training initiatives in particular, and then to examine the rationale and delivery methods of the Programme in relation to these. It also reflects on the Programme’s success (or otherwise) to date where possible. Finally, the paper seeks to suggest fruitful areas of further research both in terms of the 20Twenty Leadership Programme itself, but also with regard to evaluation in relation to other parallel programmes, and to SME training initiatives more generally

    Mapping genetic determinants of host susceptibility to Pseudomonas aeruginosa lung infection in mice.

    Get PDF
    Background: P. aeruginosa is one of the top three causes of opportunistic human bacterial infections. The remarkable variability in the clinical outcomes of this infection is thought to be associated with genetic predisposition. However, the genes underlying host susceptibility to P. aeruginosa infection are still largely unknown. Results: As a step towards mapping these genes, we applied a genome wide linkage analysis approach to a mouse model. A large F2 intercross population, obtained by mating P. aeruginosa-resistant C3H/HeOuJ, and susceptible A/J mice, was used for quantitative trait locus (QTL) mapping. The F2 progenies were challenged with a P. aeruginosa clinical strain and monitored for the survival time up to 7 days post-infection, as a disease phenotype associated trait. Selected phenotypic extremes of the F2 distribution were genotyped with high-density single nucleotide polymorphic (SNP) markers, and subsequently QTL analysis was performed. A significant locus was mapped on chromosome 6 and was named P. aeruginosa infection resistance locus 1 (Pairl1). The most promising candidate genes, including Dok1, Tacr1, Cd207, Clec4f, Gp9, Gata2, Foxp1, are related to pathogen sensing, neutrophils and macrophages recruitment and inflammatory processes. Conclusions: We propose a set of genes involved in the pathogenesis of P. aeruginosa infection that may be explored to complement human studie

    Pleosporales

    Get PDF
    One hundred and five generic types of Pleosporales are described and illustrated. A brief introduction and detailed history with short notes on morphology, molecular phylogeny as well as a general conclusion of each genus are provided. For those genera where the type or a representative specimen is unavailable, a brief note is given. Altogether 174 genera of Pleosporales are treated. Phaeotrichaceae as well as Kriegeriella, Zeuctomorpha and Muroia are excluded from Pleosporales. Based on the multigene phylogenetic analysis, the suborder Massarineae is emended to accommodate five families, viz. Lentitheciaceae, Massarinaceae, Montagnulaceae, Morosphaeriaceae and Trematosphaeriaceae

    Considerations on the Castrop formula for calculation of intraocular lens power

    Get PDF
    Background: To explain the concept of the Castrop lens power calculation formula and show the application and results from a large dataset compared to classical formulae. Methods: The Castrop vergence formula is based on a pseudophakic model eye with 4 refractive surfaces. This was compared against the SRKT, Hoffer-Q, Holladay1, simplified Haigis with 1 optimized constant and Haigis formula with 3 optimized constants. A large dataset of preoperative biometric values, lens power data and postoperative refraction data was split into training and test sets. The training data were used for formula constant optimization, and the test data for cross-validation. Constant optimization was performed for all formulae using nonlinear optimization, minimising root mean squared prediction error. Results: The constants for all formulae were derived with the Levenberg-Marquardt algorithm. Applying these constants to the test data, the Castrop formula showed a slightly better performance compared to the classical formulae in terms of prediction error and absolute prediction error. Using the Castrop formula, the standard deviation of the prediction error was lowest at 0.45 dpt, and 95% of all eyes in the test data were within the limit of 0.9 dpt of prediction error. Conclusion: The calculation concept of the Castrop formula and one potential option for optimization of the 3 Castrop formula constants (C, H, and R) are presented. In a large dataset of 1452 data points the performance of the Castrop formula was slightly superior to the respective results of the classical formulae such as SRKT, Hoffer-Q, Holladay1 or Haigis

    Tracking the spatial diffusion of influenza and norovirus using telehealth data: A spatiotemporal analysis of syndromic data

    Get PDF
    Background: Telehealth systems have a large potential for informing public health authorities in an early stage of outbreaks of communicable disease. Influenza and norovirus are common viruses that cause significant respiratory and gastrointestinal disease worldwide. Data about these viruses are not routinely mapped for surveillance purposes in the UK, so the spatial diffusion of national outbreaks and epidemics is not known as such incidents occur. We aim to describe the geographical origin and diffusion of rises in fever and vomiting calls to a national telehealth system, and consider the usefulness of these findings for influenza and norovirus surveillance. Methods: Data about fever calls (5- to 14-year-old age group) and vomiting calls (β‰₯ 5-year-old age group) in school-age children, proxies for influenza and norovirus, respectively, were extracted from the NHS Direct national telehealth database for the period June 2005 to May 2006. The SaTScan space-time permutation model was used to retrospectively detect statistically significant clusters of calls on a week-by-week basis. These syndromic results were validated against existing laboratory and clinical surveillance data. Results: We identified two distinct periods of elevated fever calls. The first originated in the North-West of England during November 2005 and spread in a south-east direction, the second began in Central England during January 2006 and moved southwards. The timing, geographical location, and age structure of these rises in fever calls were similar to a national influenza B outbreak that occurred during winter 2005–2006. We also identified significantly elevated levels of vomiting calls in South-East England during winter 2005–2006. Conclusion: Spatiotemporal analyses of telehealth data, specifically fever calls, provided a timely and unique description of the evolution of a national influenza outbreak. In a similar way the tool may be useful for tracking norovirus, although the lack of consistent comparison data makes this more difficult to assess. In interpreting these results, care must be taken to consider other infectious and non-infectious causes of fever and vomiting. The scan statistic should be considered for spatial analyses of telehealth data elsewhere and will be used to initiate prospective geographical surveillance of influenza in England.

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    Β© 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. Β© 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    A case–control study of risk of leukaemia in relation to mobile phone use

    Get PDF
    Background Mobile phone use is now ubiquitous, and scientific reviews have recommended research into its relation to leukaemia risk, but no large studies have been conducted.Methods In a case-control study in South East England to investigate the relation of acute and non-lymphocytic leukaemia risk to mobile phone use, 806 cases with leukaemia incident 2003-2009 at ages 18-59 years (50% of those identified as eligible) and 585 non-blood relatives as controls (provided by 392 cases) were interviewed about mobile phone use and other potentially aetiological variables.Results No association was found between regular mobile phone use and risk of leukaemia (odds ratio (OR)=1.06, 95% confidence interval (CI)=0.76, 1.46). Analyses of risk in relation to years since first use, lifetime years of use, cumulative number of calls and cumulative hours of use produced no significantly raised risks, and there was no evidence of any trends. A non-significantly raised risk was found in people who first used a phone 15 or more years ago (OR=1.87, 95% CI=0.96, 3.63). Separate analyses of analogue and digital phone use and leukaemia subtype produced similar results to those overall.Conclusion This study suggests that use of mobile phones does not increase leukaemia risk, although the possibility of an effect after long-term use, while biologically unlikely, remains open

    Variational Methods for Biomolecular Modeling

    Full text link
    Structure, function and dynamics of many biomolecular systems can be characterized by the energetic variational principle and the corresponding systems of partial differential equations (PDEs). This principle allows us to focus on the identification of essential energetic components, the optimal parametrization of energies, and the efficient computational implementation of energy variation or minimization. Given the fact that complex biomolecular systems are structurally non-uniform and their interactions occur through contact interfaces, their free energies are associated with various interfaces as well, such as solute-solvent interface, molecular binding interface, lipid domain interface, and membrane surfaces. This fact motivates the inclusion of interface geometry, particular its curvatures, to the parametrization of free energies. Applications of such interface geometry based energetic variational principles are illustrated through three concrete topics: the multiscale modeling of biomolecular electrostatics and solvation that includes the curvature energy of the molecular surface, the formation of microdomains on lipid membrane due to the geometric and molecular mechanics at the lipid interface, and the mean curvature driven protein localization on membrane surfaces. By further implicitly representing the interface using a phase field function over the entire domain, one can simulate the dynamics of the interface and the corresponding energy variation by evolving the phase field function, achieving significant reduction of the number of degrees of freedom and computational complexity. Strategies for improving the efficiency of computational implementations and for extending applications to coarse-graining or multiscale molecular simulations are outlined.Comment: 36 page

    Learning to Eat Vegetables in Early Life: The Role of Timing, Age and Individual Eating Traits

    Get PDF
    Vegetable intake is generally low among children, who appear to be especially fussy during the pre-school years. Repeated exposure is known to enhance intake of a novel vegetable in early life but individual differences in response to familiarisation have emerged from recent studies. In order to understand the factors which predict different responses to repeated exposure, data from the same experiment conducted in three groups of children from three countries (nβ€Š=β€Š332) aged 4–38 m (18.9Β±9.9 m) were combined and modelled. During the intervention period each child was given between 5 and 10 exposures to a novel vegetable (artichoke puree) in one of three versions (basic, sweet or added energy). Intake of basic artichoke puree was measured both before and after the exposure period. Overall, younger children consumed more artichoke than older children. Four distinct patterns of eating behaviour during the exposure period were defined. Most children were β€œlearners” (40%) who increased intake over time. 21% consumed more than 75% of what was offered each time and were labelled β€œplate-clearers”. 16% were considered β€œnon-eaters” eating less than 10 g by the 5th exposure and the remainder were classified as β€œothers” (23%) since their pattern was highly variable. Age was a significant predictor of eating pattern, with older pre-school children more likely to be non-eaters. Plate-clearers had higher enjoyment of food and lower satiety responsiveness than non-eaters who scored highest on food fussiness. Children in the added energy condition showed the smallest change in intake over time, compared to those in the basic or sweetened artichoke condition. Clearly whilst repeated exposure familiarises children with a novel food, alternative strategies that focus on encouraging initial tastes of the target food might be needed for the fussier and older pre-school children

    A Synthesis of Tagging Studies Examining the Behaviour and Survival of Anadromous Salmonids in Marine Environments

    Get PDF
    This paper synthesizes tagging studies to highlight the current state of knowledge concerning the behaviour and survival of anadromous salmonids in the marine environment. Scientific literature was reviewed to quantify the number and type of studies that have investigated behaviour and survival of anadromous forms of Pacific salmon (Oncorhynchus spp.), Atlantic salmon (Salmo salar), brown trout (Salmo trutta), steelhead (Oncorhynchus mykiss), and cutthroat trout (Oncorhynchus clarkii). We examined three categories of tags including electronic (e.g. acoustic, radio, archival), passive (e.g. external marks, Carlin, coded wire, passive integrated transponder [PIT]), and biological (e.g. otolith, genetic, scale, parasites). Based on 207 papers, survival rates and behaviour in marine environments were found to be extremely variable spatially and temporally, with some of the most influential factors being temperature, population, physiological state, and fish size. Salmonids at all life stages were consistently found to swim at an average speed of approximately one body length per second, which likely corresponds with the speed at which transport costs are minimal. We found that there is relatively little research conducted on open-ocean migrating salmonids, and some species (e.g. masu [O. masou] and amago [O. rhodurus]) are underrepresented in the literature. The most common forms of tagging used across life stages were various forms of external tags, coded wire tags, and acoustic tags, however, the majority of studies did not measure tagging/handling effects on the fish, tag loss/failure, or tag detection probabilities when estimating survival. Through the interdisciplinary application of existing and novel technologies, future research examining the behaviour and survival of anadromous salmonids could incorporate important drivers such as oceanography, tagging/handling effects, predation, and physiology
    • …
    corecore