762 research outputs found

    Lighting during grow-out and Salmonella in broiler flocks

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Lighting is used during conventional broiler grow-out to modify bird behaviour to reach the goals of production and improve bird welfare. The protocols for lighting intensity vary. In a field study, we evaluated if the lighting practices impact the burden of <it>Salmonella </it>in broiler flocks.</p> <p>Methods</p> <p>Conventional grow-out flocks reared in the states of Alabama, Mississippi and Texas, USA in 2003 to 2006 were sampled 1 week before harvest (<it>n </it>= 58) and upon arrival for processing (<it>n </it>= 56) by collecting feathered carcass rinsate, crop and one cecum from each of 30 birds, and during processing by collecting rinsate of 30 carcasses at pre-chilling (<it>n </it>= 56) and post-chilling points (<it>n </it>= 54). Litter samples and drag swabs of litter were collected from the grow-out houses after bird harvest (<it>n </it>= 56). Lighting practices for these flocks were obtained with a questionnaire completed by the growers. Associations between the lighting practices and the burden of <it>Salmonella </it>in the flocks were tested while accounting for variation between the grow-out farms, their production complexes and companies.</p> <p>Results</p> <p>Longer relative duration of reduced lights during the grow-out period was associated with reduced detection of <it>Salmonella </it>on the exterior of birds 1 week before harvest and on the broiler carcasses at the post-chilling point of processing. In addition, starting reduced lights for ≥18 hours per day later in the grow-out period was associated with decreased detection of <it>Salmonella </it>on the exterior of broilers arriving for processing and in the post-harvest drag swabs of litter from the grow-out house.</p> <p>Conclusions</p> <p>The results of this field study show that lighting practices implemented during broiler rearing can impact the burden of <it>Salmonella </it>in the flock. The underlying mechanisms are likely to be interactive.</p

    Non-western immigrants' satisfaction with the general practitioners' services in Oslo, Norway

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Over the last few years the number of immigrants from the non-western parts of the world living in Oslo, has increased considerably. We need to know if these immigrants are satisfied with the health services they are offered. The aim of this study was to assess whether the immigrants' level of satisfaction with visits to general practitioners was comparable with that for ethnic Norwegians.</p> <p>Methods</p> <p>Two population-based surveys, the Oslo Health Study and the Oslo Immigrant Health Study, were performed on selected groups of Oslo citizens in 2000 and 2002. The response rates were 46% and 33%, respectively. In all, 11936 Norwegians and 1102 non-western immigrants from the Oslo Health Study, and 1774 people from the Oslo Immigrant Health Study, were included in this analysis. Non-western immigrants' and ethnic Norwegians' level of satisfaction with visits to general practitioners were analysed with respect to age, gender, health, working status, and use of translators. Bivariate (Chi square) and multivariate analyses (logistic regression) were performed.</p> <p>Results</p> <p>Most participants were either moderately or very satisfied with their last visit to a general practitioner. Non-western immigrants were less satisfied than Norwegians. Dissatisfaction among the immigrants was associated with young age, a feeling of not having good health, and coming from Turkey, Iran, Pakistan, or Vietnam as compared to Sri Lanka. The attendance rates in the surveys were rather low and lowest among the non-western immigrants.</p> <p>Conclusion</p> <p>Although the degree of satisfaction with the primary health care was relatively high among the participants in these surveys, the non-western immigrants in this study were less satisfied than ethnic Norwegians with their last visit to a general practitioner. The rather low response rates opens for the possibility that the degree of satisfaction may not be representative for all immigrants.</p

    Family composition and age at menarche: findings from the international Health Behaviour in School-Aged Children Study

    Get PDF
    This research was funded by The University of St Andrews and NHS Health Scotland.Background Early menarche has been associated with father absence, stepfather presence and adverse health consequences in later life. This article assesses the association of different family compositions with the age at menarche. Pathways are explored which may explain any association between family characteristics and pubertal timing. Methods Cross-sectional, international data on the age at menarche, family structure and covariates (age, psychosomatic complaints, media consumption, physical activity) were collected from the 2009–2010 Health Behaviour in School-aged Children (HBSC) survey. The sample focuses on 15-year old girls comprising 36,175 individuals across 40 countries in Europe and North America (N = 21,075 for age at menarche). The study examined the association of different family characteristics with age at menarche. Regression and path analyses were applied incorporating multilevel techniques to adjust for the nested nature of data within countries. Results Living with mother (Cohen’s d = .12), father (d = .08), brothers (d = .04) and sisters (d = .06) are independently associated with later age at menarche. Living in a foster home (d = −.16), with ‘someone else’ (d = −.11), stepmother (d = −.10) or stepfather (d = −.06) was associated with earlier menarche. Path models show that up to 89% of these effects can be explained through lifestyle and psychological variables. Conclusions Earlier menarche is reported amongst those with living conditions other than a family consisting of two biological parents. This can partly be explained by girls’ higher Body Mass Index in these families which is a biological determinant of early menarche. Lower physical activity and elevated psychosomatic complaints were also more often found in girls in these family environments.Publisher PDFPeer reviewe

    Pyrokinin β-Neuropeptide Affects Necrophoretic Behavior in Fire Ants (S. invicta), and Expression of β-NP in a Mycoinsecticide Increases Its Virulence

    Get PDF
    Fire ants are one of the world's most damaging invasive pests, with few means for their effective control. Although ecologically friendly alternatives to chemical pesticides such as the insecticidal fungus Beauveria bassiana have been suggested for the control of fire ant populations, their use has been limited due to the low virulence of the fungus and the length of time it takes to kill its target. We present a means of increasing the virulence of the fungal agent by expressing a fire ant neuropeptide. Expression of the fire ant (Solenopsis invicta) pyrokinin β -neuropeptide (β-NP) by B. bassiana increased fungal virulence six-fold towards fire ants, decreased the LT50, but did not affect virulence towards the lepidopteran, Galleria mellonella. Intriguingly, ants killed by the β-NP expressing fungus were disrupted in the removal of dead colony members, i.e. necrophoretic behavior. Furthermore, synthetic C-terminal amidated β-NP but not the non-amidated peptide had a dramatic effect on necrophoretic behavior. These data link chemical sensing of a specific peptide to a complex social behavior. Our results also confirm a new approach to insect control in which expression of host molecules in an insect pathogen can by exploited for target specific augmentation of virulence. The minimization of the development of potential insect resistance by our approach is discussed

    Quality assessment of an interferon-gamma release assay for tuberculosis infection in a resource-limited setting

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>When a test for diagnosis of infectious diseases is introduced in a resource-limited setting, monitoring quality is a major concern. An optimized design of experiment and statistical models are required for this assessment.</p> <p>Methods</p> <p>Interferon-gamma release assay to detect tuberculosis (TB) infection from whole blood was tested in Hanoi, Viet Nam. Balanced incomplete block design (BIBD) was planned and fixed-effect models with heterogeneous error variance were used for analysis. In the first trial, the whole blood from 12 donors was incubated with nil, TB-specific antigens or mitogen. In 72 measurements, two laboratory members exchanged their roles in harvesting plasma and testing for interferon-gamma release using enzyme linked immunosorbent assay (ELISA) technique. After intervention including checkup of all steps and standard operation procedures, the second trial was implemented in a similar manner.</p> <p>Results</p> <p>The lack of precision in the first trial was clearly demonstrated. Large within-individual error was significantly affected by both harvester and ELISA operator, indicating that both of the steps had problems. After the intervention, overall within-individual error was significantly reduced (<it>P </it>< 0.0001) and error variance was no longer affected by laboratory personnel in charge, indicating that a marked improvement could be objectively observed.</p> <p>Conclusion</p> <p>BIBD and analysis of fixed-effect models with heterogeneous variance are suitable and useful for objective and individualized assessment of proficiency in a multistep diagnostic test for infectious diseases in a resource-constrained laboratory. The action plan based on our findings would be worth considering when monitoring for internal quality control is difficult on site.</p

    Response rates and selection problems, with emphasis on mental health variables and DNA sampling, in large population-based, cross-sectional and longitudinal studies of adolescents in Norway

    Get PDF
    Background Selection bias is a threat to the internal validity of epidemiological studies. In light of a growing number of studies which aim to provide DNA, as well as a considerable number of invitees who declined to participate, we discuss response rates, predictors of lost to follow-up and failure to provide DNA, and the presence of possible selection bias, based on five samples of adolescents. Methods We included nearly 7,000 adolescents from two longitudinal studies of 18/19 year olds with two corresponding cross-sectional baseline studies at age 15/16 (10th graders), and one cross-sectional study of 13th graders (18/19 years old). DNA was sampled from the cheek mucosa of 18/19 year olds. Predictors of lost to follow-up and failure to provide DNA were studied by Poisson regression. Selection bias in the follow-up at age 18/19 was estimated through investigation of prevalence ratios (PRs) between selected exposures (physical activity, smoking) and outcome variables (general health, mental distress, externalizing problems) measured at baseline. Results Out of 5,750 who participated at age 15/16, we lost 42% at follow-up at age 18/19. The percentage of participants who gave their consent to DNA provision was as high as the percentage that consented to a linkage of data with other health registers and surveys, approximately 90%. Significant predictors of lost to follow-up and failure to provide DNA samples in the present genetic epidemiological study were: male gender; non-western ethnicity; postal survey compared with school-based; low educational plans; low education and income of father; low perceived family economy; unmarried parents; poor self-reported health; externalized symptoms and smoking, with some differences in subgroups of ethnicity and gender. The association measures (PRs) were quite similar among participants and all invitees, with some minor discrepancies in subgroups of non-western boys and girls. Conclusions Lost to follow-up had marginal impact on the estimated prevalence ratios. It is not likely that the invitation to provide DNA influenced the response rates of 18/19 year olds. Non-western ethnicity, male gender and characteristics related to a low social class and general and mental health problems measured at baseline are associated with lost to follow-up and failure to provide DNA

    Disruption of Murine mp29/Syf2/Ntc31 Gene Results in Embryonic Lethality with Aberrant Checkpoint Response

    Get PDF
    Human p29 is a putative component of spliceosomes, but its role in pre-mRNA is elusive. By siRNA knockdown and stable overexpression, we demonstrated that human p29 is involved in DNA damage response and Fanconi anemia pathway in cultured cells. In this study, we generated p29 knockout mice (mp29GT/GT) using the mp29 gene trap embryonic stem cells to study the role of mp29 in DNA damage response in vivo. Interruption of mp29 at both alleles resulted in embryonic lethality. Embryonic abnormality occurred as early as E6.5 in mp29GT/GT mice accompanied with decreased mRNA levels of α-tubulin and Chk1. The reduction of α-tubulin and Chk1 mRNAs is likely due to an impaired post-transcriptional event. An aberrant G2/M checkpoint was found in mp29 gene trap embryos when exposed to aphidicolin and UV light. This embryonic lethality was rescued by crossing with mp29 transgenic mice. Additionally, the knockdown of zfp29 in zebrafish resulted in embryonic death at 72 hours of development postfertilization (hpf). A lower level of acetylated α-tubulin was also observed in zfp29 morphants. Together, these results illustrate an indispensable role of mp29 in DNA checkpoint response during embryonic development

    Stability and change in screen-based sedentary behaviours and associated factors among Norwegian children in the transition between childhood and adolescence

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In order to inform interventions to prevent sedentariness, more longitudinal studies are needed focusing on stability and change over time in multiple sedentary behaviours. This paper investigates patterns of stability and change in TV/DVD use, computer/electronic game use and total screen time (TST) and factors associated with these patterns among Norwegian children in the transition between childhood and adolescence.</p> <p>Methods</p> <p>The baseline of this longitudinal study took place in September 2007 and included 975 students from 25 control schools of an intervention study, the HEalth In Adolescents (HEIA) study. The first follow-up took place in May 2008 and the second follow-up in May 2009, with 885 students participating at all time points (average age at baseline = 11.2, standard deviation ± 0.3). Time used for/spent on TV/DVD and computer/electronic games was self-reported, and a TST variable (hours/week) was computed. Tracking analyses based on absolute and rank measures, as well as regression analyses to assess factors associated with change in TST and with tracking high TST were conducted.</p> <p>Results</p> <p>Time spent on all sedentary behaviours investigated increased in both genders. Findings based on absolute and rank measures revealed a fair to moderate level of tracking over the 2 year period. High parental education was inversely related to an increase in TST among females. In males, self-efficacy related to barriers to physical activity and living with married or cohabitating parents were inversely related to an increase in TST. Factors associated with tracking high vs. low TST in the multinomial regression analyses were low self-efficacy and being of an ethnic minority background among females, and low self-efficacy, being overweight/obese and not living with married or cohabitating parents among males.</p> <p>Conclusions</p> <p>Use of TV/DVD and computer/electronic games increased with age and tracked over time in this group of 11-13 year old Norwegian children. Interventions targeting these sedentary behaviours should thus be introduced early. The identified modifiable and non-modifiable factors associated with change in TST and tracking of high TST should be taken into consideration when planning such interventions.</p

    A new mouse model for renal lesions produced by intravenous injection of diphtheria toxin A-chain expression plasmid

    Get PDF
    BACKGROUND: Various animal models of renal failure have been produced and used to investigate mechanisms underlying renal disease and develop therapeutic drugs. Most methods available to produce such models appear to involve subtotal nephrectomy or intravenous administration of antibodies raised against basement membrane of glomeruli. In this study, we developed a novel method to produce mouse models of renal failure by intravenous injection of a plasmid carrying a toxic gene such as diphtheria toxin A-chain (DT-A) gene. DT-A is known to kill cells by inhibiting protein synthesis. METHODS: An expression plasmid carrying the cytomegalovirus enhancer/chicken β-actin promoter linked to a DT-A gene was mixed with lipid (FuGENE™6) and the resulting complexes were intravenously injected into adult male B6C3F1 mice every day for up to 6 days. After final injection, the kidneys of these mice were sampled on day 4 and weeks 3 and 5. RESULTS: H-E staining of the kidney specimens sampled on day 4 revealed remarkable alterations in glomerular compartments, as exemplified by mesangial cell proliferation and formation of extensive deposits in glomerular basement membrane. At weeks 3 and 5, gradual recovery of these tissues was observed. These mice exhibited proteinuria and disease resembling sub-acute glomerulonephritis. CONCLUSIONS: Repeated intravenous injections of DT-A expression plasmid DNA/lipid complex caused temporary abnormalities mainly in glomeruli of mouse kidney. The disease in these mice resembles sub-acute glomerulonephritis. These DT-A gene-incorporated mice will be useful as animal models in the fields of nephrology and regenerative medicine

    Systemic Immune Activation in HIV Infection Is Associated with Decreased MDC Responsiveness to TLR Ligand and Inability to Activate Naive CD4 T-Cells

    Get PDF
    HIV infection is characterized by ineffective anti-viral T-cell responses and impaired dendritic cell (DC) functions, including response to Toll-Like Receptor (TLR) ligands. Because TLR responsiveness may affect a host's response to virus, we examined TLR ligand induced Myeloid and Plasmacytoid DC (MDC and PDC) activation of naïve T-cells in HIV+ subjects.Freshly purified MDC and PDC obtained from HIV+ subjects and healthy controls were cultured in the presence and absence of TLR ligands (poly I∶C or R-848). We evaluated indices of maturation/activation (CD83, CD86, and HLA-DR expression), cytokine secretion (IFN-alpha and IL-6), and ability to activate allogeneic naïve CD4 T-cells to secrete IFN-gamma and IL-2.MDC from HIV+ subjects had increased spontaneous IL-6 production and increased CD83 and CD86 expression when compared to MDC of controls. MDC IL-6 expression was associated with plasma HIV level. At the same time, poly I∶C induced HLA-DR up-regulation on MDC was reduced in HIV+ persons when compared to controls. The latter finding was associated with impaired ability of MDC from HIV+ subjects to activate allogeneic naïve CD4 T-cells. PDC from HIV+ persons had increased spontaneous and TLR ligand induced IL-6 expression, and increased HLA-DR expression at baseline. The latter was associated with an intact ability of HIV PDC to activate allogeneic naïve CD4 T-cells.These results have implications for the ability of the HIV+ host to form innate and adaptive responses to HIV and other pathogens
    • …
    corecore