472 research outputs found

    Family Digital Literacy Practices and Children’s Mobile Phone Use

    Get PDF
    Smart phones are ubiquitous in everyday life and are having a major impact on work, education, social relationships and modes of communication. Children are the fastest growing population of smart phone users, with use often focusing around internet access, e.g., 1 in 3 internet users in the UK are under 18 years of age. Despite their widespread use, relatively little is known about the factors that underpin children’s use. The home is a significant ecological context of development and recent research has highlighted the importance of the home environment in promoting and supporting the development of both safe and unsafe online behavior. Yet the importance of these influences currently remains relatively unrecognized. Therefore, in this paper we present a narrative review of evidence examining parental practices concerning digital communication technologies and applications, with a particular focus on smartphones, and how they relate to the use of technology by their children. Emerging evidence to date indicates that two important factors are at play. Firstly, parental technology use is closely related to that of their child. Secondly, that despite parents frequently voiced concerns about the nature and extent of their child’s mobile phone use, parents themselves often engage in a number of unsafe internet behaviors and excessive phone use in the home environment. Our review identifies two crucial lines of enquiry that have yet to be comprehensively pursued by researchers in the field: firstly, the adoption of a psychological perspective on children’s emergent behaviors with mobile devices and secondly, the influential role of context. Given parental concerns about the possible negative impact of technologies, parental awareness should be raised about the influence of their behavior in the context of internet safety along with the adoption of good digital literacy practices. It is anticipated that a comprehensive characterization of the associated contextual factors influencing smartphone use will serve as a catalyst for debate, discussion, and future research

    Long-Acting β<inf>2</inf>-Agonists Increase Fluticasone Propionate-Induced Mitogen-Activated Protein Kinase Phosphatase 1 (MKP-1) in Airway Smooth Muscle Cells

    Get PDF
    Mitogen-activated protein kinase phosphatase 1 (MKP-1) represses MAPK-driven signalling and plays an important anti-inflammatory role in asthma and airway remodelling. Although MKP-1 is corticosteroid-responsive and increased by cAMP-mediated signalling, the upregulation of this critical anti-inflammatory protein by long-acting β2-agonists and clinically-used corticosteroids has been incompletely examined to date. To address this, we investigated MKP-1 gene expression and protein upregulation induced by two long-acting β2-agonists (salmeterol and formoterol), alone or in combination with the corticosteroid fluticasone propionate (abbreviated as fluticasone) in primary human airway smooth muscle (ASM) cells in vitro. β2-agonists increased MKP-1 protein in a rapid but transient manner, while fluticasone induced sustained upregulation. Together, long-acting β2-agonists increased fluticasone-induced MKP-1 and modulated ASM synthetic function (measured by interleukin 6 (IL-6) and interleukin 8 (IL-8) secretion). As IL-6 expression (like MKP-1) is cAMP/adenylate cyclase-mediated, the long-acting β2-agonist formoterol increased IL-6 mRNA expression and secretion. Nevertheless, when added in combination with fluticasone, β2-agonists significantly repressed IL-6 secretion induced by tumour necrosis factor α (TNFα). Conversely, as IL-8 is not cAMP-responsive, β2-agonists significantly inhibited TNFα-induced IL-8 in combination with fluticasone, where fluticasone alone was without repressive effect. In summary, long-acting β2-agonists increase fluticasone-induced MKP-1 in ASM cells and repress synthetic function of this immunomodulatory airway cell type. © 2013 Manetsch et al

    Temporal variations in quality of acute stroke care and outcomes in London hyperacute stroke units: a mixed-methods study

    Get PDF
    This is the final version. Available from the NIHR Journals Library via the DOI in this recordBackground Seven-day working in hospitals is a current priority of international health research and policy. Previous research has shown variability in delivering evidence-based clinical interventions across different times of the day and week. We aimed to identify factors influencing such variations in London hyperacute stroke units. Objectives To investigate variations in quality of acute stroke care and outcomes by day and time of admission in London hyperacute stroke units, and to identify factors influencing such variations. Design This was a prospective cohort study using anonymised patient-level data from the Sentinel Stroke National Audit Programme. Factors influencing variations in care and outcomes were studied through interview and observation data. Setting The setting was acute stroke services in London hyperacute stroke units. Participants A total of 7094 patients with a primary diagnosis of stroke took part. We interviewed hyperacute stroke unit staff (n = 76), including doctors, nurses, therapists and administrators, and 31 patients and carers. We also conducted non-participant observations of delivery of care at different times of the day and week (n = 45, ≈102 hours). Intervention Hub-and-spoke model for care of suspected acute stroke patients in London with performance standards was designed to deliver uniform access to high-quality hyperacute stroke unit care across the week. Main outcome measures Indicators of quality of acute stroke care, mortality at 3 days after admission, disability at the end of the inpatient spell and length of stay. Data sources Sentinel Stroke National Audit Programme data for all patients in London hyperacute stroke units with a primary diagnosis of stroke between 1 January and 31 December 2014, and nurse staffing data for all eight London hyperacute stroke units for the same period. Results We found no variation in quality of care by day and time of admission across the week in terms of stroke nursing assessment, brain scanning and thrombolysis in London hyperacute stroke units, nor in 3-day mortality nor disability at hospital discharge. Other quality-of-care measures significantly varied by day and time of admission. Quality of care was better if the nurse in charge was at a higher band and/or there were more nurses on duty. Staff deliver ‘front-door’ interventions consistently by taking on additional responsibilities out of hours, creating continuities between day and night, building trusting relationships and prioritising ‘front-door’ interventions. Limitations We were unable to measure long-term outcomes as our request to the Sentinel Stroke National Audit Programme, the Healthcare Quality Improvement Partnership and NHS Digital for Sentinel Stroke National Audit Programme data linked with patient mortality status was not fulfilled. Conclusions Organisational factors influence 24 hours a day, 7 days a week (24/7), provision of stroke care, creating temporal patterns of provision reflected in patient outcomes, including mortality, length of stay and functional independence. Future work Further research would help to explore 24/7 stroke systems in other contexts. We need a clearer understanding of variations by looking at absolute time intervals, rather than achievement of targets. Research is needed with longer-term mortality and modified Rankin Scale data, and a more meaningful range of outcomes.National Institute for Health Research (NIHR

    Overlapping presentation of fungal tubulointerstitial nephritis in an immunosuppressed pediatric patient

    Get PDF
    With the expanding use of immunosuppressive therapies and broad-spectrum antibiotics, Candida species has become an increasingly important cause of infections, particularly in the presence of anti-tumor necrosis factor-α therapy. We report the case of a 17-year-old female with ulcerative colitis who developed oliguric renal failure following immunosuppressive and nephrotoxic therapy. Although urine cultures and urinary tract imaging were negative in the face of fungemia, renal biopsy was the key to establishing the diagnosis of fungal tubulo-interstitial nephritis as the primary reversible cause of the renal failure

    Effect of multivitamin and multimineral supplementation on cognitive function in men and women aged 65 years and over : a randomised controlled trial

    Get PDF
    Background: Observational studies have frequently reported an association between cognitive function and nutrition in later life but randomised trials of B vitamins and antioxidant supplements have mostly found no beneficial effect. We examined the effect of daily supplementation with 11 vitamins and 5 minerals on cognitive function in older adults to assess the possibility that this could help to prevent cognitive decline. Methods: The study was carried out as part of a randomised double blind placebo controlled trial of micronutrient supplementation based in six primary care health centres in North East Scotland. 910 men and women aged 65 years and over living in the community were recruited and randomised: 456 to active treatment and 454 to placebo. The active treatment consisted of a single tablet containing eleven vitamins and five minerals in amounts ranging from 50–210 % of the UK Reference Nutrient Intake or matching placebo tablet taken daily for 12 months. Digit span forward and verbal fluency tests, which assess immediate memory and executive functioning respectively, were conducted at the start and end of the intervention period. Risk of micronutrient deficiency at baseline was assessed by a simple risk questionnaire. Results: For digit span forward there was no evidence of an effect of supplements in all participants or in sub-groups defined by age or risk of deficiency. For verbal fluency there was no evidence of a beneficial effect in the whole study population but there was weak evidence for a beneficial effect of supplementation in the two pre-specified subgroups: in those aged 75 years and over (n 290; mean difference between supplemented and placebo groups 2.8 (95% CI -0.6, 6.2) units) and in those at increased risk of micronutrient deficiency assessed by the risk questionnaire (n 260; mean difference between supplemented and placebo groups 2.5 (95% CI -1.0, 6.1) units). Conclusion: The results provide no evidence for a beneficial effect of daily multivitamin and multimineral supplements on these domains of cognitive function in community-living people over 65 years. However, the possibility of beneficial effects in older people and those at greater risk of nutritional deficiency deserves further attention.Peer reviewedPublisher PD

    Resource Modelling: The Missing Piece of the HTA Jigsaw?

    Get PDF
    Within health technology assessment (HTA), cost-effectiveness analysis and budget impact analyses have been broadly accepted as important components of decision making. However, whilst they address efficiency and affordability, the issue of implementation and feasibility has been largely ignored. HTA commonly takes place within a deliberative framework that captures issues of implementation and feasibility in a qualitative manner. We argue that only through a formal quantitative assessment of resource constraints can these issues be fully addressed. This paper argues the need for resource modelling to be considered explicitly in HTA. First, economic evaluation and budget impact models are described along with their limitations in evaluating feasibility. Next, resource modelling is defined and its usefulness is described along with examples of resource modelling from the literature. Then, the important issues that need to be considered when undertaking resource modelling are described before setting out recommendations for the use of resource modelling in HTA

    From DNA sequence to application: possibilities and complications

    Get PDF
    The development of sophisticated genetic tools during the past 15 years have facilitated a tremendous increase of fundamental and application-oriented knowledge of lactic acid bacteria (LAB) and their bacteriophages. This knowledge relates both to the assignments of open reading frames (ORF’s) and the function of non-coding DNA sequences. Comparison of the complete nucleotide sequences of several LAB bacteriophages has revealed that their chromosomes have a fixed, modular structure, each module having a set of genes involved in a specific phase of the bacteriophage life cycle. LAB bacteriophage genes and DNA sequences have been used for the construction of temperature-inducible gene expression systems, gene-integration systems, and bacteriophage defence systems. The function of several LAB open reading frames and transcriptional units have been identified and characterized in detail. Many of these could find practical applications, such as induced lysis of LAB to enhance cheese ripening and re-routing of carbon fluxes for the production of a specific amino acid enantiomer. More knowledge has also become available concerning the function and structure of non-coding DNA positioned at or in the vicinity of promoters. In several cases the mRNA produced from this DNA contains a transcriptional terminator-antiterminator pair, in which the antiterminator can be stabilized either by uncharged tRNA or by interaction with a regulatory protein, thus preventing formation of the terminator so that mRNA elongation can proceed. Evidence has accumulated showing that also in LAB carbon catabolite repression in LAB is mediated by specific DNA elements in the vicinity of promoters governing the transcription of catabolic operons. Although some biological barriers have yet to be solved, the vast body of scientific information presently available allows the construction of tailor-made genetically modified LAB. Today, it appears that societal constraints rather than biological hurdles impede the use of genetically modified LAB.

    Long Delays and Missed Opportunities in Diagnosing Smear-Positive Pulmonary Tuberculosis in Kampala, Uganda: A Cross-Sectional Study

    Get PDF
    BACKGROUND: Early detection and treatment of tuberculosis cases are the hallmark of successful tuberculosis control. We conducted a cross-sectional study at public primary health facilities in Kampala city, Uganda to quantify diagnostic delay among pulmonary tuberculosis (PTB) patients, assess associated factors, and describe trajectories of patients' health care seeking. METHODOLOGY/PRINCIPAL FINDINGS: Semi-structured interviews with new smear-positive PTB patients (≥ 15 years) registered for treatment. Between April 2007 and April 2008, 253 patients were studied. The median total delay was 8 weeks (IQR 4-12), median patient delay was 4 weeks (inter-quartile range [IQR] 1-8) and median health service delay was 4 weeks (IQR 2-8). Long total delay (>14 weeks) was observed for 61/253 (24.1%) of patients, long health service delay (>6 weeks) for 71/242 (29.3%) and long patient delay (>8 weeks) for 47/242 (19.4%). Patients who knew that TB was curable were less likely to have long total delay (adjusted Odds Ratio [aOR] 0.28; 95%CI 0.11-0.73) and long patient delay (aOR 0.36; 95%CI 0.13-0.97). Being female (aOR 1.98; 95%CI 1.06-3.71), staying for more than 5 years at current residence (aOR 2.24 95%CI 1.18-4.27) and having been tested for HIV before (aOR 3.72; 95%CI 1.42-9.75) was associated with long health service delay. Health service delay contributed 50% of the total delay. Ninety-one percent (231) of patients had visited one or more health care providers before they were diagnosed, for an average (median) of 4 visits (range 1-30). All but four patients had systemic symptoms by the time the diagnosis of TB was made. CONCLUSIONS/SIGNIFICANCE: Diagnostic delay among tuberculosis patients in Kampala is common and long. This reflects patients waiting too long before seeking care and health services waiting until systemic symptoms are present before examining sputum smears; this results in missed opportunities for diagnosis

    How good is probabilistic record linkage to reconstruct reproductive histories? Results from the Aberdeen children of the 1950s study

    Get PDF
    BACKGROUND: Probabilistic record linkage is widely used in epidemiology, but studies of its validity are rare. Our aim was to validate its use to identify births to a cohort of women, being drawn from a large cohort of people born in Scotland in the early 1950s. METHODS: The Children of the 1950s cohort includes 5868 females born in Aberdeen 1950–56 who were in primary schools in the city in 1962. In 2001 a postal questionnaire was sent to the cohort members resident in the UK requesting information on offspring. Probabilistic record linkage (based on surname, maiden name, initials, date of birth and postcode) was used to link the females in the cohort to birth records held by the Scottish Maternity Record System (SMR 2). RESULTS: We attempted to mail a total of 5540 women; 3752 (68%) returned a completed questionnaire. Of these 86% reported having had at least one birth. Linkage to SMR 2 was attempted for 5634 women, one or more maternity records were found for 3743. There were 2604 women who reported at least one birth in the questionnaire and who were linked to one or more SMR 2 records. When judged against the questionnaire information, the linkage correctly identified 4930 births and missed 601 others. These mostly occurred outside of Scotland (147) or prior to full coverage by SMR 2 (454). There were 134 births incorrectly linked to SMR 2. CONCLUSION: Probabilistic record linkage to routine maternity records applied to population-based cohort, using name, date of birth and place of residence, can have high specificity, and as such may be reliably used in epidemiological research
    corecore