569 research outputs found

    Associations between Internet-Based Professional Social Networking and Emotional Distress

    Get PDF
    Professional social networking websites are commonly used among young professionals. In light of emerging concerns regarding social networking use and emotional distress, the purpose of this study was to investigate the association between frequency of use of LinkedIn, the most commonly used professional social networking website, and depression and anxiety among young adults. In October 2014, we assessed a nationally-representative sample of 1,780 U.S. young adults between the ages of 19 to 32 regarding frequency of LinkedIn use, depression and anxiety, and socio-demographic covariates. We measured depression and anxiety using validated Patient-Reported Outcomes Measurement Information System measures. We used bivariable and multivariable logistic regression to assess the association between LinkedIn use and depression and anxiety while controlling for age, sex, race, relationship status, living situation, household income, education level, and overall social media use. In weighted analyses, 72% of participants did not report use of LinkedIn, 16% reported at least some use but less than once each week, and 12% reported use at least once per week. In multivariable analyses controlling for all covariates, compared with those who did not use LinkedIn, participants using LinkedIn at least once per week had significantly greater odds of increased depression (adjusted odds ratio [AOR] = 2.10, 95% confidence interval [CI] = 1.31 - 3.38) and increased anxiety (AOR = 2.79, 95% CI = 1.72 - 4.53). LinkedIn use was significantly related to both outcomes in a dose-response fashion. Future research should investigate directionality of this association and possible reasons for it

    Understanding innovators' experiences of barriers and facilitators in implementation and diffusion of healthcare service innovations: A qualitative study

    Get PDF
    This article is made available through the Brunel Open Access Publishing Fund - Copyright @ 2011 Barnett et al.Background: Healthcare service innovations are considered to play a pivotal role in improving organisational efficiency and responding effectively to healthcare needs. Nevertheless, healthcare organisations encounter major difficulties in sustaining and diffusing innovations, especially those which concern the organisation and delivery of healthcare services. The purpose of the present study was to explore how healthcare innovators of process-based initiatives perceived and made sense of factors that either facilitated or obstructed the innovation implementation and diffusion. Methods: A qualitative study was designed. Fifteen primary and secondary healthcare organisations in the UK, which had received health service awards for successfully generating and implementing service innovations, were studied. In-depth, semi structured interviews were conducted with the organisational representatives who conceived and led the development process. The data were recorded, transcribed and thematically analysed. Results: Four main themes were identified in the analysis of the data: the role of evidence, the function of inter-organisational partnerships, the influence of human-based resources, and the impact of contextual factors. "Hard" evidence operated as a proof of effectiveness, a means of dissemination and a pre-requisite for the initiation of innovation. Inter-organisational partnerships and people-based resources, such as champions, were considered an integral part of the process of developing, establishing and diffusing the innovations. Finally, contextual influences, both intra-organisational and extra-organisational were seen as critical in either impeding or facilitating innovators' efforts. Conclusions: A range of factors of different combinations and co-occurrence were pointed out by the innovators as they were reflecting on their experiences of implementing, stabilising and diffusing novel service initiatives. Even though the innovations studied were of various contents and originated from diverse organisational contexts, innovators' accounts converged to the significant role of the evidential base of success, the inter-personal and inter-organisational networks, and the inner and outer context. The innovators, operating themselves as important champions and being often willing to lead constructive efforts of implementation to different contexts, can contribute to the promulgation and spread of the novelties significantly.This research was supported financially by the Multidisciplinary Assessment of Technology Centre for Healthcare (MATCH)

    On the experimental verification of quantum complexity in linear optics

    Full text link
    The first quantum technologies to solve computational problems that are beyond the capabilities of classical computers are likely to be devices that exploit characteristics inherent to a particular physical system, to tackle a bespoke problem suited to those characteristics. Evidence implies that the detection of ensembles of photons, which have propagated through a linear optical circuit, is equivalent to sampling from a probability distribution that is intractable to classical simulation. However, it is probable that the complexity of this type of sampling problem means that its solution is classically unverifiable within a feasible number of trials, and the task of establishing correct operation becomes one of gathering sufficiently convincing circumstantial evidence. Here, we develop scalable methods to experimentally establish correct operation for this class of sampling algorithm, which we implement with two different types of optical circuits for 3, 4, and 5 photons, on Hilbert spaces of up to 50,000 dimensions. With only a small number of trials, we establish a confidence >99% that we are not sampling from a uniform distribution or a classical distribution, and we demonstrate a unitary specific witness that functions robustly for small amounts of data. Like the algorithmic operations they endorse, our methods exploit the characteristics native to the quantum system in question. Here we observe and make an application of a "bosonic clouding" phenomenon, interesting in its own right, where photons are found in local groups of modes superposed across two locations. Our broad approach is likely to be practical for all architectures for quantum technologies where formal verification methods for quantum algorithms are either intractable or unknown.Comment: Comments welcom

    Modelling the impact of larviciding on the population dynamics and biting rates of Simulium damnosum (s.l.): implications for vector control as a complementary strategy for onchocerciasis elimination in Africa

    Get PDF
    Background: In 2012, the World Health Organization set goals for the elimination of onchocerciasis transmission by 2020 in selected African countries. Epidemiological data and mathematical modelling have indicated that elimination may not be achieved with annual ivermectin distribution in all endemic foci. Complementary and alternative treatment strategies (ATS), including vector control, will be necessary. Implementation of vector control will require that the ecology and population dynamics of Simulium damnosum sensu lato be carefully considered. Methods: We adapted our previous SIMuliid POPulation dynamics (SIMPOP) model to explore the impact of larvicidal insecticides on S. damnosum (s.l.) biting rates in different ecological contexts and to identify how frequently and for how long vector control should be continued to sustain substantive reductions in vector biting. SIMPOP was fitted to data from large-scale aerial larviciding trials in savannah sites (Ghana) and small-scale ground larviciding trials in forest areas (Cameroon). The model was validated against independent data from Burkina Faso/Côte d’Ivoire (savannah) and Bioko (forest). Scenario analysis explored the effects of ecological and programmatic factors such as pre-control daily biting rate (DBR) and larviciding scheme design on reductions and resurgences in biting rates. Results: The estimated efficacy of large-scale aerial larviciding in the savannah was greater than that of ground-based larviciding in the forest. Small changes in larvicidal efficacy can have large impacts on intervention success. At 93% larvicidal efficacy (a realistic value based on field trials), 10 consecutive weekly larvicidal treatments would reduce DBRs by 96% (e.g. from 400 to 16 bites/person/day). At 70% efficacy, and for 10 weekly applications, the DBR would decrease by 67% (e.g. from 400 to 132 bites/person/day). Larviciding is more likely to succeed in areas with lower water temperatures and where blackfly species have longer gonotrophic cycles. Conclusions: Focal vector control can reduce vector biting rates in settings where a high larvicidal efficacy can be achieved and an appropriate duration and frequency of larviciding can be ensured. Future work linking SIMPOP with onchocerciasis transmission models will permit evaluation of the impact of combined anti-vectorial and anti-parasitic interventions on accelerating elimination of the disease

    A train-the-trainer education and promotion program: chronic fatigue syndrome – a diagnostic and management challenge

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Chronic fatigue syndrome (CFS) is a complicated illness for providers and patients. Fewer than 20% of persons with CFS have been diagnosed and treated. For providers, compounding the issue are the challenges in making a diagnosis due to the lack of a biomedical marker.</p> <p>Methods</p> <p>The objective of the CFS diagnosis and management curriculum was to instruct core trainers as to the evaluation, diagnosis, and management of CFS. Over a two year period, 79 primary care physicians, physician assistants, and nurse practitioners from diverse regions in the U.S. participated as core trainers in a two day Train-the-Trainer (TTT) workshop. As core trainers, the workshop participants were expected to show increases in knowledge, self-efficacy, and management skills with the primary goal of conducting secondary presentations.</p> <p>Results</p> <p>The optimal goal for each core trainer to present secondary training to 50 persons in the health care field was not reached. However, the combined core trainer group successfully reached 2064 primary care providers. Eighty-two percent of core trainers responded "Very good" or "Excellent" in a post-tessurvey of self-efficacy expectation and CFS diagnosis. Data from the Chicago workshops showed significant improvement on the Primary Care Opinion Survey (p < 0.01) and on the Relevance and Responsibility Factors of the CAT survey (p = 0.03 and p = 0.04, respectively). Dallas workshop data show a significant change from pre- to post-test scores on the CFS Knowledge test (p = 0.001). Qualitative and process evaluation data revealed that target audience and administrative barriers impacted secondary training feasibility.</p> <p>Conclusion</p> <p>Data show the workshop was successful in meeting the objectives of increasing CFS knowledge and raising perceived self-efficacy towards making a diagnosis. The CFS TTT program informed an educational provider project by shifting the format for physicians to grand rounds and continuing medical education design while retaining TTT aspects for nurse practitioners and physicians assistants. Evaluations also indicate that secondary trainings may be more readily employed and accepted if administrative barriers are addressed early in the planning phases.</p

    Patient- and system-related barriers for the earlier diagnosis of colorectal cancer

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A cohort of colorectal cancer (CRC) patients represents an opportunity to study missed opportunities for earlier diagnosis. Primary objective: To study the epidemiology of diagnostic delays and failures to offer/complete CRC screening. Secondary objective: To identify system- and patient-related factors that may contribute to diagnostic delays or failures to offer/complete CRC screening.</p> <p>Methods</p> <p>Setting: Rural Veterans Administration (VA) Healthcare system. Participants: CRC cases diagnosed within the VA between 1/1/2000 and 3/1/2007. Data sources: progress notes, orders, and pathology, laboratory, and imaging results obtained between 1/1/1995 and 12/31/2007. Completed CRC screening was defined as a fecal occult blood test or flexible sigmoidoscopy (both within five years), or colonoscopy (within 10 years); delayed diagnosis was defined as a gap of more than six months between an abnormal test result and evidence of clinician response. A summary abstract of the antecedent clinical care for each patient was created by a certified gastroenterologist (GI), who jointly reviewed and coded the abstracts with a general internist (TW).</p> <p>Results</p> <p>The study population consisted of 150 CRC cases that met the inclusion criteria. The mean age was 69.04 (range 35-91); 99 (66%) were diagnosed due to symptoms; 61 cases (46%) had delays associated with system factors; of them, 57 (38% of the total) had delayed responses to abnormal findings. Fifteen of the cases (10%) had prompt symptom evaluations but received no CRC screening; no patient factors were identified as potentially contributing to the failure to screen/offer to screen. In total, 97 (65%) of the cases had missed opportunities for early diagnosis and 57 (38%) had patient factors that likely contributed to the diagnostic delay or apparent failure to screen/offer to screen.</p> <p>Conclusion</p> <p>Missed opportunities for earlier CRC diagnosis were frequent. Additional studies of clinical data management, focusing on following up abnormal findings, and offering/completing CRC screening, are needed.</p

    Computer vision and machine learning for robust phenotyping in genome-wide studies

    Get PDF
    Traditional evaluation of crop biotic and abiotic stresses are time-consuming and labor-intensive limiting the ability to dissect the genetic basis of quantitative traits. A machine learning (ML)-enabled image-phenotyping pipeline for the genetic studies of abiotic stress iron deficiency chlorosis (IDC) of soybean is reported. IDC classification and severity for an association panel of 461 diverse plant-introduction accessions was evaluated using an end-to-end phenotyping workflow. The workflow consisted of a multi-stage procedure including: (1) optimized protocols for consistent image capture across plant canopies, (2) canopy identification and registration from cluttered backgrounds, (3) extraction of domain expert informed features from the processed images to accurately represent IDC expression, and (4) supervised ML-based classifiers that linked the automatically extracted features with expert-rating equivalent IDC scores. ML-generated phenotypic data were subsequently utilized for the genome-wide association study and genomic prediction. The results illustrate the reliability and advantage of ML-enabled image-phenotyping pipeline by identifying previously reported locus and a novel locus harboring a gene homolog involved in iron acquisition. This study demonstrates a promising path for integrating the phenotyping pipeline into genomic prediction, and provides a systematic framework enabling robust and quicker phenotyping through ground-based systems

    Barriers to obesity management: a pilot study of primary care clinicians

    Get PDF
    BACKGROUND: Obesity is an increasing epidemic in both the US and veteran populations, yet it remains largely understudied in the Veteran's Health Administration (VHA) setting. The purpose of our study was to identify barriers to the effective management of obesity in VHA primary care settings. METHODS: Three focus groups of clinicians from a Veteran's Affairs Medical Center (VAMC) and an affiliated Community Based Outpatient Center (CBOC) were conducted to identify potential barriers to obesity management. The focus groups and previously published studies then informed the creation of a 47-item survey that was then disseminated and completed by 55 primary care clinicians. RESULTS: The focus groups identified provider, system, and patient barriers to obesity care. Lack of obesity training during medical school and residency was associated with lower rates of discussing diet and exercise with obese patients (p < 0.05). Clinicians who watched their own diets vigorously were more likely to calculate BMI for obese patients than other clinicians (42% vs. 13%, p < 0.05). Many barriers identified in previous studies (e.g., attitudes toward obese patients, lack of insurance payments for obesity care) were not prevalent barriers in the current study. CONCLUSION: Many VHA clinicians do not routinely provide weight management services for obese patients. The most prevalent barriers to obesity care were poor education during medical school and residency and the lack of information provided by the VHA to both clinicians and patients about available weight management services
    • …
    corecore