233 research outputs found
Problematic mukbang watching and its relationship to disordered eating and internet addiction: a pilot study among emerging adult mukbang watchers
Internet technology has facilitated the use of a wide variety of different activities and applications in online contexts. One such activity is watching mukbang (i.e., watching videos of “eating broadcasts” where someone eats a large amount of food while interacting with viewers). In the present study, the relationship of problematic mukbang watching with disordered eating and internet addiction was examined. Participants were 140 emerging adults who watched mukbang at least once in the past 30 days (66% female; Mage = 21.66, SD = 1.88, range = 19–29 years). Structural equation modeling indicated that problematic mukbang watching was positively associated with both disordered eating and internet addiction. The present study is the first to explore the predictive role of problematic mukbang watching on adverse consequences, and suggests that mukbang watching may be problematic for a minority of emerging adults and that problematic mukbang watching warrants further examination of its impact on mental health and wellbeing
Early Detection of Tuberculosis Outbreaks among the San Francisco Homeless: Trade-Offs Between Spatial Resolution and Temporal Scale
BACKGROUND: San Francisco has the highest rate of tuberculosis (TB) in the U.S. with recurrent outbreaks among the homeless and marginally housed. It has been shown for syndromic data that when exact geographic coordinates of individual patients are used as the spatial base for outbreak detection, higher detection rates and accuracy are achieved compared to when data are aggregated into administrative regions such as zip codes and census tracts. We examine the effect of varying the spatial resolution in the TB data within the San Francisco homeless population on detection sensitivity, timeliness, and the amount of historical data needed to achieve better performance measures. METHODS AND FINDINGS: We apply a variation of space-time permutation scan statistic to the TB data in which a patient's location is either represented by its exact coordinates or by the centroid of its census tract. We show that the detection sensitivity and timeliness of the method generally improve when exact locations are used to identify real TB outbreaks. When outbreaks are simulated, while the detection timeliness is consistently improved when exact coordinates are used, the detection sensitivity varies depending on the size of the spatial scanning window and the number of tracts in which cases are simulated. Finally, we show that when exact locations are used, smaller amount of historical data is required for training the model. CONCLUSION: Systematic characterization of the spatio-temporal distribution of TB cases can widely benefit real time surveillance and guide public health investigations of TB outbreaks as to what level of spatial resolution results in improved detection sensitivity and timeliness. Trading higher spatial resolution for better performance is ultimately a tradeoff between maintaining patient confidentiality and improving public health when sharing data. Understanding such tradeoffs is critical to managing the complex interplay between public policy and public health. This study is a step forward in this direction
Laparoscopic sacrocolpopexy with bone anchor fixation: short-term anatomic and functional results
Contains fulltext :
108485.pdf (publisher's version ) (Open Access)INTRODUCTION AND HYPOTHESIS: The aim of this study was to evaluate short-term anatomic and functional outcomes and safety of laparoscopic sacrocolpopexy with bone anchor fixation. METHODS: A prospective cohort study of women undergoing laparoscopic sacrocolpopexy between 2004 and 2009. Anatomic outcome was assessed using the pelvic organ prolapse quantification score (POP-Q). Functional outcomes were assessed using the Urogenital Distress Inventory, Defecatory Distress Inventory, and the Incontinence Impact Questionnaire preoperatively and at 6 months postoperatively. The Wilcoxon signed rank test was used to test differences between related samples. RESULTS: Forty-nine women underwent laparoscopic sacrocolpopexy. The objective success rate in the apical compartment was 98%, subjective success rate was 79%. One mesh exposure (2%) was found. One conversion was necessary due to injury to the ileum. CONCLUSIONS: Laparoscopic sacrocolpopexy with bone anchor fixation is a safe and efficacious treatment for apical compartment prolapse. It provides excellent apical support and good functional outcome 6 months postoperatively.1 april 201
Regulation of N-WASP and the Arp2/3 Complex by Abp1 Controls Neuronal Morphology
Polymerization and organization of actin filaments into complex superstructures is indispensable for structure and function of neuronal networks. We here report that knock down of the F-actin-binding protein Abp1, which is important for endocytosis and synaptic organization, results in changes in axon development virtually identical to Arp2/3 complex inhibition, i.e., a selective increase of axon length. Our in vitro and in vivo experiments demonstrate that Abp1 interacts directly with N-WASP, an activator of the Arp2/3 complex, and releases the autoinhibition of N-WASP in cooperation with Cdc42 and thereby promotes N-WASP-triggered Arp2/3 complex-mediated actin polymerization. In line with our mechanistical studies and the colocalization of Abp1, N-WASP and Arp2/3 at sites of actin polymerization in neurons, we reveal an essential role of Abp1 and its cooperativity with Cdc42 in N-WASP-induced rearrangements of the neuronal cytoskeleton. We furthermore show that introduction of N-WASP mutants lacking the ability to bind Abp1 or Cdc42, Arp2/3 complex inhibition, Abp1 knock down, N-WASP knock down and Arp3 knock down, all cause identical neuromorphological phenotypes. Our data thus strongly suggest that these proteins and their complex formation are important for cytoskeletal processes underlying neuronal network formation
Utilisation of an operative difficulty grading scale for laparoscopic cholecystectomy
Background
A reliable system for grading operative difficulty of laparoscopic cholecystectomy would standardise description of findings and reporting of outcomes. The aim of this study was to validate a difficulty grading system (Nassar scale), testing its applicability and consistency in two large prospective datasets.
Methods
Patient and disease-related variables and 30-day outcomes were identified in two prospective cholecystectomy databases: the multi-centre prospective cohort of 8820 patients from the recent CholeS Study and the single-surgeon series containing 4089 patients. Operative data and patient outcomes were correlated with Nassar operative difficultly scale, using Kendall’s tau for dichotomous variables, or Jonckheere–Terpstra tests for continuous variables. A ROC curve analysis was performed, to quantify the predictive accuracy of the scale for each outcome, with continuous outcomes dichotomised, prior to analysis.
Results
A higher operative difficulty grade was consistently associated with worse outcomes for the patients in both the reference and CholeS cohorts. The median length of stay increased from 0 to 4 days, and the 30-day complication rate from 7.6 to 24.4% as the difficulty grade increased from 1 to 4/5 (both p < 0.001). In the CholeS cohort, a higher difficulty grade was found to be most strongly associated with conversion to open and 30-day mortality (AUROC = 0.903, 0.822, respectively). On multivariable analysis, the Nassar operative difficultly scale was found to be a significant independent predictor of operative duration, conversion to open surgery, 30-day complications and 30-day reintervention (all p < 0.001).
Conclusion
We have shown that an operative difficulty scale can standardise the description of operative findings by multiple grades of surgeons to facilitate audit, training assessment and research. It provides a tool for reporting operative findings, disease severity and technical difficulty and can be utilised in future research to reliably compare outcomes according to case mix and intra-operative difficulty
Smoking Cessation Pharmacogenetics: Analysis of Varenicline and Bupropion in Placebo-Controlled Clinical Trials
Despite effective therapies for smoking cessation, most smokers find quitting difficult and most successful quitters relapse. Considerable evidence supports a genetic risk for nicotine dependence; however, less is known about the pharmacogenetics of smoking cessation. In the first pharmacogenetic investigation of the efficacy of varenicline and bupropion, we examined whether genes important in the pharmacodynamics and pharmacokinetics of these drugs and nicotine predict medication efficacy and adverse events. Subjects participated in randomized, double-blind, placebo-controlled smoking cessation clinical trials, comparing varenicline, a nicotinic acetylcholine receptor (nAChR) partial agonist, with bupropion, a norepinephrine/dopamine reuptake inhibitor, and placebo. Primary analysis included 1175 smokers of European ancestry, and 785 single nucleotide polymorphisms from 24 genes, representing 254 linkage disequilibrium (LD) bins (genes included nAChR subunits, additional varenicline-specific genes, and genes involved in nicotine or bupropion metabolism). For varenicline, continuous abstinence (weeks 9–12) was associated with multiple nAChR subunit genes (including CHRNB2, CHRNA5, and CHRNA4) (OR=1.76; 95% CI: 1.23–2.52) (p<0.005); for bupropion, abstinence was associated with CYP2B6 (OR=1.78; 95% CI: 1.27–2.50) (p<0.001). Incidence of nausea was associated with several nAChR subunit genes (OR=0.50; 95% CI: 0.36–0.70) (p<0.0001) and time to relapse after quitting was associated with HTR3B (HR=1.97; 95% CI: 1.45–2.68) (p<0.0001). These data provide evidence for multiple genetic loci contributing to smoking cessation and therapeutic response. Different loci are associated with varenicline vs bupropion response, suggesting that additional research may identify clinically useful markers to guide treatment decisions
Knowledge systems, health care teams, and clinical practice: a study of successful change
Clinical teams are of growing importance to healthcare delivery, but little is known about how teams learn and change their clinical practice. We examined how teams in three US hospitals succeeded in making significant practice improvements in the area of antimicrobial resistance. This was a qualitative cross-case study employing Soft Knowledge Systems as a conceptual framework. The purpose was to describe how teams produced, obtained, and used knowledge and information to bring about successful change. A purposeful sampling strategy was used to maximize variation between cases. Data were collected through interviews, archival document review, and direct observation. Individual case data were analyzed through a two-phase coding process followed by the cross-case analysis. Project teams varied in size and were multidisciplinary. Each project had more than one champion, only some of whom were physicians. Team members obtained relevant knowledge and information from multiple sources including the scientific literature, experts, external organizations, and their own experience. The success of these projects hinged on the teams' ability to blend scientific evidence, practical knowledge, and clinical data. Practice change was a longitudinal, iterative learning process during which teams continued to acquire, produce, and synthesize relevant knowledge and information and test different strategies until they found a workable solution to their problem. This study adds to our understanding of how teams learn and change, showing that innovation can take the form of an iterative, ongoing process in which bits of K&I are assembled from multiple sources into potential solutions that are then tested. It suggests that existing approaches to assessing the impact of continuing education activities may overlook significant contributions and more attention should be given to the role that practical knowledge plays in the change process in addition to scientific knowledge
Reef fishes at all trophic levels respond positively to effective marine protected areas
Marine Protected Areas (MPAs) offer a unique opportunity to test the assumption that fishing pressure affects some trophic groups more than others. Removal of larger predators through fishing is often suggested to have positive flow-on effects for some lower trophic groups, in which case protection from fishing should result in suppression of lower trophic groups as predator populations recover. We tested this by assessing differences in the trophic structure of reef fish communities associated with 79 MPAs and open-access sites worldwide, using a standardised quantitative dataset on reef fish community structure. The biomass of all major trophic groups (higher carnivores, benthic carnivores, planktivores and herbivores) was significantly greater (by 40% - 200%) in effective no-take MPAs relative to fished open-access areas. This effect was most pronounced for individuals in large size classes, but with no size class of any trophic group showing signs of depressed biomass in MPAs, as predicted from higher predator abundance. Thus, greater biomass in effective MPAs implies that exploitation on shallow rocky and coral reefs negatively affects biomass of all fish trophic groups and size classes. These direct effects of fishing on trophic structure appear stronger than any top down effects on lower trophic levels that would be imposed by intact predator populations. We propose that exploitation affects fish assemblages at all trophic levels, and that local ecosystem function is generally modified by fishing
- …