3,707 research outputs found

    Analysis of weighted networks

    Full text link
    The connections in many networks are not merely binary entities, either present or not, but have associated weights that record their strengths relative to one another. Recent studies of networks have, by and large, steered clear of such weighted networks, which are often perceived as being harder to analyze than their unweighted counterparts. Here we point out that weighted networks can in many cases be analyzed using a simple mapping from a weighted network to an unweighted multigraph, allowing us to apply standard techniques for unweighted graphs to weighted ones as well. We give a number of examples of the method, including an algorithm for detecting community structure in weighted networks and a new and simple proof of the max-flow/min-cut theorem.Comment: 9 pages, 3 figure

    Self-reported pregnancy exposures and placental DNA methylation in the MARBLES prospective autism sibling study.

    Get PDF
    Human placenta is a fetal-derived tissue that offers a unique sample of epigenetic and environmental exposures present in utero. In the MARBLES prospective pregnancy study of high-risk younger siblings of children with autism spectrum disorder (ASD), pregnancy and environmental factors collected by maternal interviews were examined as predictors of placental DNA methylation, including partially methylated domains (PMDs), an embryonic feature of the placental methylome. DNA methylation data from MethylC-seq analysis of 47 placentas of children clinically diagnosed at 3 years with ASD or typical development using standardized assessments were examined in relation to: child's gestational age, birth-weight, and diagnosis; maternal pre-pregnancy body mass index, smoking, education, parity, height, prenatal vitamin and folate intake; home ownership; pesticides professionally applied to lawns or gardens or inside homes, pet flea/tick pouches, collars, or soaps/shampoos used in the 3 months prior to or during pregnancy. Sequencing run, order, and coverage, and child race and sex were considered as potential confounders. Akaike information criterion was used to select the most parsimonious among candidate models. Final prediction models used sandwich estimators to produce homoscadisticity-robust estimates of the 95% confidence interval (CI) and P-values controlled the false discovery rate at 5%. The strongest, most robust associations were between pesticides professionally applied outside the home and higher average methylation over PMDs [0.45 (95% CI 0.17, 0.72), P = 0.03] and a reduced proportion of the genome in PMDs [-0.42 (95% CI - 0.67 to -0.17), P = 0.03]. Pesticide exposures could alter placental DNA methylation more than other factors

    Thermal effects in the Input Optics of the Enhanced Laser Interferometer Gravitational-Wave Observatory interferometers

    Get PDF
    We present the design and performance of the LIGO Input Optics subsystem as implemented for the sixth science run of the LIGO interferometers. The Initial LIGO Input Optics experienced thermal side effects when operating with 7 W input power. We designed, built, and implemented improved versions of the Input Optics for Enhanced LIGO, an incremental upgrade to the Initial LIGO interferometers, designed to run with 30 W input power. At four times the power of Initial LIGO, the Enhanced LIGO Input Optics demonstrated improved performance including better optical isolation, less thermal drift, minimal thermal lensing, and higher optical efficiency. The success of the Input Optics design fosters confidence for its ability to perform well in Advanced LIGO

    Improving distributed video coding side information by intelligently combining macro-blocks from multiple algorithms

    Get PDF
    The performance of distributed video coding (DVC) greatly relies on the quality of Side information (SI). This paper investigates a novel way of producing SI by intelligently combining macroblocks (MB) produced by two SI generation algorithms, namely higher-order piecewise temporal trajectory interpolation (HOPTTI) and adaptive overlapped block motion compensation (AOBMC). The two algorithms address the problem differently. HOPTTI attempts to improve the motion estimation using higher order trajectory interpolation while AOMBC addresses the blocking and overlapping problem caused by inaccurate block matching. By judiciously selecting when to incorporate AOBMC with HOPTTI, it would give a peak signal-to-noise ratio (PSNR) improvement in SI quality. Two switching mechanisms, which exploit the spatial-temporal correlation at the macro-block level, have been investigated and the RST-based intelligent mode switching (IMS) algorithm is found to produce enhanced SI quality. Experimental results show that the basic mode switching algorithm gives a PSNR improvement of up to 1.8dB in SI quality compared to using only HOPTTI. The more intelligent RST-based switching provides a further PSNR enhancement of up to 1.1dB for certain test sequences

    Increased emergency department use by adolescents and young adults with eating disorders

    Full text link
    Objective: This study describes patterns of emergency department (ED) utilization by patients who screen positive for eating disorders. Method: ED patients aged 14–20 years ( n = 1,920) completed a computerized questionnaire. The analyses compared the rates of ED use between patients who screened positive for an eating disorder and those who did not and examined the reasons for ED use amongst patients with eating disorders. Results: ED patients who screened positive for eating disorders were significantly more likely to have previously visited the ED and, on average, utilized the ED at a rate 1.6 times higher than patients who screen negative for eating disorders. The most common chief complaints among patients who screen positive for eating disorders were abdominal pain and other gastrointestinal‐related problems. Discussion: Patients with eating disorders utilize the ED more frequently than those without and commonly present for complaints seemingly unrelated to their eating disorder. © 2012 by Wiley Periodicals, Inc. (Int J Eat Disord 2013)Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/97531/1/22070_ftp.pd

    Cost-benefit analysis for commissioning decisions in GEO600

    Get PDF
    Gravitational wave interferometers are complex instruments, requiring years of commissioning to achieve the required sensitivities for the detection of gravitational waves, of order 10^-21 in dimensionless detector strain, in the tens of Hz to several kHz frequency band. Investigations carried out by the GEO600 detector characterisation group have shown that detector characterisation techniques are useful when planning for commissioning work. At the time of writing, GEO600 is the only large scale laser interferometer currently in operation running with a high duty factor, 70%, limited chiefly by the time spent commissioning the detector. The number of observable gravitational wave sources scales as the product of the volume of space to which the detector is sensitive and the observation time, so the goal of commissioning is to improve the detector sensitivity with the least possible detector down time. We demonstrate a method for increasing the number of sources observable by such a detector, by assessing the severity of non-astrophysical noise contaminations to efficiently guide commissioning. This method will be particularly useful in the early stages and during the initial science runs of the aLIGO and adVirgo detectors, as they are brought up to design performance.Comment: 17 pages, 17 figures, 2 table

    Carbon removals from nature restoration are no substitute for steep emission reductions

    Get PDF
    The role of nature restoration in mitigating the impacts of climate change is receiving increasing attention, yet the mitigation potential is often assessed in terms of carbon removal rather than the ability to meet temperature goals, such as those outlined in the Paris Agreement. Here, we estimate the global removal potential from nature restoration constrained by a “responsible development” framework and the contribution this would make to a 1.5°C temperature limit. Our constrained restoration options result in a median of 103 GtC (5%–95% range of −91 to 196 GtC) in cumulative removals between 2020 and 2100. When combined with deep-decarbonization scenarios, our restoration scenario briefly exceeds 1.5°C before declining to between 1.25°C and 1.5°C by 2100 (median, 50% probability). We conclude that additional carbon sequestration via nature restoration is unlikely to be done quickly enough to notably reduce the global peak temperatures expected in the next few decades. Land restoration is an important option for tackling climate change but cannot compensate for delays in reducing fossil fuel emissions
    corecore