970 research outputs found

    Let Your CyberAlter Ego Share Information and Manage Spam

    Full text link
    Almost all of us have multiple cyberspace identities, and these {\em cyber}alter egos are networked together to form a vast cyberspace social network. This network is distinct from the world-wide-web (WWW), which is being queried and mined to the tune of billions of dollars everyday, and until recently, has gone largely unexplored. Empirically, the cyberspace social networks have been found to possess many of the same complex features that characterize its real counterparts, including scale-free degree distributions, low diameter, and extensive connectivity. We show that these topological features make the latent networks particularly suitable for explorations and management via local-only messaging protocols. {\em Cyber}alter egos can communicate via their direct links (i.e., using only their own address books) and set up a highly decentralized and scalable message passing network that can allow large-scale sharing of information and data. As one particular example of such collaborative systems, we provide a design of a spam filtering system, and our large-scale simulations show that the system achieves a spam detection rate close to 100%, while the false positive rate is kept around zero. This system has several advantages over other recent proposals (i) It uses an already existing network, created by the same social dynamics that govern our daily lives, and no dedicated peer-to-peer (P2P) systems or centralized server-based systems need be constructed; (ii) It utilizes a percolation search algorithm that makes the query-generated traffic scalable; (iii) The network has a built in trust system (just as in social networks) that can be used to thwart malicious attacks; iv) It can be implemented right now as a plugin to popular email programs, such as MS Outlook, Eudora, and Sendmail.Comment: 13 pages, 10 figure

    Parallel gene synthesis in a microfluidic device

    Get PDF
    The ability to synthesize custom de novo DNA constructs rapidly, accurately and inexpensively is highly desired by researchers, as synthetic genes and longer DNA constructs are enabling to numerous powerful applications in both traditional molecular biology and the emerging field of synthetic biology. However, the current cost of de novo synthesisā€”driven largely by reagent and handling costsā€”is a significant barrier to the widespread availability of such technology. In this work, we demonstrate, to our knowledge, the first gene synthesis in a microfluidic environment. The use of microfluidic technology greatly reduces reaction volumes and the corresponding reagent and handling costs. Additionally, microfluidic technology enables large numbers of complex reactions to be performed in parallel. Here, we report the fabrication of a multi-chamber microfluidic device and its use in carrying out the syntheses of several DNA constructs. Genes up to 1ā€‰kb in length were synthesized in parallel at minute starting oligonucleotide concentrations (10ā€“25ā€‰nM) in four 500ā€‰nl reactors. Such volumes are one to two orders of magnitude lower than those utilized in conventional gene synthesis. The identity of all target genes was verified by sequencing, and the resultant error rate was determined to be 1 per 560 bases.Massachusetts Institute of Technology. Center for Bits and AtomsNational Science Foundation (U.S.) (CBA grant CCR-0122419

    Preferential survival in models of complex ad hoc networks

    Full text link
    There has been a rich interplay in recent years between (i) empirical investigations of real world dynamic networks, (ii) analytical modeling of the microscopic mechanisms that drive the emergence of such networks, and (iii) harnessing of these mechanisms to either manipulate existing networks, or engineer new networks for specific tasks. We continue in this vein, and study the deletion phenomenon in the web by following two different sets of web-sites (each comprising more than 150,000 pages) over a one-year period. Empirical data show that there is a significant deletion component in the underlying web networks, but the deletion process is not uniform. This motivates us to introduce a new mechanism of preferential survival (PS), where nodes are removed according to a degree-dependent deletion kernel. We use the mean-field rate equation approach to study a general dynamic model driven by Preferential Attachment (PA), Double PA (DPA), and a tunable PS, where c nodes (c<1) are deleted per node added to the network, and verify our predictions via large-scale simulations. One of our results shows that, unlike in the case of uniform deletion, the PS kernel when coupled with the standard PA mechanism, can lead to heavy-tailed power law networks even in the presence of extreme turnover in the network. Moreover, a weak DPA mechanism, coupled with PS, can help make the network even more heavy-tailed, especially in the limit when deletion and insertion rates are almost equal, and the overall network growth is minimal. The dynamics reported in this work can be used to design and engineer stable ad hoc networks and explain the stability of the power law exponents observed in real-world networks.Comment: 9 pages, 6 figure

    Spectroscopy of Broad Line Blazars from 1LAC

    Get PDF
    We report on optical spectroscopy of 165 Flat Spectrum Radio Quasars (FSRQs) in the Fermi 1LAC sample, which have helped allow a nearly complete study of this population. Fermi FSRQ show significant evidence for non-thermal emission even in the optical; the degree depends on the gamma-ray hardness. They also have smaller virial estimates of hole mass than the optical quasar sample. This appears to be largely due to a preferred (axial) view of the gamma-ray FSRQ and non-isotropic (H/R ~ 0.4) distribution of broad-line velocities. Even after correction for this bias, the Fermi FSRQ show higher mean Eddington ratios than the optical population. A comparison of optical spectral properties with Owens Valley Radio Observatory radio flare activity shows no strong correlation.Comment: Accepted for publication in Ap

    Single-nucleotide polymorphisms are associated with cognitive decline at Alzheimer's disease conversion within mild cognitive impairment patients

    Get PDF
    INTRODUCTION: The growing public threat of Alzheimer's disease (AD) has raised the urgency to quantify the degree of cognitive decline during the conversion process of mild cognitive impairment (MCI) to AD and its underlying genetic pathway. The aim of this article was to test genetic common variants associated with accelerated cognitive decline after the conversion of MCI to AD. METHODS: In 583 subjects with MCI enrolled in the Alzheimer's Disease Neuroimaging Initiative (ADNI; ADNI-1, ADNI-Go, and ADNI-2), 245 MCI participants converted to AD at follow-up. We tested the interaction effects between individual single-nucleotide polymorphisms and AD diagnosis trajectory on the longitudinal Alzheimer's Disease Assessment Scale-Cognition scores. RESULTS: Our findings reveal six genes, including BDH1, ST6GAL1, RAB20, PDS5B, ADARB2, and SPSB1, which are directly or indirectly related to MCI conversion to AD. DISCUSSION: This genome-wide association study sheds light on a genetic mechanism of longitudinal cognitive changes during the transition period from MCI to AD

    Soil Moisture Active Passive (SMAP) Mission Level 4 Carbon (L4_C) Product Specification Document

    Get PDF
    This is the Product Specification Document (PSD) for Level 4 Surface and Root Zone Soil Moisture (L4_SM) data for the Science Data System (SDS) of the Soil Moisture Active Passive (SMAP) project. The L4_SM data product provides estimates of land surface conditions based on the assimilation of SMAP observations into a customized version of the NASA Goddard Earth Observing System, Version 5 (GEOS-5) land data assimilation system (LDAS). This document applies to any standard L4_SM data product generated by the SMAP Project

    An Outcome-based Approach for the Creation of Fetal Growth Standards: Do Singletons and Twins Need Separate Standards?

    Get PDF
    Contemporary fetal growth standards are created by using theoretical properties (percentiles) of birth weight (for gestational age) distributions. The authors used a clinically relevant, outcome-based methodology to determine if separate fetal growth standards are required for singletons and twins. All singleton and twin livebirths between 36 and 42 weeksā€™ gestation in the United States (1995ā€“2002) were included, after exclusions for missing information and other factors (nā€‰=ā€‰17,811,922). A birth weight range was identified, at each gestational age, over which serious neonatal morbidity and neonatal mortality rates were lowest. Among singleton males at 40 weeks, serious neonatal morbidity/mortality rates were lowest between 3,012 g (95% confidence interval (CI): 3,008, 3,018) and 3,978 g (95% CI: 3,976, 3,980). The low end of this optimal birth weight range for females was 37 g (95% CI: 21, 53) less. The low optimal birth weight was 152 g (95% CI: 121, 183) less for twins compared with singletons. No differences were observed in low optimal birth weight by period (1999ā€“2002 vs. 1995ā€“1998), but small differences were observed for maternal education, race, parity, age, and smoking status. Patterns of birth weight-specific serious neonatal morbidity/neonatal mortality support the need for plurality-specific fetal growth standards
    • ā€¦
    corecore