618 research outputs found

    A Self-Organized Method for Computing the Epidemic Threshold in Computer Networks

    Full text link
    In many cases, tainted information in a computer network can spread in a way similar to an epidemics in the human world. On the other had, information processing paths are often redundant, so a single infection occurrence can be easily "reabsorbed". Randomly checking the information with a central server is equivalent to lowering the infection probability but with a certain cost (for instance processing time), so it is important to quickly evaluate the epidemic threshold for each node. We present a method for getting such information without resorting to repeated simulations. As for human epidemics, the local information about the infection level (risk perception) can be an important factor, and we show that our method can be applied to this case, too. Finally, when the process to be monitored is more complex and includes "disruptive interference", one has to use actual simulations, which however can be carried out "in parallel" for many possible infection probabilities

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Human Computation and Convergence

    Full text link
    Humans are the most effective integrators and producers of information, directly and through the use of information-processing inventions. As these inventions become increasingly sophisticated, the substantive role of humans in processing information will tend toward capabilities that derive from our most complex cognitive processes, e.g., abstraction, creativity, and applied world knowledge. Through the advancement of human computation - methods that leverage the respective strengths of humans and machines in distributed information-processing systems - formerly discrete processes will combine synergistically into increasingly integrated and complex information processing systems. These new, collective systems will exhibit an unprecedented degree of predictive accuracy in modeling physical and techno-social processes, and may ultimately coalesce into a single unified predictive organism, with the capacity to address societies most wicked problems and achieve planetary homeostasis.Comment: Pre-publication draft of chapter. 24 pages, 3 figures; added references to page 1 and 3, and corrected typ

    Management of Hypertriglyceridemia in the Diabetic Patient

    Get PDF
    The hypertriglyceridemia of diabetes can be classified into mild to moderate (triglycerides between 150–499 mg/dL) and severe hypertriglyceridemia (triglycerides ≥500 mg/dL). As in any other individuals with hypertriglyceridemia, secondary causes need to be excluded. The management of severe hypertriglyceridemia (chylomicronemia syndrome) includes aggressive reduction of triglycerides with intravenous insulin, fibrates, omega-3 fatty acids, and/or niacin therapy to avert the risk of pancreatitis. In patients with mild to moderate hypertriglyceridemia, the treatment of choice is statin therapy to achieve the low-density lipoprotein (LDL) and non-high-density lipoprotein (HDL) target goals. The evidence base would favor niacin therapy in combination with statin therapy to achieve the goals pertaining to LDL cholesterol and non-HDL cholesterol. The data about the combination of fibrate therapy with statin therapy are disappointing

    Pauli's Principle in Probe Microscopy

    Get PDF
    Exceptionally clear images of intramolecular structure can be attained in dynamic force microscopy through the combination of a passivated tip apex and operation in what has become known as the "Pauli exclusion regime" of the tip-sample interaction. We discuss, from an experimentalist's perspective, a number of aspects of the exclusion principle which underpin this ability to achieve submolecular resolution. Our particular focus is on the origins, history, and interpretation of Pauli's principle in the context of interatomic and intermolecular interactions.Comment: This is a chapter from "Imaging and Manipulation of Adsorbates using Dynamic Force Microscopy", a book which is part of the "Advances in Atom and Single Molecule Machines" series published by Springer [http://www.springer.com/series/10425]. To be published late 201

    Piperidinols that show anti-tubercular activity as inhibitors of arylamine N-acetyltransferase: an essential enzyme for mycobacterial survival inside macrophages

    Get PDF
    Latent M. tuberculosis infection presents one of the major obstacles in the global eradication of tuberculosis (TB). Cholesterol plays a critical role in the persistence of M. tuberculosis within the macrophage during latent infection. Catabolism of cholesterol contributes to the pool of propionyl-CoA, a precursor that is incorporated into cell-wall lipids. Arylamine N-acetyltransferase (NAT) is encoded within a gene cluster that is involved in the cholesterol sterol-ring degradation and is essential for intracellular survival. The ability of the NAT from M. tuberculosis (TBNAT) to utilise propionyl-CoA links it to the cholesterol-catabolism pathway. Deleting the nat gene or inhibiting the NAT enzyme prevents intracellular survival and results in depletion of cell-wall lipids. TBNAT has been investigated as a potential target for TB therapies. From a previous high-throughput screen, 3-benzoyl-4-phenyl-1-methylpiperidinol was identified as a selective inhibitor of prokaryotic NAT that exhibited antimycobacterial activity. The compound resulted in time-dependent irreversible inhibition of the NAT activity when tested against NAT from M. marinum (MMNAT). To further evaluate the antimycobacterial activity and the NAT inhibition of this compound, four piperidinol analogues were tested. All five compounds exert potent antimycobacterial activity against M. tuberculosis with MIC values of 2.3-16.9 µM. Treatment of the MMNAT enzyme with this set of inhibitors resulted in an irreversible time-dependent inhibition of NAT activity. Here we investigate the mechanism of NAT inhibition by studying protein-ligand interactions using mass spectrometry in combination with enzyme analysis and structure determination. We propose a covalent mechanism of NAT inhibition that involves the formation of a reactive intermediate and selective cysteine residue modification. These piperidinols present a unique class of antimycobacterial compounds that have a novel mode of action different from known anti-tubercular drugs

    G-protein inwardly rectifying potassium channel 1 (GIRK 1) gene expression correlates with tumor progression in non-small cell lung cancer

    Get PDF
    BACKGROUND: G-protein inwardly rectifying potassium channel 1 (GIRK1) is thought to play a role in cell proliferation in cancer, and GIRK1 gene expression level may define a more aggressive phenotype. We detected GIRK1 expression in tissue specimens from patients with non-small cell lung cancers (NSCLCs) and assessed their clinical characteristics. METHODS: Using reverse transcription-polymerase chain reaction (RT-PCR) analyses, we quantified the expression of GIRK1 in 72 patients with NSCLCs to investigate the relationship between GIRK1 expression and clinicopathologic factors and prognosis. RESULTS: In 72 NSCLC patients, 50 (69%) samples were evaluated as having high GIRK1 gene expression, and 22 (31%) were evaluated as having low GIRK1 gene expression. GIRK1 gene expression was significantly associated with lymph node metastasis, stage (p = 0.0194 for lymph node metastasis; p = 0.0207 for stage). The overall and stage I survival rates for patients with high GIRK1 gene expressed tumors was significantly worse than for those individuals whose tumors had low GIRK1 expression (p = 0.0004 for the overall group; p = 0.0376 for stage I). CONCLUSIONS: These data indicate that GIRK1 may contribute to tumor progression and GIRK1 gene expression can serve as a useful prognostic marker in the overall and stage I NSCLCs

    Systematic review and meta-analysis of the diagnostic accuracy of ultrasonography for deep vein thrombosis

    Get PDF
    Background Ultrasound (US) has largely replaced contrast venography as the definitive diagnostic test for deep vein thrombosis (DVT). We aimed to derive a definitive estimate of the diagnostic accuracy of US for clinically suspected DVT and identify study-level factors that might predict accuracy. Methods We undertook a systematic review, meta-analysis and meta-regression of diagnostic cohort studies that compared US to contrast venography in patients with suspected DVT. We searched Medline, EMBASE, CINAHL, Web of Science, Cochrane Database of Systematic Reviews, Cochrane Controlled Trials Register, Database of Reviews of Effectiveness, the ACP Journal Club, and citation lists (1966 to April 2004). Random effects meta-analysis was used to derive pooled estimates of sensitivity and specificity. Random effects meta-regression was used to identify study-level covariates that predicted diagnostic performance. Results We identified 100 cohorts comparing US to venography in patients with suspected DVT. Overall sensitivity for proximal DVT (95% confidence interval) was 94.2% (93.2 to 95.0), for distal DVT was 63.5% (59.8 to 67.0), and specificity was 93.8% (93.1 to 94.4). Duplex US had pooled sensitivity of 96.5% (95.1 to 97.6) for proximal DVT, 71.2% (64.6 to 77.2) for distal DVT and specificity of 94.0% (92.8 to 95.1). Triplex US had pooled sensitivity of 96.4% (94.4 to 97.1%) for proximal DVT, 75.2% (67.7 to 81.6) for distal DVT and specificity of 94.3% (92.5 to 95.8). Compression US alone had pooled sensitivity of 93.8 % (92.0 to 95.3%) for proximal DVT, 56.8% (49.0 to 66.4) for distal DVT and specificity of 97.8% (97.0 to 98.4). Sensitivity was higher in more recently published studies and in cohorts with higher prevalence of DVT and more proximal DVT, and was lower in cohorts that reported interpretation by a radiologist. Specificity was higher in cohorts that excluded patients with previous DVT. No studies were identified that compared repeat US to venography in all patients. Repeat US appears to have a positive yield of 1.3%, with 89% of these being confirmed by venography. Conclusion Combined colour-doppler US techniques have optimal sensitivity, while compression US has optimal specificity for DVT. However, all estimates are subject to substantial unexplained heterogeneity. The role of repeat scanning is very uncertain and based upon limited data

    Molecular characterisation of protist parasites in human-habituated mountain gorillas (Gorilla beringei beringei), humans and livestock, from Bwindi impenetrable National Park, Uganda

    Get PDF
    Over 60 % of human emerging infectious diseases are zoonotic, and there is growing evidence of the zooanthroponotic transmission of diseases from humans to livestock and wildlife species, with major implications for public health, economics, and conservation. Zooanthroponoses are of relevance to critically endangered species; amongst these is the mountain gorilla (Gorilla beringei beringei) of Uganda. Here, we assess the occurrence of Cryptosporidium, Cyclospora, Giardia, and Entamoeba infecting mountain gorillas in the Bwindi Impenetrable National Park (BINP), Uganda, using molecular methods. We also assess the occurrence of these parasites in humans and livestock species living in overlapping/adjacent geographical regions

    Theory of Multidimensional Solitons

    Full text link
    We review a number of topics germane to higher-dimensional solitons in Bose-Einstein condensates. For dark solitons, we discuss dark band and planar solitons; ring dark solitons and spherical shell solitons; solitary waves in restricted geometries; vortex rings and rarefaction pulses; and multi-component Bose-Einstein condensates. For bright solitons, we discuss instability, stability, and metastability; bright soliton engineering, including pulsed atom lasers; solitons in a thermal bath; soliton-soliton interactions; and bright ring solitons and quantum vortices. A thorough reference list is included.Comment: review paper, to appear as Chapter 5a in "Emergent Nonlinear Phenomena in Bose-Einstein Condensates: Theory and Experiment," edited by P. G. Kevrekidis, D. J. Frantzeskakis, and R. Carretero-Gonzalez (Springer-Verlag
    corecore