216 research outputs found

    Effects of multiple-dose ponesimod, a selective SIP1 receptor modulator, on lymphocyte subsets in healthy humans

    Get PDF
    This study investigated the effects of ponesimod, a selective SIP1 receptor modulator, on T lymphocyte subsets in 16 healthy subjects. Lymphocyte subset proportions and absolute numbers were determined at baseline and on Day 10, after once-daily administration of ponesimod (10 mg, 20 mg, and 40 mg each consecutively for 3 days) or placebo (ratio 3: 1). The overall change from baseline in lymphocyte count was -1,292 +/- 340x10(6) cells/L and 275 +/- 486x10(6) cells/L in ponesimod- and placebo-treated subjects, respectively. This included a decrease in both T and B lymphocytes following ponesimod treatment. A decrease in naive CD4(+) T cells (CD45RA(+)CCR7(+)) from baseline was observed only after ponesimod treatment (-113 +/- 98x10(6) cells/L, placebo: 0 +/- 18x10(6) cells/L). The number of T-cytotoxic (CD3(+)CD8(+)) and T-helper (CD3(+)CD4(+)) cells was significantly altered following ponesimod treatment compared with placebo. Furthermore, ponesimod treatment resulted in marked decreases in CD4(+) T-central memory (CD45RA(-)CCR7(+)) cells (-437 +/- 164x10(6) cells/L) and CD4(+) T-effector memory (CD45RA(-)CCR7(-)) cells (-131 +/- 57x10(6) cells/L). In addition, ponesimod treatment led to a decrease of -228 +/- 90x10(6) cells/L of gut-homing T cells (CLA(-)integrin beta 7(+)). In contrast, when compared with placebo, CD8(+) T-effector memory and natural killer (NK) cells were not significantly reduced following multiple-dose administration of ponesimod. In summary, ponesimod treatment led to a marked reduction in overall T and B cells. Further investigations revealed that the number of CD4(+) cells was dramatically reduced, whereas CD8(+) and NK cells were less affected, allowing the body to preserve critical viral-clearing functions

    The Computational Complexity of Generating Random Fractals

    Full text link
    In this paper we examine a number of models that generate random fractals. The models are studied using the tools of computational complexity theory from the perspective of parallel computation. Diffusion limited aggregation and several widely used algorithms for equilibrating the Ising model are shown to be highly sequential; it is unlikely they can be simulated efficiently in parallel. This is in contrast to Mandelbrot percolation that can be simulated in constant parallel time. Our research helps shed light on the intrinsic complexity of these models relative to each other and to different growth processes that have been recently studied using complexity theory. In addition, the results may serve as a guide to simulation physics.Comment: 28 pages, LATEX, 8 Postscript figures available from [email protected]

    The Computational Complexity of the Lorentz Lattice Gas

    Full text link
    The Lorentz lattice gas is studied from the perspective of computational complexity theory. It is shown that using massive parallelism, particle trajectories can be simulated in a time that scales logarithmically in the length of the trajectory. This result characterizes the ``logical depth" of the Lorentz lattice gas and allows us to compare it to other models in statistical physics.Comment: 9 pages, LaTeX, to appear in J. Stat. Phy

    Theoretically Efficient Parallel Graph Algorithms Can Be Fast and Scalable

    Full text link
    There has been significant recent interest in parallel graph processing due to the need to quickly analyze the large graphs available today. Many graph codes have been designed for distributed memory or external memory. However, today even the largest publicly-available real-world graph (the Hyperlink Web graph with over 3.5 billion vertices and 128 billion edges) can fit in the memory of a single commodity multicore server. Nevertheless, most experimental work in the literature report results on much smaller graphs, and the ones for the Hyperlink graph use distributed or external memory. Therefore, it is natural to ask whether we can efficiently solve a broad class of graph problems on this graph in memory. This paper shows that theoretically-efficient parallel graph algorithms can scale to the largest publicly-available graphs using a single machine with a terabyte of RAM, processing them in minutes. We give implementations of theoretically-efficient parallel algorithms for 20 important graph problems. We also present the optimizations and techniques that we used in our implementations, which were crucial in enabling us to process these large graphs quickly. We show that the running times of our implementations outperform existing state-of-the-art implementations on the largest real-world graphs. For many of the problems that we consider, this is the first time they have been solved on graphs at this scale. We have made the implementations developed in this work publicly-available as the Graph-Based Benchmark Suite (GBBS).Comment: This is the full version of the paper appearing in the ACM Symposium on Parallelism in Algorithms and Architectures (SPAA), 201

    Dental attendance, restoration and extractions in adults with intellectual disabilities compared with the general population: a record linkage study

    Get PDF
    Background: Oral health may be poorer in adults with intellectual disabilities (IDs) who rely on carer support and medications with increased dental risks. Methods: Record linkage study of dental outcomes, and associations with anticholinergic (e.g. antipsychotics) and sugar‐containing liquid medication, in adults with IDs compared with age–sex–neighbourhood deprivation‐matched general population controls. Results: A total of 2933/4305 (68.1%) with IDs and 7761/12 915 (60.1%) without IDs attended dental care: odds ratio (OR) = 1.42 [1.32, 1.53]; 1359 (31.6%) with IDs versus 5233 (40.5%) without IDs had restorations: OR = 0.68 [0.63, 0.73]; and 567 (13.2%) with IDs versus 2048 (15.9%) without IDs had dental extractions: OR = 0.80 [0.73, 0.89]. Group differences for attendance were greatest in younger ages, and restoration/extractions differences were greatest in older ages. Adults with IDs were more likely prescribed with anticholinergics (2493 (57.9%) vs. 6235 (48.3%): OR = 1.49 [1.39, 1.59]) and sugar‐containing liquids (1641 (38.1%) vs. 2315 (17.9%): OR = 2.89 [2.67, 3.12]). Conclusion: Carers support dental appointments, but dentists may be less likely to restore teeth, possibly extracting multiple teeth at individual appointments instead

    The Parallel Complexity of Growth Models

    Full text link
    This paper investigates the parallel complexity of several non-equilibrium growth models. Invasion percolation, Eden growth, ballistic deposition and solid-on-solid growth are all seemingly highly sequential processes that yield self-similar or self-affine random clusters. Nonetheless, we present fast parallel randomized algorithms for generating these clusters. The running times of the algorithms scale as O(log2N)O(\log^2 N), where NN is the system size, and the number of processors required scale as a polynomial in NN. The algorithms are based on fast parallel procedures for finding minimum weight paths; they illuminate the close connection between growth models and self-avoiding paths in random environments. In addition to their potential practical value, our algorithms serve to classify these growth models as less complex than other growth models, such as diffusion-limited aggregation, for which fast parallel algorithms probably do not exist.Comment: 20 pages, latex, submitted to J. Stat. Phys., UNH-TR94-0

    Parallel Algorithm and Dynamic Exponent for Diffusion-limited Aggregation

    Full text link
    A parallel algorithm for ``diffusion-limited aggregation'' (DLA) is described and analyzed from the perspective of computational complexity. The dynamic exponent z of the algorithm is defined with respect to the probabilistic parallel random-access machine (PRAM) model of parallel computation according to TLzT \sim L^{z}, where L is the cluster size, T is the running time, and the algorithm uses a number of processors polynomial in L\@. It is argued that z=D-D_2/2, where D is the fractal dimension and D_2 is the second generalized dimension. Simulations of DLA are carried out to measure D_2 and to test scaling assumptions employed in the complexity analysis of the parallel algorithm. It is plausible that the parallel algorithm attains the minimum possible value of the dynamic exponent in which case z characterizes the intrinsic history dependence of DLA.Comment: 24 pages Revtex and 2 figures. A major improvement to the algorithm and smaller dynamic exponent in this versio

    Cardiometabolic risk factors, peripheral arterial tonometry and metformin in adults with type 1 diabetes participating in the REducing with MetfOrmin Vascular Adverse Lesions trial

    Get PDF
    BACKGROUND: Peripheral arterial tonometry (PAT) provides non-invasive measures of vascular health. Beneficial effects of metformin on vascular function have been reported in youth with type 1 diabetes (T1D). In the REducing with MetfOrmin Vascular Adverse Lesions (REMOVAL) trial in adults with T1D and high cardiovascular risk, we examined: (i) the extent to which routinely-measured cardiometabolic risk factors explain variance in baseline PAT; and (ii) the effects of metformin on PAT measures. METHODS: Cross-sectional univariable and multivariable analyses of baseline reactive hyperaemia index (RHI) and augmentation index (AI) (EndoPAT® (Itamar, Israel); and analysis of 36-months metformin versus placebo on vascular tonometry. RESULTS: In 364 adults ((mean ± SD) age 55.2 ± 8.5 years, T1D 34.0 ± 10.6 years, HbA1c 64.5 ± 9.0 mmol/mol (8.1 ± 0.8%)), RHI was 2.26 ± 0.74 and AI was 15.9 ± 19.2%. In an exhaustive search, independent associates of (i) RHI were smoking, waist circumference, systolic blood pressure and vitamin B12 (adjusted R2 = 0.11) and (ii) AI were male sex, pulse pressure, heart rate and waist circumference (adjusted R2 = 0.31). Metformin did not significantly affect RHI or AI. CONCLUSION: Cardiometabolic risk factors explained only a modest proportion of variance in PAT measures of vascular health in adults with T1D and high cardiovascular risk. PAT measures were not affected by metformin

    Allopurinol and cardiovascular outcomes in patients with ischaemic heart disease:the ALL-HEART RCT and economic evaluation

    Get PDF
    Background: Allopurinol is a xanthine oxidase inhibitor that lowers serum uric acid and is used to prevent acute gout flares in patients with gout. Observational and small interventional studies have suggested beneficial cardiovascular effects of allopurinol. Objective: To determine whether allopurinol improves major cardiovascular outcomes in patients with ischaemic heart disease. Design: Prospective, randomised, open-label, blinded endpoint multicentre clinical trial. Setting: Four hundred and twenty-four UK primary care practices. Participants: Aged 60 years and over with ischaemic heart disease but no gout. Interventions: Participants were randomised (1: 1) using a central web-based randomisation system to receive allopurinol up to 600 mg daily that was added to usual care or to continue usual care. Main outcome measures: The primary outcome was the composite of non-fatal myocardial infarction, non-fatal stroke or cardiovascular death. Secondary outcomes were non-fatal myocardial infarction, non-fatal stroke, cardiovascular death, all-cause mortality, hospitalisation for heart failure, hospitalisation for acute coronary syndrome, coronary revascularisation, hospitalisation for acute coronary syndrome or coronary revascularisation, all cardiovascular hospitalisations, quality of life and cost-effectiveness. The hazard ratio (allopurinol vs. usual care) in a Cox proportional hazards model was assessed for superiority in a modified intention-to-treat analysis. Results: From 7 February 2014 to 2 October 2017, 5937 participants were enrolled and randomised to the allopurinol arm (n = 2979) or the usual care arm (n = 2958). A total of 5721 randomised participants (2853 allopurinol; 2868 usual care) were included in the modified intention-to-treat analysis population (mean age 72.0 years; 75.5% male). There was no difference between the allopurinol and usual care arms in the primary endpoint, 314 (11.0%) participants in the allopurinol arm (2.47 events per 100 patient-years) and 325 (11.3%) in the usual care arm (2.37 events per 100 patient-years), hazard ratio 1.04 (95% confidence interval 0.89 to 1.21); p = 0.65. Two hundred and eighty-eight (10.1%) participants in the allopurinol arm and 303 (10.6%) participants in the usual care arm died, hazard ratio 1.02 (95% confidence interval 0.87 to 1.20); p = 0.77. The pre-specified health economic analysis plan was to perform a ‘within trial’ cost-utility analysis if there was no statistically significant difference in the primary endpoint, so NHS costs and quality-adjusted life-years were estimated over a 5-year period. The difference in costs between treatment arms was +£115 higher for allopurinol (95% confidence interval £17 to £210) with no difference in quality-adjusted life-years (95% confidence interval −0.061 to +0.060). We conclude that there is no evidence that allopurinol used in line with the study protocol is cost-effective. Limitations: The results may not be generalisable to younger populations, other ethnic groups or patients with more acute ischaemic heart disease. One thousand six hundred and thirty-seven participants (57.4%) in the allopurinol arm withdrew from randomised treatment, but an on-treatment analysis gave similar results to the main analysis. Conclusions: The ALL-HEART study showed that treatment with allopurinol 600 mg daily did not improve cardiovascular outcomes compared to usual care in patients with ischaemic heart disease. We conclude that allopurinol should not be recommended for the secondary prevention of cardiovascular events in patients with ischaemic heart disease but no gout.</p
    corecore