454 research outputs found

    Aquatic treadmill running reduces muscle soreness following intense sprint exercise in trained men

    Get PDF
    Delayed onset muscle soreness is associated with muscle damage, disturbances in proprioception, and decreases in muscular power. The purpose was to determine if short duration aquatic treadmill (ATM) running reduces muscle soreness following intense sprint exercise in trained men. Twenty trained men (180.3±4.4cm, 86.3±5.8kg, 20±1yr) were recruited and randomly divided into two groups: ATM recovery (ATMRec) and passive recovery (PRec). During testing, subjects performed a warm-up followed by sixteen 110yrd cutback runs with a sprint of 60yrds, sharp change of direction, and a return sprint of 50yrds. Work to rest ratio was set at 1:3. Additionally, following exercise, the ATMRec group performed ATM running using a HydroWorx® treadmill at 5mph, 50% maximal jet resistance, and water(33°C) level at chest depth for 10min. Both groups then evaluated their level of soreness/pain using a numerical rating scale (NRS: 0-10, 0=no pain, 10=worst pain) immediately following all exercise (IPE), 24h, and 48h post exercise in the following regions: ARMS, LEGS, BACK, CHEST, SHOULDERS, HIPS, ABDOMEN, NECK, OVERALL. Data were analyzed for group x time interactions using a 2x3 Generalized Linear Mixed Model for non-parametric data (α≤0.05). For significant interactions, the same procedure was used to analyze between group differences at the same measurement timepoint(α≤0.05)

    A contemporary overview of percutaneous coronary interventions The American College of Cardiology–National Cardiovascular Data Registry (ACC–NCDR)

    Get PDF
    AbstractObjectivesThe American College of Cardiology (ACC) established the National Cardiovascular Data Registry (ACC–NCDR) to provide a uniform and comprehensive database for analysis of cardiovascular procedures across the country. The initial focus has been the high-volume, high-profile procedures of diagnostic cardiac catheterization and percutaneous coronary intervention (PCI).BackgroundSeveral large-scale multicenter efforts have evaluated diagnostic catheterization and PCI, but these have been limited by lack of standard definitions and relatively nonuniform data collection and reporting methods.MethodsBoth clinical and procedural data, and adverse events occurring up to hospital discharge, were collected and reported according to uniform guidelines using a standard set of 143 data elements. Datasets were transmitted quarterly to a central facility for quality-control screening, storage and analysis. This report is based on PCI data collected from January 1, 1998, through September 30, 2000.ResultsA total of 139 hospitals submitted data on 146,907 PCI procedures. Of these, 32% (46,615 procedures) were excluded because data did not pass quality-control screening. The remaining 100,292 procedures (68%) were included in the analysis set. Average age was 64 ± 12 years; 34% were women, 26% had diabetes mellitus, 29% had histories of prior myocardial infarction (MI), 32% had prior PCI and 19% had prior coronary bypass surgery. In 10% the indication for PCI was acute MI ≤6 h from onset, while in 52% it was class II to IV or unstable angina. Only 5% of procedures did not have a class I indication by ACC criteria, but this varied by hospital from a low of 0 to a high of 38%. A coronary stent was placed in 77% of procedures, but this varied by hospital from a low of 0 to a high of 97%. The frequencies of in-hospital Q-wave MI, coronary artery bypass graft surgery and death were 0.4%, 1.9% and 1.4%, respectively. Mortality varied by hospital from a low of 0 to a high of 4.2%.ConclusionsThis report presents the first data collected and analyzed by the ACC–NCDR. It portrays a contemporary overview of coronary interventional practices and outcomes, using uniform data collection and reporting standards. These data reconfirm overall acceptable results that are consistent with other reported data, but also confirm large variations between individual institutions

    Development of a risk adjustment mortality model using the American College of Cardiology–National Cardiovascular Data Registry (ACC–NCDR) experience: 1998–2000

    Get PDF
    AbstractObjectivesWe sought to develop and evaluate a risk adjustment model for in-hospital mortality following percutaneous coronary intervention (PCI) procedures using data from a large, multi-center registry.BackgroundThe 1998–2000 American College of Cardiology–National Cardiovascular Data Registry (ACC–NCDR) dataset was used to overcome limitations of prior risk-adjustment analyses.MethodsData on 100,253 PCI procedures collected at the ACC–NCDR between January 1, 1998, and September 30, 2000, were analyzed. A training set/test set approach was used. Separate models were developed for presentation with and without acute myocardial infarction (MI) within 24 h.ResultsFactors associated with increased risk of PCI mortality (with odds ratios in parentheses) included cardiogenic shock (8.49), increasing age (2.61 to 11.25), salvage (13.38) urgent (1.78) or emergent PCI (5.75), pre-procedure intra-aortic balloon pump insertion (1.68), decreasing left ventricular ejection fraction (0.87 to 3.93), presentation with acute MI (1.31), diabetes (1.41), renal failure (3.04), chronic lung disease (1.33); treatment approaches including thrombolytic therapy (1.39) and non-stent devices (1.64); and lesion characteristics including left main (2.04), proximal left anterior descending disease (1.97) and Society for Cardiac Angiography and Interventions lesion classification (1.64 to 2.11). Overall, excellent discrimination was achieved (C-index = 0.89) and application of the model to high-risk patient groups demonstrated C-indexes exceeding 0.80. Patient factors were more predictive in the MI model, while lesion and procedural factors were more predictive in the analysis of non-MI patients.ConclusionsA risk adjustment model for in-hospital mortality after PCI was successfully developed using a contemporary multi-center registry. This model is an important tool for valid comparison of in-hospital mortality after PCI

    Controlled in vitro delivery of voriconazole and diclofenac to the cornea using contact lenses for the treatment of Acanthamoeba keratitis

    Get PDF
    Acanthamoeba keratitis is caused by a protozoal infection of the cornea, with 80% of cases involving the improper use of contact lenses. The infection causes intense pain and is potentially blinding. However, early diagnosis improves treatment efficacy and the chances of healing. Despite the apparent accessibility of the cornea, patients do not always respond well to current eye drop treatments largely due to rapid dose loss due to blinking and nasolacrimal drainage. Here, the topical drug delivery of voriconazole alone and in combination with diclofenac via drug-loaded contact lenses, were investigated in vitro. The contact lenses were applied onto excised porcine eyeballs and maintained at 32°C under constant irrigation, with simulated tear fluid applied to mimic in vivo conditions. The drug delivered to the corneas was quantified by HPLC analysis. The system was further tested in terms of cytotoxicity and a scratch wound repopulation model, using corneal epithelial cells. Sustained drug delivery to the cornea was achieved and for voriconazole, the MIC against Acanthamoeba castellanii was attained alone and in combination with diclofenac. MTT and scratch wound data showed reasonable cell proliferation and wound repopulation at the drug doses used, supporting further development of the system to treat Acanthamoeba keratitis

    Virtual Machine Support for Many-Core Architectures: Decoupling Abstract from Concrete Concurrency Models

    Get PDF
    The upcoming many-core architectures require software developers to exploit concurrency to utilize available computational power. Today's high-level language virtual machines (VMs), which are a cornerstone of software development, do not provide sufficient abstraction for concurrency concepts. We analyze concrete and abstract concurrency models and identify the challenges they impose for VMs. To provide sufficient concurrency support in VMs, we propose to integrate concurrency operations into VM instruction sets. Since there will always be VMs optimized for special purposes, our goal is to develop a methodology to design instruction sets with concurrency support. Therefore, we also propose a list of trade-offs that have to be investigated to advise the design of such instruction sets. As a first experiment, we implemented one instruction set extension for shared memory and one for non-shared memory concurrency. From our experimental results, we derived a list of requirements for a full-grown experimental environment for further research

    Mitochondria are required for pro-ageing features of the senescent phenotype

    Get PDF
    Cell senescence is an important tumour suppressor mechanism and driver of ageing. Both functions are dependent on the development of the senescent phenotype, which involves an overproduction of pro‐inflammatory and pro‐oxidant signals. However, the exact mechanisms regulating these phenotypes remain poorly understood. Here, we show the critical role of mitochondria in cellular senescence. In multiple models of senescence, absence of mitochondria reduced a spectrum of senescence effectors and phenotypes while preserving ATP production via enhanced glycolysis. Global transcriptomic analysis by RNA sequencing revealed that a vast number of senescent‐associated changes are dependent on mitochondria, particularly the pro‐inflammatory phenotype. Mechanistically, we show that the ATM, Akt and mTORC1 phosphorylation cascade integrates signals from the DNA damage response (DDR) towards PGC‐1β‐dependent mitochondrial biogenesis, contributing to a ROS‐mediated activation of the DDR and cell cycle arrest. Finally, we demonstrate that the reduction in mitochondrial content in vivo, by either mTORC1 inhibition or PGC‐1β deletion, prevents senescence in the ageing mouse liver. Our results suggest that mitochondria are a candidate target for interventions to reduce the deleterious impact of senescence in ageing tissues

    The effectiveness of convalescent plasma and hyperimmune immunoglobulin for the treatment of severe acute respiratory infections of viral etiology: a systematic review

    Get PDF
    Background: Administration of convalescent plasma, serum, or hyperimmune immunoglobulin may be of clinical benefit for treatment of severe acute respiratory infections (SARIs) of viral etiology. We conducted a systematic review and exploratory meta-analysis to assess the overall evidence. Methods: Healthcare databases and sources of grey literature were searched in July 2013. All records were screened against the protocol eligibility criteria, using a 3-stage process. Data extraction and risk of bias assessments were undertaken. Results: We identified 32 studies of SARS coronavirus infection and severe influenza. Narrative analyses revealed consistent evidence for a reduction in mortality, especially when convalescent plasma is administered early after symptom onset. Exploratory post hoc meta-analysis showed a statistically significant reduction in the pooled odds of mortality following treatment, compared with placebo or no therapy (odds ratio, 0.25; 95% confidence interval, .14–.45; I(2) = 0%). Studies were commonly of low or very low quality, lacked control groups, and at moderate or high risk of bias. Sources of clinical and methodological heterogeneity were identified. Conclusions: Convalescent plasma may reduce mortality and appears safe. This therapy should be studied within the context of a well-designed clinical trial or other formal evaluation, including for treatment of Middle East respiratory syndrome coronavirus CoV infection

    Search for Early Gamma-ray Production in Supernovae Located in a Dense Circumstellar Medium with the Fermi LAT

    Get PDF
    Supernovae (SNe) exploding in a dense circumstellar medium (CSM) are hypothesized to accelerate cosmic rays in collisionless shocks and emit GeV gamma rays and TeV neutrinos on a time scale of several months. We perform the first systematic search for gamma-ray emission in Fermi LAT data in the energy range from 100 MeV to 300 GeV from the ensemble of 147 SNe Type IIn exploding in dense CSM. We search for a gamma-ray excess at each SNe location in a one year time window. In order to enhance a possible weak signal, we simultaneously study the closest and optically brightest sources of our sample in a joint-likelihood analysis in three different time windows (1 year, 6 months and 3 months). For the most promising source of the sample, SN 2010jl (PTF10aaxf), we repeat the analysis with an extended time window lasting 4.5 years. We do not find a significant excess in gamma rays for any individual source nor for the combined sources and provide model-independent flux upper limits for both cases. In addition, we derive limits on the gamma-ray luminosity and the ratio of gamma-ray-to-optical luminosity ratio as a function of the index of the proton injection spectrum assuming a generic gamma-ray production model. Furthermore, we present detailed flux predictions based on multi-wavelength observations and the corresponding flux upper limit at 95% confidence level (CL) for the source SN 2010jl (PTF10aaxf).Comment: Accepted for publication in ApJ. Corresponding author: A. Franckowiak ([email protected]), updated author list and acknowledgement

    Search for extended gamma-ray emission from the Virgo galaxy cluster with Fermi-LAT

    Get PDF
    Galaxy clusters are one of the prime sites to search for dark matter (DM) annihilation signals. Depending on the substructure of the DM halo of a galaxy cluster and the cross sections for DM annihilation channels, these signals might be detectable by the latest generation of γ\gamma-ray telescopes. Here we use three years of Fermi Large Area Telescope (LAT) data, which are the most suitable for searching for very extended emission in the vicinity of nearby Virgo galaxy cluster. Our analysis reveals statistically significant extended emission which can be well characterized by a uniformly emitting disk profile with a radius of 3\deg that moreover is offset from the cluster center. We demonstrate that the significance of this extended emission strongly depends on the adopted interstellar emission model (IEM) and is most likely an artifact of our incomplete description of the IEM in this region. We also search for and find new point source candidates in the region. We then derive conservative upper limits on the velocity-averaged DM pair annihilation cross section from Virgo. We take into account the potential γ\gamma-ray flux enhancement due to DM sub-halos and its complex morphology as a merging cluster. For DM annihilating into bbb\overline{b}, assuming a conservative sub-halo model setup, we find limits that are between 1 and 1.5 orders of magnitude above the expectation from the thermal cross section for mDM100GeVm_{\mathrm{DM}}\lesssim100\,\mathrm{GeV}. In a more optimistic scenario, we exclude σv3×1026cm3s1\langle \sigma v \rangle\sim3\times10^{-26}\,\mathrm{cm^{3}\,s^{-1}} for mDM40GeVm_{\mathrm{DM}}\lesssim40\,\mathrm{GeV} for the same channel. Finally, we derive upper limits on the γ\gamma-ray-flux produced by hadronic cosmic-ray interactions in the inter cluster medium. We find that the volume-averaged cosmic-ray-to-thermal pressure ratio is less than 6%\sim6\%.Comment: 15 pages, 11 figures, 4 tables, accepted for publication in ApJ; corresponding authors: T. Jogler, S. Zimmer & A. Pinzk
    corecore