394 research outputs found
Control intervention design for preclinical and clinical trials: Consensus-based core recommendations from the third Stroke Recovery and Rehabilitation Roundtable
Control comparator selection is a critical trial design issue. Preclinical and clinical investigators who are doing trials of stroke recovery and rehabilitation interventions must carefully consider the appropriateness and relevance of their chosen control comparator as the benefit of an experimental intervention is established relative to a comparator. Establishing a strong rationale for a selected comparator improves the integrity of the trial and validity of its findings. This Stroke Recovery and Rehabilitation Roundtable (SRRR) taskforce used a graph theory voting system to rank the importance and ease of addressing challenges during control comparator design. “Identifying appropriate type of control” was ranked easy to address and very important, “variability in usual care” was ranked hard to address and of low importance, and “understanding the content of the control and how it differs from the experimental intervention” was ranked very important but not easy to address. The CONtrol DeSIGN (CONSIGN) decision support tool was developed to address the identified challenges and enhance comparator selection, description, and reporting. CONSIGN is a web-based tool inclusive of seven steps that guide the user through control comparator design. The tool was refined through multiple rounds of pilot testing that included more than 130 people working in neurorehabilitation research. Four hypothetical exemplar trials, which span preclinical, mood, aphasia, and motor recovery, demonstrate how the tool can be applied in practice. Six consensus recommendations are defined that span research domains, professional disciplines, and international borders
Timing and Dose of Upper Limb Motor Intervention After Stroke: A Systematic Review
This systematic review aimed to investigate timing, dose, and efficacy of upper limb intervention during the first 6 months poststroke. Three online databases were searched up to July 2020. Titles/abstracts/full-text were reviewed independently by 2 authors. Randomized and nonrandomized studies that enrolled people within the first 6 months poststroke, aimed to improve upper limb recovery, and completed preintervention and postintervention assessments were included. Risk of bias was assessed using Cochrane reporting tools. Studies were examined by timing (recovery epoch), dose, and intervention type. Two hundred and sixty-one studies were included, representing 228 (n=9704 participants) unique data sets. The number of studies completed increased from one (n=37 participants) between 1980 and 1984 to 91 (n=4417 participants) between 2015 and 2019. Timing of intervention start has not changed (median 38 days, interquartile range [IQR], 22–66) and study sample size remains small (median n=30, IQR 20–48). Most studies were rated high risk of bias (62%). Study participants were enrolled at different recovery epochs: 1 hyperacute (<24 hours), 13 acute (1–7 days), 176 early subacute (8–90 days), 34 late subacute (91–180 days), and 4 were unable to be classified to an epoch. For both the intervention and control groups, the median dose was 45 (IQR, 600–1430) min/session, 1 (IQR, 1–1) session/d, 5 (IQR, 5–5) d/wk for 4 (IQR, 3–5) weeks. The most common interventions tested were electromechanical (n=55 studies), electrical stimulation (n=38 studies), and constraint-induced movement (n=28 studies) therapies. Despite a large and growing body of research, intervention dose and sample size of included studies were often too small to detect clinically important effects. Furthermore, interventions remain focused on subacute stroke recovery with little change in recent decades. A united research agenda that establishes a clear biological understanding of timing, dose, and intervention type is needed to progress stroke recovery research. Prospective Register of Systematic Reviews ID: CRD42018019367/CRD42018111629
Atypical audiovisual speech integration in infants at risk for autism
The language difficulties often seen in individuals with autism might stem from an inability to integrate audiovisual information, a skill important for language development. We investigated whether 9-month-old siblings of older children with autism, who are at an increased risk of developing autism, are able to integrate audiovisual speech cues. We used an eye-tracker to record where infants looked when shown a screen displaying two faces of the same model, where one face is articulating/ba/and the other/ga/, with one face congruent with the syllable sound being presented simultaneously, the other face incongruent. This method was successful in showing that infants at low risk can integrate audiovisual speech: they looked for the same amount of time at the mouths in both the fusible visual/ga/− audio/ba/and the congruent visual/ba/− audio/ba/displays, indicating that the auditory and visual streams fuse into a McGurk-type of syllabic percept in the incongruent condition. It also showed that low-risk infants could perceive a mismatch between auditory and visual cues: they looked longer at the mouth in the mismatched, non-fusible visual/ba/− audio/ga/display compared with the congruent visual/ga/− audio/ga/display, demonstrating that they perceive an uncommon, and therefore interesting, speech-like percept when looking at the incongruent mouth (repeated ANOVA: displays x fusion/mismatch conditions interaction: F(1,16) = 17.153, p = 0.001). The looking behaviour of high-risk infants did not differ according to the type of display, suggesting difficulties in matching auditory and visual information (repeated ANOVA, displays x conditions interaction: F(1,25) = 0.09, p = 0.767), in contrast to low-risk infants (repeated ANOVA: displays x conditions x low/high-risk groups interaction: F(1,41) = 4.466, p = 0.041). In some cases this reduced ability might lead to the poor communication skills characteristic of autism
Procalcitonin (PCT) and C-reactive Protein (CRP) as severe systemic infection markers in febrile neutropenic adults
Abstract\ud
\ud
\ud
\ud
Background\ud
\ud
Procalcitonin (PCT) is an inflammatory marker that has been used as indicator of severe bacterial infection. We evaluated the concentrations of PCT as a marker for systemic infection compared to C-reactive protein (CRP) in patients neutropenic febrile.\ud
\ud
\ud
\ud
Methods\ud
\ud
52 adult patients were enrolled in the study. Blood sample was collected in order to determine the serum concentrations of PCT, CRP and other hematological parameters at the onset of fever. The patients were divided into 2 groups, one with severe infection (n = 26) and the other in which the patients did not present such an infection (n = 26). Then PCT and CRP concentrations at the fever onset were compared between groups using non parametric statistical tests, ROC curve, sensitivity, specificity, likelihood ratio, and Spearman's correlation coefficient.\ud
\ud
\ud
\ud
Results\ud
\ud
The mean of PCT was significantly higher in the group with severe infection (6.7 ng/mL versus 0.6 ng/mL – p = 0.0075) comparing with CRP. Serum concentrations of 0.245 ng/mL of PCT displayed 100% de sensitivity and 69.2% specificity. PCT concentrations of 2,145 ng/mL presented a likelihood ratio of 13, which was not observed for any concentration of CRP.\ud
\ud
\ud
\ud
Conclusion\ud
\ud
PCT seems to be an useful marker for the diagnosis of systemic infection in febrile neutropenic patients, probably better than CRP
Theoretically optimal forms for very long span bridges under gravity loading
Electronic supplementary material is available online at https://doi.org/10.6084/m9.
figshare.c.4218686.© 2018 The Authors. Long-span bridges have traditionally employed suspension or cable-stayed forms, comprising vertical pylons and networks of cables supporting a bridge deck. However, the optimality of such forms over very long spans appears never to have been rigorously assessed, and the theoretically optimal form for a given span carrying gravity loading has remained unknown. To address this we here describe a new numerical layout optimization procedure capable of intrinsically modelling the self-weight of the constituent structural elements, and use this to identify the form requiring the minimum volume of material for a given span. The bridge forms identified are complex and differ markedly to traditional suspension and cable-stayed bridge forms. Simplified variants incorporating split pylons are also presented. Although these would still be challenging to construct in practice, a benefit is that they are capable of spanning much greater distances for a given volume of material than traditional suspension and cable-stayed forms employing vertical pylons, particularly when very long spans (e.g. over 2 km) are involved.Engineering and Physical Sciences Research Council and Expedition Engineering Lt
Biological variation of measured and estimated glomerular filtration rate in patients with chronic kidney disease
When assessing changes in glomerular filtration rate (GFR) it is important to differentiate pathological change from intrinsic biological and analytical variation. GFR is measured using complex reference methods (e.g. iohexol clearance). In clinical practice measurement of creatinine and cystatin C is used in equations (e.g. Modification of Diet in Renal Disease [MDRD] or Chronic Kidney Disease Epidemiology Collaboration [CKD-EPI]) to provide estimated GFR. We studied biological variability of measured and estimated GFR in twenty nephrology outpatients (10 male, 10 female; median age 71, range 50-80 years) with moderate CKD (GFR 30-59 mL/min/1.73 m2). Patients underwent weekly GFR measurement by iohexol clearance over four consecutive weeks. Simultaneously GFR was estimated using the MDRD, CKD-EPIcreatinine, CKD-EPIcystatinC and CKD-EPIcreatinine+cystatinC equations. Within-subject biological variation (CVI) expressed as a percentage [95% CI] for the MDRD (5.0% [4.3-6.1]), CKD-EPIcreatinine (5.3% [4.5-6.4]), CKD-EPIcystatinC (5.3% [4.5-6.5]), and CKD-EPIcreatinine+cystatinC (5.0% [4.3-6.2]) equations were broadly equivalent. CVI values for MDRD and CKD- EPIcreatinine+cystatinC were lower (p=0.027 and p=0.022 respectively) than that of measured GFR (6.7% [5.6-8.2]). Reference change values (RCV), the point at which a true change in a biomarker in an individual can be inferred to have occurred with 95% probability were calculated: using the MDRD equation, positive and negative RCVs were 15.1% and 13.1% respectively. If an individual’s baseline MDRD estimated GFR (mL/min/1.73 m2) was 59, significant increases or decreases would be to values >68 or <51 respectively. Within-subject variability of estimated GFR is lower than measured GFR. RCVs can be used to understand GFR changes in clinical practice
The characterisation of microsatellite markers reveals tetraploidy in the Greater Water Parsnip, Sium latifolium (Apiaceae).
BACKGROUND:
The Greater Water Parsnip, Sium latifolium (Apiaceae), is a marginal aquatic perennial currently endangered in England and consequently the focus of a number of conservation translocation projects. Microsatellite markers were developed for S. latifolium to facilitate comparison of genetic diversity and composition between natural and introduced populations.
RESULTS:
We selected 65 S. latifolium microsatellite (MiSeq) sequences and designed primer pairs for these. Primer sets were tested in 32 individuals. We found 15 polymorphic loci that amplified consistently. For the selected 15 loci, the number of alleles per locus ranged from 8 to 17. For all loci, S. latifolium individuals displayed up to four alleles indicating polyploidy in this species.
CONCLUSIONS:
These are the first microsatellite loci developed for S. latifolium and each individual displayed 1-4 alleles per locus, suggesting polyploidy in this species. These markers provide a valuable resource in evaluating the population genetic composition of this endangered species and thus will be useful for guiding conservation and future translocations of the species
The dpsA Gene of Streptomyces coelicolor: Induction of Expression from a Single Promoter in Response to Environmental Stress or during Development
The DpsA protein plays a dual role in Streptomyces coelicolor, both as part of the stress response and contributing to nucleoid condensation during sporulation. Promoter mapping experiments indicated that dpsA is transcribed from a single, sigB-like dependent promoter. Expression studies implicate SigH and SigB as the sigma factors responsible for dpsA expression while the contribution of other SigB-like factors is indirect by means of controlling sigH expression. The promoter is massively induced in response to osmotic stress, in part due to its sensitivity to changes in DNA supercoiling. In addition, we determined that WhiB is required for dpsA expression, particularly during development. Gel retardation experiments revealed direct interaction between apoWhiB and the dpsA promoter region, providing the first evidence for a direct WhiB target in S. coelicolor
Utility of total lymphocyte count as a surrogate marker for CD4 counts in HIV-1 infected children in Kenya
<p>Abstract</p> <p>Background</p> <p>In resource-limited settings, such as Kenya, access to CD4 testing is limited. Therefore, evaluation of less expensive laboratory diagnostics is urgently needed to diagnose immuno-suppression in children.</p> <p>Objectives</p> <p>To evaluate utility of total lymphocyte count (TLC) as surrogate marker for CD4 count in HIV-infected children.</p> <p>Methods</p> <p>This was a hospital based retrospective study conducted in three HIV clinics in Kisumu and Nairobi in Kenya. TLC, CD4 count and CD4 percent data were abstracted from hospital records of 487 antiretroviral-naïve HIV-infected children aged 1 month - 12 years.</p> <p>Results</p> <p>TLC and CD4 count were positively correlated (r = 0.66, p < 0.001) with highest correlation seen in children with severe immuno-suppression (r = 0.72, p < 0.001) and children >59 months of age (r = 0.68, p < 0.001). Children were considered to have severe immuno-suppression if they met the following WHO set CD4 count thresholds: age below 12 months (CD4 counts < 1500 cells/mm<sup>3</sup>), age 12-35 months (CD4 count < 750 cells/mm3), age 36-59 months (CD4 count < 350 cells/mm<sup>3</sup>, and age above 59 months (CD4 count < 200 cells/mm<sup>3</sup>). WHO recommended TLC threshold values for severe immuno-suppression of 4000, 3000, 2500 and 2000 cells/mm<sup>3 </sup>for age categories <12, 12-35, 36-59 and >59 months had low sensitivity of 25%, 23%, 33% and 62% respectively in predicting severe immuno-suppression using CD4 count as gold standard. Raising TLC thresholds to 7000, 6000, 4500 and 3000 cells/mm<sup>3 </sup>for each of the stated age categories increased sensitivity to 71%, 64%, 56% and 86%, with positive predictive values of 85%, 61%, 37%, 68% respectively but reduced specificity to 73%, 62%, 54% and 68% with negative predictive values of 54%, 65%, 71% and 87% respectively.</p> <p>Conclusion</p> <p>TLC is positively correlated with absolute CD4 count in children but current WHO age-specific thresholds had low sensitivity to identify severely immunosuppressed Kenyan children. Sensitivity and therefore utility of TLC to identify immuno-suppressed children may be improved by raising the TLC cut off levels across the various age categories.</p
- …