103 research outputs found

    Quantification of Renal Stone Contrast with Ultrasound in Human Subjects

    Get PDF
    Purpose: Greater visual contrast between calculi and tissue would improve ultrasound (US) imaging of urolithiasis and potentially expand clinical use. The color Doppler twinkling artifact has been suggested to provide enhanced contrast of stones compared with brightness mode (B-mode) imaging, but results are variable. This work provides the first quantitative measure of stone contrast in humans for B-mode and color Doppler mode, forming the basis to improve US for the detection of stones. Materials and Methods: Using a research ultrasound system, B-mode imaging was tuned for detecting stones by applying a single transmit angle and reduced signal compression. Stone twinkling with color Doppler was tuned by using low-frequency transmit pulses, longer pulse durations, and a high-pulse repetition frequency. Data were captured from 32 subjects, with 297 B-mode and Doppler images analyzed from 21 subjects exhibiting twinkling signals. The signal to clutter ratio (i.e., stone to background tissue) (SCR) was used to compare the contrast of a stone on B-mode with color Doppler, and the contrast between stone twinkling and blood-flow signals within the kidney. Results: The stone was the brightest object in only 54% of B-mode images and 100% of Doppler images containing stone twinkling. On average, stones were isoechoic with the tissue clutter on B-mode (SCR = 0 dB). Stone twinkling averaged 37 times greater contrast than B-mode (16 dB, p < 0.0001) and 3.5 times greater contrast than blood-flow signals (5.5 dB, p = 0.088). Conclusions: This study provides the first quantitative measure of US stone to tissue contrast in humans. Stone twinkling contrast is significantly greater than the contrast of a stone on B-mode. There was also a trend of stone twinkling signals having greater contrast than blood-flow signals in the kidney. Dedicated optimization of B-mode and color Doppler stone imaging could improve US detection of stones

    First-in-human clinical trial of ultrasonic propulsion of kidney stones

    Get PDF
    PURPOSE: Ultrasonic propulsion is a new technology using focused ultrasound energy applied transcutaneously to reposition kidney stones. We report what are to our knowledge the findings from the first human investigational trial of ultrasonic propulsion toward the applications of expelling small stones and dislodging large obstructing stones. MATERIALS AND METHODS: Subjects underwent ultrasonic propulsion while awake without sedation in clinic, or during ureteroscopy while anesthetized. Ultrasound and a pain questionnaire were completed before, during and after propulsion. The primary outcome was to reposition stones in the collecting system. Secondary outcomes included safety, controllable movement of stones and movement of stones less than 5 mm and 5 mm or greater. Adverse events were assessed weekly for 3 weeks. RESULTS: Kidney stones were repositioned in 14 of 15 subjects. Of the 43 targets 28 (65%) showed some level of movement while 13 (30%) were displaced greater than 3 mm to a new location. Discomfort during the procedure was rare, mild, brief and self-limited. Stones were moved in a controlled direction with more than 30 fragments passed by 4 of the 6 subjects who had previously undergone a lithotripsy procedure. The largest stone moved was 10 mm. One patient experienced pain relief during treatment of a large stone at the ureteropelvic junction. In 4 subjects a seemingly large stone was determined to be a cluster of small passable stones after they were moved. CONCLUSIONS: Ultrasonic propulsion was able to successfully reposition stones and facilitate the passage of fragments in humans. No adverse events were associated with the investigational procedure

    A Randomized Feasibility Trial of a Novel, Integrative, and Intensive Virtual Rehabilitation Program for Service Members Post-Acquired Brain Injury.

    Get PDF
    INTRODUCTION: Acquired Brain Injury, whether resulting from Traumatic brain injury (TBI) or Cerebral Vascular Accident (CVA), represent major health concerns for the Department of Defense and the nation. TBI has been referred to as the signature injury of recent U.S. military conflicts in Iraq and Afghanistan - affecting approximately 380,000 service members from 2000 to 2017; whereas CVA has been estimated to effect 795,000 individuals each year in the United States. TBI and CVA often present with similar motor, cognitive, and emotional deficits; therefore the treatment interventions for both often overlap. The Defense Health Agency and Veterans Health Administration would benefit from enhanced rehabilitation solutions to treat deficits resulting from acquired brain injuries (ABI), including both TBI and CVA. The purpose of this study was to evaluate the feasibility of implementing a novel, integrative, and intensive virtual rehabilitation system for treating symptoms of ABI in an outpatient clinic. The secondary aim was to evaluate the system\u27s clinical effectiveness. MATERIALS AND METHODS: Military healthcare beneficiaries with ABI diagnoses completed a 6-week randomized feasibility study of the BrightBrainer Virtual Rehabilitation (BBVR) system in an outpatient military hospital clinic. Twenty-six candidates were screened, consented and randomized, 21 of whom completed the study. The BBVR system is an experimental adjunct ABI therapy program which utilizes virtual reality and repetitive bilateral upper extremity training. Four self-report questionnaires measured participant and provider acceptance of the system. Seven clinical outcomes included the Fugl-Meyer Assessment of Upper Extremity, Box and Blocks Test, Jebsen-Taylor Hand Function Test, Automated Neuropsychological Assessment Metrics, Neurobehavioral Symptom Inventory, Quick Inventory of Depressive Symptomatology-Self-Report, and Post Traumatic Stress Disorder Checklist- Civilian Version. The statistical analyses used bootstrapping, non-parametric statistics, and multilevel/hierarchical modeling as appropriate. This research was approved by the Walter Reed National Military Medical Center and Uniformed Services University of the Health Sciences Institutional Review Boards. RESULTS: All of the participants and providers reported moderate to high levels of utility, ease of use and satisfaction with the BBVR system (x- = 73-86%). Adjunct therapy with the BBVR system trended towards statistical significance for the measure of cognitive function (ANAM [x- = -1.07, 95% CI -2.27 to 0.13, p = 0.074]); however, none of the other effects approached significance. CONCLUSION: This research provides evidence for the feasibility of implementing the BBVR system into an outpatient military setting for treatment of ABI symptoms. It is believed these data justify conducting a larger, randomized trial of the clinical effectiveness of the BBVR system

    Neurodevelopmental Outcome of Young Children with Biliary Atresia and Native Liver: Results from the ChiLDReN Study

    Get PDF
    OBJECTIVES: To assess neurodevelopmental outcomes among participants with biliary atresia with their native liver at ages 12 months (group 1) and 24 months (group 2), and to evaluate variables predictive of neurodevelopmental impairment. STUDY DESIGN: Participants enrolled in a prospective, longitudinal, multicenter study underwent neurodevelopmental testing with either the Bayley Scales of Infant Development, 2nd edition, or Bayley Scales of Infant and Toddler Development, 3rd edition. Scores (normative mean = 100 ± 15) were categorized as ≥100, 85-99, and <85 for χ2 analysis. Risk for neurodevelopmental impairment (defined as ≥1 score of <85 on the Bayley Scales of Infant Development, 2nd edition, or Bayley Scales of Infant and Toddler Development, 3rd edition, scales) was analyzed using logistic regression. RESULTS: There were 148 children who completed 217 Bayley Scales of Infant and Toddler Development, 3rd edition, examinations (group 1, n = 132; group 2, n = 85). Neurodevelopmental score distributions significantly shifted downward compared with test norms at 1 and 2 years of age. Multivariate analysis identified ascites (OR, 3.17; P = .01) and low length z-scores at time of testing (OR, 0.70; P < .04) as risk factors for physical/motor impairment; low weight z-score (OR, 0.57; P = .001) and ascites (OR, 2.89; P = .01) for mental/cognitive/language impairment at 1 year of age. An unsuccessful hepatoportoenterostomy was predictive of both physical/motor (OR, 4.88; P < .02) and mental/cognitive/language impairment (OR, 4.76; P = .02) at 2 years of age. CONCLUSION: Participants with biliary atresia surviving with native livers after hepatoportoenterostomy are at increased risk for neurodevelopmental delays at 12 and 24 months of age. Those with unsuccessful hepatoportoenterostomy are >4 times more likely to have neurodevelopmental impairment compared with those with successful hepatoportoenterostomy. Growth delays and/or complications indicating advanced liver disease should alert clinicians to the risk for neurodevelopmental delays, and expedite appropriate interventions

    Improving burst wave lithotripsy effectiveness for small stones and fragments by increasing frequency: theoretical modeling and ex vivo study

    Get PDF
    Introduction and Objective: In clinical trial NCT03873259, a 2.6-mm lower pole stone was treated transcutaneously and ex vivo with 390-kHz burst wave lithotripsy (BWL) for 40 minutes and failed to break. The stone was subsequently fragmented with 650-kHz BWL after a 4-minute exposure. This study investigated how to fragment small stones and why varying BWL frequency may more effectively fragment stones to dust. Methods: A linear elastic model was used to calculate the stress created inside stones from shock wave lithotripsy (SWL) and different BWL frequencies mimicking the stone’s size, shape, lamellar structure, and composition. To test model predictions about the impact of BWL frequency, matched pairs of stones (1-5 mm) were treated at 1) 390 kHz, 2) 830 kHz, and 3) 390 kHz followed by 830 kHz. The mass of fragments greater than 1 and 2 mm was measured over 10 minutes of exposure. Results: The linear elastic model predicts that the maximum principal stress inside a stone increases to more than 5.5 times the pressure applied by the ultrasound wave as frequency is increased, regardless of composition tested. The threshold frequency for stress amplification is proportionate to the wave speed divided by the stone diameter. Thus, smaller stones may be likely to fragment at higher frequency, but not lower frequency below a limit. Unlike with SWL, this amplification in BWL occurs consistently with spherical and irregularly shaped stones. In water tank experiments, stones smaller than the threshold size broke fastest at high frequency (p=0.0003), whereas larger stones broke equally well to sub-millimeter dust at high, low, or mixed frequency. Conclusions: For small stones and fragments, increasing frequency of BWL may produce amplified stress in the stone causing the stone to break. Using the strategies outlined here, stones of all sizes may be turned to dust efficiently with BWL

    The influence of groundwater abstraction on interpreting climate controls and extreme recharge events from well hydrographs in semi-arid South Africa

    Get PDF
    There is a scarcity of long-term groundwater hydrographs from sub-Saharan Africa to investigate groundwater sustainability, processes and controls. This paper presents an analysis of 21 hydrographs from semi-arid South Africa. Hydrographs from 1980 to 2000 were converted to standardised groundwater level indices and rationalised into four types (C1–C4) using hierarchical cluster analysis. Mean hydrographs for each type were cross-correlated with standardised precipitation and streamflow indices. Relationships with the El Niño–Southern Oscillation (ENSO) were also investigated. The four hydrograph types show a transition of autocorrelation over increasing timescales and increasingly subdued responses to rainfall. Type C1 strongly relates to rainfall, responding in most years, whereas C4 notably responds to only a single extreme event in 2000 and has limited relationship with rainfall. Types C2, C3 and C4 have stronger statistical relationships with standardised streamflow than standardised rainfall. C3 and C4 changes are significantly (p < 0.05) correlated to the mean wet season ENSO anomaly, indicating a tendency for substantial or minimal recharge to occur during extreme negative and positive ENSO years, respectively. The range of different hydrograph types, sometimes within only a few kilometres of each other, appears to be a result of abstraction interference and cannot be confidently attributed to variations in climate or hydrogeological setting. It is possible that high groundwater abstraction near C3/C4 sites masks frequent small-scale recharge events observed at C1/C2 sites, resulting in extreme events associated with negative ENSO years being more visible in the time series

    Dynamic Evolution of Pathogenicity Revealed by Sequencing and Comparative Genomics of 19 Pseudomonas syringae Isolates

    Get PDF
    Closely related pathogens may differ dramatically in host range, but the molecular, genetic, and evolutionary basis for these differences remains unclear. In many Gram- negative bacteria, including the phytopathogen Pseudomonas syringae, type III effectors (TTEs) are essential for pathogenicity, instrumental in structuring host range, and exhibit wide diversity between strains. To capture the dynamic nature of virulence gene repertoires across P. syringae, we screened 11 diverse strains for novel TTE families and coupled this nearly saturating screen with the sequencing and assembly of 14 phylogenetically diverse isolates from a broad collection of diseased host plants. TTE repertoires vary dramatically in size and content across all P. syringae clades; surprisingly few TTEs are conserved and present in all strains. Those that are likely provide basal requirements for pathogenicity. We demonstrate that functional divergence within one conserved locus, hopM1, leads to dramatic differences in pathogenicity, and we demonstrate that phylogenetics-informed mutagenesis can be used to identify functionally critical residues of TTEs. The dynamism of the TTE repertoire is mirrored by diversity in pathways affecting the synthesis of secreted phytotoxins, highlighting the likely role of both types of virulence factors in determination of host range. We used these 14 draft genome sequences, plus five additional genome sequences previously reported, to identify the core genome for P. syringae and we compared this core to that of two closely related non-pathogenic pseudomonad species. These data revealed the recent acquisition of a 1 Mb megaplasmid by a sub-clade of cucumber pathogens. This megaplasmid encodes a type IV secretion system and a diverse set of unknown proteins, which dramatically increases both the genomic content of these strains and the pan-genome of the species

    Physiogenomic comparison of human fat loss in response to diets restrictive of carbohydrate or fat

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Genetic factors that predict responses to diet may ultimately be used to individualize dietary recommendations. We used physiogenomics to explore associations among polymorphisms in candidate genes and changes in relative body fat (Δ%BF) to low fat and low carbohydrate diets.</p> <p>Methods</p> <p>We assessed Δ%BF using dual energy X-ray absorptiometry (DXA) in 93 healthy adults who consumed a low carbohydrate diet (carbohydrate ~12% total energy) (LC diet) and in 70, a low fat diet (fat ~25% total energy) (LF diet). Fifty-three single nucleotide polymorphisms (SNPs) selected from 28 candidate genes involved in food intake, energy homeostasis, and adipocyte regulation were ranked according to probability of association with the change in %BF using multiple linear regression.</p> <p>Results</p> <p>Dieting reduced %BF by 3.0 ± 2.6% (absolute units) for LC and 1.9 ± 1.6% for LF (p < 0.01). SNPs in nine genes were significantly associated with Δ%BF, with four significant after correction for multiple statistical testing: rs322695 near the retinoic acid receptor beta (<it>RARB</it>) (p < 0.005), rs2838549 in the hepatic phosphofructokinase (<it>PFKL</it>), and rs3100722 in the histamine N-methyl transferase (<it>HNMT</it>) genes (both p < 0.041) due to LF; and the rs5950584 SNP in the angiotensin receptor Type II (<it>AGTR2</it>) gene due to LC (p < 0.021).</p> <p>Conclusion</p> <p>Fat loss under LC and LF diet regimes appears to have distinct mechanisms, with <it>PFKL </it>and <it>HNMT </it>and <it>RARB </it>involved in fat restriction; and <it>AGTR2 </it>involved in carbohydrate restriction. These discoveries could provide clues to important physiologic mechanisms underlying the Δ%BF to low carbohydrate and low fat diets.</p

    Summary Statement Novel Agents in the Treatment of Lung Cancer: Fifth Cambridge Conference Assessing Opportunities for Combination Therapy

    Get PDF
    The promise of effective targeted therapy for lung cancer requires rigorous identification of potential targets combined with intensive discovery and development efforts aimed at developing effective "drugs" for these targets. We now recognize that getting the right drug to the right target in the right patient is more complicated than one could have imagined a decade ago. As knowledge of targets and development of agents have proliferated and advanced, so too have data demonstrating the biologic heterogeneity of tumors. The finding that lung cancers are genetically diverse and can exhibit several pathways of resistance in response to targeted agents makes the prospect for curative therapy more daunting. It is becoming increasingly clear that single-agent treatment will be the exception rather than the rule. This information raises important new questions about the development and assessment of novel agents in lung cancer treatment: (1) How do we identify the most important drug targets for tumor initiation and maintenance? (2) What is the best way to assess drug candidates that may only be relevant in a small fraction of patients? (3) What models do we use to predict clinical response and identify effective combinations? And (4) how do we bring combination regimens to the clinic, particularly when the agents are not yet approved individually and may be under development from different companies? The Fifth Cambridge Conference on Novel Agents in the Treatment of Lung Cancer was held in Cambridge, Massachusetts, on October 1-2, 2007, to discuss these questions by reviewing recent progress in the field and advancing recommendations for research and patient care. New information, conclusions, and recommendations considered significant for the field by the program faculty are summarized here and presented at greater length in the individual articles and accompanying discussions that comprise the full conference proceedings. A CME activity based on this summary is also available at www.informedicalcme.com/cme
    • …
    corecore