98 research outputs found

    Self-directed learning of basic musculoskeletal ultrasound among rheumatologists in the United States

    Get PDF
    Objective Because musculoskeletal ultrasound (MSUS) is highly user dependent, we aimed to establish whether non-mentored learning of MSUS is sufficient to achieve the same level of diagnostic accuracy and scanning reliability as has been achieved by rheumatologists recognized as international experts in MSUS. Methods A group of 8 rheumatologists with more experience in MSUS and 8 rheumatologists with less experience in MSUS participated in an MSUS exercise to assess patients with musculoskeletal abnormalities commonly seen in a rheumatology practice. Patients' established diagnoses were obtained from chart review (gout, osteoarthritis, rotator cuff syndrome, rheumatoid arthritis, and seronegative arthritis). Two examining groups were formed, each composed of 4 less experienced and 4 more experienced examiners. Each group scanned 1 predefined body region (hand, wrist, elbow, shoulder, knee, or ankle) in each of 8 patients, blinded to medical history and physical examination. Structural abnormalities were noted with dichotomous answers, and an open-ended answer was used for the final diagnosis. Results Less experienced and more experienced examiners achieved the same diagnostic accuracy (US-established diagnosis versus chart review diagnosis). The interrater reliability for tissue pathology was slightly higher for more experienced versus less experienced examiners (Κ = 0.43 versus Κ = 0.34; P = 0.001). Conclusion Non-mentored training in MSUS can lead to the achievement of diagnostic accuracy in MSUS comparable to that achieved by highly experienced international experts. Reliability may increase slightly with additional experience. Further study is needed to determine the minimal training requirement to achieve proficiency in MSUS.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/65036/1/20063_ftp.pd

    Exploring the Diversity of Groups at 0.1<z<0.8 with X-ray and Optically Selected Samples

    Full text link
    We present the global group properties of two samples of galaxy groups containing 39 high quality X-ray selected systems and 38 optically (spectroscopically) selected systems in coincident spatial regions at 0.12<z<0.79. Only nine optical systems are associable with X-ray systems. We discuss the confusion inherent in the matching of both galaxies to extended X-ray emission and of X-ray emission to already identified optical systems. Extensive spectroscopy has been obtained and the resultant redshift catalog and group membership are provided here. X-ray, dynamical, and total stellar masses of the groups are also derived and presented. We explore the effects of applying three different kinds of radial cut to our systems: a constant cut of 1 Mpc and two r200 cuts, one based on the velocity dispersion of the system and the other on the X-ray emission. We find that an X-ray based r200 results in less scatter in scaling relations and less dynamical complexity as evidenced by results of the Anderson-Darling and Dressler-Schectman tests, indicating that this radius tends to isolate the virialized part of the system. The constant and velocity dispersion based cuts can overestimate membership and can work to inflate velocity dispersion and dynamical and stellar mass. We find Lx-sigma and Mstellar-Lx scaling relations for X-ray and optically selected systems are not dissimilar. The mean fraction of mass found in stars for our systems is approximately 0.014 with a logarithmic standard deviation of 0.398 dex. We also define and investigate a sample of groups which are X-ray underluminous given the total group stellar mass. For these systems the fraction of stellar mass contributed by the most massive galaxy is typically lower than that found for the total population of groups implying that there may be less IGM contributed from the most massive member in these systems. (Abridged)Comment: Accepted for publication in the Astrophysical Journal (ApJ). 27 pages, 14 figures, 12 table

    Glucose Control Predicts 2-Year Change in Lipid Profile in Youth with Type 1 Diabetes

    Get PDF
    To test the hypothesis that a change in A1c over a follow-up interval of approximately 2 years would be associated with concomitant changes in fasting lipids in individuals with type 1 diabetes (T1D)

    Change in adiposity minimally affects the lipid profile in youth with recent onset type 1 diabetes

    Get PDF
    Dyslipidemia contributes to the increased risk of cardiovascular disease in persons with type 1 diabetes (T1D). Weight control is commonly recommended as a treatment for dyslipidemia. However, the extent to which decreases in weight affect the lipid profile in youth with T1D is not known. Therefore, we tested the hypothesis that decreases in BMI-z score (BMIz) were associated with concomitant changes in the lipid profile in youth with T1D

    Design and implementation of a generalized laboratory data model

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Investigators in the biological sciences continue to exploit laboratory automation methods and have dramatically increased the rates at which they can generate data. In many environments, the methods themselves also evolve in a rapid and fluid manner. These observations point to the importance of robust information management systems in the modern laboratory. Designing and implementing such systems is non-trivial and it appears that in many cases a database project ultimately proves unserviceable.</p> <p>Results</p> <p>We describe a general modeling framework for laboratory data and its implementation as an information management system. The model utilizes several abstraction techniques, focusing especially on the concepts of inheritance and meta-data. Traditional approaches commingle event-oriented data with regular entity data in <it>ad hoc </it>ways. Instead, we define distinct regular entity and event schemas, but fully integrate these via a standardized interface. The design allows straightforward definition of a "processing pipeline" as a sequence of events, obviating the need for separate workflow management systems. A layer above the event-oriented schema integrates events into a workflow by defining "processing directives", which act as automated project managers of items in the system. Directives can be added or modified in an almost trivial fashion, i.e., without the need for schema modification or re-certification of applications. Association between regular entities and events is managed via simple "many-to-many" relationships. We describe the programming interface, as well as techniques for handling input/output, process control, and state transitions.</p> <p>Conclusion</p> <p>The implementation described here has served as the Washington University Genome Sequencing Center's primary information system for several years. It handles all transactions underlying a throughput rate of about 9 million sequencing reactions of various kinds per month and has handily weathered a number of major pipeline reconfigurations. The basic data model can be readily adapted to other high-volume processing environments.</p

    The ACTIVE cognitive training trial and predicted medical expenditures

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Health care expenditures for older adults are disproportionately high and increasing at both the individual and population levels. We evaluated the effects of the three cognitive training interventions (memory, reasoning, or speed of processing) in the ACTIVE study on changes in predicted medical care expenditures.</p> <p>Methods</p> <p>ACTIVE was a multisite randomized controlled trial of older adults (≥ 65). Five-year follow-up data were available for 1,804 of the 2,802 participants. Propensity score weighting was used to adjust for potential attrition bias. Changes in predicted annual<b/>medical expenditures were calculated at the first and fifth annual follow-up assessments using a new method for translating functional status scores. Multiple linear regression methods were used in this cost-offset analysis.</p> <p>Results</p> <p>At one and five years post-training, annual predicted expenditures declined<b/>by 223(p=.024)and223 (p = .024) and 128 (p = .309), respectively, in the speed of processing treatment group, but there were no statistically significant changes in the memory or reasoning treatment groups compared to the no-contact control group at either period. Statistical adjustment for age, race, education, MMSE scores, ADL and IADL performance scores, EPT scores, chronic condition counts, and the SF-36 PCS and MCS scores at baseline did not alter the one-year (244;p=.012)orfiveyear(244; p = .012) or five-year (143; p = .250) expenditure declines in the speed of processing treatment group.</p> <p>Conclusion</p> <p>The speed of processing intervention significantly reduced subsequent annual predicted medical care expenditures at the one-year post-baseline comparison, but annual savings were no longer statistically significant at the five-year post-baseline comparison.</p

    Metformin treatment in diabetes and heart failure: when academic equipoise meets clinical reality

    Get PDF
    <p>Abstract</p> <p>Objective</p> <p>Metformin has had a 'black box' contraindication in diabetic patients with heart failure (HF), but many believe it to be the treatment of choice in this setting. Therefore, we attempted to conduct a pilot study to evaluate the feasibility of undertaking a large randomized controlled trial with clinical endpoints.</p> <p>Study Design</p> <p>The pilot study was a randomized double blinded placebo controlled trial. Patients with HF and type 2 diabetes were screened in hospitals and HF clinics in Edmonton, Alberta, Canada (population ~1 million). Major exclusion criteria included the current use of insulin or high dose metformin, decreased renal function, or a glycosylated hemoglobin <7%. Patients were to be randomized to 1500 mg of metformin daily or matching placebo and followed for 6 months for a variety of functional outcomes, as well as clinical events.</p> <p>Results</p> <p>Fifty-eight patients were screened over a six month period and all were excluded. Because of futility with respect to enrollment, the pilot study was abandoned. The mean age of screened patients was 77 (SD 9) years and 57% were male. The main reasons for exclusion were: use of insulin therapy (n = 23; 40%), glycosylated hemoglobin <7% (n = 17; 29%) and current use of high dose metformin (n = 12; 21%). Overall, contraindicated metformin therapy was the most commonly prescribed oral antihyperglycemic agent (n = 27; 51%). On average, patients were receiving 1,706 mg (SD 488 mg) of metformin daily and 12 (44%) used only metformin.</p> <p>Conclusion</p> <p>Despite uncertainty in the scientific literature, there does not appear to be clinical uncertainty with regards to the safety or effectiveness of metformin in HF making a definitive randomized trial virtually impossible.</p> <p>Trial registration</p> <p>ClinicalTrials.gov Identifier: NCT00325910</p

    Development and pilot of an internationally standardized measure of cardiovascular risk management in European primary care

    Get PDF
    Contains fulltext : 97806.pdf (publisher's version ) (Open Access)BACKGROUND: Primary care can play an important role in providing cardiovascular risk management in patients with established Cardiovascular Diseases (CVD), patients with a known high risk of developing CVD, and potentially for individuals with a low risk of developing CVD, but who have unhealthy lifestyles. To describe and compare cardiovascular risk management, internationally valid quality indicators and standardized measures are needed. As part of a large project in 9 European countries (EPA-Cardio), we have developed and tested a set of standardized measures, linked to previously developed quality indicators. METHODS: A structured stepwise procedure was followed to develop measures. First, the research team allocated 106 validated quality indicators to one of the three target populations (established CVD, at high risk, at low risk) and to different data-collection methods (data abstraction from the medical records, a patient survey, an interview with lead practice GP/a practice survey). Secondly, we selected a number of other validated measures to enrich the assessment. A pilot study was performed to test the feasibility. Finally, we revised the measures based on the findings. RESULTS: The EPA-Cardio measures consisted of abstraction forms from the medical-records data of established Coronary Heart Disease (CHD)-patients--and high-risk groups, a patient questionnaire for each of the 3 groups, an interview questionnaire for the lead GP and a questionnaire for practice teams. The measures were feasible and accepted by general practices from different countries. CONCLUSIONS: An internationally standardized measure of cardiovascular risk management, linked to validated quality indicators and tested for feasibility in general practice, is now available. Careful development and pilot testing of the measures are crucial in international studies of quality of healthcare
    corecore