982 research outputs found

    Influence of prior activity (warm-up) and inspiratory muscle training upon between- and within-day reliability of maximal inspiratory pressure measurement

    Get PDF
    This is the post-print version of the article. The official published version can be obtained from the link below.BACKGROUND: A specific inspiratory muscle ‘warm-up’ (IWU) prior to assessment of maximal inspiratory mouth pressure (PImax) may reduce the number of measurements required to obtain reproducible, representative estimates of PImax. The influence of inspiratory muscle training (IMT) upon this phenomenon is unknown. OBJECTIVE: Compare the impact of an IWU on the between- and within-day reliability of PImax before and after IMT. METHOD: Eight participants were assessed on 4 separate occasions: 2 trials preceded IMT and 2 followed it. At each assessment, the highest of 3 initial efforts was recorded as the pre-IWU value (PI). The highest of 9 subsequent efforts that followed 2 sets of 30 breaths at 40% PI was recorded as PImax. Following 4 weeks of IMT, the trials were repeated. RESULTS: IWU increased PI by 11–17% (p ≤ 0.01), irrespective of IMT status. After IWU, 5–6 efforts were required to determine PImax, irrespective of IMT status. PImax was similar between the 2 trials before IMT and the 2 trials after IMT (p ≥ 0.05), and was 21% higher after IMT (p ≤ 0.01). The coefficient of variation was excellent before and after IWU, both before (1.9 and 0.6%, respectively) and after IMT (1.1 and 0.3%, respectively). Limits of agreement and sample sizes for effect sizes ≤10% were substantially smaller after IWU in all trials. CONCLUSIONS: (1) IWU enhances the between-day reliability of PImax measurement, and this is unaffected by IMT, and (2) judgements regarding acceptability in relation to PImax reliability should be made in relation to analytical goals and we present data to facilitate this

    Statistical analysis of high-dimensional biomedical data: a gentle introduction to analytical goals, common approaches and challenges

    Get PDF
    International audienceBackground: In high-dimensional data (HDD) settings, the number of variables associated with each observation is very large. Prominent examples of HDD in biomedical research include omics data with a large number of variables such as many measurements across the genome, proteome, or metabolome, as well as electronic health records data that have large numbers of variables recorded for each patient. The statistical analysis of such data requires knowledge and experience, sometimes of complex methods adapted to the respective research questions. Methods: Advances in statistical methodology and machine learning methods offer new opportunities for innovative analyses of HDD, but at the same time require a deeper understanding of some fundamental statistical concepts. Topic group TG9 “High-dimensional data” of the STRATOS (STRengthening Analytical Thinking for Observational Studies) initiative provides guidance for the analysis of observational studies, addressing particular statistical challenges and opportunities for the analysis of studies involving HDD. In this overview, we discuss key aspects of HDD analysis to provide a gentle introduction for non-statisticians and for classically trained statisticians with little experience specific to HDD. Results: The paper is organized with respect to subtopics that are most relevant for the analysis of HDD, in particular initial data analysis, exploratory data analysis, multiple testing, and prediction. For each subtopic, main analytical goals in HDD settings are outlined. For each of these goals, basic explanations for some commonly used analysis methods are provided. Situations are identified where traditional statistical methods cannot, or should not, be used in the HDD setting, or where adequate analytic tools are still lacking. Many key references are provided. Conclusions: This review aims to provide a solid statistical foundation for researchers, including statisticians and non-statisticians, who are new to research with HDD or simply want to better evaluate and understand the results of HDD analyses

    Spatial planning, territorial development, and territorial impact assessment

    Get PDF
    This article debates the possibilities and advantages of using territorial impact assessment (TIA) policy evaluation methodologies to assess the implementation of spatial planning instruments. It builds on existing literature to define key analytical goals, dimensions, and respective components to monitor and evaluate the implementation of spatial plans, at all territorial levels, to be used as a TIA evaluation matrix. It concludes that, despite the inherent complexity associated with the process of evaluating spatial planning processes, there are manifest advantages to using TIA tools to evaluate them, mostly at the ex post phase.info:eu-repo/semantics/acceptedVersio

    Sample Return Missions Where Contamination Issues are Critical: Genesis Mission Approach

    Get PDF
    The Genesis Mission, sought the challenging analytical goals of accurately and precisely measuring the elemental and isotopic composition of the Sun to levels useful for planetary science, requiring sensitivities of ppm to ppt in the outer 100 nm of collector materials. Analytical capabilities were further challenged when the hard landing in 2004 broke open the canister containing the super-clean collectors. Genesis illustrates that returned samples allow flexibility and creativity to recover from setbacks

    You can't always sketch what you want: Understanding Sensemaking in Visual Query Systems

    Full text link
    Visual query systems (VQSs) empower users to interactively search for line charts with desired visual patterns, typically specified using intuitive sketch-based interfaces. Despite decades of past work on VQSs, these efforts have not translated to adoption in practice, possibly because VQSs are largely evaluated in unrealistic lab-based settings. To remedy this gap in adoption, we collaborated with experts from three diverse domains---astronomy, genetics, and material science---via a year-long user-centered design process to develop a VQS that supports their workflow and analytical needs, and evaluate how VQSs can be used in practice. Our study results reveal that ad-hoc sketch-only querying is not as commonly used as prior work suggests, since analysts are often unable to precisely express their patterns of interest. In addition, we characterize three essential sensemaking processes supported by our enhanced VQS. We discover that participants employ all three processes, but in different proportions, depending on the analytical needs in each domain. Our findings suggest that all three sensemaking processes must be integrated in order to make future VQSs useful for a wide range of analytical inquiries.Comment: Accepted for presentation at IEEE VAST 2019, to be held October 20-25 in Vancouver, Canada. Paper will also be published in a special issue of IEEE Transactions on Visualization and Computer Graphics (TVCG) IEEE VIS (InfoVis/VAST/SciVis) 2019 ACM 2012 CCS - Human-centered computing, Visualization, Visualization design and evaluation method

    Strategies to define performance specifications in laboratory medicine: 3 years on from the Milan Strategic Conference

    Get PDF
    Measurements in clinical laboratories produce results needed in the diagnosis and monitoring of patients. These results are always characterized by some uncertainty. What quality is needed and what measurement errors can be tolerated without jeopardizing patient safety should therefore be defined and specified for each analyte having clinical use. When these specifications are defined, the total examination process will be "fit for purpose" and the laboratory professionals should then set up rules to control the measuring systems to ensure they perform within specifications. The laboratory community has used different models to set performance specifications (PS). Recently, it was felt that there was a need to revisit different models and, at the same time, to emphasize the presuppositions for using the different models. Therefore, in 2014 the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) organized a Strategic Conference in Milan. It was felt that there was a need for more detailed discussions on, for instance, PS for EQAS, which measurands should use which models to set PS and how to set PS for the extra-analytical phases. There was also a need to critically evaluate the quality of data on biological variation studies and further discussing the use of the total error (TE) concept. Consequently, EFLM established five Task Finish Groups (TFGs) to address each of these topics. The TFGs are finishing their activity on 2017 and the content of this paper includes deliverables from these groups

    Developing Predictive Molecular Maps of Human Disease through Community-based Modeling

    Get PDF
    The failure of biology to identify the molecular causes of disease has led to disappointment in the rate of development of new medicines. By combining the power of community-based modeling with broad access to large datasets on a platform that promotes reproducible analyses we can work towards more predictive molecular maps that can deliver better therapeutics

    In vitro determination of hemoglobin A1c for diabetes diagnosis and management: technology update

    Get PDF
    It is fascinating to consider the analytical improvements that have occurred since glycated hemoglobin was first used in routine clinical laboratories for diabetes monitoring around 1977; at that time methods displayed poor precision, there were no calibrators or material with assayed values for quality control purposes. This review outlines the major improvements in hemoglobin A1c (HbA1c) measurement that have occurred since its introduction, and reflects on the increased importance of this hemoglobin fraction in the monitoring of glycemic control. The use of HbA1c as a diagnostic tool is discussed in addition to its use in monitoring the patient with diabetes; the biochemistry of HbA1c formation is described, and how these changes to the hemoglobin molecule have been used to develop methods to measure this fraction. Standardization of HbA1c is described in detail; the development of the IFCC Reference Measurement Procedure for HbA1c has enabled global standardization to be achieved which has allowed global targets to be set for glycemic control and diagnosis. The importance of factors that may interfere in the measurement of HbA1c are highlighted
    corecore