465 research outputs found

    A computational theory of visuo-spatial mental imagery

    Get PDF
    The thesis develops a new theory of visuo-spatial mental imagery: the perceptual instantiation theory. The theory is concretized in a formal framework and implemented as a computational model. The theory and its model are evaluated against a set of empirical phenomena and compared to the contemporary theories of mental imagery. The new theory is shown to provide explanations for the considered phenomena that partly go beyond those of the contemporary theories

    Differences between Spatial and Visual Mental Representations

    Get PDF
    This article investigates the relationship between visual mental representations and spatial mental representations in human visuo-spatial processing. By comparing two common theories of visuo-spatial processing – mental model theory and the theory of mental imagery – we identified two open questions: (1) which representations are modality-specific, and (2) what is the role of the two representations in reasoning. Two experiments examining eye movements and preferences for under-specified problems were conducted to investigate these questions. We found that significant spontaneous eye movements along the processed spatial relations occurred only when a visual mental representation is employed, but not with a spatial mental representation. Furthermore, the preferences for the answers of the under-specified problems differed between the two mental representations. The results challenge assumptions made by mental model theory and the theory of mental imagery

    Influence of the gas composition on the efficiency of ammonia stripping of biogas digestate

    Get PDF
    AbstractImpact of strip gas composition on side stream ammonia stripping, a technology aiming at the reduction of high ammonia levels in anaerobic reactors, was investigated. Evaluation of the effect of oxygen contact during air stripping showed a distinct, though lower than perceived, inhibition of anaerobic microflora. To circumvent, the feasibility and possible constraints of biogas and flue gas as alternatives in side stream stripping were studied. Experiments, with ammonia bicarbonate model solution and digestate, were conducted. It was demonstrated that the stripping performance is negatively correlated to the CO2 level in the strip gas with a progressive performance loss towards higher concentrations. In contrast to biogas with its high CO2 content, the efficiency reduction observed for flue gas was significantly less pronounced. The later provides the additional benefit that its high thermal energy can be re-utilized in the stripping unit and it is therefore considered a viable alternative for air

    Detecting CTP truncation artifacts in acute stroke imaging from the arterial input and the vascular output functions

    Get PDF
    Background: Current guidelines for CT perfusion (CTP) in acute stroke suggest acquiring scans with a minimal duration of 60-70 s. But even then, CTP analysis can be affected by truncation artifacts. Conversely, shorter acquisitions are still widely used in clinical practice and may, sometimes, be sufficient to reliably estimate lesion volumes. We aim to devise an automatic method that detects scans affected by truncation artifacts. Methods: Shorter scan durations are simulated from the ISLES’18 dataset by consecutively removing the last CTP time-point until reaching a 10 s duration. For each truncated series, perfusion lesion volumes are quantified and used to label the series as unreliable if the lesion volumes considerably deviate from the original untruncated ones. Afterwards, nine features from the arterial input function (AIF) and the vascular output function (VOF) are derived and used to fit machine-learning models with the goal of detecting unreliably truncated scans. Methods are compared against a baseline classifier solely based on the scan duration, which is the current clinical standard. The ROC-AUC, precision-recall AUC and the F1-score are measured in a 5-fold cross-validation setting. Results: The best performing classifier obtained an ROC-AUC of 0.982, precision-recall AUC of 0.985 and F1-score of 0.938. The most important feature was the AIFcoverage_{coverage}, measured as the time difference between the scan duration and the AIF peak. When using the AIFcoverage_{coverage} to build a single feature classifier, an ROC-AUC of 0.981, precision-recall AUC of 0.984 and F1-score of 0.932 were obtained. In comparison, the baseline classifier obtained an ROC-AUC of 0.954, precision-recall AUC of 0.958 and F1-Score of 0.875. Conclusions: Machine learning models fed with AIF and VOF features accurately detected unreliable stroke lesion measurements due to insufficient acquisition duration. The AIFcoverage_{coverage} was the most predictive feature of truncation and identified unreliable short scans almost as good as machine learning. We conclude that AIF/VOF based classifiers are more accurate than the scans’ duration for detecting truncation. These methods could be transferred to perfusion analysis software in order to increase the interpretability of CTP outputs

    AIFNet: Automatic Vascular Function Estimation for Perfusion Analysis Using Deep Learning

    Full text link
    Perfusion imaging is crucial in acute ischemic stroke for quantifying the salvageable penumbra and irreversibly damaged core lesions. As such, it helps clinicians to decide on the optimal reperfusion treatment. In perfusion CT imaging, deconvolution methods are used to obtain clinically interpretable perfusion parameters that allow identifying brain tissue abnormalities. Deconvolution methods require the selection of two reference vascular functions as inputs to the model: the arterial input function (AIF) and the venous output function, with the AIF as the most critical model input. When manually performed, the vascular function selection is time demanding, suffers from poor reproducibility and is subject to the professionals' experience. This leads to potentially unreliable quantification of the penumbra and core lesions and, hence, might harm the treatment decision process. In this work we automatize the perfusion analysis with AIFNet, a fully automatic and end-to-end trainable deep learning approach for estimating the vascular functions. Unlike previous methods using clustering or segmentation techniques to select vascular voxels, AIFNet is directly optimized at the vascular function estimation, which allows to better recognise the time-curve profiles. Validation on the public ISLES18 stroke database shows that AIFNet reaches inter-rater performance for the vascular function estimation and, subsequently, for the parameter maps and core lesion quantification obtained through deconvolution. We conclude that AIFNet has potential for clinical transfer and could be incorporated in perfusion deconvolution software.Comment: Preprint submitted to Elsevie

    A Radiomics Approach to Traumatic Brain Injury Prediction in CT Scans

    Full text link
    Computer Tomography (CT) is the gold standard technique for brain damage evaluation after acute Traumatic Brain Injury (TBI). It allows identification of most lesion types and determines the need of surgical or alternative therapeutic procedures. However, the traditional approach for lesion classification is restricted to visual image inspection. In this work, we characterize and predict TBI lesions by using CT-derived radiomics descriptors. Relevant shape, intensity and texture biomarkers characterizing the different lesions are isolated and a lesion predictive model is built by using Partial Least Squares. On a dataset containing 155 scans (105 train, 50 test) the methodology achieved 89.7 % accuracy over the unseen data. When a model was build using only texture features, a 88.2 % accuracy was obtained. Our results suggest that selected radiomics descriptors could play a key role in brain injury prediction. Besides, the proposed methodology is close to reproduce radiologists decision making. These results open new possibilities for radiomics-inspired brain lesion detection, segmentation and prediction.Comment: Submitted to ISBI 201

    The increase in health care costs associated with muscle weakness in older people without long-term illnesses in the Czech Republic:results from the Survey of Health, Ageing and Retirement in Europe (SHARE)

    Get PDF
    Michal Steffl,1 Jan Sima,2 Kate Shiells,3 Iva Holmerova3 1Department of Physiology and Biochemistry, Faculty of Physical Education and Sport, Charles University, Prague, Czech Republic; 2Department of Sport Management, Faculty of Physical Education and Sport, Charles University, Prague, Czech Republic; 3Centre of Expertise in Longevity and Long-term Care, Faculty of Humanities, Charles University, Prague, Czech Republic Abstract: Muscle weakness and associated diseases are likely to place a considerable economic burden on government health care expenditure. Therefore, our aim for this study was to estimate the direct and indirect costs associated with muscle weakness in the Czech Republic. We applied a cost-of-illness approach using data from the Survey of Health, Ageing and Retirement in Europe (SHARE). Six hundred and eighty-nine participants aged 70 years and over and without any long-term illnesses were included in our study. A generalized linear model with gamma distribution was used, and odds ratio (OR) was calculated in order to explore the effect of muscle weakness on direct and indirect costs. For both genders, muscle weakness had a statistically significant impact on direct costs (OR =2.11), but did not have a statistically significant impact on indirect costs (OR =1.08) or on total cost (OR =1.51). Muscle weakness had the greatest statistically significant impact on direct costs in females (OR =2.75). In conclusion, our study has shown that muscle weakness may lead to increased direct costs, and consequently place a burden on health care expenditure. Therefore, the results of this study could lead to greater interest in the prevention of muscle weakness among older people in the Czech Republic. Keywords: direct cost, indirect cost, economic burden, sarcopenia, frailt

    Numerical Simulations and Experiments of Ignition of Solid Particles in a Laminar Burner:Effects of Slip Velocity and Particle Swelling

    Get PDF
    Ignition and combustion of pulverized solid fuel is investigated in a laminar burner. The two-dimensional OH radical field is measured in the experiments, providing information on the first onset of ignition and a detailed characterization of the flame structure for the single particle. In addition, particle velocity and diameter are tracked in time in the experiments. Simulations are carried out with a Lagrangian point-particle approach fully coupled with an Eulerian solver for the gas-phase, which includes detailed chemistry and transport. The numerical simulation results are compared with the experimental measurements in order to investigate the ignition characteristics. The effect of the slip velocity, i.e. the initial velocity difference between the gas-phase and the particle, is investigated numerically. For increasing slip velocity, the ignition delay time decreases. For large slip velocities, the decrease in ignition delay time is found to saturate to a value which is about 40% smaller than the ignition delay time at zero slip velocity. Performing a simulation neglecting the dependency of the Nusselt number on the slip velocity, it is found that this dependency does not play a role. On the contrary, it is found that the decrease of ignition delay time induced by the slip velocity is due to modifications of the temperature field around the particle. In particular, the low-temperature fluid related to the energy sink due to particle heating is transported away from the particle position when the slip velocity is non-zero; therefore, the particle is exposed to larger temperatures. Finally, the effect of particle swell is investigated using a model for the particle swelling based on the CPD framework. With this model, we observed negligible differences in ignition delay time compared to the case in which swelling is not included. This is related to the negligible swelling predicted by this model before ignition. However, this is inconsistent with the experimental measurements of particle diameter, showing a significant increase of diameter even before ignition. In further simulations, the measured swelling was directly prescribed, using an analytical fit at the given conditions. With this approach, it is found that the inclusion of swelling reduces the ignition delay time by about 20% for small particles while it is negligible for large particles

    Irrigation Water Use in the Danube Basin: Facts, Governance and Approach to Sustainability

    Get PDF
    In this paper we assess the irrigation water use in the Danube Basin, highlight its complexity, identify future challenges and show the relevance for a basin-wide integrative irrigation management plan as part of a more holistic and coherent resource policy. In this sense, we base our integrative regional assessments of the water-food-energy nexus on insights from an extensive review and scientific synthesis of the Danube Basin and region, experimental field studies on irrigation and agricultural water consumption, current irrigation related policies and strategies in most of the Danube countries, and regulatory frameworks on resources at European Union level. We show that a basin-wide integrative approach to water use calls for the evaluation of resource use trade-offs, resonates with the need for transdisciplinary research in addressing nexus challenges and supports integrative resource management policies within which irrigation water use represents an inherent part. In this respect, we propose a transdisciplinary research framework on sustainable irrigation water use in the Danube Basin. The findings were summarized into four interconnected problem areas in the Danube Basin, which directly or indirectly relate to irrigation strategies and resource policies: prospective water scarcity and Danube water connectedness, agricultural droughts, present and future level of potential yields, and science based proactive decision-making

    Psychometric evaluation of the near activity visual questionnaire presbyopia (NAVQ-P) and additional patient-reported outcome items

    Get PDF
    Background: The Near Visual Acuity Questionnaire Presbyopia (NAVQ-P) is a patient-reported outcome (PRO) measure that was developed in a phakic presbyopia population to assess near vision function impacts. The study refined and explored the psychometric properties and score interpretability of the NAVQ-P and additional PRO items assessing near vision correction independence (NVCI), near vision satisfaction (NVS), and near vision correction preference (NVCP). Methods: This was a psychometric validation study conducted using PRO data collected as part of a Phase IIb clinical trial (CUN8R44 A2202) consisting of 235 randomized adults with presbyopia from the US, Japan, Australia, and Canada. Data collected at baseline, week 2, and months 1, 2, and 3 during the 3-month trial treatment period were included in the analyses to assess item (question) properties, NAVQ-P dimensionality and scoring, reliability, validity, and score interpretation. Results: Item responses were distributed across the full response scale for most NAVQ-P and additional PRO items. Confirmatory factor analysis supported the pre-defined unidimensional structure and calculation of a NAVQ-P total score as a measure of near vision function. Item deletion informed by item response distributions, dimensionality analyses, item response theory, and previous qualitative findings, including clinical input, supported retention of 14 NAVQ-P items. The 14-item NAVQ-P total score had excellent internal consistency (α = 0.979) and high test-retest reliability (Intraclass Correlation Coefficients > = 0.898). There was good evidence of construct-related validity for all PROs supported by strong correlations with concurrent measures. Excellent results for known-groups validity and ability to detect change analyses were also demonstrated. Anchor-based and distribution-based methods supported interpretation of scores through generation of group-level and within-individual estimates of meaningful change thresholds. A meaningful within-patient change in the range of 8-15-point improvement on the NAVQ-P total score (score range 0–42) was recommended, including a more specific responder definition of 10-point improvement. Conclusions: The NAVQ-P, NVCI, and NVS are valid and reliable instruments which have the ability to detect change over time. Findings strongly support the use of these measures as outcome assessments in clinical/research studies and in clinical practice in the presbyopia population
    • …
    corecore