2,033 research outputs found

    A Comparison of Cranial Cavity Extraction Tools for Non-contrast Enhanced CT Scans in Acute Stroke Patients

    Get PDF
    Cranial cavity extraction is often the first step in quantitative neuroimaging analyses. However, few automated, validated extraction tools have been developed for non-contrast enhanced CT scans (NECT). The purpose of this study was to compare and contrast freely available tools in an unseen dataset of real-world clinical NECT head scans in order to assess the performance and generalisability of these tools. This study included data from a demographically representative sample of 428 patients who had completed NECT scans following hospitalisation for stroke. In a subset of the scans (n = 20), the intracranial spaces were segmented using automated tools and compared to the gold standard of manual delineation to calculate accuracy, precision, recall, and dice similarity coefficient (DSC) values. Further, three readers independently performed regional visual comparisons of the quality of the results in a larger dataset (n = 428). Three tools were found; one of these had unreliable performance so subsequent evaluation was discontinued. The remaining tools included one that was adapted from the FMRIB software library (fBET) and a convolutional neural network- based tool (rBET). Quantitative comparison showed comparable accuracy, precision, recall and DSC values (fBET: 0.984 ± 0.002; rBET: 0.984 ± 0.003; p = 0.99) between the tools; however, intracranial volume was overestimated. Visual comparisons identified characteristic regional differences in the resulting cranial cavity segmentations. Overall fBET had highest visual quality ratings and was preferred by the readers in the majority of subject results (84%). However, both tools produced high quality extractions of the intracranial space and our findings should improve confidence in these automated CT tools. Pre- and post-processing techniques may further improve these results. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s12021-021-09534-7

    Fabric first: is it still the right approach?

    Get PDF
    ‘Fabric first’ describes an approach to improving the thermal performance of residential buildings by prioritising the improvement of fabric. It has historically been widely advocated. However, the urgency of complete decarbonisation challenges this approach in existing buildings. Heat decarbonisation is necessary to deliver zero-carbon goals. In many cases, no additional fabric improvement is needed to decarbonise heating; a heat pump, or other zero-carbon heat supply, will be enough. Retrofitting fabric first may not be feasible across the whole housing stock on timescales necessary for rapid decarbonisation and could therefore slow housing decarbonisation. However, fabric improvement will continue to have an important role. Energy use in buildings with a ‘heat pump only’ retrofit will be higher than if insulation were also improved. Fabric should continue to be prioritised in new buildings and where low-cost insulation measures are available. Fabric improvement can have other benefits: lower running costs, improved comfort, reduced damp risk, better heat pump performance, reduced overheating risk and lower requirements for electricity capacity increases. The suitability of a heat-pump-only approach to building decarbonisation should therefore be decided building by building. For national building stocks, complete decarbonisation of heating systems is required, but stock average fabric improvement may be 30–50%

    The Analyticity of a Generalized Ruelle's Operator

    Full text link
    In this work we propose a generalization of the concept of Ruelle operator for one dimensional lattices used in thermodynamic formalism and ergodic optimization, which we call generalized Ruelle operator, that generalizes both the Ruelle operator proposed in [BCLMS] and the Perron Frobenius operator defined in [Bowen]. We suppose the alphabet is given by a compact metric space, and consider a general a-priori measure to define the operator. We also consider the case where the set of symbols that can follow a given symbol of the alphabet depends on such symbol, which is an extension of the original concept of transition matrices from the theory of subshifts of finite type. We prove the analyticity of the Ruelle operator and present some examples

    Modelling the distribution of white matter hyperintensities due to ageing on MRI images using Bayesian inference

    Get PDF
    White matter hyperintensities (WMH), also known as white matter lesions, are localised white matter areas that appear hyperintense on MRI scans. WMH commonly occur in the ageing population, and are often associated with several factors such as cognitive disorders, cardiovascular risk factors, cerebrovascular and neurodegenerative diseases. Despite the fact that some links between lesion location and parametric factors such as age have already been established, the relationship between voxel-wise spatial distribution of lesions and these factors is not yet well understood. Hence, it would be of clinical importance to model the distribution of lesions at the population-level and quantitatively analyse the effect of various factors on the lesion distribution model. In this work we compare various methods, including our proposed method, to generate voxel-wise distributions of WMH within a population with respect to various factors. Our proposed Bayesian spline method models the spatio-temporal distribution of WMH with respect to a parametric factor of interest, in this case age, within a population. Our probabilistic model takes as input the lesion segmentation binary maps of subjects belonging to various age groups and provides a population-level parametric lesion probability map as output. We used a spline representation to ensure a degree of smoothness in space and the dimension associated with the parameter, and formulated our model using a Bayesian framework. We tested our algorithm output on simulated data and compared our results with those obtained using various existing methods with different levels of algorithmic and computational complexity. We then compared the better performing methods on a real dataset, consisting of 1000 subjects of the UK Biobank, divided in two groups based on hypertension diagnosis. Finally, we applied our method on a clinical dataset of patients with vascular disease. On simulated dataset, the results from our algorithm showed a mean square error (MSE) value of , which was lower than the MSE value reported in the literature, with the advantage of being robust and computationally efficient. In the UK Biobank data, we found that the lesion probabilities are higher for the hypertension group compared to the non-hypertension group and further verified this finding using a statistical t-test. Finally, when applying our method on patients with vascular disease, we observed that the overall probability of lesions is significantly higher in later age groups, which is in line with the current literature

    Interpenetration isomers in isoreticular amine-tagged zinc MOFs

    Get PDF
    The effect of increasing steric size of pendant amine substituents on structural isoreticulation has been studied systematically in a series of Zn-MOFs. Linear biphenyl dicarboxylic acids tagged with pendant primary amine (H2bpdc-NH2), allylamine (H2bpdc-NHallyl), diallylamine (H2bpdc-N(allyl)2) and dimethylamine (H2bpdc-NMe2) groups react with zinc nitrate in DMF to yield a set of interpenetrated MOFs, WUF-11-14, respectively, that are structurally akin to IRMOF-9. The allylated amine ligands undergo C-N cleavage reactions under the synthesis conditions, yielding WUF-12 and WUF-13 as multivariate MOFs. The single crystal X-ray crystallography on this set of MOFs was not straightforward and we give a salutary account of the difficulties encountered. Gas adsorption measurements combined with surface area calculations provide invaluable support for the crystallographic assignments. The crystallographic analyses reveal subtle differences in the relative positions of the interpenetrating frameworks, and we present a classification system for this type of MOF and analyse related examples available in the literature. CO2 adsorption measurements revealed that WUF-14, which features the strongest Brønsted basic dimethylamine tag group, has the highest capacity, isosteric heat of adsorption, and CO2/N2 selectivity.</p

    The increase of the functional entropy of the human brain with age

    Get PDF
    We use entropy to characterize intrinsic ageing properties of the human brain. Analysis of fMRI data from a large dataset of individuals, using resting state BOLD signals, demonstrated that a functional entropy associated with brain activity increases with age. During an average lifespan, the entropy, which was calculated from a population of individuals, increased by approximately 0.1 bits, due to correlations in BOLD activity becoming more widely distributed. We attribute this to the number of excitatory neurons and the excitatory conductance decreasing with age. Incorporating these properties into a computational model leads to quantitatively similar results to the fMRI data. Our dataset involved males and females and we found significant differences between them. The entropy of males at birth was lower than that of females. However, the entropies of the two sexes increase at different rates, and intersect at approximately 50 years; after this age, males have a larger entropy

    Wigs, disguises and child's play : solidarity in teacher education

    Get PDF
    It is generally acknowledged that much contemporary education takes place within a dominant audit culture, in which accountability becomes a powerful driver of educational practices. In this culture both pupils and teachers risk being configured as a means to an assessment and target-driven end: pupils are schooled within a particular paradigm of education. The article discusses some ethical issues raised by such schooling, particularly the tensions arising for teachers, and by implication, teacher educators who prepare and support teachers for work in situations where vocational aims and beliefs may be in in conflict with instrumentalist aims. The article offers De Certeau’s concept of ‘la perruque’ to suggest an opening to playful engagement for human ends in education, as a way of contending with and managing the tensions generated. I use the concept to recover a concept of solidarity for teacher educators and teachers to enable ethical teaching in difficult times

    Extinction times in the subcritical stochastic SIS logistic epidemic

    Get PDF
    Many real epidemics of an infectious disease are not straightforwardly super- or sub-critical, and the understanding of epidemic models that exhibit such complexity has been identified as a priority for theoretical work. We provide insights into the near-critical regime by considering the stochastic SIS logistic epidemic, a well-known birth-and-death chain used to model the spread of an epidemic within a population of a given size NN. We study the behaviour of the process as the population size NN tends to infinity. Our results cover the entire subcritical regime, including the "barely subcritical" regime, where the recovery rate exceeds the infection rate by an amount that tends to 0 as N→∞N \to \infty but more slowly than N−1/2N^{-1/2}. We derive precise asymptotics for the distribution of the extinction time and the total number of cases throughout the subcritical regime, give a detailed description of the course of the epidemic, and compare to numerical results for a range of parameter values. We hypothesise that features of the course of the epidemic will be seen in a wide class of other epidemic models, and we use real data to provide some tentative and preliminary support for this theory.Comment: Revised; 34 pages; 6 figure
    • …
    corecore