429 research outputs found

    An investigation on the applicability of multi-microprocessing in the two dimensional digital filtering problem

    Get PDF
    Digital image processing has been receiving an increasing amount of development in recent years, largely because high speed digital com puters are becoming readily available. In addition, the advent of the microprocessor has revolutionized the capabilities of compact electronic systems. This thesis examines the applicability of microprocessors to digital image processing. Using a Z8000 microprocessor as a baseline, the computation time for a two-dimensional fast Fourier transform is estimated for various microprocessor architectures. These results are later compared to the manufacturer\u27s specified computation times on a Floating Point. Systems AP-120B Array Processor

    A Semi-parametric Technique for the Quantitative Analysis of Dynamic Contrast-enhanced MR Images Based on Bayesian P-splines

    Full text link
    Dynamic Contrast-enhanced Magnetic Resonance Imaging (DCE-MRI) is an important tool for detecting subtle kinetic changes in cancerous tissue. Quantitative analysis of DCE-MRI typically involves the convolution of an arterial input function (AIF) with a nonlinear pharmacokinetic model of the contrast agent concentration. Parameters of the kinetic model are biologically meaningful, but the optimization of the non-linear model has significant computational issues. In practice, convergence of the optimization algorithm is not guaranteed and the accuracy of the model fitting may be compromised. To overcome this problems, this paper proposes a semi-parametric penalized spline smoothing approach, with which the AIF is convolved with a set of B-splines to produce a design matrix using locally adaptive smoothing parameters based on Bayesian penalized spline models (P-splines). It has been shown that kinetic parameter estimation can be obtained from the resulting deconvolved response function, which also includes the onset of contrast enhancement. Detailed validation of the method, both with simulated and in vivo data, is provided

    Neuroconductor: an R platform for medical imaging analysis

    Get PDF
    Neuroconductor (https://neuroconductor.org) is an open-source platform for rapid testing and dissemination of reproducible computational imaging software. The goals of the project are to: (i) provide a centralized repository of R software dedicated to image analysis, (ii) disseminate software updates quickly, (iii) train a large, diverse community of scientists using detailed tutorials and short courses, (iv) increase software quality via automatic and manual quality controls, and (v) promote reproducibility of image data analysis. Based on the programming language R (https://www.r-project.org/), Neuroconductor starts with 51 inter-operable packages that cover multiple areas of imaging including visualization, data processing and storage, and statistical inference. Neuroconductor accepts new R package submissions, which are subject to a formal review and continuous automated testing. We provide a description of the purpose of Neuroconductor and the user and developer experience

    BIOH 202N.00: Human Anatomy and Physiology I - Laboratory

    Get PDF

    Sub‐phenotyping Metabolic Disorders Using Body Composition: An Individualized, Nonparametric Approach Utilizing Large Data Sets

    Get PDF
    Objective This study performed individual-centric, data-driven calculations of propensity for coronary heart disease (CHD) and type 2 diabetes (T2D), utilizing magnetic resonance imaging-acquired body composition measurements, for sub-phenotyping of obesity and nonalcoholic fatty liver disease (NAFLD). Methods A total of 10,019 participants from the UK Biobank imaging substudy were included and analyzed for visceral and abdominal subcutaneous adipose tissue, muscle fat infiltration, and liver fat. An adaption of the k-nearest neighbors algorithm was applied to the imaging variable space to calculate individualized CHD and T2D propensity and explore metabolic sub-phenotyping within obesity and NAFLD. Results The ranges of CHD and T2D propensity for the whole cohort were 1.3% to 58.0% and 0.6% to 42.0%, respectively. The diagnostic performance, area under the receiver operating characteristic curve (95% CI), using disease propensities for CHD and T2D detection was 0.75 (0.73-0.77) and 0.79 (0.77-0.81). Exploring individualized disease propensity, CHD phenotypes, T2D phenotypes, comorbid phenotypes, and metabolically healthy phenotypes were found within obesity and NAFLD. Conclusions The adaptive k-nearest neighbors algorithm allowed an individual-centric assessment of each individual’s metabolic phenotype moving beyond discrete categorizations of body composition. Within obesity and NAFLD, this may help in identifying which comorbidities a patient may develop and consequently enable optimization of treatment

    Living and Learning Communities: One University\u27s Journey

    Get PDF
    University housing has the capacity to offer more than comfortable living spaces, and campuses across the U.S., including our own, are exploring models of residential learning communities that provide both academic and social support students while cultivating a strong sense of community. In this article, we describe our campus foray into offering a new residential learning community model. We explain its origins, its evolution, and the questions we face now that we have successfully created a second approach to living learning communities on our campus

    XUV Opacity of Aluminum between the Cold-Solid to Warm-Plasma Transition

    Full text link
    We present calculations of the free-free XUV opacity of warm, solid-density aluminum at photon energies between the plasma frequency at 15 eV and the L-edge at 73 eV, using both density functional theory combined with molecular dynamics and a semi-analytical model in the RPA framework with the inclusion of local field corrections. As the temperature is increased from room temperature to 10 eV, with the ion and electron temperatures equal, we calculate an increase in the opacity in the range over which the degree of ionization is constant. The effect is less pronounced if only the electron temperature is allowed to increase. The physical significance of these increases is discussed in terms of intense XUV-laser matter interactions on both femtosecond and picosecond time-scales.Comment: 4 pages, 3 figure

    A Multiresolution Census Algorithm for Calculating Vortex Statistics in Turbulent Flows

    Full text link
    The fundamental equations that model turbulent flow do not provide much insight into the size and shape of observed turbulent structures. We investigate the efficient and accurate representation of structures in two-dimensional turbulence by applying statistical models directly to the simulated vorticity field. Rather than extract the coherent portion of the image from the background variation, as in the classical signal-plus-noise model, we present a model for individual vortices using the non-decimated discrete wavelet transform. A template image, supplied by the user, provides the features to be extracted from the vorticity field. By transforming the vortex template into the wavelet domain, specific characteristics present in the template, such as size and symmetry, are broken down into components associated with spatial frequencies. Multivariate multiple linear regression is used to fit the vortex template to the vorticity field in the wavelet domain. Since all levels of the template decomposition may be used to model each level in the field decomposition, the resulting model need not be identical to the template. Application to a vortex census algorithm that records quantities of interest (such as size, peak amplitude, circulation, etc.) as the vorticity field evolves is given. The multiresolution census algorithm extracts coherent structures of all shapes and sizes in simulated vorticity fields and is able to reproduce known physical scaling laws when processing a set of voriticity fields that evolve over time

    Gaps in Adolescent Tobacco Prevention and Counseling in Vermont

    Get PDF
    Introduction. Tobacco use remains the leading cause of preventable death in Vermont. While the Vermont Blueprint for Health includes compensation for adult tobacco counseling, it includes no specific mention of pediatric populations. Research questions: To what extent are tobacco assessment and cessation efforts occurring in the primary care setting with pediatric patients? What factors influence their practices?Methods. A 12-question electronic survey, modeled on an American Academy of Pediatrics survey, was distributed to primary care providers throughout Vermont; through the UVM departments of pediatrics, family medicine, the Vermont Medical Society and the Vermont Area Health Education Center. We received 70 completed surveys.Results. 70% of the surveyed primary care providers begin tobacco counseling at the age recommended (11 years) by the Vermont Department of Health. Only 45.71% of providers are confident in their understanding of the recommendations for adolescent health screening written in the Blueprint for Health. Additionally, only 67.1% of the providers expressed confidence in their ability to provide guidance regarding the harmful effects of E-cigarettes, compared to 92.8% feeling confident regarding conventional cigarettes. 70% of providers listed time restraints as a significant factor in their decision not to counsel adolescents on tobacco use.Discussion. The Blueprint for Health is a guiding document for provider practices that is not well understood and does not specifically include pediatric tobacco prevention. In an environment where youth E-cigarette use is rising, especially among adolescents, it is especially critical that physicians are confident in their counseling practices.https://scholarworks.uvm.edu/comphp_gallery/1237/thumbnail.jp
    corecore