1,625 research outputs found

    Galaxy Distances in the Nearby Universe: Corrections For Peculiar Motions

    Get PDF
    By correcting the redshift--dependent distances for peculiar motions through a number of peculiar velocity field models, we recover the true distances of a wide, all-sky sample of nearby galaxies (~ 6400 galaxies with velocities cz<5500 km/s), which is complete up to the blue magnitude B=14 mag. Relying on catalogs of galaxy groups, we treat ~2700 objects as members of galaxy groups and the remaining objects as field galaxies. We model the peculiar velocity field using: i) a cluster dipole reconstruction scheme; ii) a multi--attractor model fitted to the Mark II and Mark III catalogs of galaxy peculiar velocities. According to Mark III data the Great Attractor has a smaller influence on local dynamics than previously believed, whereas the Perseus-Pisces and Shapley superclusters acquire a specific dynamical role. Remarkably, the Shapley structure, which is found to account for nearly half the peculiar motion of the Local Group, is placed by Mark III data closer to the zone of avoidance with respect to its optical position. Our multi--attractor model based on Mark III data favors a cosmological density parameter Omega ~ 0.5 (irrespective of a biasing factor of order unity). Differences among distance estimates are less pronounced in the ~ 2000 - 4000 km/s distance range than at larger or smaller distances. In the last regions these differences have a serious impact on the 3D maps of the galaxy distribution and on the local galaxy density --- on small scales.Comment: 24 pages including (9 eps figures and 7 tables). Figures 1,2,3,4 are available only upon request. Accepted by Ap

    Assessment of Image Quality of a PET/CT scanner for a Standarized Image situation Using a NEMA Body Phantom. “The impact of Different Image Reconstruction Parameters on Image quality”

    Get PDF
    Radiologists and medical practitioners are working daily with images from integrated Positron Emission Tomography/ Computed Tomography (PET/CT) scanners in order to detect potentially lethal diseases. It is thus very important to ensure that these images have adequate image quality. For the staff responsible of quality assurance of the applied scanner, it is important to ensure that the reconstruction procedures and image protocols in use enable acquisition of image with a high quality with respect to resolution and contrast, while the data sets are containing as little noise as possible. The goal of the quality assurance work will be to continuously make sure that, data acquisition settings and especially the reconstruction procedure that is utilized for routine and daily clinical purposes, enables lesions or cancer cells and diseases to be detected. This master thesis project aims at evaluating a reconstruction algorithm (iterative reconstruction) and some key parameters applied in image reconstruction. These parameters include selected filters (Gaussian, median, Hann and Butterworth filter), selected full width at half maximum values (FWHM: 3, 5, and 7 mm) and image matrix sizes (128 x 128 and 168 x 168 pixels respectively), in order to provide information on how these key parameters will affect image quality. The National Electrical Manufacturers Association (NEMA) International Electrotechnical Commission (IEC) Body Phantom Set was used in this work. It consists of a lid with six fillable spheres (with internal diameters 37, 28, 22, 17, 13 and 10 mm respectively), lung insert, body phantom (which represent the background volume) and a test phantom. The work in this thesis project has been carried out using the radiopharmaceutical tracer an F-18 FDG, fluotodeoxyglucose, produced with a cyclotron, a General Electric’s PETtrace 6 cyclotron, at the Center for Nuclear Medicine/PET at Haukeland University Hospital in Bergen, Norway. The applied radiopharmaceutical F-18 FDG was produced in a 2.5 ml target volume at the cyclotron. After the production, this volume was delivered from the cyclotron into a 20 ml sealed cylindrical glass already containing 17.5 ml of non-radioactive water. The activity level in this new solution with 20 ml F-18 FDG and water was measured in a dose calibrator (ISOMED 2010TM). The solution was diluted further, in an iterative process, a number of times in order to acquire the necessary activity concentrations for both the selected hot spheres and the background volume. The aim was to obtain activity concentrations for sphere-to-background ratios of either 4:1 or 8:1. The sphere-to-background ratio in this work is the ratio between the radioactivity level in four small spheres (with diameters 22, 17, 13 and 10 mm respectively, and having a total volume of 9.8 ml for all the 4 spheres) and the radioactivity level in the main body of the applied phantom; the so-called background volume (9708 ml). The two bigger spheres (28 and 37 mm) were filled with non-radioactive water in order to represent areas without radioactivity, i.e. “cold spheres”. When the spheres and volumes under study were filled with the desired level of activity and the activity level was measured, the spheres were positioned into the applied body phantom and the phantom was sealed to avoid spillage. The prepared NEMA IEC body phantom was placed on the table of a Siemens Biograph 40 PET/CT scanner in a predetermined reproducible position and scanned using a standard clinical whole body PET/CT protocol. The acquired images were reconstructed. Three repetitive studies were done for each concentration ratio. For each experiment performed, the sphere-to-background ratios were either 4:1 or 8:1. A selection of different standardized reconstruction parameters and different image corrections were applied. This was done in order to study what impact changes of the reconstruction parameters will have on the image quality. The image quality being defined by a quantification of the measured relative contrast in the images studied. The procedures followed while performing the PET/CT were in compliance with the recommended procedure presented in the NEMA NU2 – 2007 manual (from the manufacturer of the NEMA IEC body phantom described above). The reconstructed images were analyzed manually on a PET/CT workstation and also analyzed automatically with python programming software specially developed for the purpose of this work. The image quality results obtained from analyzes of the reconstructed images when different reconstruction parameters were used, were thereafter compared to the standardized protocol for reconstruction of PET/CT images. Lastly, the results have been compared with other similar work on the same subject by Helmar Bergmann et al (2005).Master i Medisinsk biologiMAMD-MEDBIBMED39

    Determining the Quantitative Principles of T Cell Response to Antigenic Disparity in Stem Cell Transplantation

    Get PDF
    Alloreactivity compromising clinical outcomes in stem cell transplantation is observed despite HLA matching of donors and recipients. This has its origin in the variation between the exomes of the two, which provides the basis for minor histocompatibility antigens (mHA). The mHA presented on the HLA class I and II molecules and the ensuing T cell response to these antigens results in graft vs. host disease. In this paper, results of a whole exome sequencing study are presented, with resulting alloreactive polymorphic peptides and their HLA class I and HLA class II (DRB1) binding affinity quantified. Large libraries of potentially alloreactive recipient peptides binding both sets of molecules were identified, with HLA-DRB1 generally presenting a greater number of peptides. These results are used to develop a quantitative framework to understand the immunobiology of transplantation. A tensor-based approach is used to derive the equations needed to determine the alloreactive donor T cell response from the mHA-HLA binding affinity and protein expression data. This approach may be used in future studies to simulate the magnitude of expected donor T cell response and determine the risk for alloreactive complications in HLA matched or mismatched hematopoietic cell and solid organ transplantation

    Proceedings of the 2011 New York Workshop on Computer, Earth and Space Science

    Full text link
    The purpose of the New York Workshop on Computer, Earth and Space Sciences is to bring together the New York area's finest Astronomers, Statisticians, Computer Scientists, Space and Earth Scientists to explore potential synergies between their respective fields. The 2011 edition (CESS2011) was a great success, and we would like to thank all of the presenters and participants for attending. This year was also special as it included authors from the upcoming book titled "Advances in Machine Learning and Data Mining for Astronomy". Over two days, the latest advanced techniques used to analyze the vast amounts of information now available for the understanding of our universe and our planet were presented. These proceedings attempt to provide a small window into what the current state of research is in this vast interdisciplinary field and we'd like to thank the speakers who spent the time to contribute to this volume.Comment: Author lists modified. 82 pages. Workshop Proceedings from CESS 2011 in New York City, Goddard Institute for Space Studie

    CHILES: HI morphology and galaxy environment at z=0.12 and z=0.17

    Get PDF
    We present a study of 16 HI-detected galaxies found in 178 hours of observations from Epoch 1 of the COSMOS HI Large Extragalactic Survey (CHILES). We focus on two redshift ranges between 0.108 <= z <= 0.127 and 0.162 <= z <= 0.183 which are among the worst affected by radio frequency interference (RFI). While this represents only 10% of the total frequency coverage and 18% of the total expected time on source compared to what will be the full CHILES survey, we demonstrate that our data reduction pipeline recovers high quality data even in regions severely impacted by RFI. We report on our in-depth testing of an automated spectral line source finder to produce HI total intensity maps which we present side-by-side with significance maps to evaluate the reliability of the morphology recovered by the source finder. We recommend that this become a common place manner of presenting data from upcoming HI surveys of resolved objects. We use the COSMOS 20k group catalogue, and we extract filamentary structure using the topological DisPerSE algorithm to evaluate the \hi\ morphology in the context of both local and large-scale environments and we discuss the shortcomings of both methods. Many of the detections show disturbed HI morphologies suggesting they have undergone a recent interaction which is not evident from deep optical imaging alone. Overall, the sample showcases the broad range of ways in which galaxies interact with their environment. This is a first look at the population of galaxies and their local and large-scale environments observed in HI by CHILES at redshifts beyond the z=0.1 Universe.Comment: 23 pages, 12 figures, 1 interactive 3D figure, accepted to MNRA

    Collisions of inhomogeneous pre-planetesimals

    Full text link
    In the framework of the coagulation scenario, kilometre-sized planetesimals form by subsequent collisions of pre-planetesimals of sizes from centimetre to hundreds of metres. Pre-planetesimals are fluffy, porous dust aggregates, which are inhomogeneous owing to their collisional history. Planetesimal growth can be prevented by catastrophic disruption in pre-planetesimal collisions above the destruction velocity threshold. We develop an inhomogeneity model based on the density distribution of dust aggregates, which is assumed to be a Gaussian distribution with a well-defined standard deviation. As a second input parameter, we consider the typical size of an inhomogeneous clump. These input parameters are easily accessible by laboratory experiments. For the simulation of the dust aggregates, we utilise a smoothed particle hydrodynamics (SPH) code with extensions for modelling porous solid bodies. The porosity model was previously calibrated for the simulation of silica dust, which commonly serves as an analogue for pre-planetesimal material. The inhomogeneity is imposed as an initial condition on the SPH particle distribution. We carry out collisions of centimetre-sized dust aggregates of intermediate porosity. We vary the standard deviation of the inhomogeneous distribution at fixed typical clump size. The collision outcome is categorised according to the four-population model. We show that inhomogeneous pre-planetesimals are more prone to destruction than homogeneous aggregates. Even slight inhomogeneities can lower the threshold for catastrophic disruption. For a fixed collision velocity, the sizes of the fragments decrease with increasing inhomogeneity. Pre-planetesimals with an active collisional history tend to be weaker. This is a possible obstacle to collisional growth and needs to be taken into account in future studies of the coagulation scenario.Comment: 12 pages, 9 figures, 4 table
    • …
    corecore