192 research outputs found

    Weighted ICP Algorithm for Alignment of Stars from Scanned Astronomical Photographic Plates

    Get PDF
    ACM Computing Classification System (1998): I.2.8, I.2.10, I.5.1, J.2.Given the coarse celestial coordinates of the centre of a plate scan and the field of view, we are looking for a mapping between the stars extracted from the image and the stars from a catalogue, where the stars from both sources are represented by their stellar magnitudes and coordinates, relatively to the image centre. In a previous work we demonstrated the application of Iterative Closest Point (ICP) algorithm for the alignment problem where stars were represented only by their geometrical coordinates. ICP leads to translation and rotation of the initial points - a correction required for one set of stars to fit over the other. This paper extends the previous work by demonstrating significant improvement of ICP by using the stellar magnitudes as point weights. The improvement consists of great decrease of the iteration count until convergence, which helps in the case of highly “misaligned” initial states. The essential aspects of the ICP method like noise tolerance of false or missing stars are still in charge.This work is partially supported by the following projects: (1) Creative Development Support of Doctoral Students, Post-Doctoral and Young Researches in the Field of Computer Science, BG 051-PO-001-3.3.04/13, European Social Fund 2007–2013, Operational programme “Human resources development”, and (2) Astroinformatics, grant DO-02-275/2008 of the National Science Fund of the Bulgarian Ministry of Education, Youth and Science

    The Cryogenic AntiCoincidence detector for Athena X-IFU

    Get PDF
    Athena is an ESA project for a space telescope for the X-ray astrophysics. The scientific goal is to study the Universe by measuring the evolution of baryonic matter in large-scale structures, such as the warm-hot intergalactic medium, as well as in energetic compact objects. Because most of the baryonic component of the Universe is locked up in hot gas at temperatures of about a million degrees, and because of the extreme energetics of the processes close to the event horizon of black holes, understanding the hot and energetic Universe requires space-based observations in the X-ray band. The topic requires for spatially resolved X-ray spectroscopy and deep wide-field X-ray spectral imaging with capabilities far beyond those of current observatories like XMM-Newton and Chandra. The observatory will be a fixed 12-meter fixed-focus telescope with two instruments, the innovative X-ray Integral Field Unit (X-IFU), based on cryogenic detectors; and the Wide Field Imager (WFI). These two instruments combine the high spectral resolution of X-IFU with the high spatial resolution of WFI to achieve the scientific goals, with a measurement spectrum from 0.5 to 10 keV. X-IFU is based on 50 mK cooled Transition Edge Sensors (TES), that exploit the metal-superconductor transition. These can provide the required energy resolution, while offering exceptional efficiency compared to the spectrometers on the current generation of X-ray observatories. Since the telescope will operate in an environment rich in cosmic rays, it would be impossible to separate the signals from the background on the X-ray detector. In X-IFU, this problem will be solved by an active anticoincidence layer, which would make it possible to achieve the scientific goals for the spectroscopy of faint or distant sources. The work done in this thesis was focused on the anticoincidence detector, which is one of the core parts of the instrument. Its scope is the reduction of the signal background by about 2 orders of magnitude and will be positioned only 1 mm below the spectrometer. The Demonstration Model (DM) of the detector has been studied, realized and tested. With particular interest in improving the understanding and technology of microfabrication of superconducting devices. The detector is fabricated using optical microlithography and PLD, electro-beam evaporator, and RF-sputtering film deposition systems. The DM active area consists of 96 Ir/Au TES films connected in parallel with superimposed Nb strip lines, insulated with a SiO film, and four heaters on a Si absorber. The pixel is freestanding and attached to a gold frame with four Si beams. The frame is needed to have a strong coupling to a cryostat, since the operating point is below 1 K, and the heaters and the beams are needed to control the decoupling of the active area. Measurements are performed at temperatures around to 0.1 K (the theoretical operating point of X-IFU) in a dilution cryostat reading signals from radiation sources such as Am 241 at 60 keV or Fe 55 at 5 keV. The very low impedance of TES sensors requires a SQuID to read the output signal. In addition some structural models of the detector have been fabricated and vibrated to understand the structural characteristics and to test the response to stresses that the detector will experience during launch. Variations of the detector were studied to test its spectroscopic capabilities and to measure its thermal characteristics. To better understand the overall signal generation inside the absorber, a model and simulation of the phononic distribution of the a-thermal transient was developed. Finally, the detector was tested in conjunction with the NASA spectrometer to verify its anticoincidence performance

    Science Mission Directorate TechPort Records for 2019 STI-DAA Release

    Get PDF
    The role of the Science Mission Directorate (SMD) is to enable NASA to achieve its science goals in the context of the Nation's science agenda. SMD's strategic decisions regarding future missions and scientific pursuits are guided by Agency goals, input from the science community including the recommendations set forth in the National Research Council (NRC) decadal surveys and a commitment to preserve a balanced program across the major science disciplines. Toward this end, each of the four SMD science divisions -- Heliophysics, Earth Science, Planetary Science, and Astrophysics -- develops fundamental science questions upon which to base future research and mission programs

    Large-Scale periodic solar velocities: An observational study

    Get PDF
    Observations of large-scale solar velocities were made using the mean field telescope and Babcock magnetograph of the Stanford Solar Observatory. Observations were made in the magnetically insensitive ion line at 5124 A, with light from the center (limb) of the disk right (left) circularly polarized, so that the magnetograph measures the difference in wavelength between center and limb. Computer calculations are made of the wavelength difference produced by global pulsations for spherical harmonics up to second order and of the signal produced by displacing the solar image relative to polarizing optics or diffraction grating

    Natural Sciences in Archaeology and Cultural Heritage

    Get PDF
    A Special Issue of the international journal Sustainability under the section Sustainability of Culture & Heritage has been made, entitled Natural Sciences in Archaeology and Cultural Heritage. The bridge between science/technology and the humanities (archaeology, anthropology, history of art, and cultural heritage) has formed a well-established interdisciplinary subject with several sub-disciplines; it is growing exponentially, spurred by the fast development of technology in other fields (space exploration, medical, military, and industrial applications). On the other hand, art and culture struggle to survive due to neglect, lack of funding, or the dangers of events such as natural disasters and war. This volume strengthens and exerts the documentation of the sustainability of the issue that arises from the outcome of resulting research and the application of such a duality link. The sustainable dimension emerges from society, education, and economics through the impact of cultural growth, all of which produce a balanced society, in which prosperity, harmony, and development are merged at a sustainable local/regional/national/social level. A wide range of subjects linking the applied natural sciences with archaeology and the cultural heritage of innovative research and applications are presented in this volume

    Significant Accomplishments in Science and Technology at Goddard Space Flight Center, 1969

    Get PDF
    Aerospace scientific and technological studies in 1969 for satellite systems and spacecraft mission

    From nanometers to centimeters: Imaging across spatial scales with smart computer-aided microscopy

    Get PDF
    Microscopes have been an invaluable tool throughout the history of the life sciences, as they allow researchers to observe the miniscule details of living systems in space and time. However, modern biology studies complex and non-obvious phenotypes and their distributions in populations and thus requires that microscopes evolve from visual aids for anecdotal observation into instruments for objective and quantitative measurements. To this end, many cutting-edge developments in microscopy are fuelled by innovations in the computational processing of the generated images. Computational tools can be applied in the early stages of an experiment, where they allow for reconstruction of images with higher resolution and contrast or more colors compared to raw data. In the final analysis stage, state-of-the-art image analysis pipelines seek to extract interpretable and humanly tractable information from the high-dimensional space of images. In the work presented in this thesis, I performed super-resolution microscopy and wrote image analysis pipelines to derive quantitative information about multiple biological processes. I contributed to studies on the regulation of DNMT1 by implementing machine learning-based segmentation of replication sites in images and performed quantitative statistical analysis of the recruitment of multiple DNMT1 mutants. To study the spatiotemporal distribution of DNA damage response I performed STED microscopy and could provide a lower bound on the size of the elementary spatial units of DNA repair. In this project, I also wrote image analysis pipelines and performed statistical analysis to show a decoupling of DNA density and heterochromatin marks during repair. More on the experimental side, I helped in the establishment of a protocol for many-fold color multiplexing by iterative labelling of diverse structures via DNA hybridization. Turning from small scale details to the distribution of phenotypes in a population, I wrote a reusable pipeline for fitting models of cell cycle stage distribution and inhibition curves to high-throughput measurements to quickly quantify the effects of innovative antiproliferative antibody-drug-conjugates. The main focus of the thesis is BigStitcher, a tool for the management and alignment of terabyte-sized image datasets. Such enormous datasets are nowadays generated routinely with light-sheet microscopy and sample preparation techniques such as clearing or expansion. Their sheer size, high dimensionality and unique optical properties poses a serious bottleneck for researchers and requires specialized processing tools, as the images often do not fit into the main memory of most computers. BigStitcher primarily allows for fast registration of such many-dimensional datasets on conventional hardware using optimized multi-resolution alignment algorithms. The software can also correct a variety of aberrations such as fixed-pattern noise, chromatic shifts and even complex sample-induced distortions. A defining feature of BigStitcher, as well as the various image analysis scripts developed in this work is their interactivity. A central goal was to leverage the user's expertise at key moments and bring innovations from the big data world to the lab with its smaller and much more diverse datasets without replacing scientists with automated black-box pipelines. To this end, BigStitcher was implemented as a user-friendly plug-in for the open source image processing platform Fiji and provides the users with a nearly instantaneous preview of the aligned images and opportunities for manual control of all processing steps. With its powerful features and ease-of-use, BigStitcher paves the way to the routine application of light-sheet microscopy and other methods producing equally large datasets

    From nanometers to centimeters: Imaging across spatial scales with smart computer-aided microscopy

    Get PDF
    Microscopes have been an invaluable tool throughout the history of the life sciences, as they allow researchers to observe the miniscule details of living systems in space and time. However, modern biology studies complex and non-obvious phenotypes and their distributions in populations and thus requires that microscopes evolve from visual aids for anecdotal observation into instruments for objective and quantitative measurements. To this end, many cutting-edge developments in microscopy are fuelled by innovations in the computational processing of the generated images. Computational tools can be applied in the early stages of an experiment, where they allow for reconstruction of images with higher resolution and contrast or more colors compared to raw data. In the final analysis stage, state-of-the-art image analysis pipelines seek to extract interpretable and humanly tractable information from the high-dimensional space of images. In the work presented in this thesis, I performed super-resolution microscopy and wrote image analysis pipelines to derive quantitative information about multiple biological processes. I contributed to studies on the regulation of DNMT1 by implementing machine learning-based segmentation of replication sites in images and performed quantitative statistical analysis of the recruitment of multiple DNMT1 mutants. To study the spatiotemporal distribution of DNA damage response I performed STED microscopy and could provide a lower bound on the size of the elementary spatial units of DNA repair. In this project, I also wrote image analysis pipelines and performed statistical analysis to show a decoupling of DNA density and heterochromatin marks during repair. More on the experimental side, I helped in the establishment of a protocol for many-fold color multiplexing by iterative labelling of diverse structures via DNA hybridization. Turning from small scale details to the distribution of phenotypes in a population, I wrote a reusable pipeline for fitting models of cell cycle stage distribution and inhibition curves to high-throughput measurements to quickly quantify the effects of innovative antiproliferative antibody-drug-conjugates. The main focus of the thesis is BigStitcher, a tool for the management and alignment of terabyte-sized image datasets. Such enormous datasets are nowadays generated routinely with light-sheet microscopy and sample preparation techniques such as clearing or expansion. Their sheer size, high dimensionality and unique optical properties poses a serious bottleneck for researchers and requires specialized processing tools, as the images often do not fit into the main memory of most computers. BigStitcher primarily allows for fast registration of such many-dimensional datasets on conventional hardware using optimized multi-resolution alignment algorithms. The software can also correct a variety of aberrations such as fixed-pattern noise, chromatic shifts and even complex sample-induced distortions. A defining feature of BigStitcher, as well as the various image analysis scripts developed in this work is their interactivity. A central goal was to leverage the user's expertise at key moments and bring innovations from the big data world to the lab with its smaller and much more diverse datasets without replacing scientists with automated black-box pipelines. To this end, BigStitcher was implemented as a user-friendly plug-in for the open source image processing platform Fiji and provides the users with a nearly instantaneous preview of the aligned images and opportunities for manual control of all processing steps. With its powerful features and ease-of-use, BigStitcher paves the way to the routine application of light-sheet microscopy and other methods producing equally large datasets
    corecore