11 research outputs found

    The Mauritius radio telescope and a study of selected super nova remnants associated with pulsars

    Get PDF
    The just completed Mauritius Radio Telescope, a 2km by 1km T shaped synthesis array designed to repeat the 6C survey but for the Southern hemisphere, is described. Full details of the instrument hardware, and the software designed to interpret the output of the hardware are presented. The early results from the instrument, in the shape of maps of known SNR with possibly associated pulsars, are shown. In combination with publicly available X-ray data and published maps at other frequencies, conclusions on the associations are drawn. The sources described are G5.4-2.3 with PSR 1757-23, G8.7-0.1 with PSR 1800-21, G315.4-2.3 with SN 185, G320.4-1.2 with PSR 1509-58 and G343.1-2.3 with PSR 1706-44.The MRT project is an example of hardware simplifications being made possible by the increasing power and sophistication of software and computation. The sheer speed with which calculations can be done nowadays has allowed corrections to be applied post facturn, where previously it would have to be fixed in hardware (at much greater cost)

    Discrete Wavelet Methods for Interference Mitigation: An Application To Radio Astronomy

    Get PDF
    The field of wavelets concerns the analysis and alteration of signals at various resolutions. This is achieved through the use of analysis functions which are referred to as wavelets. A wavelet is a signal defined for some brief period of time that contains oscillatory characteristics. Generally, wavelets are intentionally designed to posses particular qualities relevant to a particular signal processing application. This research project makes use of wavelets to mitigate interference, and documents how wavelets are effective in the suppression of Radio Frequency Interference (RFI) in the context of radio astronomy. This study begins with the design of a library of smooth orthogonal wavelets well suited to interference suppression. This is achieved through the use of a multi-parameter optimization applied to a trigonometric parameterization of wavelet filters used for the implementation of the Discrete Wavelet Transform (DWT). This is followed by the design of a simplified wavelet interference suppression system, from which measures of performance and suitability are considered. It is shown that optimal performance metrics for the suppression system are that of Shannon鈥檚 entropy, Root Mean Square Error (RMSE) and normality testing using the Lilliefors test. From the application of these heuristics, the optimal thresholding mechanism was found to be the universal adaptive threshold and entropy based measures were found to be optimal for matching wavelets to interference. This in turn resulted in the implementation of the wavelet suppression system, which consisted of a bank of matched filters used to determine which interference source is present in a sampled time domain vector. From this, the astronomy based application was documented and results were obtained. It is shown that the wavelet based interference suppression system outperforms existing flagging techniques. This is achieved by considering measures of the number of sources within a radio-image of the Messier 83 (M83) galaxy and the power of the main source in the image. It is shown that designed system results in an increase of 27% in the number of sources in the recovered radio image and a 1.9% loss of power of the main source

    Flexible Integration and Efficient Analysis of Multidimensional Datasets from the Web

    Get PDF
    If numeric data from the Web are brought together, natural scientists can compare climate measurements with estimations, financial analysts can evaluate companies based on balance sheets and daily stock market values, and citizens can explore the GDP per capita from several data sources. However, heterogeneities and size of data remain a problem. This work presents methods to query a uniform view - the Global Cube - of available datasets from the Web and builds on Linked Data query approaches

    Self-calibration and improving image fidelity for ALMA and other radio interferometers

    Get PDF
    This manual is intended to help ALMA and other interferometer users improve images by recognising limitations and how to overcome them and deciding when and how to use self-calibration. The images provided by the ALMA Science Archive are calibrated using standard observing and data processing routines, including a quality assurance process to make sure that the observations meet the proposer's science requirements. This may not represent the full potential of the data, since any interferometry observation can be imaged with a range of resolutions and surface brightness sensitivity. The separation between phase calibration source and target usually limits the target dynamic range to a few hundred (or 50--100 for challenging conditions) but if the noise in the target field has not reached the thermal limit, improvements may be possible using self-calibration. This often requires judgements based on the target properties and is not yet automated for all situations. This manual provides background on the instrumental and atmospheric causes of visibility phase and amplitude errors, their effects on imaging and how to improve the signal to noise ratio and image fidelity by self-calibration. We introduce the conditions for self-calibration to be useful and how to estimate calibration parameter values for a range of observing modes (continuum, spectral line etc.). We also summarise more general error recognition and other techniques to tackle imaging problems. The examples are drawn from ALMA interferometric data processed using CASA, but the principles are generally applicable to most similar cm to sub-mm imaging.Comment: 76 pages, 55 figures, ALMA Memo serie

    Radio galaxies at low frequencies: high spatial and spectral resolution studies with LOFAR

    Get PDF
    This thesis uses novel observations from the Low Frequency Array to address open questions on the topic of galaxy evolution. The highest resolution images at ultra low radio frequencies are used to investigate the physical processes present in the radio emission from distant galaxies. Detections of spectral features from carbon atoms in a nearby galaxy are also presented and used to constrain the temperature and density of cold gas that is a key component of all galaxies

    Flexible Integration and Efficient Analysis of Multidimensional Datasets from the Web

    Get PDF
    If numeric data from the Web are brought together, natural scientists can compare climate measurements with estimations, financial analysts can evaluate companies based on balance sheets and daily stock market values, and citizens can explore the GDP per capita from several data sources. However, heterogeneities and size of data remain a problem. This work presents methods to query a uniform view - the Global Cube - of available datasets from the Web and builds on Linked Data query approaches

    A pilot wide-field VLBI survey of the GOODS-North field

    Get PDF
    Very Long Baseline Interferometry (VLBI) has significant advantages in disentangling active galactic nuclei (AGN) from star formation, particularly at intermediate to high-redshift due to its high angular resolution and insensitivity to dust. Surveys using VLBI arrays are only just becoming practical over wide areas with numerous developments and innovations (such as multi-phase centre techniques) in observation and data analysis techniques. However, fully automated pipelines for VLBI data analysis are based on old software packages and are unable to incorporate new calibration and imaging algorithms. In this work, the researcher developed a pipeline for VLBI data analysis which integrates a recent wide-field imaging algorithm, RFI excision, and a purpose-built source finding algorithm specifically developed for the 64kx64k wide-field VLBI images. The researcher used this novel pipeline to process 6% (~ 9 arcmin2 of the total 160 arcmin2) of the data from the CANDELS GOODS- North extragalactic field at 1.6 GHz. The milli-arcsec scale images have an average rms of a ~ 10 uJy/beam. Forty four (44) candidate sources were detected, most of which are at sub-mJy flux densities, having brightness temperatures and luminosities of >5x105 K and >6x1021 W Hz-1 respectively. This work demonstrates that automated post-processing pipelines for wide-field, uniform sensitivity VLBI surveys are feasible and indeed made more efficient with new software, wide-field imaging algorithms and more purpose-built source- finders. This broadens the discovery space for future wide-field surveys with upcoming arrays such as the African VLBI Network (AVN), MeerKAT and the Square Kilometre Array (SKA)

    Metodolog铆a de implantaci贸n de modelos de gesti贸n de la informaci贸n dentro de los sistemas de planificaci贸n de recursos empresariales. Aplicaci贸n en la peque帽a y mediana empresa

    Get PDF
    La Siguiente Generaci贸n de Sistemas de Fabricaci贸n (SGSF) trata de dar respuesta a los requerimientos de los nuevos modelos de empresas, en contextos de inteligencia, agilidad y adaptabilidad en un entono global y virtual. La Planificaci贸n de Recursos Empresariales (ERP) con soportes de gesti贸n del producto (PDM) y el ciclo de vida del producto (PLM) proporciona soluciones de gesti贸n empresarial sobre la base de un uso coherente de tecnolog铆as de la informaci贸n para la implantaci贸n en sistemas CIM (Computer-Integrated Manufacturing), con un alto grado de adaptabilidad a la estnictura organizativa deseada. En general, esta implementaci贸n se lleva desarrollando hace tiempo en grandes empresas, siendo menor (casi nula) su extensi贸n a PYMEs. La presente Tesis Doctoral, define y desarrolla una nueva metodolog铆a de implementaci贸n pan la generaci贸n autom谩tica de la informaci贸n en los procesos de negocio que se verifican en empresas con requerimientos adaptados a las necesidades de la SGSF, dentro de los sistemas de gesti贸n de los recursos empresariales (ERP), atendiendo a la influencia del factor humano. La validez del modelo te贸rico de la metodolog铆a mencionada se ha comprobado al implementarlo en una empresa del tipo PYME, del sector de Ingenier铆a. Para el establecimiento del Estado del Arte de este tema se ha dise帽ado y aplicado una metodolog铆a espec铆fica basada en el ciclo de mejora continua de Shewhart/Deming, aplicando las herramientas de b煤squeda y an谩lisis bibliogr谩fico disponibles en la red con acceso a las correspondientes bases de datos
    corecore