1,638 research outputs found

    A 6-12 GHz Analogue Lag-Correlator for Radio Interferometry

    Get PDF
    Aims: We describe a 6-12 GHz analogue correlator that has been developed for use in radio interferometers. Methods: We use a lag-correlator technique to synthesis eight complex spectral channels. Two schemes were considered for sampling the cross-correlation function, using either real or complex correlations, and we developed prototypes for both of them. We opted for the ``add and square'' detection scheme using Schottky diodes over the more commonly used active multipliers because the stability of the device is less critical. Results: We encountered an unexpected problem, in that there were errors in the lag spacings of up to ten percent of the unit spacing. To overcome this, we developed a calibration method using astronomical sources which corrects the effects of the non-uniform sampling as well as gain error and dispersion in the correlator.Comment: 14 pages, 21 figures, accepted for publication in A&

    A 6-12 GHz Analogue Lag-Correlator for Radio Interferometry

    Get PDF
    Aims: We describe a 6-12 GHz analogue correlator that has been developed for use in radio interferometers. Methods: We use a lag-correlator technique to synthesis eight complex spectral channels. Two schemes were considered for sampling the cross-correlation function, using either real or complex correlations, and we developed prototypes for both of them. We opted for the ``add and square'' detection scheme using Schottky diodes over the more commonly used active multipliers because the stability of the device is less critical. Results: We encountered an unexpected problem, in that there were errors in the lag spacings of up to ten percent of the unit spacing. To overcome this, we developed a calibration method using astronomical sources which corrects the effects of the non-uniform sampling as well as gain error and dispersion in the correlator.Comment: 14 pages, 21 figures, accepted for publication in A&

    Adaptation Strategies for Automated Machine Learning on Evolving Data

    Get PDF
    Automated Machine Learning (AutoML) systems have been shown to efficiently build good models for new datasets. However, it is often not clear how well they can adapt when the data evolves over time. The main goal of this study is to understand the effect of data stream challenges such as concept drift on the performance of AutoML methods, and which adaptation strategies can be employed to make them more robust. To that end, we propose 6 concept drift adaptation strategies and evaluate their effectiveness on different AutoML approaches. We do this for a variety of AutoML approaches for building machine learning pipelines, including those that leverage Bayesian optimization, genetic programming, and random search with automated stacking. These are evaluated empirically on real-world and synthetic data streams with different types of concept drift. Based on this analysis, we propose ways to develop more sophisticated and robust AutoML techniques.Comment: 12 pages, 7 figures (14 counting subfigures), submitted to TPAMI - AutoML Special Issu

    A new measure of resilience: An application to the London Underground

    Get PDF
    The many varied views on resilience indicate that it is an important concept which has significance in many disciplines, from ecology to psychology to risk/disaster management. Therefore, it is important to be able to quantifiably measure the resilience of systems, and thus be able to make decisions on how the resilience of the system can be improved. In this paper we will work with the definition, due to Pimm (1991), that resilience is "how fast a variable that has been displaced from equilibrium returns to it." We will think of a system as being more or less resilient depending on the speed with which a system recovers from disruptive events or shocks. Here we consider systems which revert to an equilibrium state from shocks, and introduce a measure of resilience by providing a quantification of the rapidity of these systems' recovery from shocks.We use a mean-reverting stochastic model to study the diffusive effects of shocks and we apply this model to the case of the London Underground. As a shock diffuses through the network, the human-flow in the network recovers from the shock. The speed with which the passenger counts return to normal is an indicator of how quickly the line is able to recover from the shock and thereafter resume normal operations

    Continual learning from stationary and non-stationary data

    Get PDF
    Continual learning aims at developing models that are capable of working on constantly evolving problems over a long-time horizon. In such environments, we can distinguish three essential aspects of training and maintaining machine learning models - incorporating new knowledge, retaining it and reacting to changes. Each of them poses its own challenges, constituting a compound problem with multiple goals. Remembering previously incorporated concepts is the main property of a model that is required when dealing with stationary distributions. In non-stationary environments, models should be capable of selectively forgetting outdated decision boundaries and adapting to new concepts. Finally, a significant difficulty can be found in combining these two abilities within a single learning algorithm, since, in such scenarios, we have to balance remembering and forgetting instead of focusing only on one aspect. The presented dissertation addressed these problems in an exploratory way. Its main goal was to grasp the continual learning paradigm as a whole, analyze its different branches and tackle identified issues covering various aspects of learning from sequentially incoming data. By doing so, this work not only filled several gaps in the current continual learning research but also emphasized the complexity and diversity of challenges existing in this domain. Comprehensive experiments conducted for all of the presented contributions have demonstrated their effectiveness and substantiated the validity of the stated claims

    Perspectives and recent advances in super-resolution spectroscopy: Stochastic and disordered-based approaches

    Get PDF
    Spectroscopic applications are characterized by the constant effort to combine high spectral resolution with large bandwidth. A trade-off typically exists between these two aspects, but the recent development of super-resolved spectroscopy techniques is bringing new opportunities into this field. This is particularly relevant for all applications where compact and cost-effective instruments are needed such as in sensing, quality control, environmental monitoring, or biometric authentication, to name a few. These unconventional approaches exploit several strategies for spectral investigation, taking advantage of concepts such as sparse sampling, artificial intelligence, or post-processing reconstruction algorithms. In this Perspective, we discuss the main strengths and weaknesses of these methods, tracing promising future directions for their further development and widespread adoption. Published under an exclusive license by AIP Publishing
    • …
    corecore