1,728 research outputs found

    Non-linear Kalman filters for calibration in radio interferometry

    Full text link
    We present a new calibration scheme based on a non-linear version of Kalman filter that aims at estimating the physical terms appearing in the Radio Interferometry Measurement Equation (RIME). We enrich the filter's structure with a tunable data representation model, together with an augmented measurement model for regularization. We show using simulations that it can properly estimate the physical effects appearing in the RIME. We found that this approach is particularly useful in the most extreme cases such as when ionospheric and clock effects are simultaneously present. Combined with the ability to provide prior knowledge on the expected structure of the physical instrumental effects (expected physical state and dynamics), we obtain a fairly cheap algorithm that we believe to be robust, especially in low signal-to-noise regime. Potentially the use of filters and other similar methods can represent an improvement for calibration in radio interferometry, under the condition that the effects corrupting visibilities are understood and analytically stable. Recursive algorithms are particularly well adapted for pre-calibration and sky model estimate in a streaming way. This may be useful for the SKA-type instruments that produce huge amounts of data that have to be calibrated before being averaged

    Radio interferometric gain calibration as a complex optimization problem

    Full text link
    Recent developments in optimization theory have extended some traditional algorithms for least-squares optimization of real-valued functions (Gauss-Newton, Levenberg-Marquardt, etc.) into the domain of complex functions of a complex variable. This employs a formalism called the Wirtinger derivative, and derives a full-complex Jacobian counterpart to the conventional real Jacobian. We apply these developments to the problem of radio interferometric gain calibration, and show how the general complex Jacobian formalism, when combined with conventional optimization approaches, yields a whole new family of calibration algorithms, including those for the polarized and direction-dependent gain regime. We further extend the Wirtinger calculus to an operator-based matrix calculus for describing the polarized calibration regime. Using approximate matrix inversion results in computationally efficient implementations; we show that some recently proposed calibration algorithms such as StefCal and peeling can be understood as special cases of this, and place them in the context of the general formalism. Finally, we present an implementation and some applied results of CohJones, another specialized direction-dependent calibration algorithm derived from the formalism.Comment: 18 pages; 6 figures; accepted by MNRA

    Quantitative Analysis of Saliency Models

    Full text link
    Previous saliency detection research required the reader to evaluate performance qualitatively, based on renderings of saliency maps on a few shapes. This qualitative approach meant it was unclear which saliency models were better, or how well they compared to human perception. This paper provides a quantitative evaluation framework that addresses this issue. In the first quantitative analysis of 3D computational saliency models, we evaluate four computational saliency models and two baseline models against ground-truth saliency collected in previous work.Comment: 10 page

    Distributed texture-based terrain synthesis

    Get PDF
    Terrain synthesis is an important field of Computer Graphics that deals with the generation of 3D landscape models for use in virtual environments. The field has evolved to a stage where large and even infinite landscapes can be generated in realtime. However, user control of the generation process is still minimal, as well as the creation of virtual landscapes that mimic real terrain. This thesis investigates the use of texture synthesis techniques on real landscapes to improve realism and the use of sketch-based interfaces to enable intuitive user control

    Using baseline-dependent window functions for data compression and field-of-interest shaping in radio interferometry

    Full text link
    In radio interferometry, observed visibilities are intrinsically sampled at some interval in time and frequency. Modern interferometers are capable of producing data at very high time and frequency resolution; practical limits on storage and computation costs require that some form of data compression be imposed. The traditional form of compression is a simple averaging of the visibilities over coarser time and frequency bins. This has an undesired side effect: the resulting averaged visibilities "decorrelate", and do so differently depending on the baseline length and averaging interval. This translates into a non-trivial signature in the image domain known as "smearing", which manifests itself as an attenuation in amplitude towards off-centre sources. With the increasing fields of view and/or longer baselines employed in modern and future instruments, the trade-off between data rate and smearing becomes increasingly unfavourable. In this work we investigate alternative approaches to low-loss data compression. We show that averaging of the visibility data can be treated as a form of convolution by a boxcar-like window function, and that by employing alternative baseline-dependent window functions a more optimal interferometer smearing response may be induced. In particular, we show improved amplitude response over a chosen field of interest, and better attenuation of sources outside the field of interest. The main cost of this technique is a reduction in nominal sensitivity; we investigate the smearing vs. sensitivity trade-off, and show that in certain regimes a favourable compromise can be achieved. We show the application of this technique to simulated data from the Karl G. Jansky Very Large Array (VLA) and the European Very-long-baseline interferometry Network (EVN)

    Model consent clauses for rare disease research

    Get PDF
    Background: Rare Disease research has seen tremendous advancements over the last decades, with the development of new technologies, various global collaborative efforts and improved data sharing. To maximize the impact of and to further build on these developments, there is a need for model consent clauses for rare diseases research, in order to improve data interoperability, to meet the informational needs of participants, and to ensure proper ethical and legal use of data sources and participants' overall protection. Methods: A global Task Force was set up to develop model consent clauses specific to rare diseases research, that are comprehensive, harmonized, readily accessible, and internationally applicable, facilitating the recruitment and consent of rare disease research participants around the world. Existing consent forms and notices of consent were analyzed and classified under different consent themes, which were used as background to develop the model consent clauses. Results: The IRDiRC-GA4GH MCC Task Force met in September 2018, to discuss and design model consent clauses. Based on analyzed consent forms, they listed generic core elements and designed the following rare disease research specific core elements; Rare Disease Research Introductory Clause, Familial Participation, Audio/Visual Imaging, Collecting, storing, sharing of rare disease data, Recontact for matching, Data Linkage, Return of Results to Family Members, Incapacity/Death, and Benefits. Conclusion: The model consent clauses presented in this article have been drafted to highlight consent elements that bear in mind the trends in rare disease research, while providing a tool to help foster harmonization and collaborative efforts

    Shading Curves: Vector-Based Drawing With Explicit Gradient Control

    Get PDF
    A challenge in vector graphics is to define primitives that offer flexible manipulation of colour gradients. We propose a new primitive, called a shading curve, that supports explicit and local gradient control. This is achieved by associating shading profiles to each side of the curve. These shading profiles, which can be manually manipulated, represent the colour gradient out from their associated curves. Such explicit and local gradient control is challenging to achieve via the diffusion curve process, introduced in 2008, because it offers only implicit control of the colour gradient. We resolve this problem by using subdivision surfaces that are constructed from shading curves and their shading profiles.This is the final version of the article. It first appeared from Wiley via http://dx.doi.org/10.1111/cgf.1253
    corecore