400 research outputs found

    A Framework for Generalising the Newton Method and Other Iterative Methods from Euclidean Space to Manifolds

    Full text link
    The Newton iteration is a popular method for minimising a cost function on Euclidean space. Various generalisations to cost functions defined on manifolds appear in the literature. In each case, the convergence rate of the generalised Newton iteration needed establishing from first principles. The present paper presents a framework for generalising iterative methods from Euclidean space to manifolds that ensures local convergence rates are preserved. It applies to any (memoryless) iterative method computing a coordinate independent property of a function (such as a zero or a local minimum). All possible Newton methods on manifolds are believed to come under this framework. Changes of coordinates, and not any Riemannian structure, are shown to play a natural role in lifting the Newton method to a manifold. The framework also gives new insight into the design of Newton methods in general.Comment: 36 page

    Prospects for near-infrared characterisation of hot Jupiters with VSI

    Full text link
    In this paper, we study the feasibility of obtaining near-infrared spectra of bright extrasolar planets with the 2nd generation VLTI Spectro-Imager instrument (VSI), which has the required angular resolution to resolve nearby hot Extrasolar Giant Planets (EGPs) from their host stars. Taking into account fundamental noises, we simulate closure phase measurements of several extrasolar systems using four 8-m telescopes at the VLT and a low spectral resolution (R = 100). Synthetic planetary spectra from T. Barman are used as an input. Standard chi2-fitting methods are then used to reconstruct planetary spectra from the simulated data. These simulations show that low-resolution spectra in the H and K bands can be retrieved with a good fidelity for half a dozen targets in a reasonable observing time (about 10 hours, spread over a few nights). Such observations would strongly constrain the planetary temperature and albedo, the energy redistribution mechanisms, as well as the chemical composition of their atmospheres. Systematic errors, not included in our simulations, could be a serious limitation to these performance estimations. The use of integrated optics is however expected to provide the required instrumental stability (around 10^-4 on the closure phase) to enable the first thorough characterisation of extrasolar planetary emission spectra in the near-infrared.Comment: 10 pages, 8 figures, Proc. SPIE conference 7013 "Optical and Infrared Interferometry" (Marseille 2008

    Improving Interferometric Null Depth Measurements using Statistical Distributions: Theory and First Results with the Palomar Fiber Nuller

    Get PDF
    A new "self-calibrated" statistical analysis method has been developed for the reduction of nulling interferometry data. The idea is to use the statistical distributions of the fluctuating null depth and beam intensities to retrieve the astrophysical null depth (or equivalently the object's visibility) in the presence of fast atmospheric fluctuations. The approach yields an accuracy much better (about an order of magnitude) than is presently possible with standard data reduction methods, because the astrophysical null depth accuracy is no longer limited by the magnitude of the instrumental phase and intensity errors but by uncertainties on their probability distributions. This approach was tested on the sky with the two-aperture fiber nulling instrument mounted on the Palomar Hale telescope. Using our new data analysis approach alone-and no observations of calibrators-we find that error bars on the astrophysical null depth as low as a few 10-4 can be obtained in the near-infrared, which means that null depths lower than 10-3 can be reliably measured. This statistical analysis is not specific to our instrument and may be applicable to other interferometers

    Evidence for a circumplanetary disk around protoplanet PDS 70 b

    Full text link
    We present the first observational evidence for a circumplanetary disk around the protoplanet PDS~70~b, based on a new spectrum in the KK band acquired with VLT/SINFONI. We tested three hypotheses to explain the spectrum: Atmospheric emission from the planet with either (1) a single value of extinction or (2) variable extinction, and (3) a combined atmospheric and circumplanetary disk model. Goodness-of-fit indicators favour the third option, suggesting circumplanetary material contributing excess thermal emission --- most prominent at λ2.3μ\lambda \gtrsim 2.3 \mum. Inferred accretion rates (107.8\sim 10^{-7.8}--107.3MJ10^{-7.3} M_J yr1^{-1}) are compatible with observational constraints based on the Hα\alpha and Brγ\gamma lines. For the planet, we derive an effective temperature of 1500--1600 K, surface gravity log(g)4.0\log(g)\sim 4.0, radius 1.6RJ\sim 1.6 R_J, mass 10MJ\sim 10 M_J, and possible thick clouds. Models with variable extinction lead to slightly worse fits. However, the amplitude (ΔAV3\Delta A_V \gtrsim 3mag) and timescale of variation (\lesssim~years) required for the extinction would also suggest circumplanetary material.Comment: 8 pages, 2 figures, 1 table. This is a pre-copyedited, author-produced PDF of an article accepted for publication in ApJL on 2019 May 1

    Low-rank optimization for semidefinite convex problems

    Full text link
    We propose an algorithm for solving nonlinear convex programs defined in terms of a symmetric positive semidefinite matrix variable XX. This algorithm rests on the factorization X=YYTX=Y Y^T, where the number of columns of Y fixes the rank of XX. It is thus very effective for solving programs that have a low rank solution. The factorization X=YYTX=Y Y^T evokes a reformulation of the original problem as an optimization on a particular quotient manifold. The present paper discusses the geometry of that manifold and derives a second order optimization method. It furthermore provides some conditions on the rank of the factorization to ensure equivalence with the original problem. The efficiency of the proposed algorithm is illustrated on two applications: the maximal cut of a graph and the sparse principal component analysis problem.Comment: submitte

    Integrated scheme of rapid environmental assessment for shallow water acoustics

    Get PDF
    Predicting sound propagation in shallow or very shallow water environments requires that the frequency-dependent acoustic properties be assessed for all components of the waveguide, i.e., the water column, sea bottom and sea surface interface. During the Maritime Rapid Environmental Assessment MREA?BP'07 sea trial in April-May 2007, south of Elba Island in the Mediterranean Sea, an integrated MREA scheme has been implemented to provide a full 4D (3D+T) environmental picture that is directly exploitable by acoustic propagation models. Based on a joint multi-disciplinary effort, several standard and advanced techniques of environmental characterization covering the fields of underwater acoustics, physical oceanography and geophysics have been combined within a coherent scheme of data acquisition, processing and assimilation. The paper presents the whole architecture of the implemented scheme. Based on a preliminary analysis of MREA?BP'07 data, advantages and drawbacks of the approach will be discussed. Ways ahead for further improvement and perspectives are finally drawn

    A geometric Newton method for Oja's vector field

    Full text link
    Newton's method for solving the matrix equation F(X)AXXXTAX=0F(X)\equiv AX-XX^TAX=0 runs up against the fact that its zeros are not isolated. This is due to a symmetry of FF by the action of the orthogonal group. We show how differential-geometric techniques can be exploited to remove this symmetry and obtain a ``geometric'' Newton algorithm that finds the zeros of FF. The geometric Newton method does not suffer from the degeneracy issue that stands in the way of the original Newton method

    Direct exoplanet detection and characterization using the ANDROMEDA method: Performance on VLT/NaCo data

    Full text link
    Context. The direct detection of exoplanets with high-contrast imaging requires advanced data processing methods to disentangle potential planetary signals from bright quasi-static speckles. Among them, angular differential imaging (ADI) permits potential planetary signals with a known rotation rate to be separated from instrumental speckles that are either statics or slowly variable. The method presented in this paper, called ANDROMEDA for ANgular Differential OptiMal Exoplanet Detection Algorithm is based on a maximum likelihood approach to ADI and is used to estimate the position and the flux of any point source present in the field of view. Aims. In order to optimize and experimentally validate this previously proposed method, we applied ANDROMEDA to real VLT/NaCo data. In addition to its pure detection capability, we investigated the possibility of defining simple and efficient criteria for automatic point source extraction able to support the processing of large surveys. Methods. To assess the performance of the method, we applied ANDROMEDA on VLT/NaCo data of TYC-8979-1683-1 which is surrounded by numerous bright stars and on which we added synthetic planets of known position and flux in the field. In order to accommodate the real data properties, it was necessary to develop additional pre-processing and post-processing steps to the initially proposed algorithm. We then investigated its skill in the challenging case of a well-known target, β\beta Pictoris, whose companion is close to the detection limit and we compared our results to those obtained by another method based on principal component analysis (PCA). Results. Application on VLT/NaCo data demonstrates the ability of ANDROMEDA to automatically detect and characterize point sources present in the image field. We end up with a robust method bringing consistent results with a sensitivity similar to the recently published algorithms, with only two parameters to be fine tuned. Moreover, the companion flux estimates are not biased by the algorithm parameters and do not require a posteriori corrections. Conclusions. ANDROMEDA is an attractive alternative to current standard image processing methods that can be readily applied to on-sky data

    Auto-RSM: An automated parameter-selection algorithm for the RSM map exoplanet detection algorithm

    Full text link
    Context. Most of the high-contrast imaging (HCI) data-processing techniques used over the last 15 years have relied on the angular differential imaging (ADI) observing strategy, along with subtraction of a reference point spread function (PSF) to generate exoplanet detection maps. Recently, a new algorithm called regime switching model (RSM) map has been proposed to take advantage of these numerous PSF-subtraction techniques; RSM uses several of these techniques to generate a single probability map. Selection of the optimal parameters for these PSF-subtraction techniques as well as for the RSM map is not straightforward, is time consuming, and can be biased by assumptions made as to the underlying data set. Aims: We propose a novel optimisation procedure that can be applied to each of the PSF-subtraction techniques alone, or to the entire RSM framework. Methods: The optimisation procedure consists of three main steps: (i) definition of the optimal set of parameters for the PSF-subtraction techniques using the contrast as performance metric, (ii) optimisation of the RSM algorithm, and (iii) selection of the optimal set of PSF-subtraction techniques and ADI sequences used to generate the final RSM probability map. Results: The optimisation procedure is applied to the data sets of the exoplanet imaging data challenge, which provides tools to compare the performance of HCI data-processing techniques. The data sets consist of ADI sequences obtained with three state-of-the-art HCI instruments: SPHERE, NIRC2, and LMIRCam. The results of our analysis demonstrate the interest of the proposed optimisation procedure, with better performance metrics compared to the earlier version of RSM, as well as to other HCI data-processing techniques.EPIC; NNEx
    corecore