1,978 research outputs found

    Hidden Markov model tracking of continuous gravitational waves from a neutron star with wandering spin

    Full text link
    Gravitational wave searches for continuous-wave signals from neutron stars are especially challenging when the star's spin frequency is unknown a priori from electromagnetic observations and wanders stochastically under the action of internal (e.g. superfluid or magnetospheric) or external (e.g. accretion) torques. It is shown that frequency tracking by hidden Markov model (HMM) methods can be combined with existing maximum likelihood coherent matched filters like the F-statistic to surmount some of the challenges raised by spin wandering. Specifically it is found that, for an isolated, biaxial rotor whose spin frequency walks randomly, HMM tracking of the F-statistic output from coherent segments with duration T_drift = 10d over a total observation time of T_obs = 1yr can detect signals with wave strains h0 > 2e-26 at a noise level characteristic of the Advanced Laser Interferometer Gravitational Wave Observatory (Advanced LIGO). For a biaxial rotor with randomly walking spin in a binary orbit, whose orbital period and semi-major axis are known approximately from electromagnetic observations, HMM tracking of the Bessel-weighted F-statistic output can detect signals with h0 > 8e-26. An efficient, recursive, HMM solver based on the Viterbi algorithm is demonstrated, which requires ~10^3 CPU-hours for a typical, broadband (0.5-kHz) search for the low-mass X-ray binary Scorpius X-1, including generation of the relevant F-statistic input. In a "realistic" observational scenario, Viterbi tracking successfully detects 41 out of 50 synthetic signals without spin wandering in Stage I of the Scorpius X-1 Mock Data Challenge convened by the LIGO Scientific Collaboration down to a wave strain of h0 = 1.1e-25, recovering the frequency with a root-mean-square accuracy of <= 4.3e-3 Hz

    Kalman-filter control schemes for fringe tracking. Development and application to VLTI/GRAVITY

    Full text link
    The implementation of fringe tracking for optical interferometers is inevitable when optimal exploitation of the instrumental capacities is desired. Fringe tracking allows continuous fringe observation, considerably increasing the sensitivity of the interferometric system. In addition to the correction of atmospheric path-length differences, a decent control algorithm should correct for disturbances introduced by instrumental vibrations, and deal with other errors propagating in the optical trains. We attempt to construct control schemes based on Kalman filters. Kalman filtering is an optimal data processing algorithm for tracking and correcting a system on which observations are performed. As a direct application, control schemes are designed for GRAVITY, a future four-telescope near-infrared beam combiner for the Very Large Telescope Interferometer (VLTI). We base our study on recent work in adaptive-optics control. The technique is to describe perturbations of fringe phases in terms of an a priori model. The model allows us to optimize the tracking of fringes, in that it is adapted to the prevailing perturbations. Since the model is of a parametric nature, a parameter identification needs to be included. Different possibilities exist to generalize to the four-telescope fringe tracking that is useful for GRAVITY. On the basis of a two-telescope Kalman-filtering control algorithm, a set of two properly working control algorithms for four-telescope fringe tracking is constructed. The control schemes are designed to take into account flux problems and low-signal baselines. First simulations of the fringe-tracking process indicate that the defined schemes meet the requirements for GRAVITY and allow us to distinguish in performance. In a future paper, we will compare the performances of classical fringe tracking to our Kalman-filter control.Comment: 17 pages, 8 figures, accepted for publication in A&

    Geometric Cross-Modal Comparison of Heterogeneous Sensor Data

    Full text link
    In this work, we address the problem of cross-modal comparison of aerial data streams. A variety of simulated automobile trajectories are sensed using two different modalities: full-motion video, and radio-frequency (RF) signals received by detectors at various locations. The information represented by the two modalities is compared using self-similarity matrices (SSMs) corresponding to time-ordered point clouds in feature spaces of each of these data sources; we note that these feature spaces can be of entirely different scale and dimensionality. Several metrics for comparing SSMs are explored, including a cutting-edge time-warping technique that can simultaneously handle local time warping and partial matches, while also controlling for the change in geometry between feature spaces of the two modalities. We note that this technique is quite general, and does not depend on the choice of modalities. In this particular setting, we demonstrate that the cross-modal distance between SSMs corresponding to the same trajectory type is smaller than the cross-modal distance between SSMs corresponding to distinct trajectory types, and we formalize this observation via precision-recall metrics in experiments. Finally, we comment on promising implications of these ideas for future integration into multiple-hypothesis tracking systems.Comment: 10 pages, 13 figures, Proceedings of IEEE Aeroconf 201

    Approximate Dynamic Programming via Sum of Squares Programming

    Full text link
    We describe an approximate dynamic programming method for stochastic control problems on infinite state and input spaces. The optimal value function is approximated by a linear combination of basis functions with coefficients as decision variables. By relaxing the Bellman equation to an inequality, one obtains a linear program in the basis coefficients with an infinite set of constraints. We show that a recently introduced method, which obtains convex quadratic value function approximations, can be extended to higher order polynomial approximations via sum of squares programming techniques. An approximate value function can then be computed offline by solving a semidefinite program, without having to sample the infinite constraint. The policy is evaluated online by solving a polynomial optimization problem, which also turns out to be convex in some cases. We experimentally validate the method on an autonomous helicopter testbed using a 10-dimensional helicopter model.Comment: 7 pages, 5 figures. Submitted to the 2013 European Control Conference, Zurich, Switzerlan

    Integrated Approach to Airborne Laser Communication

    Get PDF
    Lasers offer tremendous advantages over RF communication systems in bandwidth and security, due to their ultra-high frequency and narrow spatial beamwidth. Atmospheric turbulence causes severe received power variations and high bit error rates (BERs) in airborne laser communication. Airborne optical communication systems require special considerations in size, complexity, power, and weight. Conventional adaptive optics systems correct for the phase only and cannot correct for strong scintillation, but here the two transmission paths are separated sufficiently so that the strong scintillation is \averaged out by incoherently summing up the two beams in the receiver. This requisite separation distance is derived for multiple geometries, turbulence conditions, and turbulence effects. Integrating multiple techniques into a system alleviates the deleterious effects of turbulence without bulky adaptive optics systems. Wave optics simulations show multiple transmitters, receiver and transmitter trackers, and adaptive thresholding significantly reduce the BER (by over 10,000 times)

    Fusion of Head and Full-Body Detectors for Multi-Object Tracking

    Full text link
    In order to track all persons in a scene, the tracking-by-detection paradigm has proven to be a very effective approach. Yet, relying solely on a single detector is also a major limitation, as useful image information might be ignored. Consequently, this work demonstrates how to fuse two detectors into a tracking system. To obtain the trajectories, we propose to formulate tracking as a weighted graph labeling problem, resulting in a binary quadratic program. As such problems are NP-hard, the solution can only be approximated. Based on the Frank-Wolfe algorithm, we present a new solver that is crucial to handle such difficult problems. Evaluation on pedestrian tracking is provided for multiple scenarios, showing superior results over single detector tracking and standard QP-solvers. Finally, our tracker ranks 2nd on the MOT16 benchmark and 1st on the new MOT17 benchmark, outperforming over 90 trackers.Comment: 10 pages, 4 figures; Winner of the MOT17 challenge; CVPRW 201

    Real-time data analysis at the LHC: present and future

    Full text link
    The Large Hadron Collider (LHC), which collides protons at an energy of 14 TeV, produces hundreds of exabytes of data per year, making it one of the largest sources of data in the world today. At present it is not possible to even transfer most of this data from the four main particle detectors at the LHC to "offline" data facilities, much less to permanently store it for future processing. For this reason the LHC detectors are equipped with real-time analysis systems, called triggers, which process this volume of data and select the most interesting proton-proton collisions. The LHC experiment triggers reduce the data produced by the LHC by between 1/1000 and 1/100000, to tens of petabytes per year, allowing its economical storage and further analysis. The bulk of the data-reduction is performed by custom electronics which ignores most of the data in its decision making, and is therefore unable to exploit the most powerful known data analysis strategies. I cover the present status of real-time data analysis at the LHC, before explaining why the future upgrades of the LHC experiments will increase the volume of data which can be sent off the detector and into off-the-shelf data processing facilities (such as CPU or GPU farms) to tens of exabytes per year. This development will simultaneously enable a vast expansion of the physics programme of the LHC's detectors, and make it mandatory to develop and implement a new generation of real-time multivariate analysis tools in order to fully exploit this new potential of the LHC. I explain what work is ongoing in this direction and motivate why more effort is needed in the coming years.Comment: Contribution to the proceedings of the HEPML workshop NIPS 2014. 20 pages, 5 figure

    Improved detection limits using a hand-held optical imager with coregistration capabilities

    Get PDF
    Optical imaging is emerging as a non-invasive and non-ionizing method for breast cancer diagnosis. A hand-held optical imager has been developed with coregistration facilities towards flexible imaging of different tissue volumes and curvatures in near real-time. Herein, fluorescence-enhanced optical imaging experiments are performed to demonstrate deeper target detection under perfect and imperfect (100:1) uptake conditions in (liquid) tissue phantoms and in vitro. Upon summation of multiple scans (fluorescence intensity images), fluorescent targets are detected at greater depths than from single scan alone
    • …
    corecore