40 research outputs found

    The Fine Guidance System of the PLATO Mission

    Get PDF
    PLATO - PLAnetary Transits and Oscillation of stars - is a medium-class mission in the European Space Agency (ESA) Cosmic Vision programme, whose launch is foreseen by 2026. The objective is the detection and characterization of terrestrial exoplanets up to the habitable zone of solar-type stars by means of their transit signature in front of a very large sample of bright stars. The seismic oscillations of the parent stars orbited by these planets is measured in order to understand the properties of the exoplanatory systems. The PLATO payload consists of an instrument with 26 cameras for star observation; 24 normal cameras grouped in four subsets with six cameras each and two fast cameras. Besides providing scientific data for very bright stars, the fast cameras also serve as two redundant Fine Guidance System (FGS) and will be an integral part of the Attitude and Orbit Control System (AOCS). This ensures a very high pointing precision which is needed to achieve a high photometric precision. Working as a star-tracker, the attitude calculation is based on guide star positions on the focal plane and their reference directions given by a star catalogue. Compared to predecessor missions like CoRoT, Kepler, or TESS, the precision of the fine guidance algorithm needs to be increased significantly. This is especially challenging as the optical design is identical for all cameras and optimized to meet the science objectives rather than to serve as a star-tracker. Therefore, a novel approach based on a Gaussian fit is proposed. The shown algorithm provides a noise optimal estimation of the guide star positions which propagates to an optimal attitude estimation. Although, computational more expensive than conventional methods, its suitability for a real-time on-board application is proven with an implementation on the target hardware. Furthermore, its robustness and precision is assessed theoretically and with simulated star images sequences

    Real-Time Hardware-in-the-Loop Test Configuration: Use Case for the Fine Guidance System of PLATO

    Get PDF
    The presentation describes a real-time hardware in the loop test configuration with payload hardware units and simulators, used for the verification and validation of the Fine Guidance System of the PLATO mission. The presentation covers the motivation behind a combination of real and emulated hardware, a description of the PLATO mission and its Fine Guidance System, the test configuration, the related timing expected during tests and tools for test automation. Morover, a solution for a data archiving tool is provided

    PlatoSim: an end-to-end PLATO camera simulator for modelling high-precision space-based photometry

    Get PDF
    Context. PLAnetary Transits and Oscillations of stars (PLATO) is the ESA M3 space mission dedicated to detect and characterise transiting exoplanets including information from the asteroseismic properties of their stellar hosts. The uninterrupted and high-precision photometry provided by space-borne instruments such as PLATO require long preparatory phases. An exhaustive list of tests are paramount to design a mission that meets the performance requirements and as such simulations are an indispensable tool in the mission preparation. Aims. To accommodate PLATOs need of versatile simulations prior to mission launch that at the same time describe innovative yet complex multi-telescope design accurately, in this work we present the end-to-end PLATO simulator specifically developed for that purpose, namely PlatoSim. We show, step-by-step, the algorithms embedded into the software architecture of PlatoSim that allow the user to simulate photometric time series of charge-coupled device (CCD) images and light curves in accordance to the expected observations of PLATO. Methods. In the context of the PLATO payload, a general formalism of modelling, end-to-end, incoming photons from the sky to the final measurement in digital units is discussed. According to the light path through the instrument, we present an overview of the stellar field and sky background, the short- and long-term barycentric pixel displacement of the stellar sources, the cameras and their optics, the modelling of the CCDs and their electronics, and all main random and systematic noise sources. Results. We show the strong predictive power of PlatoSim through its diverse applicability and contribution to numerous working groups within the PLATO mission consortium. This involves the ongoing mechanical integration and alignment, performance studies of the payload, the pipeline development, and assessments of the scientific goals. Conclusions. PlatoSim is a state-of-the-art simulator that is able to produce the expected photometric observations of PLATO to a high level of accuracy. We demonstrate that PlatoSim is a key software tool for the PLATO mission in the preparatory phases until mission launch and prospectively beyond

    High Resolution Mapping Using CCD-line Camera and Laser Scanner with Integrated Position and Orientation System

    Get PDF
    The fusion of panoramic data with laser scanner data is a new approach an allows the combination of high-resolution image and depth data. Application areas are city modeling, computer vision and documentation of the cultural heritage. Panoramic recording of image data is realized by a CCD-line, which is precisely rotated around the projection centre. In the case of other possible movements, the actual position of the projection centre and the view direction has to be measured. Linear moving panoramas e.g. along a wall are an interesting extension of such rotational panoramas. Here, the instantaneous position and orientation determination can be realized with an integrated navigation system comprising differential GPS and an inertial measurement unit. This paper investigates the combination of a panoramic camera and a laser scanner with a navigation system for indoor and outdoor applications. First it will be reported about laboratory experiments which were carried out to abtain valid parameters about the surveying accuracy achievable with both sensors panoramic camera and laser scanner respectively. Then out door surveying results using a position and orientation system as navigation sensor will be presented and discussed

    Stereo-Vision-Aided Inertial Navigation

    No full text
    Das Wissen um Position und Lage spielt für viele Anwendungen eine entscheidende Rolle. Diese Arbeit zeigt die enge Verknüpfung von inertialer Navigation und optischen Informationen, um dieses Wissen zu gewinnen. Die vorgestellte Methode basiert auf rein passiven Messungen und ist unabhängig von externen Referenzen. Damit eignet sie sich für die Navigation in unbekannten Umgebungen sowohl in Innen- als auch in Außenbereichen. Ein Hauptaugenmerk liegt auf der Gewinnung von Bewegungsinformationen aus optischen Systemen. Unterstützt durch inertiale Bewegungsdaten werden natürliche Merkmale einer Umgebung verfolgt, um daraus die Eigenbewegung einer Stereokamera abzuleiten, die dazu beiträgt, die inertiale Sensordrift zu kompensieren. Die Nutzung der inertialen Messungen liefert eine signifikanten Beitrag zur Vermeidung von Fehlzuordnungen und zur Reduzierung von Rechenleistung. Die gemessene Erdbeschleunigung dient als vertikale Referenz, welche die Navigationslösung zusätzlich stabilisiert. Die inertiale Navigation, ein Koppelnavigationsverfahren, hat eine große Bedeutung und ist Gegenstand vieler Forschungsarbeiten. Bei der Koppelnavigation wird aus der zuletzt bekannten Position sowie der gemessenen Geschwindigkeit und Zeit die aktuelle Position bestimmt. Zusätzliche langzeitstabile externe Messungen, wie z.B. GPS, ergänzen die guten Kurzzeiteigenschaften der inertialen Navigation und begrenzen aufsummierte Fehler. Obwohl das Verfahren für viele Anwendungen sehr gut funktioniert, zeigt es Schwächen, wenn die stützende Messung fehlerhaft oder nicht verfügbar ist. Der Einsatz unabhängiger Systeme wie z.B. optischer Sensoren, Barometer oder Odometer stellt daher eine sinnvolle Ergänzung dar. Zunächst stelle ich einen allgemeinen Ansatz für ein Multisensor-System zur Positions- und Lagemessung vor. Hierbei beleuchte ich die gesamte Systemkette, beginnend mit der Auslegung der Hardware-Komponenten über die Datenerfassung und die Kalibrierung bis zur Ableitung höherwertiger Informationen aus fusionierten Sensordaten. Insbesondere die detaillierte Betrachtung möglicher Fehlerquellen liefert ein wichtigen Beitrag zum Systemverständnis. Anhand einiger Navigationsaufgaben im Innen- und Außenbereich stelle ich beispielhaft das Ergebnis einer Integration von optischen- und inertialen Messdaten dar

    Stereobildgestützte inertiale Navigation

    No full text
    Reliable information about position and attitude is an essential requirement for many applications. The work expounded in this paper aims at a tight integration of low-cost inertial navigation and stereo vision to obtain this information. The method I present here is based on passive measurements and does not rely on external referencing. Thus, it provides a navigation solution for unknown indoor and outdoor environments.Das Wissen um Position und Lage spielt für viele Anwendungen eine entscheidende Rolle. Diese Arbeit zeigt die enge Verknüpfung von inertialer Navigation und optischen Informationen, um dieses Wissen zu gewinnen. Die vorgestellte Methode basiert auf rein passiven Messungen und ist unabhängig von externen Referenzen. Damit eignet sie sich für die Navigation in unbekannten Umgebungen sowohl in Innen- als auch in Außenbereichen

    Test Input Partitioning for Automated Testing of Satellite On-board Image Processing Algorithms

    Get PDF
    On-board image processing technologies in the satellite domain are subject to extremely strict requirements with respect to reliability and accuracy in hard real-time. Due to their large input domain, it is infeasible to execute all possible test cases. To overcome this problem, we define a novel test approach that efficiently and systematically captures the input domain of satellite on-board image processing applications. To achieve this, we first present a dedicated partitioning into equivalence classes for each input parameter. Then, we define multidimensional coverage criteria to assess a given test suite for its coverage on the complete input domain. Finally, we present a test generation algorithm that automatically inserts missing test cases into a given test suite based on our multidimensional coverage criteria. This results in a reasonably small test suite that covers the whole input domain of satellite on-board image processing applications. We demonstrate the effectiveness of our approach with experimental results from the ESA medium-class mission PLATO

    Uncertainty Model for Template Feature Matching

    Get PDF
    Using visual odometry and inertial measurements, indoor and outdoor positioning systems can perform an accurate self-localization in unknown, unstructured environments where absolute positioning systems (e.g. GNSS) are unavailable. However, the achievable accuracy is highly affected by the residuals of calibration, the quality of the noise model, etc. Only if these unavoidable uncertainties of sensors and data processing can be taken into account and be handled via error propagation, which allows to propagate them through the entire system. The central filter (e.g. Kalman filter) of the system can then make use of the enhanced statistical model and use the propagated errors to calculate the optimal result. In this paper, we focus on the uncertaintiy calculation of the elementary part of the optical navigation, the template feature matcher. First of all, we propose a method to model the image noise. Then we use Taylor's theorem to extend two very popular and efficient template feature matchers sum-of-absolute-differences (SAD) and normalized-cross-correlation (NCC) to get sub-pixel matching results. Based on the proposed noise model and the extended matcher, we propagate the image noise to the uncertainties of sub-pixel matching results. Although the SAD and NCC are used, the image noise model can be easily combined with other feature matchers. We evaluate our method by an Integrated Positioning System (IPS) which is developed by German Aerospace Center. The experimental results show that our method can improve the quality of the measured trajectory. Moreover, it increases the robustness of the system

    Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments

    Get PDF
    Accurate knowledge of position and orientation is a prerequisite for many applications regarding unmanned navigation, mapping, or environmental modelling. GPS-aided inertial navigation is the preferred solution for outdoor applications. Nevertheless a similar solution for navigation tasks in difficult environments with erroneous or no GPS-data is needed. Therefore a stereo vision aided inertial navigation system is presented which is capable of providing real-time local navigation for indoor applications. A method is described to reconstruct the ego motion of a stereo camera system aided by inertial data. This, in turn, is used to constrain the inertial sensor drift. The optical information is derived from natural landmarks, extracted and tracked over consequent stereo image pairs. Using inertial data for feature tracking effectively reduces computational costs and at the same time increases the reliability due to constrained search areas. Mismatched features, e.g. at repetitive structures typical for indoor environments are avoided. An Integrated Positioning System (IPS) was deployed and tested on an indoor navigation task. IPS was evaluated for accuracy, ro- bustness, and repeatability in a common office environment. In combination with a dense disparity map, derived from the navigation cameras, a high density point cloud is generated to show the capability of the navigation algorithm
    corecore