224,752 research outputs found

    A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    Full text link
    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI papers published on JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/jins

    Telescience Testbed Pilot Program

    Get PDF
    The Telescience Testbed Pilot Program is developing initial recommendations for requirements and design approaches for the information systems of the Space Station era. During this quarter, drafting of the final reports of the various participants was initiated. Several drafts are included in this report as the University technical reports

    EChO Payload electronics architecture and SW design

    Full text link
    EChO is a three-modules (VNIR, SWIR, MWIR), highly integrated spectrometer, covering the wavelength range from 0.55 μ\mum, to 11.0 μ\mum. The baseline design includes the goal wavelength extension to 0.4 μ\mum while an optional LWIR module extends the range to the goal wavelength of 16.0 μ\mum. An Instrument Control Unit (ICU) is foreseen as the main electronic subsystem interfacing the spacecraft and collecting data from all the payload spectrometers modules. ICU is in charge of two main tasks: the overall payload control (Instrument Control Function) and the housekeepings and scientific data digital processing (Data Processing Function), including the lossless compression prior to store the science data to the Solid State Mass Memory of the Spacecraft. These two main tasks are accomplished thanks to the Payload On Board Software (P-OBSW) running on the ICU CPUs.Comment: Experimental Astronomy - EChO Special Issue 201

    Engineering simulations for cancer systems biology

    Get PDF
    Computer simulation can be used to inform in vivo and in vitro experimentation, enabling rapid, low-cost hypothesis generation and directing experimental design in order to test those hypotheses. In this way, in silico models become a scientific instrument for investigation, and so should be developed to high standards, be carefully calibrated and their findings presented in such that they may be reproduced. Here, we outline a framework that supports developing simulations as scientific instruments, and we select cancer systems biology as an exemplar domain, with a particular focus on cellular signalling models. We consider the challenges of lack of data, incomplete knowledge and modelling in the context of a rapidly changing knowledge base. Our framework comprises a process to clearly separate scientific and engineering concerns in model and simulation development, and an argumentation approach to documenting models for rigorous way of recording assumptions and knowledge gaps. We propose interactive, dynamic visualisation tools to enable the biological community to interact with cellular signalling models directly for experimental design. There is a mismatch in scale between these cellular models and tissue structures that are affected by tumours, and bridging this gap requires substantial computational resource. We present concurrent programming as a technology to link scales without losing important details through model simplification. We discuss the value of combining this technology, interactive visualisation, argumentation and model separation to support development of multi-scale models that represent biologically plausible cells arranged in biologically plausible structures that model cell behaviour, interactions and response to therapeutic interventions

    Study of application of space telescope science operations software for SIRTF use

    Get PDF
    The design and development of the Space Telescope Science Operations Ground System (ST SOGS) was evaluated to compile a history of lessons learned that would benefit NASA's Space Infrared Telescope Facility (SIRTF). Forty-nine specific recommendations resulted and were categorized as follows: (1) requirements: a discussion of the content, timeliness and proper allocation of the system and segment requirements and the resulting impact on SOGS development; (2) science instruments: a consideration of the impact of the Science Instrument design and data streams on SOGS software; and (3) contract phasing: an analysis of the impact of beginning the various ST program segments at different times. Approximately half of the software design and source code might be useable for SIRTF. Transportability of this software requires, at minimum, a compatible DEC VAX-based architecture and VMS operating system, system support software similar to that developed for SOGS, and continued evolution of the SIRTF operations concept and requirements such that they remain compatible with ST SOGS operation

    Workshop proceedings: Information Systems for Space Astrophysics in the 21st Century, volume 1

    Get PDF
    The Astrophysical Information Systems Workshop was one of the three Integrated Technology Planning workshops. Its objectives were to develop an understanding of future mission requirements for information systems, the potential role of technology in meeting these requirements, and the areas in which NASA investment might have the greatest impact. Workshop participants were briefed on the astrophysical mission set with an emphasis on those missions that drive information systems technology, the existing NASA space-science operations infrastructure, and the ongoing and planned NASA information systems technology programs. Program plans and recommendations were prepared in five technical areas: Mission Planning and Operations; Space-Borne Data Processing; Space-to-Earth Communications; Science Data Systems; and Data Analysis, Integration, and Visualization

    Robot Autonomy for Surgery

    Full text link
    Autonomous surgery involves having surgical tasks performed by a robot operating under its own will, with partial or no human involvement. There are several important advantages of automation in surgery, which include increasing precision of care due to sub-millimeter robot control, real-time utilization of biosignals for interventional care, improvements to surgical efficiency and execution, and computer-aided guidance under various medical imaging and sensing modalities. While these methods may displace some tasks of surgical teams and individual surgeons, they also present new capabilities in interventions that are too difficult or go beyond the skills of a human. In this chapter, we provide an overview of robot autonomy in commercial use and in research, and present some of the challenges faced in developing autonomous surgical robots

    The digital data processing concepts of the LOFT mission

    Full text link
    The Large Observatory for X-ray Timing (LOFT) is one of the five mission candidates that were considered by ESA for an M3 mission (with a launch opportunity in 2022 - 2024). LOFT features two instruments: the Large Area Detector (LAD) and the Wide Field Monitor (WFM). The LAD is a 10 m 2 -class instrument with approximately 15 times the collecting area of the largest timing mission so far (RXTE) for the first time combined with CCD-class spectral resolution. The WFM will continuously monitor the sky and recognise changes in source states, detect transient and bursting phenomena and will allow the mission to respond to this. Observing the brightest X-ray sources with the effective area of the LAD leads to enormous data rates that need to be processed on several levels, filtered and compressed in real-time already on board. The WFM data processing on the other hand puts rather low constraints on the data rate but requires algorithms to find the photon interaction location on the detector and then to deconvolve the detector image in order to obtain the sky coordinates of observed transient sources. In the following, we want to give an overview of the data handling concepts that were developed during the study phase.Comment: Proc. SPIE 9144, Space Telescopes and Instrumentation 2014: Ultraviolet to Gamma Ray, 91446
    • …
    corecore