1,684 research outputs found

    VoIP Quality Assessment Technologies

    Get PDF

    Optimization of Planck/LFI on--board data handling

    Get PDF
    To asses stability against 1/f noise, the Low Frequency Instrument (LFI) onboard the Planck mission will acquire data at a rate much higher than the data rate allowed by its telemetry bandwith of 35.5 kbps. The data are processed by an onboard pipeline, followed onground by a reversing step. This paper illustrates the LFI scientific onboard processing to fit the allowed datarate. This is a lossy process tuned by using a set of 5 parameters Naver, r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level of distortion introduced by the onboard processing, EpsilonQ, as a function of these parameters. It describes the method of optimizing the onboard processing chain. The tuning procedure is based on a optimization algorithm applied to unprocessed and uncompressed raw data provided either by simulations, prelaunch tests or data taken from LFI operating in diagnostic mode. All the needed optimization steps are performed by an automated tool, OCA2, which ends with optimized parameters and produces a set of statistical indicators, among them the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr = 2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup the process an analytical model is developed that is able to extract most of the relevant information on EpsilonQ and Cr as a function of the signal statistics and the processing parameters. This model will be of interest for the instrument data analysis. The method was applied during ground tests when the instrument was operating in conditions representative of flight. Optimized parameters were obtained and the performance has been verified, the required data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of 3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx, txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted 10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio

    Computer Program and Method for Detecting and Predicting Valve Failure in a Reciprocating Compressor

    Get PDF
    Embodiments of the present invention provide a method implemented by a computer program for detecting and identifying valve failure in a reciprocating compressor and further for predicting valve failure in the compressor. Embodiments of the present invention detect and predict the valve failure using wavelet analysis, logistic regression, and neural networks. A pressure signal from the valve of the reciprocating compressor presents a non-stationary waveform from which features can be extracted using wavelet packet decomposition. The extracted features, along with temperature data for the valve, are used to train a logistic regression model to classify defective and normal operation of the valve. The wavelet features extracted from the pressure signal are also used to train a neural network model to predict to predict the future trend of the pressure signal of the system, which is used as an indicator for performance assessment and for root cause detection of the compressor valve failures

    Proceedings of the Second International Mobile Satellite Conference (IMSC 1990)

    Get PDF
    Presented here are the proceedings of the Second International Mobile Satellite Conference (IMSC), held June 17-20, 1990 in Ottawa, Canada. Topics covered include future mobile satellite communications concepts, aeronautical applications, modulation and coding, propagation and experimental systems, mobile terminal equipment, network architecture and control, regulatory and policy considerations, vehicle antennas, and speech compression

    Fog Computing in Medical Internet-of-Things: Architecture, Implementation, and Applications

    Full text link
    In the era when the market segment of Internet of Things (IoT) tops the chart in various business reports, it is apparently envisioned that the field of medicine expects to gain a large benefit from the explosion of wearables and internet-connected sensors that surround us to acquire and communicate unprecedented data on symptoms, medication, food intake, and daily-life activities impacting one's health and wellness. However, IoT-driven healthcare would have to overcome many barriers, such as: 1) There is an increasing demand for data storage on cloud servers where the analysis of the medical big data becomes increasingly complex, 2) The data, when communicated, are vulnerable to security and privacy issues, 3) The communication of the continuously collected data is not only costly but also energy hungry, 4) Operating and maintaining the sensors directly from the cloud servers are non-trial tasks. This book chapter defined Fog Computing in the context of medical IoT. Conceptually, Fog Computing is a service-oriented intermediate layer in IoT, providing the interfaces between the sensors and cloud servers for facilitating connectivity, data transfer, and queryable local database. The centerpiece of Fog computing is a low-power, intelligent, wireless, embedded computing node that carries out signal conditioning and data analytics on raw data collected from wearables or other medical sensors and offers efficient means to serve telehealth interventions. We implemented and tested an fog computing system using the Intel Edison and Raspberry Pi that allows acquisition, computing, storage and communication of the various medical data such as pathological speech data of individuals with speech disorders, Phonocardiogram (PCG) signal for heart rate estimation, and Electrocardiogram (ECG)-based Q, R, S detection.Comment: 29 pages, 30 figures, 5 tables. Keywords: Big Data, Body Area Network, Body Sensor Network, Edge Computing, Fog Computing, Medical Cyberphysical Systems, Medical Internet-of-Things, Telecare, Tele-treatment, Wearable Devices, Chapter in Handbook of Large-Scale Distributed Computing in Smart Healthcare (2017), Springe

    THE APPLICATION OF REAL-TIME SOFTWARE IN THE IMPLEMENTATION OF LOW-COST SATELLITE RETURN LINKS

    Get PDF
    Digital Signal Processors (DSPs) have evolved to a level where it is feasible for digital modems with relatively low data rates to be implemented entirely with software algorithms. With current technology it is still necessary for analogue processing between the RF input and a low frequency IF but, as DSP technology advances, it will become possible to shift the interface between analogue and digital domains ever closer towards the RF input. The software radio concept is a long-term goal which aims to realise software-based digital modems which are completely flexible in terms of operating frequency, bandwidth, modulation format and source coding. The ideal software radio cannot be realised until DSP, Analogue to Digital (A/D) and Digital to Analogue (D/A) technology has advanced sufficiently. Until these advances have been made, it is often necessary to sacrifice optimum performance in order to achieve real-time operation. This Thesis investigates practical real-time algorithms for carrier frequency synchronisation, symbol timing synchronisation, modulation, demodulation and FEC. Included in this work are novel software-based transceivers for continuous-mode transmission, burst-mode transmission, frequency modulation, phase modulation and orthogonal frequency division multiplexing (OFDM). Ideal applications for this work combine the requirement for flexible baseband signal processing and a relatively low data rate. Suitable applications for this work were identified in low-cost satellite return links, and specifically in asymmetric satellite Internet delivery systems. These systems employ a high-speed (>>2Mbps) DVB channel from service provider to customer and a low-cost, low-speed (32-128 kbps) return channel. This Thesis also discusses asymmetric satellite Internet delivery systems, practical considerations for their implementation and the techniques that are required to map TCP/IP traffic to low-cost satellite return links

    Comparison of CELP speech coder with a wavelet method

    Get PDF
    This thesis compares the speech quality of Code Excited Linear Predictor (CELP, Federal Standard 1016) speech coder with a new wavelet method to compress speech. The performances of both are compared by performing subjective listening tests. The test signals used are clean signals (i.e. with no background noise), speech signals with room noise and speech signals with artificial noise added. Results indicate that for clean signals and signals with predominantly voiced components the CELP standard performs better than the wavelet method but for signals with room noise the wavelet method performs much better than the CELP. For signals with artificial noise added, the results are mixed depending on the level of artificial noise added with CELP performing better for low level noise added signals and the wavelet method performing better for higher noise levels

    Monitoring multicast traffic in heterogeneous networks

    Get PDF
    Estágio realizado no INESC - Porto e orientado pelo Prof. Doutor Ricardo MorlaTese de mestrado integrado. Engenharia Electrotécnica e de Computadores - Major Telecomunicações. Faculdade de Engenharia. Universidade do Porto. 200

    Data Processing and Investigations for the GRACE Follow-On Laser Ranging Interferometer

    Get PDF
    This thesis presents first in-depth results of the Laser Ranging Interferometer (LRI) onboard the Gravity Recovery And Climate Experiment - Follow On (GRACE-Follow On) mission. The LRI is a novel instrument, which was developed in a U.S.-German collaboration including the Albert-Einstein Institute (AEI) in Hanover. It successfully demonstrated the feasibility of ranging measurements by means of laser interferometry between two distant spacecraft and will push space-borne gravimetry missions to the next sensitivity level. The author of this thesis contributed to this project by programming a comprehensive framework for ground-processing of LRI telemetry and analyzing various kinds of instrument data streams. Therefore, the title of this thesis covers both topics, data processing and investigations within the data. Within this thesis, an introduction to laser interferometry is given and the various payloads of the GRACE-Follow On satellites are presented. Furthermore, the design of the LRI itself is discussed, in order to understand the profound causal relations when getting into the details of investigations. The various kinds of telemetry data and their processing levels are presented, giving an insight about the variety of data sets, that are downlinked from the satellites. The investigations cover various major topics. These reach from different models to assess the absolute laser frequency, which sets the scale to convert the raw phase measurements into corresponding inter-satellite displacements, and comprise a detailed investigation of the carrier to noise ratio, which provides information about the signal quality. Furthermore, the laser’s beam properties in the far-field are investigated by means of the intensity and the phasefront. These investigations even lead to a proposal for a new scan pattern, which has actually been performed. Last but not least, a comprehensive assessment of the LRI spectrum was performed, which reveals correlation between the satellite’s attitude and orbit control system (AOCS), i.e. the star cameras for attitude determination and thruster activations for attitude control, and the ranging signal, measured by the LRI. In summary, this thesis is concerned with several aspects of the LRI characterization and data analysis. Since the overall data quality and sensitivity of the LRI exceeds the needs and expectations for the current gravimetric mission, many of the discussed effects are rather of academic interest, e.g. to deepen the instrument understanding of the LRI team and for the development of future missions in the field of geodesy or the space-based gravitational wave detection (LISA mission)
    • …
    corecore