824,579 research outputs found

    Direct exoplanet detection and characterization using the ANDROMEDA method: Performance on VLT/NaCo data

    Full text link
    Context. The direct detection of exoplanets with high-contrast imaging requires advanced data processing methods to disentangle potential planetary signals from bright quasi-static speckles. Among them, angular differential imaging (ADI) permits potential planetary signals with a known rotation rate to be separated from instrumental speckles that are either statics or slowly variable. The method presented in this paper, called ANDROMEDA for ANgular Differential OptiMal Exoplanet Detection Algorithm is based on a maximum likelihood approach to ADI and is used to estimate the position and the flux of any point source present in the field of view. Aims. In order to optimize and experimentally validate this previously proposed method, we applied ANDROMEDA to real VLT/NaCo data. In addition to its pure detection capability, we investigated the possibility of defining simple and efficient criteria for automatic point source extraction able to support the processing of large surveys. Methods. To assess the performance of the method, we applied ANDROMEDA on VLT/NaCo data of TYC-8979-1683-1 which is surrounded by numerous bright stars and on which we added synthetic planets of known position and flux in the field. In order to accommodate the real data properties, it was necessary to develop additional pre-processing and post-processing steps to the initially proposed algorithm. We then investigated its skill in the challenging case of a well-known target, β\beta Pictoris, whose companion is close to the detection limit and we compared our results to those obtained by another method based on principal component analysis (PCA). Results. Application on VLT/NaCo data demonstrates the ability of ANDROMEDA to automatically detect and characterize point sources present in the image field. We end up with a robust method bringing consistent results with a sensitivity similar to the recently published algorithms, with only two parameters to be fine tuned. Moreover, the companion flux estimates are not biased by the algorithm parameters and do not require a posteriori corrections. Conclusions. ANDROMEDA is an attractive alternative to current standard image processing methods that can be readily applied to on-sky data

    First performance measurements with the Analysis Grand Challenge

    Full text link
    The IRIS-HEP Analysis Grand Challenge (AGC) is designed to be a realistic environment for investigating how analysis methods scale to the demands of the HL-LHC. The analysis task is based on publicly available Open Data and allows for comparing the usability and performance of different approaches and implementations. It includes all relevant workflow aspects from data delivery to statistical inference. The reference implementation for the AGC analysis task is heavily based on tools from the HEP Python ecosystem. It makes use of novel pieces of cyberinfrastructure and modern analysis facilities in order to address the data processing challenges of the HL-LHC. This contribution compares multiple different analysis implementations and studies their performance. Differences between the implementations include the use of multiple data delivery mechanisms and caching setups for the analysis facilities under investigation.Comment: Submitted as proceedings for 21st International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2022) to Journal Of Physics: Conference Serie

    Investigating parallel multi-step vibration processing pipelines for planetary stage fault detection in wind turbine drivetrains

    Get PDF
    This paper proposes a signal processing approach for wind turbine gearbox vibration signals based on employing multiple analysis pipelines. These so-called pipelines consist of combinations of various advanced signal processing methods that have been proven to be effective in literature when applied to wind turbine vibration signals. The performance of the pipelines is examined on vibration data containing different wind turbine gearbox faults in the planetary stages. Condition indicators are extracted from every pipeline to evaluate the fault detection capability for such incipient failures. The results indicate that the multipronged approach with the different pipelines increases the reliability of successfully detecting incipient planetary stage gearbox faults. The type, location, and severity of the fault influences the choice for the appropriate processing method combination. It is therefore often insufficient to only utilize a single processing pipeline for vibration analysis of wind turbine gearbox faults. Besides investigating the performance of the different processing techniques, the main outcome and recommendation of this paper is thus to employ a diversified analysis methodology which is not limited to a sole method combination, to improve the early detection rate of planetary stage gearbox faults

    Develop Advanced Nonlinear Signal Analysis Topographical Mapping System

    Get PDF
    During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data

    Practical application of random vibration signal analysis on structural dynamics

    Get PDF
    The use of random signal analysis in vibration and structural analysis is discussed and documented. The need to minimize product performance and durability risks due to competitive pressure in the marketplace has forced industry to streamline methods of product design, processing, and evaluation. Engineers need to be experienced in much more advanced techniques than they did even a decade ago. Random vibration signals are used to simulate real world environments and are another tool in structural analysis. This investigation outlines the basics of random vibrations, frequency analyzers, data analysis, along with associated noise and errors problems. A case study using an aluminum beam is used to demonstrate the concepts presented in this work

    A Common Knowledge Engineering Framework for Data Assimilation, Correlation, and Extrapolation (DACE)

    Get PDF
    The Common Knowledge Engineering Framework for Data Assimilation, Correlation, and Extrapolation (DACE) project is focused on providing a software centric general framework for advanced processing and analysis of data. This translates to researchers, scientists, engineers, and system architects not having to program a new application but rather to define the system configuration, process, and processing that is needed to perform a specific functionality. This makes the limitation of the application the end users ability to fully define the functional requirements and setup the framework accordingly. This doctoral project will provide the details to the system definition, standards, metrics, schedule, and evaluation that were utilized in the performance o f this project. The project’s framework allows multiple analysis methods to be utilized either individually or concurrently depending on the end user’s configuration. The architecture will not provide limitations on what can be done. It will allow the end user to configure and define the analysis method to use

    Does Body Mass Index Modify Memory, Reasoning, and Speed of Processing Training Effects in Older Adults

    Get PDF
    Objective To describe 10-year trajectories of cognitive performance by body mass index (BMI) class and to investigate BMI differences in response to memory, reasoning, and speed of processing training in older adults. Methods This is a secondary analysis of the multisite, randomized trial Advanced Cognitive Training for Independent and Vital Elderly. There were 701 older adults with normal weight, 1,081 with overweight, and 902 with obesity (mean age 73.6) randomized to memory training, reasoning training, speed of processing training, or no-training control group. Participants completed memory, reasoning, and speed of processing tests. Baseline sociodemographic, health, and chronic disease measures were included as covariates in analyses. Results The 10-year trajectories of memory, reasoning, or speed of processing performance did not differ by BMI status among the participants randomized to the untrained control arm. The training effect on the reasoning and speed of processing outcomes did not differ by BMI status. The training effect on the memory outcome in participants with a BMI indicating obesity, however, was just 38% of that observed in participants with normal-weight BMI. Conclusions These analyses of data from the largest trial of cognitive training ever conducted suggest that older adults with obesity may be less responsive to memory training

    A Unified Approach for Comprehensive Analysis of Various Spectral and Tissue Doppler Echocardiography

    Full text link
    Doppler echocardiography offers critical insights into cardiac function and phases by quantifying blood flow velocities and evaluating myocardial motion. However, previous methods for automating Doppler analysis, ranging from initial signal processing techniques to advanced deep learning approaches, have been constrained by their reliance on electrocardiogram (ECG) data and their inability to process Doppler views collectively. We introduce a novel unified framework using a convolutional neural network for comprehensive analysis of spectral and tissue Doppler echocardiography images that combines automatic measurements and end-diastole (ED) detection into a singular method. The network automatically recognizes key features across various Doppler views, with novel Doppler shape embedding and anti-aliasing modules enhancing interpretation and ensuring consistent analysis. Empirical results indicate a consistent outperformance in performance metrics, including dice similarity coefficients (DSC) and intersection over union (IoU). The proposed framework demonstrates strong agreement with clinicians in Doppler automatic measurements and competitive performance in ED detection

    Dual-Use Space Technology Transfer Conference and Exhibition

    Get PDF
    This document contains papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; new ways of doing business; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; ans robotics technologies. More than 77 papers, 20 presentations, and 20 exhibits covering various disciplines were presented b experts from NASA, universities, and industry

    Space/time/frequency methods in adaptive radar

    Get PDF
    Radar systems may be processed with various space, time and frequency techniques. Advanced radar systems are required to detect targets in the presence of jamming and clutter. This work studies the application of two types of radar systems. It is well known that targets moving along-track within a Synthetic Aperture Radar field of view are imaged as defocused objects. The SAR stripmap mode is tuned to stationary ground targets and the mismatch between the SAR processing parameters and the target motion parameters causes the energy to spill over to adjacent image pixels, thus hindering target feature extraction and reducing the probability of detection. The problem can be remedied by generating the image using a filter matched to the actual target motion parameters, effectively focusing the SAR image on the target. For a fixed rate of motion the target velocity can be estimated from the slope of the Doppler frequency characteristic. The problem is similar to the classical problem of estimating the instantaneous frequency of a linear FM signal (chirp). The Wigner-Ville distribution, the Gabor expansion, the Short-Time Fourier transform and the Continuous Wavelet Transform are compared with respect to their performance in noisy SAR data to estimate the instantaneous Doppler frequency of range compressed SAR data. It is shown that these methods exhibit sharp signal-to-noise threshold effects. The space-time radar problem is well suited to the application of techniques that take advantage of the low-rank property of the space-time covariance matrix. It is shown that reduced-rank methods outperform full-rank space-time adaptive processing when the space-time covariance matrix is estimated from a dataset with limited support. The utility of reduced-rank methods is demonstrated by theoretical analysis, simulations and analysis of real data. It is shown that reduced-rank processing has two effects on the performance: increased statistical stability which tends to improve performance, and introduction of a bias which lowers the signal-to-noise ratio. A method for evaluating the theoretical conditioned SNR for fixed reduced-rank transforms is also presented
    • …
    corecore