75,244 research outputs found
Production of Reliable Flight Crucial Software: Validation Methods Research for Fault Tolerant Avionics and Control Systems Sub-Working Group Meeting
The state of the art in the production of crucial software for flight control applications was addressed. The association between reliability metrics and software is considered. Thirteen software development projects are discussed. A short term need for research in the areas of tool development and software fault tolerance was indicated. For the long term, research in format verification or proof methods was recommended. Formal specification and software reliability modeling, were recommended as topics for both short and long term research
Recommended from our members
Fluorescent amplification for next generation sequencing (FA-NGS) library preparation.
BACKGROUND:Next generation sequencing (NGS) has become a universal practice in modern molecular biology. As the throughput of sequencing experiments increases, the preparation of conventional multiplexed libraries becomes more labor intensive. Conventional library preparation typically requires quality control (QC) testing for individual libraries such as amplification success evaluation and quantification, none of which occur until the end of the library preparation process. RESULTS:In this study, we address the need for a more streamlined high-throughput NGS workflow by tethering real-time quantitative PCR (qPCR) to conventional workflows to save time and implement single tube and single reagent QC. We modified two distinct library preparation workflows by replacing PCR and quantification with qPCR using SYBR Green I. qPCR enabled individual library quantification for pooling in a single tube without the need for additional reagents. Additionally, a melting curve analysis was implemented as an intermediate QC test to confirm successful amplification. Sequencing analysis showed comparable percent reads for each indexed library, demonstrating that pooling calculations based on qPCR allow for an even representation of sequencing reads. To aid the modified workflow, a software toolkit was developed and used to generate pooling instructions and analyze qPCR and melting curve data. CONCLUSIONS:We successfully applied fluorescent amplification for next generation sequencing (FA-NGS) library preparation to both plasmids and bacterial genomes. As a result of using qPCR for quantification and proceeding directly to library pooling, the modified library preparation workflow has fewer overall steps. Therefore, we speculate that the FA-NGS workflow has less risk of user error. The melting curve analysis provides the necessary QC test to identify and troubleshoot library failures prior to sequencing. While this study demonstrates the value of FA-NGS for plasmid or gDNA libraries, we speculate that its versatility could lead to successful application across other library types
Tsunamis generated by fast granular landslides: 3D experiments and empirical predictors
Landslides falling into water bodies can generate impulsive waves, which are a type of tsunamis. The propagating wave may be highly destructive for hydraulic structures, civil infrastructure and people living along the shorelines. A facility to study this phenomenon was set up in the laboratory of the Technical University of Catalonia. The set-up consists of a new device releasing granular material at high velocity into a wave basin. A system employing laser sheets, high-speed and high-definition cameras was designed to accurately measure the high velocity and geometry of the sliding mass as well as the produced water displacement in time and space. The analysis of experimental data helped to develop empirical relationships linking the landslide parameters with the produced wave amplitude, propagation features and energy, which are useful tools for the hazard assessment. The empirical relationships were successfully tested in the case of the 2007 event that occurred in Chehalis Lake (Canada).Peer ReviewedPostprint (author's final draft
One-step deposition of nano-to-micron-scalable, high-quality digital image correlation patterns for high-strain in-situ multi-microscopy testing
Digital Image Correlation (DIC) is of vital importance in the field of
experimental mechanics, yet, producing suitable DIC patterns for demanding
in-situ mechanical tests remains challenging, especially for ultra-fine
patterns, despite the large number of patterning techniques in the literature.
Therefore, we propose a simple, flexible, one-step technique (only requiring a
conventional deposition machine) to obtain scalable, high-quality, robust DIC
patterns, suitable for a range of microscopic techniques, by deposition of a
low melting temperature solder alloy in so-called 'island growth' mode, without
elevating the substrate temperature. Proof of principle is shown by
(near-)room-temperature deposition of InSn patterns, yielding highly dense,
homogeneous DIC patterns over large areas with a feature size that can be tuned
from as small as 10nm to 2um and with control over the feature shape and
density by changing the deposition parameters. Pattern optimization, in terms
of feature size, density, and contrast, is demonstrated for imaging with atomic
force microscopy, scanning electron microscopy (SEM), optical microscopy and
profilometry. Moreover, the performance of the InSn DIC patterns and their
robustness to large deformations is validated in two challenging case studies
of in-situ micro-mechanical testing: (i) self-adaptive isogeometric digital
height correlation of optical surface height profiles of a coarse, bimodal InSn
pattern providing microscopic 3D deformation fields (illustrated for
delamination of aluminum interconnects on a polyimide substrate) and (ii) DIC
on SEM images of a much finer InSn pattern allowing quantification of high
strains near fracture locations (illustrated for rupture of a Fe foil). As
such, the high controllability, performance and scalability of the DIC patterns
offers a promising step towards more routine DIC-based in-situ micro-mechanical
testing.Comment: Accepted for publication in Strai
3D mechanical analysis of aeronautical plain bearings: Validation of a finite element model from measurement of displacement fields by digital volume correlation and optical scanning tomography
On Airbus aircraft, spherical plain bearings are used on many components; in particular to link engine to pylon or pylon to wing. Design of bearings is based on contact pressure distribution on spherical surfaces. To determine this distribution, a 3D analysis of the mechanical behaviour of aeronautical plain bearing is presented in this paper. A numerical model has been built and validated from a comparison with 3D experimental measurements of kinematic components. For that, digital volume correlation (DVC) coupled with optical scanning tomography (OST) is employed to study the mechanical response of a plain bearing model made in epoxy resin. Experimental results have been compared with the ones obtained from the simulated model. This comparison enables us to study the influence of various boundary conditions to build the FE model. Some factors have been highlighted like the fitting behaviour which can radically change contact pressure distribution. This work shows the contribution of a representative mechanical environment to study precisely mechanical response of aeronautical plain bearings
A system performance throughput model applicable to advanced manned telescience systems
As automated space systems become more complex, autonomous, and opaque to the flight crew, it becomes increasingly difficult to determine whether the total system is performing as it should. Some of the complex and interrelated human performance measurement issues are addressed that are related to total system validation. An evaluative throughput model is presented which can be used to generate a human operator-related benchmark or figure of merit for a given system which involves humans at the input and output ends as well as other automated intelligent agents. The concept of sustained and accurate command/control data information transfer is introduced. The first two input parameters of the model involve nominal and off-nominal predicted events. The first of these calls for a detailed task analysis while the second is for a contingency event assessment. The last two required input parameters involving actual (measured) events, namely human performance and continuous semi-automated system performance. An expression combining these four parameters was found using digital simulations and identical, representative, random data to yield the smallest variance
Development and Psychometric Properties of A Screening Tool for Assessing Developmental Coordination Disorder in Adults
Background: Developmental Coordination Disorder (DCD) is a neurodevelopmental disorder affecting motor coordination. Evidence suggests this disorder persists into adulthood and may be associated with biomechanical dysfunction and pain. We report on the development and initial validation of a questionnaire to assess for DCD in adults. Methods: An initial item pool (13 items) was derived from the American Psychiatric Association criteria and World Health Organisation definition for DCD. An expert panel assessed face and content validity which led to a 9-item Functional Difficulties Questionnaire (FDQ-9) with possible scores ranging from 9-36 (higher scores indicating greater functional difficulties). The FDQ-9 was piloted on individuals recruited from convenience samples. The underlying factor structure and aspects of reliability, validity and accuracy were tested. The Receiver Operating Characteristic Curve was employed to evaluate the diagnostic accuracy of the test using self-reported dyspraxia as the reference standard. Results: Principal Axis Factoring yielded a two factor solution relating to gross and fine motor skills; for conceptual parsimony these were combined. Internal reliability was high (0.81), the mean inter-item correlation was 0.51 and preliminary findings suggested satisfactory construct validity. The Area under the Curve was 0.918 [95% CI 0.84-1.00] indicating a diagnostic test with high accuracy. A cut-off score was established with a sensitivity and specificity of 86% [95% CI 78%-89%] and 81% [95 % CI 73%-89%] respectively. Test-retest reliability was good (ICC 0.96 [95% CI 0.92 to 0.98]. Conclusion: The psychometric properties of the FDQ-9 appear promising. Work is required to conduct further psychometric evaluations on new samples and apply the scale to clinical practice
Evaluating the Differences of Gridding Techniques for Digital Elevation Models Generation and Their Influence on the Modeling of Stony Debris Flows Routing: A Case Study From Rovina di Cancia Basin (North-Eastern Italian Alps)
Debris \ufb02ows are among the most hazardous phenomena in mountain areas. To cope
with debris \ufb02ow hazard, it is common to delineate the risk-prone areas through
routing models. The most important input to debris \ufb02ow routing models are the
topographic data, usually in the form of Digital Elevation Models (DEMs). The quality
of DEMs depends on the accuracy, density, and spatial distribution of the sampled
points; on the characteristics of the surface; and on the applied gridding methodology.
Therefore, the choice of the interpolation method affects the realistic representation
of the channel and fan morphology, and thus potentially the debris \ufb02ow routing
modeling outcomes. In this paper, we initially investigate the performance of common
interpolation methods (i.e., linear triangulation, natural neighbor, nearest neighbor,
Inverse Distance to a Power, ANUDEM, Radial Basis Functions, and ordinary kriging)
in building DEMs with the complex topography of a debris \ufb02ow channel located
in the Venetian Dolomites (North-eastern Italian Alps), by using small footprint full-
waveform Light Detection And Ranging (LiDAR) data. The investigation is carried
out through a combination of statistical analysis of vertical accuracy, algorithm
robustness, and spatial clustering of vertical errors, and multi-criteria shape reliability
assessment. After that, we examine the in\ufb02uence of the tested interpolation algorithms
on the performance of a Geographic Information System (GIS)-based cell model for
simulating stony debris \ufb02ows routing. In detail, we investigate both the correlation
between the DEMs heights uncertainty resulting from the gridding procedure and
that on the corresponding simulated erosion/deposition depths, both the effect of
interpolation algorithms on simulated areas, erosion and deposition volumes, solid-liquid
discharges, and channel morphology after the event. The comparison among the tested
interpolation methods highlights that the ANUDEM and ordinary kriging algorithms
are not suitable for building DEMs with complex topography. Conversely, the linear
triangulation, the natural neighbor algorithm, and the thin-plate spline plus tension and completely regularized spline functions ensure the best trade-off among accuracy
and shape reliability. Anyway, the evaluation of the effects of gridding techniques on
debris \ufb02ow routing modeling reveals that the choice of the interpolation algorithm does
not signi\ufb01cantly affect the model outcomes
- …