199,037 research outputs found

    Cross-layer system reliability assessment framework for hardware faults

    Get PDF
    System reliability estimation during early design phases facilitates informed decisions for the integration of effective protection mechanisms against different classes of hardware faults. When not all system abstraction layers (technology, circuit, microarchitecture, software) are factored in such an estimation model, the delivered reliability reports must be excessively pessimistic and thus lead to unacceptably expensive, over-designed systems. We propose a scalable, cross-layer methodology and supporting suite of tools for accurate but fast estimations of computing systems reliability. The backbone of the methodology is a component-based Bayesian model, which effectively calculates system reliability based on the masking probabilities of individual hardware and software components considering their complex interactions. Our detailed experimental evaluation for different technologies, microarchitectures, and benchmarks demonstrates that the proposed model delivers very accurate reliability estimations (FIT rates) compared to statistically significant but slow fault injection campaigns at the microarchitecture level.Peer ReviewedPostprint (author's final draft

    Deriving Models for Software Project Effort Estimation By Means of Genetic Programming

    Get PDF
    Software engineering, effort estimation, genetic programming, symbolic regression. This paper presents the application of a computational intelligence methodology in effort estimation for software projects. Namely, we apply a genetic programming model for symbolic regression; aiming to produce mathematical expressions that (1) are highly accurate and (2) can be used for estimating the development effort by revealing relationships between the project’s features and the required work. We selected to investigate the effectiveness of this methodology into two software engineering domains. The system was proved able to generate models in the form of handy mathematical expressions that are more accurate than those found in literature.

    An estimate of necessary effort in the development of software projects

    Get PDF
    International Workshop on Intelligent Technologies for Software Engineering (WITSE'04). 19th IEEE International Conference on Automated Software Engineering (Linz, Austria, September 20th - 25th, 2004)The estimated of the effort in the development of software projects has already been studied in the field of software engineering. For this purpose different ways of measurement such as Unes of code and function points, generally addressed to relate software size with project cost (effort) have been used. In this work we are presenting a research project that deals with this field, us'mg machine learning techniques to predict the software project cost. Several public set of data are used. The analysed sets of data only relate the effort invested in the development of software projects and the size of the resultant code. For this reason, we can say that the data used are poor. Despite that, the results obtained are good, because they improve the ones obtained in previous analyses. In order to get results closer to reality we should find data sets of a bigger size that take into account more variables, thus offering more possibilities to obtain solutions in a more efficient way.Publicad

    Elicitation of structured engineering judgement to inform a focussed FMEA

    Get PDF
    The practical use of Failure Mode and Effects Analysis (FMEA) has been criticised because it is often implemented too late and in a manner that does not allow information to be fed-back to inform the product design. Lessons learnt from the use of elicitation methods to gather structured expert judgement about engineering concerns for a new product design has led to an enhancement of the approach for implementing design and process FMEA. We refer to this variant as a focussed FMEA since the goal is to enable relevant engineers to contribute to the analysis and to act upon the outcomes in such a way that all activities focus upon the design needs. The paper begins with a review of the proposed process to identify and quantify engineering concerns. The pros and cons of using elicitation methods, originally designed to support construction of a Bayesian prior, to inform a focussed FMEA are analysed and a comparison of the proposed process in relation to the existing standards is made. An industrial example is presented to illustrate customisation of the process and discuss the impact on the design process

    Validity of telemetric-derived measures of heart rate variability: a systematic review

    Get PDF
    Heart rate variability (HRV) is a widely accepted indirect measure of autonomic function with widespread application across many settings. Although traditionally measured from the 'gold standard' criterion electrocardiography (ECG), the development of wireless telemetric heart rate monitors (HRMs) extends the scope of the HRV measurement. However, the validity of telemetric-derived data against the criterion ECG data is unclear. Thus, the purpose of this study was twofold: (a) to systematically review the validity of telemetric HRM devices to detect inter-beat intervals and aberrant beats; and (b) to determine the accuracy of HRV parameters computed from HRM-derived inter-beat interval time series data against criterion ECG-derived data in healthy adults aged 19 to 62 yrs. A systematic review of research evidence was conducted. Four electronic databases were accessed to obtain relevant articles (PubMed, EMBASE, MEDLINE and SPORTDiscus. Articles published in English between 1996 and 2016 were eligible for inclusion. Outcome measures included temporal and power spectral indices (Task Force of the European Society of Cardiology and the North American Society of Pacing and Electrophysiology (1996). The review confirmed that modern HRMs (Polar® V800™ and Polar® RS800CX™) accurately detected inter-beat interval time-series data. The HRV parameters computed from the HRM-derived time series data were interchangeable with the ECG-derived data. The accuracy of the automatic in-built manufacturer error detection and the HRV algorithms were not established. Notwithstanding acknowledged limitations (a single reviewer, language bias, and the restricted selection of HRV parameters), we conclude that the modern Polar® HRMs offer a valid useful alternative to the ECG for the acquisition of inter-beat interval time series data, and the HRV parameters computed from Polar® HRM-derived inter-beat interval time series data accurately reflect ECG-derived HRV metrics, when inter-beat interval data are processed and analyzed using identical protocols, validated algorithms and software, particularly under controlled and stable conditions

    On systematic approaches for interpreted information transfer of inspection data from bridge models to structural analysis

    Get PDF
    In conjunction with the improved methods of monitoring damage and degradation processes, the interest in reliability assessment of reinforced concrete bridges is increasing in recent years. Automated imagebased inspections of the structural surface provide valuable data to extract quantitative information about deteriorations, such as crack patterns. However, the knowledge gain results from processing this information in a structural context, i.e. relating the damage artifacts to building components. This way, transformation to structural analysis is enabled. This approach sets two further requirements: availability of structural bridge information and a standardized storage for interoperability with subsequent analysis tools. Since the involved large datasets are only efficiently processed in an automated manner, the implementation of the complete workflow from damage and building data to structural analysis is targeted in this work. First, domain concepts are derived from the back-end tasks: structural analysis, damage modeling, and life-cycle assessment. The common interoperability format, the Industry Foundation Class (IFC), and processes in these domains are further assessed. The need for usercontrolled interpretation steps is identified and the developed prototype thus allows interaction at subsequent model stages. The latter has the advantage that interpretation steps can be individually separated into either a structural analysis or a damage information model or a combination of both. This approach to damage information processing from the perspective of structural analysis is then validated in different case studies
    • …
    corecore