29,923 research outputs found

    Understanding the Sources of Variation in Software Inspections

    Get PDF
    In a previous experiment, we determined how various changes in three structural elements of the software inspection process (team size, and number and sequencing of session), altered effectiveness and interval. our results showed that such changes did not significantly influence the defect detection reate, but that certain combinations of changes dramatically increased the inspection interval. We also observed a large amount of unexplained variance in the data, indicating that other factors much be affecting inspection performance. The nature and extent of these other factos now have to be determined to ensure that they had not biased our earlier results. Also, identifying these other factors might suggest additional ways to improve the efficiency of inspection. Acting on the hypothesis that the "inputs" into the inspection process (reviewers, authors, and code units) were significant sources of variation, we modeled their effects on inspection performance. We found that they were responsible for much more variation in defect detection than was process structure. This leads us to conclude that better defect detection techniques, not better process structures, at the key to improving inspection effectiveness. The combined effects of process inputs and process structure on the inspection interval accounted for only a small percentage of the variance in inspection interval. Therefore, there still remain other factors which need to be identified. (Also cross-referenced as UMIACS-TR-97-22

    On systematic approaches for interpreted information transfer of inspection data from bridge models to structural analysis

    Get PDF
    In conjunction with the improved methods of monitoring damage and degradation processes, the interest in reliability assessment of reinforced concrete bridges is increasing in recent years. Automated imagebased inspections of the structural surface provide valuable data to extract quantitative information about deteriorations, such as crack patterns. However, the knowledge gain results from processing this information in a structural context, i.e. relating the damage artifacts to building components. This way, transformation to structural analysis is enabled. This approach sets two further requirements: availability of structural bridge information and a standardized storage for interoperability with subsequent analysis tools. Since the involved large datasets are only efficiently processed in an automated manner, the implementation of the complete workflow from damage and building data to structural analysis is targeted in this work. First, domain concepts are derived from the back-end tasks: structural analysis, damage modeling, and life-cycle assessment. The common interoperability format, the Industry Foundation Class (IFC), and processes in these domains are further assessed. The need for usercontrolled interpretation steps is identified and the developed prototype thus allows interaction at subsequent model stages. The latter has the advantage that interpretation steps can be individually separated into either a structural analysis or a damage information model or a combination of both. This approach to damage information processing from the perspective of structural analysis is then validated in different case studies

    Industrial implementation of intelligent system techniques for nuclear power plant condition monitoring

    Get PDF
    As the nuclear power plants within the UK age, there is an increased requirement for condition monitoring to ensure that the plants are still be able to operate safely. This paper describes the novel application of Intelligent Systems (IS) techniques to provide decision support to the condition monitoring of Nuclear Power Plant (NPP) reactor cores within the UK. The resulting system, BETA (British Energy Trace Analysis) is deployed within the UK’s nuclear operator and provides automated decision support for the analysis of refuelling data, a lead indicator of the health of AGR (Advanced Gas-cooled Reactor) nuclear power plant cores. The key contribution of this work is the improvement of existing manual, labour-intensive analysis through the application of IS techniques to provide decision support to NPP reactor core condition monitoring. This enables an existing source of condition monitoring data to be analysed in a rapid and repeatable manner, providing additional information relating to core health on a more regular basis than routine inspection data allows. The application of IS techniques addresses two issues with the existing manual interpretation of the data, namely the limited availability of expertise and the variability of assessment between different experts. Decision support is provided by four applications of intelligent systems techniques. Two instances of a rule-based expert system are deployed, the first to automatically identify key features within the refuelling data and the second to classify specific types of anomaly. Clustering techniques are applied to support the definition of benchmark behaviour, which is used to detect the presence of anomalies within the refuelling data. Finally data mining techniques are used to track the evolution of the normal benchmark behaviour over time. This results in a system that not only provides support for analysing new refuelling events but also provides the platform to allow future events to be analysed. The BETA system has been deployed within the nuclear operator in the UK and is used at both the engineering offices and on station to support the analysis of refuelling events from two AGR stations, with a view to expanding it to the rest of the fleet in the near future

    WORKSTEP modernisation funds evaluation

    Get PDF

    Monitoring water-soil dynamics and tree survival using soil sensors under a big data approach

    Get PDF
    ArticleThe high importance of green urban planning to ensure access to green areas requires modern and multi-source decision-support tools. The integration of remote sensing data and sensor developments can contribute to the improvement of decision-making in urban forestry. This study proposes a novel big data-based methodology that combines real-time information from soil sensors and climate data to monitor the establishment of a new urban forest in semi-arid conditions. Water-soil dynamics and their implication in tree survival were analyzed considering the application of di erent treatment restoration techniques oriented to facilitate the recovery of tree and shrub vegetation in the degraded area. The synchronized data-capturing scheme made it possible to evaluate hourly, daily, and seasonal changes in soil-water dynamics. The spatial variation of soil-water dynamics was captured by the sensors and it highly contributed to the explanation of the observed ground measurements on tree survival. The methodology showed how the e ciency of treatments varied depending on species selection and across the experimental design. The use of retainers for improving soil moisture content and adjusting tree-watering needs was, on average, the most successful restoration technique. The results and the applied calibration of the sensor technology highlighted the random behavior of water-soil dynamics despite the small-scale scope of the experiment. The results showed the potential of this methodology to assess watering needs and adjust watering resources to the vegetation status using real-time atmospheric and soil datainfo:eu-repo/semantics/publishedVersio

    Quality and standards in secondary initial teacher training : inspected 1999/2002

    Get PDF

    Spc technigues in the non – manufacturing areas

    Get PDF
    In order for the company to maintain optimal process stability to make continuous improvements in order to survive and thrive, management needs to know and use certain tools and methods in quality management. Organizations which embrace the TQM concepts should recognize the value of SPC techniques in areas such as sales, purchasing, invoicing, finance, distribution, training, etc., which are outside production or operations – the traditional area for SPC use. A Pareto analysis, a histogram, a flow chart, or a control chart is a vehicle for communication. Statistical Quality Control (SPC) is a term used to describe the set of statistical tools used by quality professionals. Before start a new business (start-up) should be the right study for feasibility and profitability (feasibility study) and business plan for banks or potential investors in order to secure financing. Statistics can be used to come up with a practical business plan with which will be of interest to potential partners or financial institutions. Ways of using statistical data are: control, storage-archiving and retrieval. When using it is necessary to take into account the wide range of needs such as: the need for users, data quality, complete data inventory when entering documents, data protection and security, comparability of data over a period of time, timeliness of final data, financial implication, public understanding and acceptance
    corecore