702 research outputs found

    Integrating testing techniques through process programming

    Get PDF
    Integration of multiple testing techniques is required to demonstrate high quality of software. Technique integration has three basic goals: incremental testing capabilities, extensive error detection, and cost-effective application. We are experimenting with the use of process programming as a mechanism of integrating testing techniques. Having set out to integrate DATA FLOW testing and RELAY, we proposed synergistic use of these techniques to achieve all three goals. We developed a testing process program much as we would develop a software product from requirements through design to implementation and evaluation. We found process programming to be effective for explicitly integrating the techniques and achieving the desired synergism. Used in this way, process programming also mitigates many of the other problems that plague testing in the software development process

    Lazy training of radial basis neural networks

    Get PDF
    Proceeding of: 16th International Conference on Artificial Neural Networks, ICANN 2006. Athens, Greece, September 10-14, 2006Usually, training data are not evenly distributed in the input space. This makes non-local methods, like Neural Networks, not very accurate in those cases. On the other hand, local methods have the problem of how to know which are the best examples for each test pattern. In this work, we present a way of performing a trade off between local and non-local methods. On one hand a Radial Basis Neural Network is used like learning algorithm, on the other hand a selection of the training patterns is used for each query. Moreover, the RBNN initialization algorithm has been modified in a deterministic way to eliminate any initial condition influence. Finally, the new method has been validated in two time series domains, an artificial and a real world one.This article has been financed by the Spanish founded research MEC project OPLINK::UC3M, Ref: TIN2005-08818-C04-0

    Medication adherence and predictive factors in patients with cardiovascular disease: A cross-sectional study

    Full text link
    © 2020 John Wiley & Sons Australia, Ltd Adherence to cardiac medications makes a significant contribution to avoidance of morbidity and premature mortality in patients with cardiovascular disease. This quantitative study used cross-sectional survey design to evaluate medication adherence and contributing factors among patients with cardiovascular disease, comparing patients who were admitted to a cardiac ward (n = 89) and those attending outpatient cardiac rehabilitation (n = 31) in Australia. Data collection was completed between October 2016 and December 2017. Descriptive and regression analyses were conducted to identify medication adherence and determine factors independently predictive of medication adherence. Participants from cardiac rehabilitation had significantly lower adherence to cardiac medications than those recruited from the cardiac ward (58.1 vs 64.0%, respectively). Self-efficacy was significantly associated with participants' medication adherence in both groups. The ability to refill medications and beliefs about cardiac medications were independently significantly predictive of cardiac medication adherence. These findings indicate areas where clinical nurses could expand their role to improve cardiac patients' medication self-management

    Behaviour change interventions to improve medication adherence in patients with cardiac disease: Protocol for a mixed methods study including a pilot randomised controlled trial

    Full text link
    © 2017 Australian College of Nursing Ltd Background: Suboptimal adherence to medication increases mortality and morbidity; individually tailored supportive interventions can improve patients’ adherence to their medication regimens. Aims: The study aims to use a pilot randomised controlled trial (RCT) to test the hypothesis that a theory-based, nurse-led, multi-faceted intervention comprising motivational interviewing techniques and text message reminders in addition to standard care will better promote medication adherence in cardiac patients compared to standard care alone. The pilot study will assess self-reported adherence or non-adherence to cardiovascular medication in patients referred to a cardiac rehabilitation program following hospital admission for an acute cardiac event and test the feasibility of the intervention. The study will examine the role of individual, behavioural and environmental factors in predicting medication non-adherence in patients with CVD. Methods: This is a mixed- methods study including a nested pilot RCT. Twenty-eight cardiac patients will be recruited; an estimated sample of nine patients in each group will be required for the pilot RCT with 80% power to detect a moderate effect size at 5% significance, and assuming 50% loss to follow-up over the six month intervention. Participants will complete a paper-based survey (Phase one), followed by a brief semi-structured interview (Phase two) to identify their level of adherence to medication and determine factors predictive of non-adherence. Participants identified as ‘non-adherent’ will be eligible for the pilot randomised trial, where they will be randomly allocated to receive either the motivational interview plus text message reminders and standard care, or standard care alone. Discussion: Nurse-led multi-faceted interventions have the potential to enhance adherence to cardiac medications. The results of this study may have relevance for cardiac patients in other settings, and for long-term medication users with other chronic diseases

    Feature weighting techniques for CBR in software effort estimation studies: A review and empirical evaluation

    Get PDF
    Context : Software effort estimation is one of the most important activities in the software development process. Unfortunately, estimates are often substantially wrong. Numerous estimation methods have been proposed including Case-based Reasoning (CBR). In order to improve CBR estimation accuracy, many researchers have proposed feature weighting techniques (FWT). Objective: Our purpose is to systematically review the empirical evidence to determine whether FWT leads to improved predictions. In addition we evaluate these techniques from the perspectives of (i) approach (ii) strengths and weaknesses (iii) performance and (iv) experimental evaluation approach including the data sets used. Method: We conducted a systematic literature review of published, refereed primary studies on FWT (2000-2014). Results: We identified 19 relevant primary studies. These reported a range of different techniques. 17 out of 19 make benchmark comparisons with standard CBR and 16 out of 17 studies report improved accuracy. Using a one-sample sign test this positive impact is significant (p = 0:0003). Conclusion: The actionable conclusion from this study is that our review of all relevant empirical evidence supports the use of FWTs and we recommend that researchers and practitioners give serious consideration to their adoption

    Determining appropriate approaches for using data in feature selection

    Get PDF
    Feature selection is increasingly important in data analysis and machine learning in big data era. However, how to use the data in feature selection, i.e. using either ALL or PART of a dataset, has become a serious and tricky issue. Whilst the conventional practice of using all the data in feature selection may lead to selection bias, using part of the data may, on the other hand, lead to underestimating the relevant features under some conditions. This paper investigates these two strategies systematically in terms of reliability and effectiveness, and then determines their suitability for datasets with different characteristics. The reliability is measured by the Average Tanimoto Index and the Inter-method Average Tanimoto Index, and the effectiveness is measured by the mean generalisation accuracy of classification. The computational experiments are carried out on ten real-world benchmark datasets and fourteen synthetic datasets. The synthetic datasets are generated with a pre-set number of relevant features and varied numbers of irrelevant features and instances, and added with different levels of noise. The results indicate that the PART approach is more effective in reducing the bias when the size of a dataset is small but starts to lose its advantage as the dataset size increases

    The Cyborg Astrobiologist: Testing a Novelty-Detection Algorithm on Two Mobile Exploration Systems at Rivas Vaciamadrid in Spain and at the Mars Desert Research Station in Utah

    Full text link
    (ABRIDGED) In previous work, two platforms have been developed for testing computer-vision algorithms for robotic planetary exploration (McGuire et al. 2004b,2005; Bartolo et al. 2007). The wearable-computer platform has been tested at geological and astrobiological field sites in Spain (Rivas Vaciamadrid and Riba de Santiuste), and the phone-camera has been tested at a geological field site in Malta. In this work, we (i) apply a Hopfield neural-network algorithm for novelty detection based upon color, (ii) integrate a field-capable digital microscope on the wearable computer platform, (iii) test this novelty detection with the digital microscope at Rivas Vaciamadrid, (iv) develop a Bluetooth communication mode for the phone-camera platform, in order to allow access to a mobile processing computer at the field sites, and (v) test the novelty detection on the Bluetooth-enabled phone-camera connected to a netbook computer at the Mars Desert Research Station in Utah. This systems engineering and field testing have together allowed us to develop a real-time computer-vision system that is capable, for example, of identifying lichens as novel within a series of images acquired in semi-arid desert environments. We acquired sequences of images of geologic outcrops in Utah and Spain consisting of various rock types and colors to test this algorithm. The algorithm robustly recognized previously-observed units by their color, while requiring only a single image or a few images to learn colors as familiar, demonstrating its fast learning capability.Comment: 28 pages, 12 figures, accepted for publication in the International Journal of Astrobiolog

    Seir immune strategy for instance weighted naive bayes classification

    Full text link
    © Springer International Publishing Switzerland 2015. Naive Bayes (NB) has been popularly applied in many classification tasks. However, in real-world applications, the pronounced advantage of NB is often challenged by insufficient training samples. Specifically, the high variance may occur with respect to the limited number of training samples. The estimated class distribution of a NB classier is inaccurate if the number of training instances is small. To handle this issue, in this paper, we proposed a SEIR (Susceptible, Exposed, Infectious and Recovered) immune-strategy-based instance weighting algorithm for naive Bayes classification, namely SWNB. The immune instance weighting allows the SWNB algorithm adjust itself to the data without explicit specification of functional or distributional forms of the underlying model. Experiments and comparisons on 20 benchmark datasets demonstrated that the proposed SWNB algorithm outperformed existing state-of-the-art instance weighted NB algorithm and other related computational intelligence methods

    Data Mining and Machine Learning in Astronomy

    Full text link
    We review the current state of data mining and machine learning in astronomy. 'Data Mining' can have a somewhat mixed connotation from the point of view of a researcher in this field. If used correctly, it can be a powerful approach, holding the potential to fully exploit the exponentially increasing amount of available data, promising great scientific advance. However, if misused, it can be little more than the black-box application of complex computing algorithms that may give little physical insight, and provide questionable results. Here, we give an overview of the entire data mining process, from data collection through to the interpretation of results. We cover common machine learning algorithms, such as artificial neural networks and support vector machines, applications from a broad range of astronomy, emphasizing those where data mining techniques directly resulted in improved science, and important current and future directions, including probability density functions, parallel algorithms, petascale computing, and the time domain. We conclude that, so long as one carefully selects an appropriate algorithm, and is guided by the astronomical problem at hand, data mining can be very much the powerful tool, and not the questionable black box.Comment: Published in IJMPD. 61 pages, uses ws-ijmpd.cls. Several extra figures, some minor additions to the tex
    • …
    corecore