695,873 research outputs found

    REISCH: incorporating lightweight and reliable algorithms into healthcare applications of WSNs

    Get PDF
    Healthcare institutions require advanced technology to collect patients' data accurately and continuously. The tradition technologies still suffer from two problems: performance and security efficiency. The existing research has serious drawbacks when using public-key mechanisms such as digital signature algorithms. In this paper, we propose Reliable and Efficient Integrity Scheme for Data Collection in HWSN (REISCH) to alleviate these problems by using secure and lightweight signature algorithms. The results of the performance analysis indicate that our scheme provides high efficiency in data integration between sensors and server (saves more than 24% of alive sensors compared to traditional algorithms). Additionally, we use Automated Validation of Internet Security Protocols and Applications (AVISPA) to validate the security procedures in our scheme. Security analysis results confirm that REISCH is safe against some well-known attacks

    Bandwidth Selection for Multivariate Kernel Density Estimation Using MCMC

    Get PDF
    We provide Markov chain Monte Carlo (MCMC) algorithms for computing the bandwidth matrix for multivariate kernel density estimation. Our approach is based on treating the elements of the bandwidth matrix as parameters to be estimated, which we do by optimizing the likelihood cross-validation criterion. Numerical results show that the resulting bandwidths are superior to all existing methods; for dimensions greater than two, our algorithm is the first practical method for estimating the optimal bandwidth matrix. Moreover, the MCMC algorithm for bandwidth selection for multivariate data has no increased difficulty as the dimension of data increases.Bandwidth selection, cross-validation, multivariate kernel density estimation, sampling algorithms.

    Validation procedures in radiological diagnostic models. Neural network and logistic regression

    Get PDF
    The objective of this paper is to compare the performance of two predictive radiological models, logistic regression (LR) and neural network (NN), with five different resampling methods. One hundred and sixty-seven patients with proven calvarial lesions as the only known disease were enrolled. Clinical and CT data were used for LR and NN models. Both models were developed with cross validation, leave-one-out and three different bootstrap algorithms. The final results of each model were compared with error rate and the area under receiver operating characteristic curves (Az). The neural network obtained statistically higher Az than LR with cross validation. The remaining resampling validation methods did not reveal statistically significant differences between LR and NN rules. The neural network classifier performs better than the one based on logistic regression. This advantage is well detected by three-fold cross-validation, but remains unnoticed when leave-one-out or bootstrap algorithms are used.Skull, neoplasms, logistic regression, neural networks, receiver operating characteristic curve, statistics, resampling

    An empirical learning-based validation procedure for simulation workflow

    Full text link
    Simulation workflow is a top-level model for the design and control of simulation process. It connects multiple simulation components with time and interaction restrictions to form a complete simulation system. Before the construction and evaluation of the component models, the validation of upper-layer simulation workflow is of the most importance in a simulation system. However, the methods especially for validating simulation workflow is very limit. Many of the existing validation techniques are domain-dependent with cumbersome questionnaire design and expert scoring. Therefore, this paper present an empirical learning-based validation procedure to implement a semi-automated evaluation for simulation workflow. First, representative features of general simulation workflow and their relations with validation indices are proposed. The calculation process of workflow credibility based on Analytic Hierarchy Process (AHP) is then introduced. In order to make full use of the historical data and implement more efficient validation, four learning algorithms, including back propagation neural network (BPNN), extreme learning machine (ELM), evolving new-neuron (eNFN) and fast incremental gaussian mixture model (FIGMN), are introduced for constructing the empirical relation between the workflow credibility and its features. A case study on a landing-process simulation workflow is established to test the feasibility of the proposed procedure. The experimental results also provide some useful overview of the state-of-the-art learning algorithms on the credibility evaluation of simulation models

    Validating multi-photon quantum interference with finite data

    Full text link
    Multi-particle interference is a key resource for quantum information processing, as exemplified by Boson Sampling. Hence, given its fragile nature, an essential desideratum is a solid and reliable framework for its validation. However, while several protocols have been introduced to this end, the approach is still fragmented and fails to build a big picture for future developments. In this work, we propose an operational approach to validation that encompasses and strengthens the state of the art for these protocols. To this end, we consider the Bayesian hypothesis testing and the statistical benchmark as most favorable protocols for small- and large-scale applications, respectively. We numerically investigate their operation with finite sample size, extending previous tests to larger dimensions, and against two adversarial algorithms for classical simulation: the Mean-Field sampler and the Metropolized Independent Sampler. To evidence the actual need for refined validation techniques, we show how the assessment of numerically simulated data depends on the available sample size, as well as on the internal hyper-parameters and other practically relevant constraints. Our analyses provide general insights into the challenge of validation, and can inspire the design of algorithms with a measurable quantum advantage.Comment: 10 pages, 7 figure

    Tracking filter and multi-sensor data fusion

    Get PDF
    In this paper factorization filtering, fusion filtering strategy and related algorithms are presented. Some results of implementation and validation using realistic data are given
    corecore