207 research outputs found

    Estimation of Analog Parametric Test Metrics Using Copulas

    No full text
    © 2011 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.International audienceA new technique for the estimation of analog parametric test metrics at the design stage is presented in this paper. This technique employs the copulas theory to estimate the distribution between random variables that represent the performances and the test measurements of the circuit under test (CUT). A copulas-based model separates the dependencies between these random variables from their marginal distributions, providing a complete and scale-free description of dependence that is more suitable to be modeled using well-known multivariate parametric laws. The model can be readily used for the generation of an arbitrarily large sample of CUT instances. This sample is thereafter used for estimating parametric test metrics such as defect level (or test escapes) and yield loss. We demonstrate the usefulness of the proposed technique to evaluate a built-in-test technique for a radio frequency low noise amplifier and to set test limits that result in a desired tradeoff between test metrics. In addition, we compare the proposed technique with previous ones that rely on direct density estimation

    Analog Performance Prediction Based on Archimedean Copulas Generation Algorithm

    No full text
    International audienceTesting analog circuits is a complex and very time consuming task. In contrary to digital circuits, testing analog circuits needs different configurations, each of them targets a certain set of output parameters which are the performances and the test measures. One of the solutions to simplify the test task and optimize test time is the reduction of the number of to-be-tested performances by eliminating redundant ones. However, the main problem with such a solution is the identification of redundant performances. Traditional methods based on calculation of the correlation between different performances or on the defect level are shown to be not sufficient. This paper presents a new method based on the Archimedean copula generation algorithm. It predicts the performance value from each output parameter value based on the dependence (copula) between the two values. Therefore, different performances can be represented by a single output parameter; as a result, less test configurations are required. To validate the proposed approach, a CMOS imager with two performances and one test measure is used. The simulation results show that the two performances can be replaced by a single test measure. Industrial results are also reported to prove the superiority of the proposed approach

    A New Method for Estimation of Missing Data Based on Sampling Methods for Data Mining

    No full text
    International audienceToday we collect large amounts of data and we receive more than we can handle, the accumulated data are often raw and far from being of good quality they contain Missing Values and noise. The presence of Missing Values in data are major disadvantages for most Datamining algorithms. Intuitively, the pertinent information is embedded in many attributes and its extraction is only possible if the original data are cleaned and pre-treated. In this paper we propose a new technique for preprocessing data that aims to estimate the Missing Values, in order to obtain representative Samples of good quality, and also to assure that the information extracted is more safe and reliable

    Secure and Efficient Sharing Aggregation Scheme for Data Protection in WSNs

    No full text
    International audienceWireless sensor networks (WSNs) are omnipresent in a multitude of applications. One of the important common requirements of these applications is the data security. Indeed, the exchanged data in WSNs are often considered as a preferred target, which can be a subject of several threats, such as eavesdropping , replay, falsification, alteration, etc. Another important common requirement of WSNs applications is data aggregation. Indeed, the limitations of such networks in terms of energy, bandwidth and storage accentuate the need of data aggregation. In this paper, we address these two issues. We propose a new efficient approach for data integrity and credibility protection for WSNs, while ensuring the data aggregation. We consider a cluster-based network architecture, where sensor nodes are equally distributed in clusters. Each sensor node is in charge to deliver one bit of the sensed data and at the same time observe the remaining parts through a parity control based encryption approach. In this manner, the sensed data could be effectively and securely controlled with a low overhead compared to the classical aggregation approaches, where all the nodes transmit individually the sensed data. To validate the proposed protocol we have simulated it using the simulator CupCarbon and in order to evaluate its efficiency in terms of energy, we have developed a prototype with the TelosB platform, where the obtained results show that our method is less energy consuming

    New techniques for selecting test frequencies for linear analog circuits

    No full text
    International audienceIn this paper we show that the problem of minimizing the number of test frequencies necessary to detect all possible faults in a multi-frequency test approach for linear analog circuits can be modeled as a set covering problem. We will show in particular, that under some conditions on the considered faults, the coefficient matrix of the problem has the strong consecutive-ones property and hence the corresponding set covering problem can be solved in polynomial time. For an efficient solution of the problem, an interval graph formulation is also used and a polynomial algorithm using the interval graph structure is suggested. The optimization of test frequencies for a case-study biquadratic filter is presented for illustration purposes. Numerical simulations with a set of randomly generated problem instances demonstrate two different implementation approaches to solve the optimization problem very fast, with a good time complexity

    New techniques for selecting test frequencies for linear analog circuits

    No full text
    International audienceIn this paper we show that the problem of minimizing the number of test frequencies necessary to detect all possible faults in a multi-frequency test approach for linear analog circuits can be modeled as a set covering problem. We will show in particular, that under some conditions on the considered faults, the coefficient matrix of the problem has the strong consecutive-ones property and hence the corresponding set covering problem can be solved in polynomial time. For an efficient solution of the problem, an interval graph formulation is also used and a polynomial algorithm using the interval graph structure is suggested. The optimization of test frequencies for a case-study biquadratic filter is presented for illustration purposes. Numerical simulations with a set of randomly generated problem instances demonstrate two different implementation approaches to solve the optimization problem very fast, with a good time complexity

    Secure Key Exchange Against Man-in-the-Middle Attack: Modified Diffie-Hellman Protocol

    Get PDF
    One of the most famous key exchange protocols is Diffie-Hellman Protocol (DHP) which is a widely used technique on which key exchange systems around the world depend. This protocol is simple and uncomplicated, and its robustness is based on the Discrete Logarithm Problem (DLP). Despite this, he is considered weak against the man-in-the-middle attack. This article presents a completely different version of the DHP protocol. The proposed version is based on two verification stages. In the first step, we check if the pseudo-random value α that Alice sends to Bob has been manipulated! In the second step, we make sure that the random value β that Bob sends to Alice is not manipulated. The man-in-the-middle attacker Eve can impersonate neither Alice nor Bob, manipulate their exchanged values, or discover the secret encryption key

    Intelligent Data Mining Techniques for Emergency Detection in Wireless Sensor Networks

    Get PDF
    2nd IEEE International Conference on Cloud and Big Data Computing (CBDCom 2016), Toulouse, France, 18-21 JulyEvent detection is an important part in many Wireless Sensor Network (WSN) applications such as forest fire and environmental pollution. In this kind of applications, the event must be detected early in order to reduce the threats and damages. In this paper, we propose a new approach for early forest fire detection, which is based on the integration of Data Mining techniques into sensor nodes. The idea is to partition the node set into clusters so that each node can individually detect fires using classification techniques. Once a fire is detected, the corresponding node will send an alert to its cluster-head. This alert will then be routed via gateways and other cluster-heads to the sink in order to inform the firefighters. The approach is validated using the CupCarbon simulator. The results show that our approach can provide a fast reaction to forest fires with efficient energy consumption.French National Research Agency (ANR

    Secure Key Exchange Against Man-in-the-Middle Attack: Modified Diffie-Hellman Protocol

    Get PDF
    One of the most famous key exchange protocols is Diffie-Hellman Protocol (DHP) which is a widely used technique on which key exchange systems around the world depend. This protocol is simple and uncomplicated, and its robustness is based on the Discrete Logarithm Problem (DLP). Despite this, he is considered weak against the man-in-the-middle attack. This article presents a completely different version of the DHP protocol. The proposed version is based on two verification stages. In the first step, we check if the pseudo-random value α that Alice sends to Bob has been manipulated! In the second step, we make sure that the random value β that Bob sends to Alice is not manipulated. The man-in-the-middle attacker, Eve, can impersonate neither Alice nor Bob, manipulate their exchanged values, or discover the secret encryption key

    Co-simulation of multiple vehicle routing problem models

    Get PDF
    Complex systems are often designed in a decentralized and open way so that they can operate on heterogeneous entities that communicate with each other. Numerous studies consider the process of components simulation in a complex system as a proven approach to realistically predict the behavior of a complex system or to effectively manage its complexity. The simulation of different complex system components can be coupled via co-simulation to reproduce the behavior emerging from their interaction. On the other hand, multi-agent simulations have been largely implemented in complex system modeling and simulation. Each multi-agent simulator’s role is to solve one of the VRP objectives. These simulators interact within a co-simulation platform called MECSYCO, to ensure the integration of the various proposed VRP models. This paper presents the Vehicle Routing Problem (VRP) simulation results in several aspects, where the main goal is to satisfy several client demands. The experiments show the performance of the proposed VRP multi-model and carry out its improvement in terms of computational complexity
    corecore