14,445 research outputs found

    DISTANCE: a framework for software measure construction.

    Get PDF
    In this paper we present a framework for software measurement that is specifically suited to satisfy the measurement needs of empirical software engineering research. The framework offers an approach to measurement that builds upon the easily imagined, detected and visualised concepts of similarity and dissimilarity between software entities. These concepts are used both to model the software attributes of interest and to define the corresponding software measures. Central to the framework is a process model that embeds constructive procedures for attribute modelling and measure construction into a goal-oriented approach to empirical software engineering studies. The underlying measurement theoretic principles of our approach ensure the construct validity of the resulting measures. The approach was tested on a popular suite of object-oriented design measures. We further show that our measure construction method compares favourably to related work.Software;

    Quantum Cryptography Beyond Quantum Key Distribution

    Get PDF
    Quantum cryptography is the art and science of exploiting quantum mechanical effects in order to perform cryptographic tasks. While the most well-known example of this discipline is quantum key distribution (QKD), there exist many other applications such as quantum money, randomness generation, secure two- and multi-party computation and delegated quantum computation. Quantum cryptography also studies the limitations and challenges resulting from quantum adversaries---including the impossibility of quantum bit commitment, the difficulty of quantum rewinding and the definition of quantum security models for classical primitives. In this review article, aimed primarily at cryptographers unfamiliar with the quantum world, we survey the area of theoretical quantum cryptography, with an emphasis on the constructions and limitations beyond the realm of QKD.Comment: 45 pages, over 245 reference

    WEAK MEASUREMENT THEORY AND MODIFIED COGNITIVE COMPLEXITY MEASURE

    Get PDF
    Measurement is one of the problems in the area of software engineering. Since traditional measurement theory has a major problem in defining empirical observations on software entities in terms of their measured quantities, Morasca has tried to solve this problem by proposing Weak Measurement theory. In this paper, we tried to evaluate the applicability of weak measurement theory by applying it on a newly proposed Modified Cognitive Complexity Measure (MCCM). We also investigated the applicability of Weak Extensive Structure for deciding on the type of scale for MCCM. It is observed that the MCCM is on weak ratio scale

    Information and the reconstruction of quantum physics

    Full text link
    The reconstruction of quantum physics has been connected with the interpretation of the quantum formalism, and has continued to be so with the recent deeper consideration of the relation of information to quantum states and processes. This recent form of reconstruction has mainly involved conceiving quantum theory on the basis of informational principles, providing new perspectives on physical correlations and entanglement that can be used to encode information. By contrast to the traditional, interpretational approach to the foundations of quantum mechanics, which attempts directly to establish the meaning of the elements of the theory and often touches on metaphysical issues, the newer, more purely reconstructive approach sometimes defers this task, focusing instead on the mathematical derivation of the theoretical apparatus from simple principles or axioms. In its most pure form, this sort of theory reconstruction is fundamentally the mathematical derivation of the elements of theory from explicitly presented, often operational principles involving a minimum of extra‐mathematical content. Here, a representative series of specifically information‐based treatments—from partial reconstructions that make connections with information to rigorous axiomatizations, including those involving the theories of generalized probability and abstract systems—is reviewed.Accepted manuscrip

    If physics is an information science, what is an observer?

    Full text link
    Interpretations of quantum theory have traditionally assumed a "Galilean" observer, a bare "point of view" implemented physically by a quantum system. This paper investigates the consequences of replacing such an informationally-impoverished observer with an observer that satisfies the requirements of classical automata theory, i.e. an observer that encodes sufficient prior information to identify the system being observed and recognize its acceptable states. It shows that with reasonable assumptions about the physical dynamics of information channels, the observations recorded by such an observer will display the typical characteristics predicted by quantum theory, without requiring any specific assumptions about the observer's physical implementation.Comment: 30 pages, comments welcome; v2 significant revisions - results unchange

    Squeeziness: An information theoretic measure for avoiding fault masking

    Get PDF
    Copyright @ 2012 ElsevierFault masking can reduce the effectiveness of a test suite. We propose an information theoretic measure, Squeeziness, as the theoretical basis for avoiding fault masking. We begin by explaining fault masking and the relationship between collisions and fault masking. We then define Squeeziness and demonstrate by experiment that there is a strong correlation between Squeeziness and the likelihood of collisions. We conclude with comments on how Squeeziness could be the foundation for generating test suites that minimise the likelihood of fault masking

    Information Flow for Security in Control Systems

    Full text link
    This paper considers the development of information flow analyses to support resilient design and active detection of adversaries in cyber physical systems (CPS). The area of CPS security, though well studied, suffers from fragmentation. In this paper, we consider control systems as an abstraction of CPS. Here, we extend the notion of information flow analysis, a well established set of methods developed in software security, to obtain a unified framework that captures and extends system theoretic results in control system security. In particular, we propose the Kullback Liebler (KL) divergence as a causal measure of information flow, which quantifies the effect of adversarial inputs on sensor outputs. We show that the proposed measure characterizes the resilience of control systems to specific attack strategies by relating the KL divergence to optimal detection techniques. We then relate information flows to stealthy attack scenarios where an adversary can bypass detection. Finally, this article examines active detection mechanisms where a defender intelligently manipulates control inputs or the system itself in order to elicit information flows from an attacker's malicious behavior. In all previous cases, we demonstrate an ability to investigate and extend existing results by utilizing the proposed information flow analyses

    Enhanced secure key exchange systems based on the Johnson-noise scheme

    Get PDF
    We introduce seven new versions of the Kirchhoff-Law-Johnson-(like)-Noise (KLJN) classical physical secure key exchange scheme and a new transient protocol for practically-perfect security. While these practical improvements offer progressively enhanced security and/or speed for the non-ideal conditions, the fundamental physical laws providing the security remain the same. In the "intelligent" KLJN (iKLJN) scheme, Alice and Bob utilize the fact that they exactly know not only their own resistor value but also the stochastic time function of their own noise, which they generate before feeding it into the loop. In the "multiple" KLJN (MKLJN) system, Alice and Bob have publicly known identical sets of different resistors with a proper, publicly known truth table about the bit-interpretation of their combination. In the "keyed" KLJN (KKLJN) system, by using secure communication with a formerly shared key, Alice and Bob share a proper time-dependent truth table for the bit-interpretation of the resistor situation for each secure bit exchange step during generating the next key. The remaining four KLJN schemes are the combinations of the above protocols to synergically enhance the security properties. These are: the "intelligent-multiple" (iMKLJN), the "intelligent-keyed" (iKKLJN), the "keyed-multiple" (KMKLJN) and the "intelligent-keyed-multiple" (iKMKLJN) KLJN key exchange systems. Finally, we introduce a new transient-protocol offering practically-perfect security without privacy amplification, which is not needed at practical applications but it is shown for the sake of ongoing discussions.Comment: This version is accepted for publicatio

    Rank Minimization over Finite Fields: Fundamental Limits and Coding-Theoretic Interpretations

    Full text link
    This paper establishes information-theoretic limits in estimating a finite field low-rank matrix given random linear measurements of it. These linear measurements are obtained by taking inner products of the low-rank matrix with random sensing matrices. Necessary and sufficient conditions on the number of measurements required are provided. It is shown that these conditions are sharp and the minimum-rank decoder is asymptotically optimal. The reliability function of this decoder is also derived by appealing to de Caen's lower bound on the probability of a union. The sufficient condition also holds when the sensing matrices are sparse - a scenario that may be amenable to efficient decoding. More precisely, it is shown that if the n\times n-sensing matrices contain, on average, \Omega(nlog n) entries, the number of measurements required is the same as that when the sensing matrices are dense and contain entries drawn uniformly at random from the field. Analogies are drawn between the above results and rank-metric codes in the coding theory literature. In fact, we are also strongly motivated by understanding when minimum rank distance decoding of random rank-metric codes succeeds. To this end, we derive distance properties of equiprobable and sparse rank-metric codes. These distance properties provide a precise geometric interpretation of the fact that the sparse ensemble requires as few measurements as the dense one. Finally, we provide a non-exhaustive procedure to search for the unknown low-rank matrix.Comment: Accepted to the IEEE Transactions on Information Theory; Presented at IEEE International Symposium on Information Theory (ISIT) 201
    corecore