3,825 research outputs found

    Modular Arithmetic Expressions and Primality Testing via DNA Self-Assembly

    Full text link
    Self-assembly is a fundamental process by which supramolecular species form spontaneously from their components. This process is ubiquitous throughout the life chemistry and is central to biological information processing. Algorithms for solving many mathematical and computational problems via tile self assembly have been proposed by many researchers in the last decade. In particular tile set for doing basic arithmetic of two inputs have been given. In this work we give tile set for doing basic arithmetic (addition, subtraction, multiplication) of n inputs and subsequently computing its modulo. We also present a tile set for primality testing. Finally we present a software 'xtilemod' for doing modular arithmetic. This simplifies the task of creating the input files to xgrow simulator for doing basic (addition, subtraction, multiplication and division) as well as modular arithmetic of n inputs. Similar software for creating tile set for primality testing is also given

    Experimental Progress in Computation by Self-Assembly of DNA Tilings

    Get PDF
    Approaches to DNA-based computing by self-assembly require the use of D. T A nanostructures, called tiles, that have efficient chemistries, expressive computational power: and convenient input and output (I/O) mechanisms. We have designed two new classes of DNA tiles: TAO and TAE, both of which contain three double-helices linked by strand exchange. Structural analysis of a TAO molecule has shown that the molecule assembles efficiently from its four component strands. Here we demonstrate a novel method for I/O whereby multiple tiles assemble around a single-stranded (input) scaffold strand. Computation by tiling theoretically results in the formation of structures that contain single-stranded (output) reported strands, which can then be isolated for subsequent steps of computation if necessary. We illustrate the advantages of TAO and TAE designs by detailing two examples of massively parallel arithmetic: construction of complete XOR and addition tables by linear assemblies of DNA tiles. The three helix structures provide flexibility for topological routing of strands in the computation: allowing the implementation of string tile models

    State of the art parallel approaches for RSA public key based cryptosystem

    Full text link
    RSA is one of the most popular Public Key Cryptography based algorithm mainly used for digital signatures, encryption/decryption etc. It is based on the mathematical scheme of factorization of very large integers which is a compute-intensive process and takes very long time as well as power to perform. Several scientists are working throughout the world to increase the speedup and to decrease the power consumption of RSA algorithm while keeping the security of the algorithm intact. One popular technique which can be used to enhance the performance of RSA is parallel programming. In this paper we are presenting the survey of various parallel implementations of RSA algorithm involving variety of hardware and software implementations.Comment: IJCSA February 201

    New Design of Reversible Full Adder/Subtractor using RR gate

    Full text link
    Quantum computers require quantum processors. An important part of the processor of any computer is the arithmetic unit, which performs binary addition, subtraction, division and multiplication, however multiplication can be performed using repeated addition, while division can be performed using repeated subtraction. In this paper we present two designs using the reversible R3R^3 gate to perform the quantum half adder/ subtractor and the quantum full adder/subtractor. The proposed half adder/subtractor design can be used to perform different logical operations, such as ANDAND, XORXOR, NANDNAND, XNORXNOR, NOTNOT and copy of basis. The proposed design is compared with the other previous designs in terms of the number of gates used, the number of constant bits, the garbage bits, the quantum cost and the delay. The proposed designs are implemented and tested using GAP software

    The "crisis of noosphere" as a limiting factor to achieve the point of technological singularity

    Full text link
    One of the most significant developments in the history of human being is the invention of a way of keeping records of human knowledge, thoughts and ideas. In 1926, the work of several thinkers such as Edouard Le Roy, Vladimir Vernadsky and Teilhard de Chardin led to the concept of noosphere, thus the idea that human cognition and knowledge transforms the biosphere coming to be something like the planet's thinking layer. At present, is commonly accepted by some thinkers that the Internet is the medium that brings life to noosphere. According to Vinge and Kurzweil's technological singularity hypothesis, noosphere would be in the future the natural environment in which 'human-machine superintelligence' emerges after to reach the point of technological singularity. In this paper we show by means of a numerical model the impossibility that our civilization reaches the point of technological singularity in the near future. We propose that this point may be reached when Internet data centers are based on "computer machines" to be more effective in terms of power consumption than current ones. We speculate about what we have called 'Nooscomputer' or N-computer a hypothetical machine which would consume far less power allowing our civilization to reach the point of technological singularity

    Combinatorial Entropy Encoding

    Full text link
    This paper proposes a novel entropy encoding technique for lossless data compression. Representing a message string by its lexicographic index in the permutations of its symbols results in a compressed version matching Shannon entropy of the message. Commercial data compression standards make use of Huffman or arithmetic coding at some stage of the compression process. In the proposed method, like arithmetic coding entire string is mapped to an integer but is not based on fractional numbers. Unlike both arithmetic and Huffman coding no prior entropy model of the source is required. Simple intuitive algorithm based on multinomial coefficients is developed for entropy encoding that adoptively uses low number of bits for more frequent symbols. Correctness of the algorithm is demonstrated by an example

    Carbon--The First Frontier of Information Processing

    Get PDF
    Information is often encoded as an aperiodic chain of building blocks. Modern digital computers use bits as the building blocks, but in general the choice of building blocks depends on the nature of the information to be encoded. What are the optimal building blocks to encode structural information? This can be analysed by substituting the operations of addition and multiplication of conventional arithmetic with translation and rotation. It is argued that at the molecular level, the best component for encoding discretised structural information is carbon. Living organisms discovered this billions of years ago, and used carbon as the back-bone for constructing proteins that function according to their structure. Structural analysis of polypeptide chains shows that an efficient and versatile structural language of 20 building blocks is needed to implement all the tasks carried out by proteins. Properties of amino acids indicate that the present triplet genetic code was preceded by a more primitive one, coding for 10 amino acids using two nucleotide bases.Comment: (v1) 9 pages, revtex. (v2) 10 pages. Several arguments expanded to make the article self-contained and to increase clarity. Applications pointed out. (v3) 11 pages. Published version. Well-known properties of proteins shifted to an appendix. Reformatted according to journal styl

    Review on the Advancements of DNA Cryptography

    Full text link
    Since security is one of the most important issues, the evolve of cryptography and cryptographic analysis are considered as the fields of on-going research. The latest development on this field is DNA cryptography. It has emerged after the disclosure of computational ability of Deoxyribo Nucleic Acid (DNA). DNA cryptography uses DNA as the computational tool along with several molecular techniques to manipulate it. Due to very high storage capacity of DNA, this field is becoming very promising. Currently it is in the development phase and it requires a lot of work and research to reach a mature stage. By reviewing all the potential and cutting edge technology of current research, this paper shows the directions that need to be addressed further in the field of DNA cryptography

    CoHSI V: Identical multiple scale-independent systems within genomes and computer software

    Full text link
    A mechanism-free and symbol-agnostic conservation principle, the Conservation of Hartley-Shannon Information (CoHSI) is predicted to constrain the structure of discrete systems regardless of their origin or function. Despite their distinct provenance, genomes and computer software share a simple structural property; they are linear symbol-based discrete systems, and thus they present an opportunity to test in a comparative context the predictions of CoHSI. Here, without any consideration of, or relevance to, their role in specifying function, we identify that 10 representative genomes (from microbes to human) and a large collection of software contain identically structured nested subsystems. In the case of base sequences in genomes, CoHSI predicts that if we split the genome into n-tuples (a 2-tuple is a pair of consecutive bases; a 3-tuple is a trio and so on), without regard for whether or not a region is coding, then each collection of n-tuples will constitute a homogeneous discrete system and will obey a power-law in frequency of occurrence of the n-tuples. We consider 1-, 2-, 3-, 4-, 5-, 6-, 7- and 8-tuples of ten species and demonstrate that the predicted power-law behavior is emphatically present, and furthermore as predicted, is insensitive to the start window for the tuple extraction i.e. the reading frame is irrelevant. We go on to provide a proof of Chargaff's second parity rule and on the basis of this proof, predict higher order tuple parity rules which we then identify in the genome data. CoHSI predicts precisely the same behavior in computer software. This prediction was tested and confirmed using 2-, 3- and 4-tuples of the hexadecimal representation of machine code in multiple computer programs, underlining the fundamental role played by CoHSI in defining the landscape in which discrete symbol-based systems must operate.Comment: 22 pages, 13 figures, 35 reference

    A Memcomputing Pascaline

    Full text link
    The original Pascaline was a mechanical calculator able to sum and subtract integers. It encodes information in the angles of mechanical wheels and through a set of gears, and aided by gravity, could perform the calculations. Here, we show that such a concept can be realized in electronics using memory elements such as memristive systems. By using memristive emulators we have demonstrated experimentally the memcomputing version of the mechanical Pascaline, capable of processing and storing the numerical results in the multiple levels of each memristive element. Our result is the first experimental demonstration of multidigit arithmetics with multi-level memory devices that further emphasizes the versatility and potential of memristive systems for future massively-parallel high-density computing architectures
    corecore