562 research outputs found

    Trust models in ubiquitous computing

    No full text
    We recapture some of the arguments for trust-based technologies in ubiquitous computing, followed by a brief survey of some of the models of trust that have been introduced in this respect. Based on this, we argue for the need of more formal and foundational trust models

    The New Generation of Computer Literacy

    Get PDF
    A tremendous mismatch is developing between two of the most critical components of any computer literacy course: the textbooks and the students. We are encountering a new generation of students (literally as well as figuratively!) who are much better acquainted with computer usage than their earlier counterparts. Yet many textbooks with increasing emphasis in those same computer tools continue to appear. There are signs of a coming change in that a few authors and publishers apparently are becoming aware of the need for innovations in texts for non-scientists. These textbooks open the door for a new orientation to principles in the teaching of computer literacy

    On Finding the Jaccard Center

    Get PDF
    We initiate the study of finding the Jaccard center of a given collection N of sets. For two sets X,Y, the Jaccard index is defined as |Xcap Y|/|Xcup Y| and the corresponding distance is 1-|Xcap Y|/|Xcup Y|. The Jaccard center is a set C minimizing the maximum distance to any set of N. We show that the problem is NP-hard to solve exactly, and that it admits a PTAS while no FPTAS can exist unless P = NP. Furthermore, we show that the problem is fixed parameter tractable in the maximum Hamming norm between Jaccard center and any input set. Our algorithms are based on a compression technique similar in spirit to coresets for the Euclidean 1-center problem. In addition, we also show that, contrary to the previously studied median problem by Chierichetti et al. (SODA 2010), the continuous version of the Jaccard center problem admits a simple polynomial time algorithm

    Chaotic Evolution in Quantum Mechanics

    Full text link
    A quantum system is described, whose wave function has a complexity which increases exponentially with time. Namely, for any fixed orthonormal basis, the number of components required for an accurate representation of the wave function increases exponentially.Comment: 8 pages (LaTeX 16 kB, followed by PostScript 2 kB for figure

    Aiding Self-motivation with Readings in Introductory Computing

    Get PDF
    Students can achieve self-motivation and a broader appreciation of computing by reading widely about computing. This paper advocates discussing self-motivation with students, and suggesting that they read widely as a means to that end. A discussion of how to present these ideas effectively, and an annotated list of suggested readings, appropriate for undergraduate majors in computing, are included

    Software engineering for 'quantum advantage'

    Get PDF
    Software is a critical factor in the reliability of computer systems. While the development of hardware is assisted by mature science and engineering disciplines, software science is still in its infancy. This situation is likely to worsen in the future with quantum computer systems. Actually, if quantum computing is quickly coming of age, with potential groundbreaking impacts on many different fields, such benefits come at a price: quantum programming is hard and finding new quantum algorithms is far from straightforward. Thus, the need for suitable formal techniques in quantum software development is even bigger than in classical computation. A lack of reliable approaches to quantum computer programming will put at risk the expected quantum advantage of the new hardware. This position paper argues for the need for a proper quantum software engineering discipline benefiting from precise foundations and calculi, capable of supporting algorithm development and analysis.This work was supported by ERDF, through COMPETE 2020 Programme, and FCT (Fundação para a Ciência e a Tecnologia), the Portuguese funding agency, within project POCI-01-0145-FEDER030947

    Law and Order in Algorithmics

    Get PDF
    An algorithm is the input-output effect of a computer program; mathematically, the notion of algorithm comes close to the notion of function. Just as arithmetic is the theory and practice of calculating with numbers, so is ALGORITHMICS the theory and practice of calculating with algorithms. Just as a law in arithmetic is an equation between numbers, like a(b+c) = ab + ac, so is a LAW in algorithmics an equation between algorithms. The goal of the research done is: (extending algorithmics by) the systematic detection and use of laws for algorithms. To this end category theory (a branch of mathematics) is used to formalise the notion of algorithm, and to formally prove theorems and laws about algorithms.\ud \ud The underlying motivation for the research is the conviction that algorithmics may be of help in the construction of computer programs, just as arithmetic is of help in solving numeric problems. In particular, algorithmics provides the means to derive computer programs by calculation, from a given specification of the input-output effect.\ud \ud In Chapter 2 the systematic detection and use of laws is applied to category theory itself. The result is a way to conduct and present proofs in category theory, that is an alternative to the conventional way (diagram chasing).\ud \ud In Chapter 3--4 several laws are formally derived in a systematic fashion. These laws facilitate to calculate with those algorithms that are defined by induction on their input, or on their output. Technically, initial algebras and terminal co-algebras play an crucial role here.\ud \ud In Chapter 5 a category theoretic formalisation of the notion of law itself is derived and investigated. This result provides a tool to formulate and prove theorems about laws-in-general, and, more specifically, about equationally specified datatypes.\ud \ud Finally, in Chapter 6 laws are derived for arbitrary recursive algorithms. Here the notion of ORDER plays a crucial role. The results are relevant for current functional programming languages

    The Semantic Web: Apotheosis of annotation, but what are its semantics?

    Get PDF
    This article discusses what kind of entity the proposed Semantic Web (SW) is, principally by reference to the relationship of natural language structure to knowledge representation (KR). There are three distinct views on this issue. The first is that the SW is basically a renaming of the traditional AI KR task, with all its problems and challenges. The second view is that the SW will be, at a minimum, the World Wide Web with its constituent documents annotated so as to yield their content, or meaning structure, more directly. This view makes natural language processing central as the procedural bridge from texts to KR, usually via some form of automated information extraction. The third view is that the SW is about trusted databases as the foundation of a system of Web processes and services. There's also a fourth view, which is much more difficult to define and discuss: If the SW just keeps moving as an engineering development and is lucky, then real problems won't arise. This article is part of a special issue called Semantic Web Update

    Computational Theology: A Metaobject-Based Implementation of Models of Generalised Trinitarian Logic

    Get PDF
    This paper analyses an amazingly close analogy between models of generalised trinitarin logics on the one hand side and class hierarchies in the field of object-oriented programming on the other, thus linking philosophy of religion and computer science. In order to bring out this analogy as clear and precise as possible, we utilise a metaobject protocol for the actual implementation of the theological models. These formal implementations lead to the insight that the analogy can be pushed even further, and we lay bare and analyse the close relation between the theological notion of subordination of divine persons and precedence in structures of multiple inheritance. The implementation of theoretical godheads finally leads to new metaobject programming techniques, thus underlining the cross-fertilisation between theology and computer science
    corecore