22,104 research outputs found

    Mapping Big Data into Knowledge Space with Cognitive Cyber-Infrastructure

    Full text link
    Big data research has attracted great attention in science, technology, industry and society. It is developing with the evolving scientific paradigm, the fourth industrial revolution, and the transformational innovation of technologies. However, its nature and fundamental challenge have not been recognized, and its own methodology has not been formed. This paper explores and answers the following questions: What is big data? What are the basic methods for representing, managing and analyzing big data? What is the relationship between big data and knowledge? Can we find a mapping from big data into knowledge space? What kind of infrastructure is required to support not only big data management and analysis but also knowledge discovery, sharing and management? What is the relationship between big data and science paradigm? What is the nature and fundamental challenge of big data computing? A multi-dimensional perspective is presented toward a methodology of big data computing.Comment: 59 page

    Actor-network procedures: Modeling multi-factor authentication, device pairing, social interactions

    Full text link
    As computation spreads from computers to networks of computers, and migrates into cyberspace, it ceases to be globally programmable, but it remains programmable indirectly: network computations cannot be controlled, but they can be steered by local constraints on network nodes. The tasks of "programming" global behaviors through local constraints belong to the area of security. The "program particles" that assure that a system of local interactions leads towards some desired global goals are called security protocols. As computation spreads beyond cyberspace, into physical and social spaces, new security tasks and problems arise. As networks are extended by physical sensors and controllers, including the humans, and interlaced with social networks, the engineering concepts and techniques of computer security blend with the social processes of security. These new connectors for computational and social software require a new "discipline of programming" of global behaviors through local constraints. Since the new discipline seems to be emerging from a combination of established models of security protocols with older methods of procedural programming, we use the name procedures for these new connectors, that generalize protocols. In the present paper we propose actor-networks as a formal model of computation in heterogenous networks of computers, humans and their devices; and we introduce Procedure Derivation Logic (PDL) as a framework for reasoning about security in actor-networks. On the way, we survey the guiding ideas of Protocol Derivation Logic (also PDL) that evolved through our work in security in last 10 years. Both formalisms are geared towards graphic reasoning and tool support. We illustrate their workings by analysing a popular form of two-factor authentication, and a multi-channel device pairing procedure, devised for this occasion.Comment: 32 pages, 12 figures, 3 tables; journal submission; extended references, added discussio

    Advanced Cyberinfrastructure for Science, Engineering, and Public Policy

    Full text link
    Progress in many domains increasingly benefits from our ability to view the systems through a computational lens, i.e., using computational abstractions of the domains; and our ability to acquire, share, integrate, and analyze disparate types of data. These advances would not be possible without the advanced data and computational cyberinfrastructure and tools for data capture, integration, analysis, modeling, and simulation. However, despite, and perhaps because of, advances in "big data" technologies for data acquisition, management and analytics, the other largely manual, and labor-intensive aspects of the decision making process, e.g., formulating questions, designing studies, organizing, curating, connecting, correlating and integrating crossdomain data, drawing inferences and interpreting results, have become the rate-limiting steps to progress. Advancing the capability and capacity for evidence-based improvements in science, engineering, and public policy requires support for (1) computational abstractions of the relevant domains coupled with computational methods and tools for their analysis, synthesis, simulation, visualization, sharing, and integration; (2) cognitive tools that leverage and extend the reach of human intellect, and partner with humans on all aspects of the activity; (3) nimble and trustworthy data cyber-infrastructures that connect, manage a variety of instruments, multiple interrelated data types and associated metadata, data representations, processes, protocols and workflows; and enforce applicable security and data access and use policies; and (4) organizational and social structures and processes for collaborative and coordinated activity across disciplinary and institutional boundaries.Comment: A Computing Community Consortium (CCC) white paper, 9 pages. arXiv admin note: text overlap with arXiv:1604.0200

    Gaming security by obscurity

    Get PDF
    Shannon sought security against the attacker with unlimited computational powers: *if an information source conveys some information, then Shannon's attacker will surely extract that information*. Diffie and Hellman refined Shannon's attacker model by taking into account the fact that the real attackers are computationally limited. This idea became one of the greatest new paradigms in computer science, and led to modern cryptography. Shannon also sought security against the attacker with unlimited logical and observational powers, expressed through the maxim that "the enemy knows the system". This view is still endorsed in cryptography. The popular formulation, going back to Kerckhoffs, is that "there is no security by obscurity", meaning that the algorithms cannot be kept obscured from the attacker, and that security should only rely upon the secret keys. In fact, modern cryptography goes even further than Shannon or Kerckhoffs in tacitly assuming that *if there is an algorithm that can break the system, then the attacker will surely find that algorithm*. The attacker is not viewed as an omnipotent computer any more, but he is still construed as an omnipotent programmer. So the Diffie-Hellman step from unlimited to limited computational powers has not been extended into a step from unlimited to limited logical or programming powers. Is the assumption that all feasible algorithms will eventually be discovered and implemented really different from the assumption that everything that is computable will eventually be computed? The present paper explores some ways to refine the current models of the attacker, and of the defender, by taking into account their limited logical and programming powers. If the adaptive attacker actively queries the system to seek out its vulnerabilities, can the system gain some security by actively learning attacker's methods, and adapting to them?Comment: 15 pages, 9 figures, 2 tables; final version appeared in the Proceedings of New Security Paradigms Workshop 2011 (ACM 2011); typos correcte

    Active learning based laboratory towards engineering education 4.0

    Get PDF
    Universities have a relevant and essential key role to ensure knowledge and development of competencies in the current fourth industrial revolution called Industry 4.0. The Industry 4.0 promotes a set of digital technologies to allow the convergence between the information technology and the operation technology towards smarter factories. Under such new framework, multiple initiatives are being carried out worldwide as response of such evolution, particularly, from the engineering education point of view. In this regard, this paper introduces the initiative that is being carried out at the Technical University of Catalonia, Spain, called Industry 4.0 Technologies Laboratory, I4Tech Lab. The I4Tech laboratory represents a technological environment for the academic, research and industrial promotion of related technologies. First, in this work, some of the main aspects considered in the definition of the so called engineering education 4.0 are discussed. Next, the proposed laboratory architecture, objectives as well as considered technologies are explained. Finally, the basis of the proposed academic method supported by an active learning approach is presented.Postprint (published version

    Antifragility = Elasticity + Resilience + Machine Learning: Models and Algorithms for Open System Fidelity

    Full text link
    We introduce a model of the fidelity of open systems - fidelity being interpreted here as the compliance between corresponding figures of interest in two separate but communicating domains. A special case of fidelity is given by real-timeliness and synchrony, in which the figure of interest is the physical and the system's notion of time. Our model covers two orthogonal aspects of fidelity, the first one focusing on a system's steady state and the second one capturing that system's dynamic and behavioural characteristics. We discuss how the two aspects correspond respectively to elasticity and resilience and we highlight each aspect's qualities and limitations. Finally we sketch the elements of a new model coupling both of the first model's aspects and complementing them with machine learning. Finally, a conjecture is put forward that the new model may represent a first step towards compositional criteria for antifragile systems.Comment: Preliminary version submitted to the 1st International Workshop "From Dependable to Resilient, from Resilient to Antifragile Ambients and Systems" (ANTIFRAGILE 2014), https://sites.google.com/site/resilience2antifragile
    corecore