1,227 research outputs found

    Trustworthy content push

    Full text link
    Delivery of content to mobile devices gains increasing importance in industrial environments to support employees in the field. An important application are e-mail push services like the fashionable Blackberry. These systems are facing security challenges regarding data transport to, and storage of the data on the end user equipment. The emerging Trusted Computing technology offers new answers to these open questions.Comment: 4 pages, 4 eps figure

    The AliEn system, status and perspectives

    Full text link
    AliEn is a production environment that implements several components of the Grid paradigm needed to simulate, reconstruct and analyse HEP data in a distributed way. The system is built around Open Source components, uses the Web Services model and standard network protocols to implement the computing platform that is currently being used to produce and analyse Monte Carlo data at over 30 sites on four continents. The aim of this paper is to present the current AliEn architecture and outline its future developments in the light of emerging standards.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics (CHEP03), La Jolla, Ca, USA, March 2003, 10 pages, Word, 10 figures. PSN MOAT00

    Handling Confidential Data on the Untrusted Cloud: An Agent-based Approach

    Get PDF
    Cloud computing allows shared computer and storage facilities to be used by a multitude of clients. While cloud management is centralized, the information resides in the cloud and information sharing can be implemented via off-the-shelf techniques for multiuser databases. Users, however, are very diffident for not having full control over their sensitive data. Untrusted database-as-a-server techniques are neither readily extendable to the cloud environment nor easily understandable by non-technical users. To solve this problem, we present an approach where agents share reserved data in a secure manner by the use of simple grant-and-revoke permissions on shared data.Comment: 7 pages, 9 figures, Cloud Computing 201

    Software architecture for modeling and distributing virtual environments

    Get PDF

    Design of Participatory Virtual Reality System for visualizing an intelligent adaptive cyberspace

    Get PDF
    The concept of 'Virtual Intelligence' is proposed as an intelligent adaptive interaction between the simulated 3-D dynamic environment and the 3-D dynamic virtual image of the participant in the cyberspace created by a virtual reality system. A system design for such interaction is realised utilising only a stereoscopic optical head-mounted LCD display with an ultrasonic head tracker, a pair of gesture-controlled fibre optic gloves and, a speech recogni(ion and synthesiser device, which are all connected to a Pentium computer. A 3-D dynamic environment is created by physically-based modelling and rendering in real-time and modification of existing object description files by afractals-based Morph software. It is supported by an extensive library of audio and video functions, and functions characterising the dynamics of various objects. The multimedia database files so created are retrieved or manipulated by intelligent hypermedia navigation and intelligent integration with existing information. Speech commands control the dynamics of the environment and the corresponding multimedia databases. The concept of a virtual camera developed by ZeIter as well as Thalmann and Thalmann, as automated by Noma and Okada, can be applied for dynamically relating the orientation and actions of the virtual image of the participant with respect to the simulated environment. Utilising the fibre optic gloves, gesture-based commands are given by the participant for controlling his 3-D virtual image using a gesture language. Optimal estimation methods and dataflow techniques enable synchronisation between the commands of the participant expressed through the gesture language and his 3-D dynamic virtual image. Utilising a framework, developed earlier by the author, for adaptive computational control of distribute multimedia systems, the data access required for the environment as well as the virtual image of the participant can be endowed with adaptive capability

    Bio-signal data gathering, management and analysis within a patient-centred health care context

    Get PDF
    The healthcare service is under pressure to do more with less, and changing the way the service is modelled could be the key to saving resources and increasing efficacy. This change could be possible using patient-centric care models. This model would include straightforward and easy-to-use telemonitoring devices and a flexible data management structure. The structure would maintain its state by ingesting many sources of data, then tracking this data through cleaning and processing into models and estimates to obtaining values from data which could be used by the patient. The system can become less disease-focused and more health-focused by being preventative in nature and allowing patients to be more proactive and involved in their care by automating the data management. This work presents the development of a new device and a data management and analysis system to utilise the data from this device and support data processing along with two examples of its use. These are signal quality and blood pressure estimation. This system could aid in the creation of patient-centric telecare systems

    Multiprotocol Authentication Device for HPC and Cloud Environments Based on Elliptic Curve Cryptography

    Get PDF
    Multifactor authentication is a relevant tool in securing IT infrastructures combining two or more credentials. We can find smartcards and hardware tokens to leverage the authentication process, but they have some limitations. Users connect these devices in the client node to log in or request access to services. Alternatively, if an application wants to use these resources, the code has to be amended with bespoke solutions to provide access. Thanks to advances in system-on-chip devices, we can integrate cryptographically robust, low-cost solutions. In this work, we present an autonomous device that allows multifactor authentication in client–server systems in a transparent way, which facilitates its integration in High-Performance Computing (HPC) and cloud systems, through a generic gateway. The proposed electronic token (eToken), based on the system-on-chip ESP32, provides an extra layer of security based on elliptic curve cryptography. Secure communications between elements use Message Queuing Telemetry Transport (MQTT) to facilitate their interconnection. We have evaluated different types of possible attacks and the impact on communications. The proposed system offers an efficient solution to increase security in access to services and systems.Spanish Ministry of Science, Innovation and Universities (MICINN) PGC2018-096663-B-C44European Union (EU

    View-based textual modelling

    Get PDF
    This work introduces the FURCAS approach, a framework for view-based textual modelling. FURCAS includes means that allow software language engineers to define partial and overlapping textual modelling languages. Furthermore, FURCAS provides an incremental update approach that enables modellers to work with multiple views on the same underlying model. The approach is validated against a set of formal requirements, as well as several industrial case studies showing its practical applicability

    Extending functional databases for use in text-intensive applications

    Get PDF
    This thesis continues research exploring the benefits of using functional databases based around the functional data model for advanced database applications-particularly those supporting investigative systems. This is a growing generic application domain covering areas such as criminal and military intelligence, which are characterised by significant data complexity, large data sets and the need for high performance, interactive use. An experimental functional database language was developed to provide the requisite semantic richness. However, heavy use in a practical context has shown that language extensions and implementation improvements are required-especially in the crucial areas of string matching and graph traversal. In addition, an implementation on multiprocessor, parallel architectures is essential to meet the performance needs arising from existing and projected database sizes in the chosen application area. [Continues.
    • 

    corecore