10,368 research outputs found

    Enforcing Data Geolocation Policies in Public Cloud using Trusted Computing

    Get PDF
    With the advancement in technology, Cloud computing always amazes the world with revolutionizing solutions that automate and simplify complex computational tasks. The advantages like no maintenance cost, accessibility, data backup, pay-per-use models, unlimited storage, and processing power encourage individuals and businesses to migrate their workload to the cloud. Despite the numerous advantages of cloud computing, the geolocation of data in the cloud environment is a massive concern, which relates to the performance and government legislation that will be applied to data. The unclarity of data geolocation can cause compliance concerns. In this work, we have presented a technique that will allow users to restrict the geolocation of their data in the cloud environment. We have used trusted computing mechanisms to attest the host and its geolocation remotely. With this model, the user will upload the data whose decryption key will be shared with a third-party attestation server only. The decryption key will be sealed to the TPM of the host after successful attestation guaranteeing the authorized geolocation and platform state

    State of The Art and Hot Aspects in Cloud Data Storage Security

    Get PDF
    Along with the evolution of cloud computing and cloud storage towards matu- rity, researchers have analyzed an increasing range of cloud computing security aspects, data security being an important topic in this area. In this paper, we examine the state of the art in cloud storage security through an overview of selected peer reviewed publications. We address the question of defining cloud storage security and its different aspects, as well as enumerate the main vec- tors of attack on cloud storage. The reviewed papers present techniques for key management and controlled disclosure of encrypted data in cloud storage, while novel ideas regarding secure operations on encrypted data and methods for pro- tection of data in fully virtualized environments provide a glimpse of the toolbox available for securing cloud storage. Finally, new challenges such as emergent government regulation call for solutions to problems that did not receive enough attention in earlier stages of cloud computing, such as for example geographical location of data. The methods presented in the papers selected for this review represent only a small fraction of the wide research effort within cloud storage security. Nevertheless, they serve as an indication of the diversity of problems that are being addressed

    "One of our hosts in another country": Challenges of data geolocation in cloud storage

    Get PDF
    Physical location of data in cloud storage is an increasingly urgent problem. In a short time, it has evolved from the concern of a few regulated businesses to an important consideration for many cloud storage users. One of the characteristics of cloud storage is fluid transfer of data both within and among the data centres of a cloud provider. However, this has weakened the guarantees with respect to control over data replicas, protection of data in transit and physical location of data. This paper addresses the lack of reliable solutions for data placement control in cloud storage systems. We analyse the currently available solutions and identify their shortcomings. Furthermore, we describe a high-level architecture for a trusted, geolocation-based mechanism for data placement control in distributed cloud storage systems, which are the basis of an on-going work to define the detailed protocol and a prototype of such a solution. This mechanism aims to provide granular control over the capabilities of tenants to access data placed on geographically dispersed storage units comprising the cloud storage

    Horizon Report 2009

    Get PDF
    El informe anual Horizon investiga, identifica y clasifica las tecnologías emergentes que los expertos que lo elaboran prevén tendrán un impacto en la enseñanza aprendizaje, la investigación y la producción creativa en el contexto educativo de la enseñanza superior. También estudia las tendencias clave que permiten prever el uso que se hará de las mismas y los retos que ellos suponen para las aulas. Cada edición identifica seis tecnologías o prácticas. Dos cuyo uso se prevé emergerá en un futuro inmediato (un año o menos) dos que emergerán a medio plazo (en dos o tres años) y dos previstas a más largo plazo (5 años)

    Ancient Cartographies as a Basis for Geolocation Models in Public Space: The Case of Giambattista Nolli and its Heritage Application

    Get PDF
    In 1748, the architect and surveyor Giambattista Nolli mapped an abstract reality of the city of Rome. As a challenge to the inherited projections, it represented the city mixing streets, halls, corridors, churches, baths and markets as part of a unique public space network. A new way to design public space and rethink the whole urban system was opened by the possibility of containing in these representations a single layer with all kinds of public space (including the interior of public buildings). Despite this, Nolli's plan remained as a useless instrument since the hegemony of automobile mobility appeared as a pre-eminent system. This research tries to understand how the application of the ancient cartographies' methodology can improve the pedestrian mobility of historic cities by means of enhancing the graphic value of the system of Giambattista Nolli. Nowadays, free public space is represented as empty and built ones, as solid. This proposal would revert this reified conception of the city, understanding this baroque representation as an instrument of identification and assessment of the transitional heritage. The clues unveiled by Nolli seem to be able to integrate the plans of public buildings within the urban tissue, which would result in a step towards the full integration of cartography and mobility. The success of the comprehensive tools offered by large servers such as Alphabet inc. (Google) or Bing Maps confirm the suitability of the combination of new technologies and Big Data with urban planning, reaching the synchronisation of Smart Cities. Nowadays, open public space can be 'walked in' from any electronic device, consequently, the application of the "Nolli methodology" would implement the model of urban geolocation with the assimilation of inner public spaces. In the creation of a great global map of the public space, a chimaera could be intuited. This would be discussed within a tangible reality: every open public space is already housed in the Big Data and it is accessible through geolocation tools. The inclusion of the of the public buildings' interiors would contribute to develop a greater permeability between city and citizens. Furthermore, this representation would optimize pedestrian travel times and would be able to expand the geolocation system network as a documentary repository

    Optimal interpolation of satellite and ground data for irradiance nowcasting at city scales

    Get PDF
    We use a Bayesian method, optimal interpolation, to improve satellite derived irradiance estimates at city-scales using ground sensor data. Optimal interpolation requires error covariances in the satellite estimates and ground data, which define how information from the sensor locations is distributed across a large area. We describe three methods to choose such covariances, including a covariance parameterization that depends on the relative cloudiness between locations. Results are computed with ground data from 22 sensors over a 75Ă—80 km area centered on Tucson, AZ, using two satellite derived irradiance models. The improvements in standard error metrics for both satellite models indicate that our approach is applicable to additional satellite derived irradiance models. We also show that optimal interpolation can nearly eliminate mean bias error and improve the root mean squared error by 50%

    Earth observations from DSCOVR EPIC instrument

    Full text link
    The National Oceanic and Atmospheric Administration (NOAA) Deep Space Climate Observatory (DSCOVR) spacecraft was launched on 11 February 2015 and in June 2015 achieved its orbit at the first Lagrange point (L1), 1.5 million km from Earth toward the sun. There are two National Aeronautics and Space Administration (NASA) Earth-observing instruments on board: the Earth Polychromatic Imaging Camera (EPIC) and the National Institute of Standards and Technology Advanced Radiometer (NISTAR). The purpose of this paper is to describe various capabilities of the DSCOVR EPIC instrument. EPIC views the entire sunlit Earth from sunrise to sunset at the backscattering direction (scattering angles between 168.5° and 175.5°) with 10 narrowband filters: 317, 325, 340, 388, 443, 552, 680, 688, 764, and 779 nm. We discuss a number of preprocessing steps necessary for EPIC calibration including the geolocation algorithm and the radiometric calibration for each wavelength channel in terms of EPIC counts per second for conversion to reflectance units. The principal EPIC products are total ozone (O3) amount, scene reflectivity, erythemal irradiance, ultraviolet (UV) aerosol properties, sulfur dioxide (SO2) for volcanic eruptions, surface spectral reflectance, vegetation properties, and cloud products including cloud height. Finally, we describe the observation of horizontally oriented ice crystals in clouds and the unexpected use of the O2 B-band absorption for vegetation properties.The NASA GSFC DSCOVR project is funded by NASA Earth Science Division. We gratefully acknowledge the work by S. Taylor and B. Fisher for help with the SO2 retrievals and Marshall Sutton, Carl Hostetter, and the EPIC NISTAR project for help with EPIC data. We also would like to thank the EPIC Cloud Algorithm team, especially Dr. Gala Wind, for the contribution to the EPIC cloud products. (NASA Earth Science Division)Accepted manuscrip

    IDMoB: IoT Data Marketplace on Blockchain

    Full text link
    Today, Internet of Things (IoT) devices are the powerhouse of data generation with their ever-increasing numbers and widespread penetration. Similarly, artificial intelligence (AI) and machine learning (ML) solutions are getting integrated to all kinds of services, making products significantly more "smarter". The centerpiece of these technologies is "data". IoT device vendors should be able keep up with the increased throughput and come up with new business models. On the other hand, AI/ML solutions will produce better results if training data is diverse and plentiful. In this paper, we propose a blockchain-based, decentralized and trustless data marketplace where IoT device vendors and AI/ML solution providers may interact and collaborate. By facilitating a transparent data exchange platform, access to consented data will be democratized and the variety of services targeting end-users will increase. Proposed data marketplace is implemented as a smart contract on Ethereum blockchain and Swarm is used as the distributed storage platform.Comment: Presented at Crypto Valley Conference on Blockchain Technology (CVCBT 2018), 20-22 June 2018 - published version may diffe

    Extending P4 in-band telemetry to user equipment for latency-and localization-aware autonomous networking with AI forecasting

    Get PDF
    In beyond-5G networks, detailed end-to-end monitoring of specific application traffic will be required along with the access-backhaul-cloud continuum to enable low latency service due to local edge steering. Current monitoring solutions are confined to specific network segments. In-band network telemetry (INT) technologies for software defined network (SDN) programmable data planes based on the P4 language are effective in the backhaul network segment, although limited to inter-switch latency; therefore, link latencies including wireless and optical segments are excluded from INT monitoring. Moreover, information such as user equipment (UE) geolocation would allow detailed mobility monitoring and improved cloud-edge steering policies. However, the synchronization between latency and location information, typically provided by different platforms, is hard to achieve with current monitoring systems. In this paper, P4-based INT is proposed to be thoroughly extended involving UE. The INT mechanism is designed to provide synchronized and accurate end-to-end latency and geolocation information, enabling decentralized steering policies, i.e., involving UE and selected switches, without SDN controller intervention. The proposal also includes an artificial-intelligence-assisted forecast system able to predict latency and geolocation in advance and trigger faster edge steering

    Shortcuts through Colocation Facilities

    Full text link
    Network overlays, running on top of the existing Internet substrate, are of perennial value to Internet end-users in the context of, e.g., real-time applications. Such overlays can employ traffic relays to yield path latencies lower than the direct paths, a phenomenon known as Triangle Inequality Violation (TIV). Past studies identify the opportunities of reducing latency using TIVs. However, they do not investigate the gains of strategically selecting relays in Colocation Facilities (Colos). In this work, we answer the following questions: (i) how Colo-hosted relays compare with other relays as well as with the direct Internet, in terms of latency (RTT) reductions; (ii) what are the best locations for placing the relays to yield these reductions. To this end, we conduct a large-scale one-month measurement of inter-domain paths between RIPE Atlas (RA) nodes as endpoints, located at eyeball networks. We employ as relays Planetlab nodes, other RA nodes, and machines in Colos. We examine the RTTs of the overlay paths obtained via the selected relays, as well as the direct paths. We find that Colo-based relays perform the best and can achieve latency reductions against direct paths, ranging from a few to 100s of milliseconds, in 76% of the total cases; 75% (58% of total cases) of these reductions require only 10 relays in 6 large Colos.Comment: In Proceedings of the ACM Internet Measurement Conference (IMC '17), London, GB, 201
    • …
    corecore