4 research outputs found

    Citizens as Sensors for Crisis Event: Sensor Web Enablement for Volunteered Geographic Information

    Get PDF
    A set of developments within the field of geosensors is to engage citizens to act as sensors, thus providing so-called Volunteered Geographic Information (VGI). There is a long tradition of non specialists contributing to the collection of geo-referenced information. Furthermore thanks to recent convergence of greater access to broadband connections, the availability of Global Positioning Systems at affordable prices, and more participative forms of interaction on the Web (Web 2.0) vast numbers of individuals are able to create and share geographic information. The potential of up to 6 billion human sensors to monitor the state of the environment, validate global models with local knowledge, contribute to crisis situations awareness and provide information that only humans can capture (e.g. emotions and perceptions like fear of crime) is vast and has yet to be fully exploited. However, integrating VGI into Spatial Data Infrastructures (SDI) is a major challenge, as it is often regarded as insufficiently structured, documented or validated according to scientific standards. Early instances of SDIs used to have limited ability to manage and process geosensor-based data (beyond remotely sensed imagery snapshots), which tend to arrive in continuous streams of real-time information. The current works on standards for Sensor Web Enablement (SWE) aim to fill this gap. This paper shows how such SWE standards can be applied to VGI, thus converting it in a timely, cost-effective and valuable source of information for SDIs. By doing so, we extend previous works describing a workflow for VGI integration into SDI and further advance an initial set of VGI Sensing and event detection techniques. In particular, an example of how such VGI Sensing techniques can support crisis information system is provided.JRC.DDG.H.6-Spatial data infrastructure

    Citizen-based sensing of crisis events: sensor web enablement for volunteered geographic information

    Get PDF
    Thanks to recent convergence of greater access to broadband connections, the availability of Global Positioning Systems in small packages at affordable prices and more participative forms of interaction on the Web (Web 2.0), vast numbers of individuals became able to create and share Volunteered Geographic Information (VGI). The potential of up to six billion persons to monitor the state of the environment, validate global models with local knowledge, contribute to crisis situations awareness, and provide information that only humans can capture is vast and has yet to be fully exploited. Integrating VGI into Spatial Data Infrastructures (SDI) is a major challenge, as it is often regarded as insufficiently structured, documented, or validated according to scientific standards. Early instances of SDIs used to have limited ability to manage and process geosensor-based data (beyond remotely sensed imagery), which tend to arrive in continuous streams of real-time information. The current works on standards for Sensor Web Enablement fill this gap. This paper shows how such standards can be applied to VGI, thus converting it in a timely, cost-effective and valuable source of information for SDIs. By doing so, we extend previous efforts describing a workflow for VGI integration into SDI and further advance an initial set of VGI Sensing and event detection techniques. Examples of how such VGI Sensing techniques can support crisis information system are provided. The presented approach serves central building blocks for a Digital Earth’s nervous system, which is required to develop the next generation of (geospatial) information infrastructures

    Doping as a Possible Means to create Superconductivity in Graphene

    Get PDF
    The possibility of creating superconductivity in Highly Oriented Pyrolytic Graphite (HOPG) by means of doping was investigated. Bulk HOPG samples were doped with phosphorous using either ion-implantation or by Chemical Vapor Deposition growth with phosphine in the gas mixture. The methods for testing the graphene samples, once doped, were done by performing R vs. T measurements, and determining via observation suppressed superconductive characteristics signaling the presence of the Meissner Effect in a strong applied magnetic field. Before doping, the resistance vs. temperature (R vs. T) characteristic of the HOPG was measured. The R vs. T characteristic was again measured after doping, and for surface multilayers of graphene exfoliated from the post doped bulk sample. A 100 to 350 mT magnetic field was supplied, and the R vs. T characteristic was re-measured on a number of samples. Phosphorous-implanted HOPG samples exhibit deviations from the expected rise in resistance as the temperature is reduced to some point above 100 K. The application of a modest magnetic field reverses this trend. A step in resistance at a temperature of approximately 50-60 K in all of the samples is clearly observed, as well as a second step at 100-120 K, a third at a temperature range of 150-180 K and a fourth from about 200-240 K. A response consistent with the presence of magnetic field flux pancake vortices in phosphorous implanted HOPG and in phosphorous-doped exfoliated multilayer graphene has been observed. The lack of zero resistance at low temperatures is also consistent with pancake vortex behaviour in the flux-flow regime. The presence of magnetic vortices requires, and is direct evidence of superconductivity

    Selected Applications in Data Intensive Computing

    No full text
    As advances of science and technology develop, large amount of data are exponentially generated every day through different ways, such as scientific instruments, computer simulations, and many other methods. How to mine valuable nuggets of knowledge to make informed decisions from such large amount of data in an efficient way is challenging. However, the development of distributed computing techniques and high speed networks provides us good opportunities to solve big data problems. In this thesis, I focus on developing data intensive computing algorithms and applying data mining methods to analyze massive biological and medical data under cloud computing environments. There are many approaches which can parallelize an existing data mining algorithm in a cloud computing environment. Achieving better performance by manipulating data in an intelligent way has attracted a lot of attention. In this thesis, I propose two different approaches to parallelize the existing random decision tree algorithm, which has been implemented in the Sector/Sphere cloud environment. Some comparisons about cost and accuracy are also conducted for these two different implementations and are presented here. Recently, with the development of ChIP-chip and ChIP-seq technology, huge amounts of genome wide protein-DNA binding sites data are now available for many transcription factors and chromatin regulators for many species. Previous studies have already shown that the distribution of their localizations and modification can offer novel insight into the mechanisms of regulation. As it is strongly believed that multiple chromatin factors can work together to regulate a common target, I formally define this problem and propose a novel graph-based algorithm called Patterns of Marks (PoM) to efficiently identify these types of geometric patterns in the massive genomic data. In addition, as the amount of data grows, it is impossible to integrate data manually, therefore, I propose two algorithms to automatically integrate big tabular data. I also conduct an experimental study by developing a customizable lightweight web crawler to collect various data from Internet
    corecore