2,573 research outputs found

    Global-Scale Resource Survey and Performance Monitoring of Public OGC Web Map Services

    Full text link
    One of the most widely-implemented service standards provided by the Open Geospatial Consortium (OGC) to the user community is the Web Map Service (WMS). WMS is widely employed globally, but there is limited knowledge of the global distribution, adoption status or the service quality of these online WMS resources. To fill this void, we investigated global WMSs resources and performed distributed performance monitoring of these services. This paper explicates a distributed monitoring framework that was used to monitor 46,296 WMSs continuously for over one year and a crawling method to discover these WMSs. We analyzed server locations, provider types, themes, the spatiotemporal coverage of map layers and the service versions for 41,703 valid WMSs. Furthermore, we appraised the stability and performance of basic operations for 1210 selected WMSs (i.e., GetCapabilities and GetMap). We discuss the major reasons for request errors and performance issues, as well as the relationship between service response times and the spatiotemporal distribution of client monitoring sites. This paper will help service providers, end users and developers of standards to grasp the status of global WMS resources, as well as to understand the adoption status of OGC standards. The conclusions drawn in this paper can benefit geospatial resource discovery, service performance evaluation and guide service performance improvements.Comment: 24 pages; 15 figure

    GeoFog4Health: a fog-based SDI framework for geospatial health big data analysis

    Get PDF
    Spatial Data Infrastructure (SDI) is an important framework for sharing geospatial big data using the web. Integration of SDI with cloud computing led to emergence of Cloud-SDI as a tool for transmission, processing and analysis of geospatial data. Fog computing is a paradigm where embedded computers are employed to increase the throughput and reduce latency at the edge of the network. In this study, we developed and evaluated a Fog-based SDI framework named GeoFog4Health for mining analytics from geo-health big data. We built prototypes using Intel Edison and Raspberry Pi for studying the comparative performance. We conducted a case study on Malaria vector-borne disease positive maps of Maharastra state in India. The proposed framework had provision of lossless data compression for reduced data transfer. Also, overlay analysis of geospatial data was implemented. In addition, we discussed energy savings, cost analysis and scalability of the proposed framework with respect to efficient data processing. We compared the performance of the proposed framework with the state-of-the-art Cloud-SDI in terms of analysis time. Results and discussions showed the efficacy of the proposed system for enhanced analysis of geo-health big data generated from a variety of sensing frameworks

    Cloud data security and various cryptographic algorithms

    Get PDF
    Cloud computing has spread widely among different organizations due to its advantages, such as cost reduction, resource pooling, broad network access, and ease of administration. It increases the abilities of physical resources by optimizing shared use. Clients’ valuable items (data and applications) are moved outside of regulatory supervision in a shared environment where many clients are grouped together. However, this process poses security concerns, such as sensitive information theft and personally identifiable data leakage. Many researchers have contributed to reducing the problem of data security in cloud computing by developing a variety of technologies to secure cloud data, including encryption. In this study, a set of encryption algorithms (advance encryption standard (AES), data encryption standard (DES), Blowfish, Rivest-Shamir-Adleman (RSA) encryption, and international data encryption algorithm (IDEA) was compared in terms of security, data encipherment capacity, memory usage, and encipherment time to determine the optimal algorithm for securing cloud information from hackers. Results show that RSA and IDEA are less secure than AES, Blowfish, and DES). The AES algorithm encrypts a huge amount of data, takes the least encipherment time, and is faster than other algorithms, and the Blowfish algorithm requires the least amount of memory space

    Technologies and Applications for Big Data Value

    Get PDF
    This open access book explores cutting-edge solutions and best practices for big data and data-driven AI applications for the data-driven economy. It provides the reader with a basis for understanding how technical issues can be overcome to offer real-world solutions to major industrial areas. The book starts with an introductory chapter that provides an overview of the book by positioning the following chapters in terms of their contributions to technology frameworks which are key elements of the Big Data Value Public-Private Partnership and the upcoming Partnership on AI, Data and Robotics. The remainder of the book is then arranged in two parts. The first part “Technologies and Methods” contains horizontal contributions of technologies and methods that enable data value chains to be applied in any sector. The second part “Processes and Applications” details experience reports and lessons from using big data and data-driven approaches in processes and applications. Its chapters are co-authored with industry experts and cover domains including health, law, finance, retail, manufacturing, mobility, and smart cities. Contributions emanate from the Big Data Value Public-Private Partnership and the Big Data Value Association, which have acted as the European data community's nucleus to bring together businesses with leading researchers to harness the value of data to benefit society, business, science, and industry. The book is of interest to two primary audiences, first, undergraduate and postgraduate students and researchers in various fields, including big data, data science, data engineering, and machine learning and AI. Second, practitioners and industry experts engaged in data-driven systems, software design and deployment projects who are interested in employing these advanced methods to address real-world problems

    Guidelines for Cloud Computing architecture: development process

    Get PDF
    Cloud computing (CC) has received significant attention from different types of businesses and Industries and emerged as a new utility for businesses activities. The philosophy behind CC shows a great potential to transform a major part of the IT industry, making computing environment and resources even more attractive as a cost-effective service and changing the way IT hardware is designed and purchased. Resulting day by day more small and medium and large enterprise are adopting different types of CC services. However, strong market competitive environment for converting existing IT services to CC environment imposed different types of challenges for the CC architect. Development of CC architect environment in any organisation is a very complex process and success depends on its proper architecture design and development according to business requirements. The aim of this paper identifies the major type of key factors from literature and provides different guidelines for organisations to support the CC architecture development process. Finally, the different types of CC services generally referred to as CC architect are explained how they all work. This paper will be helpful and provides certain guidance on situations where specific types of CC services are particularly not the best option for any organization

    Cloud technology options towards Free Flow of Data

    Get PDF
    This whitepaper collects the technology solutions that the projects in the Data Protection, Security and Privacy Cluster propose to address the challenges raised by the working areas of the Free Flow of Data initiative. The document describes the technologies, methodologies, models, and tools researched and developed by the clustered projects mapped to the ten areas of work of the Free Flow of Data initiative. The aim is to facilitate the identification of the state-of-the-art of technology options towards solving the data security and privacy challenges posed by the Free Flow of Data initiative in Europe. The document gives reference to the Cluster, the individual projects and the technologies produced by them
    • …
    corecore