109,976 research outputs found

    Case Studies on the Exploitation of Crowd-Sourcing with Web 2.0 Functionalities

    Get PDF
    Crowd-sourcing appears more promising with Web 2.0 functionality and businesses have started using it for a wide range of activities, that would be better completed by a crowd rather than any specific pool of knowledge workers. However, relatively little is known about how a business can leverage on collective intelligence and capture the user- generated value for competitive advantage. This empirical study uses the principle of interpretive field research to validate the case findings with a descriptive multiple case study methodology. An extended theoretical framework to identify the important considerations at strategic and functional levels for the effective use of crowd-sourcing is proposed. The analytic framework uses five Business Strategy Components: Vision and Strategy, Human Capital, Infrastructure, Linkage and Trust, and External Environment. It also uses four Web 2.0 Functional Components: Social Networking, Interaction Orientation, Customization & Personalization, and User- added Value. By using these components as analytic lenses, the case research examines how successful e-commerce firms may deploy Web 2.0 functionalities for effective use of crowd-sourcing. Prioritization of these functional considerations might be favorable in some cases for the best fit of situations and limitations. In conclusion, it is important that the alignment between strategy and functional components is maintained

    OGC SWE-based Data Acquisition System Development for EGIM on EMSODEV EU Project

    Get PDF
    The EMSODEV[1] (European Multidisciplinary Seafloor and water column Observatory DEVelopment) is an EU project whose general objective is to set up the full implementation and operation of the EMSO distributed Research Infrastructure (RI), through the development, testing and deployment of an EMSO Generic Instrument Module (EGIM). This research infrastructure will provide accurate records on marine environmental changes from distributed local nodes around Europe. These observations are critical to respond accurately to the social and scientific challenges such as climate change, changes in marine ecosystems, and marine hazards. In this paper we present the design and development of the EGIM data acquisition system. EGIM is able to operate on any EMSO node, mooring line, sea bed station, cabled or non-cabled and surface buoy. In fact a central function of EGIM within the EMSO infrastructure is to have a number of ocean locations where the same set of core variables are measured homogeneously: using the same hardware, same sensor references, same qualification methods, same calibration methods, same data format and access, and same maintenance procedures.Peer ReviewedPostprint (published version

    CYCLONE Unified Deployment and Management of Federated, Multi-Cloud Applications

    Full text link
    Various Cloud layers have to work in concert in order to manage and deploy complex multi-cloud applications, executing sophisticated workflows for Cloud resource deployment, activation, adjustment, interaction, and monitoring. While there are ample solutions for managing individual Cloud aspects (e.g. network controllers, deployment tools, and application security software), there are no well-integrated suites for managing an entire multi cloud environment with multiple providers and deployment models. This paper presents the CYCLONE architecture that integrates a number of existing solutions to create an open, unified, holistic Cloud management platform for multi-cloud applications, tailored to the needs of research organizations and SMEs. It discusses major challenges in providing a network and security infrastructure for the Intercloud and concludes with the demonstration how the architecture is implemented in a real life bioinformatics use case

    Technical Report: A Trace-Based Performance Study of Autoscaling Workloads of Workflows in Datacenters

    Get PDF
    To improve customer experience, datacenter operators offer support for simplifying application and resource management. For example, running workloads of workflows on behalf of customers is desirable, but requires increasingly more sophisticated autoscaling policies, that is, policies that dynamically provision resources for the customer. Although selecting and tuning autoscaling policies is a challenging task for datacenter operators, so far relatively few studies investigate the performance of autoscaling for workloads of workflows. Complementing previous knowledge, in this work we propose the first comprehensive performance study in the field. Using trace-based simulation, we compare state-of-the-art autoscaling policies across multiple application domains, workload arrival patterns (e.g., burstiness), and system utilization levels. We further investigate the interplay between autoscaling and regular allocation policies, and the complexity cost of autoscaling. Our quantitative study focuses not only on traditional performance metrics and on state-of-the-art elasticity metrics, but also on time- and memory-related autoscaling-complexity metrics. Our main results give strong and quantitative evidence about previously unreported operational behavior, for example, that autoscaling policies perform differently across application domains and by how much they differ.Comment: Technical Report for the CCGrid 2018 submission "A Trace-Based Performance Study of Autoscaling Workloads of Workflows in Datacenters

    Development of Distributed Research Center for analysis of regional climatic and environmental changes

    Get PDF
    We present an approach and first results of a collaborative project being carried out by a joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center UNH, USA. Its main objective is development of a hardware and software platform prototype of a Distributed Research Center (DRC) for monitoring and projecting of regional climatic and environmental changes in the Northern extratropical areas. The DRC should provide the specialists working in climate related sciences and decision-makers with accurate and detailed climatic characteristics for the selected area and reliable and affordable tools for their in-depth statistical analysis and studies of the effects of climate change. Within the framework of the project, new approaches to cloud processing and analysis of large geospatial datasets (big geospatial data) inherent to climate change studies are developed and deployed on technical platforms of both institutions. We discuss here the state of the art in this domain, describe web based information-computational systems developed by the partners, justify the methods chosen to reach the project goal, and briefly list the results obtained so far
    • …
    corecore