64,537 research outputs found

    DCDIDP: A Distributed, Collaborative, and Data-driven IDP Framework for the Cloud

    Get PDF
    Recent advances in distributed computing, grid computing, virtualization mechanisms, and utility computing led into Cloud Computing as one of the industry buzz words of our decade. As the popularity of the services provided in the cloud environment grows exponentially, the exploitation of possible vulnerabilities grows with the same pace. Intrusion Detection and Prevention Systems (IDPSs) are one of the most popular tools among the front line fundamental tools to defend the computation and communication infrastructures from the intruders. In this poster, we propose a distributed, collaborative, and data-driven IDP (DCDIDP) framework for cloud computing environments. Both cloud providers and cloud customers will benefit significantly from DCDIDP that dynamically evolves and gradually mobilizes the resources in the cloud as suspicion about attacks increases. Such system will provide homogeneous IDPS for all the cloud providers that collaborate distributively. It will respond to the attacks, by collaborating with other peers and in a distributed manner, as near as possible to attack sources and at different levels of operations (e.g. network, host, VM). We present the DCDIDP framework and explain its components. However, further explanation is part of our ongoing work

    Addressing Employee Use of Personal Clouds

    Get PDF
    Cloud computing is one of the most useful innovations in the digital age. While much of the attention on recent advances has focused on smartphones, tablet computers, and wearable technology, the cloud is perhaps unrivaled in its utility for organizations. From simplified data storage to innovative software platforms, enterprise-grade cloud solutions provide cost-effective alternatives to acquiring expensive computer hardware and software. Enterprise clouds also offer a collaborative work environment for a mobile and widespread work force, enabling businesses to maximize worker productivity

    The Value of Green IT: a Theoretical Framework and Exploratory Assessment of Cloud Computing

    Get PDF
    The phenomenon of climate change and the resulting focus on energy consumption has created a consensus among businesses about the need for a collective reduction in carbon emissions. A range of new Green technologies, such as Cloud computing, provide unprecedented opportunities to improve the efficiency of business operations and represent a realistic opportunity to reduce energy costs and combat global warming. Enterprises are equally concerned with optimising the business value derived from investments in Green IT. However, evidence from the literature shows that the measurement of IT value is a complex, challenge involving multiple stakeholders. Green IT, such as Cloud computing, adds further complexity to the understanding of IT value as the expected operational and business benefits should also be complemented by environmental and societal concerns. This paper contributes to the area of Cloud computing and Green IT research through the development of a framework that measures the value of Green IT. The framework expands on the work of Corbett (2010) and is developed as a result of a comprehensive literature review of both seminal works in the IT value field and also the most recent studies in Green IT. This paper demonstrates the efficacy of the value framework by reporting from a series of case studies involving Cloud computing. Using exploratory case studies, this study highlights the utility of the model and its applicability to multiple contexts. Although the framework can be applied in multiple settings, the findings highlight a number of areas that are prominent in SMEs and those in need of further attention

    Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing

    Get PDF
    Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements

    Parallel Differential Evolution approach for Cloud workflow placements under simultaneous optimization of multiple objectives

    Get PDF
    International audienceThe recent rapid expansion of Cloud computing facilities triggers an attendant challenge to facility providers and users for methods for optimal placement of workflows on distributed resources, under the often-contradictory impulses of minimizing makespan, energy consumption, and other metrics. Evolutionary Optimization techniques that from theoretical principles are guaranteed to provide globally optimum solutions, are among the most powerful tools to achieve such optimal placements. Multi-Objective Evolutionary algorithms by design work upon contradictory objectives, gradually evolving across generations towards a converged Pareto front representing optimal decision variables – in this case the mapping of tasks to resources on clusters. However the computation time taken by such algorithms for convergence makes them prohibitive for real time placements because of the adverse impact on makespan. This work describes parallelization, on the same cluster, of a Multi-Objective Differential Evolution method (NSDE-2) for optimization of workflow placement, and the attendant speedups that bring the implicit accuracy of the method into the realm of practical utility. Experimental validation is performed on a real-life testbed using diverse Cloud traces. The solutions under different scheduling policies demonstrate significant reduction in energy consumption with some improvement in makespan

    Investigating elastic cloud based RDF processing

    Get PDF
    The Semantic Web was proposed as an extension of the traditional Web to give Web data context and meaning by using the Resource Description Framework (RDF) data model. The recent growth in the adoption of RDF in addition to the massive growth of RDF data, have led numerous efforts to focus on the challenges of processing this data. To this extent, many approaches have focused on vertical scalability by utilising powerful hardware, or horizontal scalability utilising always-on physical computer clusters or peer to peer networks. However, these approaches utilise fixed and high specification computer clusters that require considerable upfront and ongoing investments to deal with the data growth. In recent years cloud computing has seen wide adoption due to its unique elasticity and utility billing features. This thesis addresses some of the issues related to the processing of large RDF datasets by utilising cloud computing. Initially, the thesis reviews the background literature of related distributed RDF processing work and issues, in particular distributed rulebased reasoning and dictionary encoding, followed by a review of the cloud computing paradigm and related literature. Then, in order to fully utilise features that are specific to cloud computing such as elasticity, the thesis designs and fully implements a Cloud-based Task Execution framework (CloudEx), a generic framework for efficiently distributing and executing tasks on cloud environments. Subsequently, some of the large-scale RDF processing issues are addressed by using the CloudEx framework to develop algorithms for processing RDF using cloud computing. These algorithms perform efficient dictionary encoding and forward reasoning using cloud-based columnar databases. The algorithms are collectively implemented as an Elastic Cost Aware Reasoning Framework (ECARF), a cloud-based RDF triple store. This thesis presents original results and findings that advance the state of the art of performing distributed cloud-based RDF processing and forward reasoning
    • …
    corecore