104,582 research outputs found

    Dynamically Partitioning Workflow over Federated Clouds For Optimising the Monetary Cost and Handling Run-Time Failures

    Get PDF
    Several real-world problems in domain of healthcare, large scale scientific simulations, and manufacturing are organised as workflow applications. Efficiently managing workflow applications on the Cloud computing data-centres is challenging due to the following problems: (i) they need to perform computation over sensitive data (e.g. Healthcare workflows) hence leading to additional security and legal risks especially considering public cloud environments and (ii) the dynamism of the cloud environment can lead to several run-time problems such as data loss and abnormal termination of workflow task due to failures of computing, storage, and network services. To tackle above challenges, this paper proposes a novel workflow management framework call DoFCF (Deploy on Federated Cloud Framework) that can dynamically partition scientific workflows across federated cloud (public/private) data-centres for minimising the financial cost, adhering to security requirements, while gracefully handling run-time failures. The framework is validated in cloud simulation tool (CloudSim) as well as in a realistic workflow-based cloud platform (e-Science Central). The results showed that our approach is practical and is successful in meeting users security requirements and reduces overall cost, and dynamically adapts to the run-time failures

    Simulació i modelat d'entorns cloud per a experimentació amb machine learning

    Get PDF
    Nowadays Cloud computing has emerged as one of the most promising computer paradigms. The idea of selling software as a service has promoted IT enterprises to bet for this new paradigm. Cloud computing aims to power the next generation data centers not only offering software as a service but virtual services like hardware, data storage capacity or application logic. The increasing use of Cloud-based applications will also increase the power dedicated to the data centers that support this Clouds. Research in Cloud computing requires solutions that have to be tested in real environments. It is difficult and expensive to set up suitable test-beds for large scale cluster applications. Simulation can fulfill the needs that we find in Cloud computing experimentation. A large data center simulator can save lots of time and effort in Cloud investigation. This project presents the design and development process of some extensions to an existing virtualized data center simulator for Cloud computing research. It is able to reproduce the behaviour of a real Cloud framework and the information that it offers of the execution makes it suitable for testing and investigation purposes. The final idea of this project was to extend the heterogeneity of the tests that can be run by the simulator to use it as a test-bed for machine learning experimentation

    Technology Trends in ICT – Towards Data-Driven, Farmer-Centered and Knowledge-Based Hybrid Cloud Architectures for Smart Farming

    Get PDF
    Over the past four decades, advances in Information and Communication Technology (ICT) have resulted in unprecedented opportunity and innovation for improving farming outcomes. Ongoing innovations such as mobile, social media, agricultural drones, Internet of Things (IoT), Big Data, and cloud computing present new challenges and opportunities for agribusinesses to redefine and rethink the role of ICT towards achieving better farming outcomes. With recent advances in infrastructure, data (collection, storage and retrieval), and a better understanding of all aspects of the food chain, new challenges and opportunities are presented. Unstructured data is now being generated real time, in large volumes, at high speed and unknown quality that results in challenges to current approaches for decision making, and require a focus on analytics. These new sources of data create the opportunity to inform and drive a change in decision making from one that is highly intuitive to one that is data driven and processed in real-time. This paper highlights recent trends in ICT and introduces a hybrid cloud architecture for smart farming. The proposed architecture emphasizes data-driven, farmer-centered, and knowledge-based decision tools through service integration, aggregation and interoperation. As a customized solution for farmers, the proposed architecture contains components of 1) data integration of on-farm sensors and data from public sources, 2) farm management modules, 3) knowledge-based software solutions from different providers, 4) service integration, aggregation and interoperation, and 5) a customized dashboard focused on usefulness and usability. This cloud-based solution allows the integration of businesses services, things, and technology from any channel and can be used anywhere. At this time, hybrid cloud environments have shown promise to integrate these different services and provide smart farming solutions to both big and smallholder farmers. 

    Shaping the Future of Animation towards Role of 3D Simulation Technology in Animation Film and Television

    Get PDF
    The application of 3D simulation technology has revolutionized the field of animation film and television art, providing new possibilities and creative opportunities for visual storytelling. This research aims to explore the various aspects of applying 3D simulation technology in animation film and television art. It examines how 3D simulation technology enhances the creation of realistic characters, environments, and special effects, contributing to immersive and captivating storytelling experiences. The research also investigates the technical aspects of integrating 3D cloud simulation technology into the animation production pipeline, including modeling, texturing, rigging, and animation techniques. This paper explores the application of these optimization algorithms in the context of cloud-based 3D environments, focusing on enhancing the efficiency and performance of 3D simulations. Black Widow and Spider Monkey Optimization can be used to optimize the placement and distribution of 3D assets in cloud storage systems, improving data access and retrieval times. The algorithms can also optimize the scheduling of rendering tasks in cloud-based rendering pipelines, leading to more efficient and cost-effective rendering processes. The integration of 3D cloud environments and optimization algorithms enables real-time optimization and adaptation of 3D simulations. This allows for dynamic adjustments of simulation parameters based on changing conditions, resulting in improved accuracy and responsiveness. Moreover, it explores the impact of 3D cloud simulation technology on the artistic process, examining how it influences the artistic vision, aesthetics, and narrative possibilities in animation film and television. The research findings highlight the advantages and challenges of using 3D simulation technology in animation, shedding light on its potential future developments and its role in shaping the future of animation film and television art

    ROUTER:Fog Enabled Cloud based Intelligent Resource Management Approach for Smart Home IoT Devices

    Get PDF
    There is a growing requirement for Internet of Things (IoT) infrastructure to ensure low response time to provision latency-sensitive real-time applications such as health monitoring, disaster management, and smart homes. Fog computing offers a means to provide such requirements, via a virtualized intermediate layer to provide data, computation, storage, and networking services between Cloud datacenters and end users. A key element within such Fog computing environments is resource management. While there are existing resource manager in Fog computing, they only focus on a subset of parameters important to Fog resource management encompassing system response time, network bandwidth, energy consumption and latency. To date no existing Fog resource manager considers these parameters simultaneously for decision making, which in the context of smart homes will become increasingly key. In this paper, we propose a novel resource management technique (ROUTER) for fog-enabled Cloud computing environments, which leverages Particle Swarm Optimization to optimize simultaneously. The approach is validated within an IoT-based smart home automation scenario, and evaluated within iFogSim toolkit driven by empirical models within a small-scale smart home experiment. Results demonstrate our approach results a reduction of 12% network bandwidth, 10% response time, 14% latency and 12.35% in energy consumption

    Federating cloud systems for collaborative construction and engineering

    Get PDF
    The construction industry has undergone a transformation in the use of data to drive its processes and outcomes, especially with the use of Building Information Modelling (BIM). In particular, project collaboration in the construction industry can involve multiple stakeholders (architects, engineers, consultants) that exchange data at different project stages. Therefore, the use of Cloud computing in construction projects has continued to increase, primarily due to the ease of access, availability and scalability in data storage and analysis available through such platforms. Federation of cloud systems can provide greater flexibility in choosing a Cloud provider, enabling different members of the construction project to select a provider based on their cost to benefit requirements. When multiple construction disciplines collaborate online, the risk associated with project failure increases as the capability of a provider to deliver on the project cannot be assessed apriori. In such uncontrolled industrial environments, “trust” can be an efficacious mechanism for more informed decision making adaptive to the evolving nature of such multi-organisation dynamic collaborations in construction. This paper presents a trust based Cooperation Value Estimation (CoVE) approach to enable and sustain collaboration among disciplines in construction projects mainly focusing on data privacy, security and performance. The proposed approach is demonstrated with data and processes from a real highway bridge construction project describing the entire selection process of a cloud provider. The selection process uses the audit and assessment process of the Cloud Security Alliance (CSA) and real world performance data from the construction industry workloads. Other application domains can also make use of this proposed approach by adapting it to their respective specifications. Experimental evaluation has shown that the proposed approach ensures on-time completion of projects and enhanced..

    Towards Effective and Efficient Data Management in Embedded Systems and Internet of Things

    Get PDF
    The majority of today low-end and low-cost embedded devices work in dynamic environments under several constraints such as low power, reduced memory, limited processing and communication, etc. Therefore, their data management is critical. We introduce here a general method for data representation, storage, and transmission in embedded systems based on a compact representation scheme and some heuristics. This method has been implemented, tested, and evaluated within a vehicle tracking system that uses an in-house very low cost microcontroller-based telemetry device, which provides for near-real-time remote vehicle monitoring, energy consumption, ubiquitous health, etc. However, our method is general and can be used for any type of low-cost and resource-constrained embedded device, where data communication from the device to the Internet (or cloud) is involved. Its efficiency and effectiveness are proven by significant reductions of mobile data transmitted, as our case study shows. Further benefits are reducing power consumption and transmission costs

    REAL-TIME SENSOR DATA ANALYTICS AND VISUALIZATION IN CLOUD-BASED SYSTEMS FOR FOREST ENVIRONMENT MONITORING

    Get PDF
    Forest environment monitoring is essential for natural resource management. The development of sensors using across forests enables for the collection massive volumes of data due to technological improvements in the sensor network. Raspberry Pi, a flexible and inexpensive single-board computer, is at the main of the system, connecting and interfacing with the many sensors spread throughout the system. Sensors such as this can collect crucial information about the forest's environment, such as the weather, humidity, and temperature. Data from various sensors can be acquired and processed in real-time due to Raspberry Pi's role as a data collection device. The system uses cloud-based services to overcome the limitations of on-premises data processing and storage. A fusion technique on the cloud platform combines and analyzes data from various sensors after receiving transmissions from Raspberry Pi. The cloud service provides a location for live monitoring and other visualization which greatly help data in real-time. These visuals can be accessed remotely, allowing users to access the forest from any location. Improved comprehension and control of forest environments are possible because of the combination of various technologies for collecting, analyzing, and evaluating sensor data

    Improving Online Education Using Big Data Technologies

    Get PDF
    In a world in full digital transformation, where new information and communication technologies are constantly evolving, the current challenge of Computing Environments for Human Learning (CEHL) is to search the right way to integrate and harness the power of these technologies. In fact, these environments face many challenges, especially the increased demand for learning, the huge growth in the number of learners, the heterogeneity of available resources as well as the problems related to the complexity of intensive processing and real-time analysis of data produced by e-learning systems, which goes beyond the limits of traditional infrastructures and relational database management systems. This chapter presents a number of solutions dedicated to CEHL around the two big paradigms, namely cloud computing and Big Data. The first part of this work is dedicated to the presentation of an approach to integrate both emerging technologies of the big data ecosystem and on-demand services of the cloud in the e-learning field. It aims to enrich and enhance the quality of e-learning platforms relying on the services provided by the cloud accessible via the internet. It introduces distributed storage and parallel computing of Big Data in order to provide robust solutions to the requirements of intensive processing, predictive analysis, and massive storage of learning data. To do this, a methodology is presented and applied which describes the integration process. In addition, this chapter also addresses the deployment of a distributed e-learning architecture combining several recent tools of the Big Data and based on a strategy of data decentralization and the parallelization of the treatments on a cluster of nodes. Finally, this article aims to develop a Big Data solution for online learning platforms based on LMS Moodle. A course recommendation system has been designed and implemented relying on machine learning techniques, to help the learner select the most relevant learning resources according to their interests through the analysis of learning traces. The realization of this system is done using the learning data collected from the ESTenLigne platform and Spark Framework deployed on Hadoop infrastructure

    Simulació i modelat d'entorns cloud per a experimentació amb machine learning

    Get PDF
    Nowadays Cloud computing has emerged as one of the most promising computer paradigms. The idea of selling software as a service has promoted IT enterprises to bet for this new paradigm. Cloud computing aims to power the next generation data centers not only offering software as a service but virtual services like hardware, data storage capacity or application logic. The increasing use of Cloud-based applications will also increase the power dedicated to the data centers that support this Clouds. Research in Cloud computing requires solutions that have to be tested in real environments. It is difficult and expensive to set up suitable test-beds for large scale cluster applications. Simulation can fulfill the needs that we find in Cloud computing experimentation. A large data center simulator can save lots of time and effort in Cloud investigation. This project presents the design and development process of some extensions to an existing virtualized data center simulator for Cloud computing research. It is able to reproduce the behaviour of a real Cloud framework and the information that it offers of the execution makes it suitable for testing and investigation purposes. The final idea of this project was to extend the heterogeneity of the tests that can be run by the simulator to use it as a test-bed for machine learning experimentation
    corecore