416 research outputs found

    Electrochemical treatment of industrial sulfidic spent caustic streams for sulfide removal and caustic recovery

    No full text
    Alkaline spent caustic streams (SCS) produced in the petrochemical and chemical manufacturing industry, contain high concentrations of reactive sulfide (HS-) and caustic soda (NaOH). Common treatment methods entail high operational costs while not recovering the possible resources that SCS contain. Here we studied the electrochemical treatment of SCS from a chemical manufacturing industry in an electrolysis cell, aiming at anodic HS- removal and cathodic NaOH, devoid of sulfide, recovery. Using a synthetic SCS we first evaluated the HS- oxidation product distribution over time, as well as the HS- removal and the NaOH recovery, as a function of current density. In a second step, we investigated the operational aspects of such treatment for the industrial SCS, under 300 A m(-2) fixed current density. In an electrolysis cell receiving 205 +/- 60 g S L-1 d(-1) HS- over 20 days of continuous operation, HS- was removed with a 38.0 +/- 7.7 % removal and similar to 80 % coulombic efficiency, with a concomitant recovery of a similar to 12 wt.% NaOH solution. The low cell voltage obtained (1.75 +/- 0.12 V), resulted in low energy requirements of 3.7 +/- 0.6 kW h kg(-1) S and 6.3 +/- 0.4 kW h kg(-1) NaOH and suggests techno-economic viability of this process

    Tunneling Appropriate Computational Models from Laser Scanning Data

    Get PDF
    Tunneling projects often require computational models of existing structures. To this end, this paper demonstrates the viability of automatically, robustly reconstructing an individual building model from laser scanning data for further computational modeling without any manual intervention. The resulting model is appropriate for immediate importation into a commercial finite element method (FEM) program. The method combines a voxel-based technique with an angle criterion. Initially, the voxelization model is used to represent the façade model, while an angle criterion is implemented to determine boundaries of the façade and its openings (doors and windows). The algorithm overcomes common problems of occlusions or artefacts that arise during data acquisition. The resulting relative errors of overall dimensions and opening areas of geometric models were less 2% and 6%, respectively, which are generally within industry standards for this type of building modeling.Science Foundation Ireland (SFI/PICA/I850); European Union Grant ERC StG 2012-307836- RETURN

    P4SINC – An Execution Policy Framework for IoT Services in the Edge

    Get PDF
    Internet of Things (IoT) services are increasingly deployed at the edge to access and control Things. The execution of such services needs to be monitored to provide information for security, service contract, and system operation management. Although different techniques have been proposed for deploying and executing IoT services in IoT gateways and edge servers, there is a lack of generic policy frameworks for instrumentation and assurance of various types of execution policies for IoT services. In this paper, we present P4SINC as an execution policy framework that covers various functionalities for IoT services deployed in software-defined machines in IoT infrastructures. P4SINC supports the instrumentation and enforcement of IoT services during their deployment and execution, thus being leveraged for other purposes such as security and service contract management. We illustrate our prototype with realistic examples

    Context-driven Policies Enforcement for Edge-based IoT Data Sharing-as-a-Service

    Get PDF
    Sharing real-time data originating from connected devices is crucial to real-world intelligent Internet of Things (IoT) applications, i.e., based on artificial intelligence/machine learning (AI/ML). Such IoT data sharing involves multiple parties for different purposes and is usually based on data contracts that might depend on the dynamic change of IoT data variety and velocity. It is still an open challenge to support multiple parties (aka tenants) with these dynamic contracts based on the data value for their specific contextual purposes.This work addresses these challenges by introducing a novel dynamic context-based policy enforcement framework to support IoT data sharing (on-Edge) based on dynamic contracts. Our enforcement framework allows IoT Data Hub owners to define extensible rules and metrics to govern the tenants in accessing the shared data on the Edge based on policies defined with static and dynamic contexts. We have developed a proof-of-concept prototype for sharing sensitive data such as surveillance camera videos to illustrate our proposed framework. The experimental results demonstrated that our framework could soundly and timely enforce context-based policies at runtime with moderate overhead. Moreover, the context and policy changes are correctly reflected in the system in nearly real-time.acceptedVersio

    Programming Elasticity and Commitment in Dynamic Processes

    Get PDF
    In the past, elasticity and commitment in business processes were underexplored. But as businesses increasingly exploit pay-per-use resources in the cloud for on-demand needs, elasticity and commitment have become important issues. Here, the authors discuss the value of using elastic resources and commitments to create more dynamic organizations that can easily balance the need to be adaptable and flexible, while also retaining a high level of manageability.Junta de AndalucĂ­a P12-TIC-1867Ministerio de EconomĂ­a y Competitividad TIN2012-32273Junta de AndalucĂ­a TIC-590

    Semf-service evolution management framework

    Get PDF
    Abstrac

    Manufacturing process data analysis pipelines: a requirements analysis and survey

    Get PDF
    Smart manufacturing is strongly correlated with the digitization of all manufacturing activities. This increases the amount of data available to drive productivity and profit through data-driven decision making programs. The goal of this article is to assist data engineers in designing big data analysis pipelines for manufacturing process data. Thus, this paper characterizes the requirements for process data analysis pipelines and surveys existing platforms from academic literature. The results demonstrate a stronger focus on the storage and analysis phases of pipelines than on the ingestion, communication, and visualization stages. Results also show a tendency towards custom tools for ingestion and visualization, and relational data tools for storage and analysis. Tools for handling heterogeneous data are generally well-represented throughout the pipeline. Finally, batch processing tools are more widely adopted than real-time stream processing frameworks, and most pipelines opt for a common script-based data processing approach. Based on these results, recommendations are offered for each phase of the pipeline. Document type: Articl
    • …
    corecore