395 research outputs found

    Big data reference architecture for industry 4.0: including economic and ethical Implications

    Get PDF
    El rápido progreso de la Industria 4.0 se consigue gracias a las innovaciones en varios campos, por ejemplo, la fabricación, el big data y la inteligencia artificial. La tesis explica la necesidad de una arquitectura del Big Data para implementar la Inteligencia Artificial en la Industria 4.0 y presenta una arquitectura cognitiva para la inteligencia artificial - CAAI - como posible solución, que se adapta especialmente a los retos de las pequeñas y medianas empresas. La tesis examina las implicaciones económicas y éticas de esas tecnologías y destaca tanto los beneficios como los retos para los países, las empresas y los trabajadores individuales. El "Cuestionario de la Industria 4.0 para las PYME" se realizó para averiguar los requisitos y necesidades de las pequeñas y medianas empresas. Así, la nueva arquitectura de la CAAI presenta un modelo de diseño de software y proporciona un conjunto de bloques de construcción de código abierto para apoyar a las empresas durante la implementación. Diferentes casos de uso demuestran la aplicabilidad de la arquitectura y la siguiente evaluación verifica la funcionalidad de la misma.The rapid progress in Industry 4.0 is achieved through innovations in several fields, e.g., manufacturing, big data, and artificial intelligence. The thesis motivates the need for a Big Data architecture to apply artificial intelligence in Industry 4.0 and presents a cognitive architecture for artificial intelligence – CAAI – as a possible solution, which is especially suited for the challenges of small and medium-sized enterprises. The work examines the economic and ethical implications of those technologies and highlights the benefits but also the challenges for countries, companies and individual workers. The "Industry 4.0 Questionnaire for SMEs" was conducted to gain insights into smaller and medium-sized companies’ requirements and needs. Thus, the new CAAI architecture presents a software design blueprint and provides a set of open-source building blocks to support companies during implementation. Different use cases demonstrate the applicability of the architecture and the following evaluation verifies the functionality of the architecture

    A COMPREHENSIVE METHODOLOGY for the OPTIMIZATION of the OPERATING STRATEGY of HYBRID ELECTRIC VEHICLES

    Get PDF
    The sustainable exploitation of energy and reduction of pollutant emissions are main concerns in our society. Driven by more stringent international standards, automobile manufacturers are developing new technologies such as the Hybrid Electric Vehicles (HEVs). These innovative systems combine the main benefits of traditional Internal Combustion Engines (ICEs) with those of Battery Electric Vehicles (BEVs), while overcoming their main drawbacks. HEVs can offer significant improvements in the efficiency of the propulsion system, but they also lead to higher complexities in the design and in the control. In order to exploit all the expected advantages, a dedicated optimization of the Hybrid Operating Strategy (HOS) is required. In this framework, simulation plays a key role in identifying the optimal HOS, where the primary design targets are the fuel economy, emission reduction and improvement in the vehicle performance (including acceleration, driving range, operational flexibility and noise). With such a perspective, a simulation study was performed involving the implementation, in Matlab environment, of zero-dimensional models of a Series Hybrid Electric Vehicle (SHEV) and a Parallel Hybrid Electric Vehicle (PHEV). As far as the hybrid operating strategy is concerned, three different approaches were investigated: _ A novel Benchmark Optimizer (BO), that determines the best possible operating strategy for the selected target, mission profile and powertrain design. The single solution is characterized by a vector, in which every scalar independently defines the mechanical power of the electric machine, for the PHEV, or the engine speed, for the SHEV, at each time step of the selected driving cycle _ A real-time optimizer based on the Minimization of the Total system Losses (TLM). It involves a vector-approach, in order to select, at each time step, the power split that guarantees the minimum system losses. It requires a reduced number of calibration parameters and, therefore, is computationally fast and adequate to work in real-world applications. Based on this technique, two different methodologies concerning the engine component are considered: the Total engine losses (TLM TOT) and the Recoverable (with respect to the optimal operating point) engine losses (TLM REC) _ A real-time optimizer based on the Total Load Switch Thresholds. It switches the operating mode depending on the load and speed signals. It involves a scalar-approach and requires a reduced number of calibration parameters. It is by far the method that requires the least computational effort In all the three cases, the numerical optimizer is based on Genetic Algorithm (GA) techniques. GAs are inspired by the mechanism of natural selection, in which better individuals are likely to be the winners in a competing environment. It is a statistical approach able to solve optimization problems whose objective function is non-continuous, non-differentiable, stochastic and highly non-linear. The study analyses the optimization of the well-to-wheel CO2 emissions of a Parallel and a Series Hybrid Electric Vehicles along the New European Driving Cycle (NEDC) and the Artemis Driving Cycles. In the case of the only compression ignition engine, also NOx emissions were considered as optimization criteria along the NED

    A formal ontology for industrial maintenance

    No full text
    International audienceThe rapid advancement of information and communication technologies has resulted in a variety of maintenance support systems and tools covering all sub-domains of maintenance. Most of these systems are based on different models that are sometimes redundant or incoherent and always heterogeneous. This problem has lead to the development of maintenance platforms integrating all of these support systems. The main problem confronted by these integration platforms is to provide semantic interoperability between different applications within the same environment. In this aim, we have developed an ontology for the field of industrial maintenance, adopting the METHONTOLOGY approach to manage the life cycle development of this ontology, that we have called IMAMO (Industrial MAintenance Management Ontology). This ontology can be used not only to ensure semantic interoperability but also to generate new knowledge that supports decision making in the maintenance process. This paper provides and discusses some tests so as to evaluate the ontology and to show how it can ensure semantic interoperability and generate new knowledge within the platform

    Life cycle assessment of ground mounted photovoltaic panels

    Get PDF
    Abstract. Nowadays, the problem of carbon emission attracts a lot of attention from people in the world. To solve this problem, many solutions are proposed to get the target of Greenhouse Gas emission reduction. Among of all, the increase of the share of renewable energy is known as a feasible and promising approach for achieving this goal. Solar power and wind power is considered as two dominant renewable sources having a significant contribution to the power generation as well as reducing CO₂ emissions. In this study, ground mounted photovoltaic plant is taken as a approach for achieving this target. The objective of the study was to answer three research questions: (1) What are the life-cycle environmental impacts of ground-mounted photovoltaic (GMPV) systems; (2) What are the missing data to perform life cycle assessment (LCA) of GMPV? and (3)What are the future development projections for GMPV and how would they impact on their LCA? Furthermore, the state of the art of GMPV technology is also reviewed. The thesis is based on the data of Ecoinvent v3.3, available in open LCA, associating with six cases studies on GMPV, will give an evaluation about the state of the art of technology, the data gap of GMPV in Ecoinvent v3.3. The LCA method is known as a quantitative approach which is utilized to make an evaluation of whole process of a product. The four steps of LCA are goal and scope definition, inventory analysis, impact assessment and interpretation. Based on the six case studies from literature, the data gaps were recognized regarding the power output, number of modules, performance module and degradation rate, and the materials in the mounting system. These data gaps are very important because they have the significant impacts on the implementation of LCA approach. If these data gaps were filled, operators would be likely to have a more precise evaluation of GMPV systems. It was concluded that multicrystalline silicon module is the commercially available material with highest efficiency but, because of their high cost, the development is shifted towards CdTe thin film materials. CdTe thin film is gradually proving its position in the photovoltaic (PV) commercial market because of growing efficiency and reasonable cost, which are very important when applying in the large scale of GMPV systems. Finally, it was suggested that the third generation technology, which is the combination between Generation 1 technology and Generation II technology with the feature of high efficiency and reasonable cost, has the highest potential for applying in GMPV

    Irish Building Services News

    Get PDF

    Real-time Condition Monitoring and Asset Management of Oil- Immersed Power Transformers

    Get PDF
    This research pioneers a comprehensive asset management methodology utilizing solely online dissolved gas analysis. Integrating advanced AI algorithms, the model was trained and rigorously tested on real-world data, demonstrating its efficacy in optimizing asset performance and reliability

    Local Learning Strategies for Data Management Components

    Get PDF
    In a world with an ever-increasing amount of data processed, providing tools for highquality and fast data processing is imperative. Database Management Systems (DBMSs) are complex adaptive systems supplying reliable and fast data analysis and storage capabilities. To boost the usability of DBMSs even further, a core research area of databases is performance optimization, especially for query processing. With the successful application of Artificial Intelligence (AI) and Machine Learning (ML) in other research areas, the question arises in the database community if ML can also be beneficial for better data processing in DBMSs. This question has spawned various works successfully replacing DBMS components with ML models. However, these global models have four common drawbacks due to their large, complex, and inflexible one-size-fits-all structures. These drawbacks are the high complexity of model architectures, the lower prediction quality, the slow training, and the slow forward passes. All these drawbacks stem from the core expectation to solve a certain problem with one large model at once. The full potential of ML models as DBMS components cannot be reached with a global model because the model’s complexity is outmatched by the problem’s complexity. Therefore, we present a novel general strategy for using ML models to solve data management problems and to replace DBMS components. The novel strategy is based on four advantages derived from the four disadvantages of global learning strategies. In essence, our local learning strategy utilizes divide-and-conquer to place less complex but more expressive models specializing in sub-problems of a data management problem. It splits the problem space into less complex parts that can be solved with lightweight models. This circumvents the one-size-fits-all characteristics and drawbacks of global models. We will show that this approach and the lesser complexity of the specialized local models lead to better problem-solving qualities and DBMS performance. The local learning strategy is applied and evaluated in three crucial use cases to replace DBMS components with ML models. These are cardinality estimation, query optimizer hinting, and integer algorithm selection. In all three applications, the benefits of the local learning strategy are demonstrated and compared to related work. We also generalize the strategy’s usability for a broader application and formulate best practices with instructions for others

    Vertical Structures in the Global Liquefied Natural Gas Market: Vertical Structures in the Global Liquefied Natural Gas Market: Empirical Analyses Based on Recent Developments in Transaction Cost Economics

    Get PDF
    During the last decade, the global liquefied natural gas (LNG) market altered substantially. Significant investments have been realized, traded volumes increased and contracting structures gained in flexibility. Various governance forms co-exist, including the poles of spot market transactions and vertical integration as well as numerous hybrid forms such as long-term contracts, joint ventures, and strategic partnerships. This dissertation empirically investigates, based on transaction cost economics and recent extensions thereof, which motivations drive companies towards the choice of hierarchical governance forms. First, the likelihood of vertical integration and the impact of inter-organizational trust as a shift parameter accounting for differences in the institutional environment are analyzed. Estimation results confirm transaction cost economics by showing that relationship-specific investments in an uncertain environment drive LNG companies to invest in successive stages along the value chain. Furthermore, the presence of inter-organizational trust increases the likelihood of less hierarchical governance modes. Second, alternative theories of the firm are linked in order to explain the menu of strategic positions recently observed in this dynamic market. Estimation results support the positioning-economizing perspective of the firm. The three strategic choices of target market position, resource profile, and organizational structure are interdependent. Third, the determinants of optimal contract length as a trade-off between the minimization of transaction costs due to repeated bilateral bargaining and the risk of being bound in an inflexible agreement in uncertain environments is discussed. Estimation results show that the presence of high asset specificity results in longer contracts whereas the need for flexibility in today’s LNG market supports shorter agreements. When firms have experience in bilateral trading, contract duration decreases. In addition, countries heavily reliant on natural gas imports via LNG are often willing to forgo some flexibility in favor of supply security. Contracts dedicated to competitive downstream markets on average are shorter than those concluded with customers in non-liberalized importing countries

    An investigation of the socio-economic, technical and appliance related factors affecting high electrical energy demand in UK homes

    Get PDF
    The amount of electricity used in individual UK homes varies considerably. Previous UK energy research has identified that high electricity consuming homes not only use more electricity, compared with others, but appear to be consuming even more electricity over time. Furthermore, there is additional evidence which shows that high consuming dwellings also have a greater potential to make energy savings than those who consume less. It has been suggested that future UK energy policy might focus on reducing the demand of high electricity consumers in order to reduce overall CO2 emissions. Therefore, understanding what drives high usage in domestic buildings is essential to support informed decisions. This thesis asserts that to improve knowledge and understanding of the factors affecting high electrical energy consumption in UK domestic buildings, it is necessary to combine an analysis of the occupants socio-economic characteristics, dwelling technical characteristics and appliance related aspects, with detailed monitoring of the ownership, power demand and occupants use of electrical appliances. Using a sample of 315 UK homes, the influence of socio-economic, technical and appliance related characteristics on the probability of a household being a high electrical energy consumer was investigated (Odds ratio analysis). Detailed appliance monitoring data was collected from 27 UK homes to establish the contributions of appliance ownership, power demand and use to high electrical energy demand (Appliance Electricity Use Survey). The current research found similar skewed electricity distributions towards high electricity consumers for both the 315 and 27 home cohorts. Conflicting results were however obtained from the two household samples with regard to whether high electricity consumers are increasing electrical energy demand over time. The results of the odds ratio analysis and Appliance Electricity Use Survey suggest that high electricity consumption in domestic buildings is related to a combination of the socio-economic characteristics of the building occupants, technical characteristics of the dwelling and the ownership, power demand and use of electrical appliances
    corecore