20,311 research outputs found

    A metric to represent the evolution of CAD/analysis models in collaborative design

    Get PDF
    Computer Aided Design (CAD) and Computer Aided Engineering (CAE) models are often used during product design. Various interactions between the different models must be managed for the designed system to be robust and in accordance with initially defined specifications. Research published to date has for example considered the link between digital mock-up and analysis models. However design/analysis integration must take into consideration the important number of models (digital mock-up and simulation) due to model evolution in time, as well as considering system engineering. To effectively manage modifications made to the system, the dependencies between the different models must be known and the nature of the modification must be characterised to estimate the impact of the modification throughout the dependent models. We propose a technique to describe the nature of a modification which may be used to determine the consequence within other models as well as a way to qualify the modified information. To achieve this, a metric is proposed that allows the qualification and evaluation of data or information, based on the maturity and validity of information and model

    Empowering citizens' cognition and decision making in smart sustainable cities

    Get PDF
    © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes,creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.Advances in Internet technologies have made it possible to gather, store, and process large quantities of data, often in real time. When considering smart and sustainable cities, this big data generates useful information and insights to citizens, service providers, and policy makers. Transforming this data into knowledge allows for empowering citizens' cognition as well as supporting decision-making routines. However, several operational and computing issues need to be taken into account: 1) efficient data description and visualization, 2) forecasting citizens behavior, and 3) supporting decision making with intelligent algorithms. This paper identifies several challenges associated with the use of data analytics in smart sustainable cities and proposes the use of hybrid simulation-optimization and machine learning algorithms as an effective approach to empower citizens' cognition and decision making in such ecosystemsPeer ReviewedPostprint (author's final draft

    The interaction of lean and building information modeling in construction

    Get PDF
    Lean construction and Building Information Modeling are quite different initiatives, but both are having profound impacts on the construction industry. A rigorous analysis of the myriad specific interactions between them indicates that a synergy exists which, if properly understood in theoretical terms, can be exploited to improve construction processes beyond the degree to which it might be improved by application of either of these paradigms independently. Using a matrix that juxtaposes BIM functionalities with prescriptive lean construction principles, fifty-six interactions have been identified, all but four of which represent constructive interaction. Although evidence for the majority of these has been found, the matrix is not considered complete, but rather a framework for research to explore the degree of validity of the interactions. Construction executives, managers, designers and developers of IT systems for construction can also benefit from the framework as an aid to recognizing the potential synergies when planning their lean and BIM adoption strategies

    RH-RT: A Data Analytics Framework for Reducing Wait Time at Emergency Departments and Centres for Urgent Care

    Get PDF
    This is the author accepted manuscript. The final version is available from IEEE via the DOI in this recordRight Hospital – Right Time (RH-RT) is the conceptualization of the use of descriptive, predictive and prescriptive analytics with real-time data from Accident & Emergency (A&E)/Emergency Departments (ED) and centers for urgent care; its objective is to derive maximum value from wait time data by using data analytics techniques, and making them available to both patients and healthcare organizations. The paper presents an architecture for the implementation of RH-RT that is specific to the authors’ current work on a digital platform (NHSquicker) that makes available live waiting time from multiple centers of urgent care (e.g., A&E/ED, Minor Injury Units) in Devon and Cornwall. The focus of the paper is on the development of a Hybrid Systems Model (HSM) comprising of healthcare business intelligence, forecasting techniques and computer simulation. The contribution of the work is the conceptual RH-RT framework and its implementation architecture that relies on near real-time data from NHSquicker.Torbay Medical Research FundEconomic and Social Research Council (ESRC)Torbay Medical Research FundAcademic Health Science Networ

    Investigation of degradation and upgradation models for flexible unit systems: a systematic literature review

    Get PDF
    Research on flexible unit systems (FUS) with the context of descriptive, predictive, and prescriptive analysis have remarkably progressed in recent times, being now reinforced in the current Industry 4.0 era with the increased focus on integration of distributed and digitalized systems. In the existing literature, most of the work focused on the individual contributions of the above mentioned three analyses. Moreover, the current literature is unclear with respect to the integration of degradation and upgradation models for FUS. In this paper, a systematic literature review on degradation, residual life distribution, workload adjustment strategy, upgradation, and predictive maintenance as major performance measures to investigate the performance of the FUS has been considered. In order to identify the key issues and research gaps in the existing literature, the 59 most relevant papers from 2009 to 2020 have been sorted and analyzed. Finally, we identify promising research opportunities that could expand the scope and depth of FUS.The project is funded by the Department of Science and Technology, Science & Engineering Research Board (DST-SERB), Statutory Body Established through an Act of Parliament: SERB Act 2008, Government of India with Sanction Order No ECR/2016/001808, and also by FCT—Fundação para a CiĂȘncia e Tecnologia through the R&D Units Project Scope: UIDB/00319/2020

    The Structured Process Modeling Method (SPMM) : what is the best way for me to construct a process model?

    Get PDF
    More and more organizations turn to the construction of process models to support strategical and operational tasks. At the same time, reports indicate quality issues for a considerable part of these models, caused by modeling errors. Therefore, the research described in this paper investigates the development of a practical method to determine and train an optimal process modeling strategy that aims to decrease the number of cognitive errors made during modeling. Such cognitive errors originate in inadequate cognitive processing caused by the inherent complexity of constructing process models. The method helps modelers to derive their personal cognitive profile and the related optimal cognitive strategy that minimizes these cognitive failures. The contribution of the research consists of the conceptual method and an automated modeling strategy selection and training instrument. These two artefacts are positively evaluated by a laboratory experiment covering multiple modeling sessions and involving a total of 149 master students at Ghent University

    Numerical simulation of flooding from multiple sources using adaptive anisotropic unstructured meshes and machine learning methods

    Get PDF
    Over the past few decades, urban floods have been gaining more attention due to their increase in frequency. To provide reliable flooding predictions in urban areas, various numerical models have been developed to perform high-resolution flood simulations. However, the use of high-resolution meshes across the whole computational domain causes a high computational burden. In this thesis, a 2D control-volume and finite-element (DCV-FEM) flood model using adaptive unstructured mesh technology has been developed. This adaptive unstructured mesh technique enables meshes to be adapted optimally in time and space in response to the evolving flow features, thus providing sufficient mesh resolution where and when it is required. It has the advantage of capturing the details of local flows and wetting and drying front while reducing the computational cost. Complex topographic features are represented accurately during the flooding process. This adaptive unstructured mesh technique can dynamically modify (both, coarsening and refining the mesh) and adapt the mesh to achieve a desired precision, thus better capturing transient and complex flow dynamics as the flow evolves. A flooding event that happened in 2002 in Glasgow, Scotland, United Kingdom has been simulated to demonstrate the capability of the adaptive unstructured mesh flooding model. The simulations have been performed using both fixed and adaptive unstructured meshes, and then results have been compared with those published 2D and 3D results. The presented method shows that the 2D adaptive mesh model provides accurate results while having a low computational cost. The above adaptive mesh flooding model (named as Floodity) has been further developed by introducing (1) an anisotropic dynamic mesh optimization technique (anisotropic-DMO); (2) multiple flooding sources (extreme rainfall and sea-level events); and (3) a unique combination of anisotropic-DMO and high-resolution Digital Terrain Model (DTM) data. It has been applied to a densely urbanized area within Greve, Denmark. Results from MIKE 21 FM are utilized to validate our model. To assess uncertainties in model predictions, sensitivity of flooding results to extreme sea levels, rainfall and mesh resolution has been undertaken. The use of anisotropic-DMO enables us to capture high resolution topographic features (buildings, rivers and streets) only where and when is needed, thus providing improved accurate flooding prediction while reducing the computational cost. It also allows us to better capture the evolving flow features (wetting-drying fronts). To provide real-time spatio-temporal flood predictions, an integrated long short-term memory (LSTM) and reduced order model (ROM) framework has been developed. This integrated LSTM-ROM has the capability of representing the spatio-temporal distribution of floods since it takes advantage of both ROM and LSTM. To reduce the dimensional size of large spatial datasets in LSTM, the proper orthogonal decomposition (POD) and singular value decomposition (SVD) approaches are introduced. The performance of the LSTM-ROM developed here has been evaluated using Okushiri tsunami as test cases. The results obtained from the LSTM-ROM have been compared with those from the full model (Fluidity). Promising results indicate that the use of LSTM-ROM can provide the flood prediction in seconds, enabling us to provide real-time flood prediction and inform the public in a timely manner, reducing injuries and fatalities. Additionally, data-driven optimal sensing for reconstruction (DOSR) and data assimilation (DA) have been further introduced to LSTM-ROM. This linkage between modelling and experimental data/observations allows us to minimize model errors and determine uncertainties, thus improving the accuracy of modelling. It should be noting that after we introduced the DA approach, the prediction errors are significantly reduced at time levels when an assimilation procedure is conducted, which illustrates the ability of DOSR-LSTM-DA to significantly improve the model performance. By using DOSR-LSTM-DA, the predictive horizon can be extended by 3 times of the initial horizon. More importantly, the online CPU cost of using DOSR-LSTM-DA is only 1/3 of the cost required by running the full model.Open Acces
    • 

    corecore