3,193 research outputs found

    Data science for engineering design: State of the art and future directions

    Get PDF
    Abstract Engineering design (ED) is the process of solving technical problems within requirements and constraints to create new artifacts. Data science (DS) is the inter-disciplinary field that uses computational systems to extract knowledge from structured and unstructured data. The synergies between these two fields have a long story and throughout the past decades, ED has increasingly benefited from an integration with DS. We present a literature review at the intersection between ED and DS, identifying the tools, algorithms and data sources that show the most potential in contributing to ED, and identifying a set of challenges that future data scientists and designers should tackle, to maximize the potential of DS in supporting effective and efficient designs. A rigorous scoping review approach has been supported by Natural Language Processing techniques, in order to offer a review of research across two fuzzy-confining disciplines. The paper identifies challenges related to the two fields of research and to their interfaces. The main gaps in the literature revolve around the adaptation of computational techniques to be applied in the peculiar context of design, the identification of data sources to boost design research and a proper featurization of this data. The challenges have been classified considering their impacts on ED phases and applicability of DS methods, giving a map for future research across the fields. The scoping review shows that to fully take advantage of DS tools there must be an increase in the collaboration between design practitioners and researchers in order to open new data driven opportunities

    Operational Research: Methods and Applications

    Get PDF
    Throughout its history, Operational Research has evolved to include a variety of methods, models and algorithms that have been applied to a diverse and wide range of contexts. This encyclopedic article consists of two main sections: methods and applications. The first aims to summarise the up-to-date knowledge and provide an overview of the state-of-the-art methods and key developments in the various subdomains of the field. The second offers a wide-ranging list of areas where Operational Research has been applied. The article is meant to be read in a nonlinear fashion. It should be used as a point of reference or first-port-of-call for a diverse pool of readers: academics, researchers, students, and practitioners. The entries within the methods and applications sections are presented in alphabetical order. The authors dedicate this paper to the 2023 Turkey/Syria earthquake victims. We sincerely hope that advances in OR will play a role towards minimising the pain and suffering caused by this and future catastrophes

    Managing Distributed Cloud Applications and Infrastructure

    Get PDF
    The emergence of the Internet of Things (IoT), combined with greater heterogeneity not only online in cloud computing architectures but across the cloud-to-edge continuum, is introducing new challenges for managing applications and infrastructure across this continuum. The scale and complexity is simply so complex that it is no longer realistic for IT teams to manually foresee the potential issues and manage the dynamism and dependencies across an increasing inter-dependent chain of service provision. This Open Access Pivot explores these challenges and offers a solution for the intelligent and reliable management of physical infrastructure and the optimal placement of applications for the provision of services on distributed clouds. This book provides a conceptual reference model for reliable capacity provisioning for distributed clouds and discusses how data analytics and machine learning, application and infrastructure optimization, and simulation can deliver quality of service requirements cost-efficiently in this complex feature space. These are illustrated through a series of case studies in cloud computing, telecommunications, big data analytics, and smart cities

    BIM-based software for construction waste analytics using artificial intelligence hybrid models

    Get PDF
    The Construction industry generates about 30% of the total waste in the UK. Current high landfill cost and severe environmental impact of waste reveals the need to reduce waste generated from construction activities. Although literature reveals that the best approach to Construction Waste (CW) management is minimization at the design stage, current tools are not robust enough to support architects and design engineers. Review of extant literature reveals that the key limitations of existing CW management tools are that they are not integrated with the design process and that they lack Building Information Modelling (BIM) compliance. This is because the tools are external to design BIM tools used by architects and design engineers. This study therefore investigates BIM-based strategies for CW management and develops Artificial Intelligent (AI) hybrid models to predict CW at the design stage. The model was then integrated into Autodesk Revit as an add-in (BIMWaste) to provide CW analytics. Based on a critical realism paradigm, the study adopts exploratory sequential mixed methods, which combines both qualitative and quantitative methods into a single study. The study starts with the review of extant literature and (FGIs) with industry practitioners. The transcripts of the FGIs were subjected to thematic analysis to identify prevalent themes from the quotations. The factors from literature review and FGIs were then combined and put together in a questionnaire survey and distributed to industry practitioners. The questionnaire responses were subjected to rigorous statistical process to identify key strategies for BIM-based approach to waste efficient design coordination. Results of factor analysis revealed five groups of BIM strategies for CW management, which are: (i)improved collaboration for waste management, (ii)waste-driven design process and solutions, (iii)lifecycle waste analytics, (iv) Innovative technologies for waste intelligence and analytics, and (v)improved documentation for waste management. The results improve the understanding of BIM functionalities and how they could improve the effectiveness of existing CW management tools. Thereafter, the key strategies were developed into a holistic BIM framework for CW management. This was done to incorporate industrial and technological requirements for BIM enabled waste management into an integrated system.The framework guided the development of AI hybrid models and BIM based tool for CW management. Adaptive Neuro-Fuzzy Inference System (ANFIS) model was developed for CW prediction and mathematical models were developed for CW minimisation. Based on historical Construction Waste Record (CWR) from 117 building projects, the model development reveals that two key predictors of CW are “GFA” and “Construction Type”. The final models were then incorporated into Autodesk Revit to enable the prediction of CW from building designs. The performance of the final tool was tested using a test plan and two test cases. The results show that the tool performs well and that it predicts CW according to waste types, element types, and building levels. The study generated several implications that would be of interest to several stakeholders in the construction industry. Particularly, the study provides a clear direction on how CW management strategies could be integrated into BIM platform to streamline the CW analytics

    CPS Data Streams Analytics based on Machine Learning for Cloud and Fog Computing: A Survey

    Get PDF
    Cloud and Fog computing has emerged as a promising paradigm for the Internet of things (IoT) and cyber-physical systems (CPS). One characteristic of CPS is the reciprocal feedback loops between physical processes and cyber elements (computation, software and networking), which implies that data stream analytics is one of the core components of CPS. The reasons for this are: (i) it extracts the insights and the knowledge from the data streams generated by various sensors and other monitoring components embedded in the physical systems; (ii) it supports informed decision making; (iii) it enables feedback from the physical processes to the cyber counterparts; (iv) it eventually facilitates the integration of cyber and physical systems. There have been many successful applications of data streams analytics, powered by machine learning techniques, to CPS systems. Thus, it is necessary to have a survey on the particularities of the application of machine learning techniques to the CPS domain. In particular, we explore how machine learning methods should be deployed and integrated in cloud and fog architectures for better fulfilment of the requirements, e.g. mission criticality and time criticality, arising in CPS domains. To the best of our knowledge, this paper is the first to systematically study machine learning techniques for CPS data stream analytics from various perspectives, especially from a perspective that leads to the discussion and guidance of how the CPS machine learning methods should be deployed in a cloud and fog architecture

    Many-objective design of reservoir systems - Applications to the Blue Nile

    Get PDF
    This work proposes a multi-criteria optimization-based approach for supporting the negotiated design of multireservoir systems. The research addresses the multi-reservoir system design problem (selecting among alternative options, reservoir sizing), the capacity expansion problem (timing the activation of new assets and the filling of new large reservoirs) and management of multi-reservoir systems at various expansion stages. The aim is to balance multiple long and short-term performance objectives of relevance to stakeholders with differing interests. The work also investigates how problem re-formulations can be used to improve computational efficiency at the design and assessment stage and proposes a framework for post-processing of many objective optimization results to facilitate negotiation among multiple stakeholders. The proposed methods are demonstrated using the Blue Nile in a suite of proof-of-concept studies. Results take the form of Pareto-optimal trade-offs where each point on the curve or surface represents the design of water resource systems (i.e., asset choice, size, implementation dates of reservoirs, and operating policy) and coordination strategies (e.g., cost sharing and power trade) where further benefits in one measure necessarily come at the expense of another. Technical chapters aim to offer practical Nile management and/or investment recommendations deriving from the analysis which could be refined in future more detailed studies

    Managing Distributed Cloud Applications and Infrastructure

    Get PDF
    The emergence of the Internet of Things (IoT), combined with greater heterogeneity not only online in cloud computing architectures but across the cloud-to-edge continuum, is introducing new challenges for managing applications and infrastructure across this continuum. The scale and complexity is simply so complex that it is no longer realistic for IT teams to manually foresee the potential issues and manage the dynamism and dependencies across an increasing inter-dependent chain of service provision. This Open Access Pivot explores these challenges and offers a solution for the intelligent and reliable management of physical infrastructure and the optimal placement of applications for the provision of services on distributed clouds. This book provides a conceptual reference model for reliable capacity provisioning for distributed clouds and discusses how data analytics and machine learning, application and infrastructure optimization, and simulation can deliver quality of service requirements cost-efficiently in this complex feature space. These are illustrated through a series of case studies in cloud computing, telecommunications, big data analytics, and smart cities

    Mechanisms Driving Digital New Venture Creation & Performance: An Insider Action Research Study of Pure Digital Entrepreneurship in EdTech

    Get PDF
    Digitisation has ushered in a new era of value creation where cross border data flows generate more economic value than traditional flows of goods. The powerful new combination of digital and traditional forms of innovation has seen several new industries branded with a ‘tech’ suffix. In the education technology sector (EdTech), which is the industry context of this research, digitisation is driving double-digit growth into a projected $240 billion industry by 2021. Yet, despite its contemporary significance, the field of entrepreneurship has paid little attention to the phenomenon of digital entrepreneurship. As several scholars observe, digitisation challenges core organising axioms of entrepreneurship, with significant implications for the new venture creation process in new sectors such as EdTech. New venture creation no longer appears to follow discrete and linear models of innovation, as spatial and temporal boundaries get compressed. Given the paradigmatic shift, this study investigates three interrelated themes. Firstly, it seeks to determine how a Pure Digital Entrepreneurship (PDE) process develops over time; and more importantly, how the journey challenges extant assumptions of the entrepreneurial process. Secondly, it strives to identify and theorise the deep structures which underlie the PDE process through mechanism-based explanations. Consequently, the study also seeks to determine the causal pathways and enablers which overtly or covertly interrelate to power new venture emergence and performance. Thirdly, it aims to offer practical guidelines for nurturing the growth of PDE ventures, and for the development of supportive ecosystems. To meet the stated objectives, this study utilises an Insider Action Research (IAR) approach to inquiry, which incorporates reflective practice, collaborative inquiry and design research for third-person knowledge production. This three-pronged approach to inquiry allows for the enactment of a PDE journey in real-time, while acquiring a holistic narrative in the ‘swampy lowlands’ of new venture creation. The findings indicate that the PDE process is differentiated by the centrality of digital artifacts in new venture ideas, which in turn result in less-bounded processes that deliver temporal efficiencies – hence, the shorter new venture creation processes than in traditional forms of entrepreneurship. Further, PDE action is defined by two interrelated events – digital product development and digital growth marketing. These events are characterised by the constant forking, merging and termination of diverse activities. Secondly, concurrent enactment and piecemeal co-creation were found to be consequential mechanisms driving temporal efficiencies in digital product development. Meanwhile, data-driven operation and flexibility combine in digital growth marketing, to form higher order mechanisms which considerably reduce the levels of task-specific and outcome uncertainties. Finally, the study finds that digital growth marketing is differentiated from traditional marketing by the critical role of algorithmic agencies in their capacity as gatekeepers. Thus, unlike traditional marketing, which emphasises customer sovereignty, digital growth marketing involves a dual focus on the needs of human and algorithmic stakeholders. Based on the findings, this research develops a pragmatic model of pure digital new venture creation and suggests critical policy guidelines for nurturing the growth of PDE ventures and ecosystems
    corecore