3,911 research outputs found

    State-of-the-art in aerodynamic shape optimisation methods

    Get PDF
    Aerodynamic optimisation has become an indispensable component for any aerodynamic design over the past 60 years, with applications to aircraft, cars, trains, bridges, wind turbines, internal pipe flows, and cavities, among others, and is thus relevant in many facets of technology. With advancements in computational power, automated design optimisation procedures have become more competent, however, there is an ambiguity and bias throughout the literature with regards to relative performance of optimisation architectures and employed algorithms. This paper provides a well-balanced critical review of the dominant optimisation approaches that have been integrated with aerodynamic theory for the purpose of shape optimisation. A total of 229 papers, published in more than 120 journals and conference proceedings, have been classified into 6 different optimisation algorithm approaches. The material cited includes some of the most well-established authors and publications in the field of aerodynamic optimisation. This paper aims to eliminate bias toward certain algorithms by analysing the limitations, drawbacks, and the benefits of the most utilised optimisation approaches. This review provides comprehensive but straightforward insight for non-specialists and reference detailing the current state for specialist practitioners

    Investigation of Energy Modelling Methods of Multiple Fidelities: A Case Study

    Get PDF
    Building energy modelling has become an integral part of building design due to energy consumption concerns in sustainable buildings. As such, energy modelling methods have evolved to the point of including higher-order physics, complex interconnected components and sub-systems. Despite advances in computer capacity, the cost of generating and running complex energy simulations makes it impractical to rely exclusively on such higher fidelity energy modelling for exploring a large set of design alternatives. This challenge of exploring a large set of alternatives efficiently might be overcome by using surrogate models to generalize across the large design space from an evaluation of a sparse subset of design alternatives by higher fidelity energy modelling or by using a set of multi-fidelity models in combination to efficiently evaluate the design space. Given there exists a variety of building energy modelling methods for energy estimation, multi-fidelity modelling could be a promising approach for broad exploration of design spaces to identify sustainable building designs. Hence, this study investigates energy estimates from three energy modelling methods (modified bin, degree day, EnergyPlus) over a range of design variables and climatic regions. The goal is to better understand how their outputs compare to each other and whether they might be suitable for a multi-fidelity modelling approach. The results show that modified bin and degree day methods yield energy use estimates of similar magnitude to each other but are typically higher than results from EnergyPlus. The differences in the results were traced, as expected, to the heating and cooling end-uses, and specifically to the heat gain and heat loss through opaque (i.e., walls, floors, roofs) and window surfaces. The observed trends show the potential for these methods to be used for multi-fidelity modelling, thereby allowing building designers to broadly consider and compare more design alternatives earlier in the design process

    Glance behaviours when using an in-vehicle smart driving aid : a real-world, on-road driving study

    Get PDF
    In-vehicle information systems (IVIS) are commonplace in modern vehicles, from the initial satellite navigation and in-car infotainment systems, to the more recent driving related Smartphone applications. Investigating how drivers interact with such systems when driving is key to understanding what factors need to be considered in order to minimise distraction and workload issues while maintaining the benefits they provide. This study investigates the glance behaviours of drivers, assessed from video data, when using a smart driving Smartphone application (providing both eco-driving and safety feedback in real-time) in an on-road study over an extended period of time. Findings presented in this paper show that using the in-vehicle smart driving aid during real-world driving resulted in the drivers spending an average of 4.3% of their time looking at the system, at an average of 0.43 s per glance, with no glances of greater than 2 s, and accounting for 11.3% of the total glances made. This allocation of visual resource could be considered to be taken from ‘spare’ glances, defined by this study as to the road, but off-centre. Importantly glances to the mirrors, driving equipment and to the centre of the road did not reduce with the introduction of the IVIS in comparison to a control condition. In conclusion an ergonomically designed in-vehicle smart driving system providing feedback to the driver via an integrated and adaptive interface does not lead to visual distraction, with the task being integrated into normal driving

    Machine learning for estimation of building energy consumption and performance:a review

    Get PDF
    Ever growing population and progressive municipal business demands for constructing new buildings are known as the foremost contributor to greenhouse gasses. Therefore, improvement of energy eciency of the building sector has become an essential target to reduce the amount of gas emission as well as fossil fuel consumption. One most eective approach to reducing CO2 emission and energy consumption with regards to new buildings is to consider energy eciency at a very early design stage. On the other hand, ecient energy management and smart refurbishments can enhance energy performance of the existing stock. All these solutions entail accurate energy prediction for optimal decision making. In recent years, articial intelligence (AI) in general and machine learning (ML) techniques in specic terms have been proposed for forecasting of building energy consumption and performance. This paperprovides a substantial review on the four main ML approaches including articial neural network, support vector machine, Gaussian-based regressions and clustering, which have commonly been applied in forecasting and improving building energy performance

    Adaptive Simulation Modelling Using The Digital Twin Paradigm

    Get PDF
    Structural Health Monitoring (SHM) involves the application of qualified standards, by competent people, using appropriate processes and procedures throughout the struc- ture’s life cycle, from design to decommissioning. The main goal is to ensure that through an ongoing process of risk management, the structure’s continued fitness-for-purpose (FFP) is maintained – allowing for optimal use of the structure with a minimal chance of downtime and catastrophic failure. While undertaking the SHM task, engineers use model(s) to predict the risk to the structure from degradation mechanisms such as corrosion and cracking. These predictive models are either physics-based, data-driven or hybrid based. The process of building these predictive models tends to involve processing some input parameters related to the material properties (e.g.: mass density, modulus of elasticity, polarisation current curve, etc) or/and the environment, to calibrate the model and using them for the predictive simulation. So, the accuracy of the predictions is very much dependent upon the input data describing the properties of the materials and/or the environmental conditions the structure experiences. For the structure(s) with non-uniform and complex degradation behaviour, this pro- cess is repeated over the life-time of the structure(s), i.e., when each new survey is per- formed (or new data is available) and then the survey data are used to infer changes in the material or environmental properties. This conventional parameter tuning and updat- ing approach is computationally expensive and time-consuming, as multi-simulations are needed and manual intervention is expected to determine the optimal model parameters. There is therefore a need for a fundamental paradigm shift to address the shortcomings of conventional approaches. The Digital Twin (DT) offers such a paradigm shift in that it integrates ultra-high fidelity simulation model(s) with other related structural data, to mirror the structural behaviour of its corresponding physical twin. DT’s inherent ability to handle large data allows for the inclusion of an evolving set of data relating to the struc- ture with time as well as provides for the adaptation of the simulation model with very little need for human intervention. This research project investigated DT as an alternative to the existing model calibration and adaptation approach. It developed a design of experiment platform for online model validation and adaptation (i.e., parameter updating) solver(s) within the Digital Twin paradigm. The design of experimental platform provided a basis upon which an approach based on the creation of surrogates and reduced order model (ROM)-assisted parameter search were developed for improving the efficiency of model calibration and adaptation. Furthermore, the developed approach formed a basis for developing solvers which pro- vide for the self-calibration and self-adaptation capability required for the prediction and analysis of an asset’s structural behaviour over time. The research successfully demonstrated that such solvers can be used to efficiently calibrate ultra-high-fidelity simulation model within a DT environment for the accurate prediction of the status of a real-world engineering structure

    Ultrasound IMT measurement on a multi-ethnic and multi-institutional database: Our review and experience using four fully automated and one semi-automated methods

    Get PDF
    Automated and high performance carotid intima-media thickness (IMT) measurement is gaining increasing importance in clinical practice to assess the cardiovascular risk of patients. In this paper, we compare four fully automated IMT measurement techniques (CALEX, CAMES, CARES and CAUDLES) and one semi-automated technique (FOAM). We present our experience using these algorithms, whose lumen-intima and media-adventitia border estimation use different methods that can be: (a) edge-based; (b) training-based; (c) feature-based; or (d) directional Edge-Flow based. Our database (DB) consisted of 665 images that represented a multi-ethnic group and was acquired using four OEM scanners. The performance evaluation protocol adopted error measures, reproducibility measures, and Figure of Merit (FoM). FOAM showed the best performance, with an IMT bias equal to 0.025 ± 0.225 mm, and a FoM equal to 96.6%. Among the four automated methods, CARES showed the best results with a bias of 0.032 ± 0.279 mm, and a FoM to 95.6%, which was statistically comparable to that of FOAM performance in terms of accuracy and reproducibility. This is the first time that completely automated and user-driven techniques have been compared on a multi-ethnic dataset, acquired using multiple original equipment manufacturer (OEM) machines with different gain settings, representing normal and pathologic case

    Measuring Directed Functional Connectivity Using Non-Parametric Directionality Analysis : Validation and Comparison with Non-Parametric Granger Causality

    Get PDF
    BACKGROUND: 'Non-parametric directionality' (NPD) is a novel method for estimation of directed functional connectivity (dFC) in neural data. The method has previously been verified in its ability to recover causal interactions in simulated spiking networks in Halliday et al. (2015). METHODS: This work presents a validation of NPD in continuous neural recordings (e.g. local field potentials). Specifically, we use autoregressive models to simulate time delayed correlations between neural signals. We then test for the accurate recovery of networks in the face of several confounds typically encountered in empirical data. We examine the effects of NPD under varying: a) signal-to-noise ratios, b) asymmetries in signal strength, c) instantaneous mixing, d) common drive, e) data length, and f) parallel/convergent signal routing. We also apply NPD to data from a patient who underwent simultaneous magnetoencephalography and deep brain recording. RESULTS: We demonstrate that NPD can accurately recover directed functional connectivity from simulations with known patterns of connectivity. The performance of the NPD measure is compared with non-parametric estimators of Granger causality (NPG), a well-established methodology for model-free estimation of dFC. A series of simulations investigating synthetically imposed confounds demonstrate that NPD provides estimates of connectivity that are equivalent to NPG, albeit with an increased sensitivity to data length. However, we provide evidence that: i) NPD is less sensitive than NPG to degradation by noise; ii) NPD is more robust to the generation of false positive identification of connectivity resulting from SNR asymmetries; iii) NPD is more robust to corruption via moderate amounts of instantaneous signal mixing. CONCLUSIONS: The results in this paper highlight that to be practically applied to neural data, connectivity metrics should not only be accurate in their recovery of causal networks but also resistant to the confounding effects often encountered in experimental recordings of multimodal data. Taken together, these findings position NPD at the state-of-the-art with respect to the estimation of directed functional connectivity in neuroimaging

    Vision 2040: A Roadmap for Integrated, Multiscale Modeling and Simulation of Materials and Systems

    Get PDF
    Over the last few decades, advances in high-performance computing, new materials characterization methods, and, more recently, an emphasis on integrated computational materials engineering (ICME) and additive manufacturing have been a catalyst for multiscale modeling and simulation-based design of materials and structures in the aerospace industry. While these advances have driven significant progress in the development of aerospace components and systems, that progress has been limited by persistent technology and infrastructure challenges that must be overcome to realize the full potential of integrated materials and systems design and simulation modeling throughout the supply chain. As a result, NASA's Transformational Tools and Technology (TTT) Project sponsored a study (performed by a diverse team led by Pratt & Whitney) to define the potential 25-year future state required for integrated multiscale modeling of materials and systems (e.g., load-bearing structures) to accelerate the pace and reduce the expense of innovation in future aerospace and aeronautical systems. This report describes the findings of this 2040 Vision study (e.g., the 2040 vision state; the required interdependent core technical work areas, Key Element (KE); identified gaps and actions to close those gaps; and major recommendations) which constitutes a community consensus document as it is a result of over 450 professionals input obtain via: 1) four society workshops (AIAA, NAFEMS, and two TMS), 2) community-wide survey, and 3) the establishment of 9 expert panels (one per KE) consisting on average of 10 non-team members from academia, government and industry to review, update content, and prioritize gaps and actions. The study envisions the development of a cyber-physical-social ecosystem comprised of experimentally verified and validated computational models, tools, and techniques, along with the associated digital tapestry, that impacts the entire supply chain to enable cost-effective, rapid, and revolutionary design of fit-for-purpose materials, components, and systems. Although the vision focused on aeronautics and space applications, it is believed that other engineering communities (e.g., automotive, biomedical, etc.) can benefit as well from the proposed framework with only minor modifications. Finally, it is TTT's hope and desire that this vision provides the strategic guidance to both public and private research and development decision makers to make the proposed 2040 vision state a reality and thereby provide a significant advancement in the United States global competitiveness
    corecore