8 research outputs found

    A Decomposed Data Analysis Approach to Assessing City Sustainable Development Performance: A Network DEA Model with a Slack-Based Measure

    Get PDF
    This paper deals with urban sustainable development in China. We propose a network data envelopment analysis (DEA) model with a slack-based measure (SBM) to analyze the eco-efficiency of 284 Chinese cities, enabling us to find a way to open the “black box” in conventional DEA models and introduce social well-being factors into the model, and depict the role of local government in providing public service and improving social well-beings. We set up a framework of urban development by dividing the process of into two steps. The first stage is a production system translating inputs and natural resources into GDP and waste production, which will be inputs to the second stage for distribution and consumption to realize social welfare and environmental protection. The results show eco-efficiency of Chinese cities experienced a significant decrease from 2005 to 2016, which should be mainly attributed to the distribution and consumption processes. Structural differences are described by regions, administrative level and clusters. These results are compared with an existing urban sustainability index system developed by McKinsey and an ANOVA approach is conducted to reveal differences between cities across regions and clusters. This article sheds new light on the understanding of urban sustainable construction and development in China regarding the service performance of local government. View Full-Tex

    Probabilistic-based approach for evaluating the thermal response of concrete slabs under fire loading

    Get PDF
    Performance-based design for fire safety has been introduced in several international design frameworks. The fire models and simulations include various assumptions and simplifications, and the current fire resistance evaluation is based on deterministic approaches, leading to uncertainties in the performance of the structural members exposed to fire. An alternative is the application of probabilistic methodologies to assess the fire resistance of the structural members. The authors present the application of an efficient probabilistic methodology to perform a sensitivity analysis to identify the critical variables of a thermal model of a structural element exposed to characteristic fire loading. Furthermore, the methodology determines the reliability of the structural element. The methodology combines the elementary effects method with variance-based methods to rank the influence of the governing variables of the thermal and fire models on the thermal performance of a reinforced concrete slab and to determine their uncertainty contribution to the time-dependent thermal response. Furthermore, the Monte Carlo method is applied to calculate the probability of failure and the reliability index of the structural member exposed to fire loading. The critical governing variables from the fire model are the firefighting measures index, which accounts for firefighting measures used in the compartment (FFMi), characteristic fuel load density (qf,k), compartment opening factor (O), and the ratio of the compartment's floor area to total area (Af/At). The critical governing variables from the thermal model are the coefficient of convection (h), concrete specific heat (cc), concrete density (dc), and concrete conductivity (kc). As one moves away from the exposed surface, h, qf,k, and Af/At are not as influential in the thermal response. Also observed is that the uncertainty of FFMi, O, cc, and h are the primary sources of the thermal response's uncertainty. Considering the variability of the input variables, a low-reliability index is determined for buildings with no basic firefighting measures, and adding intervention measures, sprinkler systems, and detection systems will increase the reliability index by 53%, 85%, and 89%, respectively

    Case‑based tuning of a metaheuristic algorithm exploiting sensitivity analysis and design of experiments for reverse engineering applications

    Get PDF
    Due to its capacity to evolve in a large solution space, the Simulated Annealing (SA) algorithm has shown very promising results for the Reverse Engineering of editable CAD geometries including parametric 2D sketches, 3D CAD parts and assem blies. However, parameter setting is a key factor for its performance, but it is also awkward work. This paper addresses the way a SA-based Reverse Engineering technique can be enhanced by identifying its optimal default setting parameters for the ftting of CAD geometries to point clouds of digitized parts. The method integrates a sensitivity analysis to characterize the impact of the variations in the parameters of a CAD model on the evolution of the deviation between the CAD model itself and the point cloud to be ftted. The principles underpinning the adopted ftting algorithm are briefy recalled. A framework that uses design of experiments (DOEs) is introduced to identify and save in a database the best setting parameter values for given CAD models. This database is then exploited when considering the ftting of a new CAD model. Using similar ity assessment, it is then possible to reuse the best setting parameter values of the most similar CAD model found in the database. The applied sensitivity analysis is described together with the comparison of the resulting sensitivity evolution curves with the changes in the CAD model parameters imposed by the SA algorithm. Possible improvements suggested by the analysis are implemented to enhance the efciency of SA-based ftting. The overall approach is illustrated on the ftting of single mechanical parts but it can be directly extended to the ftting of parts’ assemblies. It is particularly interesting in the context of the Industry 4.0 to update and maintain the coherence of the digital twins with respect to the evolution of the associated physical products and systems

    The Future of Sensitivity Analysis: An essential discipline for systems modeling and policy support

    Get PDF
    Sensitivity analysis (SA) is en route to becoming an integral part of mathematical modeling. The tremendous potential benefits of SA are, however, yet to be fully realized, both for advancing mechanistic and data-driven modeling of human and natural systems, and in support of decision making. In this perspective paper, a multidisciplinary group of researchers and practitioners revisit the current status of SA, and outline research challenges in regard to both theoretical frameworks and their applications to solve real-world problems. Six areas are discussed that warrant further attention, including (1) structuring and standardizing SA as a discipline, (2) realizing the untapped potential of SA for systems modeling, (3) addressing the computational burden of SA, (4) progressing SA in the context of machine learning, (5) clarifying the relationship and role of SA to uncertainty quantification, and (6) evolving the use of SA in support of decision making. An outlook for the future of SA is provided that underlines how SA must underpin a wide variety of activities to better serve science and society.John Jakeman’s work was supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, Scientific Discovery through Advanced Computing (SciDAC) program. Joseph Guillaume received funding from an Australian Research Council Discovery Early Career Award (project no. DE190100317). Arnald Puy worked on this paper on a Marie Sklodowska-Curie Global Fellowship, grant number 792178. Takuya Iwanaga is supported through an Australian Government Research Training Program (AGRTP) Scholarship and the ANU Hilda-John Endowment Fun

    Advances on Mechanics, Design Engineering and Manufacturing III

    Get PDF
    This open access book gathers contributions presented at the International Joint Conference on Mechanics, Design Engineering and Advanced Manufacturing (JCM 2020), held as a web conference on June 2–4, 2020. It reports on cutting-edge topics in product design and manufacturing, such as industrial methods for integrated product and process design; innovative design; and computer-aided design. Further topics covered include virtual simulation and reverse engineering; additive manufacturing; product manufacturing; engineering methods in medicine and education; representation techniques; and nautical, aeronautics and aerospace design and modeling. The book is organized into four main parts, reflecting the focus and primary themes of the conference. The contributions presented here not only provide researchers, engineers and experts in a range of industrial engineering subfields with extensive information to support their daily work; they are also intended to stimulate new research directions, advanced applications of the methods discussed and future interdisciplinary collaborations

    Global sensitivity analysis for optimization with variable selection

    No full text
    International audienceThe optimization of high dimensional functions is a key issue in engineering problems but it often comes at a cost that is not acceptable since it usually involves a complex and expensive computer code. In practice, engineers usually overcome this limitation by rst identifying which parameters drive the most the function variations: non-inuential variables are set to a xed value and the optimization procedure is then carried out with the remaining inuential variables only [1]. However, such variable selection is performed through inuence measures typically designed for regression problems, and does not account for the specic structure of an optimization problem. Ideally, we would like to identify which variables have an impact on constraints satisfaction and lead to low values of the objective function. In this paper, we propose a new sensitivity analysis that incorporates the specic aspects of optimization problems. In particular, we introduce an inuence measure based on the Hilbert-Schmidt Independence Criterion to characterize [2] whether a design variable matters to reach low values of the objective function and to satisfy the constraints. This measure makes it possible to sort the inputs and reduce the problem dimension. We estimate the sensitivity for optimization measure from a design of experiments and propose a random and a greedy strategies to set the values of the non-inuential variables before conducting a local optimization. We apply our methods to several test-cases from common optimization benchmarks. Our results show how variable selection for optimization and the greedy strategy can signicantly reduce the number of function evaluations while still attaining satisfying minima. References [1] Zabalza-Mezghani, I., Manceau, E., Feraille, M., Jourdan, A. (2004). Uncertainty management: From geological scenarios to production scheme optimization

    Global sensitivity analysis for optimization with variable selection

    No full text
    International audienceThe optimization of high dimensional functions is a key issue in engineering problems but it frequently comes at a cost that is not acceptable since it usually involves a complex and expensive computer code. Engineers often overcome this limitation by first identifying which parameters drive the most the function variations: non-influential variables are set to a fixed value and the optimization procedure is carried out with the remaining influential variables. Such variable selection is performed through influence measures that are meaningful for regression problems. However it does not account for the specific structure of optimization problems where we would like to identify which variables most lead to constraints satisfaction and low values of the objective function. In this paper, we propose a new sensitivity analysis that accounts for the specific aspects of optimization problems. In particular, we introduce an influence measure based on the Hilbert-Schmidt Independence Criterion to characterize whether a design variable matters to reach low values of the objective function and to satisfy the constraints. This sensitivity measure makes it possible to sort the inputs and reduce the problem dimension. We compare a random and a greedy strategies to set the values of the non-influential variables before conducting a local optimization. Applications to several test-cases show that this variable selection and the greedy strategy significantly reduce the number of function evaluations at a limited cost in terms of solution performance
    corecore