5,429 research outputs found

    Assessment of Fidelity to Data and Robustness to Uncertainty to Assure Credible Predictions in the Modeling of Wind Turbine Blades

    Get PDF
    In the field of wind energy, modeling and simulation techniques provide an efficient and economical alternative to experimentation for studying the behavior of wind turbines. Numerical models however are approximations of reality, thusly making it crucial to evaluate various sources of uncertainties that influence the model predictions. Credibility of a numerical model rests on the model\u27s ability to replicate existing experimental data, widely known as fidelity-to-data. This dissertation advocates that fidelity-to-data, while necessary, is insufficient to claim credibility of a numerical model. Herein, the objective is to develop numerical models that not only provide agreement to experimental data, but also remain consistent (robust) as unavoidable uncertainties are considered. The focus in this dissertation is on the development of models that are simplified yet consistent with experiments, which offer the possibility of large scale simulations for rapid prototyping and prognostics. This dissertation presents a completely integrated Verification and Validation (V&V) procedure that includes the solution and code verification, sensitivity analysis, calibration, validation, and uncertainty quantification in the development of a finite element (FE) model of the CX-100 wind turbine blade that is simplified yet consistent with experiments. This integrated V&V procedure implements a comprehensive evaluation of uncertainties, including experimental, numerical, and parametric uncertainties, to evaluate the effect of assumptions encountered in the model development process. Mesh refinement studies are performed to ensure that mesh size is chosen such that the effect of numerical uncertainty does not exceed experimental uncertainty. A main effect screening is performed to determine and eliminate the model parameters that are least sensitive to model output, reducing demands on computational resources to only calibrate parameters that significantly influence model predictions. Model calibration is performed in a two-step procedure to de-couple boundary condition effects from the material properties: first against the natural frequencies of the free-free experimental data, and second against the natural frequencies of the fixed-free experimental data. The predictive capability of the calibrated model is successfully validated by comparing model predictions against an independent dataset. Through the V&V activities, this dissertation demonstrates the development of a FE model that is simplified yet consistent with experiments to simulate the low-order vibrations of wind turbine blades. Confidence in model predictions increases when the model has been validated against experimental evidence. However, numerical models that provide excellent fidelity to data after calibration and validation exercises may run the risk of generalizing poorly to other, non-tested settings. Such issues with generalization typically occur if the model is overly complex with many uncertain calibration parameters. As a result, small perturbations in the calibrated input parameter values may result in significant variability in model predictions. Therefore, this dissertation posits that credible model predictions should simultaneously provide fidelity-to¬-data and robustness¬-to-uncertainty. This concept that relies on the trade-off between fidelity and robustness is demonstrated in the selection of a model from among a suite of models developed with varying complexity for CX-100 wind turbine blade in a configuration with added masses. The robustness to uncertainty is evaluated through info-gap decision theory (IGDT), while the fidelity to data is determined with respect to the experimentally obtained natural frequencies of the CX-100 blade. Finally, as fidelity and robustness are conflicting objectives, model calibration can result in multiple plausible solutions with comparable fidelity to data and robustness to uncertainty, raising concerns about non-uniqueness. This dissertation states that to mitigate such non-uniqueness concerns, self-consistency of model predictions must also be evaluated. This concept is demonstrated in the development of a one dimensional simplified beam model to replace the three dimensional finite element model of CX-100 wind turbine blade. The findings demonstrate that all three objectives, fidelity-to-data, robustness-to-uncertainty and self-consistency are conflicting objectives and thus, must be considered simultaneously. When all three objectives are considered during calibration it is observed that the fidelity optimal model remains both least robust and self-consistent, suggesting that robustness and self-consistency are necessary attributes to consider during model calibration

    Methods and concepts for the multi-criteria synthesis of ship structures

    Get PDF

    Cognitive Biases and Organizational Correctives: Do Both Disease and Cure Depend on the Politics of the Beholder?

    Get PDF
    The study reported here assessed the impact of managers\u27 philosophies of human nature on their reactions to influential academic claims and counter-claims of when human judgment is likely to stray from rational-actor standards and of how organizations can correct these biases. Managers evaluated scenarios that depicted decision-making processes at micro, meso, and macro levels of analysis: alleged cognitive biases of individuals, strategies of structuring and coping with accountability relationships between supervisors and employees, and strategies that corporate entities use to cope with accountability demands from the broader society. Political ideology and cognitive style emerged as consistent predictors of the value spins that managers placed on decisions at all three levels of analysis. Specifically, conservative managers with strong preferences for cognitive closure were most likely (a) to defend simple heuristic-driven errors such as overattribution and overconfidence and to warn of the mirror-image mistakes of failing to hold people accountable and of diluting sound policies with irrelevant side-objectives; (b) to be skeptical of complex strategies of structuring or coping with accountability and to praise those who lay down clear rules and take decisive stands; (c) to prefer simple philosophies of corporate governance (the shareholder over stakeholder model) and to endorse organizational norms such as hierarchical filtering that reduce cognitive overload on top management by short-circuiting unnecessary argumentation. Intuitive theories of good judgment apparently cut across levels of analysis and are deeply grounded in personal epistemologies and political ideologies

    From stakeholders analysis to cognitive mapping and Multi Attribute Value Theory: an integrated approach for policy support

    Get PDF
    One of the fundamental features of policy processes in contemporary societies is complexity. It follows from the plurality of points of view actors adopt in their interventions, and from the plurality of criteria upon which they base their decisions. In this context, collaborative multicriteria decision processes seem to be appropriate to address part of the complexity challenge. This study discusses a decision support framework that guides policy makers in their strategic decisions by using a multi-method approach based on the integration of three tools, i.e., (i) stakeholders analysis, to identify the multiple interests involved in the process, (ii) cognitive mapping, to define the shared set of objectives for the analysis, and (iii) Multi Attribute Value Theory, to measure the level of achievement of the previously defined objectives by the policy options under investigation. The integrated decision support framework has been tested on a real world project concerning the location of new parking areas in a UNESCO site in Southern Italy. The purpose of this study was to test the operability of an integrated analytical approach to support policy decisions by investigating the combined and synergistic effect of the three aforementioned tools. The ultimate objective was to propose policy recommendations for a sustainable parking area development strategy in the region under consideration. The obtained results illustrate the importance of integrated approaches for the development of accountable public decision processes and consensus policy alternatives. The proposed integrated methodological framework will, hopefully, stimulate the application of other collaborative decision processes in public policy making

    Bench-Ranking: ettekirjutav analüüsimeetod suurte teadmiste graafide päringutele

    Get PDF
    Relatsiooniliste suurandmete (BD) töötlemisraamistike kasutamine suurte teadmiste graafide töötlemiseks kätkeb endas võimalust päringu jõudlust optimeerimida. Kaasaegsed BD-süsteemid on samas keerulised andmesüsteemid, mille konfiguratsioonid omavad olulist mõju jõudlusele. Erinevate raamistike ja konfiguratsioonide võrdlusuuringud pakuvad kogukonnale parimaid tavasid parema jõudluse saavutamiseks. Enamik neist võrdlusuuringutest saab liigitada siiski vaid kirjeldavaks ja diagnostiliseks analüütikaks. Lisaks puudub ühtne standard nende uuringute võrdlemiseks kvantitatiivselt järjestatud kujul. Veelgi enam, suurte graafide töötlemiseks vajalike konveierite kavandamine eeldab täiendavaid disainiotsuseid mis tulenevad mitteloomulikust (relatsioonilisest) graafi töötlemise paradigmast. Taolisi disainiotsuseid ei saa automaatselt langetada, nt relatsiooniskeemi, partitsioonitehnika ja salvestusvormingute valikut. Käesolevas töös käsitleme kuidas me antud uurimuslünga täidame. Esmalt näitame disainiotsuste kompromisside mõju BD-süsteemide jõudluse korratavusele suurte teadmiste graafide päringute tegemisel. Lisaks näitame BD-raamistike jõudluse kirjeldavate ja diagnostiliste analüüside piiranguid suurte graafide päringute tegemisel. Seejärel uurime, kuidas lubada ettekirjutavat analüütikat järjestamisfunktsioonide ja mitmemõõtmeliste optimeerimistehnikate (nn "Bench-Ranking") kaudu. See lähenemine peidab kirjeldava tulemusanalüüsi keerukuse, suunates praktiku otse teostatavate teadlike otsusteni.Leveraging relational Big Data (BD) processing frameworks to process large knowledge graphs yields a great interest in optimizing query performance. Modern BD systems are yet complicated data systems, where the configurations notably affect the performance. Benchmarking different frameworks and configurations provides the community with best practices for better performance. However, most of these benchmarking efforts are classified as descriptive and diagnostic analytics. Moreover, there is no standard for comparing these benchmarks based on quantitative ranking techniques. Moreover, designing mature pipelines for processing big graphs entails considering additional design decisions that emerge with the non-native (relational) graph processing paradigm. Those design decisions cannot be decided automatically, e.g., the choice of the relational schema, partitioning technique, and storage formats. Thus, in this thesis, we discuss how our work fills this timely research gap. Particularly, we first show the impact of those design decisions’ trade-offs on the BD systems’ performance replicability when querying large knowledge graphs. Moreover, we showed the limitations of the descriptive and diagnostic analyses of BD frameworks’ performance for querying large graphs. Thus, we investigate how to enable prescriptive analytics via ranking functions and Multi-Dimensional optimization techniques (called ”Bench-Ranking”). This approach abstracts out from the complexity of descriptive performance analysis, guiding the practitioner directly to actionable informed decisions.https://www.ester.ee/record=b553332

    Run-time management for future MPSoC platforms

    Get PDF
    In recent years, we are witnessing the dawning of the Multi-Processor Systemon- Chip (MPSoC) era. In essence, this era is triggered by the need to handle more complex applications, while reducing overall cost of embedded (handheld) devices. This cost will mainly be determined by the cost of the hardware platform and the cost of designing applications for that platform. The cost of a hardware platform will partly depend on its production volume. In turn, this means that ??exible, (easily) programmable multi-purpose platforms will exhibit a lower cost. A multi-purpose platform not only requires ??exibility, but should also combine a high performance with a low power consumption. To this end, MPSoC devices integrate computer architectural properties of various computing domains. Just like large-scale parallel and distributed systems, they contain multiple heterogeneous processing elements interconnected by a scalable, network-like structure. This helps in achieving scalable high performance. As in most mobile or portable embedded systems, there is a need for low-power operation and real-time behavior. The cost of designing applications is equally important. Indeed, the actual value of future MPSoC devices is not contained within the embedded multiprocessor IC, but in their capability to provide the user of the device with an amount of services or experiences. So from an application viewpoint, MPSoCs are designed to ef??ciently process multimedia content in applications like video players, video conferencing, 3D gaming, augmented reality, etc. Such applications typically require a lot of processing power and a signi??cant amount of memory. To keep up with ever evolving user needs and with new application standards appearing at a fast pace, MPSoC platforms need to be be easily programmable. Application scalability, i.e. the ability to use just enough platform resources according to the user requirements and with respect to the device capabilities is also an important factor. Hence scalability, ??exibility, real-time behavior, a high performance, a low power consumption and, ??nally, programmability are key components in realizing the success of MPSoC platforms. The run-time manager is logically located between the application layer en the platform layer. It has a crucial role in realizing these MPSoC requirements. As it abstracts the platform hardware, it improves platform programmability. By deciding on resource assignment at run-time and based on the performance requirements of the user, the needs of the application and the capabilities of the platform, it contributes to ??exibility, scalability and to low power operation. As it has an arbiter function between different applications, it enables real-time behavior. This thesis details the key components of such an MPSoC run-time manager and provides a proof-of-concept implementation. These key components include application quality management algorithms linked to MPSoC resource management mechanisms and policies, adapted to the provided MPSoC platform services. First, we describe the role, the responsibilities and the boundary conditions of an MPSoC run-time manager in a generic way. This includes a de??nition of the multiprocessor run-time management design space, a description of the run-time manager design trade-offs and a brief discussion on how these trade-offs affect the key MPSoC requirements. This design space de??nition and the trade-offs are illustrated based on ongoing research and on existing commercial and academic multiprocessor run-time management solutions. Consequently, we introduce a fast and ef??cient resource allocation heuristic that considers FPGA fabric properties such as fragmentation. In addition, this thesis introduces a novel task assignment algorithm for handling soft IP cores denoted as hierarchical con??guration. Hierarchical con??guration managed by the run-time manager enables easier application design and increases the run-time spatial mapping freedom. In turn, this improves the performance of the resource assignment algorithm. Furthermore, we introduce run-time task migration components. We detail a new run-time task migration policy closely coupled to the run-time resource assignment algorithm. In addition to detailing a design-environment supported mechanism that enables moving tasks between an ISP and ??ne-grained recon??gurable hardware, we also propose two novel task migration mechanisms tailored to the Network-on-Chip environment. Finally, we propose a novel mechanism for task migration initiation, based on reusing debug registers in modern embedded microprocessors. We propose a reactive on-chip communication management mechanism. We show that by exploiting an injection rate control mechanism it is possible to provide a communication management system capable of providing a soft (reactive) QoS in a NoC. We introduce a novel, platform independent run-time algorithm to perform quality management, i.e. to select an application quality operating point at run-time based on the user requirements and the available platform resources, as reported by the resource manager. This contribution also proposes a novel way to manage the interaction between the quality manager and the resource manager. In order to have a the realistic, reproducible and ??exible run-time manager testbench with respect to applications with multiple quality levels and implementation tradev offs, we have created an input data generation tool denoted Pareto Surfaces For Free (PSFF). The the PSFF tool is, to the best of our knowledge, the ??rst tool that generates multiple realistic application operating points either based on pro??ling information of a real-life application or based on a designer-controlled random generator. Finally, we provide a proof-of-concept demonstrator that combines these concepts and shows how these mechanisms and policies can operate for real-life situations. In addition, we show that the proposed solutions can be integrated into existing platform operating systems

    Conflicting Objectives in Decisions

    Get PDF
    This book deals with quantitative approaches in making decisions when conflicting objectives are present. This problem is central to many applications of decision analysis, policy analysis, operational research, etc. in a wide range of fields, for example, business, economics, engineering, psychology, and planning. The book surveys different approaches to the same problem area and each approach is discussed in considerable detail so that the coverage of the book is both broad and deep. The problem of conflicting objectives is of paramount importance, both in planned and market economies, and this book represents a cross-cultural mixture of approaches from many countries to the same class of problem
    corecore