2,610 research outputs found

    Selected Problems of Determining Critical Loads in Sructures with Stable Post-Critical Behaviour.

    Get PDF
    This paper presents selected cases of inapplicability of theory based methods of determining critical loads in thin – walled, composite tubes. 8th layered composite tubes with square cross-section were being subjected to static compression and in order to register experimental data two measuring equipment were employed: strain-gauges and Digital Image Correlation system ARAMIS R ⃝ . When measurement data were collected five different theory based methods were applied in order to determine critical loads. Cases where it was impossible to apply certain methods or some doubts about correctness of the results occurred were presented and analyzed. Moreover in cases where it was possible, the theory was equivalently transformed, in such a way to fit experimental data and calculate the critical loads

    Sustainability-Based Product Design in a Decision Support Semantic Framework

    Get PDF
    The design of products for sustainability involves holistic consideration of a complex diversity of objectives and requirements over a product’s life cycle related to the environment, economics, and the stakeholders in society. These objectives may only be considered effectively when they are represented transparently to design participants early in a design process. Life Cycle Assessment (LCA) provides a credible prescription to account for environmental impacts. However, LCA methods are time consuming to use and are intended to assess the impacts of a completely defined design. Thus, more capable methods are needed to efficiently identify more sustainable design concepts. To this end, this work introduces a fundamental approach to formulate models for normative decision analysis to accurately account for these multiple objectives. Salient features of this novel approach include the direct accounting of the LCA formulations via mathematical relationships and their integration with derived expressions for compatible life cycle cost models, as well as a methodical approach to account for significant sources of uncertainty. Here, a semantic ontological framework integrates the information associated with decision criteria with that of the standards and regulations applicable to a design situation. Since this framework shares the context and meaning of this information and design rationale across domains of knowledge transparently among design participants, this approach can influence a design toward sustainability considerations while the design complies with regulations and standards. Hypothetical equivalents and inequivalents method is represented and deployed to consistently model a designer’s preferences among the criteria. Material selection is a very significant factor for the optimal concept selection of a product’s components. A new method is detailed to estimate the impacts of material alternatives across an entire design space. Here, a new surrogate model construction technique, which is much more efficient than the construction of complete LCA models, can prune the design space with adequate robustness for near optimal concept selection. This new technique introduces a feasible approximation of a Latin Hypercube design at the first of two sampling stages to overcome the issues with sampling from discrete data sets of material property variables

    Entanglement Typicality

    Full text link
    We provide a summary of both seminal and recent results on typical entanglement. By typical values of entanglement, we refer here to values of entanglement quantifiers that (given a reasonable measure on the manifold of states) appear with arbitrarily high probability for quantum systems of sufficiently high dimensionality. We work within the Haar measure framework for discrete quantum variables, where we report on results concerning the average von Neumann and linear entropies as well as arguments implying the typicality of such values in the asymptotic limit. We then proceed to discuss the generation of typical quantum states with random circuitry. Different phases of entanglement, and the connection between typical entanglement and thermodynamics are discussed. We also cover approaches to measures on the non-compact set of Gaussian states of continuous variable quantum systems.Comment: Review paper with two quotes and minimalist figure

    Complexity Characterization in a Probabilistic Approach to Dynamical Systems Through Information Geometry and Inductive Inference

    Full text link
    Information geometric techniques and inductive inference methods hold great promise for solving computational problems of interest in classical and quantum physics, especially with regard to complexity characterization of dynamical systems in terms of their probabilistic description on curved statistical manifolds. In this article, we investigate the possibility of describing the macroscopic behavior of complex systems in terms of the underlying statistical structure of their microscopic degrees of freedom by use of statistical inductive inference and information geometry. We review the Maximum Relative Entropy (MrE) formalism and the theoretical structure of the information geometrodynamical approach to chaos (IGAC) on statistical manifolds. Special focus is devoted to the description of the roles played by the sectional curvature, the Jacobi field intensity and the information geometrodynamical entropy (IGE). These quantities serve as powerful information geometric complexity measures of information-constrained dynamics associated with arbitrary chaotic and regular systems defined on the statistical manifold. Finally, the application of such information geometric techniques to several theoretical models are presented.Comment: 29 page

    State-of-the-art in aerodynamic shape optimisation methods

    Get PDF
    Aerodynamic optimisation has become an indispensable component for any aerodynamic design over the past 60 years, with applications to aircraft, cars, trains, bridges, wind turbines, internal pipe flows, and cavities, among others, and is thus relevant in many facets of technology. With advancements in computational power, automated design optimisation procedures have become more competent, however, there is an ambiguity and bias throughout the literature with regards to relative performance of optimisation architectures and employed algorithms. This paper provides a well-balanced critical review of the dominant optimisation approaches that have been integrated with aerodynamic theory for the purpose of shape optimisation. A total of 229 papers, published in more than 120 journals and conference proceedings, have been classified into 6 different optimisation algorithm approaches. The material cited includes some of the most well-established authors and publications in the field of aerodynamic optimisation. This paper aims to eliminate bias toward certain algorithms by analysing the limitations, drawbacks, and the benefits of the most utilised optimisation approaches. This review provides comprehensive but straightforward insight for non-specialists and reference detailing the current state for specialist practitioners

    Automation of Process Planning for Automated Fiber Placement

    Get PDF
    Process planning represents an essential stage of the Automated Fiber Placement (AFP) workflow. It develops useful and efficient machine processes based upon the working material, composite design, and manufacturing resources. The current state of process planning requires a high degree of interaction from the process planner and could greatly benefit from increased automation. Therefore, a list of key steps and functions are created to identify the more difficult and time-consuming phases of process planning. Additionally, a set of metrics must exist by which to evaluate the effectiveness of the manufactured laminate from the machine code created during the Process Planning stage. This work begins with a ranking process which was performed through a survey of the Advanced Composites Consortium (ACC) Collaborative Research Team (CRT). Members were interviewed who possessed practical process planning experience in the composites industry. The Process Planning survey collected general input on the overall importance and time requirements for each function and which functions would benefit most greatly from semi-automation or full automation. Layup strategies, in addition to dog ears, stagger shifts, steering constraints, and starting points, represented the group of functions labeled as process optimization and ranked the highest in terms of priority for automation. The laminates resulting from the selected parameters are evaluated through the occurrences of principal defect metrics such as fiber gaps, overlaps, angle deviation and steering violations. This document presents an automated software solution to the layup strategy and starting point selection phase of Process Planning. A series of ply scenarios are generated with variations of these ply parameters and evaluated according to a set of metrics entered by the Process Planner. These metrics are generated through use of the Analytical Hierarchy Process (AHP), where relative importance between each of the fiber features are defined. The ply scenarios are selected which reduce the overall fiber feature scores based on the defects the Process Planner wishes to minimize

    Changes from Classical Statistics to Modern Statistics and Data Science

    Full text link
    A coordinate system is a foundation for every quantitative science, engineering, and medicine. Classical physics and statistics are based on the Cartesian coordinate system. The classical probability and hypothesis testing theory can only be applied to Euclidean data. However, modern data in the real world are from natural language processing, mathematical formulas, social networks, transportation and sensor networks, computer visions, automations, and biomedical measurements. The Euclidean assumption is not appropriate for non Euclidean data. This perspective addresses the urgent need to overcome those fundamental limitations and encourages extensions of classical probability theory and hypothesis testing , diffusion models and stochastic differential equations from Euclidean space to non Euclidean space. Artificial intelligence such as natural language processing, computer vision, graphical neural networks, manifold regression and inference theory, manifold learning, graph neural networks, compositional diffusion models for automatically compositional generations of concepts and demystifying machine learning systems, has been rapidly developed. Differential manifold theory is the mathematic foundations of deep learning and data science as well. We urgently need to shift the paradigm for data analysis from the classical Euclidean data analysis to both Euclidean and non Euclidean data analysis and develop more and more innovative methods for describing, estimating and inferring non Euclidean geometries of modern real datasets. A general framework for integrated analysis of both Euclidean and non Euclidean data, composite AI, decision intelligence and edge AI provide powerful innovative ideas and strategies for fundamentally advancing AI. We are expected to marry statistics with AI, develop a unified theory of modern statistics and drive next generation of AI and data science.Comment: 37 page

    Eccentric connectivity index

    Full text link
    The eccentric connectivity index ξc\xi^c is a novel distance--based molecular structure descriptor that was recently used for mathematical modeling of biological activities of diverse nature. It is defined as ξc(G)=vV(G)deg(v)ϵ(v)\xi^c (G) = \sum_{v \in V (G)} deg (v) \cdot \epsilon (v)\,, where deg(v)deg (v) and ϵ(v)\epsilon (v) denote the vertex degree and eccentricity of vv\,, respectively. We survey some mathematical properties of this index and furthermore support the use of eccentric connectivity index as topological structure descriptor. We present the extremal trees and unicyclic graphs with maximum and minimum eccentric connectivity index subject to the certain graph constraints. Sharp lower and asymptotic upper bound for all graphs are given and various connections with other important graph invariants are established. In addition, we present explicit formulae for the values of eccentric connectivity index for several families of composite graphs and designed a linear algorithm for calculating the eccentric connectivity index of trees. Some open problems and related indices for further study are also listed.Comment: 25 pages, 5 figure
    corecore