7,150 research outputs found

    Optimal aeroelastic trim for rotorcraft with constrained, non-unique trim solutions

    Get PDF
    New rotorcraft configurations are emerging, such as the optimal speed helicopter and slowed-rotor compound helicopter which, due to variable rotor speed and redundant lifting components, have non-unique trim solution spaces. The combination of controls and rotor speed that produce the best steady-flight condition is sought among all the possible solutions. This work develops the concept of optimal rotorcraft trim and explores its application to advanced rotorcraft configurations with non-unique, constrained trim solutions. The optimal trim work is based on the nonlinear programming method of the generalized reduced gradient (GRG) and is integrated into a multi-body, comprehensive aeroelastic rotorcraft code. In addition to the concept of optimal trim, two further developments are presented that allow the extension of optimal trim to rotorcraft with rotors that operate over a wide range of rotor speeds. The first is the concept of variable rotor speed trim with special application to rotors operating in steady autorotation. The technique developed herein treats rotor speed as a trim variable and uses a Newton-Raphson iterative method to drive the rotor speed to zero average torque simultaneously with other dependent trim variables. The second additional contribution of this thesis is a novel way to rapidly approximate elastic rotor blade stresses and strains in the aeroelastic trim analysis for structural constraints. For rotors that operate over large angular velocity ranges, rotor resonance and increased flapping conditions are encountered that can drive the maximum cross-sectional stress and strain to levels beyond endurance limits; such conditions must be avoided. The method developed herein captures the maximum cross-sectional stress/strain based on the trained response of an artificial neural network (ANN) surrogate as a function of 1-D beam forces and moments. The stresses/strains are computed simultaneously with the optimal trim and are used as constraints in the optimal trim solution. Finally, an optimal trim analysis is applied to a high-speed compound gyroplane configuration, which has two distinct rotor speed control methods, with the purpose of maximizing the vehicle cruise efficiency while maintaining rotor blade strain below endurance limit values.Ph.D.Committee Chair: Dimitri N. Mavris; Committee Co-Chair: Daniel P Schrage; Committee Member: David A. Peters; Committee Member: Dewey H. Hodges; Committee Member: J.V.R. Prasa

    Designing structured tight frames via an alternating projection method

    Get PDF
    Tight frames, also known as general Welch-bound- equality sequences, generalize orthonormal systems. Numerous applications - including communications, coding, and sparse approximation- require finite-dimensional tight frames that possess additional structural properties. This paper proposes an alternating projection method that is versatile enough to solve a huge class of inverse eigenvalue problems (IEPs), which includes the frame design problem. To apply this method, one needs only to solve a matrix nearness problem that arises naturally from the design specifications. Therefore, it is the fast and easy to develop versions of the algorithm that target new design problems. Alternating projection will often succeed even if algebraic constructions are unavailable. To demonstrate that alternating projection is an effective tool for frame design, the paper studies some important structural properties in detail. First, it addresses the most basic design problem: constructing tight frames with prescribed vector norms. Then, it discusses equiangular tight frames, which are natural dictionaries for sparse approximation. Finally, it examines tight frames whose individual vectors have low peak-to-average-power ratio (PAR), which is a valuable property for code-division multiple-access (CDMA) applications. Numerical experiments show that the proposed algorithm succeeds in each of these three cases. The appendices investigate the convergence properties of the algorithm

    Optimal compromise between incompatible conditional probability distributions, with application to Objective Bayesian Kriging

    Full text link
    Models are often defined through conditional rather than joint distributions, but it can be difficult to check whether the conditional distributions are compatible, i.e. whether there exists a joint probability distribution which generates them. When they are compatible, a Gibbs sampler can be used to sample from this joint distribution. When they are not, the Gibbs sampling algorithm may still be applied, resulting in a "pseudo-Gibbs sampler". We show its stationary probability distribution to be the optimal compromise between the conditional distributions, in the sense that it minimizes a mean squared misfit between them and its own conditional distributions. This allows us to perform Objective Bayesian analysis of correlation parameters in Kriging models by using univariate conditional Jeffreys-rule posterior distributions instead of the widely used multivariate Jeffreys-rule posterior. This strategy makes the full-Bayesian procedure tractable. Numerical examples show it has near-optimal frequentist performance in terms of prediction interval coverage

    Doctor of Philosophy

    Get PDF
    dissertationPlatelet aggregation, an important part of the development of blood clots, is a complex process involving both mechanical interaction between platelets and blood, and chemical transport on and o the surfaces of those platelets. Radial Basis Function (RBF) interpolation is a meshfree method for the interpolation of multidimensional scattered data, and therefore well-suited for the development of meshfree numerical methods. This dissertation explores the use of RBF interpolation for the simulation of both the chemistry and mechanics of platelet aggregation. We rst develop a parametric RBF representation for closed platelet surfaces represented by scattered nodes in both two and three dimensions. We compare this new RBF model to Fourier models in terms of computational cost and errors in shape representation. We then augment the Immersed Boundary (IB) method, a method for uid-structure interaction, with our RBF geometric model. We apply the resultant method to a simulation of platelet aggregation, and present comparisons against the traditional IB method. We next consider a two-dimensional problem where platelets are suspended in a stationary fluid, with chemical diusion in the fluid and chemical reaction-diusion on platelet surfaces. To tackle the latter, we propose a new method based on RBF-generated nite dierences (RBF-FD) for solving partial dierential equations (PDEs) on surfaces embedded in 2D domains. To robustly tackle the former, we remove a limitation of the Augmented Forcing method (AFM), a method for solving PDEs on domains containing curved objects, using RBF-based symmetric Hermite interpolation. Next, we extend our RBF-FD method to the numerical solution of PDEs on surfaces embedded in 3D domains, proposing a new method of stabilizing RBF-FD discretizations on surfaces. We perform convergence studies and present applications motivated by biology. We conclude with a summary of the thesis research and present an overview of future research directions, including spectrally-accurate projection methods, an extension of the Regularized Stokeslet method, RBF-FD for variable-coecient diusion, and boundary conditions for RBF-FD

    Taxonomy of datasets in graph learning : a data-driven approach to improve GNN benchmarking

    Full text link
    The core research of this thesis, mostly comprising chapter four, has been accepted to the Learning on Graphs (LoG) 2022 conference for a spotlight presentation as a standalone paper, under the title "Taxonomy of Benchmarks in Graph Representation Learning", and is to be published in the Proceedings of Machine Learning Research (PMLR) series. As a main author of the paper, my specific contributions to this paper cover problem formulation, design and implementation of our taxonomy framework and experimental pipeline, collation of our results and of course the writing of the article.L'apprentissage profond sur les graphes a atteint des niveaux de succès sans précédent ces dernières années grâce aux réseaux de neurones de graphes (GNN), des architectures de réseaux de neurones spécialisées qui ont sans équivoque surpassé les approches antérieurs d'apprentissage définies sur des graphes. Les GNN étendent le succès des réseaux de neurones aux données structurées en graphes en tenant compte de leur géométrie intrinsèque. Bien que des recherches approfondies aient été effectuées sur le développement de GNN avec des performances supérieures à celles des modèles références d'apprentissage de représentation graphique, les procédures d'analyse comparative actuelles sont insuffisantes pour fournir des évaluations justes et efficaces des modèles GNN. Le problème peut-être le plus répandu et en même temps le moins compris en ce qui concerne l'analyse comparative des graphiques est la "couverture de domaine": malgré le nombre croissant d'ensembles de données graphiques disponibles, la plupart d'entre eux ne fournissent pas d'informations supplémentaires et au contraire renforcent les biais potentiellement nuisibles dans le développement d’un modèle GNN. Ce problème provient d'un manque de compréhension en ce qui concerne les aspects d'un modèle donné qui sont sondés par les ensembles de données de graphes. Par exemple, dans quelle mesure testent-ils la capacité d'un modèle à tirer parti de la structure du graphe par rapport aux fonctionnalités des nœuds? Ici, nous développons une approche fondée sur des principes pour taxonomiser les ensembles de données d'analyse comparative selon un "profil de sensibilité" qui est basé sur la quantité de changement de performance du GNN en raison d'une collection de perturbations graphiques. Notre analyse basée sur les données permet de mieux comprendre quelles caractéristiques des données de référence sont exploitées par les GNN. Par conséquent, notre taxonomie peut aider à la sélection et au développement de repères graphiques adéquats et à une évaluation mieux informée des futures méthodes GNN. Enfin, notre approche et notre implémentation dans le package GTaxoGym (https://github.com/G-Taxonomy-Workgroup/GTaxoGym) sont extensibles à plusieurs types de tâches de prédiction de graphes et à des futurs ensembles de données.Deep learning on graphs has attained unprecedented levels of success in recent years thanks to Graph Neural Networks (GNNs), specialized neural network architectures that have unequivocally surpassed prior graph learning approaches. GNNs extend the success of neural networks to graph-structured data by accounting for their intrinsic geometry. While extensive research has been done on developing GNNs with superior performance according to a collection of graph representation learning benchmarks, current benchmarking procedures are insufficient to provide fair and effective evaluations of GNN models. Perhaps the most prevalent and at the same time least understood problem with respect to graph benchmarking is "domain coverage": Despite the growing number of available graph datasets, most of them do not provide additional insights and on the contrary reinforce potentially harmful biases in GNN model development. This problem stems from a lack of understanding with respect to what aspects of a given model are probed by graph datasets. For example, to what extent do they test the ability of a model to leverage graph structure vs. node features? Here, we develop a principled approach to taxonomize benchmarking datasets according to a "sensitivity profile" that is based on how much GNN performance changes due to a collection of graph perturbations. Our data-driven analysis provides a deeper understanding of which benchmarking data characteristics are leveraged by GNNs. Consequently, our taxonomy can aid in selection and development of adequate graph benchmarks, and better informed evaluation of future GNN methods. Finally, our approach and implementation in the GTaxoGym package (https://github.com/G-Taxonomy-Workgroup/GTaxoGym) are extendable to multiple graph prediction task types and future datasets

    Characterization and Generation of 3D Realistic Geological Particles with Metaball Descriptor based on X-Ray Computed Tomography

    Full text link
    The morphology of geological particles is crucial in determining its granular characteristics and assembly responses. In this paper, Metaball-function based solutions are proposed for morphological characterization and generation of three-dimensional realistic particles according to the X-ray Computed Tomography (XRCT) images. For characterization, we develop a geometric-based Metaball-Imaging algorithm. This algorithm can capture the main contour of parental particles with a series of non-overlapping spheres and refine surface-texture details through gradient search. Four types of particles, hundreds of samples, are applied for evaluations. The result shows good matches on key morphological indicators(i.e., volume, surface area, sphericity, circularity, corey-shape factor, nominal diameter and surface-equivalent-sphere diameter), confirming its characterization precision. For generation, we propose the Metaball Variational Autoencoder. Assisted by deep neural networks, this method can generate 3D particles in Metaball form, while retaining coessential morphological features with parental particles. Additionally, this method allows for control over the generated shapes through an arithmetic pattern, enabling the generation of particles with specific shapes. Two sets of XRCT images different in sample number and geometric features are chosen as parental data. On each training set, one thousand particles are generated for validations. The generation fidelity is demonstrated through comparisons of morphologies and shape-feature distributions between generated and parental particles. Examples are also provided to demonstrate controllability on the generated shapes. With Metaball-based simulations frameworks previously proposed by the authors, these methods have the potential to provide valuable insights into the properties and behavior of actual geological particles
    • …
    corecore