964 research outputs found

    Robust evolutionary methods for multi-objective and multdisciplinary design optimisation in aeronautics

    Get PDF

    Multiscale structural optimisation with concurrent coupling between scales

    Get PDF
    A robust three-dimensional multiscale topology optimisation framework with concurrent coupling between scales is presented. Concurrent coupling ensures that only the microscale data required to evaluate the macroscale model during each iteration of optimisation is collected and results in considerable computational savings. This represents the principal novelty of the framework and permits a previously intractable number of design variables to be used in the parametrisation of the microscale geometry, which in turn enables accessibility to a greater range of mechanical point properties during optimisation. Additionally, the microscale data collected during optimisation is stored in a re-usable database, further reducing the computational expense of subsequent iterations or entirely new optimisation problems. Application of this methodology enables structures with precise functionally-graded mechanical properties over two-scales to be derived, which satisfy one or multiple functional objectives. For all applications of the framework presented within this thesis, only a small fraction of the microstructure database is required to derive the optimised multiscale solutions, which demonstrates a significant reduction in the computational expense of optimisation in comparison to contemporary sequential frameworks. The derivation and integration of novel additive manufacturing constraints for open-walled microstructures within the concurrently coupled multiscale topology optimisation framework is also presented. Problematic fabrication features are discouraged through the application of an augmented projection filter and two relaxed binary integral constraints, which prohibit the formation of unsupported members, isolated assemblies of overhanging members and slender members during optimisation. Through the application of these constraints, it is possible to derive self-supporting, hierarchical structures with varying topology, suitable for fabrication through additive manufacturing processes.Open Acces

    Managing uncertainty in integrated environmental modelling:the UncertWeb framework

    Get PDF
    Web-based distributed modelling architectures are gaining increasing recognition as potentially useful tools to build holistic environmental models, combining individual components in complex workflows. However, existing web-based modelling frameworks currently offer no support for managing uncertainty. On the other hand, the rich array of modelling frameworks and simulation tools which support uncertainty propagation in complex and chained models typically lack the benefits of web based solutions such as ready publication, discoverability and easy access. In this article we describe the developments within the UncertWeb project which are designed to provide uncertainty support in the context of the proposed ‘Model Web’. We give an overview of uncertainty in modelling, review uncertainty management in existing modelling frameworks and consider the semantic and interoperability issues raised by integrated modelling. We describe the scope and architecture required to support uncertainty management as developed in UncertWeb. This includes tools which support elicitation, aggregation/disaggregation, visualisation and uncertainty/sensitivity analysis. We conclude by highlighting areas that require further research and development in UncertWeb, such as model calibration and inference within complex environmental models

    Interactive optimisation for high-lift design.

    Get PDF
    Interactivity always involves two entities; one of them by default is a human user. The specialised subject of human factors is introduced in the context of computational aerodynamics and optimisation, specifically a high-lift aerofoil. The trial and error nature of a design process hinges on designer’s knowledge, skill and intuition. A basic, important assumption of a man-machine system is that in solving a problem, there are some steps in which the computer has an advantageous edge while in other steps a human has dominance. Computational technologies are now an indispensable part of aerospace technology; algorithms involving significant user interaction, either during the process of generating solutions or as a component of post-optimisation evaluation where human decision making is involved are increasingly becoming popular, multi-objective particle swarm is one such optimiser. Several design optimisation problems in engineering are by nature multi-objective; the interest of a designer lies in simultaneous optimisation against two or more objectives which are usually in conflict. Interactive optimisation allows the designer to understand trade-offs between various objectives, and is generally used as a tool for decision making. The solution to a multi-objective problem, one where betterment in one objective occurs over the deterioration of at least one other objective is called a Pareto set. There are multiple solutions to a problem and multiple betterment ideas to an already existing design. The final responsibility of identifying an optimal solution or idea rests on the design engineers and decision making is done based on quantitative metrics, displayed as numbers or graphs. However, visualisation, ergonomics and human factors influence and impact this decision making process. A visual, graphical depiction of the Pareto front is oftentimes used as a design aid tool for purposes of decision making with chances of errors and fallacies fundamentally existing in engineering design. An effective visualisation tool benefits complex engineering analyses by providing the decision-maker with a good imagery of the most important information. Two high-lift aerofoil data-sets have been used as test-case examples; a multi-element solver, an optimiser based on swarm intelligence technique, and visual techniques which include parallel co-ordinates, heat map, scatter plot, self-organising map and radial coordinate visualisation comprise the module. Factors that affect optima and various evaluation criteria have been studied in light of the human user. This research enquires into interactive optimisation by adapting three interactive approaches: information trade-off, reference point and classification, and investigates selected visualisation techniques which act as chief aids in the context of high-lift design trade studies. Human-in-the-loop engineering, man-machine interaction & interface along with influencing factors, reliability, validation and verification in the presence of design uncertainty are considered. The research structure, choice of optimiser and visual aids adapted in this work are influenced by and streamlined to fit with the parallel on-going development work on Airbus’ Python based tool. Results, analysis, together with literature survey are presented in this report. The words human, user, engineer, aerodynamicist, designer, analyst and decision-maker/ DM are synonymous, and are used interchangeably in this research. In a virtual engineering setting, for an efficient interactive optimisation task, a suitable visualisation tool is a crucial prerequisite. Various optimisation design tools & methods are most useful when combined with a human engineer's insight is the underlying premise of this work; questions such as why, what, how might help aid aeronautical technical innovation.PhD in Aerospac

    Multiscale optimisation of dynamic properties for additively manufactured lattice structures

    Get PDF
    A framework for tailoring the dynamic properties of functionally graded lattice structures through the use of multiscale optimisation is presented in this thesis. The multiscale optimisation utilises a two scale approach to allow for complex lattice structures to be simulated in real time at a similar computational expense to traditional finite element problems. The micro and macro scales are linked by a surrogate model that predicts the homogenised material properties of the underlying lattice geometry based on the lattice design parameters. Optimisation constraints on the resonant frequencies and the Modal Assurance Criteria are implemented that can induce the structure to resonate at specific frequencies whilst simultaneously tracking and ensuring the correct mode shapes are maintained. This is where the novelty of the work lies, as dynamic properties have not previously been optimised for in a multiscale, functionally graded lattice structure. Multiscale methods offer numerous benefits and increased design freedom when generating optimal structures for dynamic environments. These benefits are showcased in a series of optimised cantilever structures. The results show a significant improvement in dynamic behavior when compared to the unoptimised case as well as when compared to a single scale topology optimised structure. The validation of the resonant properties for the lattice structures is performed through a series of mechanical tests on additive manufactured lattices. These tests address both the micro and the macro scale of the multiscale method. The homogeneous and surrogate model assumptions of the micro scale are investigated through both compression and tensile tests of uniform lattice samples. The resonant frequency predictions of the macro scale optimisation are verified through mechanical shaker testing and computed tomography scans of the lattice structure. Sources of discrepancy between the predicted and observed behavior are also investigated and explained.Open Acces

    Holistic Approach for Authoring Immersive and Smart Environments for the Integration in Engineering Education

    Get PDF
    Die vierte industrielle Revolution und der rasante technologische Fortschritt stellen die etablierten Bildungsstrukturen und traditionellen Bildungspraktiken in Frage. Besonders in der Ingenieurausbildung erfordert das lebenslange Lernen, dass man sein Wissen und seine Fähigkeiten ständig verbessern muss, um auf dem Arbeitsmarkt wettbewerbsfähig zu sein. Es besteht die Notwendigkeit eines Paradigmenwechsels in der Bildung und Ausbildung hin zu neuen Technologien wie virtueller Realität und künstlicher Intelligenz. Die Einbeziehung dieser Technologien in ein Bildungsprogramm ist jedoch nicht so einfach wie die Investition in neue Geräte oder Software. Es müssen neue Bildungsprogramme geschaffen oder alte von Grund auf umgestaltet werden. Dabei handelt es sich um komplexe und umfangreiche Prozesse, die Entscheidungsfindung, Design und Entwicklung umfassen. Diese sind mit erheblichen Herausforderungen verbunden, die die Überwindung vieler Hindernisse erfordert. Diese Arbeit stellt eine Methodologie vor, die sich mit den Herausforderungen der Nutzung von Virtueller Realität und Künstlicher Intelligenz als Schlüsseltechnologien in der Ingenieurausbildung befasst. Die Methodologie hat zum Ziel, die Hauptakteure anzuleiten, um den Lernprozess zu verbessern, sowie neuartige und effiziente Lernerfahrungen zu ermöglichen. Da jedes Bildungsprogramm einzigartig ist, folgt die Methodik einem ganzheitlichen Ansatz, um die Erstellung maßgeschneiderter Kurse oder Ausbildungen zu unterstützen. Zu diesem Zweck werden die Wechselwirkung zwischen verschiedenen Aspekten berücksichtigt. Diese werden in den drei Ebenen - Bildung, Technologie und Management zusammengefasst. Die Methodik betont den Einfluss der Technologien auf die Unterrichtsgestaltung und die Managementprozesse. Sie liefert Methoden zur Entscheidungsfindung auf der Grundlage einer umfassenden pädagogischen, technologischen und wirtschaftlichen Analyse. Darüber hinaus unterstützt sie den Prozess der didaktischen Gestaltung durch eine umfassende Kategorisierung der Vor- und Nachteile immersiver Lernumgebungen und zeigt auf, welche ihrer Eigenschaften den Lernprozess verbessern können. Ein besonderer Schwerpunkt liegt auf der systematischen Gestaltung immersiver Systeme und der effizienten Erstellung immersiver Anwendungen unter Verwendung von Methoden aus dem Bereich der künstlichen Intelligenz. Es werden vier Anwendungsfälle mit verschiedenen Ausbildungsprogrammen vorgestellt, um die Methodik zu validieren. Jedes Bildungsprogramm hat seine eigenen Ziele und in Kombination decken sie die Validierung aller Ebenen der Methodik ab. Die Methodik wurde iterativ mit jedem Validierungsprojekt weiterentwickelt und verbessert. Die Ergebnisse zeigen, dass die Methodik zuverlässig und auf viele Szenarien sowie auf die meisten Bildungsstufen und Bereiche übertragbar ist. Durch die Anwendung der in dieser Arbeit vorgestellten Methoden können Interessengruppen immersiven Technologien effektiv und effizient in ihre Unterrichtspraxis integrieren. Darüber hinaus können sie auf der Grundlage der vorgeschlagenen Ansätze Aufwand, Zeit und Kosten für die Planung, Entwicklung und Wartung der immersiven Systeme sparen. Die Technologie verlagert die Rolle des Lehrenden in eine Moderatorrolle. Außerdem bekommen die Lehrkräfte die Möglichkeit die Lernenden individuell zu unterstützen und sich auf deren kognitive Fähigkeiten höherer Ordnung zu konzentrieren. Als Hauptergebnis erhalten die Lernenden eine angemessene, qualitativ hochwertige und zeitgemäße Ausbildung, die sie qualifizierter, erfolgreicher und zufriedener macht

    Machine learning algorithms for efficient process optimisation of variable geometries at the example of fabric forming

    Get PDF
    Für einen optimalen Betrieb erfordern moderne Produktionssysteme eine sorgfältige Einstellung der eingesetzten Fertigungsprozesse. Physikbasierte Simulationen können die Prozessoptimierung wirksam unterstützen, jedoch sind deren Rechenzeiten oft eine erhebliche Hürde. Eine Möglichkeit, Rechenzeit einzusparen sind surrogate-gestützte Optimierungsverfahren (SBO1). Surrogates sind recheneffiziente, datengetriebene Ersatzmodelle, die den Optimierer im Suchraum leiten. Sie verbessern in der Regel die Konvergenz, erweisen sich aber bei veränderlichen Optimierungsaufgaben, etwa häufigen Bauteilanpassungen nach Kundenwunsch, als unhandlich. Um auch solche variablen Optimierungsaufgaben effizient zu lösen, untersucht die vorliegende Arbeit, wie jüngste Fortschritte im Maschinenlernen (ML) – im Speziellen bei neuronalen Netzen – bestehende SBO-Techniken ergänzen können. Dabei werden drei Hauptaspekte betrachtet: erstens, ihr Potential als klassisches Surrogate für SBO, zweitens, ihre Eignung zur effiziente Bewertung der Herstellbarkeit neuer Bauteilentwürfe und drittens, ihre Möglichkeiten zur effizienten Prozessoptimierung für variable Bauteilgeometrien. Diese Fragestellungen sind grundsätzlich technologieübergreifend anwendbar und werden in dieser Arbeit am Beispiel der Textilumformung untersucht. Der erste Teil dieser Arbeit (Kapitel 3) diskutiert die Eignung tiefer neuronaler Netze als Surrogates für SBO. Hierzu werden verschiedene Netzarchitekturen untersucht und mehrere Möglichkeiten verglichen, sie in ein SBO-Framework einzubinden. Die Ergebnisse weisen ihre Eignung für SBO nach: Für eine feste Beispielgeometrie minimieren alle Varianten erfolgreich und schneller als ein Referenzalgorithmus (genetischer Algorithmus) die Zielfunktion. Um die Herstellbarkeit variabler Bauteilgeometrien zu bewerten, untersucht Kapitel 4 anschließend, wie Geometrieinformationen in ein Prozess-Surrogate eingebracht werden können. Hierzu werden zwei ML-Ansätze verglichen, ein merkmals- und ein rasterbasierter Ansatz. Der merkmalsbasierte Ansatz scannt ein Bauteil nach einzelnen, prozessrelevanten Geometriemerkmalen, der rasterbasierte Ansatz hingegen interpretiert die Geometrie als Ganzes. Beide Ansätze können das Prozessverhalten grundsätzlich erlernen, allerdings erweist sich der rasterbasierte Ansatz als einfacher übertragbar auf neue Geometrievarianten. Die Ergebnisse zeigen zudem, dass hauptsächlich die Vielfalt und weniger die Menge der Trainingsdaten diese Übertragbarkeit bestimmt. Abschließend verbindet Kapitel 5 die Surrogate-Techniken für flexible Geometrien mit variablen Prozessparametern, um eine effiziente Prozessoptimierung für variable Bauteile zu erreichen. Hierzu interagiert ein ML-Algorithmus in einer Simulationsumgebung mit generischen Geometriebeispielen und lernt, welche Geometrie, welche Umformparameter erfordert. Nach dem Training ist der Algorithmus in der Lage, auch für nicht-generische Bauteilgeometrien brauchbare Empfehlungen auszugeben. Weiter zeigt sich, dass die Empfehlungen mit ähnlicher Geschwindigkeit wie die klassische SBO zum tatsächlichen Prozessoptimum konvergieren, jedoch kein bauteilspezifisches A-priori-Sampling nötig ist. Einmal trainiert, ist der entwickelte Ansatz damit effizienter. Insgesamt zeigt diese Arbeit, wie ML-Techniken gegenwärtige SBOMethoden erweitern und so die Prozess- und Produktoptimierung zu frühen Entwicklungszeitpunkten effizient unterstützen können. Die Ergebnisse der Untersuchungen münden in Folgefragen zur Weiterentwicklung der Methoden, etwa die Integration physikalischer Bilanzgleichungen, um die Modellprognosen physikalisch konsistenter zu machen

    Optimal shape design with automatically differentiated CAD parametrisations

    Get PDF
    PhD ThesisTypical engineering workflow for aerodynamic design could be considered as a three-stage process: modelling of a new component in a CAD system, its detailed aerodynamic analysis on the computational grid using flow simulations (CFD) and manufacturing of the CAD component. Numerical shape optimisation is becoming an essential industrial method to improve the aerodynamic performance of shapes immersed in fluids. High-fidelity optimisation requires fine design spaces with many design variables, which can only be tackled with gradient-based optimisation methods. Adjoint CFD can efficiently calculate the necessary flow sensitivities on computational grids and ideally, also CAD parametrisation should be kept inside the loop to maintain a consistent CAD model during the optimisation and streamline the design process. However, (i) typical commercial CAD systems do not offer derivative computation and (ii) standard CAD parametrisations may not define a suitable design space for the optimisation. This thesis presents an automatically differentiated (AD) version of the open-source CAD kernel OpenCascade Technology (OCCT), which robustly provides shape derivatives with respect to CAD parameters. Developed block-vector AD mode outperforms commonly used finite difference approaches in both efficiency and accuracy. Coupling of OCCT with an adjoint CFD solver provides for the first time a fully differentiated design chain. Extension of OCCT to perform shape optimisation is demonstrated by using CAD parametrisations based on (a) user-defined parametric CAD models and (b) BRep (NURBS) models. The imposition of geometric constraints, a salient part of the industrial design, is shown for both approaches. Novel parametrisation techniques that can handle components with surface-surface intersections or simultaneously incorporate approaches (a) and (b) for the optimisation of a single component are demonstrated. The CAD-based methodology is successfully applied for aerodynamic shape optimisation of three industrial test cases. Additionally, advantages of the differentiated CAD is showcased for the commonly occurring CAD re-parametrisation and mesh-to-CAD fitting problems

    Methodologies for the analysis of value from delay-tolerant inter-satellite networking

    Get PDF
    In a world that is becoming increasingly connected, both in the sense of people and devices, it is of no surprise that users of the data enabled by satellites are exploring the potential brought about from a more connected Earth orbit environment. Lower data latency, higher revisit rates and higher volumes of information are the order of the day, and inter-connectivity is one of the ways in which this could be achieved. Within this dissertation, three main topics are investigated and built upon. First, the process of routing data through intermittently connected delay-tolerant networks is examined and a new routing protocol introduced, called Spae. The consideration of downstream resource limitations forms the heart of this novel approach which is shown to provide improvements in data routing that closely match that of a theoretically optimal scheme. Next, the value of inter-satellite networking is derived in such a way that removes the difficult task of costing the enabling inter-satellite link technology. Instead, value is defined as the price one should be willing to pay for the technology while retaining a mission value greater than its non-networking counterpart. This is achieved through the use of multi-attribute utility theory, trade-space analysis and system modelling, and demonstrated in two case studies. Finally, the effects of uncertainty in the form of sub-system failure are considered. Inter-satellite networking is shown to increase a system's resilience to failure through introduction of additional, partially failed states, made possible by data relay. The lifetime value of a system is then captured using a semi-analytical approach exploiting Markov chains, validated with a numerical Monte Carlo simulation approach. It is evident that while inter-satellite networking may offer more value in general, it does not necessarily result in a decrease in the loss of utility over the lifetime.In a world that is becoming increasingly connected, both in the sense of people and devices, it is of no surprise that users of the data enabled by satellites are exploring the potential brought about from a more connected Earth orbit environment. Lower data latency, higher revisit rates and higher volumes of information are the order of the day, and inter-connectivity is one of the ways in which this could be achieved. Within this dissertation, three main topics are investigated and built upon. First, the process of routing data through intermittently connected delay-tolerant networks is examined and a new routing protocol introduced, called Spae. The consideration of downstream resource limitations forms the heart of this novel approach which is shown to provide improvements in data routing that closely match that of a theoretically optimal scheme. Next, the value of inter-satellite networking is derived in such a way that removes the difficult task of costing the enabling inter-satellite link technology. Instead, value is defined as the price one should be willing to pay for the technology while retaining a mission value greater than its non-networking counterpart. This is achieved through the use of multi-attribute utility theory, trade-space analysis and system modelling, and demonstrated in two case studies. Finally, the effects of uncertainty in the form of sub-system failure are considered. Inter-satellite networking is shown to increase a system's resilience to failure through introduction of additional, partially failed states, made possible by data relay. The lifetime value of a system is then captured using a semi-analytical approach exploiting Markov chains, validated with a numerical Monte Carlo simulation approach. It is evident that while inter-satellite networking may offer more value in general, it does not necessarily result in a decrease in the loss of utility over the lifetime
    corecore