15 research outputs found

    A sharp interface isogeometric strategy for moving boundary problems

    Get PDF
    The proposed methodology is first utilized to model stationary and propagating cracks. The crack face is enriched with the Heaviside function which captures the displacement discontinuity. Meanwhile, the crack tips are enriched with asymptotic displacement functions to reproduce the tip singularity. The enriching degrees of freedom associated with the crack tips are chosen as stress intensity factors (SIFs) such that these quantities can be directly extracted from the solution without a-posteriori integral calculation. As a second application, the Stefan problem is modeled with a hybrid function/derivative enriched interface. Since the interface geometry is explicitly defined, normals and curvatures can be analytically obtained at any point on the interface, allowing for complex boundary conditions dependent on curvature or normal to be naturally imposed. Thus, the enriched approximation naturally captures the interfacial discontinuity in temperature gradient and enables the imposition of Gibbs-Thomson condition during solidification simulation. The shape optimization through configuration of finite-sized heterogeneities is lastly studied. The optimization relies on the recently derived configurational derivative that describes the sensitivity of an arbitrary objective with respect to arbitrary design modifications of a heterogeneity inserted into a domain. The THB-splines, which serve as the underlying approximation, produce sufficiently smooth solution near the boundaries of the heterogeneity for accurate calculation of the configurational derivatives. (Abstract shortened by ProQuest.

    Non-acyclicity of coset lattices and generation of finite groups

    Get PDF

    A Two Dimensional Vertically Integrated Moving Boundary Hydrodynamic Model in Curvilinear Coordinates.

    Get PDF
    A two dimensional, vertically averaged, hydrodynamic mathematical model on an adaptive boundary fitted grid is developed. This model addresses numerous drawbacks in the traditional finite difference models and incorporates such features as moving boundaries due to tidal incursion, and the ability to resolve irregular coastline geometries. A semi-implicit finite difference scheme is developed in order to overcome time step restrictions due to the gravity wave stability criterion. An eulerian-lagrangian formulation is used to solve the hydrodynamic equations on unsteady grid systems. The model is applied to study circulation features in Lake Pontchartrain (a medium sized physical system) and to the Bayou Chitigue channel pond system (a very small physical system), to demonstrate its ability to model systems of varying sizes and shapes

    The Twin-Probe Method: Improving the Accuracy of Langmuir Probes on Small Spacecraft

    Full text link
    A Langmuir probe (LP) is a versatile and effective in-situ space plasma instrument for measuring ion and electron densities, and electron temperatures. However, utilizing LPs on very small spacecraft presents challenges that are not experienced on larger, more traditional spacecraft. In particular, a key issue for LP operation on these very small satellites is the negative spacecraft potential induced during LP sweeps due to the limited ion current collection to the spacecraft relative to the electron current collected by the LP. This induced spacecraft charging reduces the accuracy of measurements made by the LP. To mitigate these charging effects, laboratory plasma experiments and computer modeling confirmed that the spacecraft potential can be tracked during LP sweeps using a second, identical probe configured for high impedance potential measurements. By correcting for changes to the spacecraft potential, the LP sweeps can be reconstructed as if they were referenced against a stable potential, providing more accurate measurements of the ambient plasma’s properties. This dual probe measurement is referred to here as the twin-probe method (TPM). This dissertation focuses on the efficacy of the twin-probe method and identifies barriers that must be addressed to maximize its impact. Particle-in-cell simulations were performed using the NASA/Air Force Spacecraft Charging Analyzer Program (NASCAP-2K) to understand which physical processes and system parameters are most critical when analyzing spacecraft charging behavior. A separate MATLAB program called the Plasma-Spacecraft Interaction Codes for Low Earth Orbit (PSIC-LEO) was developed using analytic equations to model spacecraft charging effects on LP current voltage (I-V) curves. Finally, an experiment campaign, performed at NASA Marshall Space Flight Center (MSFC), studied the TPM in a laboratory plasma that approximates a high-density, low-Earth orbit environment. Through these investigations, it was determined that induced spacecraft charging effects result in LP I-V characteristics which overestimate electron temperature and underestimate electron density. Furthermore, regions of the I-V curves have additional non-linear characteristics due to the spacecraft’s induced potential, making traditional Langmuir probe theory more difficult to apply. The TPM is shown to correct I-V curves to provide more accurate estimates of plasma properties. The magnitude of the TPM correction is dependent on the area ratio, defined as the conductive spacecraft surface area divided by the probe surface area. Greater spacecraft charging and, consequently, larger I-V curve corrections when using the TPM, are observed as the area ratio decreases. The method’s largest impact occurs for area ratios below 300. While the TPM is effective for area ratios greater than 300, overlap between measurement uncertainty and the magnitude of correction prevents definitive claims of a maximum area ratio for which twin-probe implementation is necessary. Moreover, since the TPM mitigates the effects of spacecraft charging, but does not mitigate the charging itself, a minimum area ratio of 50 is recommended for this method. Below this area ratio, the TPM can be used, but the spacecraft may charge too negatively to allow the Langmuir probe to reach the plasma potential, reducing the number of useful plasma properties obtained from the incomplete I-V curve. Finally, novel capabilities brought about using a combination of Langmuir probes and other satellite instruments are identified. These capabilities include expanding the measurable range of plasma ion distributions using charged particle energy analyzers and calibrating for environmental effects (like photoelectron current).PHDApplied PhysicsUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/162984/1/omarleon_1.pd

    Algorithms for Geometric Optimization and Enrichment in Industrialized Building Construction

    Get PDF
    The burgeoning use of industrialized building construction, coupled with advances in digital technologies, is unlocking new opportunities to improve the status quo of construction projects being over-budget, delayed and having undesirable quality. Yet there are still several objective barriers that need to be overcome in order to fully realize the full potential of these innovations. Analysis of literature and examples from industry reveal the following notable barriers: (1) geometric optimization methods need to be developed for the stricter dimensional requirements in industrialized construction, (2) methods are needed to preserve model semantics during the process of generating an updated as-built model, (3) semantic enrichment methods are required for the end-of-life stage of industrialized buildings, and (4) there is a need to develop pragmatic approaches for algorithms to ensure they achieve required computational efficiency. The common thread across these examples is the need for developing algorithms to optimize and enrich geometric models. To date, a comprehensive approach paired with pragmatic solutions remains elusive. This research fills this gap by presenting a new approach for algorithm development along with pragmatic implementations for the industrialized building construction sector. Computational algorithms are effective for driving the design, analysis, and optimization of geometric models. As such, this thesis develops new computational algorithms for design, fabrication and assembly, onsite construction, and end-of-life stages of industrialized buildings. A common theme throughout this work is the development and comparison of varied algorithmic approaches (i.e., exact vs. approximate solutions) to see which is optimal for a given process. This is implemented in the following ways. First, a probabilistic method is used to simulate the accumulation of dimensional tolerances in order to optimize geometric models during design. Second, a series of exact and approximate algorithms are used to optimize the topology of 2D panelized assemblies to minimize material use during fabrication and assembly. Third, a new approach to automatically update geometric models is developed whereby initial model semantics are preserved during the process of generating an as-built model. Finally, a series of algorithms are developed to semantically enrich geometric models to enable industrialized buildings to be disassembled and reused. The developments made in this research form a rational and pragmatic approach to addressing the existing challenges faced in industrialized building construction. Such developments are shown not only to be effective in improving the status quo in the industry (i.e., improving cost, reducing project duration, and improving quality), but also for facilitating continuous innovation in construction. By way of assessing the potential impact of this work, the proposed algorithms can reduce rework risk during fabrication and assembly (65% rework reduction in the case study for the new tolerance simulation algorithm), reduce waste during manufacturing (11% waste reduction in the case study for the new panel unfolding and nesting algorithms), improve accuracy and automation of as-built model generation (model error reduction from 50.4 mm to 5.7 mm in the case study for the new parametric BIM updating algorithms), reduce lifecycle cost for adapting industrialized buildings (15% reduction in capital costs in the computational building configurator) and reducing lifecycle impacts for reusing structural systems from industrialized buildings (between 54% to 95% reduction in average lifecycle impacts for the approach illustrated in Appendix B). From a computational standpoint, the novelty of the algorithms developed in this research can be described as follows. Complex geometric processes can be codified solely on the innate properties of geometry – that is, by parameterizing geometry and using methods such as combinatorial optimization, topology can be optimized and semantics can be automatically enriched for building assemblies. Employing the use of functional discretization (whereby continuous variable domains are converted into discrete variable domains) is shown to be highly effective for complex geometric optimization approaches. Finally, the algorithms encapsulate and balance the benefits posed by both parametric and non-parametric schemas, resulting in the ability to achieve both high representational accuracy and semantically rich information (which has previously not been achieved or demonstrated). In summary, this thesis makes several key improvements to industrialized building construction. One of the key findings is that rather than pre-emptively determining the best suited algorithm for a given process or problem, it is often more pragmatic to derive both an exact and approximate solution and then decide which is optimal to use for a given process. Generally, most tasks related to optimizing or enriching geometric models is best solved using approximate methods. To this end, this research presents a series of key techniques that can be followed to improve the temporal performance of algorithms. The new approach for developing computational algorithms and the pragmatic demonstrations for geometric optimization and enrichment are expected to bring the industry forward and solve many of the current barriers it faces

    Poly-algorithmic Techniques in Real Quantifier Elimination

    Get PDF

    Efficient abstractions for visualization and interaction

    Get PDF
    Abstractions, such as functions and methods, are an essential tool for any programmer. Abstractions encapsulate the details of a computation: the programmer only needs to know what the abstraction achieves, not how it achieves it. However, using abstractions can come at a cost: the resulting program may be inefficient. This can lead to programmers not using some abstractions, instead writing the entire functionality from the ground up. In this thesis, we present several results that make this situation less likely when programming interactive visualizations. We present results that make abstractions more efficient in the areas of graphics, layout and events
    corecore