8,255 research outputs found
UMSL Bulletin 2023-2024
The 2023-2024 Bulletin and Course Catalog for the University of Missouri St. Louis.https://irl.umsl.edu/bulletin/1088/thumbnail.jp
Recommended from our members
Rigorous Experimentation For Reinforcement Learning
Scientific fields make advancements by leveraging the knowledge created by others to push the boundary of understanding. The primary tool in many fields for generating knowledge is empirical experimentation. Although common, generating accurate knowledge from empirical experiments is often challenging due to inherent randomness in execution and confounding variables that can obscure the correct interpretation of the results. As such, researchers must hold themselves and others to a high degree of rigor when designing experiments. Unfortunately, most reinforcement learning (RL) experiments lack this rigor, making the knowledge generated from experiments dubious. This dissertation proposes methods to address central issues in RL experimentation.
Evaluating the performance of an RL algorithm is the most common type of experiment in RL literature. Most performance evaluations are often incapable of answering a specific research question and produce misleading results. Thus, the first issue we address is how to create a performance evaluation procedure that holds up to scientific standards.
Despite the prevalence of performance evaluation, these types of experiments produce limited knowledge, e.g., they can only show how well an algorithm worked and not why, and they require significant amounts of time and computational resources. As an alternative, this dissertation proposes that scientific testing, the process of conducting carefully controlled experiments designed to further the knowledge and understanding of how an algorithm works, should be the primary form of experimentation.
Lastly, this dissertation provides a case study using policy gradient methods, showing how scientific testing can replace performance evaluation as the primary form of experimentation. As a result, this dissertation can motivate others in the field to adopt more rigorous experimental practices
Parallel Longest Increasing Subsequence and van Emde Boas Trees
This paper studies parallel algorithms for the longest increasing subsequence
(LIS) problem. Let be the input size and be the LIS length of the
input. Sequentially, LIS is a simple problem that can be solved using dynamic
programming (DP) in work. However, parallelizing LIS is a
long-standing challenge. We are unaware of any parallel LIS algorithm that has
optimal work and non-trivial parallelism (i.e., or
span).
This paper proposes a parallel LIS algorithm that costs work,
span, and space, and is much simpler than the previous
parallel LIS algorithms. We also generalize the algorithm to a weighted version
of LIS, which maximizes the weighted sum for all objects in an increasing
subsequence. To achieve a better work bound for the weighted LIS algorithm, we
designed parallel algorithms for the van Emde Boas (vEB) tree, which has the
same structure as the sequential vEB tree, and supports work-efficient parallel
batch insertion, deletion, and range queries.
We also implemented our parallel LIS algorithms. Our implementation is
light-weighted, efficient, and scalable. On input size , our LIS
algorithm outperforms a highly-optimized sequential algorithm (with cost) on inputs with . Our algorithm is also much faster
than the best existing parallel implementation by Shen et al. (2022) on all
input instances.Comment: to be published in Proceedings of the 35th ACM Symposium on
Parallelism in Algorithms and Architectures (SPAA '23
Process Simulation and Mechanical Analysis of High Temperature Resistance Composite Materials
Department of Mechanical EngineeringHigh-temperature resistance composite materials are rapidly replacing conventional metal alloys used for thermal protection systems (TPS) of spacecraft and missiles exiting and/or reentering the atmosphere. Recently, South Korea succeeded in launching Nuri, the first domestically developed space rocket. Ongoing space programs of S. Korea include the commercialization of the Nuri technologies as well as the development of the next-generation space launch vehicles and the spaceships for exploring the moon and deep space. The Korea government is also actively developing a long-range reentry missile after the ballistic-missile range limits was abolished. These rockets and missiles are subjected to extremely high temperature and pressure when they pass through the atmosphere and thus typically designed with TPS to protect internal devices and human pilots. High-temperature resistant yet lightweight TPS materials are preferred in order to reduce a gross launch payload. Ceramic or carbon-based composite materials are much lighter than metals but also excellent thermal insulators with exceptional dimensional stability at elevated temperature.
The ceramic and carbon-based composite materials are often denoted as ceramic matrix composites (CMCs) and carbon-carbon (CC) composites, respectively. CMCs consist of ceramic fibers embedded in a ceramic matrix. The carbon matrix of CC composites is reinforced with carbon fibers. The most typical manufacturing methods of the CMCs and CC composites is a chemical vapor infiltration (CVI) process. In the CVI process, a porous fibrous preform is commonly used as the initial skeleton of a composite. The preform is placed in a CVI reactor, and the reactor is then pressurized and heated before a precursor gas is supplied. When this gas chemically reacts at the pore surfaces inside the preform, a pyrolytic carbon or ceramic layer is deposited onto the preform surfaces. The deposition process slowly changes the precursor into the matrix, filling the empty spaces of the preform.
Although the CVI process is a seemingly only viable method to produce a large-scale product, porosity in the final product is not completely avoidable because infiltrated fibers may barricade the path of the precursor gas into internal voids. The porosity is considered defects and the sources of the degradation of mechanical properties. In the present PhD study, comprehensive numerical analysis has been performed from the CVI process simulation to the micro- and meso-scale mechanical analysis of the composite materials. Firstly, the mechanisms of porosity formation are examined by developing a physico-chemical model. The effects of the porosity on the mechanical performances are investigated using a microscale and mesoscale composite models.
In the very first part, a fully three-dimensional (3D) physicochemical CVI model is developed to simulate an isothermal CVI process for fabricating bulk carbon-carbon composites using methane as a precursor gas and a multi-layered preform consisting of a non-crimp fabric and felt. The flow inside the CVI reactor was modeled using the Navier-Stokes equation, coupled with the convection-diffusion equation, to simulate the dispersive behaviors of the reactive gases inside the porous preform. The interactive molecular diffusion of methane (CH4), ethylene (C2H4), acetylene (C2H2), and benzene (C6H6) were modeled by considering the multi-step hydrocarbon reactions between the species. The hydrocarbon concentration changes, resulting from the carbon deposition on the preform surface, were computed to predict the evolution of the preform density and porosity. The current surface area of the preform was then determined based on the current porosity. The numerical results for the average preform density agreed well with the experimental data. In addition, the present model can provide detailed simulations of the temporal and spatial evolution of the preform density that cannot be experimentally observed. The effectiveness and utility of the developed model could benefit the design of CVI reactors and processes and minimize the need for test runs when processing conditions change.
In the second part, the results obtained in the micromechanical analysis were passed into a meso-scale thick 3D woven textile composite (T3DWC) model, which was a candidate of TPS for a reentry missile. Finite element analysis is performed to virtually measure homogenized thermal and mechanical properties. For the measurements over a wide range of temperature, temperature-dependent thermal and mechanical properties of constituents are considered. A two-step homogenization approach is adopted here. The first-step homogenization is carried out at a tow level using an analytical homogenization scheme as well as the micromechanical analysis in the second part. Fiber tows are homogenized and assigned with effective elastic and thermal properties. The solid tows are then implemented into a representative volume element considering the unique in-plane periodic fiber architecture of the thick composite material. Due to the unique in-plane periodicity, conventional periodic boundary conditions for thermal and mechanical loading conditions are reformulated. Anisotropic thermal conductivity of T3DWC is obtained from the second-step homogenization based on virtual thermal tests performed at ambient to elevated temperatures.
In the third part, the micromechanical behavior of the CVI-produced porous composites materials is studied. Especially, microstructural fracture behavior of a ceramic matrix composite (CMC) with nonuniformly distributed fibers is examined. A comprehensive numerical analysis package to study the effect of nonuniform fiber dimensions and locations on the microstructural fracture behavior is developed. The package starts with an optimization algorithm for generating representative volume element (RVE) models that are statistically equivalent to experimental measurements. Experimentally measured statistical data are used as constraints while the optimization algorithm is running. Virtual springs are utilized between any adjacent fibers to nonuniformly distribute the coated fibers in the RVE model. The virtual spring with the optimization algorithm can efficiently generate multiple RVEs that are statistically identical to each other. Smeared crack approach (SCA) is implemented to consider the fracture behavior of the CMC material in a mesh-objective manner. The RVEs are subjected to tension as well as the shear loading conditions. SCA is capable of predicting different fracture patterns, uniquely defined by not only the fiber arrangement but also the specific loading type. In addition, global stress-strain curves show that the microstructural fracture behavior of the RVEs is highly dependent on the fiber distributions.ope
Online -Median with Consistent Clusters
We consider the online -median clustering problem in which points
arrive online and must be irrevocably assigned to a cluster on arrival. As
there are lower bound instances that show that an online algorithm cannot
achieve a competitive ratio that is a function of and , we consider a
beyond worst-case analysis model in which the algorithm is provided a priori
with a predicted budget that upper bounds the optimal objective value. We
give an algorithm that achieves a competitive ratio that is exponential in the
the number of clusters, and show that the competitive ratio of every
algorithm must be linear in . To the best of our knowledge this is the first
investigation in the literature that considers cluster consistency using
competitive analysis.Comment: 28 pages, 7 figure
Tidal Energy and Coastal Models: Improved Turbine Simulation
Marine renewable energy is a continually growing topic of both commercial and academic research sectors. While not as developed as other renewable technologies such as those deployed within the wind sector, there is substantial technological crossover coupled with the inherent high energy density of water, that has helped push marine renewables into the wider renewable agenda. Thus, an ever expanding range of projects are in various stages of development.As with all technological developments, there are a range of factors that can con-tribute to the rate of development or eventual success. One of the main difficulties, when looking at marine renewable technologies in a comparative view to other en-ergy generation technologies, is that the operational environment is physically more complex: Energy must be supplied in diverse physical conditions, that temporally fluctuate with a range of time scales. The constant questions to the iteration to the local ecology. The increased operational fatigue of deployed devices. The financial risk associated within a recent sector.This work presents the continual research related to the computational research development of different marine renewable technologies that were under develop-ment of several institutional bodies at the time of writing this document.The scope has a wide envelopment as the nature of novel projects means that the project failure rate is high. Thus, forced through a combination of reasons related to financial, useful purpose and intellectual property, the research covers distinct projects
Modelling, Monitoring, Control and Optimization for Complex Industrial Processes
This reprint includes 22 research papers and an editorial, collected from the Special Issue "Modelling, Monitoring, Control and Optimization for Complex Industrial Processes", highlighting recent research advances and emerging research directions in complex industrial processes. This reprint aims to promote the research field and benefit the readers from both academic communities and industrial sectors
Designing similarity functions
The concept of similarity is important in many areas of cognitive science, computer science, and statistics. In machine learning, functions that measure similarity between two instances form the core of instance-based classifiers. Past similarity measures have been primarily based on simple Euclidean distance. As machine learning has matured, it has become obvious that a simple numeric instance representation is insufficient for most domains. Similarity functions for symbolic attributes have been developed, and simple methods for combining these functions with numeric similarity functions were devised. This sequence of events has revealed three important issues, which this thesis addresses.
The first issue is concerned with combining multiple measures of similarity. There is no equivalence between units of numeric similarity and units of symbolic similarity. Existing similarity functions for numeric and symbolic attributes have no common foundation, and so various schemes have been devised to avoid biasing the overall similarity towards one type of attribute. The similarity function design framework proposed by this thesis produces probability distributions that describe the likelihood of transforming between two attribute values. Because common units of probability are employed, similarities may be combined using standard methods. It is empirically shown that the resulting similarity functions treat different attribute types coherently.
The second issue relates to the instance representation itself. The current choice of numeric and symbolic attribute types is insufficient for many domains, in which more complicated representations are required. For example, a domain may require varying numbers of features, or features with structural information. The framework proposed by this thesis is sufficiently general to permit virtually any type of instance representation-all that is required is that a set of basic transformations that operate on the instances be defined. To illustrate the framework’s applicability to different instance representations, several example similarity functions are developed.
The third, and perhaps most important, issue concerns the ability to incorporate domain knowledge within similarity functions. Domain information plays an important part in choosing an instance representation. However, even given an adequate instance representation, domain information is often lost. For example, numeric features that are modulo (such as the time of day) can be perfectly represented as a numeric attribute, but simple linear similarity functions ignore the modulo nature of the attribute. Similarly, symbolic attributes may have inter-symbol relationships that should be captured in the similarity function. The design framework proposed by this thesis allows domain information to be captured in the similarity function, both in the transformation model and in the probability assigned to basic transformations. Empirical results indicate that such domain information improves classifier performance, particularly when training data is limited
- …