8,208 research outputs found
UMSL Bulletin 2023-2024
The 2023-2024 Bulletin and Course Catalog for the University of Missouri St. Louis.https://irl.umsl.edu/bulletin/1088/thumbnail.jp
DATA AUGMENTATION FOR SYNTHETIC APERTURE RADAR USING ALPHA BLENDING AND DEEP LAYER TRAINING
Human-based object detection in synthetic aperture RADAR (SAR) imagery is complex and technical, laboriously slow but time criticalâthe perfect application for machine learning (ML). Training an ML network for object detection requires very large image datasets with imbedded objects that are accurately and precisely labeled. Unfortunately, no such SAR datasets exist. Therefore, this paper proposes a method to synthesize wide field of view (FOV) SAR images by combining two existing datasets: SAMPLE, which is composed of both real and synthetic single-object chips, and MSTAR Clutter, which is composed of real wide-FOV SAR images. Synthetic objects are extracted from SAMPLE using threshold-based segmentation before being alpha-blended onto patches from MSTAR Clutter. To validate the novel synthesis method, individual object chips are created and classified using a simple convolutional neural network (CNN); testing is performed against the measured SAMPLE subset. A novel technique is also developed to investigate training activity in deep layers. The proposed data augmentation technique produces a 17% increase in the accuracy of measured SAR image classification. This improvement shows that any residual artifacts from segmentation and blending do not negatively affect ML, which is promising for future use in wide-area SAR synthesis.Outstanding ThesisMajor, United States Air ForceApproved for public release. Distribution is unlimited
Energy Supplies in the Countries from the Visegrad Group
The purpose of this Special Issue was to collect and present research results and experiences on energy supply in the Visegrad Group countries. This research considers both macroeconomic and microeconomic aspects. It was important to determine how the V4 countries deal with energy management, how they have undergone or are undergoing energy transformation and in what direction they are heading. The articles concerned aspects of the energy balance in the V4 countries compared to the EU, including the production of renewable energy, as well as changes in its individual sectors (transport and food production). The energy efficiency of low-emission vehicles in public transport and goods deliveries are also discussed, as well as the energy efficiency of farms and energy storage facilities and the impact of the energy sector on the quality of the environment
Recommended from our members
Sonic heritage: listening to the past
History is so often told through objects, images and photographs, but the potential of sounds to reveal place and space is often neglected. Our research project âSonic Palimpsestâ1 explores the potential of sound to evoke impressions and new understandings of the past, to embrace the sonic as a tool to understand what was, in a way that can complement and add to our predominant visual understandings. Our work includes the expansion of the Oral History archives held at Chatham Dockyard to include womenâs voices and experiences, and the creation of sonic works to engage the public with their heritage. Our research highlights the social and cultural value of oral history and field recordings in the transmission of knowledge to both researchers and the public. Together these recordings document how buildings and spaces within the dockyard were used and experienced by those who worked there. We can begin to understand the social and cultural roles of these buildings within the community, both past and present
Using simulation to investigate impact of different approaches to coordination on a healthcare systemâs resilience to disasters
Many disasters that have happened in the last decades have caused a shortage of healthcare resources and change in healthcare activities. Coordination of healthcare facilities is one of the emergency medical response strategies to ensure the continued provision of medical services during disasters. The importance of coordination in healthcare systems during disasters is well recognised in the literature, but to the best of our knowledge there has been no review of the published research in this area. In this thesis, a focused literature review of models for the coordination in the healthcare system is provided. Additionally, measures of coordination effectiveness that denote resilience are discussed. In the field of medical management, there are two types of coordination including integrative care and collaborative care. Both types of coordination aim to improve the emergency medical response by ensuring the continuity of medical services and improving healthcare capability during disasters. Integrative care mainly investigates the resource allocation within a common governance, whereas collaborative care is mainly focused on the sharing of healthcare resources across governances. Thus, integrative care is mainly implemented within a healthcare provider setting, while collaborative care is mainly implemented between the settings. However, resilience is usually perceived at community level rather than at an individual institution when responding to disasters. Improving resilience during disasters requires the capability of different healthcare providers, which can be achieved by collaborative care, rather than integrative care. In addition, the literature has commonly addressed collaborative care using optimisation approach, not simulation approach. In this regard, this study presents simulation models for resilience of the healthcare network during disasters. In collaboration with the health authorities and medical staff in Thailand who experienced a number of disasters we investigated real-world activities that took place in emergency medical responses. We developed novel discrete event simulation models of collaboration in an emergency medical response in a healthcare network during disasters with the aim to improve the resilience of the healthcare network. Three strategies for collaboration in the healthcare network were defined including non-collaborative care, semi-collaborative care, and a new proposed collaborative care. Non-collaborative care strategy was in place in response to Tsunami in Phuket in 2004, while semi-collaborative care strategy is the current strategy which was implemented during the boat capsizing in Phuket in 2018. We propose a new collaborative care strategy which is defined by considering the disadvantages of the current semi-collaborative care strategy. It addresses a new collaboration in the network that enables information sharing and the classification of healthcare providers. The strategies differ with respect to the first treatment provision of patients, sharing of resources, and patient transportation The simulation models were validated and verified by using the boat capsizing real-world event. The model validations were in line with the available system outputs including the number of patients in different categories, resource allocation, patient allocation and average patient waiting times at healthcare providers. A generic metric of resilience proposed in the literature was adapted to be used in healthcare context. Our analysis yielded managerial insights into the emergency planning as follows. In all defined scenarios, the new collaborative care strategy had a considerable impact on improving the resilience and enabled faster return to the pre-disaster state of healthcare network than other strategies. The semi-collaborative care strategy frequently provided the worst resilience in almost all the defined scenarios. However, it provided better resilience than the non-collaborative care strategy when the number of affected patients was relatively small. Even though simulation enabled investigation of the impact of different strategies for collaboration in the network on the resilience, the patient allocation might not be optimal. We developed a mixed integer programming model to address the allocation of patients in collaborative care in which ambulances transport multiple patients to healthcare providers in one trip. The developed model will provide further insights into the collaborative care in disasters management
Designing, Developing, and Implementing a Personalized Gamified Goal-setting Mechanism for a Sleep-tracking Mobile Application
Goal setting is a commonly employed game element and an implicit effect of gamification. Research in goal-setting theory has found specific and difficult goals to increase the effectiveness of mobile health interventions. Previous work has indicated a lack of research about the impact of different types of goals, especially those with time constraints. This work introduces a goal-setting mechanism with two types of goals, namely continuous and time-bound, to the Sleep Revolution app, a mobile application for sleep healthcare that enables users to set goals regarding several activities they track in the app's sleep diary. The implementation follows concepts and recommendations from previous work. To assess and compare the effects of the different goals on user compliance with health recommendations, a four-week randomized controlled trial was conducted. In a second step, the feasibility of extending the mechanism with personalized goal recommendations based on the data collected in the trial is explored. Different machine learning algorithms are compared regarding their applicability to this problem and their potential effectiveness
Recommended from our members
Efficient Adoption of Residential Energy Technologies Through Improved Electric Retail Rate Design
This dissertation combines methods from engineering, operations research, and economics to analyze how emerging residential energy technologies can be effectively used to reduce both energy costs and carbon emissions. Our most important finding is that air-source heat pumps can be used to reduce both energy costs and carbon emissions in four out of the five major climate regions studied, but that electric retail rate reform is needed to provide customers with appropriate incentives.
In cold climates, it may be advantageous to use heat pumps in tandem with fossil fuel-powered furnaces; in warmer regions, furnaces can be cost-effectively abandoned altogether. We do not find that distributed rooftop solar panels or distributed battery storage are effective tools for reducing the cost of energy services. Rather, in our simulations, customers adopt these technologies in response to poor price signaling by electric utilities. By reforming electric retail rates so that the prices paid by consumers better reflect the cost of energy services, utilities can promote the adoption of technologies that reduce both aggregate costs and carbon emissions
Endogenous measures for contextualising large-scale social phenomena: a corpus-based method for mediated public discourse
This work presents an interdisciplinary methodology for developing endogenous measures of group membership through analysis of pervasive linguistic patterns in public discourse. Focusing on political discourse, this work critiques the conventional approach to the study of political participation, which is premised on decontextualised, exogenous measures to characterise groups. Considering the theoretical and empirical weaknesses of decontextualised approaches to large-scale social phenomena, this work suggests that contextualisation using endogenous measures might provide a complementary perspective to mitigate such weaknesses.
This work develops a sociomaterial perspective on political participation in mediated discourse as affiliatory action performed through language. While the affiliatory function of language is often performed consciously (such as statements of identity), this work is concerned with unconscious features (such as patterns in lexis and grammar). This work argues that pervasive patterns in such features that emerge through socialisation are resistant to change and manipulation, and thus might serve as endogenous measures of sociopolitical contexts, and thus of groups.
In terms of method, the work takes a corpus-based approach to the analysis of data from the Twitter messaging service whereby patterns in usersâ speech are examined statistically in order to trace potential community membership. The method is applied in the US state of Michigan during the second half of 2018â6 November having been the date of midterm (i.e. non-Presidential) elections in the United States. The corpus is assembled from the original posts of 5,889 users, who are nominally geolocalised to 417 municipalities. These users are clustered according to pervasive language features. Comparing the linguistic clusters according to the municipalities they represent finds that there are regular sociodemographic differentials across clusters. This is understood as an indication of social structure, suggesting that endogenous measures derived from pervasive patterns in language may indeed offer a complementary, contextualised perspective on large-scale social phenomena
Recommended from our members
Rare-Event Estimation and Calibration for Large-Scale Stochastic Simulation Models
Stochastic simulation has been widely applied in many domains. More recently, however, the rapid surge of sophisticated problems such as safety evaluation of intelligent systems has posed various challenges to conventional statistical methods. Motivated by these challenges, in this thesis, we develop novel methodologies with theoretical guarantees and numerical applications to tackle them from different perspectives.
In particular, our works can be categorized into two areas: (1) rare-event estimation (Chapters 2 to 5) where we develop approaches to estimating the probabilities of rare events via simulation; (2) model calibration (Chapters 6 and 7) where we aim at calibrating the simulation model so that it is close to reality.
In Chapter 2, we study rare-event simulation for a class of problems where the target hitting sets of interest are defined via modern machine learning tools such as neural networks and random forests. We investigate an importance sampling scheme that integrates the dominating point machinery in large deviations and sequential mixed integer programming to locate the underlying dominating points. We provide efficiency guarantees and numerical demonstration of our approach.
In Chapter 3, we propose a new efficiency criterion for importance sampling, which we call probabilistic efficiency. Conventionally, an estimator is regarded as efficient if its relative error is sufficiently controlled. It is widely known that when a rare-event set contains multiple "important regions" encoded by the dominating points, importance sampling needs to account for all of them via mixing to achieve efficiency. We argue that the traditional analysis recipe could suffer from intrinsic looseness by using relative error as an efficiency criterion. Thus, we propose the new efficiency notion to tighten this gap. In particular, we show that under the standard Gartner-Ellis large deviations regime, an importance sampling that uses only the most significant dominating points is sufficient to attain this efficiency notion.
In Chapter 4, we consider the estimation of rare-event probabilities using sample proportions output by crude Monte Carlo. Due to the recent surge of sophisticated rare-event problems, efficiency-guaranteed variance reduction may face implementation challenges, which motivate one to look at naive estimators. In this chapter we construct confidence intervals for the target probability using this naive estimator from various techniques, and then analyze their validity as well as tightness respectively quantified by the coverage probability and relative half-width.
In Chapter 5, we propose the use of extreme value analysis, in particular the peak-over-threshold method which is popularly employed for extremal estimation of real datasets, in the simulation setting. More specifically, we view crude Monte Carlo samples as data to fit on a generalized Pareto distribution. We test this idea on several numerical examples. The results show that in the absence of efficient variance reduction schemes, it appears to offer potential benefits to enhance crude Monte Carlo estimates.
In Chapter 6, we investigate a framework to develop calibration schemes in parametric settings, which satisfies rigorous frequentist statistical guarantees via a basic notion that we call eligibility set designed to bypass non-identifiability via a set-based estimation. We investigate a feature extraction-then-aggregation approach to construct these sets that target at multivariate outputs. We demonstrate our methodology on several numerical examples, including an application to calibration of a limit order book market simulator.
In Chapter 7, we study a methodology to tackle the NASA Langley Uncertainty Quantification Challenge, a model calibration problem under both aleatory and epistemic uncertainties. Our methodology is based on an integration of distributionally robust optimization and importance sampling. The main computation machinery in this integrated methodology amounts to solving sampled linear programs. We present theoretical statistical guarantees of our approach via connections to nonparametric hypothesis testing, and numerical performances including parameter calibration and downstream decision and risk evaluation tasks
A Simulation of the Impacts of Climate Change on Civil Aircraft Takeoff Performance
Climate change affects the near-surface environmental conditions that prevail at airports worldwide. Among these, air density and headwind speed are major determinants of takeoff performance, and their sensitivity to global warming carries potential operational and economic implications for the commercial air transport industry. Previous archival and prospective research observed a weakening in headwind strength and predicted an increase in near-surface temperatures, respectively, resulting in an increase in takeoff distances and weight restrictions. The main purpose of the present study was to update and generalize the extant prospective research using a more representative sample of worldwide airports, a wider range of climate scenarios, and next-generation climate models. The research questions included how much additional thrust and payload removal will be required to offset the centurial changes in takeoff conditions. This study relied on a quantitative method using the simulation instrument. Forecast climate data corresponding to four shared socioeconomic pathways (SSP1â2.6, SSP2â4.5, SSP3â7.0, and SSP5â8.5) over the available 2015â2100 period were sourced from a high-resolution CMIP6 global circulation model. These data were used to characterize the six-hourly near-surface environmental conditions prevailing at all 881 airports worldwide having at least one million passengers in pre-COVIDâ19 traffic. The missing air density was iii numerically derived from the air temperature, pressure, and humidity variables, while the headwind speed for each airportâs active runway configuration was triangulated from the wind vector components. Separately, a direct takeoff-dynamics simulation model was developed from first principles and calibrated against published performance data under international standard atmospheric conditions for two narrowbody and two widebody aircraft. The model was used to simulate 1.8 billion unique takeoffs, each initiated at 75% of maximum takeoff thrust and 100% of maximum takeoff mass. When the resulting takeoff distance required exceeded that available, the takeoff thrust was gradually increased to 100%, after which the takeoff mass was gradually decreased to an estimated breakeven load factor. In total, 65 billion takeoff iterations were simulated. Longitudinal changes to takeoff thrust, distance, and payload were recorded and examined by aircraft type, climate scenario, and climate zone. The results show that despite a marked centurial increase in the global mean air temperature of 9.4%â18.0% relative to the year 2015 under SSP2â4.5 and SSP3â7.0, air density will only decrease by 0.6%â1.1% due to its weak sensitivity to temperature. Likewise, mean headwinds were observed to remain almost unchanged relative to the 2015 baseline. As a result, the global mean takeoff thrust was found to increase by no more than 0.3 percentage point while payload removals did not exceed 1.1 passenger. Significant deviations from the mean were observed at climatic outlier airports, including those located around the Siberian plateau, where takeoff operations may become more difficult. This study contributes to the air transport climate adaption body of knowledge by providing contrasting results relative to earlier research that reported strong impacts of global warming on takeoff performance
- âŠ