3,807 research outputs found
Preservation of Semantic Properties during the Aggregation of Abstract Argumentation Frameworks
An abstract argumentation framework can be used to model the argumentative
stance of an agent at a high level of abstraction, by indicating for every pair
of arguments that is being considered in a debate whether the first attacks the
second. When modelling a group of agents engaged in a debate, we may wish to
aggregate their individual argumentation frameworks to obtain a single such
framework that reflects the consensus of the group. Even when agents disagree
on many details, there may well be high-level agreement on important semantic
properties, such as the acceptability of a given argument. Using techniques
from social choice theory, we analyse under what circumstances such semantic
properties agreed upon by the individual agents can be preserved under
aggregation.Comment: In Proceedings TARK 2017, arXiv:1707.0825
Blurred Lines: border crossing between Macau and Zhuhai
Since Macau returned to China after ending the colonial era in 1999, the relationship between Macao and the mainland has experienced more in-depth and diversified exchanges. Although Macao and inland cities are located in one country, they have different cultures, politics and economy. This thesis attempts to study the unique architectural types in Macao and show the cultural characteristics of Macao\u27s changes under the historical background of different ages.. Political factors have caused Macau’s sense of nationality to shift several times.
Since its completion, the Gongbei port, which connects the two places, has witnessed historical changes on both sides of the Taiwan Strait and is also the physical boundary connecting Chinese Mainland and Macao. The port is expanding to accommodate more tourists. In the future, with the deepening of exchanges, as the largest customs clearance port, it will also carry a deeper significance and practical role.Its role in the colonial period was as a boundary, but now it has become a bridge between differing people, culture and economy. Anthropologist Victor Turner once proposed that the threshold is an intermediate zone, a process of change and ambiguity, but here at Gongbei Port, it is a process of integration. This thesis aims to project ambiguity onto the architecture, so that the boundary can host both cultures, so as to respond to the cultural characteristics and history of the two places. My intervention is mainly to convey a concept of transition through architecture. I want to enhance the site’s cultural attributes in space. Travelers can also visit exhibitions and understand the history of the two places when passing through the gateway, which reflects the architectural form of both cultures. I want to mention the design through the single element of the wall which breaks the original division, forming an interactive and shared space from Macau to Zhuhai and from Zhuhai to Macau
Four Essays in Applied Microeconomics
This dissertation comprises four essays. The first two essay investigates the sensitivity of two largest components of health care expenditure — hospital care expenditure (HOCEXP) and physician and clinical services expenditure (DOCLNEXP) — to the changes in income and how much of the estimated sensitivity is due to purchasing more care versus purchasing better care. Although the two essays share the same decomposition model, the estimation is different in the second essay due to data limitations. Using 1999 - 2008 panel data of the 50 US states, we estimate and decompose the income elasticity of HOCEXP and DOCLNEXP into its quantity and quality components respectively. Our findings suggest that the both HOCEXP and DOCLEXP rises have more to do with quality than quantity change. The results mimic the literature indicating that both hospital care and physician and clinical services are normal goods and technical necessities at the state level. The third essay analyzes the effect of insurance coverage on the likelihood of an emergency department (ED) visit being non-urgent or primary-care-sensitive (PCS). We analyze the Tennessee Hospital Outpatient Discharge Data for 2008 and identify non-urgent and PCS ED visits following a widely used ED classification algorithm. Our results of a logit quasi-likelihood model show that noninsurance is associated with higher probability of non-urgent visits and PCS visits when compared to private insurance. The predicted effect of insurance coverage under PPACA depends on the mixed structure of insurance types. The fourth essay explores the determinants and effects of confidence on academic and labor market outcomes using a rich-informed nationwide survey of graduate Management Admission Test (GMAT) registrants. We discuss several ways to define and measure confidence. Our results suggest that many confidence measures differ by race, gender, observed ability and managerial experience. These confidence measures have some predictive power in eventual academic outcomes and more so for labor market outcomes
Enabling wide-field, high-spatial-resolution fast transient searches on modern interferometry
Fast transients reveal the energetic universe to us. They are usually the products of rapid and enormous energy releases. To study the physical nature behind them, we must know the subtle temporal structure of the signal and where precisely they come from. Radio observation has been an indispensable part of the study of fast transients, thanks to their daylight-insusceptibility and weather-tolerance. The ever powerful single-dish telescopes have been used in pulsar observations and fast radio transient surveys, yielding substantial results. However, some sporadic or even one-off fast radio transients have brought challenges to the single-dish observations. Other than high time resolution, efficient fast transient surveys require a wide field of view and spatial resolution. While larger single-dish telescopes are harder and much expensive to build, interferometry provides a practical alternative. The high spatial resolution, scalability, flexibility and cost-efficiency have made interferometry wildly used in modern radio astronomy. With the beamforming techniques, interferometric arrays are able to carry out wide-field, high resolution surveys for fast radio transients.
In this thesis, I start the introduction with an instrument-wise history of radio observation and its importance in astronomy. Following that is the introduction of the fast radio transients, including various types of pulsars and fast radio bursts. Then I present radio interferometry as a solution to tackle the challenges in the fast radio transient surveys. A brief history of radio interferometry and its achievements are also provided.
The second chapter presents the various requirements for fast radio surveys with interferometry, such as high time and frequency resolution, large field of view and high spatial resolution. I then explain how beamformed observation with interferometric arrays can meet these requirements. A comparison between the raw-voltage beamforming and visibility beamforming is presented in the perspective of computing complexity. At the end of the chapter, I briefly outline the configuration of the MeerKAT telescope, its various backends and planned science projects. The overview of the transient search system of MeerKAT based on the abovementioned beamforming technique is presented. The reason for MeerKAT using the raw-voltage beamforming method is explained.
The third chapter describes the general beamforming technique in detail and provides its implementation. Following that, I provide the solutions to the challenges that come with this technique, such as the characterization of volatile beam shapes, the generation of efficient tiling of beams and the prediction of the evolution of the tiling through time. To provide a more realistic perspective, the integration of these techniques in the MeerKAT telescope is illustrated and the corresponding capacity and statistics are provided. As an evaluation, a real beamformed observation on 47 Tucanae using the said techniques is presented and the localization capability with multiple beams is demonstrated. The development and deployment of this technique have been proven successful and effective.
The performance of this beamforming platform has been further tested in numerous observations and surveys which are detailed in the fourth chapter. For example, the TRAPUM project consisting of but is not limited to surveys of globular clusters, nearby galaxies and targeted pulsar searches of unidentified FERMI gamma-ray sources. Another important survey is the MGPS project which searches pulsars in the Galactic plane at both L-band and S-band. From both projects, we have discovered a total of 66 pulsars at the time of writing.
I report a search for giant pulses in selected pulsars in the fifth chapter. Many models try to explain the emission mechanism behind giant pulses from pulsars, some of which invoke the re-connection events near the light cylinder. In the meantime, a population of giant pulse emitters shares similar properties, such as strong magnetic fields near the light cylinder and high energy emission. I select a couple of pulsars with these properties as candidates. They were observed using the Effelsberg 100-meter telescope. I created a pipeline based on Heimdall to search giant pulses on the data and find no credible detection. The conclusion for the non-detection is that these pulsars do not emit giant pulses which are detectable by our observation, or the condition of high magnetic field strength near the light cylinder and high energy emission may not be a sufficient condition for the occurrence of giant pulses
A Reliability Case Study on Estimating Extremely Small Percentiles of Strength Data for the Continuous Improvement of Medium Density Fiberboard Product Quality
The objective of this thesis is to better estimate extremely small percentiles of strength distributions for measuring failure process in continuous improvement initiatives. These percentiles are of great interest for companies, oversight organizations, and consumers concerned with product safety and reliability. The thesis investigates the lower percentiles for the quality of medium density fiberboard (MDF). The international industrial standard for measuring quality for MDF is internal bond (IB, a tensile strength test). The results of the thesis indicated that the smaller percentiles are crucial, especially the first percentile and lower ones.
The thesis starts by introducing the background, study objectives, and previous work done in the area of MDF reliability. The thesis also reviews key components of total quality management (TQM) principles, strategies for reliability data analysis and modeling, information and data quality philosophy, and data preparation steps that were used in the research study.
Like many real world cases, the internal bond data in material failure analysis do not follow perfectly the normal distribution. There was evidence from the study to suggest that MDF has potentially different failure modes for early failures. Forcing of the normality assumption may lead to inaccurate predictions and poor product quality. We introduce a novel, forced censoring technique that closer fits the lower tails of strength distributions, where these smaller percentiles are impacted most. In this thesis, such a forced censoring technique is implemented as a software module, using JMP® Scripting Language (JSL) to expedite data processing which is key for real-time manufacturing settings.
Results show that the Weibull distribution models the data best and provides percentile estimates that are neither too conservative nor risky. Further analyses are performed to build an accelerated common-shaped Weibull model for these two product types using the JMP® Survival and Reliability platform. The use of the JMP® Scripting Language helps to automate the task of fitting an accelerated Weibull model and test model homogeneity in the shape parameter. At the end of modeling stage, a package script is written to readily provide the field engineers customized reporting for model visualization, parameter estimation, and percentile forecasting.
Furthermore, using the powerful tools of Splida and S Plus, bootstrap estimates of the small percentiles demonstrate improved intervals by our forced censoring approach and the fitted model, including the common shape assumption. Additionally, relatively more advanced Bayesian methods are employed to predict the low percentiles of this particular product type, which has a rather limited number of observations. Model interpretability, cross-validation strategy, result comparisons, and habitual assessment of practical significance are particularly stressed and exercised throughout the thesis.
Overall, the approach in the thesis is parsimonious and suitable for real time manufacturing settings. The approach follows a consistent strategy in statistical analysis which leads to more accuracy for product conformance evaluation. Such an approach may also potentially reduce the cost of destructive testing and data management due to reduced frequency of testing. If adopted, the approach may prevent field failures and improve product safety. The philosophy and analytical methods presented in the thesis also apply to other strength distributions and lifetime data
- …