8,273 research outputs found

    A Political Theory of Engineered Systems and A Study of Engineering and Justice Workshops

    Get PDF
    Since there are good reasons to think that some engineered systems are socially undesirable—for example, internal combustion engines that cause climate change, algorithms that are racist, and nuclear weapons that can destroy all life—there is a well-established literature that attempts to identify best practices for designing and regulating engineered systems in order to prevent harm and promote justice. Most of this literature, especially the design theory and engineering justice literature meant to help guide engineers, focuses on environmental, physical, social, and mental harms such as ecosystem and bodily poisoning, racial and gender discrimination, and urban alienation. However, the literature that focuses on how engineered systems can produce political harms—harms to how we shape the way we live in community together—is not well established. The first part of this thesis contributes to identifying how particular types of engineered systems can harm a democratic politics. Building on democratic theory, philosophy of collective harms, and design theory, it argues that engineered systems that extend in space and time beyond a certain threshold subvert the knowledge and empowerment necessary for a democratic politics. For example, the systems of global shipping and the internet that fundamentally shape our lives are so large that people cannot attain the knowledge necessary to regulate them well nor the empowerment necessary to shape them. The second part of this thesis is an empirical study of a workshop designed to encourage engineering undergraduates to understand how engineered systems can subvert a democratic politics, with the ultimate goal of supporting students in incorporating that understanding into their work. 32 Dartmouth undergraduate engineering students participated in the study. Half were assigned to participate in a workshop group, half to a control group. The workshop group participants took a pretest; then participated in a 3-hour, semi-structured workshop with 4 participants per session (as well as a discussion leader and note-taker) over lunch or dinner; and then took a posttest. The control group participants took the same pre- and post- tests, but had no suggested activity in the intervening 3 hours. We find that the students who participated in workshops had a statistically significant test-score improvement as compared to the control group (Brunner-Munzel test, p \u3c .001). Using thematic analysis methods, we show the data is consistent with the hypothesis that workshops produced a score improvement because of certain structure (small size, long duration, discussion-based, over homemade food) and content (theoretically rich, challenging). Thematic analysis also reveals workshop failures and areas for improvement (too much content for the duration, not well enough organized). The thesis concludes with a discussion of limitations and suggestions for future theoretical, empirical, and pedagogical research

    UMSL Bulletin 2023-2024

    Get PDF
    The 2023-2024 Bulletin and Course Catalog for the University of Missouri St. Louis.https://irl.umsl.edu/bulletin/1088/thumbnail.jp

    Three Essays on Substructural Approaches to Semantic Paradoxes

    Full text link
    This thesis consists of three papers on substructural approaches to semantic paradoxes. The first paper introduces a formal system, based on a nontransitive substructural logic, which has exactly the valid and antivalid inferences of classical logic at every level of (meta)inference, but which I argue is still not classical logic. In the second essay, I introduce infinite-premise versions of several semantic paradoxes, and show that noncontractive substructural approaches do not solve these paradoxes. In the third essay, I introduce an infinite metainferential hierarchy of validity curry paradoxes, and argue that providing a uniform solution to the paradoxes in this hierarchy makes substructural approaches less appealing. Together, the three essays in this thesis illustrate a problem for substructural approaches: substructural logics simply do not do everything that we need a logic to do, and so cannot solve semantic paradoxes in every context in which they appear. A new strategy, with a broader conception of what constitutes a uniform solution, is needed

    2023-2 Dynamic and Stochastic Rational Behavior

    Get PDF
    We analyze consumer demand behavior using Dynamic Random Utility Model (DRUM). Under DRUM, a consumer draws a utility function from a stochastic utility process in each period and maximizes this utility subject to her budget constraint. DRUM allows unrestricted time correlation and cross-section heterogeneity in preferences. We fully characterize DRUM for a panel data of consumer choices and budgets. DRUM is linked to a finite mixture of deterministic behavior represented as the Kronecker product of static rationalizable behavior. We provide a generalization of the Weyl-Minkowski theorem that uses this link and enables conversion of the characterizations of the static Random Utility Model (RUM) of McFadden-Richter (1990) to its dynamic form. DRUM is more flexible than Afriat’s (1967) framework for time series and more informative than RUM. We show the feasibility of the statistical test of DRUM in a Monte Carlo study

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    UMSL Bulletin 2022-2023

    Get PDF
    The 2022-2023 Bulletin and Course Catalog for the University of Missouri St. Louis.https://irl.umsl.edu/bulletin/1087/thumbnail.jp

    UNPUBLISHING THE NEWS: AN ASSESSMENT OF U.S. PUBLIC OPINION, NEWSROOM ACCOUNTABILITY, AND JOURNALISTS’ AUTHORITY AS “THE FIRST DRAFT OF HISTORY”

    Get PDF
    Unpublishing, or the manipulation, deindexing, or removal of published content on a news organization’s website, is a hotly debated issue in the news industry that disrupts fundamental beliefs about the nature of news and the roles of journalists. This dissertation’s premise is that unpublishing as a phenomenon challenges the authority of journalism as “the first draft of history,” questions the assumed relevance of traditional norms, and creates an opportunity to reconsider how news organizations demonstrate their accountability to the public. The study identifies public opinions related to unpublishing practices and approval of related journalistic norms through a public opinion survey of 1,350 U.S. adults. In tandem, a qualitative analysis of 62 editorial policies related to unpublishing offers the first inventory and assessment of emerging journalistic practices and the normative values journalists demonstrate through them. These contributions are valuable to both the academy and the news industry, as they identify a path forward for future research and provide desired guidance to U.S. news organizations. Findings suggest that in response to the unpublishing phenomenon, American journalists defend their professionalism primarily through the traditional professional paradigm of accuracy, invoking it to legitimize new guidelines whether those policies permitted or denounced unpublishing as a newsroom practice. Findings also show newsrooms are pledging increased levels of accountability to their communities and society at large, but how they might demonstrate that accountability more tactically was absent from policy discourse. In addition, both American adults and news organizations place a high value on the accuracy of previously published news content, yet the groups’ temporal conceptions of accuracy must be reconciled. Ultimately, the unpublishing phenomenon presents an opportunity for journalists to redefine their notions of accountability to their communities. Based on these findings, the study concludes with a call for American news organizations to abandon claims as the “first draft of history” in the digital era and assume the role of information custodians, proactively establishing and managing the lifecycle of content.Doctor of Philosoph

    Can Bayesian Network empower propensity score estimation from Real World Data?

    Full text link
    A new method, based on Bayesian Networks, to estimate propensity scores is proposed with the purpose to draw causal inference from real world data on the average treatment effect in case of a binary outcome and discrete covariates. The proposed method ensures maximum likelihood properties to the estimated propensity score, i.e. asymptotic efficiency, thus outperforming other available approach. Two point estimators via inverse probability weighting are then proposed, and their main distributional properties are derived for constructing confidence interval and for testing the hypotheses of absence of the treatment effect. Empirical evidence of the substantial improvements offered by the proposed methodology versus standard logistic modelling of propensity score is provided in simulation settings that mimic the characteristics of a real dataset of prostate cancer patients from Milan San Raffaele Hospital

    Non-parametric online market regime detection and regime clustering for multidimensional and path-dependent data structures

    Full text link
    In this work we present a non-parametric online market regime detection method for multidimensional data structures using a path-wise two-sample test derived from a maximum mean discrepancy-based similarity metric on path space that uses rough path signatures as a feature map. The latter similarity metric has been developed and applied as a discriminator in recent generative models for small data environments, and has been optimised here to the setting where the size of new incoming data is particularly small, for faster reactivity. On the same principles, we also present a path-wise method for regime clustering which extends our previous work. The presented regime clustering techniques were designed as ex-ante market analysis tools that can identify periods of approximatively similar market activity, but the new results also apply to path-wise, high dimensional-, and to non-Markovian settings as well as to data structures that exhibit autocorrelation. We demonstrate our clustering tools on easily verifiable synthetic datasets of increasing complexity, and also show how the outlined regime detection techniques can be used as fast on-line automatic regime change detectors or as outlier detection tools, including a fully automated pipeline. Finally, we apply the fine-tuned algorithms to real-world historical data including high-dimensional baskets of equities and the recent price evolution of crypto assets, and we show that our methodology swiftly and accurately indicated historical periods of market turmoil.Comment: 65 pages, 52 figure

    Depth Functions for Partial Orders with a Descriptive Analysis of Machine Learning Algorithms

    Full text link
    We propose a framework for descriptively analyzing sets of partial orders based on the concept of depth functions. Despite intensive studies of depth functions in linear and metric spaces, there is very little discussion on depth functions for non-standard data types such as partial orders. We introduce an adaptation of the well-known simplicial depth to the set of all partial orders, the union-free generic (ufg) depth. Moreover, we utilize our ufg depth for a comparison of machine learning algorithms based on multidimensional performance measures. Concretely, we analyze the distribution of different classifier performances over a sample of standard benchmark data sets. Our results promisingly demonstrate that our approach differs substantially from existing benchmarking approaches and, therefore, adds a new perspective to the vivid debate on the comparison of classifiers.Comment: Accepted to ISIPTA 2023; Forthcoming in: Proceedings of Machine Learning Researc
    • …
    corecore