1,824 research outputs found
Improving Cross-Lingual Transfer Learning for Event Detection
The widespread adoption of applications powered by Artificial Intelligence (AI) backbones has unquestionably changed the way we interact with the world around us. Applications such as automated personal assistants, automatic question answering, and machine-based translation systems have become mainstays of modern culture thanks to the recent considerable advances in Natural Language Processing (NLP) research. Nonetheless, with over 7000 spoken languages in the world, there still remain a considerable number of marginalized communities that are unable to benefit from these technological advancements largely due to the language they speak. Cross-Lingual Learning (CLL) looks to address this issue by transferring the knowledge acquired from a popular, high-resource source language (e.g., English, Chinese, or Spanish) to a less favored, lower-resourced target language (e.g., Urdu or Swahili). This dissertation leverages the Event Detection (ED) sub-task of Information Extraction (IE) as a testbed and presents three novel approaches that improve cross-lingual transfer learning from distinct perspectives: (1) direct knowledge transfer, (2) hybrid knowledge transfer, and (3) few-shot learning
Mapping the Focal Points of WordPress: A Software and Critical Code Analysis
Programming languages or code can be examined through numerous analytical lenses. This project is a critical analysis of WordPress, a prevalent web content management system, applying four modes of inquiry. The project draws on theoretical perspectives and areas of study in media, software, platforms, code, language, and power structures. The applied research is based on Critical Code Studies, an interdisciplinary field of study that holds the potential as a theoretical lens and methodological toolkit to understand computational code beyond its function. The project begins with a critical code analysis of WordPress, examining its origins and source code and mapping selected vulnerabilities. An examination of the influence of digital and computational thinking follows this. The work also explores the intersection of code patching and vulnerability management and how code shapes our sense of control, trust, and empathy, ultimately arguing that a rhetorical-cultural lens can be used to better understand code\u27s controlling influence. Recurring themes throughout these analyses and observations are the connections to power and vulnerability in WordPress\u27 code and how cultural, processual, rhetorical, and ethical implications can be expressed through its code, creating a particular worldview. Code\u27s emergent properties help illustrate how human values and practices (e.g., empathy, aesthetics, language, and trust) become encoded in software design and how people perceive the software through its worldview. These connected analyses reveal cultural, processual, and vulnerability focal points and the influence these entanglements have concerning WordPress as code, software, and platform. WordPress is a complex sociotechnical platform worthy of further study, as is the interdisciplinary merging of theoretical perspectives and disciplines to critically examine code. Ultimately, this project helps further enrich the field by introducing focal points in code, examining sociocultural phenomena within the code, and offering techniques to apply critical code methods
A Theistic Critique of Secular Moral Nonnaturalism
This dissertation is an exercise in Theistic moral apologetics. It will be developing both a critique of secular nonnaturalist moral theory (moral Platonism) at the level of metaethics, as well as a positive form of the moral argument for the existence of God that follows from this critique. The critique will focus on the work of five prominent metaethical theorists of secular moral non-naturalism: David Enoch, Eric Wielenberg, Russ Shafer-Landau, Michael Huemer, and Christopher Kulp. Each of these thinkers will be critically examined. Following this critique, the positive moral argument for the existence of God will be developed, combining a cumulative, abductive argument that follows from filling in the content of a succinct apagogic argument. The cumulative abductive argument and the apagogic argument together, with a transcendental and modal component, will be presented to make the case that Theism is the best explanation for the kind of moral, rational beings we are and the kind of universe in which we live, a rational intelligible universe
Method versatility in analysing human attitudes towards technology
Various research domains are facing new challenges brought about by growing volumes of data. To make optimal use of them, and to increase the reproducibility of research findings, method versatility is required. Method versatility is the ability to flexibly apply widely varying data analytic methods depending on the study goal and the dataset characteristics.
Method versatility is an essential characteristic of data science, but in other areas of research, such as educational science or psychology, its importance is yet to be fully accepted. Versatile methods can enrich the repertoire of specialists who validate psychometric instruments, conduct data analysis of large-scale educational surveys, and communicate their findings to the academic community, which corresponds to three stages of the research cycle: measurement, research per se, and communication. In this thesis, studies related to these stages have a common theme of human attitudes towards technology, as this topic becomes vitally important in our age of ever-increasing digitization.
The thesis is based on four studies, in which method versatility is introduced in four different ways: the consecutive use of methods, the toolbox choice, the simultaneous use, and the range extension. In the first study, different methods of psychometric analysis are used consecutively to reassess psychometric properties of a recently developed scale measuring affinity for technology interaction. In the second, the random forest algorithm and hierarchical linear modeling, as tools from machine learning and statistical toolboxes, are applied to data analysis of a large-scale educational survey related to students’ attitudes to information and communication technology. In the third, the challenge of selecting the number of clusters in model-based clustering is addressed by the simultaneous use of model fit, cluster separation, and the stability of partition criteria, so that generalizable separable clusters can be selected in the data related to teachers’ attitudes towards technology. The fourth reports the development and evaluation of a scholarly knowledge graph-powered dashboard aimed at extending the range of scholarly communication means.
The findings of the thesis can be helpful for increasing method versatility in various research areas. They can also facilitate methodological advancement of academic training in data analysis and aid further development of scholarly communication in accordance with open science principles.Verschiedene Forschungsbereiche müssen sich durch steigende Datenmengen neuen Herausforderungen stellen. Der Umgang damit erfordert – auch in Hinblick auf die Reproduzierbarkeit von Forschungsergebnissen – Methodenvielfalt. Methodenvielfalt ist die Fähigkeit umfangreiche Analysemethoden unter Berücksichtigung von angestrebten Studienzielen und gegebenen Eigenschaften der Datensätze flexible anzuwenden.
Methodenvielfalt ist ein essentieller Bestandteil der Datenwissenschaft, der aber in seinem Umfang in verschiedenen Forschungsbereichen wie z. B. den Bildungswissenschaften oder der Psychologie noch nicht erfasst wird. Methodenvielfalt erweitert die Fachkenntnisse von Wissenschaftlern, die psychometrische Instrumente validieren, Datenanalysen von groß angelegten Umfragen im Bildungsbereich durchführen und ihre Ergebnisse im akademischen Kontext präsentieren. Das entspricht den drei Phasen eines Forschungszyklus: Messung, Forschung per se und Kommunikation. In dieser Doktorarbeit werden Studien, die sich auf diese Phasen konzentrieren, durch das gemeinsame Thema der Einstellung zu Technologien verbunden. Dieses Thema ist im Zeitalter zunehmender Digitalisierung von entscheidender Bedeutung.
Die Doktorarbeit basiert auf vier Studien, die Methodenvielfalt auf vier verschiedenen Arten vorstellt: die konsekutive Anwendung von Methoden, die Toolbox-Auswahl, die simultane Anwendung von Methoden sowie die Erweiterung der Bandbreite. In der ersten Studie werden verschiedene psychometrische Analysemethoden konsekutiv angewandt, um die psychometrischen Eigenschaften einer entwickelten Skala zur Messung der Affinität von Interaktion mit Technologien zu überprüfen. In der zweiten Studie werden der Random-Forest-Algorithmus und die hierarchische lineare Modellierung als Methoden des Machine Learnings und der Statistik zur Datenanalyse einer groß angelegten Umfrage über die Einstellung von Schülern zur Informations- und Kommunikationstechnologie herangezogen. In der dritten Studie wird die Auswahl der Anzahl von Clustern im modellbasierten Clustering bei gleichzeitiger Verwendung von Kriterien für die Modellanpassung, der Clustertrennung und der Stabilität beleuchtet, so dass generalisierbare trennbare Cluster in den Daten zu den Einstellungen von Lehrern zu Technologien ausgewählt werden können. Die vierte Studie berichtet über die Entwicklung und Evaluierung eines wissenschaftlichen wissensgraphbasierten Dashboards, das die Bandbreite wissenschaftlicher Kommunikationsmittel erweitert.
Die Ergebnisse der Doktorarbeit tragen dazu bei, die Anwendung von vielfältigen Methoden in verschiedenen Forschungsbereichen zu erhöhen. Außerdem fördern sie die methodische Ausbildung in der Datenanalyse und unterstützen die Weiterentwicklung der wissenschaftlichen Kommunikation im Rahmen von Open Science
A Methodology to Enable Concurrent Trade Space Exploration of Space Campaigns and Transportation Systems
Space exploration campaigns detail the ways and means to achieve goals for our human spaceflight programs. Significant strategic, financial, and programmatic investments over long timescales are required to execute them, and therefore must be justified to decision makers. To make an informed down-selection, many alternative campaign designs are presented at the conceptual-level, as a set and sequence of individual missions to perform that meets the goals and constraints of the campaign, either technical or programmatic. Each mission is executed by in-space transportation systems, which deliver either crew or cargo payloads to various destinations. Design of each of these transportation systems is highly dependent on campaign goals and even small changes in subsystem design parameters can prompt significant changes in the overall campaign strategy. However, the current state of the art describes campaign and vehicle design processes that are generally performed independently, which limits the ability to assess these sensitive impacts. The objective of this research is to establish a methodology for space exploration campaign design that represents transportation systems as a collection of subsystems and integrates its design process to enable concurrent trade space exploration. More specifically, the goal is to identify existing campaign and vehicle design processes to use as a foundation for improvement and eventual integration.
In the past two decades, researchers have adopted terrestrial logistics and supply chain optimization processes to the space campaign design problem by accounting for the challenges that accompany space travel. Fundamentally, a space campaign is formulated as a network design problem where destinations, such as orbits or surfaces of planetary bodies, are represented as nodes with the routes between them as arcs. The objective of this design problem is to optimize the flow of commodities within network using available transport systems. Given the dynamic nature and the number of commodities involved, each campaign can be modeled as a time-expanded, generalized multi-commodity network flow and solved using a mixed integer programming algorithm. To address the challenge of modeling complex concept of operations (ConOps), this formulation was extended to include paths as a set of arcs, further enabling the inclusion of vehicle stacks and payload transfers in the campaign optimization process. Further, with the focus of transportation system within this research, the typical fixed orbital nodes in the logistics network are modified to represent ranges of orbits, categorized by their characteristic energy. This enables the vehicle design process to vary each orbit in the mission as it desires to find the best one per vehicle.
By extension, once integrated, arc costs of dV and dT are updated each iteration. Once campaign goals and external constraints are included, the formulated campaign design process generates alternatives at the conceptual level, where each one identifies the optimal set and sequence of missions to perform.
Representing transportation systems as a collection of subsystems introduces challenges in the design of each vehicle, with a high degree of coupling between each subsystem as well as the driving mission. Additionally, sizing of each subsystem can have many inputs and outputs linked across the system, resulting in a complex, multi-disciplinary analysis, and optimization problem. By leveraging the ontology within the Dynamic Rocket Equation Tool, DYREQT, this problem can be solved rapidly by defining each system as a hierarchy of elements and subelements, the latter corresponding to external subsystem-level sizing models. DYREQT also enables the construction of individual missions as a series of events, which can be directly driven and generated by the mission set found by the campaign optimization process. This process produces sized vehicles iteratively by using the mission input, subsystem level sizing models, and the ideal rocket equation.
By conducting a literature review of campaign and vehicle design processes, the different pieces of the overall methodology are identified, but not the structure. The specific iterative solver, the corresponding convergence criteria, and initialization scheme are the primary areas for experimentation of this thesis. Using NASA’s reference 3-element Human Landing System campaign, the results of these experiments show that the methodology performs best with the vehicle sizing and synthesis process initializing and a path guess that minimizes dV. Further, a converged solution is found faster using non-linear Gauss Seidel fixed point iteration over Jacobi and set of convergence criteria that covers vehicle masses and mission data.
To show improvement over the state of the art, and how it enables concurrent trade studies, this methodology is used at scale in a demonstration using NASA’s Design Reference Architecture 5.0. The LH2 Nuclear Thermal Propulsion (NTP) option is traded with NH3and H2O at the vehicle-level as a way to show the impacts of alternative propellants on the vehicle sizing and campaign strategy. Martian surface stay duration is traded at the campaign-level through two options: long-stay and short-stay. The methodology was able to produce four alternative campaigns over the course of two weeks, which provided data about the launch and aggregation strategy, mission profiles, high-level figures of merit, and subsystem-level vehicle sizes for each alternative. Expectedly, with their lower specific impulses, alternative NTP propellants showed significant growth in the overall mass required to execute each campaign, subsequently represented the number of drop tanks and launches. Further, the short-stay campaign option showed a similar overall mass required compared to its long-stay counterpart, but higher overall costs even given the fewer elements required. Both trade studies supported the overall hypothesis and that integrating the campaign and vehicle design processes addresses the coupling between then and directly shows the impacts of their sensitivities on each other. As a result, the research objective was fulfilled by producing a methodology that was able to address the key gaps identified in the current state of the art.Ph.D
Modern meat: the next generation of meat from cells
Modern Meat is the first textbook on cultivated meat, with contributions from over 100 experts within the cultivated meat community.
The Sections of Modern Meat comprise 5 broad categories of cultivated meat: Context, Impact, Science, Society, and World.
The 19 chapters of Modern Meat, spread across these 5 sections, provide detailed entries on cultivated meat. They extensively tour a range of topics including the impact of cultivated meat on humans and animals, the bioprocess of cultivated meat production, how cultivated meat may become a food option in Space and on Mars, and how cultivated meat may impact the economy, culture, and tradition of Asia
Computer Vision and Architectural History at Eye Level:Mixed Methods for Linking Research in the Humanities and in Information Technology
Information on the history of architecture is embedded in our daily surroundings, in vernacular and heritage buildings and in physical objects, photographs and plans. Historians study these tangible and intangible artefacts and the communities that built and used them. Thus valuableinsights are gained into the past and the present as they also provide a foundation for designing the future. Given that our understanding of the past is limited by the inadequate availability of data, the article demonstrates that advanced computer tools can help gain more and well-linked data from the past. Computer vision can make a decisive contribution to the identification of image content in historical photographs. This application is particularly interesting for architectural history, where visual sources play an essential role in understanding the built environment of the past, yet lack of reliable metadata often hinders the use of materials. The automated recognition contributes to making a variety of image sources usable forresearch.<br/
Measuring the impact of COVID-19 on hospital care pathways
Care pathways in hospitals around the world reported significant disruption during the recent COVID-19 pandemic but measuring the actual impact is more problematic. Process mining can be useful for hospital management to measure the conformance of real-life care to what might be considered normal operations. In this study, we aim to demonstrate that process mining can be used to investigate process changes associated with complex disruptive events. We studied perturbations to accident and emergency (A &E) and maternity pathways in a UK public hospital during the COVID-19 pandemic. Co-incidentally the hospital had implemented a Command Centre approach for patient-flow management affording an opportunity to study both the planned improvement and the disruption due to the pandemic. Our study proposes and demonstrates a method for measuring and investigating the impact of such planned and unplanned disruptions affecting hospital care pathways. We found that during the pandemic, both A &E and maternity pathways had measurable reductions in the mean length of stay and a measurable drop in the percentage of pathways conforming to normative models. There were no distinctive patterns of monthly mean values of length of stay nor conformance throughout the phases of the installation of the hospital’s new Command Centre approach. Due to a deficit in the available A &E data, the findings for A &E pathways could not be interpreted
(b2023 to 2014) The UNBELIEVABLE similarities between the ideas of some people (2006-2016) and my ideas (2002-2008) in physics (quantum mechanics, cosmology), cognitive neuroscience, philosophy of mind, and philosophy (this manuscript would require a REVOLUTION in international academy environment!)
(b2023 to 2014) The UNBELIEVABLE similarities between the ideas of some people (2006-2016) and my ideas (2002-2008) in physics (quantum mechanics, cosmology), cognitive neuroscience, philosophy of mind, and philosophy (this manuscript would require a REVOLUTION in international academy environment!
Towards a Peaceful Development of Cyberspace - Challenges and Technical Measures for the De-escalation of State-led Cyberconflicts and Arms Control of Cyberweapons
Cyberspace, already a few decades old, has become a matter of course for most of us, part of our everyday life. At the same time, this space and the global infrastructure behind it are essential for our civilizations, the economy and administration, and thus an essential expression and lifeline of a globalized world. However, these developments also create vulnerabilities and thus, cyberspace is increasingly developing into an intelligence and military operational area – for the defense and security of states but also as a component of offensive military planning, visible in the creation of military cyber-departments and the integration of cyberspace into states' security and defense strategies. In order to contain and regulate the conflict and escalation potential of technology used by military forces, over the last decades, a complex tool set of transparency, de-escalation and arms control measures has been developed and proof-tested. Unfortunately, many of these established measures do not work for cyberspace due to its specific technical characteristics. Even more, the concept of what constitutes a weapon – an essential requirement for regulation – starts to blur for this domain. Against this background, this thesis aims to answer how measures for the de-escalation of state-led conflicts in cyberspace and arms control of cyberweapons can be developed. In order to answer this question, the dissertation takes a specifically technical perspective on these problems and the underlying political challenges of state behavior and international humanitarian law in cyberspace to identify starting points for technical measures of transparency, arms control and verification. Based on this approach of adopting already existing technical measures from other fields of computer science, the thesis will provide proof of concepts approaches for some mentioned challenges like a classification system for cyberweapons that is based on technical measurable features, an approach for the mutual reduction of vulnerability stockpiles and an approach to plausibly assure the non-involvement in a cyberconflict as a measure for de-escalation. All these initial approaches and the questions of how and by which measures arms control and conflict reduction can work for cyberspace are still quite new and subject to not too many debates. Indeed, the approach of deliberately self-restricting the capabilities of technology in order to serve a bigger goal, like the reduction of its destructive usage, is yet not very common for the engineering thinking of computer science. Therefore, this dissertation also aims to provide some impulses regarding the responsibility and creative options of computer science with a view to the peaceful development and use of cyberspace
- …