1,577 research outputs found

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    Posthuman Creative Styling can a creative writer’s style of writing be described as procedural?

    Get PDF
    This thesis is about creative styling — the styling a creative writer might use to make their writing unique. It addresses the question as to whether such styling can be described as procedural. Creative styling is part of the technique a creative writer uses when writing. It is how they make the text more ‘lively’ by use of tips and tricks they have either learned or discovered. In essence these are rules, ones the writer accrues over time by their practice. The thesis argues that the use and invention of these rules can be set as procedures. and so describe creative styling as procedural. The thesis follows from questioning why it is that machines or algorithms have, so far, been incapable of producing creative writing which has value. Machine-written novels do not abound on the bookshelves and writing styled by computers is, on the whole, dull in comparison to human-crafted literature. It came about by thinking how it would be possible to reach a point where writing by people and procedural writing are considered to have equal value. For this reason the thesis is set in a posthuman context, where the differences between machines and people are erased. The thesis uses practice to inform an original conceptual space model, based on quality dimensions and dynamic-inter operation of spaces. This model gives an example of the procedures which a posthuman creative writer uses when engaged in creative styling. It suggests an original formulation for the conceptual blending of conceptual spaces, based on the casting of qualities from one space to another. In support of and informing its arguments are ninety-nine examples of creative writing practice which show the procedures by which style has been applied, created and assessed. It provides a route forward for further joint research into both computational and human-coded creative writing

    Peering into the Dark: Investigating dark matter and neutrinos with cosmology and astrophysics

    Get PDF
    The LCDM model of modern cosmology provides a highly accurate description of our universe. However, it relies on two mysterious components, dark matter and dark energy. The cold dark matter paradigm does not provide a satisfying description of its particle nature, nor any link to the Standard Model of particle physics. I investigate the consequences for cosmological structure formation in models with a coupling between dark matter and Standard Model neutrinos, as well as probes of primordial black holes as dark matter. I examine the impact that such an interaction would have through both linear perturbation theory and nonlinear N-body simulations. I present limits on the possible interaction strength from cosmic microwave background, large scale structure, and galaxy population data, as well as forecasts on the future sensitivity. I provide an analysis of what is necessary to distinguish the cosmological impact of interacting dark matter from similar effects. Intensity mapping of the 21 cm line of neutral hydrogen at high redshift using next generation observatories, such as the SKA, would provide the strongest constraints yet on such interactions, and may be able to distinguish between different scenarios causing suppressed small scale structure. I also present a novel type of probe of structure formation, using the cosmological gravitational wave signal of high redshift compact binary mergers to provide information about structure formation, and thus the behaviour of dark matter. Such observations would also provide competitive constraints. Finally, I investigate primordial black holes as an alternative dark matter candidate, presenting an analysis and framework for the evolution of extended mass populations over cosmological time and computing the present day gamma ray signal, as well as the allowed local evaporation rate. This is used to set constraints on the allowed population of low mass primordial black holes, and the likelihood of witnessing an evaporation

    Once More, With Feeling: Partnering With Learners to Re-see the College Experience Through Metaphor and Sensory Language

    Get PDF
    This study focuses on better understanding students and their internal worlds through conceptual metaphor theory and sensory language. Using a phenomenological and arts-based approach, I examined students’ metaphorical constructions of their college experiences and the sensory language and information informing those constructions. By engaging participants in a multimodal process to re-see their experience through connoisseurship and criticism, I explored the following research questions: How do students metaphorically structure their college experience? What sensory language do college students use to describe the metaphorical dimensions of their college experience? How does sensory information shape the metaphorical structuring of their college experience? Through conversations centered on participant-generated images and chosen sensory language, I identified five complex metaphors that represented participants’ constructions of their college experience: college is an unwieldy package; college is up, forward, and out; college is current and future nostalgia; college is a prism; and college is a movie and peers are the soundtrack. By considering these themes, it may be possible for educators to better partner with diverse learners to design personally meaningful experiences that support student development and success. This dissertation is available in open access at AURA (https://aura.antioch.edu) and OhioLINK ETD Center (https://etd.ohiolink.edu)

    The politics of content prioritisation online governing prominence and discoverability on digital media platforms

    Get PDF
    This thesis examines the governing systems and industry practices shaping online content prioritisation processes on digital media platforms. Content prioritisation, and the relative prominence and discoverability of content, are investigated through a critical institutional lens as digital decision guidance processes that shape online choice architecture and influence users’ access to content online. This thesis thus shows how prioritisation is never neutral or static and cannot be explained solely by political economic or neoclassical economics approaches. Rather, prioritisation is dynamically shaped by the institutional environment and by the clash between existing media governance systems and those emerging for platform governance. As prioritisation processes influence how audiovisual media services are accessed online, posing questions about the public interest in such forms of intermediation is key. In that context, this research asks how content prioritisation is governed on digital media platforms, and what the elements of a public interest framework for these practices might be. To address these questions, I use a within case study comparative research design focused on the United Kingdom, collecting data by means of semi-structured interviews and document analysis. Through a thematic analysis, I then investigate how institutional arrangements influence both organisational strategies and interests, as well as the relationships among industry and policy actors involved, namely, platform organisations, pay-TV operators, technology manufacturers, content providers including public service media, and regulators. The results provide insights into the ‘black box’ of content prioritisation across three interconnected dimensions: technical, market, and regulatory. In each dimension, a battle between industry and policy actors emerges to influence prioritisation online. As the UK Government and regulator intend to develop new prominence rules, the dispute takes on a normative dimension and gives rise to contested visions of what audiovisual services should be prioritised to the final users, and which private- and public-interest-driven criteria are (or should) be used to determine that. Finally, the analysis shows why it is crucial to reflect on how the public interest is interpreted and operationalised as new prominence regulatory regimes emerge with a variety of sometimes contradictory implications for media pluralism, diversity and audience freedom of choice. The thesis therefore indicates the need for new institutional arrangements and a public interest-driven framework for prioritisation on digital media platforms. Such a framework conceives of public interest content standards as an institutional imperative for media and platform organisations and prompts regulators to develop new online content regulation that is appropriate to changing forms of digital intermediation and emerging audiovisual market conditions. While the empirical focus is on the UK, the implications of the research findings are also considered in the light of developments in the European Union and Council of Europe initiatives that bear on the future discoverability of public interest media services and related prominence regimes

    Synthetic Aperture Radar (SAR) Meets Deep Learning

    Get PDF
    This reprint focuses on the application of the combination of synthetic aperture radars and depth learning technology. It aims to further promote the development of SAR image intelligent interpretation technology. A synthetic aperture radar (SAR) is an important active microwave imaging sensor, whose all-day and all-weather working capacity give it an important place in the remote sensing community. Since the United States launched the first SAR satellite, SAR has received much attention in the remote sensing community, e.g., in geological exploration, topographic mapping, disaster forecast, and traffic monitoring. It is valuable and meaningful, therefore, to study SAR-based remote sensing applications. In recent years, deep learning represented by convolution neural networks has promoted significant progress in the computer vision community, e.g., in face recognition, the driverless field and Internet of things (IoT). Deep learning can enable computational models with multiple processing layers to learn data representations with multiple-level abstractions. This can greatly improve the performance of various applications. This reprint provides a platform for researchers to handle the above significant challenges and present their innovative and cutting-edge research results when applying deep learning to SAR in various manuscript types, e.g., articles, letters, reviews and technical reports

    Connected World:Insights from 100 academics on how to build better connections

    Get PDF

    Twilight of the American State

    Get PDF
    The sudden emergence of the Trump nation surprised nearly everyone, including journalists, pundits, political consultants, and academics. When Trump won in 2016, his ascendancy was widely viewed as a fluke. Yet time showed it was instead the rise of a movement—angry, militant, revanchist, and unabashedly authoritarian. How did this happen? Twilight of the American State offers a sweeping exploration of how law and legal institutions helped prepare the grounds for this rebellious movement. The controversial argument is that, viewed as a legal matter, the American state is not just a liberal democracy, as most Americans believe. Rather, the American state is composed of an uneasy and unstable combination of different versions of the state—liberal democratic, administered, neoliberal, and dissociative. Each of these versions arose through its own law and legal institutions. Each emerged at different times historically. Each was prompted by deficits in the prior versions. Each has survived displacement by succeeding versions. All remain active in the contemporary moment—creating the political-legal dysfunction America confronts today. Pierre Schlag maps out a big picture view of the tribulations of the American state. The book abjures conventional academic frameworks, sets aside prescriptions for quick fixes, dispenses with lamentations about polarization, and bypasses historical celebrations of the American Spirit

    Optimising multimodal fusion for biometric identification systems

    Get PDF
    Biometric systems are automatic means for imitating the human brain’s ability of identifying and verifying other humans by their behavioural and physiological characteristics. A system, which uses more than one biometric modality at the same time, is known as a multimodal system. Multimodal biometric systems consolidate the evidence presented by multiple biometric sources and typically provide better recognition performance compared to systems based on a single biometric modality. This thesis addresses some issues related to the implementation of multimodal biometric identity verification systems. The thesis assesses the feasibility of using commercial offthe-shelf products to construct deployable multimodal biometric system. It also identifies multimodal biometric fusion as a challenging optimisation problem when one considers the presence of several configurations and settings, in particular the verification thresholds adopted by each biometric device and the decision fusion algorithm implemented for a particular configuration. The thesis proposes a novel approach for the optimisation of multimodal biometric systems based on the use of genetic algorithms for solving some of the problems associated with the different settings. The proposed optimisation method also addresses some of the problems associated with score normalization. In addition, the thesis presents an analysis of the performance of different fusion rules when characterising the system users as sheep, goats, lambs and wolves. The results presented indicate that the proposed optimisation method can be used to solve the problems associated with threshold settings. This clearly demonstrates a valuable potential strategy that can be used to set a priori thresholds of the different biometric devices before using them. The proposed optimisation architecture addressed the problem of score normalisation, which makes it an effective “plug-and-play” design philosophy to system implementation. The results also indicate that the optimisation approach can be used for effectively determining the weight settings, which is used in many applications for varying the relative importance of the different performance parameters
    corecore