125 research outputs found

    Nanotechnology. Summary

    Get PDF

    Screen Genealogies

    Get PDF
    Against the grain of the growing literature on screens, *Screen Genealogies* argues that the present excess of screens cannot be understood as an expansion and multiplication of the movie screen nor of the video display. Rather, screens continually exceed the optical histories in which they are most commonly inscribed. As contemporary screens become increasingly decomposed into a distributed field of technologically interconnected surfaces and interfaces, we more readily recognize the deeper spatial and environmental interventions that have long been a property of screens. For most of its history, a screen was a filter, a divide, a shelter, or a camouflage. A genealogy stressing transformation and descent rather than origins and roots emphasizes a deeper set of intersecting and competing definitions of the screen, enabling new thinking about what the screen might yet become

    FinBook: literary content as digital commodity

    Get PDF
    This short essay explains the significance of the FinBook intervention, and invites the reader to participate. We have associated each chapter within this book with a financial robot (FinBot), and created a market whereby book content will be traded with financial securities. As human labour increasingly consists of unstable and uncertain work practices and as algorithms replace people on the virtual trading floors of the worlds markets, we see members of society taking advantage of FinBots to invest and make extra funds. Bots of all kinds are making financial decisions for us, searching online on our behalf to help us invest, to consume products and services. Our contribution to this compilation is to turn the collection of chapters in this book into a dynamic investment portfolio, and thereby play out what might happen to the process of buying and consuming literature in the not-so-distant future. By attaching identities (through QR codes) to each chapter, we create a market in which the chapter can ‘perform’. Our FinBots will trade based on features extracted from the authors’ words in this book: the political, ethical and cultural values embedded in the work, and the extent to which the FinBots share authors’ concerns; and the performance of chapters amongst those human and non-human actors that make up the market, and readership. In short, the FinBook model turns our work and the work of our co-authors into an investment portfolio, mediated by the market and the attention of readers. By creating a digital economy specifically around the content of online texts, our chapter and the FinBook platform aims to challenge the reader to consider how their personal values align them with individual articles, and how these become contested as they perform different value judgements about the financial performance of each chapter and the book as a whole. At the same time, by introducing ‘autonomous’ trading bots, we also explore the different ‘network’ affordances that differ between paper based books that’s scarcity is developed through analogue form, and digital forms of books whose uniqueness is reached through encryption. We thereby speak to wider questions about the conditions of an aggressive market in which algorithms subject cultural and intellectual items – books – to economic parameters, and the increasing ubiquity of data bots as actors in our social, political, economic and cultural lives. We understand that our marketization of literature may be an uncomfortable juxtaposition against the conventionally-imagined way a book is created, enjoyed and shared: it is intended to be

    Books of Life: Post-DNA Life Science in 1960s American Fiction

    Get PDF
    Following the discovery of the DNA double helix in 1953, concepts of genetic code and program emerged to redefine life. A range of complementary assumptions—about the cryptographic behavior of language, the transcriptional nature of creative writing, and the mechanistic constitution of the human organism—buttressed this new, textual explanation for living beings. In this dissertation, I analyze how the 1960s novels of three writers—Ken Kesey’s One Flew Over the Cuckoo’s Nest, John Barth’s Giles Goat-Boy, and the detective novels of Chester Himes—respond to this epistemic shift within the life sciences. While the loudly-heralded “genomic book of life” written in the double helix appeared to co-opt the novel’s age-old endeavor to describe life, it also proved a compelling invitation to writers who could reconceive these molecular metaphors as compositional resources. Drawing on intellectual histories of the post-WWII life sciences to establish the heavily rhetorical character of this episode in biology, I demonstrate how Kesey, Barth, and Himes mobilized biological metaphors to dual purpose. By employing these new concepts to parody the anachronistic organic logics of literary criticism, they challenged received notions of literary form. Simultaneously, they harnessed the truth-value of scientific metaphors in a complex speculative impulse, which, by taking the new biology’s claims literally, satirized the rhetorical bombast of scientific discourse while flaunting the period’s nostalgic literary-critical investments in the “Great American Novel.” Each text pursues post-DNA biological theory as theme and formal architecture, but ultimately arrives at a more fundamental reckoning with the poetics of literality that, at this historical juncture, worked to elide the distance between life and text. These analyses contribute to critical conversations around the Anthropocene, posthumanism, scale critique, biopolitics, and comparative methods for the interdisciplinary study of science and literature. They also promise to complicate dominant accounts of the postwar novel that have tended to minimize the contributions of 1960s writers, and to augment our understanding of the postwar novel’s debts to contemporaneous scientific discourse.Doctor of Philosoph

    Spimes:A Multidimensional Lens for Designing Future Sustainable Internet Connected Devices

    Get PDF
    There are numerous loud and powerful voices promoting the Internet of Things (IoT) as a catalyst for changing many aspects of our lives for the better. Healthcare, energy, transport, finance, entertainment and in the home – billions of everyday objects across all sorts of sectors are being connected to the Internet to generate data so that we can make quicker and more efficient decisions about many facets of our lives. But is this technological development completely benign? I argue that, despite all their positive potential, IoT devices are still being designed, manufactured and disposed of in the same manner that most other ‘non-connected’ consumer products have been for decades – unsustainably. Further, while much fanfare is made of the IoT’s potential utility for reducing energy usage through pervasive monitoring, little discourse recognises the intrinsically unsustainable nature of the IoT devices themselves. In response to this growing unsustainable product culture, my thesis centres on the role that sustainability can potentially play in the design of future IoT devices. I propose the recharacterisation of IoT devices as spimes in order to provide an alternative approach for facilitating sustainable Internet-connected product design practice. The concept of spimes was first introduced in 2004 by the futurist Bruce Sterling and then outlined further a year later in his book Shaping Things. When viewed simply, a spime would be a type of near future, internet-connected device which marries physical and digital elements with innate sustainable characteristics. Whereas the majority of sustainable design theory and practice has focused on the development of sustainable non-connected devices, a credible strategy for the design of environmentally friendly Internet-connected physical objects has yet to be put forward. In light of this, I argue that now is the right time to develop the spimes concept in greater depth so that it may begin to serve as a viable counterpoint to the increasing unsustainability of the IoT. To make this case, my thesis explores the following three key questions: ‱ What are spimes? ‱ Can we begin to design spimes? ‱ What does spime-orientated research mean for unsustainable Internet-connected design practice? I outline how, in order to explore these important questions, I utilised a Research through Design approach to unpack and augment the notion of spimes through three Design Fiction case studies. Each case study concretises different key design criteria for spime devices, while also probing the broader implications that could arise as a result of adopting such spime designs in the near future. I discuss the significance of reflecting upon my Spime-based Design Fiction Practice and how this enabled me to develop the spimes concept into a multidimensional lens, which I contend, other designers can potentially harness as a means to reframe their IoT praxis with sustainability baked-in. The key aspects of my process and its outputs are also summarised in form of a design manifesto with the aim of inspiring prospective designers and technologists to create future sustainable Internet-connected devices

    Process Bifurcation and the Digital Chain in Architecture

    Get PDF
    This thesis investigates the impact of digital technology on the methods of design and production in architecture. Through research of history, theory, technology, and methods, the work determines whether the current use of new technologies should be considered as iterative development of the architectural practice, or as a radically new paradigm. The computer has now matured into the primary working medium for architectural design. Digital tools are increasingly expanding their role, from design and visualization, to use in empirical simulation and evaluation, digital fabrication, and in on-site construction validation. This combination of linked tools, into a single project-wide solution system is called the "digital chain". This procedure is a significant opportunity for optimization of process, but it can also be seen as a catalyst to reinforce creative collaboration between disparate professionals in a design team. Because of this change to both methods and relationships the digital chain, and its components are understood to be disruptive technologies. The digital chain has caused a bifurcation in architectural productivity, because of this disruption many theorists and critics claim that digital technologies now define a new era of architecture. This thesis seeks to understand if this inflection can also be considered a new paradigm. The contemporary computer, as an "information machine" is characterized by three specific capabilities: control, prediction, and processing. Research and project work to examine the digital chain have been undertaken and categorized using these three investigation channels. Each topic for investigation has been instigated with a pedagogic work. Thereafter, additional "proof of concept" projects have been undertaken to advance the work to the professional level. The results and findings occur at two levels: Practical production of architecture, but also as conclusion about "digital learning" and the scope of technological adoption within the architectural profession. The conclusions of the thesis state that a technologically induced paradigm-shift in architecture has not occurred. The current implementation of digital tools still only qualifies as iterative innovation of traditional methods. The technology and use of digital tools in architecture has developed significantly in the last decade. Emerging technologies, new methods, and the prognosis for future developments will have significant effect on architectural design and production. Because architecture has a well established precedent of "re-purposing" technology from other disciplines, the thesis concludes that there is strong potential that a paradigm-shift may yet happen. By examining emerging innovations from other industries it is possible to make highly informed predictions about incoming innovations in materials, production methods, and conceptual systems. This insight into the next wave of potential catalytic influences is presented in the discussion, and it is concluded that a broader range of new innovations may yet invoke a technologically driven paradigm-shift in architecture

    PenChain: A Blockchain-Based Platform for Penalty-Aware Service Provisioning

    Get PDF
    Service provisioning is of paramount importance as we are now heading towards a world of integrated services giving rise to the next generation of service ecosystems. The huge number of service offerings that will be available to customers in future scenarios require a novel approach to service registry and discovery that allows customers to choose the offerings that best match their preferences. One way to achieve this is to introduce the provider’s reputation, i.e., a quality indicator of the provisioned service, as an additional search criterion. Now, with blockchain technology in our hands, automated regulation of service-level agreements (SLAs) that capture mutual agreements between all involved parties has regained momentum. In this article, we report on our full-fledged work on the conception, design, and construction of a platform for SLA-minded service provisioning called PenChain. With our work, we demonstrate that penalty-aware SLAs of general services–if represented in machine-readable logic and assisted by distributed ledger technology–are programmatically enforceable. We devise algorithms for ranking services in a search result taking into account the digitized values of the SLAs. We offer two scenario-based evaluations of PenChain in the field of precision agriculture and in the domain of automotive manufacturing. Furthermore, we examine the scalability and data security of PenChain for precision agriculture

    Not invented here: Power and politics in public key infrastructure (PKI) institutionalisation at two global organisations.

    Get PDF
    This dissertation explores the impact of power and politics in Public Key Infrastructure (PKI) institutionalisation. We argue that this process can be understood in power and politics terms because the infrastructure skews the control of organisational action in favour of dominant individuals and groups. Indeed, as our case studies show, shifting power balances is not only a desired outcome of PKI deployment, power drives institutionalisation. Therefore, despite the rational goals of improving security and reducing the total cost of ownership for IT, the PKIs in our field organisations have actually been catalysts for power and politics. Although current research focuses on external technical interoperation, we believe emphasis should be on the interaction between the at once restrictive and flexible PKI technical features, organisational structures, goals of sponsors and potential user resistance. We use the Circuits of Power (CoP) framework to explain how a PKI conditions and is conditioned by power and politics. Drawing on the concepts of infrastructure and institution, we submit that PKIs are politically explosive in pluralistic, distributed global organisations because by limiting freedom of action in favour of stability and security, they set a stage for disaffection. The result of antipathy towards the infrastructure would not be a major concern if public key cryptography, which underpins PKI, had a centralised mechanism for enforcing the user discipline it relies on to work properly. However, since this discipline is not automatic, a PKI bereft of support from existing power arrangements faces considerable institutionalisation challenges. We assess these ideas in two case studies in London and Switzerland. In London, we explain how an oil company used its institutional structures to implement PKI as part of a desktop standard covering 105,000 employees. In Zurich and London, we give a power analysis of attempts by a global financial services firm to roll out PKI to over 70,000 users. Our dissertation makes an important contribution by showing that where PKI supporters engage in a shrewdly orchestrated campaign to knit the infrastructure with the existing institutional order, it becomes an accepted part of organisational life without much ceremony. In sum, we both fill gaps in information security literature and extend knowledge on the efficacy of the Circuits of Power framework in conducting IS institutionalisation studies
    • 

    corecore