275 research outputs found
Animate Being: Extending a Practice of the Image to New Mediums via Speculative Game Design
This post-disciplinary practice as research thesis examines the potential of Carl Jung's therapeutic method of active imagination as a strategy for engaging with an increasingly complex and interconnected technological reality. Embracing a non-clinical, practice-driven approach, I harness James Hillman’s notion of the image and the imaginal to investigate the interdisciplinary capacity and ethical dimensions of an expansive mode of image-work. My approach to practice theoretically and practically intertwines analytical psychology, feminist worlding and design speculation. Building upon Susan Rowland’s work, I study image-work as an ecological alchemical craft that seeks to matter the immaterial. Through the cyclic iterative design of a video game, I mobilise and respond to image-work as a mode of myth-making that may facilitate dialogue between human and non-human intelligences. Departing from the essentialism of the hero's journey, I adopt Le Guin's Carrier Bag (1986/2019) as a feminist video game form and by utilising the framework of a video game (Bogost, 2007; Flannigan, 2013), the alchemical processes of image-work are transformed into novel interactive game mechanics. The game I design is both a vessel and a portal to an imaginal ecological realm, an open-world, procedurally generated ‘living world’ sandbox exploration game. This game integrates real-time, real-world data streams to invite the non-human to enter into play as player two, facilitating experimentation with possible new forms of cross-species dialogue, collaboration, and healing
How to deploy security mechanisms online (consistently)
To mitigate a myriad of Web attacks, modern browsers support client-side secu- rity policies shipped through HTTP response headers. To enforce these policies, the operator can set response headers that the server then communicates to the client. We have shown that one of those, namely the Content Security Policy (CSP), re- quires massive engineering effort to be deployed in a non-trivially bypassable way. Thus, many policies deployed on Web sites are misconfigured. Due to the capability of CSP to also defend against framing-based attacks, it has a functionality-wise overlap with the X-Frame-Options header. We have shown that this overlap leads to inconsistent behavior of browsers, but also inconsistent deployment on real-world Web applications. Not only overloaded defense mechanisms are prone to security inconsistencies. We investigated that due to the structure of the Web it- self, misconfigured origin servers or geolocation-based CDN caches can cause unwanted security inconsistencies. To not disregard the high number of misconfigurations of CSP, we also took a closer look at the deployment process of the mechanism. By conducting a semi-structured interview, including a coding task, we were able to shed light on motivations, strategies, and roadblocks of CSP deployment. However, due to the wide usage of CSP, drastic changes are generally considered impractical. Therefore, we also evaluated if one of the newest Web security features, namely Trusted Types, can be improved.Um eine Vielzahl von Angriffen im Web zu entschärfen, unterstützen moderne Browser clientseitige Sicherheitsmechanismen, die über sogenannte HTTP Response- Header übermittelt werden. Um jene Sicherheitsfeatures anzuwenden, setzt der Betreiber einer Web site einen solchen Header, welchen der Server dann an den Client ausliefert. Wir haben gezeigt, dass das konfigurieren eines dieser Mechanismen, der Content Security Policy (CSP), einen enormen technischen Aufwand erfordert, um auf nicht triviale Weise umgangen werden zu können. Daher ist jenes feature auf vielen Webseiten, auch Top Webseiten, falsch konfiguriert. Aufgrund der Fähigkeit von CSP, auch Framing-basierte Angriffe abzuwehren, überschneidet sich seine Funktionalität darüber hinaus mit der des X-Frame-Options Headers. Wir haben gezeigt, dass dies zu inkonsistentem Verhalten von Browsern, aber auch zu inkonsistentem Einsatz in realen Webanwendungen führt. Nicht nur überladene Verteidigungsmechanismen sind anfällig für Sicherheitsinkonsistenzen. Wir haben untersucht, dass aufgrund der Struktur desWebs selbst, falsch konfigurierte Ursprungsserver, oder CDN-Caches, die von der geographischen Lage abhängen, unerwünschte Sicherheitsinkonsistenzen verursachen können. Um die hohe Anzahl an Fehlkonfigurationen von CSP-Headern nicht außer Acht zu lassen, haben wir uns auch den Erstellungsprozess eines CSP-Headers genauer angesehen. Mit Hilfe eines halbstrukturierten Interviews, welches auch eine Programmieraufgabe beinhaltete, konnten wir die Motivationen, Strategien und Hindernisse beim Einsatz von CSP beleuchten. Aufgrund der weiten Verbreitung von CSP werden drastische Änderungen allgemein jedoch als unpraktisch angesehen. Daher haben wir ebenfalls untersucht, ob eine der neuesten und daher wenig genutzten,Web-Sicherheitsmechanismen, namentlich Trusted Types, ebenfalls verbesserungswürdig ist
Data replication and update propagation in XML P2P data management systems
XML P2P data management systems are P2P systems that use XML as the underlying data format shared between peers in the network. These systems aim to bring the benefits of XML and P2P systems to the distributed data management field. However, P2P systems are known for their lack of central control and high degree of autonomy. Peers may leave the network at any time at will, increasing the risk of data loss. Despite this, most research in XML P2P systems focus on novel and efficient XML indexing and retrieval techniques. Mechanisms for ensuring data availability in XML P2P systems has received comparatively little attention. This project attempts to address this issue. We design an XML P2P data management framework to improve data availability. This framework includes mechanisms for wide-spread data replication, replica location and update propagation. It allows XML documents to be broken down into fragments. By doing so, we aim to reduce the cost of replicating data by distributing smaller XML fragments throughout the network rather than entire documents. To tackle the data replication problem, we propose a suite of selection and placement algorithms that may be interchanged to form a particular replication strategy. To support the placement of replicas anywhere in the network, we use a Fragment Location Catalogue, a global index that maintains the locations of replicas. We also propose a lazy update propagation algorithm to propagate updates to replicas. Experiments show that the data replication algorithms improve data availability in our experimental network environment. We also find that breaking XML documents into smaller pieces and replicating those instead of whole XML documents considerably reduces the replication cost, but at the price of some loss in data availability. For the update propagation tests, we find that the probability that queries return up-to-date results increases, but improvements to the algorithm are necessary to handle environments with high update rates
Lockdown Cultures
Lockdown Cultures is both a cultural response to our extraordinary times and a manifesto for the arts and humanities and their role in our post-pandemic society.
This book offers a unique response to the question of how the humanities commented on and were impacted by one of the dominant crises of our times: the Covid-19 pandemic. While the role of engineers, epidemiologists and, of course, medics is assumed, Lockdown Cultures illustrates some of the ways in which the humanities understood and analysed 2020–21, the year of lockdown and plague. Though the impulse behind the book was topical, underpinning the richly varied and individual essays is a lasting concern with the value of the humanities in the twenty-first century. Each contributor approaches this differently but there are two dominant strands: how art and culture can help us understand the Covid crisis; and how the value of the humanities can be demonstrated by engaging with cultural products from the past.
The result is a book that serves as testament to the humanities’ reinvigorated and reforged sense of identity, from the perspective of UCL and one of the leading arts and humanities faculties in the world. It bears witness to a globally impactful event while showcasing interdisciplinary thinking and examining how the pandemic has changed how we read, watch, write and educate. More than thirty individual contributions collectively reassert the importance of the arts and humanities for contemporary society
Fiskerton
Fiskerton, located in the Witham valley of Lincoln, is one of only a handful of excavated sites in Europe to reveal the Iron Age practice of ritually destroying special and elite objects by placing them in a body of water. This volume reports on the 1981 excavations on the bank of the River Witham and provides fascinating insights into this important aspect of Iron Age religion and culture. A remarkable group of Iron Age and Roman artefacts was found in association with a wooden causeway in use from at least 457 to 321BC, including bronze and iron weapons and tools (some decorated with ornamental motifs), bone tools, stone tools, jewellery and pottery. The Iron Age finds are earlier than those from similar watery sites such as La Tène in Switzerland and Llyn Cerrig Bach in Wales, and the precise dating of the Fiskerton causeway by dendrochronology establishes it as one of the earliest known structures in Europe belonging to the La Tène culture. This report provides detailed descriptions of the Iron Age, Roman and Medieval artefacts and the human and animal bones found at the site. The authors compare the Fiskerton evidence with other British, Irish and European examples of ritual or votive deposition in water; they discuss the construction and the appearance of the causeway; and they examine the significance of Fiskerton as a religious site, especially in terms of its topographical context, as a river crossing and as a boundary or liminal area between mainland Britain and the former island of Lindsey
Toward High-Performance Blockchains
The decentralized nature of blockchains has attracted many applications to build atop them, such as cryptocurrencies, smart contracts, and non-fungible tokens. The health and performance of the underlying blockchain systems considerably influence these applications. Bootstrapping new nodes by replaying all transactions on the ledger is not sustainable for ever-growing blockchains. In addition, poor performance impedes the adoption of blockchains in large-scale applications with high transaction rates.
First, in order to address the bootstrapping problem of already-deployed UTXO-based blockchains, this thesis proposes a snapshot synchronization approach. This approach allows new nodes to synchronize themselves with the rest of the network by downloading a snapshot of the system state, thereby avoiding verifying transactions since the genesis block. In addition, snapshots are stored efficiently on disk by taking advantage of the system state database.
Second, although sharding improves the performance of blockchains by distributing the workload among shards, it leaves the duplicated efforts within a shard unhandled. Specifically, every node has to verify all transactions on the ledger of its shard, thus limiting shard performance to the processing power of individual nodes. Aiming to improve the performance of individual shards, this thesis proposes Collaborative Transaction Verification, which enables nodes to share transaction verification results and thus reduces the per-node workload. Dependency graphs are employed to ensure that nodes reach the same system state despite different transaction verification and execution orders.
Finally, cross-shard transactions rely on expensive atomic commit protocols to ensure inter-shard state consistency, thus impairing the performance of sharded blockchains. This thesis explores ways of lessening the impact of cross-shard transactions. On the one hand, a dependency-aware transaction placement algorithm is proposed to reduce cross-shard transactions. On the other hand, the processing cost of the remaining cross-shard transactions is reduced by optimizing the atomic commit protocol and parallelizing dependent transaction verification with the atomic commit protocol.
The above techniques are devoted to addressing the bootstrapping and performance problems of blockchains. Our evaluation shows that the first technique can significantly expedite the initial synchronization of new nodes, and the other techniques can greatly boost the performance of sharded blockchains
Testing Ontology Embedding Visualization
This dissertation presents an experiment conducted with human participants on human-information interaction with visualizations of ontologies. The research question is whether embedding visualizations or graph based visualizations lead to better task performance for human-information interaction. A literature review of word embeddings, information retrieval applications, cartesian and radial visualizations, and knowledge graph visualizations is conducted. This literature review is grounded in a facet analysis of the intersecting topics of the central research question. The context of embeddings as used for information retrieval in the 20th century, as opposed to more recent 21st century inventions such as Google's word2vec is explored. A training ontology, the African Wildlife Ontology (AWO) was selected. It was extended using public lexical resources taken from the internet to include classes of common African plants and animals. This ontology was then visualised both as vectorspace embeddings and as a classical graph visualization. Participants were presented with one of four different knowledge graph visualizations: WebVOWL, OntoGraf, SquareVis and CircleVis and had to perform a specific information retrieval task. This task was to record as many African animals as they could find on the chart. The results are analyzed in terms of precision, recall, spam and average time. Although ultimately the results do not reject the null hypothesis, there is an opportunity for further research in the visualization of embeddings of knowledge graphs, especially for information retrieval
- …