9 research outputs found
Recommended from our members
The text is reading you: teaching language in the age of the algorithm
Most accounts of the way digital technologies have changed practices of reading and writing have focused on surface aspects of digital texts (such as hypertextuality, multimodality and the development of new registers). There are, however, less visible aspects of digital communication environments that have had an equally profound effect on reading and writing â namely the algorithms that lie behind texts that monitor the actions of readers and writers and alter the form and content of the texts they are exposed to. Algorithms have the potential to affect not just local communication practices, but also broader social practices, as they work to encourage and reinforce patterns of language use, communication and consumption. This paper describes the results of a two-year long participatory project, in which university students in Hong Kong and the United Kingdom explored the communication and inference forming practices they engage in when interacting with algorithms. The participants articulated six primary metaphors through which they and their classmates understand how algorithms work: 1) Algorithm as agent; 2) Algorithm as authority; 3) Algorithm as adversary; 4) Algorithm as communicative resource; 5) Algorithm as audience; and 6) Algorithm as oracle. Engaging learners in articulating the âfolk beliefsâ that govern peopleâs interaction with algorithms, it is argued, can contribute to the development of the kinds of digital literacies they will need to better understand the ways algorithms affect the kinds of information they are exposed to, the kinds of inferences they form about this information, and the ways their own acts of reading and writing can be used by algorithms to manipulate them
The value of critical destruction:Evaluating multispectral image processing methods for the analysis of primary historical texts
Multispectral imaging â a method for acquiring image data over a series of wavelengths across the light spectrum â is becoming a valuable tool within the cultural and heritage sector for the recovery and enhancement of information contained within primary historical texts. However, most applications of this technique, to date, have been bespoke: analysing particular documents of historic importance. There has been little prior work done on evaluating this technique in a structured fashion, to provide recommendations on how best to capture and process images when working with damaged and abraded textual material. This paper introduces a new approach for evaluating the efficacy of image processing algorithms in recovering information from multispectral images of deteriorated primary historical texts. We present a series of experiments that deliberately degrade samples cut from a real historical document to provide a set of images acquired before and after damage. These images then allow us to compare, both objectively and quantitatively, the effectiveness of multispectral imaging and image processing for recovering information from damaged text. We develop a methodological framework for the continuing study of the techniques involved in the analysis and processing of multispectral images of primary historical texts, and a dataset which will be of use to others interested in advanced digitisation techniques within the cultural heritage sector