57 research outputs found

    Variable Format: Media Poetics and the Little Database

    Get PDF
    This dissertation explores the situation of twentieth-century art and literature becoming digital. Focusing on relatively small online collections, I argue for materially invested readings of works of print, sound, and cinema from within a new media context. With bibliographic attention to the avant-garde legacy of media specificity and the little magazine, I argue that the “films,” “readings,” “magazines,” and “books” indexed on a series of influential websites are marked by meaningful transformations that continue to shape the present through a dramatic reconfiguration of the past. I maintain that the significance of an online version of a work is not only transformed in each instance of use, but that these versions fundamentally change our understanding of each historical work in turn. Here, I offer the analogical coding of these platforms as “little databases” after the little magazines that served as the vehicle of modernism and the historical avant-garde. Like the study of the full run of a magazine, these databases require a bridge between close and distant reading. Rather than contradict each other as is often argued, in this instance a combined macro- and microscopic mode of analysis yields valuable information not readily available by either method in isolation. In both directions, the social networks and technical protocols of database culture inscribe the limits of potential readings. Bridging the material orientation of bibliographic study with the format theory of recent media scholarship, this work constructs a media poetics for reading analog works situated within the windows, consoles, and networks of the twenty-first century

    Personality Identification from Social Media Using Deep Learning: A Review

    Get PDF
    Social media helps in sharing of ideas and information among people scattered around the world and thus helps in creating communities, groups, and virtual networks. Identification of personality is significant in many types of applications such as in detecting the mental state or character of a person, predicting job satisfaction, professional and personal relationship success, in recommendation systems. Personality is also an important factor to determine individual variation in thoughts, feelings, and conduct systems. According to the survey of Global social media research in 2018, approximately 3.196 billion social media users are in worldwide. The numbers are estimated to grow rapidly further with the use of mobile smart devices and advancement in technology. Support vector machine (SVM), Naive Bayes (NB), Multilayer perceptron neural network, and convolutional neural network (CNN) are some of the machine learning techniques used for personality identification in the literature review. This paper presents various studies conducted in identifying the personality of social media users with the help of machine learning approaches and the recent studies that targeted to predict the personality of online social media (OSM) users are reviewed

    The Cryptographic Imagination

    Get PDF
    Originally published in 1996. In The Cryptographic Imagination, Shawn Rosenheim uses the writings of Edgar Allan Poe to pose a set of questions pertaining to literary genre, cultural modernity, and technology. Rosenheim argues that Poe's cryptographic writing—his essays on cryptography and the short stories that grew out of them—requires that we rethink the relation of poststructural criticism to Poe's texts and, more generally, reconsider the relation of literature to communication. Cryptography serves not only as a template for the language, character, and themes of much of Poe's late fiction (including his creation, the detective story) but also as a "secret history" of literary modernity itself. "Both postwar fiction and literary criticism," the author writes, "are deeply indebted to the rise of cryptography in World War II." Still more surprising, in Rosenheim's view, Poe is not merely a source for such literary instances of cryptography as the codes in Conan Doyle's "The Dancing-Men" or in Jules Verne, but, through his effect on real cryptographers, Poe's writing influenced the outcome of World War II and the development of the Cold War. However unlikely such ideas sound, The Cryptographic Imagination offers compelling evidence that Poe's cryptographic writing clarifies one important avenue by which the twentieth century called itself into being. "The strength of Rosenheim's work extends to a revisionistic understanding of the entirety of literary history (as a repression of cryptography) and then, in a breathtaking shift of register, interlinks Poe's exercises in cryptography with the hyperreality of the CIA, the Cold War, and the Internet. What enables this extensive range of applications is the stipulated tension Rosenheim discerns in the relationship between the forms of the literary imagination and the condition of its mode of production. Cryptography, in this account, names the technology of literary production—the diacritical relationship between decoding and encoding—that the literary imagination dissimulates as hieroglyphics—the hermeneutic relationship between a sign and its content."—Donald E. Pease, Dartmouth Colleg

    The Advanced Framework for Evaluating Remote Agents (AFERA): A Framework for Digital Forensic Practitioners

    Get PDF
    Digital forensics experts need a dependable method for evaluating evidence-gathering tools. Limited research and resources challenge this process and the lack of multi-endpoint data validation hinders reliability in distributed digital forensics. A framework was designed to evaluate distributed agent-based forensic tools while enabling practitioners to self-evaluate and demonstrate evidence reliability as required by the courts. Grounded in Design Science, the framework features guidelines, data, criteria, and checklists. Expert review enhances its quality and practicality

    On Improving Generalization of CNN-Based Image Classification with Delineation Maps Using the CORF Push-Pull Inhibition Operator

    Get PDF
    Deployed image classification pipelines are typically dependent on the images captured in real-world environments. This means that images might be affected by different sources of perturbations (e.g. sensor noise in low-light environments). The main challenge arises by the fact that image quality directly impacts the reliability and consistency of classification tasks. This challenge has, hence, attracted wide interest within the computer vision communities. We propose a transformation step that attempts to enhance the generalization ability of CNN models in the presence of unseen noise in the test set. Concretely, the delineation maps of given images are determined using the CORF push-pull inhibition operator. Such an operation transforms an input image into a space that is more robust to noise before being processed by a CNN. We evaluated our approach on the Fashion MNIST data set with an AlexNet model. It turned out that the proposed CORF-augmented pipeline achieved comparable results on noise-free images to those of a conventional AlexNet classification model without CORF delineation maps, but it consistently achieved significantly superior performance on test images perturbed with different levels of Gaussian and uniform noise

    The Treatment of Advanced Persistent Threats on Windows Based Systems

    Get PDF
    Advanced Persistent Threat (APT) is the name given to individuals or groups who write malicious software (malware) and who have the intent to perform actions detrimental to the victim or the victims' organisation. This thesis investigates ways in which it is possible to treat APTs before, during and after the malware has been laid down on the victim's computer. The scope of the thesis is restricted to desktop and laptop computers with hard disk drives. APTs have different motivations for their work and this thesis is agnostic towards their origin and intent. Anti-malware companies freely present the work of APTs in many ways but summarise mainly in the form of white papers. Individually, pieces of these works give an incomplete picture of an APT but in aggregate it is possible to construct a view of APT families and pan-APT commonalities by comparing and contrasting the work of many anti-malware companies; it as if there are alot of the pieces of a jigsaw puzzle but there is no box lid available with the complete picture. In addition, academic papers provide proof of concept attacks and observations, some of which may become used by malware writers. Gaps in, and extensions to, the public knowledge may be filled through inference, implication, interpolation and extrapolation and form the basis for this thesis. The thesis presents a view of where APTs lie on windows-based systems. It uses this view to create and build generic views of where APTs lie on Hard Disc Drives on Windows based systems using the Lockheed Martin Cyber Kill Chain. This is then used to treat APTs on Windows based IT systems using purpose-built software in such a way that the malware is negated by. The thesis does not claim to find all malware on but it demonstrates how to increase the cost of doing business for APTs, for example by overwriting unused disc space so APTs cannot place malware there. The software developed was able to find Indicators of Compromise on all eight Hard Disc Drives provided for analysis. Separately, from a corpus of 228 files known to be associated with malware it identified approximately two thirds as Indicators of Compromise

    AN OBJECT-BASED MULTIMEDIA FORENSIC ANALYSIS TOOL

    Get PDF
    With the enormous increase in the use and volume of photographs and videos, multimedia-based digital evidence now plays an increasingly fundamental role in criminal investigations. However, with the increase, it is becoming time-consuming and costly for investigators to analyse content manually. Within the research community, focus on multimedia content has tended to be on highly specialised scenarios such as tattoo identification, number plate recognition, and child exploitation. An investigator’s ability to search multimedia data based on keywords (an approach that already exists within forensic tools for character-based evidence) could provide a simple and effective approach for identifying relevant imagery. This thesis proposes and demonstrates the value of using a multi-algorithmic approach via fusion to achieve the best image annotation performance. The results show that from existing systems, the highest average recall was achieved by Imagga with 53% while the proposed multi-algorithmic system achieved 77% across the select datasets. Subsequently, a novel Object-based Multimedia Forensic Analysis Tool (OM-FAT) architecture was proposed. The OM-FAT automates the identification and extraction of annotation-based evidence from multimedia content. Besides making multimedia data searchable, the OM-FAT system enables investigators to perform various forensic analyses (search using annotations, metadata, object matching, text similarity and geo-tracking) to help investigators understand the relationship between artefacts, thus reducing the time taken to perform an investigation and the investigator’s cognitive load. It will enable investigators to ask higher-level and more abstract questions of the data, then find answers to the essential questions in the investigation: what, who, why, how, when, and where. The research includes a detailed illustration of the architectural requirements, engines, and complete design of the system workflow, which represents a full case management system. To highlight the ease of use and demonstrate the system’s ability to correlate between multimedia, a prototype was developed. The prototype integrates the functionalities of the OM-FAT tool and demonstrates how the system would help digital investigators find pieces of evidence among a large number of images starting from the acquisition stage and ending in the reporting stage with less effort and in less time.The Higher Committee for Education Development in Iraq (HCED

    Variations and Application Conditions Of the Data Type »Image« - The Foundation of Computational Visualistics

    Get PDF
    Few years ago, the department of computer science of the University Magdeburg invented a completely new diploma programme called 'computational visualistics', a curriculum dealing with all aspects of computational pictures. Only isolated aspects had been studied so far in computer science, particularly in the independent domains of computer graphics, image processing, information visualization, and computer vision. So is there indeed a coherent domain of research behind such a curriculum? The answer to that question depends crucially on a data structure that acts as a mediator between general visualistics and computer science: the data structure "image". The present text investigates that data structure, its components, and its application conditions, and thus elaborates the very foundations of computational visualistics as a unique and homogenous field of research. Before concentrating on that data structure, the theory of pictures in general and the definition of pictures as perceptoid signs in particular are closely examined. This includes an act-theoretic consideration about resemblance as the crucial link between image and object, the communicative function of context building as the central concept for comparing pictures and language, and several modes of reflection underlying the relation between image and image user. In the main chapter, the data structure "image" is extendedly analyzed under the perspectives of syntax, semantics, and pragmatics. While syntactic aspects mostly concern image processing, semantic questions form the core of computer graphics and computer vision. Pragmatic considerations are particularly involved with interactive pictures but also extend to the field of information visualization and even to computer art. Four case studies provide practical applications of various aspects of the analysis

    Entangled Knowledge

    Get PDF
    The intimate relationship between global European expansion since the early modern period and the concurrent beginnings of the scientific revolution has long been acknowledged. The contributions in this volume approach the entanglement of science and cultural encounters – many of them in colonial settings – from a variety of perspectives. Historical and historiographical survey essays sketch a transcultural history of knowledge and conduct a critical dialogue between the recent academic fields of Postcolonial Studies and Science & Empire Studies; a series of case studies explores the topos of Europe’s ‘great inventions’, the scientific exploitation of culturally unfamiliar people and objects, the representation of indigenous cultures in discourses of geographical exploration, as well as non-European scientific practices. ‘Entangled Knowledges’ also refers to the critical practices of scholarship: various essays investigate scholarship’s own failures in self-reflexivity, arising from an uncritical appropriation of cultural stereotypes and colonial myths, of which the discourse of Orientalism in historiography and residual racialist assumptions in modern genetics serve as examples. The volume thus contributes to the study of cultural and colonial relations as well as to the history of science and scholarship

    Entangled Knowledge

    Get PDF
    The intimate relationship between global European expansion since the early modern period and the concurrent beginnings of the scientific revolution has long been acknowledged. The contributions in this volume approach the entanglement of science and cultural encounters – many of them in colonial settings – from a variety of perspectives. Historical and historiographical survey essays sketch a transcultural history of knowledge and conduct a critical dialogue between the recent academic fields of Postcolonial Studies and Science & Empire Studies; a series of case studies explores the topos of Europe’s ‘great inventions’, the scientific exploitation of culturally unfamiliar people and objects, the representation of indigenous cultures in discourses of geographical exploration, as well as non-European scientific practices. ‘Entangled Knowledges’ also refers to the critical practices of scholarship: various essays investigate scholarship’s own failures in self-reflexivity, arising from an uncritical appropriation of cultural stereotypes and colonial myths, of which the discourse of Orientalism in historiography and residual racialist assumptions in modern genetics serve as examples. The volume thus contributes to the study of cultural and colonial relations as well as to the history of science and scholarship
    • 

    corecore