345 research outputs found

    The Geo-Doc: Remediating the Documentary Film as an Instrument of Social Change with Locative Theory and Technology

    Get PDF
    The documentary film has had a long history as an influential communications tool with the ability to effect social change. Its inherent claims to representing the truth provide a foundation of credibility that the filmmaker uses to inform and persuade their audience with a goal of causing them to take action that ideally leads to social change. This goal has been seen to be achieved when the documentary film employs certain methods and technologies. My research questions are these: What methods and technologies are most effective in bolstering the documentary films ability to effect social change and what new and emerging methods and technologies extend that ability? How can the documentary film be remediated to incorporate these attributes and would this new project experience some measure of success in effecting social change when tested in the field? These questions are answered through an investigation of various disciplines of study. The history of the documentary film as an instrument of social change is examined from its origins to the present day. This examination also identifies those methods and technologies that have advanced the documentarys ability to serve as a successful communication tool between filmmaker and changemaker. Focussed investigations into the theory and practice of the documentary film yield specific approaches and techniques that prove to be most successful, such as the Participatory Mode, Ecocinema and Semiotic Storytelling, the Multilinear and the Database Documentary, and the distinct digital affordances provided by Geomedia. Once identified and explained, the most effective theories and practices are combined in an altogether new and remediated documentary form: the geo-doc. The geo-doc is a term I have applied to a structure of the documentary film that is a multilinear, interactive, database documentary film project presented on a platform of a Geographic Information System map. The project was made specifically for an audience of changemakers with the general public in mind as a secondary audience. In collaboration with the changemaker, content and interface suggestions are made to the filmmaker to augment the projects effectiveness as a communications tool

    Open parliaments. Technological enactment in state legislatures

    Get PDF
    This thesis starts with a simple research question, asking why parliaments that share the same level of functions and competencies produce different results in terms of the level of development of their websites. The research is divided into three stages: comparative website analysis, quantitative analysis and case studies. Looking at 93 state Legislatures in Brazil, Spain and the United States, each of the stages of the research presents findings that contribute to the literature on e-democracy and open government. The comparative website analysis shows a varying degree of development amongst state Legislature websites. This heterogeneous level of development is contrasted with a common denominator amongst most websites: while the majority of efforts are towards the provision of Legislative information, the prospects for participation and deliberation are extremely limited. Standing out against these rather predictable results, findings also suggest that certain institutional traits such as electoral systems may influence the design of websites in terms of both their content and features. The quantitative analyses single out a number of factors that influence the differences in levels of development of Legislative websites. First of all, contrary to what has been suggested by a portion of the e-democracy literature, neither resources nor partisanship seem to matter for the development of Legislative websites. Conversely, the quantitative findings suggest that matters of institutional design (e.g. parliaments’ autonomy) and demand (e.g. Internet access, population) may play a significant role in the performance of Parliamentary websites. The case studies - the core of this research - follow an institutional approach to the process of ICT usage within public organizations, through a detailed analysis of the inner workings of three different Legislatures in Brazil (Rio Grande do Sul, Minas Gerais and Rio Grande do Norte). This analysis evinces the role played by factors largely ignored by the majority of the e-democracy research until now. It shows how the different institutional arrangements ultimately shape the very configuration of websites, impacting each of them in terms of their features and contents. By reversing this interpretation, we surmise, the configuration of Legislative websites per se may provide external observers with information regarding institutional arrangements and policy-innovation cycles and processes within a Legislature. iv Finally, the comparative perspective taken sheds light on the role played by civil servants in the technological enactment process within Legislatures. All other things being equal, it is the relationships among civil servants and MPs, and the relationships between the two groups - mediated by institutional arrangements - that will ultimately affect the level of development of parliamentary websites

    Network Narrative: Prose Narrative Fiction and Participatory Cultural Production in Digital Information and Communication Networks

    Get PDF
    In this study of prose narrative created explicitly for participatory network communications environments I argue that network narratives constitute an important, born-networked form of literary and cultural expression. In the first half of the study I situate network narratives within a rich, dynamic process of reciprocity and codependence between the technological, material and formal properties of communication media on the one hand, and the uses of these media in cultural practices and forms of expression on the other. I point out how the medial and cultural flows that characterize contemporary network culture promote a codependent relation between narrative and information. This relation supports literary cultural expressions that invoke everyday communication practices increasingly shaped by mobile, networked computing devices. In the second half of this study, I extend theoretical work in the field of electronic literature and digital media to propose a set of four characteristics through which network narratives may be understood as distinct modes of networked, literary cultural expression. Network narratives, I suggest, are multimodal, distributed, participatory, and emergent. These attributes are present in distinct ways, within distinct topological layers of the narratives: in the story, discourse, and character networks of the narrative structure; in the formal and navigational structures; and in the participatory circuits of production, circulation and consumption. Attending to these topological layers and their interrelationships by using concepts derived from graph theory and network analysis offers a methodology that links the particular, closely read attributes and content of network narratives to a more distant understanding of changing patterns in broader, networked cultural production. Finally, I offer readings of five examples of network narratives. These include Kate Pullinger and Chris Joseph’s Flight Paths, Penguin Books and De Montfort University’s collaborative project A Million Penguins, the Apple iOS application The Silent History, Tim Burton’s collaboration with TIFF, BurtonStory, and a project by NFB Interactive, Out My Window. Each of these works incorporates user participation into its production circuits using different strategies, each with different implications for narrative and navigational structures. I conclude by describing these distinct strategies as additive participation – participation that becomes embedded within the work itself – and delineating different approaches that are employed independently or in combination by the authors and producers

    Protecting applications using trusted execution environments

    Get PDF
    While cloud computing has been broadly adopted, companies that deal with sensitive data are still reluctant to do so due to privacy concerns or legal restrictions. Vulnerabilities in complex cloud infrastructures, resource sharing among tenants, and malicious insiders pose a real threat to the confidentiality and integrity of sensitive customer data. In recent years trusted execution environments (TEEs), hardware-enforced isolated regions that can protect code and data from the rest of the system, have become available as part of commodity CPUs. However, designing applications for the execution within TEEs requires careful consideration of the elevated threats that come with running in a fully untrusted environment. Interaction with the environment should be minimised, but some cooperation with the untrusted host is required, e.g. for disk and network I/O, via a host interface. Implementing this interface while maintaining the security of sensitive application code and data is a fundamental challenge. This thesis addresses this challenge and discusses how TEEs can be leveraged to secure existing applications efficiently and effectively in untrusted environments. We explore this in the context of three systems that deal with the protection of TEE applications and their host interfaces: SGX-LKL is a library operating system that can run full unmodified applications within TEEs with a minimal general-purpose host interface. By providing broad system support inside the TEE, the reliance on the untrusted host can be reduced to a minimal set of low-level operations that cannot be performed inside the enclave. SGX-LKL provides transparent protection of the host interface and for both disk and network I/O. Glamdring is a framework for the semi-automated partitioning of TEE applications into an untrusted and a trusted compartment. Based on source-level annotations, it uses either dynamic or static code analysis to identify sensitive parts of an application. Taking into account the objectives of a small TCB size and low host interface complexity, it defines an application-specific host interface and generates partitioned application code. EnclaveDB is a secure database using Intel SGX based on a partitioned in-memory database engine. The core of EnclaveDB is its logging and recovery protocol for transaction durability. For this, it relies on the database log managed and persisted by the untrusted database server. EnclaveDB protects against advanced host interface attacks and ensures the confidentiality, integrity, and freshness of sensitive data.Open Acces

    Washington University Record, April 2, 1998

    Get PDF
    https://digitalcommons.wustl.edu/record/1789/thumbnail.jp

    A cultural analysis of information technology offshore outsourcing: an exercise in multi-sited ethnography of virtual work

    Get PDF
    This study is an exploration of how ethnography and anthropological analysis can provide new understanding of transnational, multi-sited research phenomena. Research focused on the work activities of one American client organization and its Indian IT service provider situated in the global virtual field of Information Technology (IT) offshore outsourcing. The Principal Investigator adapted and applied an ethnographic approach for her fieldwork in order to understand the norms, beliefs, and values about work, as well as the relationship between cultural differences and virtual communication. Dissertation findings offer new insight for anthropological discussions of globalization as well as suggest further development of studies in the virtual environment as a site for ethnographic inquiry

    Soft cinematic hypertext (other literacies)

    Get PDF
    This research demonstrates the role of academic hypertext theory and practice to humanities research, and uses this as a model to explore the specificity of digital humanities practice in the contexts of scholarly writing. This establishes terms to reconsider cinema from the point of view of a hypertextual logic of wholes and parts, which is then used to develop a new form of online interactive video known as softvideo

    Planning and optimisation of 4G/5G mobile networks and beyond

    Get PDF
    As mobile networks continue to evolve, two major problems have always existed that greatly affect the quality of service that users experience. These problems are (1) efficient resource management for users at the edge of the network and those in a network coverage hole. (2) network coverage such that improves the quality of service for users while keeping the cost of deployment very low. In this study, two novel algorithms (Collaborative Resource Allocation Algorithm and Memetic-Bee-Swarm Site Location-Allocation Algorithm) are proposed to solve these problems. The Collaborative Resource Allocation Algorithm (CRAA) is inspired by lending and welfare system from the field of political economy and developed as a Market Game. The CRAA allows users to collaborate through coalition formation for cell edge users and users with less than the required Signal-to-Noise-plus-Interference-Ratio to transmit at satisfactory Quality of Service, which is a result of the payoff, achieved and distributed using the Shapley value computed using the Owens Multi Linear Extension function. The Memetic-Bee-Swarm Site Location-Allocation Algorithm (MBSSLAA) is inspired by the behaviour of the Memetic algorithm and Bee Swarm Algorithm for site location. Series of System-level simulations and numerical evaluations were run to evaluate the performance of the algorithms. Numerical evaluation and simulations results show that the Collaborative Resource Allocation Algorithm compared with two popular Long Term Evolution-Advanced algorithms performs higher in comparison when assessed using throughput, spectral efficiency and fairness. Also, results from the simulation of MBSSLAA using realistic network design parameter values show significant higher performance for users in the coverage region of interest and signifies the importance of the ultra-dense small cells network in the future of telecommunications’ services to support the Internet of Things. The results from the proposed algorithms show that following from the existing solutions in the literature; these algorithms give higher performance than existing works done on these problems. On the performance scale, the CRAA achieved an average of 30% improvement on throughput and spectral efficiency for the users of the network. The results also show that the MBSSLAA is capable of reducing the number of small cells in an ultra-dense small cell network while providing the requisite high data coverage. It also indicates that this can be achieved while maintaining high SINR values and throughput for the users, therefore giving them a satisfactory level of quality of service which is a significant requirement in the Fifth Generation network’s specification
    • …
    corecore