260 research outputs found

    UK Lockdown Governmentalities: What Does It Mean to Govern in 2020?

    Get PDF
    Focusing on the United Kingdom, this paper examines the mechanisms of 2020’s ‘lockdown’ strategy from a governmental perspective, with ‘governmentality’ being defined as the art of, or rationale behind, governing populations at a given time. By investigating a series of recent imperatives given to the population by the UK government, and comparing these with the previously dominant form of governmentality (neoliberalism), I hope to shed light on some new features of the current art of government. Indeed, the paper argues that neoliberalism is no longer the dominant form of governmentality in the UK, although some important legacies remain. I therefore argue that new forms of governmentality have risen to prominence. In particular, I use the concept of ‘algorithmic governmentality’ to address features of lockdown subjectivity and economy, such as the ‘doppelgĂ€nger logic’ of consumption and production, as well as the government’s attempts to continuously manage and re-manage the population based on biometric data. However, I also show that this concept does not adequately encompass contemporary realities of surveillance, exposition and coercion. As such, I introduce ‘instrumentarian governmentality’ to denote the use of digital surveillance instruments to control the behaviour of the population. Additionally, the term is intended to denote an ‘authoritarian’ turn in the ways in which people are governed. Overall, what it means to govern in 2020 is posited as a fluctuating composite of three key forms of governmentality: neoliberal, algorithmic, and instrumentarian

    Offshore Accounts: Insider\u27s Summary of FATCA and Its Potential Future

    Get PDF

    The Prosecutor as a Final Safeguard Against False Convictions: How Prosecutors Assist with Exoneration

    Get PDF
    Prosecutors have helped secure an unprecedented number of recent exonerations. This development, combined with the rapid emergence of district attorney-initiated conviction integrity units (CIUs) raises several questions. How do prosecutors’ offices review postconviction innocence claims? How do they make decisions about the merits of those claims? How do CIU processes differ from non-CIU processes? This study examines the circumstances surrounding prosecutor-assisted exoneration cases through semi-structured interviews with 20 prosecutors and 19 defense attorneys. It draws from a sample of both CIU and non-CIU prosecutors, thereby enabling comparisons. Respondents were asked about their experiences and decision-making structures in specific, post-2005 exoneration cases as well as their impressions of postconviction practices more broadly. Their responses revealed the salience of office hierarchies and appellate principles for prosecutors’ postconviction discretion. Unlike earlier stages—such as charging and plea bargaining—few decisions appear to be delegated to the line prosecutor in the postconviction stage. Therefore, I incorporate organizational accident theory to understand key decisions, such as which prosecutor should be tasked with reviewing innocence claims, how to screen claims, and how to decide the outcome of an innocence claim. I find that prosecutors work within a system that emphasizes procedural errors over factual ones and that allows for a narrow and belated discovery of false convictions. Thus, prosecutors’ postconviction efforts do not appear likely to create a new pathway to exoneration for pro se defendants, or to identify false convictions that have not already been discovered by trusted defense attorneys, innocence organizations, and/or journalists. Although most prosecutors described processes that could reliably facilitate unbiased case review and reinvestigation, these processes were reserved for only the most extraordinary of innocence claims. The significance of this research for policy (in the crafting of legislation and court rules) and practice (in the processing of innocence claims through prosecutors’ offices) is discussed

    The Prosecutor as a Final Safeguard Against False Convictions

    Get PDF
    Prosecutors have helped secure an unprecedented number of recent exonerations. This development, combined with the rapid emergence of district attorney-initiated conviction integrity units (CIUs) raises several questions. How do prosecutors’ offices review postconviction innocence claims? How do they make decisions about the merits of those claims? How do CIU processes differ from non-CIU processes? This study examines the circumstances surrounding prosecutor-assisted exoneration cases through semi-structured interviews with 20 prosecutors and 19 defense attorneys. It draws from a sample of both CIU and non-CIU prosecutors, thereby enabling comparisons. Respondents were asked about their experiences and decision-making structures in specific, post-2005 exoneration cases as well as their impressions of postconviction practices more broadly. Their responses revealed the salience of office hierarchies and appellate principles for prosecutors’ postconviction discretion. Unlike earlier stages—such as charging and plea bargaining—few decisions appear to be delegated to the line prosecutor in the postconviction stage. Therefore, I incorporate organizational accident theory to understand key decisions, such as which prosecutor should be tasked with reviewing innocence claims, how to screen claims, and how to decide the outcome of an innocence claim. I find that prosecutors work within a system that emphasizes procedural errors over factual ones and that allows for a narrow and belated discovery of false convictions. Thus, prosecutors’ postconviction efforts do not appear likely to create a new pathway to exoneration for pro se defendants, or to identify false convictions that have not already been discovered by trusted defense attorneys, innocence organizations, and/or journalists. Although most prosecutors described processes that could reliably facilitate unbiased case review and reinvestigation, these processes were reserved for only the most extraordinary of innocence claims. The significance of this research for policy (in the crafting of legislation and court rules) and practice (in the processing of innocence claims through prosecutors’ offices) is discussed

    Re-examining and re-conceptualising enterprise search and discovery capability: towards a model for the factors and generative mechanisms for search task outcomes.

    Get PDF
    Many organizations are trying to re-create the Google experience, to find and exploit their own corporate information. However, there is evidence that finding information in the workplace using search engine technology has remained difficult, with socio-technical elements largely neglected in the literature. Explication of the factors and generative mechanisms (ultimate causes) to effective search task outcomes (user satisfaction, search task performance and serendipitous encountering) may provide a first step in making improvements. A transdisciplinary (holistic) lens was applied to Enterprise Search and Discovery capability, combining critical realism and activity theory with complexity theories to one of the worlds largest corporations. Data collection included an in-situ exploratory search experiment with 26 participants, focus groups with 53 participants and interviews with 87 business professionals. Thousands of user feedback comments and search transactions were analysed. Transferability of findings was assessed through interviews with eight industry informants and ten organizations from a range of industries. A wide range of informational needs were identified for search filters, including a need to be intrigued. Search term word co-occurrence algorithms facilitated serendipity to a greater extent than existing methods deployed in the organization surveyed. No association was found between user satisfaction (or self assessed search expertise) with search task performance and overall performance was poor, although most participants had been satisfied with their performance. Eighteen factors were identified that influence search task outcomes ranging from user and task factors, informational and technological artefacts, through to a wide range of organizational norms. Modality Theory (Cybersearch culture, Simplicity and Loss Aversion bias) was developed to explain the study observations. This proposes that at all organizational levels there are tendencies for reductionist (unimodal) mind-sets towards search capability leading to fixes that fail. The factors and mechanisms were identified in other industry organizations suggesting some theory generalizability. This is the first socio-technical analysis of Enterprise Search and Discovery capability. The findings challenge existing orthodoxy, such as the criticality of search literacy (agency) which has been neglected in the practitioner literature in favour of structure. The resulting multifactorial causal model and strategic framework for improvement present opportunities to update existing academic models in the IR, LIS and IS literature, such as the DeLone and McLean model for information system success. There are encouraging signs that Modality Theory may enable a reconfiguration of organizational mind-sets that could transform search task outcomes and ultimately business performance

    Information Management for Digital Learners : Introduction, Challenges, and Concepts of Personal Information Management for Individual Learners

    Get PDF
    The current cultural transition of our society into a digital society influences all aspects of human life. New technologies like the Internet and mobile devices enable an unobstructed access to knowledge in worldwide networks. These advancements bring with them a great freedom in decisions and actions of individuals but also a growing demand for an appropriate mastering of this freedom of choice and the amount of knowledge that has become available today. Naturally, this observable rise and progress of new technologies—gently but emphatically becoming part of people’s everyday lives—not only changes the way people work, communicate, and shape their leisure but also the way people learn. This thesis is dedicated to an examination of how learners can meet these requirements with the support that modern technology is able to provide to learners. More precisely, this thesis places a particular emphasis that is absent from previous work in the field and thus makes it distinctive: the explicit focus on individual learners. As a result, the main concern of this thesis can be described as the examination, development, and implementation of personal information management in learning. Altogether two different steps towards a solution have been chosen: the development of a theoretical framework and its practical implementation into a comprehensive concept. To establish a theoretical framework for personal information management in learning, the spheres of learning, e-learning, and personalised learning have been combined with theories of organisational and personal knowledge management to form a so far unique holistic view of personal information management in learning. The development of this framework involves the identification of characteristics, needs, and challenges that distinguish individual learners from within the larger crowd of uniform learners. The theoretical framework defined within the first part is transferred to a comprehensive technical concept for personal information management in learning. The realisation and design of this concept as well as its practical implementation are strongly characterised by the utilisation of information retrieval techniques to support individual learners. The characteristic feature of the resulting system is a flexible architecture that enables the unified acquisition, representation, and organisation of information related to an individual’s learning and supports an improved find-ability of personal information across all relevant sources of information. The most important results of this thesis have been validated by a comparison with current projects in related areas and within a user study.Der gegenwĂ€rtige Wandel unserer Gesellschaft zu einer digitalen Gesellschaft hat weitreichenden Einfluss auf alle Aspekte des menschlichen Lebens. Neue Technologien wie das Internet und mobile GerĂ€te zur Nutzung dieser Technologien ermöglichen einen nahezu ungehinderten Zugriff auf Wissen in weltweiten Netzwerken. Dieser Fortschritt bringt einerseits einen großen Freiheitsgrad fĂŒr Entscheidungen und Handlungen des Einzelnen, andererseits aber auch eine immer lauter werdende Forderung nach Strategien fĂŒr einen adĂ€quaten Umgang mit dieser Freiheit und der verfĂŒgbaren Menge an Informationen. NaturgemĂ€ĂŸ verĂ€ndern dieser Fortschritt und die zugehörigen Technologien nicht nur unser Arbeitsleben und den privaten Alltag, sondern auch die Art und Weise zu lernen. Diese Arbeit beschĂ€ftigt sich mit der Frage, wie Lernende diesen neuen Anforderungen gerecht werden und mithilfe von modernen Technologien in einem adĂ€quaten Informationsmanagement unterstĂŒtzt werden können. Die Besonderheit liegt dabei in einem ausschließlichen Fokus individuell Lernender, genauer gesagt jenen, die sich eigenstĂ€ndig auf individuellen Lernpfaden bewegen. Zusammengefasst untersucht diese Arbeit daher Möglichkeiten des personalisierten Informationsmanagements fĂŒr Lernende. Die Untersuchung dieser Fragestellung erfolgt auf zwei Ebenen. Die erste Ebene dieser Arbeit umfasst eine theoretische Untersuchung der Thematik. Zu diesem Zweck wird ein ĂŒbergreifendes Rahmenwerk fĂŒr das persönliche Informationsmanagement von Lernenden entwickelt, das eine ganzheitliche Betrachtung dieser Fragestellung ermöglicht. Das entwickelte Rahmenwerk zeichnet sich insbesondere durch eine Verschmelzung der DomĂ€nen E-Learning und Wissensmanagement aus. Dazu werden im Rahmen dieser theoretischen Untersuchung prĂ€gende Facetten des Lernens beschrieben und Theorien des organisatorischen Wissensmanagements zur BewĂ€ltigung des persönlichen Informationsmanagements untersucht. Dies fĂŒhrt schließlich zu einer Charakterisierung von individuell Lernenden, der Identifikation grundlegender Herausforderungen fĂŒr diese Lernenden sowie einem Modell zur Beschreibung des individuellen Informations- und Wissensmanagements. Die zweite Ebene dieser Arbeit umfasst die Umsetzung des entwickelten Rahmenwerks in ein praktisches Konzept zur effizienten Verwaltung von persönlichen Lerninhalten und -informationen einzelner Lernender. Das realisierte System ist dabei durch die BerĂŒcksichtigung von InformationsbedĂŒrfnissen individuell Lernender sowie besonders durch den gezielten Einsatz von Information Retrieval Techniken zur UnterstĂŒtzung dieser Lernenden gekennzeichnet. Das konstituierende Merkmal dieses Systems ist daher eine flexible Architektur, die die Erfassung von Lernobjekten unter besonderer BerĂŒcksichtigung des Lernkontexts erlaubt. Detaillierter betrachtet ermöglicht die Erfassung von Basisinformation in Form von Lernobjekten in Kombination mit hierarchischen und nicht-hierarchischen Zusatzinformationen eine individuelle und umfassende Verwaltung von Lerninhalten und -informationen, die auch eine verbesserte Wiederauffindbarkeit dieser Informationen zu einem spĂ€teren Zeitpunkt unterstĂŒtzt. Die wichtigsten Ergebnisse dieser Arbeit werden aktuellen Entwicklungen und Projekten in verwandten Bereichen gegenĂŒbergestellt und im Rahmen einer Nutzerstudie grundlegend validiert

    Students’ Strategies for Writing Arguments from Online Sources of Information

    Get PDF
    This study builds on previous work on writing (e.g., Bereiter & Scardamalia, 1987; Hayes & Flower, 1980) and writing from sources (e.g., Spivey, 1997). Its purpose was to investigate processes and strategies for writing from online sources of information. High-achieving Grade 12 students were recorded as they researched on the Internet and wrote arguments about cosmetics testing on animals. Data included think aloud protocols, video recordings of participants and computer screens, writing products, and interviews. Data was analyzed using narrative summaries and cross-case comparisons. A coding scheme was developed and applied, in order to establish interrater reliability. Writers used one of three overall processes: 1) Writers alternated between researching online and structuring content into an outline, and then drafted a text; 2) Writers researched online, writing notes and a separate outline, and then drafted a text, drawing on both documents; 3) Writers drafted the text and their research while drafting. Each process was comprised of sub-ordinate strategies and operations. Two contributions of this work are discussed. First, the strategies of participants were similar in that they demonstrated translations between content and rhetorical problem spaces (cf. Bereiter & Scardamalia, 1987). These translations occurred during researching, as well as drafting and reviewing, and were apparent through students’ Internet activity. Second, participants constructed different task environments (cf. Hayes & Flower, 1980) and used different strategies; all were adapted to the affordances and constraints of the Internet, the electronic writing medium, and internal cognition. Final sections address writing instruction, the method, and future research

    Approaching algorithmic power

    Get PDF
    Contemporary power manifests in the algorithmic. Emerging quite recently as an object of study within media and communications, cultural research, gender and race studies, and urban geography, the algorithm often seems ungraspable. Framed as code, it becomes proprietary property, black-boxed and inaccessible. Framed as a totality, its becomes overwhelmingly complex, incomprehensible in its operations. Framed as a procedure, it becomes a technique to be optimised, bracketing out the political. In struggling to adequately grasp the algorithmic as an object of study, to unravel its mechanisms and materialities, these framings offer limited insight into how algorithmic power is initiated and maintained. This thesis instead argues for an alternative approach: firstly, that the algorithmic is coordinated by a coherent internal logic, a knowledge-structure that understands the world in particular ways; second, that the algorithmic is enacted through control, a material and therefore observable performance which purposively influences people and things towards a predetermined outcome; and third, that this complex totality of architectures and operations can be productively analysed as strategic sociotechnical clusters of machines. This method of inquiry is developed with and tested against four contemporary examples: Uber, Airbnb, Amazon Alexa, and Palantir Gotham. Highly profitable, widely adopted and globally operational, they exemplify the algorithmic shift from whiteboard to world. But if the world is productive, it is also precarious, consisting of frictional spaces and antagonistic subjects. Force cannot be assumed as unilinear, but is incessantly negotiated—operations of parsing data and processing tasks forming broader operations that strive to establish subjectivities and shape relations. These negotiations can fail, destabilised by inadequate logics and weak control. A more generic understanding of logic and control enables a historiography of the algorithmic. The ability to index information, to structure the flow of labor, to exert force over subjects and spaces— these did not emerge with the microchip and the mainframe, but are part of a longer lineage of calculation. Two moments from this lineage are examined: house-numbering in the Habsburg Empire and punch-card machines in the Third Reich. Rather than revolutionary, this genealogy suggests an evolutionary process, albeit uneven, linking the computation of past and present. The thesis makes a methodological contribution to the nascent field of algorithmic studies. But more importantly, it renders algorithmic power more intelligible as a material force. Structured and implemented in particular ways, the design of logic and control construct different versions, or modalities, of algorithmic power. This power is political, it calibrates subjectivities towards certain ends, it prioritises space in specific ways, and it privileges particular practices whilst suppressing others. In apprehending operational logics, the practice of method thus foregrounds the sociopolitical dimensions of algorithmic power. As the algorithmic increasingly infiltrates into and governs the everyday, the ability to understand, critique, and intervene in this new field of power becomes more urgent

    Bringing Nordic mathematics education into the future : Preceedings of Norma 20 : The ninth Nordic conference on mathematics education Oslo, 2021

    Get PDF
    publishedVersio
    • 

    corecore