103 research outputs found

    Von Nischen und Infrastrukturen: Herausforderungen und neue AnsÀtze politischer Technologien

    Get PDF
    Neue Technologien aus dem aktivistischen Umfeld bieten radikale Alternativen zu kultureller Nischenbildung und zentralisierten Web 2.0-Infrastrukturen. Die sozialen und politischen RealitĂ€ten der Digitalisierung und Vernetzung sind heute von zwei konstitutiven, aber grundsĂ€tzlich unterschiedlichen, ja teilweise sogar entgegengesetzten Dynamiken geprĂ€gt. Beide stellen den Medienaktivismus vor neue Herausforderungen. Zum einen können wir ein AufblĂŒhen neuer kultureller Nischen und horizontaler Organisationsformen beobachten. Zum anderen erleben wir gleichzeitig eine enorme Zentralisierung und Konzentration auf der Ebene der Plattformen, welche einen grossen Teil der infrastrukturellen Grundlage des Wachstum der Nischen und neuen Kooperationsmuster darstellen. Nachdem der Aufbau alternativer Infrastrukturen – Zeitschriften, TV KanĂ€le und Internetplattformen – in den ersten 30 Jahren medienaktivistischer Projekte eine grosse Rolle gespielt hat (Stalder 2008) sind diese Fragen in den letzten 10 Jahren etwas in den Hintergrund getreten. Denn die KomplexitĂ€t der Infrastrukturen nahm stetig zu, was es immer aufwendiger machte, sie zu betreiben und die neuen, offene Plattformen, wie sie fĂŒr Web 2.0 typisch sind, stellten allen - scheinbar ohne EinschrĂ€nkungen - mĂ€chtige Werkzeuge zu VerfĂŒgung. Warum eine eigene Plattform betreiben, wenn grosse professionelle Anbieter das besser, sicherer und kostenfrei anbieten? Heute sind die Probleme dieser Entwicklungen aber deutlich zu erkennen. Im Folgenden werden die Herausforderungen dieser Nischenbildung, die dunkle Seite der zentralisierten Infrastrukturen sowie die darauf reagierenden, neue EntwĂŒrfe fĂŒr de-zentrale Infrastrukturen skizziert

    WikiLeaks: neue Dimensionen des Medienaktivismus

    Full text link
    "Mit WikiLeaks hat der Medienaktivismus eine neue Dimension erreicht. WikiLeaks versteht die neuen sozio-technischen Möglichkeiten und institutionellen WidersprĂŒche, die die gegenwĂ€rtige Phase der Entwicklung der Netzwerkgesellschaft kennzeichnen, fĂŒr sein Projekt nutzbar zu machen. Politisch bleibt das Projekt allerdings schwer kategorisierbar, da es gleichzeitig eine markt-libertĂ€re und eine institutionskritische Haltung vertritt, gleichermassen staatliche wie privat-wirtschaftliche Akteure mit einschliesst. Solche WidersprĂŒche können ausgehalten werden, denn anders als traditionelle aktivistische Medien versucht WikiLeaks sein Material nicht in einen erklĂ€renden Zusammenhang zu stellen, sondern ĂŒberlĂ€sst die Interpetation anderen." (Autorenreferat

    Evaluation of the Arts-based Research Programme of the Austrian Science Fund (PEEK)

    Get PDF
    Gegenstand dieser Evaluierung ist das Programm fĂŒr kĂŒnstlerische Forschung (PEEK) des Wissenschaftsfonds (FWF). Mit der EinfĂŒhrung dieses Programms im Jahr 2009 hat der FWF auf die in der Novelle zum Hochschulgesetz 2002 postulierte Gleichstellung der KunstuniversitĂ€ten mit anderen UniversitĂ€ten reagiert, die mit einer entsprechenden Novelle im Forschungs- und Technologiegesetz 2007 verankert wurde. Als Äquivalent fĂŒr "Wissenschaft" wurde fĂŒr die KunstuniversitĂ€ten der Arbeitsbereich "Entwicklung und Erschließung der KĂŒnste" ĂŒbernommen. Mit der Aufnahme der "Entwicklung und Erschließung der KĂŒnste" in das Forschungs- und Technologiegesetz sollte auch die Aufwertung der Kunsthochschulen zu KunstuniversitĂ€ten und deren Gleichstellung mit den anderen öffentlichen UniversitĂ€ten signalisiert werden. Um die Möglichkeit zu gewĂ€hrleisten, adĂ€quate ForschungsansĂ€tze zu bieten, die dem Charakter und der Reichweite der KunstuniversitĂ€ten entsprechen, wurde die kunstwissenschaftliche Forschung als ein vielversprechender Ansatz identifiziert

    The effects of positive end-expiratory pressure on cardiac function: a comparative echocardiography-conductance catheter study.

    Get PDF
    BACKGROUND Echocardiographic parameters of diastolic function depend on cardiac loading conditions, which are altered by positive pressure ventilation. The direct effects of positive end-expiratory pressure (PEEP) on cardiac diastolic function are unknown. METHODS Twenty-five patients without apparent diastolic dysfunction undergoing coronary angiography were ventilated noninvasively at PEEPs of 0, 5, and 10 cmH2O (in randomized order). Echocardiographic diastolic assessment and pressure-volume-loop analysis from conductance catheters were compared. The time constant for pressure decay (τ) was modeled with exponential decay. End-diastolic and end-systolic pressure volume relationships (EDPVRs and ESPVRs, respectively) from temporary caval occlusion were analyzed with generalized linear mixed-effects and linear mixed models. Transmural pressures were calculated using esophageal balloons. RESULTS τ values for intracavitary cardiac pressure increased with the PEEP (n = 25; no PEEP, 44 ± 5 ms; 5 cmH2O PEEP, 46 ± 6 ms; 10 cmH2O PEEP, 45 ± 6 ms; p < 0.001). This increase disappeared when corrected for transmural pressure and diastole length. The transmural EDPVR was unaffected by PEEP. The ESPVR increased slightly with PEEP. Echocardiographic mitral inflow parameters and tissue Doppler values decreased with PEEP [peak E wave (n = 25): no PEEP, 0.76 ± 0.13 m/s; 5 cmH2O PEEP, 0.74 ± 0.14 m/s; 10 cmH2O PEEP, 0.68 ± 0.13 m/s; p = 0.016; peak A wave (n = 24): no PEEP, 0.74 ± 0.12 m/s; 5 cmH2O PEEP, 0.7 ± 0.11 m/s; 10 cmH2O PEEP, 0.67 ± 0.15 m/s; p = 0.014; E' septal (n = 24): no PEEP, 0.085 ± 0.016 m/s; 5 cmH2O PEEP, 0.08 ± 0.013 m/s; 10 cmH2O PEEP, 0.075 ± 0.012 m/s; p = 0.002]. CONCLUSIONS PEEP does not affect active diastolic relaxation or passive ventricular filling properties. Dynamic echocardiographic filling parameters may reflect changing loading conditions rather than intrinsic diastolic function. PEEP may have slight positive inotropic effects. CLINICAL TRIAL REGISTRATION https://clinicaltrials.gov/ct2/show/NCT02267291 , registered 17. October 2014

    Bearing account-able witness to the ethical algorithmic system

    Get PDF
    This paper explores how accountability might make otherwise obscure and inaccessible algorithms available for governance. The potential import and difficulty of accountability is made clear in the compelling narrative reproduced across recent popular and academic reports. Through this narrative we are told that algorithms trap us and control our lives, undermine our privacy, have power and an independent agential impact, at the same time as being inaccessible, reducing our opportunities for critical engagement. The paper suggests that STS sensibilities can provide a basis for scrutinizing the terms of the compelling narrative, disturbing the notion that algorithms have a single, essential characteristic and a predictable power or agency. In place of taking for granted the terms of the compelling narrative, ethnomethodological work on sense-making accounts is drawn together with more conventional approaches to accountability focused on openness and transparency. The paper uses empirical material from a study of the development of an “ethical,” “smart” algorithmic videosurveillance system. The paper introduces the “ethical” algorithmic surveillance system, the approach to accountability developed, and some of the challenges of attempting algorithmic accountability in action. The paper concludes with reflections on future questions of algorithms and accountability

    Identification and reconstruction of low-energy electrons in the ProtoDUNE-SP detector

    Full text link
    Measurements of electrons from Îœe\nu_e interactions are crucial for the Deep Underground Neutrino Experiment (DUNE) neutrino oscillation program, as well as searches for physics beyond the standard model, supernova neutrino detection, and solar neutrino measurements. This article describes the selection and reconstruction of low-energy (Michel) electrons in the ProtoDUNE-SP detector. ProtoDUNE-SP is one of the prototypes for the DUNE far detector, built and operated at CERN as a charged particle test beam experiment. A sample of low-energy electrons produced by the decay of cosmic muons is selected with a purity of 95%. This sample is used to calibrate the low-energy electron energy scale with two techniques. An electron energy calibration based on a cosmic ray muon sample uses calibration constants derived from measured and simulated cosmic ray muon events. Another calibration technique makes use of the theoretically well-understood Michel electron energy spectrum to convert reconstructed charge to electron energy. In addition, the effects of detector response to low-energy electron energy scale and its resolution including readout electronics threshold effects are quantified. Finally, the relation between the theoretical and reconstructed low-energy electron energy spectrum is derived and the energy resolution is characterized. The low-energy electron selection presented here accounts for about 75% of the total electron deposited energy. After the addition of lost energy using a Monte Carlo simulation, the energy resolution improves from about 40% to 25% at 50~MeV. These results are used to validate the expected capabilities of the DUNE far detector to reconstruct low-energy electrons.Comment: 19 pages, 10 figure

    Impact of cross-section uncertainties on supernova neutrino spectral parameter fitting in the Deep Underground Neutrino Experiment

    Get PDF
    A primary goal of the upcoming Deep Underground Neutrino Experiment (DUNE) is to measure the O(10)\mathcal{O}(10) MeV neutrinos produced by a Galactic core-collapse supernova if one should occur during the lifetime of the experiment. The liquid-argon-based detectors planned for DUNE are expected to be uniquely sensitive to the Îœe\nu_e component of the supernova flux, enabling a wide variety of physics and astrophysics measurements. A key requirement for a correct interpretation of these measurements is a good understanding of the energy-dependent total cross section σ(EÎœ)\sigma(E_\nu) for charged-current Îœe\nu_e absorption on argon. In the context of a simulated extraction of supernova Îœe\nu_e spectral parameters from a toy analysis, we investigate the impact of σ(EÎœ)\sigma(E_\nu) modeling uncertainties on DUNE's supernova neutrino physics sensitivity for the first time. We find that the currently large theoretical uncertainties on σ(EÎœ)\sigma(E_\nu) must be substantially reduced before the Îœe\nu_e flux parameters can be extracted reliably: in the absence of external constraints, a measurement of the integrated neutrino luminosity with less than 10\% bias with DUNE requires σ(EÎœ)\sigma(E_\nu) to be known to about 5%. The neutrino spectral shape parameters can be known to better than 10% for a 20% uncertainty on the cross-section scale, although they will be sensitive to uncertainties on the shape of σ(EÎœ)\sigma(E_\nu). A direct measurement of low-energy Îœe\nu_e-argon scattering would be invaluable for improving the theoretical precision to the needed level.Comment: 25 pages, 21 figure

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    Get PDF
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Bourgeois anarchism and authoritarian democracies

    No full text
    Digital communication is profoundly affecting the constitution of (civil) society by drastically lowering the costs to speak across time and space with individuals and groups of any size, and by producing abundant records of all activities conducted through these media. This is accelerating two contradictory trends. On the one hand, a new breed of social organizations based on principles of weak cooperation and peer production is sharply expanding the scope of what can be achieved by civil society. These are voluntary organizations, with flat hierarchies and trust-based principles. They are focused on producing commons-based resources rather than individual property. In general, they are transformative, not revolutionary, in character. This phenomenon is termed "bourgeois anarchism." On the other hand, the liberal state - in a crisis of legitimacy and under pressure from such new organizations, both peaceful (civil society) and violent (terrorism) - is reorganizing itself around an increasingly authoritarian core, expanding surveillance into the capillary system of society, overriding civil liberties and reducing democratic oversight in exchange for the promise of security. This phenomenon is termed "authoritarian democracies.
    • 

    corecore