12,307 research outputs found

    Beyond opening up the black box: Investigating the role of algorithmic systems in Wikipedian organizational culture

    Full text link
    Scholars and practitioners across domains are increasingly concerned with algorithmic transparency and opacity, interrogating the values and assumptions embedded in automated, black-boxed systems, particularly in user-generated content platforms. I report from an ethnography of infrastructure in Wikipedia to discuss an often understudied aspect of this topic: the local, contextual, learned expertise involved in participating in a highly automated social-technical environment. Today, the organizational culture of Wikipedia is deeply intertwined with various data-driven algorithmic systems, which Wikipedians rely on to help manage and govern the "anyone can edit" encyclopedia at a massive scale. These bots, scripts, tools, plugins, and dashboards make Wikipedia more efficient for those who know how to work with them, but like all organizational culture, newcomers must learn them if they want to fully participate. I illustrate how cultural and organizational expertise is enacted around algorithmic agents by discussing two autoethnographic vignettes, which relate my personal experience as a veteran in Wikipedia. I present thick descriptions of how governance and gatekeeping practices are articulated through and in alignment with these automated infrastructures. Over the past 15 years, Wikipedian veterans and administrators have made specific decisions to support administrative and editorial workflows with automation in particular ways and not others. I use these cases of Wikipedia's bot-supported bureaucracy to discuss several issues in the fields of critical algorithms studies, critical data studies, and fairness, accountability, and transparency in machine learning -- most principally arguing that scholarship and practice must go beyond trying to "open up the black box" of such systems and also examine sociocultural processes like newcomer socialization.Comment: 14 pages, typo fixed in v

    Randomized Algorithms for the Loop Cutset Problem

    Full text link
    We show how to find a minimum weight loop cutset in a Bayesian network with high probability. Finding such a loop cutset is the first step in the method of conditioning for inference. Our randomized algorithm for finding a loop cutset outputs a minimum loop cutset after O(c 6^k kn) steps with probability at least 1 - (1 - 1/(6^k))^c6^k, where c > 1 is a constant specified by the user, k is the minimal size of a minimum weight loop cutset, and n is the number of vertices. We also show empirically that a variant of this algorithm often finds a loop cutset that is closer to the minimum weight loop cutset than the ones found by the best deterministic algorithms known

    Flash of photons from the early stage of heavy-ion collisions

    Get PDF
    The dynamics of partonic cascades may be an important aspect for particle production in relativistic collisions of nuclei at CERN SPS and BNL RHIC energies. Within the Parton-Cascade Model, we estimate the production of single photons from such cascades due to scattering of quarks and gluons q g -> q gamma, quark-antiquark annihilation q qbar -> g gamma, or gamma gamma, and from electromagnetic brems-strahlung of quarks q -> q gamma. We find that the latter QED branching process plays the dominant role for photon production, similarly as the QCD branchings q -> q g and g -> g g play a crucial role for parton multiplication. We conclude therefore that photons accompanying the parton cascade evolution during the early stage of heavy-ion collisions shed light on the formation of a partonic plasma.Comment: 4 pages including 3 postscript figure

    Validation de terrain d'un test ELISA de compétition pour la détection de la péripneumonie contagieuse des bovins au Botswana

    Get PDF
    Le test ELISA de compétition, récemment décrit pour détecter la présence de la péripneumonie, a été utilisé au Laboratoire national vétérinaire du Botswana à Gaborone. L'échantillonnage de sérums a compris un nombre significatif de sérums récoltés durant l'épizootie de 1995 et ensuite en 1998 après que la totalité du cheptel bovin de la zone infectée ait été abattue. Les résultats obtenus ont montré l'excellente spécificité du test avec un seul sérum négatif, sur 895, ayant un titre légèrement supérieur au seuil de positivité. La comparaison avec deux autres tests, la fixation du complément et l'agglutination sur lame, avec des sérums récoltés durant l'épizootie de péripneumonie en 1995 a montré que ces trois tests avaient des sensibilités équivalentes. Les principaux avantages de l'ELISA de compétition sont sa spécificité, sa reproductibilité et la possibilité qu'il offre de pouvoir effectuer un contrôle de qualité fiable en étant utilisé avec des logiciels développés au niveau international comme l'ELISA Data Interchange (EDI). (Résumé d'auteur

    Advanced recovery systems wind tunnel test report

    Get PDF
    Pioneer Aerospace Corporation (PAC) conducted parafoil wind tunnel testing in the NASA-Ames 80 by 120 test sections of the National Full-Scale Aerodynamic Complex, Moffett Field, CA. The investigation was conducted to determine the aerodynamic characteristics of two scale ram air wings in support of air drop testing and full scale development of Advanced Recovery Systems for the Next Generation Space Transportation System. Two models were tested during this investigation. Both the primary test article, a 1/9 geometric scale model with wing area of 1200 square feet and secondary test article, a 1/36 geometric scale model with wing area of 300 square feet, had an aspect ratio of 3. The test results show that both models were statically stable about a model reference point at angles of attack from 2 to 10 degrees. The maximum lift-drag ratio varied between 2.9 and 2.4 for increasing wing loading

    Stochastic Yield Catastrophes and Robustness in Self-Assembly

    Get PDF
    A guiding principle in self-assembly is that, for high production yield, nucleation of structures must be significantly slower than their growth. However, details of the mechanism that impedes nucleation are broadly considered irrelevant. Here, we analyze self-assembly into finite-sized target structures employing mathematical modeling. We investigate two key scenarios to delay nucleation: (i) by introducing a slow activation step for the assembling constituents and, (ii) by decreasing the dimerization rate. These scenarios have widely different characteristics. While the dimerization scenario exhibits robust behavior, the activation scenario is highly sensitive to demographic fluctuations. These demographic fluctuations ultimately disfavor growth compared to nucleation and can suppress yield completely. The occurrence of this stochastic yield catastrophe does not depend on model details but is generic as soon as number fluctuations between constituents are taken into account. On a broader perspective, our results reveal that stochasticity is an important limiting factor for self-assembly and that the specific implementation of the nucleation process plays a significant role in determining the yield

    Heavy resonance production in high energy nuclear collisions

    Get PDF
    We estimate freezeout conditions for ss, cc, and bb quarks in high energy nuclear collisions. Freezeout is due either to loss of thermal contact, or to particles ``wandering'' out of the region of hot matter. We then develop a thermal recombination model in which both single-particle (quark and antiquark) and two-particle (quark-antiquark) densities are conserved. Conservation of two-particle densities is necessary because quarks and antiquarks are always produced in coincidence, so that the local two-particle density can be much larger than the product of the single-particle densities. We use the freezeout conditions and recombination model to discuss heavy resonance production at zero baryon density in high energy nuclear collisions.Comment: revtex, 15 pages, no figures, KSUCNR-009-9

    MoodBar: Increasing new user retention in Wikipedia through lightweight socialization

    Full text link
    Socialization in online communities allows existing members to welcome and recruit newcomers, introduce them to community norms and practices, and sustain their early participation. However, socializing newcomers does not come for free: in large communities, socialization can result in a significant workload for mentors and is hard to scale. In this study we present results from an experiment that measured the effect of a lightweight socialization tool on the activity and retention of newly registered users attempting to edit for the first time Wikipedia. Wikipedia is struggling with the retention of newcomers and our results indicate that a mechanism to elicit lightweight feedback and to provide early mentoring to newcomers improves their chances of becoming long-term contributors.Comment: 9 pages, 5 figures, accepted for presentation at CSCW'1

    Analysis of reaction dynamics at RHIC in a combined parton/hadron transport approach

    Get PDF
    We introduce a transport approach which combines partonic and hadronic degrees of freedom on an equal footing and discuss the resulting reaction dynamics. The initial parton dynamics is modeled in the framework of the parton cascade model, hadronization is performed via a cluster hadronization model and configuration space coalescence, and the hadronic phase is described by a microscopic hadronic transport approach. The resulting reaction dynamics indicates a strong influence of hadronic rescattering on the space-time pattern of hadronic freeze-out and on the shape of transverse mass spectra. Freeze-out times and transverse radii increase by factors of 2 - 3 depending on the hadron species.Comment: 10 pages, 4 eps figures include
    corecore