197 research outputs found

    Algorithmic Elections

    Get PDF
    Artificial intelligence (AI) has entered election administration. Across the country, election officials are beginning to use AI systems to purge voter records, verify mail-in ballots, and draw district lines. Already, these technologies are having a profound effect on voting rights and democratic processes. However, they have received relatively little attention from AI experts, advocates, and policymakers. Scholars have sounded the alarm on a variety of “algorithmic harms” resulting from AI’s use in the criminal justice system, employment, healthcare, and other civil rights domains. Many of these same algorithmic harms manifest in elections and voting but have been underexplored and remain unaddressed. This Note offers three contributions. First, it documents the various forms of “algorithmic decisionmaking” that are currently present in U.S. elections. This is the most comprehensive survey of AI’s use in elections and voting to date. Second, it explains how algorithmic harms resulting from these technologies are disenfranchising eligible voters and disrupting democratic processes. Finally, it identifies several unique characteristics of the U.S. election administration system that are likely to complicate reform efforts and must be addressed to safeguard voting rights

    Cross-Document Pattern Matching

    Get PDF
    We study a new variant of the string matching problem called cross-document string matching, which is the problem of indexing a collection of documents to support an efficient search for a pattern in a selected document, where the pattern itself is a substring of another document. Several variants of this problem are considered, and efficient linear-space solutions are proposed with query time bounds that either do not depend at all on the pattern size or depend on it in a very limited way (doubly logarithmic). As a side result, we propose an improved solution to the weighted level ancestor problem

    Cache-Oblivious Persistence

    Full text link
    Partial persistence is a general transformation that takes a data structure and allows queries to be executed on any past state of the structure. The cache-oblivious model is the leading model of a modern multi-level memory hierarchy.We present the first general transformation for making cache-oblivious model data structures partially persistent

    Two-dimensional one-component plasma on a Flamm's paraboloid

    Full text link
    We study the classical non-relativistic two-dimensional one-component plasma at Coulomb coupling Gamma=2 on the Riemannian surface known as Flamm's paraboloid which is obtained from the spatial part of the Schwarzschild metric. At this special value of the coupling constant, the statistical mechanics of the system are exactly solvable analytically. The Helmholtz free energy asymptotic expansion for the large system has been found. The density of the plasma, in the thermodynamic limit, has been carefully studied in various situations

    Compressed Subsequence Matching and Packed Tree Coloring

    Get PDF
    We present a new algorithm for subsequence matching in grammar compressed strings. Given a grammar of size nn compressing a string of size NN and a pattern string of size mm over an alphabet of size σ\sigma, our algorithm uses O(n+nσw)O(n+\frac{n\sigma}{w}) space and O(n+nσw+mlogNlogwocc)O(n+\frac{n\sigma}{w}+m\log N\log w\cdot occ) or O(n+nσwlogw+mlogNocc)O(n+\frac{n\sigma}{w}\log w+m\log N\cdot occ) time. Here ww is the word size and occocc is the number of occurrences of the pattern. Our algorithm uses less space than previous algorithms and is also faster for occ=o(nlogN)occ=o(\frac{n}{\log N}) occurrences. The algorithm uses a new data structure that allows us to efficiently find the next occurrence of a given character after a given position in a compressed string. This data structure in turn is based on a new data structure for the tree color problem, where the node colors are packed in bit strings.Comment: To appear at CPM '1

    Supersymmetric solutions of PT-/non-PT-symmetric and non-Hermitian Screened Coulomb potential via Hamiltonian hierarchy inspired variational method

    Get PDF
    The supersymmetric solutions of PT-symmetric and Hermitian/non-Hermitian forms of quantum systems are obtained by solving the Schrodinger equation for the Exponential-Cosine Screened Coulomb potential. The Hamiltonian hierarchy inspired variational method is used to obtain the approximate energy eigenvalues and corresponding wave functions.Comment: 13 page

    Testing The Asteroseismic Scaling Relations For Red Giants With Eclipsing Binaries Observed By Kepler

    Get PDF
    Given the potential of ensemble asteroseismology for understanding fundamental properties of large numbers of stars, it is critical to determine the accuracy of the scaling relations on which these measurements are based. From several powerful validation techniques, all indications so far show that stellar radius estimates from the asteroseismic scaling relations are accurate to within a few percent. Eclipsing binary systems hosting at least one star with detectable solar-like oscillations constitute the ideal test objects for validating asteroseismic radius and mass inferences. By combining radial velocity (RV) measurements and photometric time series of eclipses, it is possible to determine the masses and radii of each component of a double-lined spectroscopic binary. We report the results of a four-year RV survey performed with the échelle spectrometer of the Astrophysical Research Consortium's 3.5 m telescope and the APOGEE spectrometer at Apache Point Observatory. We compare the masses and radii of 10 red giants (RGs) obtained by combining radial velocities and eclipse photometry with the estimates from the asteroseismic scaling relations. We find that the asteroseismic scaling relations overestimate RG radii by about 5% on average and masses by about 15% for stars at various stages of RG evolution. Systematic overestimation of mass leads to underestimation of stellar age, which can have important implications for ensemble asteroseismology used for Galactic studies. As part of a second objective, where asteroseismology is used for understanding binary systems, we confirm that oscillations of RGs in close binaries can be suppressed enough to be undetectable, a hypothesis that was proposed in a previous work

    Can forest management based on natural disturbances maintain ecological resilience?

    Get PDF
    Given the increasingly global stresses on forests, many ecologists argue that managers must maintain ecological resilience: the capacity of ecosystems to absorb disturbances without undergoing fundamental change. In this review we ask: Can the emerging paradigm of natural-disturbance-based management (NDBM) maintain ecological resilience in managed forests? Applying resilience theory requires careful articulation of the ecosystem state under consideration, the disturbances and stresses that affect the persistence of possible alternative states, and the spatial and temporal scales of management relevance. Implementing NDBM while maintaining resilience means recognizing that (i) biodiversity is important for long-term ecosystem persistence, (ii) natural disturbances play a critical role as a generator of structural and compositional heterogeneity at multiple scales, and (iii) traditional management tends to produce forests more homogeneous than those disturbed naturally and increases the likelihood of unexpected catastrophic change by constraining variation of key environmental processes. NDBM may maintain resilience if silvicultural strategies retain the structures and processes that perpetuate desired states while reducing those that enhance resilience of undesirable states. Such strategies require an understanding of harvesting impacts on slow ecosystem processes, such as seed-bank or nutrient dynamics, which in the long term can lead to ecological surprises by altering the forest's capacity to reorganize after disturbance

    GABRIELA : a new detector array for gamma-ray and conversion electron spectroscopy of transfermium elements

    Full text link
    With the aid of the Geant4 Monte Carlo simulation package a new detection system has been designed for the focal plane of the recoil separator VASSILISSA situated at the Flerov Laboratory of Nuclear Reactions, JINR, Dubna. GABRIELA (Gamma Alpha Beta Recoil Investigations with the Electromagnetic Analyser VASSILISSA) has been optimised to detect the arrival of reaction products and their subsequent radioactive decays involving the emission of alpha- and beta-particles, fission fragments, gamma- and X-rays, and conversion electrons. The new detector system is described and the results of the first commissioning experiments are presented.Comment: 24 pages, Submitted to NIM

    A 6% measurement of the Hubble parameter at z~0.45 : direct evidence of the epoch of cosmic re-acceleration

    Get PDF
    MM, LP and AC acknowledge financial contributions by grants ASI/INAF I/023/12/0 and PRIN MIUR 2010-2011 "The dark Universe and the cosmic evolution of baryons: from current surveys to Euclid". RJ and LV thank the Royal Society for financial support and the ICIC at Imperial College for hospitality while this work was being completed. LV is supported by the European Research Council under the European Community's Seventh Framework Programme FP7-IDEAS-Phys.LSS 240117. Funding for this work was partially provided by the Spanish MINECO under projects AYA2014-58747-P and MDM-2014-0369 of ICCUB (Unidad de Excelencia "Maria de Maeztu") Funding for SDSS-III has been provided by the Alfred P. Sloan Foundation, the Participating Institutions, the National Science Foundation, and the U.S. Department of Energy Office of Science.Deriving the expansion history of the Universe is a major goal of modern cosmology. To date, the most accurate measurements have been obtained with Type Ia Supernovae (SNe) and Baryon Acoustic Oscillations (BAO), providing evidence for the existence of a transition epoch at which the expansion rate changes from decelerated to accelerated. However, these results have been obtained within the framework of specific cosmological models that must be implicitly or explicitly assumed in the measurement. It is therefore crucial to obtain measurements of the accelerated expansion of the Universe independently of assumptions on cosmological models. Here we exploit the unprecedented statistics provided by the Baryon Oscillation Spectroscopic Survey (BOSS, [1-3]) Data Release 9 to provide new constraints on the Hubble parameter H(z) using the cosmic chronometers approach. We extract a sample of more than 130000 of the most massive and passively evolving galaxies, obtaining five new cosmology-independent H(z) measurements in the redshift range 0.3 < z < 0.5, with an accuracy of ~11–16% incorporating both statistical and systematic errors. Once combined, these measurements yield a 6% accuracy constraint of H(z = 0.4293) = 91.8 ± 5.3 km/s/Mpc. The new data are crucial to provide the first cosmology-independent determination of the transition redshift at high statistical significance, measuring zt = 0.4 ± 0.1, and to significantly disfavor the null hypothesis of no transition between decelerated and accelerated expansion at 99.9% confidence level. This analysis highlights the wide potential of the cosmic chronometers approach: it permits to derive constraints on the expansion history of the Universe with results competitive with standard probes, and most importantly, being the estimates independent of the cosmological model, it can constrain cosmologies beyond—and including—the ΛCDM model.PostprintPeer reviewe
    corecore