1,499 research outputs found

    La crise du socialisme réellement existant

    Get PDF
    The Soviet Union and its East European glacis are experiencing their second major wave of crisis since the death of Stalin. But unlike the "de-stalinization" crisis of the second half of the 1950s, which was largely political and was overcome through a combination of repression and reform that left the system basically intact, the present crisis cannot be weathered so easily and threatens to usher in a period of political upheaval, as it has already in Poland and, to some extent, in Romania.On the most obvious level, the crisis manifests itself in the constant decline of the economic growth rate since the late 1950s, which has put an end to the slow but steady rise in living standards, the basis upon which the tacit post-Stalin accord between the bureaucracy and society was founded. The roots of this crisis are deeply structural, but structural reforms, in particular the introduction of the « regulated market mechanism », which appears to be the nly viable alternative open to the bureaucracy, will meet with strong opposition from important sectors of that elite, especially the provincial party bosses, and threaten to create a split in its ranks.At the same time, such a reform is politically unfeasible without important concessions to the working class in the direction of democratization or, at the least, the right to organize into independent trade unions to protect itself against management, whose powers would be greatly enhanced by the reform. But such concessions to the working class, as Poland shows, are perceived by the bureaucracy as a threat to its very existence. At the same time, the working class today is potentially a much more formidable political force than at any time since the civil war.The leadership is, therefore, in a dilemma. The 1980s are likely to see the explosive combination of a simultaneous crisis "as the top" and "at the bottom"

    Communicating Uncertainty in Warning Intelligence

    Get PDF
    On November 23, 2021, Dr. David Mandel presented Communicating Uncertainty in Warning Intelligence at the 2021 CASIS West Coast Security Conference. The primary concepts of Dr. Mandel’s presentation centered on the utilization of verbal versus numeric probabilities, the variability in understandings of verbal probabilities, and the relationship between confidence levels and event probabilities. Dr. Mandel’s presentation was followed by a question and answer period directed at a group of panelists allowing the audience and CASIS Vancouver executives to directly engage with the content of each speaker’s presentation

    "Perestroika" and Women Workers

    Get PDF

    Artificial General Intelligence, Existential Risk, and Human Risk Perception

    Full text link
    Artificial general intelligence (AGI) does not yet exist, but given the pace of technological development in artificial intelligence, it is projected to reach human-level intelligence within roughly the next two decades. After that, many experts expect it to far surpass human intelligence and to do so rapidly. The prospect of superintelligent AGI poses an existential risk to humans because there is no reliable method for ensuring that AGI goals stay aligned with human goals. Drawing on publicly available forecaster and opinion data, the author examines how experts and non-experts perceive risk from AGI. The findings indicate that the perceived risk of a world catastrophe or extinction from AGI is greater than for other existential risks. The increase in perceived risk over the last year is also steeper for AGI than for other existential threats (e.g., nuclear war or human-caused climate change). That AGI is a pressing existential risk is something on which experts and non-experts agree, but the basis for such agreement currently remains obscure

    Constructing A Flexible Likelihood Function For Spectroscopic Inference

    Full text link
    We present a modular, extensible likelihood framework for spectroscopic inference based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. For the high signal-to-noise data with large spectral range that is commonly employed in stellar astrophysics, that covariant structure can lead to dramatically underestimated parameter uncertainties (and, in some cases, biases). We construct a likelihood function that accounts for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. This framework specifically address the common problem of mismatches in model spectral line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic/molecular databases or opacity prescriptions) by developing a novel local covariance kernel formalism that identifies and self-consistently downweights pathological spectral line "outliers." By fitting many spectra in a hierarchical manner, these local kernels provide a mechanism to learn about and build data-driven corrections to synthetic spectral libraries. An open-source software implementation of this approach is available at http://iancze.github.io/Starfish, including a sophisticated probabilistic scheme for spectral interpolation when using model libraries that are sparsely sampled in the stellar parameters. We demonstrate some salient features of the framework by fitting the high resolution VV-band spectrum of WASP-14, an F5 dwarf with a transiting exoplanet, and the moderate resolution KK-band spectrum of Gliese 51, an M5 field dwarf.Comment: Accepted to ApJ. Incorporated referees' comments. New figures 1, 8, 10, 12, and 14. Supplemental website: http://iancze.github.io/Starfish

    The Treatment Versus Experimentation Dilemma in Dose-finding Studies

    Get PDF
    Phase I clinical trials are conducted in order to find the maximum tolerated dose (MTD) of a given drug from a finite set of doses. For ethical reasons, these studies are usually sequential, treating patients or group of patients with the best available dose according to the current knowledge. However, it is proved here that such designs, and, more generally, designs that concentrate on one dose from some time on, cannot provide consistent estimators for the MTD unless very strong parametric assumptions hold. We describe a family of sequential designs that treat individuals with one of the two closest doses to the estimated MTD, and prove that such designs, under general conditions, concentrate eventually on the two closest doses to the MTD and estimate the MTD consistently. It is shown that this family contains randomized designs that assign the MTD with probability that approaches 1 as the size of the experiment goes to infinity. We compare several designs by simulations, studying their performances in terms of correct estimation of the MTD and the proportion of individuals treated with the MTD.Phase I clinical trials are conducted in order to find the maximum tolerated dose (MTD) of a given drug from a finite set of doses. For ethical reasons, these studies are usually sequential, treating patients or group of patients with the best available dose according to the current knowledge. However, it is proved here that such designs, and, more generally, designs that concentrate on one dose from some time on, cannot provide consistent estimators for the MTD unless very strong parametric assumptions hold. We describe a family of sequential designs that treat individuals with one of the two closest doses to the estimated MTD, and prove that such designs, under general conditions, concentrate eventually on the two closest doses to the MTD and estimate the MTD consistently. It is shown that this family contains randomized designs that assign the MTD with probability that approaches 1 as the size of the experiment goes to infinity. We compare several designs by simulations, studying their performances in terms of correct estimation of the MTD and the proportion of individuals treated with the MTD.Non-Refereed Working Papers / of national relevance onl

    Formation of the first three gravitational-wave observations through isolated binary evolution

    Get PDF
    During its first 4 months of taking data, Advanced LIGO has detected gravitational waves from two binary black hole mergers, GW150914 and GW151226, along with the statistically less significant binary black hole merger candidate LVT151012. We use our rapid binary population synthesis code COMPAS to show that all three events can be explained by a single evolutionary channel -- classical isolated binary evolution via mass transfer including a common envelope phase. We show all three events could have formed in low-metallicity environments (Z = 0.001) from progenitor binaries with typical total masses ≳160M⊙\gtrsim 160 M_\odot, ≳60M⊙\gtrsim 60 M_\odot and ≳90M⊙\gtrsim 90 M_\odot, for GW150914, GW151226, and LVT151012, respectively.Comment: Published in Nature Communication
    • …
    corecore