7,027 research outputs found

    Bond-Propagation Algorithm for Thermodynamic Functions in General 2D Ising Models

    Full text link
    Recently, we developed and implemented the bond propagation algorithm for calculating the partition function and correlation functions of random bond Ising models in two dimensions. The algorithm is the fastest available for calculating these quantities near the percolation threshold. In this paper, we show how to extend the bond propagation algorithm to directly calculate thermodynamic functions by applying the algorithm to derivatives of the partition function, and we derive explicit expressions for this transformation. We also discuss variations of the original bond propagation procedure within the larger context of Y-Delta-Y-reducibility and discuss the relation of this class of algorithm to other algorithms developed for Ising systems. We conclude with a discussion on the outlook for applying similar algorithms to other models.Comment: 12 pages, 10 figures; submitte

    High rate locally-correctable and locally-testable codes with sub-polynomial query complexity

    Full text link
    In this work, we construct the first locally-correctable codes (LCCs), and locally-testable codes (LTCs) with constant rate, constant relative distance, and sub-polynomial query complexity. Specifically, we show that there exist binary LCCs and LTCs with block length nn, constant rate (which can even be taken arbitrarily close to 1), constant relative distance, and query complexity exp(O~(logn))\exp(\tilde{O}(\sqrt{\log n})). Previously such codes were known to exist only with Ω(nβ)\Omega(n^{\beta}) query complexity (for constant β>0\beta > 0), and there were several, quite different, constructions known. Our codes are based on a general distance-amplification method of Alon and Luby~\cite{AL96_codes}. We show that this method interacts well with local correctors and testers, and obtain our main results by applying it to suitably constructed LCCs and LTCs in the non-standard regime of \emph{sub-constant relative distance}. Along the way, we also construct LCCs and LTCs over large alphabets, with the same query complexity exp(O~(logn))\exp(\tilde{O}(\sqrt{\log n})), which additionally have the property of approaching the Singleton bound: they have almost the best-possible relationship between their rate and distance. This has the surprising consequence that asking for a large alphabet error-correcting code to further be an LCC or LTC with exp(O~(logn))\exp(\tilde{O}(\sqrt{\log n})) query complexity does not require any sacrifice in terms of rate and distance! Such a result was previously not known for any o(n)o(n) query complexity. Our results on LCCs also immediately give locally-decodable codes (LDCs) with the same parameters

    Nuclear-spin relaxation of 207^{207}Pb in ferroelectric powders

    Full text link
    Motivated by a recent proposal by O. P. Sushkov and co-workers to search for a P,T-violating Schiff moment of the 207^{207}Pb nucleus in a ferroelectric solid, we have carried out a high-field nuclear magnetic resonance study of the longitudinal and transverse spin relaxation of the lead nuclei from room temperature down to 10 K for powder samples of lead titanate (PT), lead zirconium titanate (PZT), and a PT monocrystal. For all powder samples and independently of temperature, transverse relaxation times were found to be T21.5T_2\approx 1.5 ms, while the longitudinal relaxation times exhibited a temperature dependence, with T1T_1 of over an hour at the lowest temperatures, decreasing to T17T_1\approx 7 s at room temperature. At high temperatures, the observed behavior is consistent with a two-phonon Raman process, while in the low temperature limit, the relaxation appears to be dominated by a single-phonon (direct) process involving magnetic impurities. This is the first study of temperature-dependent nuclear-spin relaxation in PT and PZT ferroelectrics at such low temperatures. We discuss the implications of the results for the Schiff-moment search.Comment: 6 pages, 4 figure

    Did the Dependent Coverage Mandate Reduce Crime?

    Get PDF
    The Affordable Care Act’s dependent coverage mandate (DCM) induced approximately 2 million young adults to join parental employer-sponsored health insurance plans. This study is the first to explore the impact of the DCM on crime, a potentially important externality. Using data from the National Incident-Based Reporting System, we find that the DCM induced a 2–5 percent reduction in property crime incidents involving young adult arrestees ages 22–25 relative to those ages 27–29. This finding is supported by supplemental analysis using data from the Uniform Crime Reports. An examination of the underlying mechanisms suggests that declines in large out-of-pocket expenditures for health care, increased educational attainment, and increases in cohabitation of parents and adult children may explain these declines in crime. Backof- the-envelope calculations suggest that the DCM generated approximately 371371–512 million in annual social benefits from crime reduction among young adults

    Exploring Interpretability for Predictive Process Analytics

    Full text link
    Modern predictive analytics underpinned by machine learning techniques has become a key enabler to the automation of data-driven decision making. In the context of business process management, predictive analytics has been applied to making predictions about the future state of an ongoing business process instance, for example, when will the process instance complete and what will be the outcome upon completion. Machine learning models can be trained on event log data recording historical process execution to build the underlying predictive models. Multiple techniques have been proposed so far which encode the information available in an event log and construct input features required to train a predictive model. While accuracy has been a dominant criterion in the choice of various techniques, they are often applied as a black-box in building predictive models. In this paper, we derive explanations using interpretable machine learning techniques to compare and contrast the suitability of multiple predictive models of high accuracy. The explanations allow us to gain an understanding of the underlying reasons for a prediction and highlight scenarios where accuracy alone may not be sufficient in assessing the suitability of techniques used to encode event log data to features used by a predictive model. Findings from this study motivate the need and importance to incorporate interpretability in predictive process analytics.Comment: 15 pages, 7 figure

    Does Scientific Progress Consist in Increasing Knowledge or Understanding?

    Get PDF
    Bird argues that scientific progress consists in increasing knowledge. Dellsén objects that increasing knowledge is neither necessary nor sufficient for scientific progress, and argues that scientific progress rather consists in increasing understanding. Dellsén also contends that unlike Bird’s view, his view can account for the scientific practices of using idealizations and of choosing simple theories over complex ones. I argue that Dellsén’s criticisms against Bird’s view fail, and that increasing understanding cannot account for scientific progress, if acceptance, as opposed to belief, is required for scientific understanding

    Maximum likelihood drift estimation for a threshold diffusion

    Get PDF
    We study the maximum likelihood estimator of the drift parameters of a stochastic differential equation, with both drift and diffusion coefficients constant on the positive and negative axis, yet discontinuous at zero. This threshold diffusion is called drifted Oscillating Brownian motion.For this continuously observed diffusion, the maximum likelihood estimator coincide with a quasi-likelihood estimator with constant diffusion term. We show that this estimator is the limit, as observations become dense in time, of the (quasi)-maximum likelihood estimator based on discrete observations. In long time, the asymptotic behaviors of the positive and negative occupation times rule the ones of the estimators. Differently from most known results in the literature, we do not restrict ourselves to the ergodic framework: indeed, depending on the signs of the drift, the process may be ergodic, transient or null recurrent. For each regime, we establish whether or not the estimators are consistent; if they are, we prove the convergence in long time of the properly rescaled difference of the estimators towards a normal or mixed normal distribution. These theoretical results are backed by numerical simulations

    Small grid embeddings of 3-polytopes

    Full text link
    We introduce an algorithm that embeds a given 3-connected planar graph as a convex 3-polytope with integer coordinates. The size of the coordinates is bounded by O(27.55n)=O(188n)O(2^{7.55n})=O(188^{n}). If the graph contains a triangle we can bound the integer coordinates by O(24.82n)O(2^{4.82n}). If the graph contains a quadrilateral we can bound the integer coordinates by O(25.46n)O(2^{5.46n}). The crucial part of the algorithm is to find a convex plane embedding whose edges can be weighted such that the sum of the weighted edges, seen as vectors, cancel at every point. It is well known that this can be guaranteed for the interior vertices by applying a technique of Tutte. We show how to extend Tutte's ideas to construct a plane embedding where the weighted vector sums cancel also on the vertices of the boundary face
    corecore