628,325 research outputs found

    Resolution and binary decision diagrams cannot simulate each other polynomially

    Get PDF
    There are many different ways of proving formulas in proposition logic. Many of these can easily be characterized as forms of resolution. Others use so-called binary decision diagrams (BDDs). Experimental evidence suggests that BDDs and resolution based techniques are fundamentally different, in the sense that their performance can differ very much on benchmarks. In this paper we confirm these findings by mathematical proof. We provide examples that are easy for BDDS and exponentially hard for any form of resolution, and vice versa, examples that ar easy for resolution and exponentially hard for BDDs

    On Tackling the Limits of Resolution in SAT Solving

    Full text link
    The practical success of Boolean Satisfiability (SAT) solvers stems from the CDCL (Conflict-Driven Clause Learning) approach to SAT solving. However, from a propositional proof complexity perspective, CDCL is no more powerful than the resolution proof system, for which many hard examples exist. This paper proposes a new problem transformation, which enables reducing the decision problem for formulas in conjunctive normal form (CNF) to the problem of solving maximum satisfiability over Horn formulas. Given the new transformation, the paper proves a polynomial bound on the number of MaxSAT resolution steps for pigeonhole formulas. This result is in clear contrast with earlier results on the length of proofs of MaxSAT resolution for pigeonhole formulas. The paper also establishes the same polynomial bound in the case of modern core-guided MaxSAT solvers. Experimental results, obtained on CNF formulas known to be hard for CDCL SAT solvers, show that these can be efficiently solved with modern MaxSAT solvers

    Results from the solar maximum mission

    Get PDF
    The major results from SMM (Solar Max Mission) are presented as they relate to the understanding of the energy release and particle transportation processes that led to the high energy X-ray aspects of solar flares. Evidence is reviewed for a 152- to 158-day periodicity in various aspects of solar activity including the rate of occurrence of hard X-ray and gamma-ray flares. The statistical properties of over 7000 hard X-ray flares detected with the Hard X-Ray Burst Spectrometer are presented including the spectrum of peak rates and the distribution of the photo number spectrum. A flare classification scheme is used to divide flares into three different types. Type A flares have purely thermal, compact sources with very steep hard X-ray spectra. Type B flares are impulsive bursts which show double footpoints in hard X-rays, and soft-hard-soft spectral evolution. Type C flares have gradually varying hard X-ray and microwave fluxes from high altitudes and show hardening of the X-ray spectrum through the peak and on the decay. SSM data are presented for examples of Type B and Type C events. New results are presented showing coincident hard X rays, O V, and UV continuum observations in Type B events with a time resolution of 128 ms. The subsecond variations in the hard X-ray flux during 10% of the stronger events are discussed and the fastest observed variation in a time of 20 ms is presented. The properties of Type C flares are presented as determined primarily from the non-imaged hard X-ray and microwave spectral data. A model based on the association of Type C flares and coronal mass ejections is presented to explain many of the characteristics of these gradual flares

    Focused Ion Beams (FIB) — Novel Methodologies and Recent Applications for Multidisciplinary Sciences

    Get PDF
    Considered as the newest field of electron microscopy, focused ion beam (FIB) technologies are used in many fields of science for site-specific analysis, imaging, milling, deposition, micromachining, and manipulation. Dual-beam platforms, combining a high-resolution scanning electron microscope (HR-SEM) and an FIB column, additionally equipped with precursor-based gas injection systems (GIS), micromanipulators, and chemical analysis tools (such as energy-dispersive spectra (EDS) or wavelength-dispersive spectra (WDS)), serve as multifunctional tools for direct lithography in terms of nano-machining and nano-prototyping, while advanced specimen preparation for transmission electron microscopy (TEM) can practically be carried out with ultrahigh precision. Especially, when hard materials and material systems with hard substrates are concerned, FIB is the only technique for site-specific micro- and nanostructuring. Moreover, FIB sectioning and sampling techniques are frequently used for revealing the structural and morphological distribution of material systems with three-dimensional (3D) network at micro-/nanoscale.This book chapter includes many examples on conventional and novel processes of FIB technologies, ranging from analysis of semiconductors to electron tomography-based imaging of hard materials such as nanoporous ceramics and composites. In addition, recent studies concerning the active use of dual-beam platforms are mentione

    Non-parametric strong lens inversion of Cl~0024+1654: illustrating the monopole degeneracy

    Get PDF
    The cluster lens Cl 0024+1654 is undoubtedly one of the most beautiful examples of strong gravitational lensing, providing five large images of a single source with well-resolved substructure. Using the information contained in the positions and the shapes of the images, combined with the null space information, a non-parametric technique is used to infer the strong lensing mass map of the central region of this cluster. This yields a strong lensing mass of 1.60x10^14 M_O within a 0.5' radius around the cluster center. This mass distribution is then used as a case study of the monopole degeneracy, which may be one of the most important degeneracies in gravitational lensing studies and which is extremely hard to break. We illustrate the monopole degeneracy by adding circularly symmetric density distributions with zero total mass to the original mass map of Cl 0024+1654. These redistribute mass in certain areas of the mass map without affecting the observed images in any way. We show that the monopole degeneracy and the mass-sheet degeneracy together lie at the heart of the discrepancies between different gravitational lens reconstructions that can be found in the literature for a given object, and that many images/sources, with an overall high image density in the lens plane, are required to construct an accurate, high-resolution mass map based on strong-lensing data.Comment: 9 pages, accepted for publication by MNRA

    Fragmenting densely mineralised acellular protrusions from articular calcified cartilage: a role in osteoarthritis?

    Get PDF
    Fragmenting densely mineralised acellular protrusions from articular calcified cartilage: a role in osteoarthritis? A. Boyde a, G.R. Davis a, D. Mills a, T. Zikmund a, V.L. Adams b, L.R. Ranganath b, N. Jeffery b, J.A. Gallagher b a Dental Physical Sciences, Oral Growth and Development, Barts and The London School of Medicine and Dentistry, Queen Mary University of London, London, UK b Department of Musculoskeletal Biology, Institute of Ageing and Chronic Disease, University of Liverpool, Liverpool, UK Objectives High density mineralised protrusions (HDMP) from the tidemark mineralising front into hyaline articular cartilage (HAC) were first discovered in Thoroughbred racehorse fetlock joints and later in Icelandic horse hock joints. If these fragment, they could make a significant contribution to joint destruction in osteoarthritis. We looked for them in human material. Methods Whole femoral heads removed at operation for joint replacement or from dissection room cadavers were studied by MRI DESS at 0.23mm resolution and 26 micron resolution high contrast x-ray microtomography (XMT), then sectioned and embedded in PMMA, and block faces polished and the blocks re-imaged with 6 micron resolution XMT. Tissue mineralisation density was imaged qualitatively by backscattered electron SEM (BSE SEM) at 20kV using uncoated samples at 50Pa chamber pressure to achieve charge neutralisation. HAC histology was studied by BSE SEM after staining block faces with ammonium triiodide solution. Block surfaces were sequentially repolished and restained. Results Figure: 3D rendering of 6 micron voxel resolution XMT data set showing HDMP complex projecting above subchondral bone plate. Human femoral head removed at arthroplasty. We found examples of HDMP in HAC in human hips. Their 3D shapes are complex and may show cutting blade forms. Their mineral content (a) exceeds that of articular calcified cartilage (ACC), otherwise the densest tissue in the joint and (b) is not uniform. The mineral phase morphology frequently shows the agglomeration of many fine particles into larger concretions. Cracks within them are frequent. Dense fragments may be found within damaged HAC. Conclusions HDMP arise via the extrusion of an uncharacterised matrix into clefts in HAC. Little evidence of their existence remains after tissue has been decalcified with usual histological protocols. Their formation may be an extension of a normal but poorly recognised crack self-healing mechanism found in bone and ACC. They are surrounded by HAC, are dense and brittle and show innumerable fault lines within them. We provide evidence that they break in vivo by being able to find matching fragments in HAC. We conclude that these hard and sharp particles contribute to the shredding destruction of HAC. The osteoarthritis research community should be aware of their existence so that the frequency and possible clinical significance can be assessed in the future. Larger HDMP can be detected with the best MRI imaging

    From average case complexity to improper learning complexity

    Full text link
    The basic problem in the PAC model of computational learning theory is to determine which hypothesis classes are efficiently learnable. There is presently a dearth of results showing hardness of learning problems. Moreover, the existing lower bounds fall short of the best known algorithms. The biggest challenge in proving complexity results is to establish hardness of {\em improper learning} (a.k.a. representation independent learning).The difficulty in proving lower bounds for improper learning is that the standard reductions from NP\mathbf{NP}-hard problems do not seem to apply in this context. There is essentially only one known approach to proving lower bounds on improper learning. It was initiated in (Kearns and Valiant 89) and relies on cryptographic assumptions. We introduce a new technique for proving hardness of improper learning, based on reductions from problems that are hard on average. We put forward a (fairly strong) generalization of Feige's assumption (Feige 02) about the complexity of refuting random constraint satisfaction problems. Combining this assumption with our new technique yields far reaching implications. In particular, 1. Learning DNF\mathrm{DNF}'s is hard. 2. Agnostically learning halfspaces with a constant approximation ratio is hard. 3. Learning an intersection of ω(1)\omega(1) halfspaces is hard.Comment: 34 page

    New spectral classification technique for X-ray sources: quantile analysis

    Full text link
    We present a new technique called "quantile analysis" to classify spectral properties of X-ray sources with limited statistics. The quantile analysis is superior to the conventional approaches such as X-ray hardness ratio or X-ray color analysis to study relatively faint sources or to investigate a certain phase or state of a source in detail, where poor statistics does not allow spectral fitting using a model. Instead of working with predetermined energy bands, we determine the energy values that divide the detected photons into predetermined fractions of the total counts such as median (50%), tercile (33% & 67%), and quartile (25% & 75%). We use these quantiles as an indicator of the X-ray hardness or color of the source. We show that the median is an improved substitute for the conventional X-ray hardness ratio. The median and other quantiles form a phase space, similar to the conventional X-ray color-color diagrams. The quantile-based phase space is more evenly sensitive over various spectral shapes than the conventional color-color diagrams, and it is naturally arranged to properly represent the statistical similarity of various spectral shapes. We demonstrate the new technique in the 0.3-8 keV energy range using Chandra ACIS-S detector response function and a typical aperture photometry involving background subtraction. The technique can be applied in any energy band, provided the energy distribution of photons can be obtained.Comment: 11 pages, 9 figures, accepted for publication in Ap
    corecore