14,374 research outputs found

    Fusion Hindrance in the Heavy Ion Reactions -- Border Between the Normal and Hindered Fusions

    Full text link
    The fusion hindrance in heavy ion collisions is studied in the framework of the two-center liquid drop model. It appears that the neck and the radial degrees of freedom might both be hampered by an inner potential barrier on their path between the contact configuration to the compound nucleus. Heavy ion reactions with and without the two kinds of fusion hindrance are classified through systematic calculations. It is found that the number of reactions without radial fusion hindrance is much smaller than that without neck fusion hindrance, and for both kinds of fusion hindrance the number of reactions without fusion hindrance at small mass-asymmetry parameter α\alpha is smaller than that at large α\alpha. In the formation of a given compound nucleus, if a reaction with αc\alpha_c is not hindered, then other reactions with α>αc\alpha > \alpha_c are also not hindered as it is well known experimentally.Comment: 14 pages, 7 figure

    Group Testing with Pools of Fixed Size

    Full text link
    In the classical combinatorial (adaptive) group testing problem, one is given two integers dd and nn, where 0≤d≤n0\le d\le n, and a population of nn items, exactly dd of which are known to be defective. The question is to devise an optimal sequential algorithm that, at each step, tests a subset of the population and determines whether such subset is contaminated (i.e. contains defective items) or otherwise. The problem is solved only when the dd defective items are identified. The minimum number of steps that an optimal sequential algorithm takes in general (i.e. in the worst case) to solve the problem is denoted by M(d,n)M(d, n). The computation of M(d,n)M(d, n) appears to be very difficult and a general formula is known only for d=1d = 1. We consider here a variant of the original problem, where the size of the subsets to be tested is restricted to be a fixed positive integer kk. The corresponding minimum number of tests by a sequential optimal algorithm is denoted by M[k](d,n)M^{\lbrack k\rbrack}(d, n). In this paper we start the investigation of the function M[k](d,n)M^{\lbrack k\rbrack}(d, n)

    Structure and mechanical properties of artificial protein hydrogels assembled through aggregation of leucine zipper peptide domains

    Get PDF
    Artificial protein hydrogels made from a triblock protein (designated AC10A, where A is an acidic zipper domain and C10 comprises 10 repeats of the nonapeptide sequence exhibit normalized plateau storage moduli (G/nkT) less than 0.13 at all concentrations, pH values, and ionic strengths examined. These gels are surprisingly soft due to loop formation at the expense of bridges between physical junctions. Molecular-level evidence of loop formation is provided by strong fluorescence energy transfer (FRET) between distinct chromophores placed at the C- and N-termini of labelled chains diluted in an excess of unlabelled chains. The tendency to form loops originates from the compact size of the random coil midblock (mean RH(C10) 20 Ă…, determined from quasi-elastic light scattering of C10), and is facilitated by the ability of the leucine zipper domains to form antiparallel aggregates. Although the aggregation number of the leucine zipper domains is small (tetrameric, determined from multi-angle static light scattering of AC10 diblock), the average center-to-center distance between aggregates is roughly 1.5 times the average end-to-end distance of the C10 domain in a 7% w/v network. To avoid stretching the C10 domain, the chains tend to form loops. Changes in pH or ionic strength that expand the polyelectrolyte midblock favor bridging, leading to greater G as long as leucine zipper endblocks do not dissociate. Understanding of the network structure provided successful design strategies to increase the rigidity of these hydrogels. In contrast to intuitive design concepts for rubber and gel materials, it was shown that increasing either the length or the charge density of the midblock increases rigidity, because fewer chains are wasted in loop formation

    Evaluating probability forecasts

    Full text link
    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.Comment: Published in at http://dx.doi.org/10.1214/11-AOS902 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org

    A General Two-Step Approach to Learning-Based Hashing

    Full text link
    Most existing approaches to hashing apply a single form of hash function, and an optimization process which is typically deeply coupled to this specific form. This tight coupling restricts the flexibility of the method to respond to the data, and can result in complex optimization problems that are difficult to solve. Here we propose a flexible yet simple framework that is able to accommodate different types of loss functions and hash functions. This framework allows a number of existing approaches to hashing to be placed in context, and simplifies the development of new problem-specific hashing methods. Our framework decomposes hashing learning problem into two steps: hash bit learning and hash function learning based on the learned bits. The first step can typically be formulated as binary quadratic problems, and the second step can be accomplished by training standard binary classifiers. Both problems have been extensively studied in the literature. Our extensive experiments demonstrate that the proposed framework is effective, flexible and outperforms the state-of-the-art.Comment: 13 pages. Appearing in Int. Conf. Computer Vision (ICCV) 201
    • …
    corecore