1,892 research outputs found

    LNCS

    Get PDF
    A chain rule for an entropy notion H(.) states that the entropy H(X) of a variable X decreases by at most l if conditioned on an l-bit string A, i.e., H(X|A)>= H(X)-l. More generally, it satisfies a chain rule for conditional entropy if H(X|Y,A)>= H(X|Y)-l. All natural information theoretic entropy notions we are aware of (like Shannon or min-entropy) satisfy some kind of chain rule for conditional entropy. Moreover, many computational entropy notions (like Yao entropy, unpredictability entropy and several variants of HILL entropy) satisfy the chain rule for conditional entropy, though here not only the quantity decreases by l, but also the quality of the entropy decreases exponentially in l. However, for the standard notion of conditional HILL entropy (the computational equivalent of min-entropy) the existence of such a rule was unknown so far. In this paper, we prove that for conditional HILL entropy no meaningful chain rule exists, assuming the existence of one-way permutations: there exist distributions X,Y,A, where A is a distribution over a single bit, but H(X|Y)>>H(X|Y,A), even if we simultaneously allow for a massive degradation in the quality of the entropy. The idea underlying our construction is based on a surprising connection between the chain rule for HILL entropy and deniable encryption

    A NEVER-ENDING STORY OF RHEUMATOID ARTHRITIS

    Get PDF
    There are distinct Rheumatic disorders, still Rheumatoid arthritis (RA) is believed to be very prevailing. RA is an empathic disorder described over integral redness, constant inflammation, and the existence of auto-antibodies. In RA, inflammation in joints, loss of motion of joint stiffness, joint tenderness are most common in patients. Deformity of joints can be prevented by early diagnosis and treatment. The extremity of the disease can be reduced by combining the drugs and improved weight more profiled than single medication. Treat-to-target progress results in a superior-conclusion in RA, and the ACR, EULAR, and other specialized systems have supported treat-to-target as a basic curative strategy for RA. The novel methods used in RA have upgraded the development of the disorder and maximum people helpful in cancellation of clinical manifestations if the identification of disorder takes place before time. This review article is written after studying most of the journal’s articles, which were published between 1997-2019

    High-Dimensional Inference with the generalized Hopfield Model: Principal Component Analysis and Corrections

    Get PDF
    We consider the problem of inferring the interactions between a set of N binary variables from the knowledge of their frequencies and pairwise correlations. The inference framework is based on the Hopfield model, a special case of the Ising model where the interaction matrix is defined through a set of patterns in the variable space, and is of rank much smaller than N. We show that Maximum Lik elihood inference is deeply related to Principal Component Analysis when the amp litude of the pattern components, xi, is negligible compared to N^1/2. Using techniques from statistical mechanics, we calculate the corrections to the patterns to the first order in xi/N^1/2. We stress that it is important to generalize the Hopfield model and include both attractive and repulsive patterns, to correctly infer networks with sparse and strong interactions. We present a simple geometrical criterion to decide how many attractive and repulsive patterns should be considered as a function of the sampling noise. We moreover discuss how many sampled configurations are required for a good inference, as a function of the system size, N and of the amplitude, xi. The inference approach is illustrated on synthetic and biological data.Comment: Physical Review E: Statistical, Nonlinear, and Soft Matter Physics (2011) to appea

    From Big Data To Knowledge – Good Practices From Industry

    Get PDF
    Recent advancements in data gathering technologies have led to the rise of a large amount of data through which useful insights and ideas can be derived. These data sets are typically too large to process using traditional data processing tools and applications and thus known in the popular press as ‘big data’. It is essential to extract the hidden meanings in the available data sets by aggregating big data into knowledge, which may then positively contribute to decision making. One way to engage in data-driven strategy is to gather contextual relevant data on specific customers, products, and situations, and determine optimised offerings that are most appealing to the target customers based on sound analytics. Corporations around the world have been increasingly applying analytics, tools and technologies to capture, manage and process such data, and derive value out of the huge volumes of data generated by individuals. The detailed intelligence on consumer behaviour, user patterns and other hidden knowledge that was not possible to derive via traditional means could now be used to facilitate important business processes such as real-time control, and demand forecasting. The aim of our research is to understand and analyse the significance and impact of big data in today’s industrial environment and identify the good practices that can help us derive useful knowledge out of this wealth of information based on content analysis of 34 firms that have initiated big data analytical projects. Our descriptive and network analysis shows that the goals of a big data initiative are extensible and highlighted the importance of data representation. We also find the data analytical techniques adopted are heavily dependent on the project goals

    Two-Species Reaction-Diffusion System with Equal Diffusion Constants: Anomalous Density Decay at Large Times

    Full text link
    We study a two-species reaction-diffusion model where A+A->0, A+B->0 and B+B->0, with annihilation rates lambda0, delta0 > lambda0 and lambda0, respectively. The initial particle configuration is taken to be randomly mixed with mean densities nA(0) > nB(0), and with the two species A and B diffusing with the same diffusion constant. A field-theoretic renormalization group analysis suggests that, contrary to expectation, the large-time density of the minority species decays at the same rate as the majority when d<=2. Monte Carlo data supports the field theory prediction in d=1, while in d=2 the logarithmically slow convergence to the large-time asymptotics makes a numerical test difficult.Comment: revised version (more figures, claim on exactnes of d=2 treatment removed), 5 pages, 3 figures, RevTex, see related paper Phys. Rev. E, R3787, (1999) or cond-mat/9901147, to appear in Phys. Rev.

    The sparticle spectrum in Minimal gaugino-Gauge Mediation

    Full text link
    We compute the sparticle mass spectrum in the minimal four-dimensional construction that interpolates between gaugino mediation and ordinary gauge mediation.Comment: 21 pages, 9 figures; V2: refs. added; V3: some typos correcte
    • 

    corecore