235 research outputs found
Bayesian model of HIV/AIDS in India: A spatial analysis
Background: Bayesian models are very aexible enough to incorporate spatial correlation and to
!&$jut overall mean ratio when relatively few cases exist whereas our conventional Disease mapping has
beral limitation and it requires the Standardization Incidence Ratio which is derived from observed
cases to what might be expected from larger population. This study explains the importance and advantage
of Bayesian methods in Disease mapping
Super-polylogarithmic hypergraph coloring hardness via low-degree long codes
We prove improved inapproximability results for hypergraph coloring using the
low-degree polynomial code (aka, the 'short code' of Barak et. al. [FOCS 2012])
and the techniques proposed by Dinur and Guruswami [FOCS 2013] to incorporate
this code for inapproximability results. In particular, we prove
quasi-NP-hardness of the following problems on -vertex hyper-graphs:
* Coloring a 2-colorable 8-uniform hypergraph with
colors.
* Coloring a 4-colorable 4-uniform hypergraph with
colors.
* Coloring a 3-colorable 3-uniform hypergraph with colors.
In each of these cases, the hardness results obtained are (at least)
exponentially stronger than what was previously known for the respective cases.
In fact, prior to this result, polylog n colors was the strongest quantitative
bound on the number of colors ruled out by inapproximability results for
O(1)-colorable hypergraphs.
The fundamental bottleneck in obtaining coloring inapproximability results
using the low- degree long code was a multipartite structural restriction in
the PCP construction of Dinur-Guruswami. We are able to get around this
restriction by simulating the multipartite structure implicitly by querying
just one partition (albeit requiring 8 queries), which yields our result for
2-colorable 8-uniform hypergraphs. The result for 4-colorable 4-uniform
hypergraphs is obtained via a 'query doubling' method. For 3-colorable
3-uniform hypergraphs, we exploit the ternary domain to design a test with an
additive (as opposed to multiplicative) noise function, and analyze its
efficacy in killing high weight Fourier coefficients via the pseudorandom
properties of an associated quadratic form.Comment: 25 page
Cloud based multicasting using fat tree data confidential recurrent neural network
With the progress of cloud computing, more users are attracted by its strong and cost-effective computation potentiality. Nevertheless, whether Cloud Service Providers can efficiently protect Cloud Users data confidentiality (DC) remains a demanding issue. The CU may execute several applications with multicast needs. In Cloud different techniques were used to provide DC with multicast necessities. In this work, we aim at ensuring DC in the cloud. This is achieved using a two-step technique, called Fat Tree Data Confidential Recurrent Neural Network (FT-DCRNN) in a cloud environment. The first step performs the construction of Fat Tree based on Multicast model. The aim to use Fat Tree with Multicast model is that the multicast model propagates traffic on multiple links. With the Degree Restrict Multicast Fat Tree construction algorithm using a reference function, the minimum average between two links is measured. With these measured links, multicast is said to be performed that in turn improves the throughput and efficiency of cloud service. Then, with the objective of providing DC for the multi-casted data or messages, DCRNN model is applied. With the Non-linear Recurrent Neural Network using Logistic Activation Function, by handling complex non-linear relationships, average response time is said to be reduced
- …