12,730 research outputs found
Neural Natural Language Inference Models Enhanced with External Knowledge
Modeling natural language inference is a very challenging task. With the
availability of large annotated data, it has recently become feasible to train
complex models such as neural-network-based inference models, which have shown
to achieve the state-of-the-art performance. Although there exist relatively
large annotated data, can machines learn all knowledge needed to perform
natural language inference (NLI) from these data? If not, how can
neural-network-based NLI models benefit from external knowledge and how to
build NLI models to leverage it? In this paper, we enrich the state-of-the-art
neural natural language inference models with external knowledge. We
demonstrate that the proposed models improve neural NLI models to achieve the
state-of-the-art performance on the SNLI and MultiNLI datasets.Comment: Accepted by ACL 201
Anonymous and Adaptively Secure Revocable IBE with Constant Size Public Parameters
In Identity-Based Encryption (IBE) systems, key revocation is non-trivial.
This is because a user's identity is itself a public key. Moreover, the private
key corresponding to the identity needs to be obtained from a trusted key
authority through an authenticated and secrecy protected channel. So far, there
exist only a very small number of revocable IBE (RIBE) schemes that support
non-interactive key revocation, in the sense that the user is not required to
interact with the key authority or some kind of trusted hardware to renew her
private key without changing her public key (or identity). These schemes are
either proven to be only selectively secure or have public parameters which
grow linearly in a given security parameter. In this paper, we present two
constructions of non-interactive RIBE that satisfy all the following three
attractive properties: (i) proven to be adaptively secure under the Symmetric
External Diffie-Hellman (SXDH) and the Decisional Linear (DLIN) assumptions;
(ii) have constant-size public parameters; and (iii) preserve the anonymity of
ciphertexts---a property that has not yet been achieved in all the current
schemes
RSA: Byzantine-Robust Stochastic Aggregation Methods for Distributed Learning from Heterogeneous Datasets
In this paper, we propose a class of robust stochastic subgradient methods
for distributed learning from heterogeneous datasets at presence of an unknown
number of Byzantine workers. The Byzantine workers, during the learning
process, may send arbitrary incorrect messages to the master due to data
corruptions, communication failures or malicious attacks, and consequently bias
the learned model. The key to the proposed methods is a regularization term
incorporated with the objective function so as to robustify the learning task
and mitigate the negative effects of Byzantine attacks. The resultant
subgradient-based algorithms are termed Byzantine-Robust Stochastic Aggregation
methods, justifying our acronym RSA used henceforth. In contrast to most of the
existing algorithms, RSA does not rely on the assumption that the data are
independent and identically distributed (i.i.d.) on the workers, and hence fits
for a wider class of applications. Theoretically, we show that: i) RSA
converges to a near-optimal solution with the learning error dependent on the
number of Byzantine workers; ii) the convergence rate of RSA under Byzantine
attacks is the same as that of the stochastic gradient descent method, which is
free of Byzantine attacks. Numerically, experiments on real dataset corroborate
the competitive performance of RSA and a complexity reduction compared to the
state-of-the-art alternatives.Comment: To appear in AAAI 201
The global attractors and their Hausdorff and fractal dimensions estimation for the higher-order nonlinear Kirchhoff-type equation*
We investigate the global well-posedness and the longtime dynamics of solutions for the higher-order Kirchhoff-typeequation with nonlinear strongly dissipation:2( ) ( )m mt t tu  ï€ ï„ u  ï¦ ï D u ï ( ) ( ) ( )mï€ ï„ u  g u  f x . Under of the properassume, the main results are that existence and uniqueness of the solution is proved by using priori estimate and Galerkinmethod, the existence of the global attractor with finite-dimension, and estimation Hausdorff and fractal dimensions of theglobal attractor
The global attractors and their Hausdorff and fractal dimensions estimation for the higher-order nonlinear Kirchhoff-type equation with nonlinear strongly damped terms
In this paper ,we study the long time behavior of solution to the initial boundary value problems for higher -orderkirchhoff-type equation with nonlinear strongly dissipation:At first ,we prove the existence and uniqueness of the solution by priori estimate and Galerkin methodthen we establish the existence of global attractors ,at last,we consider that estimation of upper bounds of Hausdorff and fractal dimensions for the global attractors are obtain
- …