18,808 research outputs found

    On kk-normality and Regularity of Normal Toric Varieties

    Full text link
    We give a bound of kk for a very ample lattice polytope to be kk-normal. Equivalently, we give a new combinatorial bound for the Castelnuovo-Mumford regularity of normal projective toric varieties.Comment: Updated version with the improved main resul

    An Eisenbud-Goto-type Upper Bound for the Castelnuovo-Mumford Regularity of Fake Weighted Projective Spaces

    Full text link
    We will give an upper bound for the kk-normality of very ample lattice simplices, and then give an Eisenbud-Goto-type bound for some special classes of projective toric varieties

    Photoenhanced spin/valley polarization and tunneling magnetoresistance in ferromagnetic-normal-ferromagnetic silicene junction

    Full text link
    We theoretically demonstrate a simple way to significantly enhance the spin/valley polarizations and tunnel- ing magnetoresistnace (TMR) in a ferromagnetic-normal-ferromagnetic (FNF) silicene junction by applying a circularly polarized light in off-resonant regime to the second ferromagnetic (FM) region. We show that the fully spin-polarized current can be realized in certain ranges of light intensity. Increasing the incident energy in the presence of light will induce a transition of perfect spin polarization from positive to negative or vice versa depending on magnetic configuration (parallel or anti-parallel) of FNF junction. Additionally, under a circularly polarized light, valley polarization is very sensitive to electric field and the perfect valley polarization can be achieved even when staggered electric field is much smaller than exchange field. The most important result we would like to emphasize in this paper is that the perfect spin polarization and 100% TMR induced by a circularly polarized light are completely independent of barrier height in normal region. Furthermore, the sign reversal of TMR can be observed when the polarized direction of light is changed. A condition for observing the 100% TMR is also reported. Our results are expected to be informative for real applications of FNF silicene junction, especially in spintronics

    Neural-based Natural Language Generation in Dialogue using RNN Encoder-Decoder with Semantic Aggregation

    Full text link
    Natural language generation (NLG) is an important component in spoken dialogue systems. This paper presents a model called Encoder-Aggregator-Decoder which is an extension of an Recurrent Neural Network based Encoder-Decoder architecture. The proposed Semantic Aggregator consists of two components: an Aligner and a Refiner. The Aligner is a conventional attention calculated over the encoded input information, while the Refiner is another attention or gating mechanism stacked over the attentive Aligner in order to further select and aggregate the semantic elements. The proposed model can be jointly trained both sentence planning and surface realization to produce natural language utterances. The model was extensively assessed on four different NLG domains, in which the experimental results showed that the proposed generator consistently outperforms the previous methods on all the NLG domains.Comment: To be appear at SIGDIAL 2017. arXiv admin note: text overlap with arXiv:1706.00134, arXiv:1706.0013

    Privacy-Preserving Deep Learning via Weight Transmission

    Full text link
    This paper considers the scenario that multiple data owners wish to apply a machine learning method over the combined dataset of all owners to obtain the best possible learning output but do not want to share the local datasets owing to privacy concerns. We design systems for the scenario that the stochastic gradient descent (SGD) algorithm is used as the machine learning method because SGD (or its variants) is at the heart of recent deep learning techniques over neural networks. Our systems differ from existing systems in the following features: {\bf (1)} any activation function can be used, meaning that no privacy-preserving-friendly approximation is required; {\bf (2)} gradients computed by SGD are not shared but the weight parameters are shared instead; and {\bf (3)} robustness against colluding parties even in the extreme case that only one honest party exists. We prove that our systems, while privacy-preserving, achieve the same learning accuracy as SGD and hence retain the merit of deep learning with respect to accuracy. Finally, we conduct several experiments using benchmark datasets, and show that our systems outperform previous system in terms of learning accuracies.Comment: Full version of a conference paper at NSS 201

    Mott transitions in a three-component Falicov-Kimball model: A slave boson mean-field study

    Full text link
    Metal-insulator transitions in a three-component Falicov-Kimball model are investigated within the Kotliar-Ruckenstein slave boson mean-field approach. The model describes a mixture of two interacting fermion atom species loaded into an optical lattice at ultralow temperature. One species is two-component atoms, which can hop in the optical lattice, and the other is single-component atoms, which are localized. Different correlation-driven metal-insulator transitions are observed depending on the atom filling conditions and local interactions. These metal-insulator transitions are classified by the band renormalization factors and the double occupancies of the atom species. The filling conditions and the critical value of the local interactions for these metal-insulator transitions are also analytically established. The obtained results not only are in good agreement with the dynamical mean-field theory for the three-component Falicov-Kimball model but also clarify the nature and properties of the metal-insulator transitions in a simple physics picture

    The classification of constant weighted curvature curves in the plane with a log-linear density

    Full text link
    In this paper, we classify the class of constant weighted curvature curves in the plane with a log-linear density, or in other words, classify all traveling curved fronts with a constant forcing term in R2.\Bbb R^2. The classification gives some interesting phenomena and consequences including: the family of curves converge to a round point when the weighted curvature of curves (or equivalently the forcing term of traveling curved fronts) goes to infinity, a simple proof for a main result in [13] as well as some well-known facts concerning to the isoperimetric problem in the plane with density $e^y.

    On the Convergence Proof of AMSGrad and a New Version

    Full text link
    The adaptive moment estimation algorithm Adam (Kingma and Ba) is a popular optimizer in the training of deep neural networks. However, Reddi et al. have recently shown that the convergence proof of Adam is problematic and proposed a variant of Adam called AMSGrad as a fix. In this paper, we show that the convergence proof of AMSGrad is also problematic. Concretely, the problem in the convergence proof of AMSGrad is in handling the hyper-parameters, treating them as equal while they are not. This is also the neglected issue in the convergence proof of Adam. We provide an explicit counter-example of a simple convex optimization setting to show this neglected issue. Depending on manipulating the hyper-parameters, we present various fixes for this issue. We provide a new convergence proof for AMSGrad as the first fix. We also propose a new version of AMSGrad called AdamX as another fix. Our experiments on the benchmark dataset also support our theoretical results.Comment: Update publication informatio

    An introduction to food safety issues in Vietnam

    Full text link

    Bernstein type theorem for entire weighted minimal graphs in Gn×R\mathbb{G}^n\times\mathbb{R}

    Full text link
    Based on a calibration argument, we prove a Bernstein type theorem for entire minimal graphs over Gauss space Gn\mathbb{G}^n by a simple proof
    • …
    corecore