2,989 research outputs found

    Quantization of Black Holes

    Full text link
    We show that black holes can be quantized in an intuitive and elegant way with results in agreement with conventional knowledge of black holes by using Bohr's idea of quantizing the motion of an electron inside the atom in quantum mechanics. We find that properties of black holes can be also derived from an Ansatz of quantized entropy \Delta S=4\pi k {\Delta R / \lambdabar}, which was suggested in a previous work to unify the black hole entropy formula and Verlinde's conjecture to explain gravity as an entropic force. Such an Ansatz also explains gravity as an entropic force from quantum effect. This suggests a way to unify gravity with quantum theory. Several interesting and surprising results of black holes are given from which we predict the existence of primordial black holes ranging from Planck scale both in size and energy to big ones in size but with low energy behaviors.Comment: Latex 7 pages, no figure

    Hints of Standard Model Higgs Boson at the LHC and Light Dark Matter Searches

    Full text link
    The most recent results of searches at the LHC for the Higgs boson h have turned up possible hints of such a particle with mass m_h about 125 GeV consistent with standard model (SM) expectations. This has many potential implications for the SM and beyond. We consider some of them in the contexts of a simple Higgs-portal dark matter (DM) model, the SM plus a real gauge-singlet scalar field D as the DM candidate, and a couple of its variations. In the simplest model with one Higgs doublet and three or four generations of fermions, for D mass m_D DD tends to have a substantial branching ratio. If future LHC data confirm the preliminary Higgs indications, m_D will have to exceed m_h/2. To keep the DM lighter than m_h/2, one will need to extend the model and also satisfy constraints from DM direct searches. The latter can be accommodated if the model provides sizable isospin violation in the DM-nucleon interactions. We explore this in a two-Higgs-doublet model combined with the scalar field D. This model can offer a 125-GeV SM-like Higgs and a light DM candidate having isospin-violating interactions with nucleons at roughly the required level, albeit with some degree of fine-tuning.Comment: 17 pages, 4 figures, slightly revised, main conclusions unchanged, references added, matches published versio

    Generating Adversarial Examples with Adversarial Networks

    Full text link
    Deep neural networks (DNNs) have been found to be vulnerable to adversarial examples resulting from adding small-magnitude perturbations to inputs. Such adversarial examples can mislead DNNs to produce adversary-selected results. Different attack strategies have been proposed to generate adversarial examples, but how to produce them with high perceptual quality and more efficiently requires more research efforts. In this paper, we propose AdvGAN to generate adversarial examples with generative adversarial networks (GANs), which can learn and approximate the distribution of original instances. For AdvGAN, once the generator is trained, it can generate adversarial perturbations efficiently for any instance, so as to potentially accelerate adversarial training as defenses. We apply AdvGAN in both semi-whitebox and black-box attack settings. In semi-whitebox attacks, there is no need to access the original target model after the generator is trained, in contrast to traditional white-box attacks. In black-box attacks, we dynamically train a distilled model for the black-box model and optimize the generator accordingly. Adversarial examples generated by AdvGAN on different target models have high attack success rate under state-of-the-art defenses compared to other attacks. Our attack has placed the first with 92.76% accuracy on a public MNIST black-box attack challenge.Comment: Accepted to IJCAI201

    Tree-Structured Neural Machine for Linguistics-Aware Sentence Generation

    Full text link
    Different from other sequential data, sentences in natural language are structured by linguistic grammars. Previous generative conversational models with chain-structured decoder ignore this structure in human language and might generate plausible responses with less satisfactory relevance and fluency. In this study, we aim to incorporate the results from linguistic analysis into the process of sentence generation for high-quality conversation generation. Specifically, we use a dependency parser to transform each response sentence into a dependency tree and construct a training corpus of sentence-tree pairs. A tree-structured decoder is developed to learn the mapping from a sentence to its tree, where different types of hidden states are used to depict the local dependencies from an internal tree node to its children. For training acceleration, we propose a tree canonicalization method, which transforms trees into equivalent ternary trees. Then, with a proposed tree-structured search method, the model is able to generate the most probable responses in the form of dependency trees, which are finally flattened into sequences as the system output. Experimental results demonstrate that the proposed X2Tree framework outperforms baseline methods over 11.15% increase of acceptance ratio
    • …
    corecore