396,105 research outputs found

    Quantum cryptography: key distribution and beyond

    Full text link
    Uniquely among the sciences, quantum cryptography has driven both foundational research as well as practical real-life applications. We review the progress of quantum cryptography in the last decade, covering quantum key distribution and other applications.Comment: It's a review on quantum cryptography and it is not restricted to QK

    Recent advances on recursive filtering and sliding mode design for networked nonlinear stochastic systems: A survey

    Get PDF
    Copyright © 2013 Jun Hu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Some recent advances on the recursive filtering and sliding mode design problems for nonlinear stochastic systems with network-induced phenomena are surveyed. The network-induced phenomena under consideration mainly include missing measurements, fading measurements, signal quantization, probabilistic sensor delays, sensor saturations, randomly occurring nonlinearities, and randomly occurring uncertainties. With respect to these network-induced phenomena, the developments on filtering and sliding mode design problems are systematically reviewed. In particular, concerning the network-induced phenomena, some recent results on the recursive filtering for time-varying nonlinear stochastic systems and sliding mode design for time-invariant nonlinear stochastic systems are given, respectively. Finally, conclusions are proposed and some potential future research works are pointed out.This work was supported in part by the National Natural Science Foundation of China under Grant nos. 61134009, 61329301, 61333012, 61374127 and 11301118, the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant no. GR/S27658/01, the Royal Society of the UK, and the Alexander von Humboldt Foundation of Germany

    Beyond Stemming and Lemmatization: Ultra-stemming to Improve Automatic Text Summarization

    Full text link
    In Automatic Text Summarization, preprocessing is an important phase to reduce the space of textual representation. Classically, stemming and lemmatization have been widely used for normalizing words. However, even using normalization on large texts, the curse of dimensionality can disturb the performance of summarizers. This paper describes a new method for normalization of words to further reduce the space of representation. We propose to reduce each word to its initial letters, as a form of Ultra-stemming. The results show that Ultra-stemming not only preserve the content of summaries produced by this representation, but often the performances of the systems can be dramatically improved. Summaries on trilingual corpora were evaluated automatically with Fresa. Results confirm an increase in the performance, regardless of summarizer system used.Comment: 22 pages, 12 figures, 9 table

    The VWFA: It\u27s not just for words anymore

    Get PDF
    Reading is an important but phylogenetically new skill. While neuroimaging studies have identified brain regions used in reading, it is unclear to what extent these regions become specialized for use predominantly in reading vs. other tasks. Over the past several years, our group has published three studies addressing this question, particularly focusing on whether the putative visual word form area (VWFA) is used predominantly in reading, or whether it is used more generally in a number of tasks. Our three studies utilize a range of neuroimaging techniques, including task based fMRI experiments, a seed based resting state functional connectivity (RSFC) experiment, and a network based RSFC experiment. Overall, our studies indicate that the VWFA is not used specifically or even predominantly for reading. Rather the VWFA is a general use region that has processing properties making it particularly useful for reading, though it continues to be used in any task that requires its general processing properties. Our network based RSFC analysis extends this finding to other regions typically thought to be used predominantly for reading. Here, we review these findings and describe how the three studies complement each other. Then, we argue that conceptualizing the VWFA as a brain region with specific processing characteristics rather than a brain region devoted to a specific stimulus class, allows us to better explain the activity seen in this region during a variety of tasks. Having this type of conceptualization not only provides a better understanding of the VWFA but also provides a framework for understanding other brain regions, as it affords an explanation of function that is in keeping with the long history of studying the brain in terms of the type of information processing performed (Posner, 1978)

    Classical light vs. nonclassical light: Characterizations and interesting applications

    Full text link
    We briefly review the ideas that have shaped modern optics and have led to various applications of light ranging from spectroscopy to astrophysics, and street lights to quantum communication. The review is primarily focused on the modern applications of classical light and nonclassical light. Specific attention has been given to the applications of squeezed, antibunched, and entangled states of radiation field. Applications of Fock states (especially single photon states) in the field of quantum communication are also discussed.Comment: 32 pages, 3 figures, a review on applications of ligh

    Interactions, structure and properties in poly(lactic acid)/thermoplastic polymer blends

    Get PDF
    Blends were prepared from poly(lactic acid) (PLA) and three thermoplastics, polystyrene (PS), polycarbonate (PC) and poly(methyl methacrylate) (PMMA). Rheological and mechanical properties, structure and component interactions were determined by various methods. The results showed that the structure and properties of the blends cover a relatively wide range. All three blends have heterogeneous structure, but the size of the dispersed particles differs by an order of magnitude indicating dissimilar interactions for the corresponding pairs. Properties change accordingly, the blend containing the smallest dispersed particles has the largest tensile strength, while PLA/PS blends with the coarsest structure have the smallest. The latter blends are also very brittle. Component interactions were estimated by four different methods, the determination of the size of the dispersed particles, the calculation of the Flory-Huggins interaction parameter from solvent absorption, from solubility parameters, and by the quantitative evaluation of the composition dependence of tensile strength. All approaches led to the same result indicating strong interaction for the PLA/PMMA pair and weak for PLA and PS. A general correlation was established between interactions and the mechanical properties of the blends

    Tracing the Ingredients for a Habitable Earth from Interstellar Space through Planet Formation

    Get PDF
    We use the C/N ratio as a monitor of the delivery of key ingredients of life to nascent terrestrial worlds. Total elemental C and N contents, and their ratio, are examined for the interstellar medium, comets, chondritic meteorites and terrestrial planets; we include an updated estimate for the Bulk Silicate Earth (C/N = 49.0 +/- 9.3). Using a kinetic model of disk chemistry, and the sublimation/condensation temperatures of primitive molecules, we suggest that organic ices and macro-molecular (refractory or carbonaceous dust) organic material are the likely initial C and N carriers. Chemical reactions in the disk can produce nebular C/N ratios of ~1-12, comparable to those of comets and the low end estimated for planetesimals. An increase of the C/N ratio is traced between volatile-rich pristine bodies and larger volatile-depleted objects subjected to thermal/accretional metamorphism. The C/N ratios of the dominant materials accreted to terrestrial planets should therefore be higher than those seen in carbonaceous chondrites or comets. During planetary formation, we explore scenarios leading to further volatile loss and associated C/N variations owing to core formation and atmospheric escape. Key processes include relative enrichment of nitrogen in the atmosphere and preferential sequestration of carbon by the core. The high C/N BSE ratio therefore is best satisfied by accretion of thermally processed objects followed by large-scale atmospheric loss. These two effects must be more profound if volatile sequestration in the core is effective. The stochastic nature of these processes hints that the surface/atmospheric abundances of biosphere-essential materials will likely be variable.Comment: Accepted by PNAS per http://www.pnas.org/content/early/2015/07/01/1500954112.abstract?sid=9fd8abea-9d33-46d8-b755-217d10b1c24

    Generation and manipulation of nonclassical light using photonic crystals

    Full text link
    Photonic crystal cavities can localize light into nanoscale volumes with high quality factors. This permits a strong interaction between light and matter, which is important for the construction of classical light sources with improved properties (e.g., low threshold lasers) and of nonclassical light sources (such as single and entangled photon sources) that are crucial pieces of hardware of quantum information processing systems. This article will review some of our recent experimental and theoretical results on the interaction between single quantum dots and photonic crystal cavity fields, and on the integration of multiple photonic crystal devices into functional circuits for quantum information processing.Comment: 6 pages, 6 figures; replaced with revised versio

    What May Visualization Processes Optimize?

    Full text link
    In this paper, we present an abstract model of visualization and inference processes and describe an information-theoretic measure for optimizing such processes. In order to obtain such an abstraction, we first examined six classes of workflows in data analysis and visualization, and identified four levels of typical visualization components, namely disseminative, observational, analytical and model-developmental visualization. We noticed a common phenomenon at different levels of visualization, that is, the transformation of data spaces (referred to as alphabets) usually corresponds to the reduction of maximal entropy along a workflow. Based on this observation, we establish an information-theoretic measure of cost-benefit ratio that may be used as a cost function for optimizing a data visualization process. To demonstrate the validity of this measure, we examined a number of successful visualization processes in the literature, and showed that the information-theoretic measure can mathematically explain the advantages of such processes over possible alternatives.Comment: 10 page
    • …
    corecore