419 research outputs found

    Cutting the same fraction of several measures

    Full text link
    We study some measure partition problems: Cut the same positive fraction of d+1d+1 measures in Rd\mathbb R^d with a hyperplane or find a convex subset of Rd\mathbb R^d on which d+1d+1 given measures have the same prescribed value. For both problems positive answers are given under some additional assumptions.Comment: 7 pages 2 figure

    Nanofabrication of Surface-Enhanced Raman Scattering Device by an Integrated Block-Copolymer and Nanoimprint Lithography Method

    Get PDF
    The integration of block-copolymers and nanoimprint lithography presents a novel and cost-effective approach to achieving nanoscale patterning capabilities. The authors demonstrate the fabrication of a surface-enhanced Raman scattering device using templates created by the block-copolymers nanoimprint lithography integrated method

    Generalized Density-Functional Tight-Binding Repulsive Potentials from Unsupervised Machine Learning

    Get PDF
    We combine the approximate density-functional tight-binding (DFTB) method with unsupervised machine learning. This allows us to improve transferability and accuracy, make use of large quantum chemical data sets for the parametrization, and efficiently automatize the parametrization process of DFTB. For this purpose, generalized pair-potentials are introduced, where the chemical environment is included during the learning process, leading to more specific effective two-body potentials. We train on energies and forces of equilibrium and nonequilibrium structures of 2100 molecules, and test on ∼130 000 organic molecules containing O, N, C, H, and F atoms. Atomization energies of the reference method can be reproduced within an error of ∼2.6 kcal/mol, indicating drastic improvement over standard DFTB

    Detection of Ventricular Tachycardia Using Scanning Correlation Analysis

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/72896/1/j.1540-8159.1990.tb06919.x.pd

    Fair Mixing: the Case of Dichotomous Preferences

    Get PDF
    We consider a setting in which agents vote to choose a fair mixture of public outcomes. The agents have dichotomous preferences: each outcome is liked or disliked by an agent. We discuss three outstanding voting rules. The Conditional Utilitarian rule, a variant of the random dictator, is strategyproof and guarantees to any group of like-minded agents an influence proportional to its size. It is easier to compute and more efficient than the familiar Random Priority rule. Its worst case (resp. average) inefficiency is provably (resp. in numerical experiments) low if the number of agents is low. The efficient Egalitarian rule protects individual agents but not coalitions. It is excludable strategyproof: I do not want to lie if I cannot consume outcomes I claim to dislike. The efficient Nash Max Product rule offers the strongest welfare guarantees to coalitions, who can force any outcome with a probability proportional to their size. But it even fails the excludable form of strategyproofness

    The Fairness Challenge in Computer Networks

    Full text link
    In this paper, the concept of fairness in computer networks is investigated. We motivate the need of examining fairness issues by providing example future application scenarios where fairness support is needed in order to experience sufficient service quality. Fairness definitions from political science and their application to computer networks are described and a state-of-the-art overview of research activities in fairness, from issues such a queue management and tcp-friendliness to issues like fairness in layered multi-rate multicast scenarios, is given. We contribute with this paper to the ongoing research activities by defining the fairness challenge with the purpose of helping direct future investigations to with spots on the map of research in fairness

    Data Mining and Machine Learning in Astronomy

    Full text link
    We review the current state of data mining and machine learning in astronomy. 'Data Mining' can have a somewhat mixed connotation from the point of view of a researcher in this field. If used correctly, it can be a powerful approach, holding the potential to fully exploit the exponentially increasing amount of available data, promising great scientific advance. However, if misused, it can be little more than the black-box application of complex computing algorithms that may give little physical insight, and provide questionable results. Here, we give an overview of the entire data mining process, from data collection through to the interpretation of results. We cover common machine learning algorithms, such as artificial neural networks and support vector machines, applications from a broad range of astronomy, emphasizing those where data mining techniques directly resulted in improved science, and important current and future directions, including probability density functions, parallel algorithms, petascale computing, and the time domain. We conclude that, so long as one carefully selects an appropriate algorithm, and is guided by the astronomical problem at hand, data mining can be very much the powerful tool, and not the questionable black box.Comment: Published in IJMPD. 61 pages, uses ws-ijmpd.cls. Several extra figures, some minor additions to the tex

    From the discrete to the continuous - towards a cylindrically consistent dynamics

    Full text link
    Discrete models usually represent approximations to continuum physics. Cylindrical consistency provides a framework in which discretizations mirror exactly the continuum limit. Being a standard tool for the kinematics of loop quantum gravity we propose a coarse graining procedure that aims at constructing a cylindrically consistent dynamics in the form of transition amplitudes and Hamilton's principal functions. The coarse graining procedure, which is motivated by tensor network renormalization methods, provides a systematic approximation scheme towards this end. A crucial role in this coarse graining scheme is played by embedding maps that allow the interpretation of discrete boundary data as continuum configurations. These embedding maps should be selected according to the dynamics of the system, as a choice of embedding maps will determine a truncation of the renormalization flow.Comment: 22 page

    Quality evaluation of olive oil by statistical analysis of multicomponent stable isotope dilution assay data of aroma active compounds

    Get PDF
    An instrumental method for the evaluation of olive oil quality was developed. Twenty-one relevant aroma active compounds were quantified in 95 olive oil samples of different quality by headspace solid phase microextraction (HS-SPME) and dynamic headspace coupled to GC-MS. On the basis of these stable isotope dilution assay results, statistical evaluation by partial least-squares discriminant analysis (PLS-DA) was performed. Important variables were the odor activity values of ethyl isobutanoate, ethyl 2-methylbutanoate, 3-methylbutanol, butyric acid, E,E-2,4-decadienal, hexanoic acid, guaiacol, 2-phenylethanol, and the sum of the odor activity values of Z-3-hexenal, E-2-hexenal, Z-3-hexenyl acetate, and Z-3-hexenol. Classification performed with these variables predicted 88% of the olive oils? quality correctly. Additionally, the aroma compounds, which are characteristic for some off-flavors, were dissolved in refined plant oil. Sensory evaluation of these models demonstrated that the off-flavors rancid, fusty, and vinegary could be successfully simulated by a limited number of odorants
    corecore