1,366 research outputs found

    AlGaAs lasers with micro-cleaved mirrors suitable for monolithic integration

    Get PDF
    A technique has been developed for cleaving the mirrors of AlGaAs lasers without cleaving the substrate. Micro-cleaving involves cleaving a suspended heterostructure cantilever by ultrasonic vibrations. Lasers with microcleaved mirrors have threshold currents and quantum efficiencies identical to those of similar devices with conventionally cleaved mirrors

    Embedded error estimation and adaptive step-size control for optimal explicit strong stability preserving Runge--Kutta methods

    Full text link
    We construct a family of embedded pairs for optimal strong stability preserving explicit Runge-Kutta methods of order 2p42 \leq p \leq 4 to be used to obtain numerical solution of spatially discretized hyperbolic PDEs. In this construction, the goals include non-defective methods, large region of absolute stability, and optimal error measurement as defined in [5,19]. The new family of embedded pairs offer the ability for strong stability preserving (SSP) methods to adapt by varying the step-size based on the local error estimation while maintaining their inherent nonlinear stability properties. Through several numerical experiments, we assess the overall effectiveness in terms of precision versus work while also taking into consideration accuracy and stability.Comment: 22 pages, 49 figure

    The effect of network structure on phase transitions in queuing networks

    Get PDF
    Recently, De Martino et al have presented a general framework for the study of transportation phenomena on complex networks. One of their most significant achievements was a deeper understanding of the phase transition from the uncongested to the congested phase at a critical traffic load. In this paper, we also study phase transition in transportation networks using a discrete time random walk model. Our aim is to establish a direct connection between the structure of the graph and the value of the critical traffic load. Applying spectral graph theory, we show that the original results of De Martino et al showing that the critical loading depends only on the degree sequence of the graph -- suggesting that different graphs with the same degree sequence have the same critical loading if all other circumstances are fixed -- is valid only if the graph is dense enough. For sparse graphs, higher order corrections, related to the local structure of the network, appear.Comment: 12 pages, 7 figure

    On the Entropy of a Family of Random Substitutions

    Full text link
    The generalised random Fibonacci chain is a stochastic extension of the classical Fibonacci substitution and is defined as the rule mapping 010\mapsto 1 and 11i01mi1 \mapsto 1^i01^{m-i} with probability pip_i, where pi0p_i\geq 0 with i=0mpi=1\sum_{i=0}^m p_i=1, and where the random rule is applied each time it acts on a 1. We show that the topological entropy of this object is given by the growth rate of the set of inflated generalised random Fibonacci words.Comment: A more appropriate tile and minor misprints corrected, compared to the previous versio

    Impacts of good practice policies on regional and global greenhouse gas emissions

    Get PDF
    The report looks at the impact of "good practice"emission reduction policies in nine different areas globally and across six countries: China, Brazil, India, the US, Russia and Japan. These include renewable energy, a variety of energy efficiency standards (buildings, car fuel efficiecy, appliances and lighting, industry), hydrofluorocarbons (HFC.s), emissions from fossil fuel production, electric cars and forestry. The authors looked at the most ambitious "good practice" policies around the world that are being implemented now, and calculated the difference these would make if everybody were to apply them. If all governments follow those governments that currently adopt the best climate policies in just nin different areas, they could reduce emissions close to the levels needed to stay on track to hold global warming below 2 degrees C. The implementation of good practice policies is projected to stabilise greenhouse gas emissions at 49-50 GtCO2e by 2020, and decrease to 44- 47 GtCO2e by 2030- close to the 2 degrees C emissions range (30-44 GtCO2e) by 2030. Direct replication of good practice policies is projected to halt emissions growth in most regions sinificantly before 2030. In contrast, current policies are expected to see emissions to increase to around 54 GtCO2e by 2020 and 59-60 GtCO2e by 2030

    Sensory evaluation and electronic tongue for sensing grafted and non-grafted watermelon taste attributes

    Get PDF
    The objective of our study was to analyse the results of two measuring methods (sensory evaluation and electronic tongue) and to find differences in taste between grafted and non-grafted watermelon fruit. The trained sensory panel evaluated in two years three differently treated watermelon fruit. The studied fruit samples were produced on the same growing-areas in both years but with different growing technologies. The experiment used the non-grafted/self-rooted watermelon as control sample, while the other two treatments were grafting on two rootstock types: a Lagenaria and an interspecific squash hybrid rootstock. The electronic tongue measurement showed that it is the environment/growing technology that mainly determines the characteristics of the fruit quality, not grafting. The two measurement methods can complement each other in a detailed and practical way, as technology and growing area strongly influence the quality of watermelon fruit. The research also showed that it is possible to have similar watermelon fruit quality, independently from the used rootstock type

    Multiboost: a multi-purpose boosting package

    No full text
    http://jmlr.csail.mit.edu/papers/v13/benbouzid12a.htmlThe MultiBoost package provides a fast C++ implementation of multi-class/multi-label/multi-task boosting algorithms. It is based on AdaBoost.MH but it also implements popular cascade classifiers and FilterBoost. The package contains common multi-class base learners (stumps, trees, products, Haar filters). Further base learners and strong learners following the boosting paradigm can be easily implemented in a flexible framework

    How network-based and set-based visualizations aid consistency checking in ontologies

    Get PDF
    © 2017 ACM. Ontologies describe complex world knowledge in that they consist of hierarchical relations, such as is-a, which can be expressed by quantifiers or sets, and various binary relations, which can be expressed by links or networks. Should hierarchical relations be distinguished from other binary relations as essentially different ones in building cognitively accessible systems of ontologies? In this study, two kinds of ontology visualizations, a network-based visualization (SOVA) and a set-based visualization (concept diagrams), are empirically compared in the case of consistency checking. Participants were presented with one diagram and then asked to answer the question of whether the meaning of the diagram was contradictory. Our results showed that SOVA is more effective than concept diagrams, suggesting that to represent hierarchical and binary relations of ontologies in a way based on networks suits human cognition when checking ontologies' consistencies

    Approximate Consensus in Highly Dynamic Networks: The Role of Averaging Algorithms

    Full text link
    In this paper, we investigate the approximate consensus problem in highly dynamic networks in which topology may change continually and unpredictably. We prove that in both synchronous and partially synchronous systems, approximate consensus is solvable if and only if the communication graph in each round has a rooted spanning tree, i.e., there is a coordinator at each time. The striking point in this result is that the coordinator is not required to be unique and can change arbitrarily from round to round. Interestingly, the class of averaging algorithms, which are memoryless and require no process identifiers, entirely captures the solvability issue of approximate consensus in that the problem is solvable if and only if it can be solved using any averaging algorithm. Concerning the time complexity of averaging algorithms, we show that approximate consensus can be achieved with precision of ε\varepsilon in a coordinated network model in O(nn+1log1ε)O(n^{n+1} \log\frac{1}{\varepsilon}) synchronous rounds, and in O(ΔnnΔ+1log1ε)O(\Delta n^{n\Delta+1} \log\frac{1}{\varepsilon}) rounds when the maximum round delay for a message to be delivered is Δ\Delta. While in general, an upper bound on the time complexity of averaging algorithms has to be exponential, we investigate various network models in which this exponential bound in the number of nodes reduces to a polynomial bound. We apply our results to networked systems with a fixed topology and classical benign fault models, and deduce both known and new results for approximate consensus in these systems. In particular, we show that for solving approximate consensus, a complete network can tolerate up to 2n-3 arbitrarily located link faults at every round, in contrast with the impossibility result established by Santoro and Widmayer (STACS '89) showing that exact consensus is not solvable with n-1 link faults per round originating from the same node

    Hierarchical Aggregation for Information Visualization: Overview, Techniques, and Design Guidelines

    Full text link
    corecore