6,996,997 research outputs found

    Economic benefit of the National Broadband Network

    Get PDF
    This paper argues that all regions benefit from the NBN but the economic effects are greater in the major cities because of their larger economic activity. Executive summary This paper is a partial summary of a study undertaken in the Centre for Energy-Efficient Telecommunications (CEET) at the University of Melbourne. The study focuses on the potential economic impact of Australia’s NBN. The NBN affects the economy by making online services more widely available. Taking a conservative approach, we have considered just six categories of online services (cloud computing, electronic commerce, online higher education, telehealth practice, teleworking, and entertainment) from which there are documented economic benefits. We have attributed to the NBN only the additional benefit derived from its deployment over and above what we estimate would have been the broadband situation in Australia without the NBN. That is, we have not assumed that broadband availability would have stagnated without the NBN. We do expect, however, that future services will require higher access speeds, generally in the range 10-25 Mb/s. With this assumption and using a well-attested model of the Australian economy, we show that, in the long term, real GDP can be boosted by about 1.8% and real household consumption (a measure of national welfare) by about 2.0%. When we take into account the need to repay the cost of the NBN, GDP increases slightly but the benefit to real household consumption is reduced to 1.4%. Most of the benefit comes from telehealth and teleworking. Because the access speeds (downstream and upstream) required for the services are quite uncertain, we have looked at the effects of access speeds. If all the services except entertainment can be provided with no more than 2.5 Mb/s down and up (typical of implementations today), then the costs of the NBN outweigh the benefits. Real GDP increases by less than 0.2% but real household consumption declines by 0.4%. That is, building an NBN just for entertainment is not economically viable. An analysis of the regional distribution of benefits shows that all regions benefit from the NBN but the economic effects are greater in the major cities because of their larger economic activity

    Using Bad Learners to find Good Configurations

    Full text link
    Finding the optimally performing configuration of a software system for a given setting is often challenging. Recent approaches address this challenge by learning performance models based on a sample set of configurations. However, building an accurate performance model can be very expensive (and is often infeasible in practice). The central insight of this paper is that exact performance values (e.g. the response time of a software system) are not required to rank configurations and to identify the optimal one. As shown by our experiments, models that are cheap to learn but inaccurate (with respect to the difference between actual and predicted performance) can still be used rank configurations and hence find the optimal configuration. This novel \emph{rank-based approach} allows us to significantly reduce the cost (in terms of number of measurements of sample configuration) as well as the time required to build models. We evaluate our approach with 21 scenarios based on 9 software systems and demonstrate that our approach is beneficial in 16 scenarios; for the remaining 5 scenarios, an accurate model can be built by using very few samples anyway, without the need for a rank-based approach.Comment: 11 pages, 11 figure

    Efficient photovoltaic and electroluminescent perovskite devices

    Get PDF
    Planar diode structures employing hybrid organic-inorganic methylammonium lead iodide perovskites lead to multifunctional devices exhibiting both a high photovoltaic efficiency and good electroluminescence. The electroluminescence strongly improves at higher current density applied using a pulsed driving method

    Strongly Refuting Random CSPs Below the Spectral Threshold

    Full text link
    Random constraint satisfaction problems (CSPs) are known to exhibit threshold phenomena: given a uniformly random instance of a CSP with nn variables and mm clauses, there is a value of m=Ω(n)m = \Omega(n) beyond which the CSP will be unsatisfiable with high probability. Strong refutation is the problem of certifying that no variable assignment satisfies more than a constant fraction of clauses; this is the natural algorithmic problem in the unsatisfiable regime (when m/n=ω(1)m/n = \omega(1)). Intuitively, strong refutation should become easier as the clause density m/nm/n grows, because the contradictions introduced by the random clauses become more locally apparent. For CSPs such as kk-SAT and kk-XOR, there is a long-standing gap between the clause density at which efficient strong refutation algorithms are known, m/nO~(nk/21)m/n \ge \widetilde O(n^{k/2-1}), and the clause density at which instances become unsatisfiable with high probability, m/n=ω(1)m/n = \omega (1). In this paper, we give spectral and sum-of-squares algorithms for strongly refuting random kk-XOR instances with clause density m/nO~(n(k/21)(1δ))m/n \ge \widetilde O(n^{(k/2-1)(1-\delta)}) in time exp(O~(nδ))\exp(\widetilde O(n^{\delta})) or in O~(nδ)\widetilde O(n^{\delta}) rounds of the sum-of-squares hierarchy, for any δ[0,1)\delta \in [0,1) and any integer k3k \ge 3. Our algorithms provide a smooth transition between the clause density at which polynomial-time algorithms are known at δ=0\delta = 0, and brute-force refutation at the satisfiability threshold when δ=1\delta = 1. We also leverage our kk-XOR results to obtain strong refutation algorithms for SAT (or any other Boolean CSP) at similar clause densities. Our algorithms match the known sum-of-squares lower bounds due to Grigoriev and Schonebeck, up to logarithmic factors. Additionally, we extend our techniques to give new results for certifying upper bounds on the injective tensor norm of random tensors

    Efficient DMA transfers management on embedded Linux PSoC for Deep-Learning gestures recognition: Using Dynamic Vision Sensor and NullHop one-layer CNN accelerator to play RoShamBo

    Get PDF
    This demonstration shows a Dynamic Vision Sensor able to capture visual motion at a speed equivalent to a highspeed camera (20k fps). The collected visual information is presented as normalized histogram to a CNN accelerator hardware, called NullHop, that is able to process a pre-trained CNN to play Roshambo against a human. The CNN designed for this purpose consist of 5 convolutional layers and a fully connected layer. The latency for processing one histogram is 8ms. NullHop is deployed on the FPGA fabric of a PSoC from Xilinx, the Zynq 7100, which is based on a dual-core ARM computer and a Kintex-7 with 444K logic cells, integrated in the same chip. ARM computer is running Linux and a specific C++ controller is running the whole demo. This controller runs at user space in order to extract the maximum throughput thanks to an efficient use of the AXIStream, based of DMA transfers. This short delay needed to process one visual histogram, allows us to average several consecutive classification outputs. Therefore, it provides the best estimation of the symbol that the user presents to the visual sensor. This output is then mapped to present the winner symbol within the 60ms latency that the brain considers acceptable before thinking that there is a trick.Ministerio de Economía y Competitividad TEC2016-77785-

    Transparent and efficient shared-state management for optimistic simulations on multi-core machines

    Get PDF
    Traditionally, Logical Processes (LPs) forming a simulation model store their execution information into disjoint simulations states, forcing events exchange to communicate data between each other. In this work we propose the design and implementation of an extension to the traditional Time Warp (optimistic) synchronization protocol for parallel/distributed simulation, targeted at shared-memory/multicore machines, allowing LPs to share parts of their simulation states by using global variables. In order to preserve optimism's intrinsic properties, global variables are transparently mapped to multi-version ones, so to avoid any form of safety predicate verification upon updates. Execution's consistency is ensured via the introduction of a new rollback scheme which is triggered upon the detection of an incorrect global variable's read. At the same time, efficiency in the execution is guaranteed by the exploitation of non-blocking algorithms in order to manage the multi-version variables' lists. Furthermore, our proposal is integrated with the simulation model's code through software instrumentation, in order to allow the application-level programmer to avoid using any specific API to mark or to inform the simulation kernel of updates to global variables. Thus we support full transparency. An assessment of our proposal, comparing it with a traditional message-passing implementation of variables' multi-version is provided as well. © 2012 IEEE

    Efficient realization of a threshold voter for self-purging redundancy

    Get PDF
    The self-purging technique is not commonly used mainly due to the lack of practical implementations of its key component, the threshold voter. A very efficient implementation of this voter is presented which uses a decomposition technique to substantially reduce the circuit complexity and delay, as compared to alternative implementations.Comisión Interministerial de Ciencia y Tecnología TIC97-064

    Measured unsteady transonic aerodynamic characteristics of an elastic supercritical wing with an oscillating control surface

    Get PDF
    Transonic steady and unsteady aerodynamic data were measured on a large elastic wing in the NASA Langley Transonic Dynamics Tunnel. The wing had a supercritical airfoil shape and a leading-edge sweepback of 28.8 deg. The wing was heavily instrumented to measure both static and dynamic pressures and deflections. A hydraulically driven outboard control surface was oscillated to generate unsteady airloads on the wing. Representative results from the wind tunnel tests are presented and discussed, and the unexpected occurrence of an unusual dynamic wing instability, which was sensitive to angle of attack, is reported

    Objective and efficient terahertz signal denoising by transfer function reconstruction

    Get PDF
    As an essential processing step in many disciplines, signal denoising efficiently improves data quality without extra cost. However, it is relatively under-utilized for terahertz spectroscopy. The major technique reported uses wavelet denoising in the time-domain, which has a fuzzy physical meaning and limited performance in low-frequency and water-vapor regions. Here, we work from a new perspective by reconstructing the transfer function to remove noise-induced oscillations. The method is fully objective without a need for defining a threshold. Both reflection imaging and transmission imaging were conducted. The experimental results show that both low- and high-frequency noise and the water-vapor influence were efficiently removed. The spectrum accuracy was also improved, and the image contrast was significantly enhanced. The signal-to-noise ratio of the leaf image was increased up to 10 dB, with the 6 dB bandwidth being extended by over 0.5 THz
    corecore