78 research outputs found

    The weather affects air conditioner purchases to fill the energy efficiency gap

    Get PDF
    Energy efficiency improvement is often hindered by the energy efficiency gap. This paper examines the effect of short-run temperature fluctuations on the Energy Star air conditioner purchases in the United States from 2006 to 2019 using transaction-level data. Results show that the probability of purchasing an Energy Star air conditioner increases as the weekly temperature before the transaction deviates from 20–22 °C. A larger response is related to fewer cooling degree days in the previous years, higher electricity prices/income/educational levels/age/rate of owners, more common use of electricity, and stronger concern about climate change. 1 °C increase and decrease from 21 °C would lead to a reduction of total energy expenditure by 35.46 and 17.73 million dollars nationwide (0.13% and 0.06% of the annual total energy expenditure on air conditioning), respectively. Our findings have important policy implications for demand-end interventions to incorporate the potential impact of the ambient physical environment

    Earning Extra Performance from Restrictive Feedbacks

    Full text link
    Many machine learning applications encounter a situation where model providers are required to further refine the previously trained model so as to gratify the specific need of local users. This problem is reduced to the standard model tuning paradigm if the target data is permissibly fed to the model. However, it is rather difficult in a wide range of practical cases where target data is not shared with model providers but commonly some evaluations about the model are accessible. In this paper, we formally set up a challenge named \emph{Earning eXtra PerformancE from restriCTive feEDdbacks} (EXPECTED) to describe this form of model tuning problems. Concretely, EXPECTED admits a model provider to access the operational performance of the candidate model multiple times via feedback from a local user (or a group of users). The goal of the model provider is to eventually deliver a satisfactory model to the local user(s) by utilizing the feedbacks. Unlike existing model tuning methods where the target data is always ready for calculating model gradients, the model providers in EXPECTED only see some feedbacks which could be as simple as scalars, such as inference accuracy or usage rate. To enable tuning in this restrictive circumstance, we propose to characterize the geometry of the model performance with regard to model parameters through exploring the parameters' distribution. In particular, for the deep models whose parameters distribute across multiple layers, a more query-efficient algorithm is further tailor-designed that conducts layerwise tuning with more attention to those layers which pay off better. Our theoretical analyses justify the proposed algorithms from the aspects of both efficacy and efficiency. Extensive experiments on different applications demonstrate that our work forges a sound solution to the EXPECTED problem.Comment: Accepted by IEEE TPAMI in April 202

    Loss of profit in the hotel industry of the United States due to climate change

    Get PDF
    Tourism has been identified as a key economic sector vulnerable to climate change, yet direct empirical evidence is still lacking on the economic gain and loss of the tourism industry due to climate change. Here we find that temperature significantly affects the profits of the hotel industry with both spatial and seasonal heterogeneity. By using a rich dataset of the monthly financial records of more than 1700 hotels in 50 US states during 2016–2018 (approximately 3.2% of hotels nationally), we show that a deviation from 18 °C ~ 20 °C in monthly averaged temperature leads to a decrease in the profit rate. The effect is triggered by fewer customers, less revenue, and higher cost per occupied room partially due to the increased usage of electricity and water. Such an effect can be lasting and is less impactful for higher chain scale hotels. In future GHG emission scenarios, climate change will lead to a loss of profit in most climate zones particularly the southern regions, with higher GHG emissions leading to a more serious effect. This study contributes to the literature on how climate change affects human activities and helps refine the relevant damage function of climate change on tourism in existing climate models

    ESL-SNNs: An Evolutionary Structure Learning Strategy for Spiking Neural Networks

    Full text link
    Spiking neural networks (SNNs) have manifested remarkable advantages in power consumption and event-driven property during the inference process. To take full advantage of low power consumption and improve the efficiency of these models further, the pruning methods have been explored to find sparse SNNs without redundancy connections after training. However, parameter redundancy still hinders the efficiency of SNNs during training. In the human brain, the rewiring process of neural networks is highly dynamic, while synaptic connections maintain relatively sparse during brain development. Inspired by this, here we propose an efficient evolutionary structure learning (ESL) framework for SNNs, named ESL-SNNs, to implement the sparse SNN training from scratch. The pruning and regeneration of synaptic connections in SNNs evolve dynamically during learning, yet keep the structural sparsity at a certain level. As a result, the ESL-SNNs can search for optimal sparse connectivity by exploring all possible parameters across time. Our experiments show that the proposed ESL-SNNs framework is able to learn SNNs with sparse structures effectively while reducing the limited accuracy. The ESL-SNNs achieve merely 0.28% accuracy loss with 10% connection density on the DVS-Cifar10 dataset. Our work presents a brand-new approach for sparse training of SNNs from scratch with biologically plausible evolutionary mechanisms, closing the gap in the expressibility between sparse training and dense training. Hence, it has great potential for SNN lightweight training and inference with low power consumption and small memory usage
    • …
    corecore