3,774 research outputs found

    Energy and Economic Growth

    Get PDF
    Physical theory shows that energy is necessary for economic production and therefore growth but the mainstream theory of economic growth, except for specialized resource economics models, pays no attention to the role of energy. This paper reviews the relevant biophysical theory, mainstream and resource economics models of growth, the critiques of mainstream models, and the various mechanisms that can weaken the links between energy and growth. Finally we review the empirical literature that finds that energy used per unit of economic output has declined, but that this is to a large extent due to a shift from poorer quality fuels such as coal to the use of higher quality fuels, and especially electricity. Furthermore, time series analysis shows that energy and GDP cointegrate and energy use Granger causes GDP when additional variables such as energy prices or other production inputs are included. As a result, prospects for further large reductions in energy intensity seem limited.

    Large Language Models, Prompting, and Synthetic Data Generation for Continual Named Entity Recognition

    Get PDF
    With the ever-growing amount of textual data, the task of Named Entity Recognition (NER) is vital to Natural Language Processing (NLP), a field which focuses on enabling computers to understand and manipulate human language. NER enables the extraction of information from unstructured text. Accurate information extraction is crucial for applications ranging from information retrieval to systems for question-answering. To ensure that NER models are robust to changes in data distributions and capable of recognizing new entity types, one may consider expanding the capabilities of an existing model. Continual learning is a paradigm within machine learning. It studies the objective of learning new information incrementally without forgetting previously learned knowledge. A central concern with continual learning is the phenomenon of catastrophic forgetting, where training a neural network on new information leads to significant degradation in performance on previously learned information. Reannotation of existing data for new information and then training a new model proves costly and time-consuming, prompting the need for better strategies. Generating and using synthetic data to combat forgetting has been studied in continual learning for vision models and, to a limited extent, with long short-term memory unit (LSTM) generators or inverted models for NER models. One way to achieve this is to use generative large-language models to create synthetic data. This work focuses on building the foundation for a generative replay approach. We aim to determine the efficacy of using Open AI\u27s GPT-4 model to generate synthetic data to supplement the training of NER systems. We aim to answer the following questions: Can synthetic data be generated to mimic the format of authentic NER training data? Is synthetic data similar to the authentic data? Does the addition of synthetic data improve model performance? Is solely using synthetic data enough to achieve performance on par with a baseline? How do different prompting strategies for generating synthetic data affect model performance? We conducted experiments using the 2018 TAC SRIE dataset and a DeBERTa-V3-based model with broadcast linear and softmax classification layers. We successfully generated synthetic data using GPT-4 and two different prompting strategies. We found improved performance when supplementing authentic data with synthetic data, even when only supplementing with small amounts. This work contributes a novel finding concerning NER and synthetic data generation with generative large-language models and lays the foundation for a novel generative-replay approach to continual NER

    Report on the first round of the Mock LISA Data Challenges

    Get PDF
    The Mock LISA Data Challenges (MLDCs) have the dual purpose of fostering the development of LISA data analysis tools and capabilities, and demonstrating the technical readiness already achieved by the gravitational-wave community in distilling a rich science payoff from the LISA data output. The first round of MLDCs has just been completed: nine challenges consisting of data sets containing simulated gravitational-wave signals produced either by galactic binaries or massive black hole binaries embedded in simulated LISA instrumental noise were released in June 2006 with deadline for submission of results at the beginning of December 2006. Ten groups have participated in this first round of challenges. All of the challenges had at least one entry which successfully characterized the signal to better than 95% when assessed via a correlation with phasing ambiguities accounted for. Here, we describe the challenges, summarize the results and provide a first critical assessment of the entries

    Consumption inequality and income uncertainty

    Get PDF
    This paper places the debate over using consumption or income in studies of inequality growth in a formal intertemporal setting. It highlights the importance of permanent and transitory income uncertainty in the evaluation of growth in consumption inequality. We derive conditions under which the growth of variances and covariances of income and consumption can be used to separately identify the growth in the variance of permanent and transitory income shocks. Household data from Britain for the period 1968-1992 are used to show a strong growth in transitory inequality toward the end of this period, while younger cohorts are shown to face significantly higher levels of permanent inequality

    Gravitational Wave Chirp Search: Economization of PN Matched Filter Bank via Cardinal Interpolation

    Full text link
    The final inspiral phase in the evolution of a compact binary consisting of black holes and/or neutron stars is among the most probable events that a network of ground-based interferometric gravitational wave detectors is likely to observe. Gravitational radiation emitted during this phase will have to be dug out of noise by matched-filtering (correlating) the detector output with a bank of several 10510^5 templates, making the computational resources required quite demanding, though not formidable. We propose an interpolation method for evaluating the correlation between template waveforms and the detector output and show that the method is effective in substantially reducing the number of templates required. Indeed, the number of templates needed could be a factor 4\sim 4 smaller than required by the usual approach, when the minimal overlap between the template bank and an arbitrary signal (the so-called {\it minimal match}) is 0.97. The method is amenable to easy implementation, and the various detector projects might benefit by adopting it to reduce the computational costs of inspiraling neutron star and black hole binary search.Comment: scheduled for publicatin on Phys. Rev. D 6

    Wide parameter search for isolated pulsars using the Hough transform

    Full text link
    We use the Hough transform to analyze data from the second science run of the LIGO interferometers, to look for gravitational waves from isolated pulsars. We search over the whole sky and over a large range of frequencies and spin-down parameters. Our search method is based on the Hough transform, which is a semi-coherent, computationally efficient, and robust pattern recognition technique. We also present a validation of the search pipeline using hardware signal injections.Comment: Presented at GWDAW-9 in Annecy, France (Dec. 2004). 11 pages, 5 Figures. To appear in Classical and Quantum Gravit

    Testing Alternative Theories of Gravity using LISA

    Full text link
    We investigate the possible bounds which could be placed on alternative theories of gravity using gravitational wave detection from inspiralling compact binaries with the proposed LISA space interferometer. Specifically, we estimate lower bounds on the coupling parameter \omega of scalar-tensor theories of the Brans-Dicke type and on the Compton wavelength of the graviton \lambda_g in hypothetical massive graviton theories. In these theories, modifications of the gravitational radiation damping formulae or of the propagation of the waves translate into a change in the phase evolution of the observed gravitational waveform. We obtain the bounds through the technique of matched filtering, employing the LISA Sensitivity Curve Generator (SCG), available online. For a neutron star inspiralling into a 10^3 M_sun black hole in the Virgo Cluster, in a two-year integration, we find a lower bound \omega > 3 * 10^5. For lower-mass black holes, the bound could be as large as 2 * 10^6. The bound is independent of LISA arm length, but is inversely proportional to the LISA position noise error. Lower bounds on the graviton Compton wavelength ranging from 10^15 km to 5 * 10^16 km can be obtained from one-year observations of massive binary black hole inspirals at cosmological distances (3 Gpc), for masses ranging from 10^4 to 10^7 M_sun. For the highest-mass systems (10^7 M_sun), the bound is proportional to (LISA arm length)^{1/2} and to (LISA acceleration noise)^{-1/2}. For the others, the bound is independent of these parameters because of the dominance of white-dwarf confusion noise in the relevant part of the frequency spectrum. These bounds improve and extend earlier work which used analytic formulae for the noise curves.Comment: 16 pages, 9 figures, submitted to Classical & Quantum Gravit

    Carbon Free Boston: Technical Summary

    Full text link
    Part of a series of reports that includes: Carbon Free Boston: Summary Report; Carbon Free Boston: Social Equity Report; Carbon Free Boston: Buildings Technical Report; Carbon Free Boston: Transportation Technical Report; Carbon Free Boston: Waste Technical Report; Carbon Free Boston: Energy Technical Report; Carbon Free Boston: Offsets Technical Report; Available at http://sites.bu.edu/cfb/OVERVIEW: This technical summary is intended to argument the rest of the Carbon Free Boston technical reports that seek to achieve this goal of deep mitigation. This document provides below: a rationale for carbon neutrality, a high level description of Carbon Free Boston’s analytical approach; a summary of crosssector strategies; a high level analysis of air quality impacts; and, a brief analysis of off-road and street light emissions.Published versio

    Metrological characterization of the pulsed Rb clock with optical detection

    Full text link
    We report on the implementation and the metrological characterization of a vapor-cell Rb frequency standard working in pulsed regime. The three main parts that compose the clock, physics package, optics and electronics, are described in detail in the paper. The prototype is designed and optimized to detect the clock transition in the optical domain. Specifically, the reference atomic transition, excited with a Ramsey scheme, is detected by observing the interference pattern on a laser absorption signal. \ The metrological analysis includes the observation and characterization of the clock signal and the measurement of frequency stability and drift. In terms of Allan deviation, the measured frequency stability results as low as 1.7×1013 τ1/21.7\times 10^{-13} \ \tau^{-1/2}, τ\tau being the averaging time, and reaches the value of few units of 101510^{-15} for τ=104\tau=10^{4} s, an unprecedent achievement for a vapor cell clock. We discuss in the paper the physical effects leading to this result with particular care to laser and microwave noises transferred to the clock signal. The frequency drift, probably related to the temperature, stays below 101410^{-14} per day, and no evidence of flicker floor is observed. \ We also mention some possible improvements that in principle would lead to a clock stability below the 101310^{-13} level at 1 s and to a drift of few units of 101510^{-15} per day

    Gravitational Waves from Neutron Stars with Large Toroidal B-fields

    Full text link
    We show that NS's with large toroidal B-fields tend naturally to evolve into potent gravitational-wave (gw) emitters. The toroidal field B_t tends to distort the NS into a prolate shape, and this magnetic distortion can easily dominate over the oblateness ``frozen into'' the NS crust. An elastic NS with frozen-in B-field of this magnitude is clearly secularly unstable: the wobble angle between the NS's angular momentum J^i and the star's magnetic axis n_B^i grow on a dissipation timescale until J^i and n_B^i are orthogonal. This final orientation is clearly the optimal one for gravitational-wave (gw) emission. The basic cause of the instability is quite general, so we conjecture that the same final state is reached for a realistic NS. Assuming this, we show that for LMXB's with B_t of order 10^{13}G, the spindown from gw's is sufficient to balance the accretion torque--supporting a suggestion by Bildsten. The spindown rates of most millisecond pulsars can also be attributed to gw emission sourced by toroidal B-fields, and both these sources could be observed by LIGO II. While the first-year spindown of a newborn NS is most likely dominated by em processes, reasonable values of B_t and the (external) dipolar field B_d can lead to detectable levels of gw emission, for a newborn NS in our own galaxy.Comment: 7 pages; submitted to PRD; only minor revision
    corecore