2,081 research outputs found

    The Changing Role of Capital in the U.S. Private Business Sector: Evidence for a "New Economy"

    Get PDF
    Economists differ in their explanation of changes in the rate of U.S.economic growth in the latter half of the 20th century-particularly for the "new economy" period from 1982-2000. Adherents of the Neoclassical Growth Model have emphasized that with the increase in the capital/labor ratio the aggregate production function would be subject to diminishing returns so that economies would asymptotically approach a steady state in terms of output per worker and output per unit of capital. Endogenous Growth theorists have emphasized upward shifts in production functions offsetting diminishing returns. Both theories have neglected to incorporate into their growth models the effects of systematic shifts in the composition of output that accompany economic growth. The paper analyzes the Private Business Sector (exclusion of Government, Residential Housing, and Not For Profit), uses a more restrictive measure of output, Net National Income, rather than Gross Domestic Product and a more general measure of labor input, Persons Engaged in Production, rather than Full Time Equivalent Employment or labor hours in analysis. Using BEA data sets for the stock of physical capital and gross product originating by SIC sector and industry, the paper demonstrates that about half the increase in labor and capital productivity in the new economy has been the result of endogenous growth within sectors and industries and the other half is attributable to shifts in the composition of output away from more physical capital-intensive industries to more labor-intensive industries. After falling steadily from 1966 to 1982, both the nominal output/capital (Y/C) and real output/capital ((Q/K) ratios rise steadily from 1982 to 2000. Growth in the real capital/labor (K/N) ratio slows during this period so that in marked contrast to earlier periods, half of the growth in real output per worker (Q/N) is attributable to increases in capital productivity. Increase in the Y/C ratio is shown, by counterfactual analysis, to depend partly on the shift of output from more to less capital intensive industries. The paper also demonstrates that half of the change in the nominal Y/C ratio is due to “real” rather than relative price changes and that changes in capacity utilization over the business cycle explain only a negligible part of the increase.

    Reeds computer code

    Get PDF
    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO

    The Role of Computer Science in a Liberal Arts College

    Get PDF
    The question the panel has been asked to discuss actually encompasses two related, but distinct, questions - each of which, in turn, has two subquestions. The first question concerns the place of a computer science as a major program of study among the offerings of a liberal arts college. The two subquestions here are as follows: First, is computer science an appropriate field of study to offer in a liberal arts college? (Is it a liberal art?) Second, is a liberal arts college an appropriate home for a computer science program? (Can a high quality program be offered in such an environment?) The second question concerns the place of a computer science as part of the core program of studies in a liberal arts college. Here, the subquestions are these: First, what (if anything) ought every liberally-educated persons to know about computer science? (What are the minimum essentials?) Second, how can computer science enrich and inform the other liberal disciplines (thus taking its place among the “liberating” branches of learning)

    A Tutorial on PROLOG

    Get PDF
    PROLOG is a programming language based on the use of mathematical logic—specifically the first order predicate calculus. The name is a contraction for “Programming in Logic”. PROLOG was developed in 1972 by Phillippe Roussel of the AI Group (Groupe d’Intelligence Artificielle) of the University of Marseille. Specifically, it is an outgrowth of research there on automatic theorem proving. PROLOG has been widely used by AI researchers in Europe and Japan. In fact, the Japanese have made it the basis for the software side of their “Fifth Generation” computer project. PROLOG is currently used in a wide variety of areas, not just for automatic theorem proving. It is an excellent tool when on wants to do symbolic (as opposed to numerical) computation. Until recently, PROLOG has been less widely known in this country—perhaps due to the “not invented here” syndrome

    The utility of twins in developmental cognitive neuroscience research: How twins strengthen the ABCD research design

    Get PDF
    The ABCD twin study will elucidate the genetic and environmental contributions to a wide range of mental and physical health outcomes in children, including substance use, brain and behavioral development, and their interrelationship. Comparisons within and between monozygotic and dizygotic twin pairs, further powered by multiple assessments, provide information about genetic and environmental contributions to developmental associations, and enable stronger tests of causal hypotheses, than do comparisons involving unrelated children. Thus a sub-study of 800 pairs of same-sex twins was embedded within the overall Adolescent Brain and Cognitive Development (ABCD) design. The ABCD Twin Hub comprises four leading centers for twin research in Minnesota, Colorado, Virginia, and Missouri. Each site is enrolling 200 twin pairs, as well as singletons. The twins are recruited from registries of all twin births in each State during 2006–2008. Singletons at each site are recruited following the same school-based procedures as the rest of the ABCD study. This paper describes the background and rationale for the ABCD twin study, the ascertainment of twin pairs and implementation strategy at each site, and the details of the proposed analytic strategies to quantify genetic and environmental influences and test hypotheses critical to the aims of the ABCD study. Keywords: Twins, Heritability, Environment, Substance use, Brain structure, Brain functio

    Uncertainty reconciles complementarity with joint measurability

    Full text link
    The fundamental principles of complementarity and uncertainty are shown to be related to the possibility of joint unsharp measurements of pairs of noncommuting quantum observables. A new joint measurement scheme for complementary observables is proposed. The measured observables are represented as positive operator valued measures (POVMs), whose intrinsic fuzziness parameters are found to satisfy an intriguing pay-off relation reflecting the complementarity. At the same time, this relation represents an instance of a Heisenberg uncertainty relation for measurement imprecisions. A model-independent consideration show that this uncertainty relation is logically connected with the joint measurability of the POVMs in question.Comment: 4 pages, RevTeX. Title of previous version: "Complementarity and uncertainty - entangled in joint path-interference measurements". This new version focuses on the "measurement uncertainty relation" and its role, disentangling this issue from the special context of path interference duality. See also http://www.vjquantuminfo.org (October 2003

    Trinity University\u27s Summer Bridge Program: Navigating the Changing Demographics in Higher Education

    Get PDF
    Our article is divided into five sections. First, our study explores the demographic, economic, and cultural changes influencing higher education. We also explain the tangible and intangible benefits of a college education for first-generation, underrepresented students (FGUS). Second, we provide a brief discussion of the history of Trinity University and our Summer Bridge program. Third, our study describes our Summer Bridge program. Fourth, the data we collected examines how our Summer Bridge students’ grades and retention rates compare to our other first-year students. And, fifth, our article concludes with a discussion of future directions for our Summer Bridge program and how it may apply to other higher educational institutions. In particular, we offer recommendations for other student affairs professionals who also will be experiencing an increase in first-generation, underrepresented students

    Mid-Infrared Time-Resolved Frequency Comb Spectroscopy of Transient Free Radicals

    Get PDF
    We demonstrate time-resolved frequency comb spectroscopy (TRFCS), a new broadband absorption spectroscopy technique for the study of trace free radicals on the microsecond timescale. We apply TRFCS to study the time-resolved, mid-infrared absorption of the deuterated hydroxyformyl radical trans-DOCO, an important short-lived intermediate along the OD + CO reaction path. Directly after photolysis of the chemical precursor acrylic acid-d_1, we measure absolute trans-DOCO product concentrations with a sensitivity of 5 × 10^(10) cm^(–3) and observe its subsequent loss with a time resolution of 25 μs. The multiplexed nature of TRFCS allows us to detect simultaneously the time-dependent concentration of several other photoproducts and thus unravel primary and secondary chemical reaction pathways
    • …
    corecore