2,088 research outputs found
The Changing Role of Capital in the U.S. Private Business Sector: Evidence for a "New Economy"
Economists differ in their explanation of changes in the rate of U.S.economic growth in the latter half of the 20th century-particularly for the "new economy" period from 1982-2000. Adherents of the Neoclassical Growth Model have emphasized that with the increase in the capital/labor ratio the aggregate production function would be subject to diminishing returns so that economies would asymptotically approach a steady state in terms of output per worker and output per unit of capital. Endogenous Growth theorists have emphasized upward shifts in production functions offsetting diminishing returns. Both theories have neglected to incorporate into their growth models the effects of systematic shifts in the composition of output that accompany economic growth. The paper analyzes the Private Business Sector (exclusion of Government, Residential Housing, and Not For Profit), uses a more restrictive measure of output, Net National Income, rather than Gross Domestic Product and a more general measure of labor input, Persons Engaged in Production, rather than Full Time Equivalent Employment or labor hours in analysis. Using BEA data sets for the stock of physical capital and gross product originating by SIC sector and industry, the paper demonstrates that about half the increase in labor and capital productivity in the new economy has been the result of endogenous growth within sectors and industries and the other half is attributable to shifts in the composition of output away from more physical capital-intensive industries to more labor-intensive industries. After falling steadily from 1966 to 1982, both the nominal output/capital (Y/C) and real output/capital ((Q/K) ratios rise steadily from 1982 to 2000. Growth in the real capital/labor (K/N) ratio slows during this period so that in marked contrast to earlier periods, half of the growth in real output per worker (Q/N) is attributable to increases in capital productivity. Increase in the Y/C ratio is shown, by counterfactual analysis, to depend partly on the shift of output from more to less capital intensive industries. The paper also demonstrates that half of the change in the nominal Y/C ratio is due to “real” rather than relative price changes and that changes in capacity utilization over the business cycle explain only a negligible part of the increase.
Reeds computer code
The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO
The changing role of capital in the US private business sector: evidence for a new economy
Economists differ in their explanation of changes in the rate of U.S.economic growth in the latter half of the 20th century-particularly for the new economy period from 1982-2000. Adherents of the Neoclassical Growth Model have emphasized that with the increase in the capital/labor ratio the aggregate production function would be subject to diminishing returns so that economies would asymptotically approach a steady state in terms of output per worker and output per unit of capital. Endogenous Growth theorists have emphasized upward shifts in production functions offsetting diminishing returns. Both theories have neglected to incorporate into their growth models the effects of systematic shifts in the composition of output that accompany economic growth. The paper analyzes the Private Business Sector (exclusion of Government, Residential Housing, and Not For Profit), uses a more restrictive measure of output, Net National Income, rather than Gross Domestic Product and a more general measure of labor input, Persons Engaged in Production, rather than Full Time Equivalent Employment or labor hours in analysis. Using BEA data sets for the stock of physical capital and gross product originating by SIC sector and industry, the paper demonstrates that about half the increase in labor and capital productivity in the new economy has been the result of endogenous growth within sectors and industries and the other half is attributable to shifts in the composition of output away from more physical capital-intensive industries to more labor-intensive industries. After falling steadily from 1966 to 1982, both the nominal output/capital (Y/C) and real output/capital ((Q/K) ratios rise steadily from 1982 to 2000. Growth in the real capital/labor (K/N) ratio slows during this period so that in marked contrast to earlier periods, half of the growth in real output per worker (Q/N) is attributable to increases in capital productivity. Increase in the Y/C ratio is shown, by counterfactual analysis, to depend partly on the shift of output from more to less capital intensive industries. The paper also demonstrates that half of the change in the nominal Y/C ratio is due to real rather than relative price changes and that changes in capacity utilization over the business cycle explain only a negligible part of the increase
The Role of Computer Science in a Liberal Arts College
The question the panel has been asked to discuss actually encompasses two related, but distinct, questions - each of which, in turn, has two subquestions.
The first question concerns the place of a computer science as a major program of study among the offerings of a liberal arts college. The two subquestions here are as follows: First, is computer science an appropriate field of study to offer in a liberal arts college? (Is it a liberal art?) Second, is a liberal arts college an appropriate home for a computer science program? (Can a high quality program be offered in such an environment?)
The second question concerns the place of a computer science as part of the core program of studies in a liberal arts college. Here, the subquestions are these: First, what (if anything) ought every liberally-educated persons to know about computer science? (What are the minimum essentials?) Second, how can computer science enrich and inform the other liberal disciplines (thus taking its place among the “liberating” branches of learning)
A Tutorial on PROLOG
PROLOG is a programming language based on the use of mathematical logic—specifically the first order predicate calculus. The name is a contraction for “Programming in Logic”. PROLOG was developed in 1972 by Phillippe Roussel of the AI Group (Groupe d’Intelligence Artificielle) of the University of Marseille. Specifically, it is an outgrowth of research there on automatic theorem proving. PROLOG has been widely used by AI researchers in Europe and Japan. In fact, the Japanese have made it the basis for the software side of their “Fifth Generation” computer project. PROLOG is currently used in a wide variety of areas, not just for automatic theorem proving. It is an excellent tool when on wants to do symbolic (as opposed to numerical) computation. Until recently, PROLOG has been less widely known in this country—perhaps due to the “not invented here” syndrome
The utility of twins in developmental cognitive neuroscience research: How twins strengthen the ABCD research design
The ABCD twin study will elucidate the genetic and environmental contributions to a wide range of mental and physical health outcomes in children, including substance use, brain and behavioral development, and their interrelationship. Comparisons within and between monozygotic and dizygotic twin pairs, further powered by multiple assessments, provide information about genetic and environmental contributions to developmental associations, and enable stronger tests of causal hypotheses, than do comparisons involving unrelated children. Thus a sub-study of 800 pairs of same-sex twins was embedded within the overall Adolescent Brain and Cognitive Development (ABCD) design. The ABCD Twin Hub comprises four leading centers for twin research in Minnesota, Colorado, Virginia, and Missouri. Each site is enrolling 200 twin pairs, as well as singletons. The twins are recruited from registries of all twin births in each State during 2006–2008. Singletons at each site are recruited following the same school-based procedures as the rest of the ABCD study. This paper describes the background and rationale for the ABCD twin study, the ascertainment of twin pairs and implementation strategy at each site, and the details of the proposed analytic strategies to quantify genetic and environmental influences and test hypotheses critical to the aims of the ABCD study. Keywords: Twins, Heritability, Environment, Substance use, Brain structure, Brain functio
Uncertainty reconciles complementarity with joint measurability
The fundamental principles of complementarity and uncertainty are shown to be
related to the possibility of joint unsharp measurements of pairs of
noncommuting quantum observables. A new joint measurement scheme for
complementary observables is proposed. The measured observables are represented
as positive operator valued measures (POVMs), whose intrinsic fuzziness
parameters are found to satisfy an intriguing pay-off relation reflecting the
complementarity. At the same time, this relation represents an instance of a
Heisenberg uncertainty relation for measurement imprecisions. A
model-independent consideration show that this uncertainty relation is
logically connected with the joint measurability of the POVMs in question.Comment: 4 pages, RevTeX. Title of previous version: "Complementarity and
uncertainty - entangled in joint path-interference measurements". This new
version focuses on the "measurement uncertainty relation" and its role,
disentangling this issue from the special context of path interference
duality. See also http://www.vjquantuminfo.org (October 2003
Trinity University\u27s Summer Bridge Program: Navigating the Changing Demographics in Higher Education
Our article is divided into five sections. First, our study explores the demographic, economic, and cultural changes influencing higher education. We also explain the tangible and intangible benefits of a college education for first-generation, underrepresented students (FGUS). Second, we provide a brief discussion of the history of Trinity University and our Summer Bridge program. Third, our study describes our Summer Bridge program. Fourth, the data we collected examines how our Summer Bridge students’ grades and retention rates compare to our other first-year students. And, fifth, our article concludes with a discussion of future directions for our Summer Bridge program and how it may apply to other higher educational institutions. In particular, we offer recommendations for other student affairs professionals who also will be experiencing an increase in first-generation, underrepresented students
Recommended from our members
Beta contamination monitor energy response
Beta contamination is monitored at Los Alamos National Laboratory (LANL) with portable handheld probes and their associated counters, smear counters, air-breathing continuous air monitors (CAM), personnel contamination monitors (PCM), and hand and foot monitors (HFM). The response of these monitors was measured using a set of anodized-aluminum beta sources for the five isotopes: Carbon-14, Technetium-99, Cesium-137, Chlorine-36 and Strontium/Yttrium-90. The surface emission rates of the sources are traceable to the National Institute of Standards and Technology (NIST) with a precision of one relative standard deviation equal to 1.7%. All measurements were made in reproducible geometry, mostly using aluminum source holders. All counts, significantly above background, were collected to a precision of 1% or better. The study of the hand-held probes included measurements of six air gaps from 0.76 to 26.2 mm. The energy response of the detectors is well-parameterized as a function of the average beta energy of the isotopes (C14=50 keV, Tc99=85, Cs137=188, C136=246, and Sr/Y90=934). The authors conclude that Chlorine-36 is a suitable beta emitter for routine calibration. They recommend that a pancake Geiger-Mueller (GM) or gas-proportional counter be used for primarily beta contamination surveys with an air gap not to exceed 6 mm. Energy response varies about 30% from Tc99 to Sr/Y90 for the pancake GM detector. Dual alpha/beta probes have poor to negligible efficiency for low-energy betas. The rugged anodized sources represent partially imbedded contamination found in the field and they are provided with precise, NIST-traceable, emission rates for reliable calibration
- …