3,820 research outputs found
Innovation Heterogeneity, Schumpeterian Growth and Evolutionary Theorizing
Schumpeterian growth models revolve around two tacit assumptions that are at odds with the empirical evidence, namely: all innovations are equally important for economic growth (equipollent innovation) and all innovations occur in one sector only (confined innovation). The present paper shows that it is possible to dispose of both implicit assumptions by disaggregating the "ideas production function" without altering the gist of the theoretical framework. The paper refers briefly to the concepts of macro and microinventions, and introduces the concept of "innovatory discontinuity". The extended theoretical framework developed here throws light on the ongoing controversy between neoclassical and evolutionary theorizing.Innovation heterogeneity, ideas production function, scale effects problem, innovatory discontinuity, neoclassical and evolutionary theorizing
The Double Paradox of Elementary Economics Education
Elementary economics textbooks have become less attractive to students requiring only an introduction to economics, given that their content is pervaded by mathematical diagrams and simple equations. Also they are of relatively little value to those interested in, for example, attempting to gain an understanding of the New Economy, for they rarely emphasise business innovation and its crucial dynamic role. These factors engender something of a double paradox. First (paradox of the tools and the audience), newcomers are frequently 'turned off' by existing economics textbooks due to the pervasive use of mathematics. Second (paradox of the content and relevance), those newcomers who are not initially turned off tend to be disenchanted with economics because they perceive that economics is of little use in understanding the New Economy in which they work, or will come to work. We suggest an integrated solution to both paradoxes. The implementation entails a minor reorientation of the traditional pedagogical strategy for teaching introductory economics.Double paradox, elementary economics education
Unexpected properties of bandwidth choice when smoothing discrete data for constructing a functional data classifier
The data functions that are studied in the course of functional data analysis
are assembled from discrete data, and the level of smoothing that is used is
generally that which is appropriate for accurate approximation of the
conceptually smooth functions that were not actually observed. Existing
literature shows that this approach is effective, and even optimal, when using
functional data methods for prediction or hypothesis testing. However, in the
present paper we show that this approach is not effective in classification
problems. There a useful rule of thumb is that undersmoothing is often
desirable, but there are several surprising qualifications to that approach.
First, the effect of smoothing the training data can be more significant than
that of smoothing the new data set to be classified; second, undersmoothing is
not always the right approach, and in fact in some cases using a relatively
large bandwidth can be more effective; and third, these perverse results are
the consequence of very unusual properties of error rates, expressed as
functions of smoothing parameters. For example, the orders of magnitude of
optimal smoothing parameter choices depend on the signs and sizes of terms in
an expansion of error rate, and those signs and sizes can vary dramatically
from one setting to another, even for the same classifier.Comment: Published in at http://dx.doi.org/10.1214/13-AOS1158 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
A New Taxonomy of Economic sectors With a View to Policy Implications
This paper is an attempt to tease out a taxonomy of economic sectors based on a systems approach to innovation and economic growth that may be useful for policy analysis. The taxonomy explored here revolves around novel products rather than ethereal knowledge-producing entities. This insight goes back to Allyn Young (1928) and Joseph Schumpeter (1934) who argued that the introduction of new goods was the engine of economic growth. More precisely, our taxonomy of sectors focuses on novel products which are efficiency-enhancing within and between sectors through the market mechanism. The scheme revolves around the relationship between 'Enabling' and 'Recipient' sectors (which gives the taxonomy its name: ER), and offers a lens for viewing and interpreting a substantive part of the mechanics of modern economic growth. The last part of the paper briefly discusses a few immediate policy implications, although it has the potential for greater use and value in this regard.innovation, economic growth, enabling linkages approach, knowledge-based economies, novel products, efficiency-enhancing innovations
A study of self-actualization among various groups of male intercollegiate athletes at the University of the Pacific
The purpose of this study was to determine the degree of self-actualization (SA) as measured by the Personal Orientation Inventory (POI) among various groups of male intercollegiate athletes at the University of the Pacific. The subjects (N = 111) were golf (N = 8), tennis (N = 5), swimming (N = 6), baseball (N = 25), basketball (N = 14), and football (N = 53). The subjects were categorized into three sub-groups: (1) individuals sport members (golf, tennis, swimming) and team sport members (baseball, basketball, football); (2) good players, regular players, and substitutes; and (3) upper-division athletes and lower-division athletes
Mathematics Is Physics
In this essay, I argue that mathematics is a natural science---just like
physics, chemistry, or biology---and that this can explain the alleged
"unreasonable" effectiveness of mathematics in the physical sciences. The main
challenge for this view is to explain how mathematical theories can become
increasingly abstract and develop their own internal structure, whilst still
maintaining an appropriate empirical tether that can explain their later use in
physics. In order to address this, I offer a theory of mathematical
theory-building based on the idea that human knowledge has the structure of a
scale-free network and that abstract mathematical theories arise from a
repeated process of replacing strong analogies with new hubs in this network.
This allows mathematics to be seen as the study of regularities, within
regularities, within ..., within regularities of the natural world. Since
mathematical theories are derived from the natural world, albeit at a much
higher level of abstraction than most other scientific theories, it should come
as no surprise that they so often show up in physics.
This version of the essay contains an addendum responding to Slyvia
Wenmackers' essay and comments that were made on the FQXi website.Comment: 15 pages, LaTeX. Second prize winner in 2015 FQXi Essay Contest (see
http://fqxi.org/community/forum/topic/2364
Session D-4: From Guernica to Nuremberg: Teaching Human Rights Themes in mid-20th-Century History
A critical turning point in modern warfareâaviation, civilian casualties, and population displacement during the Spanish Civil War and World War IIâled to the Nuremberg Tribunals and UN Declaration of Human Rights, foreshadowing contemporary debates about bombing, drones, refugees/immigration, and interventionist foreign policies. This session will span World History and US History, drawing from free archival primary sources (graphic and textual) that reveal changing perceptions of warfare
- âŠ