13,981 research outputs found

    Network of Tinkerers: A Model of Open-Source Technology Innovation

    Get PDF
    Airplanes were invented by hobbyists and experimenters, and some personal computers were as well. Similarly, many open-source software developers are interested in the software they make, and not focused on profit. Based on these cases, this paper has a model of agents called tinkerers who want to improve a technology for their own reasons, by their own criteria, and who see no way to profit from it. Under these conditions, they would rather share their technology than work alone. The members of the agreement form an information network. The network's members optimally specialize based on their opportunities in particular aspects of the technology or in expanding or managing the network. Endogenously there are incentives to standardize on designs and descriptions of the technology. A tinkerer in the network who sees an opportunity to produce a profitable product may exit the network to create a startup firm and conduct focused research and development. Thus a new industry can arise.Technological Change, Open Source Software, Uncertainty, Innovation, Invention, Collective Invention, Hackers, Hobbyists, Experimenters, Airplane

    Turbulence, Inequality, and Cheap Steel

    Get PDF
    Iron and steel production grew dramatically in the U.S. when mass production technologies for steel were adopted in the 1860s. According to new measures presented in this study, earnings inequality rose within the iron and steel industries about 1870, perhaps because technological uncertainty led to gambles and turbulence. Firms made a variety of technological choices and began formal research and development. Professional associations and journals for mechanical engineers and chemists appeared. A national market replaced local markets for iron and steel. An industrial union replaced craft unions. As new ore sources and cheap water transportation were introduced, new plants along the Great Lakes outcompeted existing plants elsewhere. Because new iron and steel plants in the 1870s were larger than any U.S. plants had ever been, cost accounting appeared in the industry and grew in importance. Uncertainty explains the rise in inequality better than a skill bias account, according to which differences among individuals generate greater differences in wages. Analogous issues of inequality come up with respect to recent information technology.technological change, Bessemer steel, technological uncertainty, turbulence, inequality, innovation

    Metrics for Graph Comparison: A Practitioner's Guide

    Full text link
    Comparison of graph structure is a ubiquitous task in data analysis and machine learning, with diverse applications in fields such as neuroscience, cyber security, social network analysis, and bioinformatics, among others. Discovery and comparison of structures such as modular communities, rich clubs, hubs, and trees in data in these fields yields insight into the generative mechanisms and functional properties of the graph. Often, two graphs are compared via a pairwise distance measure, with a small distance indicating structural similarity and vice versa. Common choices include spectral distances (also known as Ī»\lambda distances) and distances based on node affinities. However, there has of yet been no comparative study of the efficacy of these distance measures in discerning between common graph topologies and different structural scales. In this work, we compare commonly used graph metrics and distance measures, and demonstrate their ability to discern between common topological features found in both random graph models and empirical datasets. We put forward a multi-scale picture of graph structure, in which the effect of global and local structure upon the distance measures is considered. We make recommendations on the applicability of different distance measures to empirical graph data problem based on this multi-scale view. Finally, we introduce the Python library NetComp which implements the graph distances used in this work

    Primary Cosmic Ray and Solar Protons II

    Get PDF
    During July and August 1961 the energy spectrum of primary cosmic ray protons was investigated in the energy range from 80 to 350 MeV. The observations were made in five high altitude balloon flights at geomagnetic latitudes lambda is greater than 73 degrees N. Solar flare and quiet day spectra were obtained. A comparison of the 1960 and 1961 results leads to the conclusions that, 1. A significant flux of low energy protons is continually present in the primary radiation in the years of high solar activity; 2. This flux decreases with the declining level of solar activity as the galactic cosmic ray flux increases, It is, therefore, suggested that it is of solar origin; 3. The time dependence of the observed proton flux suggests the following alternatives: a) The particles are produced or released more or less continuously by the sun and do not originate only in the large flare events; or b) The particles are produced in individual large solar flares and subsequently stored over long periods of time. This second alternative would require a new and as yet unknown storage mechanism with a characteristic time of about 30 or more days

    Using HP Filtered Data for Econometric Analysis : Some Evidence from Monte Carlo Simulations

    Get PDF
    The Hodrick-Prescott (HP) filter has become a widely used tool for detrending integrated time series in applied econometric analysis. Even though the theoretical time series literature sums up an extensive catalogue of severe criticism against an econometric analysis of HP filtered data, the original Hodrick and Prescott (1980, 1997) suggestion to measure the strength of association between (macro-)economic variables by a regression analysis of corresponding HP filtered time series still appears to be popular. A contradictory situation which might be justified only if HP induced distortions were quantitatively negligible in empirical applications. However, this hypothesis can hardly be maintained as the simulation results presented within this paper indicate that HP filtered series give seriously rise to spurious regression results. --HP filter,spurious regression,detrending

    Top-Quark Physics at the LHC

    Full text link
    The top quark is the heaviest of all known elementary particles. It was discovered in 1995 by the CDF and D0 experiments at the Tevatron. With the start of the LHC in 2009, an unprecedented wealth of measurements of the top quark's production mechanisms and properties have been performed by the ATLAS and CMS collaborations, most of these resulting in smaller uncertainties than those achieved previously. At the same time, huge progress was made on the theoretical side yielding significantly improved predictions up to next-to-next-to-leading order in perturbative QCD. Due to the vast amount of events containing top quarks, a variety of new measurements became feasible and opened a new window to precisions tests of the Standard Model and to contributions of new physics. In this review, originally written for a recent book on the results of LHC Run 1, top-quark measurements obtained so far from the LHC Run 1 are summarised and put in context with the current understanding of the Standard Model.Comment: 35 pages, 25 figures. To appear in "The Large Hadron Collider -- Harvest of Run 1", Thomas Sch\"orner-Sadenius (ed.), Springer, 2015 (532 pages, 253 figures; ISBN 978-3-319-15000-0; eBook ISBN 978-3-319-15001-7, for more details, see http://www.springer.com/de/book/9783319150000
    • ā€¦
    corecore