317 research outputs found

    An empirical evaluation of the technology cycle time indicator as ameasure of the pace of technological progress in superconductortechnology

    Get PDF
    The technology cycle time indicator (TCT) is a new measure of technological progress. The TCT is the median age of the patents cited on the front page of a patent document. The measure assumes that the more recent the age of the cited patents, the more quickly one generation of inventions is replacing another. The main purpose of the study was to evaluate the TCT in a dynamic context to determine how accurately it measures the pace of technological progress. This study found the trend in TCT changed abruptly from gradually increasing (slowing in cycle time) to steadily decreasing (speeding up in cycle time) following the discovery of high-temperature superconductors. The methodology prescribed in this study could potentially be used in assessing the pace of progress for different technologies or different nations in the same technolog

    An empirical evaluation of the technology cycle time indicator as ameasure of the pace of technological progress in superconductortechnology

    Get PDF
    The technology cycle time indicator (TCT) is a new measure of technological progress. The TCT is the median age of the patents cited on the front page of a patent document. The measure assumes that the more recent the age of the cited patents, the more quickly one generation of inventions is replacing another. The main purpose of the study was to evaluate the TCT in a dynamic context to determine how accurately it measures the pace of technological progress. This study found the trend in TCT changed abruptly from gradually increasing (slowing in cycle time) to steadily decreasing (speeding up in cycle time) following the discovery of high-temperature superconductors. The methodology prescribed in this study could potentially be used in assessing the pace of progress for different technologies or different nations in the same technolog

    Come back Marshall, all is forgiven? : Complexity, evolution, mathematics and Marshallian exceptionalism

    Get PDF
    Marshall was the great synthesiser of neoclassical economics. Yet with his qualified assumption of self-interest, his emphasis on variation in economic evolution and his cautious attitude to the use of mathematics, Marshall differs fundamentally from other leading neoclassical contemporaries. Metaphors inspire more specific analogies and ontological assumptions, and Marshall used the guiding metaphor of Spencerian evolution. But unfortunately, the further development of a Marshallian evolutionary approach was undermined in part by theoretical problems within Spencer's theory. Yet some things can be salvaged from the Marshallian evolutionary vision. They may even be placed in a more viable Darwinian framework.Peer reviewedFinal Accepted Versio

    Shrinking a large dataset to identify variables associated with increased risk of Plasmodium falciparum infection in Western Kenya

    Get PDF
    Large datasets are often not amenable to analysis using traditional single-step approaches. Here, our general objective was to apply imputation techniques, principal component analysis (PCA), elastic net and generalized linear models to a large dataset in a systematic approach to extract the most meaningful predictors for a health outcome. We extracted predictors for Plasmodium falciparum infection, from a large covariate dataset while facing limited numbers of observations, using data from the People, Animals, and their Zoonoses (PAZ) project to demonstrate these techniques: data collected from 415 homesteads in western Kenya, contained over 1500 variables that describe the health, environment, and social factors of the humans, livestock, and the homesteads in which they reside. The wide, sparse dataset was simplified to 42 predictors of P. falciparum malaria infection and wealth rankings were produced for all homesteads. The 42 predictors make biological sense and are supported by previous studies. This systematic data-mining approach we used would make many large datasets more manageable and informative for decision-making processes and health policy prioritization

    Global Search for New Physics with 2.0/fb at CDF

    Get PDF
    Data collected in Run II of the Fermilab Tevatron are searched for indications of new electroweak-scale physics. Rather than focusing on particular new physics scenarios, CDF data are analyzed for discrepancies with the standard model prediction. A model-independent approach (Vista) considers gross features of the data, and is sensitive to new large cross-section physics. Further sensitivity to new physics is provided by two additional algorithms: a Bump Hunter searches invariant mass distributions for "bumps" that could indicate resonant production of new particles; and the Sleuth procedure scans for data excesses at large summed transverse momentum. This combined global search for new physics in 2.0/fb of ppbar collisions at sqrt(s)=1.96 TeV reveals no indication of physics beyond the standard model.Comment: 8 pages, 7 figures. Final version which appeared in Physical Review D Rapid Communication

    Observation of Orbitally Excited B_s Mesons

    Get PDF
    We report the first observation of two narrow resonances consistent with states of orbitally excited (L=1) B_s mesons using 1 fb^{-1} of ppbar collisions at sqrt{s} = 1.96 TeV collected with the CDF II detector at the Fermilab Tevatron. We use two-body decays into K^- and B^+ mesons reconstructed as B^+ \to J/\psi K^+, J/\psi \to \mu^+ \mu^- or B^+ \to \bar{D}^0 \pi^+, \bar{D}^0 \to K^+ \pi^-. We deduce the masses of the two states to be m(B_{s1}) = 5829.4 +- 0.7 MeV/c^2 and m(B_{s2}^*) = 5839.7 +- 0.7 MeV/c^2.Comment: Version accepted and published by Phys. Rev. Let

    Measurement of the Bottom-Strange Meson Mixing Phase in the Full CDF Data Set

    Get PDF
    We report a measurement of the bottom-strange meson mixing phase \beta_s using the time evolution of B0_s -> J/\psi (->\mu+\mu-) \phi (-> K+ K-) decays in which the quark-flavor content of the bottom-strange meson is identified at production. This measurement uses the full data set of proton-antiproton collisions at sqrt(s)= 1.96 TeV collected by the Collider Detector experiment at the Fermilab Tevatron, corresponding to 9.6 fb-1 of integrated luminosity. We report confidence regions in the two-dimensional space of \beta_s and the B0_s decay-width difference \Delta\Gamma_s, and measure \beta_s in [-\pi/2, -1.51] U [-0.06, 0.30] U [1.26, \pi/2] at the 68% confidence level, in agreement with the standard model expectation. Assuming the standard model value of \beta_s, we also determine \Delta\Gamma_s = 0.068 +- 0.026 (stat) +- 0.009 (syst) ps-1 and the mean B0_s lifetime, \tau_s = 1.528 +- 0.019 (stat) +- 0.009 (syst) ps, which are consistent and competitive with determinations by other experiments.Comment: 8 pages, 2 figures, Phys. Rev. Lett 109, 171802 (2012

    Genetic determinism: how not to interpret behavioral genetics

    Get PDF
    Recently, investigators in behavioral genetics have found loci on the genome (so-called ‘quantitative trait loci’ or QTLs) that are associated with complex mental traits, such as anxiety or novelty seeking. The interpretation of these findings raises interesting theoretical questions. At first sight, the discovery of ‘genes-for-personality’ seems to support genetic determinism and reductionism. Genetic determinism is the view that the phenotype is precoded in or determined by the genotype. However, evidence from developmental biology and neural modeling indicates that development is a result of interactive processes at many levels, not only the genome, so that geneticism must be rejected. Identifying QTLs and perhaps also the causal paths in the tangle of top-down and bottom-up influences between genome, organism and environment is best seen as a simplification. It amounts to considerably less than reduction in the classical sense of replacement via bridge laws or elimination. It is argued that higher (psychological and physiological) levels are functionally characterized and are irreducible to molecular-genetic levels. Therefore, it is to be expected that ideas about inter-level relations may be useful in clarifying the relation between loci on the genome (QTLs), gene products, the nervous system, behavior and personality, and to help identify the contribution of genetic factors in behavioral genetics. © 2000, Sage Publications. All rights reserved

    Statistical Mechanics of Horizontal Gene Transfer in Evolutionary Ecology

    Full text link
    The biological world, especially its majority microbial component, is strongly interacting and may be dominated by collective effects. In this review, we provide a brief introduction for statistical physicists of the way in which living cells communicate genetically through transferred genes, as well as the ways in which they can reorganize their genomes in response to environmental pressure. We discuss how genome evolution can be thought of as related to the physical phenomenon of annealing, and describe the sense in which genomes can be said to exhibit an analogue of information entropy. As a direct application of these ideas, we analyze the variation with ocean depth of transposons in marine microbial genomes, predicting trends that are consistent with recent observations using metagenomic surveys.Comment: Accepted by Journal of Statistical Physic
    corecore