3,573 research outputs found

    Industrial production and capacity utilization: the 2005 annual revision

    Get PDF
    On November 7, 2005, the Board of Governors of the Federal Reserve System issued revisions to its index of industrial production (IP) and the related measures of capacity and capacity utilization for the period from January 1972 through September 2005. For this period, both the levels and the rates of change were revised. For years before 1972, the levels, but not the rates of change, were also revised. Overall, the changes to total industrial production were small. ; Besides the revisions to the monthly data for IP and capacity utilization starting in 1972, the comparison base year for all production and capacity indexes was changed: The indexes are now expressed as percentages of output in 2002 instead of 1997. The rebasing affects all series from their start dates: 1919 for total IP and manufacturing IP, 1948 for manufacturing capacity, and 1967 for total industrial capacity.Industrial production index ; Industrial capacity

    Diagnostic techniques in deflagration and detonation studies

    Get PDF
    Advances in experimental, high-speed techniques can be used to explore the processes occurring within energetic materials. This review describes techniques used to study a wide range of processes: hot-spot formation, ignition thresholds, deflagration, sensitivity and finally the detonation process. As this is a wide field the focus will be on small-scale experiments and quantitative studies. It is important that such studies are linked to predictive models, which inform the experimental design process. The stimuli range includes, thermal ignition, drop-weight, Hopkinson Bar and Plate Impact studies. Studies made with inert simulants are also included as these are important in differentiating between reactive response and purely mechanical behaviour

    Sparse experimental design : an effective an efficient way discovering better genetic algorithm structures

    Get PDF
    The focus of this paper is the demonstration that sparse experimental design is a useful strategy for developing Genetic Algorithms. It is increasingly apparent from a number of reports and papers within a variety of different problem domains that the 'best' structure for a GA may be dependent upon the application. The GA structure is defined as both the types of operators and the parameters settings used during operation. The differences observed may be linked to the nature of the problem, the type of fitness function, or the depth or breadth of the problem under investigation. This paper demonstrates that advanced experimental design may be adopted to increase the understanding of the relationships between the GA structure and the problem domain, facilitating the selection of improved structures with a minimum of effort

    Emission Targets and Equilibrium Choice of Technique

    Get PDF
    We study the technological pre-conditions for a cost-minimizing choice of technique in the presence of government emission targets on by- products of production. Whether a by-product is a desirable commodity or an undesirable pollutant is determined endogeneously as part of the price-quantity equilibrium solution. Non-trivial counter-examples highlight the potential risk of over-ambitious pollution targets. We show that pollution targets can be supported by the appropriate taxes providing that technology allows for a certain type of labour-intensive pollution abatement activities. Our proof is constructive: the tax equilibria we posit can be computed by the Lemke Complementary Pivoting Algorithm.Multisectoral Growth Theory, Choice of Technique, Pollution Taxes, Permit Markets, Lemke Algorithm

    Fault-Tolerant Thresholds for Encoded Ancillae with Homogeneous Errors

    Full text link
    I describe a procedure for calculating thresholds for quantum computation as a function of error model given the availability of ancillae prepared in logical states with independent, identically distributed errors. The thresholds are determined via a simple counting argument performed on a single qubit of an infinitely large CSS code. I give concrete examples of thresholds thus achievable for both Steane and Knill style fault-tolerant implementations and investigate their relation to threshold estimates in the literature.Comment: 14 pages, 5 figures, 3 tables; v2 minor edits, v3 completely revised, submitted to PR

    Econometric modeling of business Telecommunications demand using Retina and Finite Mixtues

    Get PDF
    In this paper we estimate the business telecommunications demands for local, intra-LATA and inter-LATA services, using US data from a Bill Harvesting R survey carried out during 1997. We model heterogeneity, which is present among firms due to a variety of different business telecommunication needs, by estimating normal heteroskedastic mixture regressions. The results show that a three-component mixture model fits the demand for local services well, while a two-component structure is used to model intra-LATA and inter-LATA demand. We characterize the groups in terms of their differences among the coefficients, and then use Retina to perform automatic model selection over an expanded candidate regressor set which includes heterogeneity parameters as well as transformations of the original variables.Telecommunication Demand Models, Local calls, Inter-LATA calls, intra-LATA calls, Retina, Flexible Functional Forms, Heterogeneity, Finite Mixtures.
    • 

    corecore