1,183 research outputs found

    The US economy from 1992 to 1998: historical and decomposition simulations with the USAGE model

    Get PDF
    USAGE is a 500 industry dynamic computable general equilibrium model of the US economy being developed at Monash University in collaboration with the US International Trade Commission. In common with the MONASH model of Australia, USAGE is designed for four modes of analysis: Historical, where we estimate changes in technology and consumer preferences; Decomposition, where we explain periods of economic history in terms of driving factors such as changes in technology and consumer preferences; Forecast, where we derive basecase forecasts for industries, occupations and regions that are consistent with trends from historical simulations and with available expert opinions; and Policy, where we derive deviations from basecase forecast paths caused by assumed policies. This paper reports our first set of historical and decomposition results. The historical results quantify several aspects of technical change in US industries for the period 1992 to 1998 including: intermediate-input-saving technical change; primary-factor-saving technical change; labor-capital bias in technical change; and import- domestic bias in technical change. The historical results also quantify shifts in consumer preferences between commodities. The decomposition results are applied in illustrative analyses of growth in US international trade between 1992 and 1998 and of growth in the US steel industry for this period.

    Study of Metal Magnetic Memory (MMM) Technique Using Permanently Installed Magnetic Sensor Arrays

    Get PDF
    The Metal magnetic memory (MMM) technique is marketed as a non-destructive testing method to evaluate stress concentration for ferromagnetic materials by measuring variations of the self-magnetic leakage field (SMLF) distribution on the specimen surface. This method has been implemented as a periodic screening inspection tool by several companies, and has been reported to be capable of detecting stress concentration in some instances. However, recent literature has suggested that the performance can be unreliable, with many false-calls; however, the reason for the false calls has not been satisfactorily investigated. In this paper, the normal and tangential components of the stress induced SMLF on the surface of specimens made from L80 steel, which is commonly used for pipes in the petrochemical industry, were measured by two permanently installed magnetic sensor arrays under the earth’s magnetic field. The results show that the SMLF changes are very small (about 10 ~ 30 A/m) in all cases for the material tested and occur mainly in first elastic load cycle. A Finite Element (FE) model has been used to predict the spatial distribution of the magnetic field due to localized MMM. The FE studies show that the defect size and the applied background magnetic field will influence the SMLF signal; however, the signal due to MMM could be swamped by any ferromagnetic material near to the magnetic sensors. The simulation and experimental results demonstrate the high probability of false calls with the MMM technique even if there were a significant MMM effect in the material used

    Forecasting and Policy Analysis with a Dynamic CGE Model of Australia

    Get PDF
    The main ideas in this paper are: (i) that CGE models can be used in forecasting; and (ii) that forecasts matter for policy analysis. We demonstrate these ideas by describing an application of MONASH, a dynamic CGE model of Australia, to the Australian motor vehicle industry over the period 1987 to 2016. The key to generating behevable forecasts is to use detailed information available from expert groups specializing in the analysis of different aspects of the economy. In MONASH we incorporate forecasts by specialists: on the domestic macro economy; on Australian economic policy; on world commodity markets; on international tourism; on production technologies; and on consumer preferences. We have found that CGE forecasts incorporating such specialist information are readily saleable to public and private organizations concerned with investment, employment, training and education issues. This is partly because the economywide consistency guaranteed by the CGE approach enables users of economic intelligence to see the disparate forecasts dealing with different parts and aspects of the economy within an integrated perspective. Over the last thirty five years, CGE models have been used almost exclusively as aids to "what if" (usually policy) analysis. In almost all cases it has been assumed that the effects of the shock under consideration are independent of the future path of the economy. Thus, for "what if" analysis, a common implicit view is that reahstic basecase forecasts are unnecessary. Contrary to this view, we find that "what if" answers depend significantly on the basecase forecasts. This is not surprising when we are concerned with unemployment and other adjustment costs. However, we find that basecase forecasts are critical even when our concern is the longrun welfare implications of a policy change. For example, we find that the simulated long-run effects of a tariff cut on imported cars are strongly influenced by the basecase forecast of the rate of technical progress in the car industry relative to that in other industries

    Quantum Fields in a Big Crunch/Big Bang Spacetime

    Get PDF
    We consider quantum field theory on a spacetime representing the Big Crunch/Big Bang transition postulated in the ekpyrotic or cyclic cosmologies. We show via several independent methods that an essentially unique matching rule holds connecting the incoming state, in which a single extra dimension shrinks to zero, to the outgoing state in which it re-expands at the same rate. For free fields in our construction there is no particle production from the incoming adiabatic vacuum. When interactions are included the total particle production for fixed external momentum is finite at tree level. We discuss a formal correspondence between our construction and quantum field theory on de Sitter spacetime.Comment: 30 pages, RevTex file, five postscript figure file

    A global analysis of genetic interactions in Caenorhabditis elegans

    Get PDF
    Abstract Background Understanding gene function and genetic relationships is fundamental to our efforts to better understand biological systems. Previous studies systematically describing genetic interactions on a global scale have either focused on core biological processes in protozoans or surveyed catastrophic interactions in metazoans. Here, we describe a reliable high-throughput approach capable of revealing both weak and strong genetic interactions in the nematode Caenorhabditis elegans. Results We investigated interactions between 11 'query' mutants in conserved signal transduction pathways and hundreds of 'target' genes compromised by RNA interference (RNAi). Mutant-RNAi combinations that grew more slowly than controls were identified, and genetic interactions inferred through an unbiased global analysis of the interaction matrix. A network of 1,246 interactions was uncovered, establishing the largest metazoan genetic-interaction network to date. We refer to this approach as systematic genetic interaction analysis (SGI). To investigate how genetic interactions connect genes on a global scale, we superimposed the SGI network on existing networks of physical, genetic, phenotypic and coexpression interactions. We identified 56 putative functional modules within the superimposed network, one of which regulates fat accumulation and is coordinated by interactions with bar-1(ga80), which encodes a homolog of β-catenin. We also discovered that SGI interactions link distinct subnetworks on a global scale. Finally, we showed that the properties of genetic networks are conserved between C. elegans and Saccharomyces cerevisiae, but that the connectivity of interactions within the current networks is not. Conclusions Synthetic genetic interactions may reveal redundancy among functional modules on a global scale, which is a previously unappreciated level of organization within metazoan systems. Although the buffering between functional modules may differ between species, studying these differences may provide insight into the evolution of divergent form and function

    Revising ethical guidance for the evaluation of programmes and interventions not initiated by researchers

    Get PDF
    Public health and service delivery programmes, interventions and policies (collectively, ‘programmes’) are typically developed and implemented for the primary purpose of effecting change rather than generating knowledge. Nonetheless, evaluations of these programmes may produce valuable learning that helps determine effectiveness and costs as well as informing design and implementation of future programmes. Such studies might be termed ‘opportunistic evaluations’, since they are responsive to emergent opportunities rather than being studies of interventions that are initiated or designed by researchers. However, current ethical guidance and registration procedures make little allowance for scenarios where researchers have played no role in the development or implementation of a programme, but nevertheless plan to conduct a prospective evaluation. We explore the limitations of the guidance and procedures with respect to opportunistic evaluations, providing a number of examples. We propose that one key missing distinction in current guidance is moral responsibility: researchers can only be held accountable for those aspects of a study over which they have control. We argue that requiring researchers to justify an intervention, programme or policy that would occur regardless of their involvement prevents or hinders research in the public interest without providing any further protections to research participants. We recommend that trial consent and ethics procedures allow for a clear separation of responsibilities for the intervention and the evaluation

    White Dwarfs in Globular Clusters: HST Observations of M4

    Get PDF
    Using WFPC2 on the Hubble Space Telescope, we have isolated a sample of 258 white dwarfs (WDs) in the Galactic globular cluster M4. Fields at three radial distances from the cluster center were observed and sizeable WD populations were found in all three. The location of these WDs in the color-magnitude diagram, their mean mass of 0.51(±0.03 \pm 0.03)M⊙_{\odot}, and their luminosity function confirm basic tenets of stellar evolution theory and support the results from current WD cooling theory. The WDs are used to extend the cluster main-sequence mass function upward to stars that have already completed their nuclear evolution. The WD/red dwarf binary frequency in M4 is investigated and found to be at most a few percent of all the main-sequence stars. The most ancient WDs found are about 9 Gyr old, a level which is set solely by the photometric limits of our data. Even though this is less than the age of M4, we discuss how these cooling WDs can eventually be used to check the turnoff ages of globular clusters and hence constrain the age of the Universe.Comment: 46 pages, latex, no figures included, figures available at ftp://ftp.astro.ubc.ca/pub/richer/wdfig.uu size 2.7Mb. To be published in the Astrophysical Journa
    • …
    corecore