5,101 research outputs found

    Validating Dynamic General Equilibrium Model Forecasts

    Get PDF
    The maintained hypotheses embodied in structural general equilibrium models calibrated to data have tended to make economists and policy makers insecure regarding their empirical foundation. Advances in dynamic general equilibrium (DGE) theory and its empirical application have exacerbated this insecurity since the forecasts provide by these models brings questions of validation to the forefront. Here, methods are developed to measure the magnitude of bias in DGE forecasts that are simple to implement. We adopted the concordance correlation measure, and introduced a time function method to assess the bias in DGE forecasts. A time-series confidence interval method is also introduced to formally judge the "good" forecasts from the "bad". A calibrated DGE model is used to illustrate them. The time function method allows for the choosing of a functional form and an upper bound on forecast error. The time-series confidence interval method allows the DGE results to be evaluated by the standard of the rival time series models. If the DGE results are as good as time-series forecasts, the DGE model is a superior framework because of its advantage in providing not only "good" forecasts, but also insights into the economic structure generating the results. To illustrate these methods, we calibrate to Taiwanese data for the year 1988 a multi-sector Ramsey-based DGE model. The model is shown to forecast various dimension of the economy with surprising good, but varying accuracy. The proposed validation measures show effectiveness in distinguishing among diverse model parameter values and detecting model improvements. The measures are also statistically meaningful and require no arbitrary probabilistic assumptions on the distribution of either the results or the data.Research Methods/ Statistical Methods,

    Transitioning Applications to Semantic Web Services: An Automated Formal Approach

    No full text
    Semantic Web Services have been recognized as a promising technology that exhibits huge commercial potential, and attract significant attention from both industry and the research community. Despite expectations being high, the industrial take-up of Semantic Web Service technologies has been slower than expected. One of the main reasons is that many systems have been developed without considering the potential of the web in integrating services and sharing resources. Without a systematic methodology and proper tool support, the migration from legacy systems to Semantic Web Service-based systems can be a very tedious and expensive process, which carries a definite risk of failure. There is an urgent need to provide strategies which allow the migration of legacy systems to Semantic Web Services platforms, and also tools to support such a strategy. In this paper we propose a methodology for transitioning these applications to Semantic Web Services by taking the advantage of rigorous mathematical methods. Our methodology allows users to migrate their applications to Semantic Web Services platform automatically or semi-automatically

    An Anatomy of Moroccan Agricultural Trade

    Get PDF
    Morocco is engaged in a number of economic reforms to better position the country's integration into world markets. Her agricultural sector is particularly important as its trade, GDP, and employment share are relatively large. We analyze Morocco's agricultural trade growth trends over the past 40 years (1962 - 2004) using SITC 4-digit bilateral agricultural trade data. The data are analyzed using the trend and cycles decomposition (TCD) approach and measurement of trade growth at the intensive and extensive margin. We find a high concentration of agriculture trade in both commodities and trading partners. Morocco has also lost export shares in EU to other EU countries in her top exporting commodities. Another finding suggests that agricultural export growth for Morocco was at the intensive rather than extensive margin. This posts a great challenge for Morocco if she is to expand trade at the extensive margin.International Relations/Trade,

    A novel integrative risk index of papillary thyroid cancer progression combining genomic alterations and clinical factors.

    Get PDF
    Although the majority of papillary thyroid cancer (PTC) is indolent, a subset of PTC behaves aggressively despite the best available treatment. A major clinical challenge is to reliably distinguish early on between those patients who need aggressive treatment from those who do not. Using a large cohort of PTC samples obtained from The Cancer Genome Atlas (TCGA), we analyzed the association between disease progression and multiple forms of genomic data, such as transcriptome, somatic mutations, and somatic copy number alterations, and found that genes related to FOXM1 signaling pathway were significantly associated with PTC progression. Integrative genomic modeling was performed, controlling for demographic and clinical characteristics, which included patient age, gender, TNM stages, histological subtypes, and history of other malignancy, using a leave-one-out elastic net model and 10-fold cross validation. For each subject, the model from the remaining subjects was used to determine the risk index, defined as a linear combination of the clinical and genomic variables from the elastic net model, and the stability of the risk index distribution was assessed through 2,000 bootstrap resampling. We developed a novel approach to combine genomic alterations and patient-related clinical factors that delineates the subset of patients who have more aggressive disease from those whose tumors are indolent and likely will require less aggressive treatment and surveillance (p = 4.62 × 10-10, log-rank test). Our results suggest that risk index modeling that combines genomic alterations with current staging systems provides an opportunity for more effective anticipation of disease prognosis and therefore enhanced precision management of PTC

    Polydimethylsiloxane (PDMS)-based microfluidic channel with integrated commercial pressure sensors

    Get PDF
    The precise characterisation of boiling in microchannels is essential for the optimisation of applications requiring two phase cooling. In this paper polydimethylsiloxane (PDMS) is employed to make microchannels for characterising microboiling. In particular the material properties of PDMS facilitate rapid prototyping and its optical transparency provides the capability to directly view any fluid flow. The production of microchannels is complicated by the need to integrate custom made sensors. This paper presents a PDMS microfluidic device with integrated commercial pressure sensors, which have been used to perform a detailed characterisation of microboiling phenomena. The proposed approach of integrating commercial pressure sensors into the channel also has potential applications in a range of other microsystems

    Old supernova dust factory revealed at the Galactic center

    Full text link
    Dust formation in supernova ejecta is currently the leading candidate to explain the large quantities of dust observed in the distant, early Universe. However, it is unclear whether the ejecta-formed dust can survive the hot interior of the supernova remnant (SNR). We present infrared observations of ~0.02 MM_\odot of warm (~100 K) dust seen near the center of the ~10,000 yr-old Sgr A East SNR at the Galactic center. Our findings signify the detection of dust within an older SNR that is expanding into a relatively dense surrounding medium (nen_e ~ 100 cm3\mathrm{cm}^{-3}) and has survived the passage of the reverse shock. The results suggest that supernovae may indeed be the dominant dust production mechanism in the dense environment of early Universe galaxies.Comment: 25 pages, 5 figures. Includes supplementary materials. Published Online March 19 2015 on Science Expres

    A Framework for Interaction and Cognitive Engagement in Connectivist Learning Contexts

    Get PDF
    Interaction has always been highly valued in education, especially in distance education (Moore, 1989; Anderson, 2003; Chen, 2004a; Woo & Reeves, 2007; Wang, 2013; Conrad, in press). It has been associated with motivation (Mahle, 2011; Wen-chi, et al., 2011), persistence (Tello, 2007; Joo, Lim, & Kim, 2011), deep learning (Offir, et al., 2008) and other components of effective learning. With the development of interactive technologies, and related connectivism learning theories (Siemens, 2005a; Downes, 2005), interaction theory has expanded to include interactions not only with human actors, but also with machines and digital artifacts. This paper explores the characteristics and principles of connectivist learning in an increasingly open and connected age. A theory building methodology is used to create a new theoretical model which we hope can be used by researchers and practitioners to examine and support multiple types of effective educational interactions. Inspired by the hierarchical model for instructional interaction (HMII) (Chen, 2004b) in distance learning, a framework for interaction and cognitive engagement in connectivist learning contexts has been constructed. Based on cognitive engagement theories, the interaction of connectivist learning is divided into four levels: operation interaction, wayfinding interaction, sensemaking interaction, and innovation interaction. Connectivist learning is thus a networking and recursive process of these four levels of interaction
    corecore