155 research outputs found

    Unemployment Effects of Trade with a Low-Wage Country: A Minimum-Wage Model with Sector-Specific Factors

    Get PDF
    Contrary to conventional wisdom, this paper shows that a high-wage economy can paradoxically reduce its level of aggregate unemployment by engaging in international trade with a low-wage country. We demonstrate this possibility after introducing a minimum wage into the basic specific-factor model (with immobile capital and mobile labor), even though the opposite result is known to arise in the longer-run framework of the standard Heckscher-Ohlin-Samuelson model (with both inputs mobile). Our result provides a cautionary note for public-policy discussions that promote trade barriers as a way to reduce unemployment

    Improving the Design of EQ-5D Value Set Studies for China and Beyond

    Get PDF
    This thesis details in 6 studies studying different aspects of EQ-5D use in China. The population norm study provided the first norm based on urban Chinese self-reported health status, which not only provided insight into HRQoL variations among subgroups but also served as a reference point to quantify disease burden/intervention effect etc. The subsequent methodological studies offered suggestions for improving the design of future valuation studies that may strengthen health technology assessment and cost-utility analysis in China. Since EQ-5D is the most widely-used instrument worldwide for this purpose it is thus a good candidate for such instrument in China. _Chapter 2_ describes a descriptive exercise that was performed to report EQ-5D-5L norm scores in the urban Chinese population. Additional analysis was undertaken to test whether self-reported HRQoL varied between different demographic groups. It was found that HRQoL outcomes did indeed differ over age, gender, education level, health insurance status, employment status, and the residence of origin groups. In _Chapter 3,_ individual level inconsistency in the TTO task was related to factors varied during the interview and it was found that instead of respondents, the interviewer was most vital in reducing individual level inconsistency. The results suggested that the valuation process may have been influenced by potential interviewer effects before the implementation of the Quality Control (QC) tool, which was implemented in later EQ-5D-5L TTO research. Commencing in _Chapter 4_, possible designs to be used for the EQ-5D valuation study were systematically examined and compared. First, in Chapter 4, an EQ-5D-3L saturated dataset was used as a gold standard to compare two oft-mentioned but somewhat conflicting design principles in selecting health states for direct valuation: the commonness of health states (the prevalence) versus statistical efficiency of a design. By simulating the modelling process, it was found that the principle of statistical efficiency outweighed the principle of commonness in achieving sufficient prediction accuracy for non-valued states. This result suggested that the designs for the previous valuation studies were not optimal, and that future valuation studies could use a smaller design if the statistical efficiency of that design was guaranteed. In Chapter 4, the principle of commonness as a proxy for implausibility in health states selection was examined. In _Chapter 5,_ it was reported how university students valued all EQ-5D-5L states and judged the implausibility of each state. The results showed that respondents lacked agreement concerning which states were implausible. As there was no universal implausible state, the mean value of a state from respondents who thought it was implausible was compared with the counterpart from respondents who thought it was plausible. The results showed that values from implausible observations were lower, but still in agreement with values from plausible observations. Learning from design selection experience with EQ-5D-3L, in _Chapter 6_ the aim was to test the current EQ-VT design and to identify a possible smaller design for EQ-5D-5L valuation studies. The good performance in using an orthogonal design was confirmed again with EQ-5D-5L data, i.e. an orthogonal design with 25 states performed equally as well as the EQ-VT design with 86 states in terms of prediction accuracy for all 3,125 states. In _Chapter 7,_ the most efficient TTO data design (the orthogonal) was tested in comparison with the standard EQ-VT design and again was found favorable. In this thesis, attempts were made to understand the possible effects of sample and design choices in previous Chinese valuation studies. The findings of this thesis can also be generalized to other countries’ EQ-5D studies or to valuation studies employing other instruments than EQ-5D

    IT, Production Specialization, and Division of Labor: A Smith-Ricardo Model of International Trade

    Get PDF
    A simple Smith-Ricardo model is developed that incorporates division of labor into the continuum-good Ricardian model of Dornbusch et al.(1977). The trade-off between the efficiency gain and coordination cost associated with production specialization determines the efficient level of division of labor. The model is applied to explain how the recent IT revolution could affect a country’s efficient level of production specialization and competitive advantage. In particular, absolute advantage (in division of labor) and relative labor supply plays a crucial role in determining the effects of an IT progress on a country’s competitive margin in international trade

    A New Push on an Old Fundamental: Understanding the Patterns of Outsourcing

    Get PDF
    For what kind of intermediate input/service do firms often go outsourcing? This paper develops a model of two-stage production in which economies of scope are central to the production of both the intermediate and final good. The model is able to explain the patterns of outsourcing from the degree of product differentiation, economies of scope, and economies of scale in production of the intermediate input relative to that of the final good. The recent surge of outsourcing activities is explained by a new push (progress in the general purpose technology, e.g., information technology) on an old fundamental (economies of scope in production)

    Why Take on the Tobacco Industry: The Political Economy of Government Anti-smoking Campaign

    Get PDF
    A political-economy model is developed to show that government anti-smoking campaigns can benefit the government in the political bargaining with the tobacco industry by reducing the latter’s alternative welfare. Although the equilibrium regulation on the tobacco industry increases as a result of government anti-smoking campaign, the political contribution from the tobacco industry will not necessarily be reduced. Antismoking campaigns reduce welfare of the tobacco industry but its potential loss of not lobbying increases. An incumbent government/politician will increase its effort in anti-smoking campaigns when it becomes more hungry for political contribution, and this could indeed bring more political contributions from the tobacco industry under plausible conditions

    A Co9S8 microsphere and N-doped carbon nanotube composite host material for lithium-sulfur batteries

    Get PDF
    Lithium-sulfur batteries have emerged as extraordinarily favorable energy storage devices due to their high specific capacity and energy density, safety and low cost. Unfortunately, the wide applications of lithium-sulfur batteries are hampered by several issues, such as the low electronic conductivity and slow redox kinetics, serious volumetric expansion and polysulfide “shuttle effect”. To overcome these issues, in our work, we design and synthesize a composite sulfur host material of Co9S8 microspheres and N-doped carbon nanotubes, where the metallic sulfide Co9S8 with a good conductivity enables the immobilization of the polar lithium polysulfides owing to the strong polar chemisorptive capability, and the one dimensional N-doped carbon nanotubes can provide channels for fast electron and lithium-ion transport. As the lithium polysulfides are well confined, and the redox conversions are promoted, the Co9S8@N-CNTs/S-based lithium-sulfur battery possesses a superior energy storage performance, exhibiting a large specific capacity of 1233 mAh g-1 at 0.1 C and an outstanding cyclic performance, with a low decay of 0.045% per cycle and a Coulombic efficiency of more than 99% after 1000 cycles

    Excitonic states of an impurity in a Fermi gas

    Full text link

    Whole-genome sequencing of spermatocytic tumors provides insights into the mutational processes operating in the male germline

    Get PDF
    Adult male germline stem cells (spermatogonia) proliferate by mitosis and, after puberty, generate spermatocytes that undertake meiosis to produce haploid spermatozoa. Germ cells are under evolutionary constraint to curtail mutations and maintain genome integrity. Despite constant turnover, spermatogonia very rarely form tumors, so-called spermatocytic tumors (SpT). In line with the previous identification of FGFR3 and HRAS selfish mutations in a subset of cases, candidate gene screening of 29 SpTs identified an oncogenic NRAS mutation in two cases. To gain insights in the etiology of SpT and into properties of the male germline, we performed whole-genome sequencing of five tumors (4/5 with matched normal tissue). The acquired single nucleotide variant load was extremely low (~0.2 per Mb), with an average of 6 (2±9) no

    Incorporating rich background knowledge for gene named entity classification and recognition

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Gene named entity classification and recognition are crucial preliminary steps of text mining in biomedical literature. Machine learning based methods have been used in this area with great success. In most state-of-the-art systems, elaborately designed lexical features, such as words, n-grams, and morphology patterns, have played a central part. However, this type of feature tends to cause extreme sparseness in feature space. As a result, out-of-vocabulary (OOV) terms in the training data are not modeled well due to lack of information.</p> <p>Results</p> <p>We propose a general framework for gene named entity representation, called feature coupling generalization (FCG). The basic idea is to generate higher level features using term frequency and co-occurrence information of highly indicative features in huge amount of unlabeled data. We examine its performance in a named entity classification task, which is designed to remove non-gene entries in a large dictionary derived from online resources. The results show that new features generated by FCG outperform lexical features by 5.97 F-score and 10.85 for OOV terms. Also in this framework each extension yields significant improvements and the sparse lexical features can be transformed into both a lower dimensional and more informative representation. A forward maximum match method based on the refined dictionary produces an F-score of 86.2 on BioCreative 2 GM test set. Then we combined the dictionary with a conditional random field (CRF) based gene mention tagger, achieving an F-score of 89.05, which improves the performance of the CRF-based tagger by 4.46 with little impact on the efficiency of the recognition system. A demo of the NER system is available at <url>http://202.118.75.18:8080/bioner</url>.</p
    • …
    corecore