66 research outputs found

    Schumpeterian Churn Dynamics And Re-gional Productivity Performance

    Get PDF
    This study tests empirically the Schumpeterian premise that the incessantturbulence of an economy in motion, apart from a production function comprised of static inputs, is capable of explaining patterns of economic growth and change. Localized employment churn"registered as job creation/destruction dynamics isused to account forvariations in U.S. metro-regional economic productivity performancesduring the 1986-99 period. The empirical results suggest that employment turnover and replacementdynamics have large and significant positive effects on localized productivity growth independent of a variety of industrial restructuring processes occurring simultaneously. While employment churn effectsare robust across U.S. Census regions, they do not exert a uniform influence on metro-regional productivity performances across time. Until 1996, job creation and destruction dynamics often canceled each other out as metro-regions underwent continued industrial restructuring. Since 1996, however, the positive effects on metro-region productivity growth have been consistently strong. In addition toa strong positiveeffect on productivity of the emergence of a localized IT sector, both an expanding service sector share of regional employment and a rising public spending share of regional outputexert powerful downward pressure on productivity growth rates

    Efficient Latency-Aware CNN Depth Compression via Two-Stage Dynamic Programming

    Full text link
    Recent works on neural network pruning advocate that reducing the depth of the network is more effective in reducing run-time memory usage and accelerating inference latency than reducing the width of the network through channel pruning. In this regard, some recent works propose depth compression algorithms that merge convolution layers. However, the existing algorithms have a constricted search space and rely on human-engineered heuristics. In this paper, we propose a novel depth compression algorithm which targets general convolution operations. We propose a subset selection problem that replaces inefficient activation layers with identity functions and optimally merges consecutive convolution operations into shallow equivalent convolution operations for efficient end-to-end inference latency. Since the proposed subset selection problem is NP-hard, we formulate a surrogate optimization problem that can be solved exactly via two-stage dynamic programming within a few seconds. We evaluate our methods and baselines by TensorRT for a fair inference latency comparison. Our method outperforms the baseline method with higher accuracy and faster inference speed in MobileNetV2 on the ImageNet dataset. Specifically, we achieve 1.41×1.41\times speed-up with 0.110.11\%p accuracy gain in MobileNetV2-1.0 on the ImageNet.Comment: ICML 2023; Codes at https://github.com/snu-mllab/Efficient-CNN-Depth-Compressio

    Is neighborhood poverty harmful to every child? Neighborhood poverty, family poverty, and behavioral problems among young children

    Full text link
    This longitudinal study investigates the association between neighborhood poverty and behavioral problems among young children. This study also examines whether social environments mediate the relationship between neighborhood poverty and behavioral problems. We used data from the third and fourth waves of the Fragile Families and Child Wellbeing study to assess behavioral problems separately for children who experienced no family poverty, moved out of family poverty, moved into family poverty, and experienced long‐term family poverty. Regression models assessed the effect of neighborhood poverty on behavioral problem outcomes among children aged 5 years, after controlling for sociodemographic characteristics and earlier behavioral problems. Results showed an association between neighborhood poverty and lower social cohesion and safety, which lead to greater externalizing problems among children with long‐term family poverty living in high‐poverty neighborhoods compared with those in low‐poverty neighborhoods. Policies and community resources need to be allocated to improve neighborhood social environments, particularly for poor children in high‐poverty neighborhoods.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/148233/1/jcop22140.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/148233/2/jcop22140_am.pd

    Joint triplet loss with semi-hard constraint for data augmentation and disease prediction using gene expression data

    No full text
    Abstract The accurate prediction of patients with complex diseases, such as Alzheimer’s disease (AD), as well as disease stages, including early- and late-stage cancer, is challenging owing to substantial variability among patients and limited availability of clinical data. Deep metric learning has emerged as a promising approach for addressing these challenges by improving data representation. In this study, we propose a joint triplet loss model with a semi-hard constraint (JTSC) to represent data in a small number of samples. JTSC strictly selects semi-hard samples by switching anchors and positive samples during the learning process in triplet embedding and combines a triplet loss function with an angular loss function. Our results indicate that JTSC significantly improves the number of appropriately represented samples during training when applied to the gene expression data of AD and to cancer stage prediction tasks. Furthermore, we demonstrate that using an embedding vector from JTSC as an input to the classifiers for AD and cancer stage prediction significantly improves classification performance by extracting more accurate features. In conclusion, we show that feature embedding through JTSC can aid in classification when there are a small number of samples compared to a larger number of features

    Radio resource metric estimation

    No full text

    A New Digital Value Chain Model with PLC in Biopharmaceutical Industry: The Implication for Open Innovation

    No full text
    The advancement of technology in the biopharmaceutical industry has spearheaded the speed and scale of digitalization in various aspects. New drug development is becoming more complex, costly, and challenging. This paper examines how the pharmaceutical value chain model could integrate with the product-life-cycle perspective to better explain the drug development process changes and how the digital transformation could be implemented at each stage of the drug development process. We suggest a new framework to capture digital value creation in the biopharmaceutical industry by focusing on the specific processes in new drug development and integrating the product life cycle management and value chain model. The new framework has operational implications for the biopharmaceutical industry, where digital transformation can simplify and increase the efficiency in each phase, from drug discovery, clinical trials, regulatory, manufacturing, commercialization, to monitoring processes

    A New Digital Value Chain Model with PLC in Biopharmaceutical Industry: The Implication for Open Innovation

    No full text
    The advancement of technology in the biopharmaceutical industry has spearheaded the speed and scale of digitalization in various aspects. New drug development is becoming more complex, costly, and challenging. This paper examines how the pharmaceutical value chain model could integrate with the product-life-cycle perspective to better explain the drug development process changes and how the digital transformation could be implemented at each stage of the drug development process. We suggest a new framework to capture digital value creation in the biopharmaceutical industry by focusing on the specific processes in new drug development and integrating the product life cycle management and value chain model. The new framework has operational implications for the biopharmaceutical industry, where digital transformation can simplify and increase the efficiency in each phase, from drug discovery, clinical trials, regulatory, manufacturing, commercialization, to monitoring processes
    corecore