226 research outputs found

    Controlled functionalization of crystalline polyolefins and their application in soluble polymer support

    Full text link
    Functionalization of polyolefins has been recognized as a useful methodology for the generation of new materials with a wide range of applications. Recently, crystalline or semi-crystalline polyolefins have drawn increasing attention in both industrial and academic fields as one of the most interesting engineering plastics, due to their remarkable physical and mechanical properties. This dissertation describes: (1) novel methods for the direct postfunctionalization of crystalline polyolefins to introduce functionality, (2) characterizations for the functionalized polymers to analyze their structures, molecular-weight properties, thermal properties, and hydrophilicity, and (3) an application of the modified crystalline polystyrene as a soluble polymer support for recyclable catalysts in green chemistry. Chapter 1 describes the controlled iridium-catalyzed C-H activation of commercial polystyrenes having three types of tacticity. The resulting boronic group in the polymers was further converted into other versatile groups such as hydroxy and aryl groups via subsequent modifications. Chapter 2 addresses the preparation of a soluble syndiotactic polystyrene-supported phosphine ligand. The Suzuki-Miyaura cross-coupling was effectively accomplished with the polymer-supported palladium complex, which was recovered quantitatively and recycled several times without any loss of activity and the addition of fresh base. In Chapter 3, controlled electronic aromatic bromination of syndiotactic polystyrene was studied. The brominated polymer could serve as a precursor for polyolefins having variable functionalities. Chapter 4 describes the synthesis of hydroxy-functionalized isotactic poly(1-butene) using controlled and regioselective rhodium-catalyzed C-H functionalization and subsequent oxidation. Atom transfer radical polymerization could generate a polar or amphiphilic graft copolymers from the functionalized crystalline polyolefin

    an empirical study of elementary students in Korea

    Get PDF
    Thesis(Master) --KDI School:Master of Development Policy,2016Children’s socialization as consumers is influenced by various factors. This study examines the previous theories and models associated with a consumer socialization of child. The purpose of this study is to explore how the major factors (e.g., family, peer group, media, etc.) affect to children and how the effect of the major factor differs in accordance with the conditions. This study conducts surveys and applies statistical analysis, such as regression, ANOVA, t-test and chi-square to investigate the data. Result of the study provides meaningful implication to consumer socialization of child and offers managerial suggestion for marketing and sales targeting to children.PartⅠ. Introduction PartⅡ. Background of Study PartⅢ. Hypotheses Development PartⅣ. Methodology & Results PartⅤ. ConclusionmasterpublishedJihoon SHIN

    Incremental Lossless Graph Summarization

    Full text link
    Given a fully dynamic graph, represented as a stream of edge insertions and deletions, how can we obtain and incrementally update a lossless summary of its current snapshot? As large-scale graphs are prevalent, concisely representing them is inevitable for efficient storage and analysis. Lossless graph summarization is an effective graph-compression technique with many desirable properties. It aims to compactly represent the input graph as (a) a summary graph consisting of supernodes (i.e., sets of nodes) and superedges (i.e., edges between supernodes), which provide a rough description, and (b) edge corrections which fix errors induced by the rough description. While a number of batch algorithms, suited for static graphs, have been developed for rapid and compact graph summarization, they are highly inefficient in terms of time and space for dynamic graphs, which are common in practice. In this work, we propose MoSSo, the first incremental algorithm for lossless summarization of fully dynamic graphs. In response to each change in the input graph, MoSSo updates the output representation by repeatedly moving nodes among supernodes. MoSSo decides nodes to be moved and their destinations carefully but rapidly based on several novel ideas. Through extensive experiments on 10 real graphs, we show MoSSo is (a) Fast and 'any time': processing each change in near-constant time (less than 0.1 millisecond), up to 7 orders of magnitude faster than running state-of-the-art batch methods, (b) Scalable: summarizing graphs with hundreds of millions of edges, requiring sub-linear memory during the process, and (c) Effective: achieving comparable compression ratios even to state-of-the-art batch methods.Comment: to appear at the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD '20

    The importance of collaboration in the knowledge-intensive business services: The efficiency analysis of the industry using a meta-frontier approach

    Get PDF
    This study focuses on the increasing global interest in collaboration and highlights its importance in Knowledge Intensive Business Services. This study classifies service firms into four groups with two different knowledge intensity levels based on their business. We examine the efficiency of each group, and identify and explore the perceived significance of collaboration in the industry. The findings indicate that less knowledge-intensified service firms are the most efficient in productivity whereas highly knowledge-intensified firms show lower efficiency. These results provide valuable insights into how further collaboration improves the performance of the service industry

    TensorCodec: Compact Lossy Compression of Tensors without Strong Data Assumptions

    Full text link
    Many real-world datasets are represented as tensors, i.e., multi-dimensional arrays of numerical values. Storing them without compression often requires substantial space, which grows exponentially with the order. While many tensor compression algorithms are available, many of them rely on strong data assumptions regarding its order, sparsity, rank, and smoothness. In this work, we propose TENSORCODEC, a lossy compression algorithm for general tensors that do not necessarily adhere to strong input data assumptions. TENSORCODEC incorporates three key ideas. The first idea is Neural Tensor-Train Decomposition (NTTD) where we integrate a recurrent neural network into Tensor-Train Decomposition to enhance its expressive power and alleviate the limitations imposed by the low-rank assumption. Another idea is to fold the input tensor into a higher-order tensor to reduce the space required by NTTD. Finally, the mode indices of the input tensor are reordered to reveal patterns that can be exploited by NTTD for improved approximation. Our analysis and experiments on 8 real-world datasets demonstrate that TENSORCODEC is (a) Concise: it gives up to 7.38x more compact compression than the best competitor with similar reconstruction error, (b) Accurate: given the same budget for compressed size, it yields up to 3.33x more accurate reconstruction than the best competitor, (c) Scalable: its empirical compression time is linear in the number of tensor entries, and it reconstructs each entry in logarithmic time. Our code and datasets are available at https://github.com/kbrother/TensorCodec.Comment: Accepted to ICDM 2023 - IEEE International Conference on Data Mining 202
    corecore