125 research outputs found

    A study on the vulnerability of Korean shipping companies to cybersecurity threats

    Get PDF

    Essays on foreign currency risk management

    Get PDF
    This dissertation studies on-balance-sheet and off-balance-sheet foreign currency risk management of corporate firms and commercial banks. It is comprised of two essays. The first essay investigates what determines firmsā€™ foreign currency spot net asset positions, derivatives hedging and synthetic hedging positions. We build a model that anticipates a firmā€™s market timing in currency markets and credit markets according to the exchange-rate return and interest rate differential. Using a unique set of data containing complete foreign currency spot and derivatives positions of Korean exporting firms, we empirically find that currency position-squaring firms have significantly higher firm value. We also find evidence that these firms time the currency market when they manage their currency cash position. Meanwhile, firms time the credit market when they determine the use of foreign currency debts. Strikingly, firms still time the market even when they conduct derivatives hedging and synthetic hedging. Our findings are consistent with the market timing theory of capital structure. The second essay examines what determines banksā€™ exposure to foreign currency risks, their management of these risks, and the relationship to the probability of bank failures. Using a unique data set of Korean banks with detailed information on their foreign currency risk exposures and hedging positions, we find that banksā€™ foreign currency position mismatches, maturity mismatches, and debt roll-over risks are significantly attributed to their dollar carry lending strategy, which is stimulated by market timing of corporate firms, short-maturity dollar borrowings, real estate market booms, and dollar interest rate tightening. We also find that banksā€™ foreign currency exposures significantly increase their financial distress likelihood through dollar carry lending activities. Finally we show that, overall, banks that better match their foreign currency positions and maturities are rewarded with lower probabilities of financial distress

    AN ALGORITHM TO ESTIMATE MUSCLE FORCE FROM JOINT ANGLE USING SIMULINK

    Get PDF
    The purpose of this study was to compare the muscle length and muscle force from developed model and commercial software. Basically a Hill-type muscle model consists of springs and damper, and thus muscle force was changed by muscle length. Because of this reason, the researchers needed to know the changes in muscle length and developed algorithm to estimate muscle length from joint angle (angle-to-length). Also, we implemented muscle model (length-to-force) based on the model that F. E. Zajac modified the Hill-type muscle model. Muscle lengths and muscle forces show significantly good correlation in ankle muscles in results from two methods. This indicates that angleto- force can be adapted to developed a tool determining muscle forces in real-time

    Depression and PTSD in Pashtun Women in Kandahar, Afghanistan

    Get PDF
    ObjectivesThe objectives were (a) to establish prevalence of depression and post-traumatic stress disorder (PTSD) in Afghanistan and, (b) to investigate sociodemographic and quality of life variables, which predict depression and PTSD.MethodsTranslated versions of the Beck Depression Inventory, Impact of Event Scale-Revised, and Quality of Life Inventory were administered to 125 Pashtun women in Kandahar, and statistically analyzed.ResultsApproximately half of the participants showed moderate to severe levels of depression, and more than half of the participants exhibited symptoms of PTSD. Education and income showed significant associations with PTSD symptoms or depression. The way one spends time, general health status, and general feeling towards life predicted low levels of depression and PTSD.ConclusionsThe high prevalence of depression and PTSD indicate the continuing need for mental health intervention. While education has been found to be a protective factor for mental health in previous studies, the relationship between education and mental health appear to be more complex among Afghan women. Quality of life variables could be further investigated and incorporated into mental health interventions for Afghan women

    The Efficacy of Motivational Interviewing with Cognitive Behavioral Treatment on Behavior Changes in Heavy Drinkers

    Get PDF
    This study aimed to investigate the efficacy of motivational interviewing (MI) with cognitive behavioral treatment (CBT) on behavioral changes of heavy drinkers. This study used embedded mixed methods that integrate sequential qualitative interviews with quantitative evaluation. Of a total of 47 participants, 24 belonged to an intervention group, which participated in the MI with CBT on behavioral changes once a week, 25-30 min on average, for 8 weeks. A total of 23 participants were assigned to a control group, which received a 7-page booklet containing information about alcohol. A t-test, generalized linear model, and qualitative analysis were used to evaluate the effects of MI with CBT. The interview data (n = 13) were analyzed using qualitative content analysis. There was a statistically significant change in participants' beliefs concerning the immediate effects of drinking (F = 3.827, p = 0.025). Additionally, the intervention group had a significantly higher drinking refusal self-efficacy than the control group (F = 4.426, p = 0.015). Four themes emerged from the analysis of qualitative data: reduction of benefits of drinking, changes in thoughts about costs of drinking, changes in drinking behavior, and achieving self-efficacy. The MI with CBT significantly promoted awareness of problem-drinking behaviors among heavy drinkers and increased their self-efficacy, improving their ability to make positive behavioral changes for themselves. Since this intervention is simple and easy to apply, it will be useful for problem drinking-prevention strategies in the public health sector. Therefore, efforts to disseminate these strategies will be worthwhile from sustainable perspectives.Y

    DFX: A Low-latency Multi-FPGA Appliance for Accelerating Transformer-based Text Generation

    Full text link
    Transformer is a deep learning language model widely used for natural language processing (NLP) services in datacenters. Among transformer models, Generative Pre-trained Transformer (GPT) has achieved remarkable performance in text generation, or natural language generation (NLG), which needs the processing of a large input context in the summarization stage, followed by the generation stage that produces a single word at a time. The conventional platforms such as GPU are specialized for the parallel processing of large inputs in the summarization stage, but their performance significantly degrades in the generation stage due to its sequential characteristic. Therefore, an efficient hardware platform is required to address the high latency caused by the sequential characteristic of text generation. In this paper, we present DFX, a multi-FPGA acceleration appliance that executes GPT-2 model inference end-to-end with low latency and high throughput in both summarization and generation stages. DFX uses model parallelism and optimized dataflow that is model-and-hardware-aware for fast simultaneous workload execution among devices. Its compute cores operate on custom instructions and provide GPT-2 operations end-to-end. We implement the proposed hardware architecture on four Xilinx Alveo U280 FPGAs and utilize all of the channels of the high bandwidth memory (HBM) and the maximum number of compute resources for high hardware efficiency. DFX achieves 5.58x speedup and 3.99x energy efficiency over four NVIDIA V100 GPUs on the modern GPT-2 model. DFX is also 8.21x more cost-effective than the GPU appliance, suggesting that it is a promising solution for text generation workloads in cloud datacenters.Comment: Extension of HOTCHIPS 2022 and accepted in MICRO 202

    Search for invisible axion dark matter with a multiple-cell haloscope

    Full text link
    We present the first results of a search for invisible axion dark matter using a multiple-cell cavity haloscope. This cavity concept was proposed to provide a highly efficient approach to high mass regions compared to the conventional multiple-cavity design, with larger detection volume, simpler detector setup, and unique phase-matching mechanism. Searches with a double-cell cavity superseded previous reports for the axion-photon coupling over the mass range between 13.0 and 13.9ā€‰Ī¼\,\mueV. This result not only demonstrates the novelty of the cavity concept for high-mass axion searches, but also suggests it can make considerable contributions to the next-generation experiments.Comment: 6 pages, 5 figure

    nuQmm: Quantized MatMul for Efficient Inference of Large-Scale Generative Language Models

    Full text link
    The recent advance of self-supervised learning associated with the Transformer architecture enables natural language processing (NLP) to exhibit extremely low perplexity. Such powerful models demand ever-increasing model size and, thus, large amounts of computations and memory footprints. In this paper, we propose an efficient inference framework for large-scale generative language models. As the key to reducing model size, we quantize weights by a non-uniform quantization method. Then, quantized matrix multiplications are accelerated by our proposed kernel, called nuQmm, which allows a wide trade-off between compression ratio and accuracy. Our proposed nuQmm reduces the latency of not only each GPU but also the entire inference of large LMs because a high compression ratio (by low-bit quantization) mitigates the minimum required number of GPUs. Assuming 2-bit quantization, we demonstrate that nuQmm can reduce latency to generate each token for OPT-175B (that requires 8 GPUs without nuQmm) by 47.3% using 8 GPUs or by 23.2% using only 2 GPUs.Comment: 15 pages (including 5 pages of References & Appendix), 14 figures, 7 table
    • ā€¦
    corecore