3,707 research outputs found

    Financial Intermediation, Competition, and Risk: A General Equilibrium Exposition

    Get PDF
    We study a simple general equilibrium model in which investment in a risky technology is subject to moral hazard and banks can extract market power rents. We show that more bank competition results in lower economy-wide risk, lower bank capital ratios, more efficient production plans and Pareto-ranked real allocations. Perfect competition supports a second best allocation and optimal levels of bank risk and capitalization. These results are at variance with those obtained by a large literature that has studied a similar environment in partial equilibrium. Importantly, they are empirically relevant, and demonstrate the need of general equilibrium modeling to design financial policies aimed at attaining socially optimal levels of systemic risk in the economy.General Equilibrium;Bank Competition;Market Power Rents;Risk

    Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    Get PDF
    This paper formulates a dynamic model of a bank exposed to both credit and liquidity risk, which can resolve financial distress in three costly forms: fire sales, bond issuance and equity issuance. We use the model to analyze the impact of capital regulation, liquidity requirements and taxation on banks' optimal policies and metrics of efficiency of intermediation and social value. We obtain three main results. First, mild capital requirements increase bank lending, bank efficiency and social value relative to an unregulated bank, but these benefits turn into costs if capital requirements are too stringent. Second, liquidity requirements reduce bank lending, efficiency and social value significantly, they nullify the benifits of mild capital requirements, and their private and social costs increase monotonically with their stringency. Third, increases in corporate income and bank liabilities taxes reduce bank lending, bank effciency and social value, with tax receipts increasing with the former but decreasing with the latter. Moreover, the effects of an increase in both forms of taxation are dampened if they are jointly implemented with increases in capital and liquidity requirements.Capital requirements;liquidity requirements;taxation of liabilities. JEL Classifications

    DEPAS: A Decentralized Probabilistic Algorithm for Auto-Scaling

    Full text link
    The dynamic provisioning of virtualized resources offered by cloud computing infrastructures allows applications deployed in a cloud environment to automatically increase and decrease the amount of used resources. This capability is called auto-scaling and its main purpose is to automatically adjust the scale of the system that is running the application to satisfy the varying workload with minimum resource utilization. The need for auto-scaling is particularly important during workload peaks, in which applications may need to scale up to extremely large-scale systems. Both the research community and the main cloud providers have already developed auto-scaling solutions. However, most research solutions are centralized and not suitable for managing large-scale systems, moreover cloud providers' solutions are bound to the limitations of a specific provider in terms of resource prices, availability, reliability, and connectivity. In this paper we propose DEPAS, a decentralized probabilistic auto-scaling algorithm integrated into a P2P architecture that is cloud provider independent, thus allowing the auto-scaling of services over multiple cloud infrastructures at the same time. Our simulations, which are based on real service traces, show that our approach is capable of: (i) keeping the overall utilization of all the instantiated cloud resources in a target range, (ii) maintaining service response times close to the ones obtained using optimal centralized auto-scaling approaches.Comment: Submitted to Springer Computin

    Capital Regulation, Liquidity Requirements and Taxation in a Dynamic Model of Banking

    Get PDF

    Detection of vitellogenin in a subpopulation of sea urchin coelomocytes

    Get PDF
    Sea urchin vitellogenin is a high molecular weight glycoprotein, which is the precursor of the major yolk protein present in the unfertilized egg. Vitellogenin processing into the major yolk protein and its further enzymatic cleavage during sea urchin embryonic development, has been extensively described, and the adhesive properties of the processed molecule have been studied. The function of vitellogenin in the adult, where it has been found in the coelomic fluid of both male and female individuals, is still unknown, although its role on promoting the adhesion of embryonic cells has been shown. In this report we describe the detection of vitellogenin in lysates of whole circulating coelomocytes of both male and female sea urchins of the species Paracentrotus lividus. By metrizoic acid gradients we purified total coelomocytes into six subpopulations that were tested for the occurrence of the molecule using vitellogenin-specific polyclonal antibodies. We detected vitellogenin only in the coelomocyte subpopulation called colorless spherule cells, packed in kidney-shaped granules located around the nucleus. We also showed that coelomocytes respond to stress conditions by discharging vitellogenin into the medium. This result together with previous observations on the adhesive properties of the molecule suggest a role for vitellogenin in the clotting phenomenon occurring after host invasion

    Comparing continuous and intermittent exercise. An "isoeffort" and "isotime" approach

    Get PDF
    The present study proposes an alternative way of comparing performance and acute physiological responses to continuous exercise with those of intermittent exercise, ensuring similar between-protocol overall effort (isoeffort) and the same total duration of exercise (isotime). This approach was expected to overcome some drawbacks of traditional methods of comparison. Fourteen competitive cyclists (20±3 yrs) performed a preliminary incremental test and four experimental 30-min self-paced protocols, i.e. one continuous and three passive-recovery intermittent exercise protocols with different workto- rest ratios (2 = 40:20s, 1 = 30:30s and 0.5 = 20:40s). A "maximal session effort" prescription was adopted for this experimental design. As expected, a robust perceived exertion template was observed irrespective of exercise protocol. Similar between-protocol pacing strategies further support the use of the proposed approach in competitive cyclists. Total work, oxygen uptake and heart rate mean values were significantly higher (P<0.05) in the continuous compared to intermittent protocols, while lactate values were lower. Manipulating the work-to-rest ratio in intermittent exercise, total work, oxygen uptake and heart rate mean values decreased with the decrease in the work-to-rest ratio, while lactate values increased. Despite this complex physiological picture, all protocols showed similar ventilatory responses and a nearly perfect relationship between respiratory frequency and perceived exertion. In conclusion, our data indicate that overall effort and total duration of exercise are two critical parameters that should both be controlled when comparing continuous with intermittent exercise. On an isoeffort and isotime basis, the work-to-rest ratio manipulation affects physiological responses in a different way from what has been reported in literature with traditional methods of comparison. Finally, our data suggest that during intermittent exercise respiratory frequency reflects physiological strain better than oxygen uptake, heart rate and blood lactate

    Minimax optimal quantile and semi-adversarial regret via root-logarithmic regularizers

    Full text link
    Quantile (and, more generally, KL) regret bounds, such as those achieved by NormalHedge (Chaudhuri, Freund, and Hsu 2009) and its variants, relax the goal of competing against the best individual expert to only competing against a majority of experts on adversarial data. More recently, the semi-adversarial paradigm (Bilodeau, Negrea, and Roy 2020) provides an alternative relaxation of adversarial online learning by considering data that may be neither fully adversarial nor stochastic (i.i.d.). We achieve the minimax optimal regret in both paradigms using FTRL with separate, novel, root-logarithmic regularizers, both of which can be interpreted as yielding variants of NormalHedge. We extend existing KL regret upper bounds, which hold uniformly over target distributions, to possibly uncountable expert classes with arbitrary priors; provide the first full-information lower bounds for quantile regret on finite expert classes (which are tight); and provide an adaptively minimax optimal algorithm for the semi-adversarial paradigm that adapts to the true, unknown constraint faster, leading to uniformly improved regret bounds over existing methods.https://arxiv.org/pdf/2110.14804.pdfPublished versio

    Insights into the off-state breakdown mechanisms in power GaN HEMTs

    Get PDF
    We analyze the off-state, three-terminal, lateral breakdown in AlGaN/GaN HEMTs for power switching applications by comparing two-dimensional numerical device simulations with experimental data from device structures with different gate-to-drain spacing and with either undoped or Carbon-doped GaN buffer layer. Our simulations reproduce the different breakdown-voltage dependence on the gate-drain-spacing exhibited by the two types of device and attribute the breakdown to: i) a combination of gate electron injection and source-drain punch-through in the undoped HEMTs; and ii) avalanche generation triggered by gate electron injection in the C-doped HEMTs
    • …
    corecore