1,803 research outputs found

    Procyclicality of financial systems: is there a need to modify current accounting and regulatory rules?

    Get PDF
    Financial systems have an intrinsic tendency to exacerbate business cycle fluctuations rather than smoothing them out. The current crisis is a perfect illustration of this. Some commentators have argued that the recent reforms to international bank regulation (Basel II) and accounting rules (IAS 39) are likely to increase this intrinsic procyclicality in the future. This article examines whether this accusation is founded and what policy decisions could be envisaged to alleviate this undesirable feature of financial systems.

    Liquidity regulation and the lender of last resort.

    Get PDF
    The recent subprime crisis has brought back to light proposals to regulate banks’ liquidity as a complement to solvency regulations. Based on recent academic research, I suggest that liquidity regulations might indeed be a way to limit the pressure on Central Banks in favour of liquidity injections during crisis periods. Another crucial question is the allocation of responsibilities between the Central Bank, the Banking Supervisors and the Treasury in the management of banking crises.

    Scintillation efficiency of liquid argon in low energy neutron-argon scattering

    Get PDF
    Experiments searching for weak interacting massive particles with noble gases such as liquid argon require very low detection thresholds for nuclear recoils. A determination of the scintillation efficiency is crucial to quantify the response of the detector at low energy. We report the results obtained with a small liquid argon cell using a monoenergetic neutron beam produced by a deuterium-deuterium fusion source. The light yield relative to electrons was measured for six argon recoil energies between 11 and 120 keV at zero electric drift field.Comment: 21 pages, 19 figures, 4 table

    Randomized Revenue Monotone Mechanisms for Online Advertising

    Full text link
    Online advertising is the main source of revenue for many Internet firms. A central component of online advertising is the underlying mechanism that selects and prices the winning ads for a given ad slot. In this paper we study designing a mechanism for the Combinatorial Auction with Identical Items (CAII) in which we are interested in selling kk identical items to a group of bidders each demanding a certain number of items between 11 and kk. CAII generalizes important online advertising scenarios such as image-text and video-pod auctions [GK14]. In image-text auction we want to fill an advertising slot on a publisher's web page with either kk text-ads or a single image-ad and in video-pod auction we want to fill an advertising break of kk seconds with video-ads of possibly different durations. Our goal is to design truthful mechanisms that satisfy Revenue Monotonicity (RM). RM is a natural constraint which states that the revenue of a mechanism should not decrease if the number of participants increases or if a participant increases her bid. [GK14] showed that no deterministic RM mechanism can attain PoRM of less than ln⁥(k)\ln(k) for CAII, i.e., no deterministic mechanism can attain more than 1ln⁥(k)\frac{1}{\ln(k)} fraction of the maximum social welfare. [GK14] also design a mechanism with PoRM of O(ln⁥2(k))O(\ln^2(k)) for CAII. In this paper, we seek to overcome the impossibility result of [GK14] for deterministic mechanisms by using the power of randomization. We show that by using randomization, one can attain a constant PoRM. In particular, we design a randomized RM mechanism with PoRM of 33 for CAII

    Optimal Design of Robust Combinatorial Mechanisms for Substitutable Goods

    Full text link
    In this paper we consider multidimensional mechanism design problem for selling discrete substitutable items to a group of buyers. Previous work on this problem mostly focus on stochastic description of valuations used by the seller. However, in certain applications, no prior information regarding buyers' preferences is known. To address this issue, we consider uncertain valuations and formulate the problem in a robust optimization framework: the objective is to minimize the maximum regret. For a special case of revenue-maximizing pricing problem we present a solution method based on mixed-integer linear programming formulation

    Power-Law Distributions in a Two-sided Market and Net Neutrality

    Full text link
    "Net neutrality" often refers to the policy dictating that an Internet service provider (ISP) cannot charge content providers (CPs) for delivering their content to consumers. Many past quantitative models designed to determine whether net neutrality is a good idea have been rather equivocal in their conclusions. Here we propose a very simple two-sided market model, in which the types of the consumers and the CPs are {\em power-law distributed} --- a kind of distribution known to often arise precisely in connection with Internet-related phenomena. We derive mostly analytical, closed-form results for several regimes: (a) Net neutrality, (b) social optimum, (c) maximum revenue by the ISP, or (d) maximum ISP revenue under quality differentiation. One unexpected conclusion is that (a) and (b) will differ significantly, unless average CP productivity is very high

    Clearing algorithms and network centrality

    Full text link
    I show that the solution of a standard clearing model commonly used in contagion analyses for financial systems can be expressed as a specific form of a generalized Katz centrality measure under conditions that correspond to a system-wide shock. This result provides a formal explanation for earlier empirical results which showed that Katz-type centrality measures are closely related to contagiousness. It also allows assessing the assumptions that one is making when using such centrality measures as systemic risk indicators. I conclude that these assumptions should be considered too strong and that, from a theoretical perspective, clearing models should be given preference over centrality measures in systemic risk analyses

    Environmental Effects on Cephalopod Life History and Fisheries

    Get PDF
    Editorial de un número especial de la Revista Aquatic Living ResourcesThe present collection of papers arises from a theme session on “Cephalopod Stocks: Review, Analyses, Assessment, and Sustainable Management” at the 2004 ICES Annual Science Conference, Vigo, Spain. The original proposal for the theme session was justified by the availability of much unpublished information on cephalopod biology and fisheries arising from various CEC-funded R&D projects during the last 15 years. The theme session also related directly to the EC-funded Concerted Action: CEPHSTOCK (Q5CA-2002-00962), and provided a route for dissemination of the review and synthesis work carried out under this project. The theme session was intended to facilitate the wider dissemination and publication of these results, with the long-term aim of informing future management decisions for the major fished stocks of cephalopods in European waters. Any future European research programme, related to cephalopod biology and fisheries, will need to take into account of knowledge acquired on cephalopod populations. The theme session aimed to attract scientists working on cephalopod stocks outside the NE Atlantic as well as those from ICES countries. The scope of the theme session was: • The current state of knowledge on exploited cephalopods (biology, fisheries, environmental relationships, stock identity) in European waters; • Current fishery data collection, stock assessment and management practices for cephalopod capture fisheries world-wide; • The current status of cephalopod culture and the prospects for commercial aquaculture; • Socio-economic issues related to cephalopod fisheries; • Current knowledge of aspects of cephalopod biology and ecology related to their suitability as resource species for capture and culture fisheries, and assessment of environmental factors which affect the immuno-competence and physiology of cephalopods; • Assessment and management options for currently unregulated cephalopod fisheries. The theme session attracted 28 oral presentations and 12 posters that could be broadly divided into those more concerned with biology and ecology, and those focusing on fisheries. Some of these presentations appear elsewhere, e.g. Guerra et al. (2005) on giant squid strandings. The selection of papers presented in Aquatic Living Resources vol. 18, No. 4, 2005, “Environmental effects on cephalopod life history and fisheries”, illustrates how cephalopod studies could contribute to a development of an ecosystem approach to fisheries management (FAO 2003), by analysing a series of environmental effects operating at different scales. Environmental effects on life histories The life-cycle characteristics of cephalopods contribute the main reasons for the large inter-annual fluctuations of population densities (Boyle and Boletzky 1996). In teleost fishes differences in biological parameters have been analysed in relation to fishing, considered sometimes as the main environmental impact (Rochet et al. 2000). As a first step to transposing this approach to cephalopods, substantial biological data sets and new statistical approaches are applied to answer questions about squid life history. Vidal et al. evaluated the influence of food supply on yolk utilization, metabolism and growth of paralarvae of Loligo vulgaris reynaudii while Smith et al. re-examined historical life history data to infer the relationships between nutritional state, growth and maturation in Loligo forbesi. Moreno et al. examined differences in age, size-at-maturity and reproductive investment in different cohorts of Loligo vulgaris in relation to environmental influences. Consequences of such influences on cohort success have to be analysed taking into account the spatial organization of fished populations. Walters et al. (2004) encouraged this approach presenting “spatial life history trajectories” which involve nested designs or time-stepping structures

    Luminescence quenching of the triplet excimer state by air traces in gaseous argon

    Full text link
    While developing a liquid argon detector for dark matter searches we investigate the influence of air contamination on the VUV scintillation yield in gaseous argon at atmospheric pressure. We determine with a radioactive alpha-source the photon yield for various partial air pressures and different reflectors and wavelength shifters. We find for the fast scintillation component a time constant tau1= 11.3 +- 2.8 ns, independent of gas purity. However, the decay time of the slow component depends on gas purity and is a good indicator for the total VUV light yield. This dependence is attributed to impurities destroying the long-lived argon excimer states. The population ratio between the slowly and the fast decaying excimer states is determined for alpha-particles to be 5.5 +-0.6 in argon gas at 1100 mbar and room temperature. The measured mean life of the slow component is tau2 = 3.140 +- 0.067 microsec at a partial air pressure of 2 x 10-6 mbar.Comment: 7 pages submitted to NIM
    • …
    corecore