648 research outputs found

    Adverse selection costs, trading activity and price discovery in the NYSE: An empirical analysis

    Get PDF
    This paper studies the role that trading activity plays in the price discovery process of a NYSE-listed stock. We measure the expected information content of each trade by estimating its permanent price impact. It depends on observable trade features and market conditions. We also estimate the time required for quotes to incorporate all the information content of a particular trade. Our results show that price discovery is faster after risky trades and also at the extreme intervals of the session. The quote adjustment to trade-related shocks is progressive and this causes risk persistency and unusual short-term market conditions.Publicad

    Adverse selection costs, trading activity and liquidity in the NYSE: an empirical analysis in a dynamic context

    Get PDF
    This paper measures the adverse selection costs associated to a given trade by estimating its permanent impact on market quotes. This estimation depends on observable trade features and market conditions, and it is given by the impulse-response function of a generalization of the Hasbrouck's (1991a,b) VAR model. It is evidenced that microstructure structural models of quote formation may introduce a downward bias in the estimation of adverse selection costs by assuming that trades only have an immediate impact on prices. Moreover, it is observed that the market behavior, in terms of liquidity and activity, in the short-term period after a trade depends on the information-asymmetry risk associated to that trade

    On the bi-dimensionality of liquidity

    Get PDF
    Variations in overall liquidity can be measured by simultaneous changes in both immediacy costs and depth. Liquidity changes, however, are ambiguous whenever both liquidity dimensions do not reinforce each other. In this paper, ambiguity is characterized using an instantaneous time-varying elasticity concept. Several bi-dimensional liquidity measures that cope with the ambiguity problem are constructed. First, it is shown that bi-dimensional measures are superior since commonalities in overall liquidity cannot be fully explained by the common factors in one-dimensional proxies of liquidity. Second, it is shown that an infinitesimal variation in either market volatility or trading activity augments the probability of observing an unambiguous liquidity adjustment. Ambiguity strongly depends on the expected (deterministic) component of volatility.Publicad

    Evaluación de los modificadores por comportamiento urbanos que afectan al daño en los terremotos. Aplicación al terremoto de Lorca.

    Get PDF
    Habitualmente, la estimación de la vulnerabilidad sísmica se centra en el comportamiento estructural de los edificios. Sólo algunas metodologías, como el proyecto Risk‐UE,consideran la influencia de otros factores no estructurales o urbanísticos, tales como el piso blando, la irregularidad en alzado, la irregularidad en planta, etc. Estos factores,denominados también modificadores por comportamiento, pueden tener una incidencia en el daño observado, y la confluencia de varios de ellos puede variar sustancialmente la vulnerabilidad. Los modificadores por comportamiento se han identificado de forma empírica, a través de la observación de patrones de daño típicos en terremotos, teniendo en cuenta las inspecciones visuales (ATC 21 1988, Benedetti y Petrini 1984, UNDP/UNIDO 1985) y otras propuestas (Coburn y Spence 1992). La puntuación del modificador ha sido dada por el conocimiento de expertos en terremotos tras analizar evaluaciones de vulnerabilidad anteriores y bases de datos del daño producido en edificios. En esta comunicación estudiaremos los modificadores que derivan de características urbanísticas. Esta línea de investigación considera que un parámetro modificador deriva de características urbanísticas si puede ser regulado en la Normativa Urbanística de un Plan General de Ordenación Urbana. Se realiza una descripción de cada modificador según cada metodología o investigador (Risk‐UE, Giovinazzi, Lantada y Feriche) y una comparativa entre las distintas ponderaciones de los modificadores. Este análisis nos permite poder tener una primera visión de la posible cuantificación de cada modificador y la tendencia que ha tenido la calibración desde el año 2003 con el proyecto Risk‐UE hasta el año 2012 con la tesis de Feriche. Finalmente se presentan los resultados del estudio exploratorio de los parámetros urbanísticos de tres zonas seleccionados de la ciudad de Lorca según el tipo de suelo en el que se encuentren y se indican aquellos parámetros que han podido influir en el daño provocado por el terremoto de mayo de 2011

    Sumoylation of Smc5 Promotes Error-free Bypass at Damaged Replication Forks

    Get PDF
    Replication of a damaged DNA template can threaten the integrity of the genome, requiring the use of various mechanisms to tolerate DNA lesions. The Smc5/6 complex, together with the Nse2/Mms21 SUMO ligase, plays essential roles in genome stability through undefined tasks at damaged replication forks. Various subunits within the Smc5/6 complex are substrates of Nse2, but we currently do not know the role of these modifications. Here we show that sumoylation of Smc5 is targeted to its coiled-coil domain, is upregulated by replication fork damage, and participates in bypass of DNA lesions. smc5-KR mutant cells display defects in formation of sister chromatid junctions and higher translesion synthesis. Also, we provide evidence indicating that Smc5 sumoylation modulates Mph1-dependent fork regression, acting synergistically with other pathways to promote chromosome disjunction. We propose that sumoylation of Smc5 enhances physical remodeling of damaged forks, avoiding the use of a more mutagenic tolerance pathway.Ministerio de Ciencia, Innovacion y Universidades (BFU2015-71308-P, PGC2018-097796-B-I00)AGAUR-Generalitat de Catalunya (2017-SGR-569

    Coarse-grain Load Distribution in Heterogeneous Computing

    Get PDF
    HPC heterogeneous clusters are composed by different type of machines (various types of component manufacturers, varying computational capacities), and different hardware accelerators. TThe most common type of data distributions is the equal division of the data across all the nodes. A more sophisticated policy of data distribution is needed to explode the computational capacity of the entire system

    Seismic vulnerability and damage assessment in Navarre (NE Spain)

    Get PDF
    A regional characterization of the seismic vulnerability of the building stock of Navarre (Northern Spain) and the expected damage associated with expected ground shaking for a 475-year return period is presented. Besides the initial planning meetings, the work consists on three phases: The first is the field work conducted along different routes crossing the entire region, including main cities. Two geographical areas with distinctive construction patterns and characteristic typologies were recognised and delimited, together with a transition zone. Several buildings were sampled and documented, and empirical vulnerability distributions were obtained. The second phase relates to cadastral data exploitation and processing, selection of parcels as working units and selection of municipalities and districts as representation units. Based on the age of construction and the associated seismic code requirements; the number of stories; and the empirical distributions derived in the earlier stage, statistical distributions of building vulnerability classes were composed following three vulnerability classifications. These include the vulnerability classification of the European Macroseismic Scale, the vulnerability index approach and the Hazus classification. This phase was as important as time-consuming, and set the basis for the proper development of the subsequent analyses. The third phase consisted on calculating the expected damage with empirical as well as with analytical methods, using as seismic input an updated hazard-consistent seismic intensity map of the region. Vulnerability and damage results derived with the three methods used are compared and analysed, and their suitability discussed. Results of this work will be used in the regional seismic risk plan of Navarre (RISNA Project

    Methodology for an effective risk assessment of urban areas: progress and first results of the merisur project

    Get PDF
    The progress and results the MERISUR, Methodology for an Effective RISk assessment of URban areas, are presented. This project aims at developing an effective methodology for urban seismic risk assessment that provides solutions to some deficiencies detected after recent damaging events worldwide, including risk mitigation actions based on benefit/cost ratios. In a fisrt stage, the hazard and vulnerability models are developed and improved. A procedure to determine the hazard-controlling seismogenic fault, contsistent with different probability levels, is established. Methods to include active faults as individual sources and to consider near filed effects that significantly amplify ground motions are proposed. A more complete description of seismic vulnerability encompassing structural, non-structural components is accomplished. Vulnerability modifiers to incorporate effects or urban parameters on vulnerability classes are also quantified. A distinction is also made between damage to structural and non-structural building elements. For this purpose, a pushover analysis is specifically carried out to model building response and damage trends on non-structural elements. This gives the primary damage. In addition, the area covered by the resulting debris is also estimated both in inner spaces (within the building) and in the outer space (public roads and streets). In this way, a volume of debris will be associated to each area unit of the city, and the potential damage to persons and elements exposed, such as urban furniture and vehicles, will be assessed. This constitutes the secondary damage. A static level of occupation (building, urban furniture, etc.) and a dynamic level of occupation (persons, vehicles) will be assigned to each area unit of the city, hereby defining the exposure in time and space. Earthquake losses related to primary damage of building components and to secondary damage (such as urban furniture and vehicles) will be also assessed. Cost/benefit ratios between ex ante risk mitigation measurements will be developed in order to decide whether risk transfer or risk retention is preferable for different risk scenarios. This analysis will confer effectiveness to the results of a seismic risk study. Overall, the estimate of earthquake losses and cost/benefit ratios are topics with little presence in the scientific literature concerning damaging earthquakes in Spain. Thus, the results of this study will provide effective solutions to the challenge to society tackled in this proposal

    Bioaccesibilidad de arsénico y mercurio en alimentos con potencial riesgo toxicológico

    Get PDF
    El arsénico y el mercurio son elementos traza, cuyas concentraciones en alimentos deben estar controladas por las autoridades sanitarias debido a que su excesiva ingesta puede entrañar efectos perjudiciales para la salud. De entre las distintas especies químicas de ambos tóxicos existentes en los alimentos, el arsénico inorgánico y el metilmercurio constituyen las especies más tóxicas, estando considerado el arsénico inorgánico como carcinógeno tipo I para el hombre, y el metilmercurio como posible carcinogénico para el hombre, grupo 2B. El arroz, en el caso del arsénico, y los productos de la pesca en el caso del mercurio, son alimentos de alto consumo susceptibles de presentar elevadas concentraciones de estos contaminantes y pueden por ello constituir un riesgo para la salud de los consumidores. En la estimación del riesgo toxicológico, la etapa de evaluación de la exposición tiene en cuenta la frecuencia y la magnitud de la misma. Dado que la dieta es la principal vía de entrada de arsénico y mercurio para el hombre, la evaluación de la magnitud de la exposición a ambos tóxicos debería considerar tanto la influencia del cocinado sobre la concentración del tóxico, como la biodisponibilidad oral (fracción soluble del contaminante ingerido que es absorbida por el epitelio intestinal y alcanza la circulación sistémica, hallándose así disponible para actuar en el organismo receptor). Una herramienta conservadora para la evaluación de la biodisponibilidad oral es la bioaccesibilidad, relación entre la concentración bioaccesible o soluble de una sustancia y la concentración total de la sustancia presente en la muestra. La bioaccesibilidad es indicativa de la máxima cantidad que puede absorberse por el epitelio intestinal y por ello se utiliza como indicador de la máxima biodisponibilidad oral. Conocer la bioaccesibilidad del arsénico y mercurio desde los alimentos puede aportar información novedosa en la estimación del riesgo toxicológico. En la actualidad, para estimar la bioaccesibilidad de los contaminantes pueden utilizarse métodos de digestión gastrointestinal in vitro estáticos y dinámicos que emulan las etapas gástrica e intestinal de la digestión humana. La mayoría de los estudios sobre arsénico en arroz y mercurio en pescados existentes en la bibliografía, no consideran el efecto del cocinado ni la bioaccesibilidad, limitándose a una determinación de las concentraciones de estos metales en los productos crudos. La presente tesis doctoral caracteriza las concentraciones de arsénico, mercurio y sus especies químicas de interés toxicológico en muestras de arroz y productos de la pesca, y evalúa el efecto del cocinado y de la bioaccesibilidad sobre dichas concentraciones y sobre la estimación del riesgo asumido a su consumo.Arsenic (As) and mercury (Hg) are toxic trace elements, and their concentrations in foods must be controlled by health authorities because of the possible adverse effects on the health that are associated with their dietary intake. Inorganic arsenic is considered by the International Agency for Research on Cancer (IARC) as a carcinogen to humans, Group I and methylmercury compounds are classified by this agency as possibly carcinogenic to humans, Group 2B. Rice in case of arsenic, and seafood products and specially predatory fish, in case of mercury, tend to accumulate high levels of these pollutants. In risk assessment, the exposure assessment stage evaluates the extent, duration, frequency and magnitude of exposure to a chemical pollutant. The evaluation of the magnitude of exposure to metal(loid)s through food should consider not only the influence of cooking on the concentration of the contaminant, but also the oral bioavailability, i.e. the soluble fraction of the ingested pollutant that is absorbed by the intestinal epithelium and reaches the central (blood) compartment from the gastrointestinal tract. A conservative tool for evaluating oral bioavailability is oral bioaccessibility, defined as the fraction that is soluble in the gastrointestinal environment and is available for absorption. Bioaccessibility provides an indication of maximum oral bioavailability and is therefore an important tool to be used in risk assessment. For As and Hg, however, as for other toxic trace elements, evaluation of risk intake is normally based on concentrations of the contaminant in raw food. Most of the studies report data in raw products, not considering the effect of cooking and bioaccessibility on concentrations. The present thesis characterizes the concentrations of arsenic, mercury and their chemical species of toxicological interest, in samples of rice and seafood products, and evaluates the effect of cooking and bioaccessibility on these concentrations and on the estimation of the risk associated with consumption of these products

    uBench: exposing the impact of CUDA block geometry in terms of performance

    Get PDF
    Producción CientíficaThe choice of thread-block size and shape is one of the most important user decisions when a parallel problem is written for any CUDA architecture. The reason is that thread-block geometry has a significant impact on the global performance of the program. Unfortunately, the programmer has not enough information about the subtle interactions between this choice of parameters and the underlying hardware. This paper presents uBench, a complete suite of micro-benchmarks, in order to explore the impact on performance of (1) the thread-block geometry choice criteria, and (2) the GPU hardware resources and configurations. Each micro-benchmark has been designed to be as simple as possible to focus on a single effect derived from the hardware and thread-block parameter choice. As an example of the capabilities of this benchmark suite, this paper shows an experimental evaluation and comparison of Fermi and Kepler architectures. Our study reveals that, in spite of the new hardware details introduced by Kepler, the principles underlying the block geometry selection criteria are similar for both architectures.This research is partly supported by the Ministerio de Industria, Spain (CENIT OCEANLIDER), MINECO (Spain) and the European Union FEDER (MOGECOPP project TIN2011-25639, CAPAP-H network TIN2010-12011-E and TIN2011-15734-E), Junta de Castilla y León (VA172A12-2), and the HPC-EUROPA2 project (project number: 228398) with the support of the European Commission—Capacities Area—Research Infrastructures Initiative
    corecore