23,129 research outputs found

    Granularity of corporate debt : [Version 9 Mai 2013]

    Get PDF
    We study to what extent firms spread out their debt maturity dates across time, which we call "granularity of corporate debt." We consider the role of debt granularity using a simple model in which a firm's inability to roll over expiring debt causes inefficiencies, such as costly asset sales or underinvestment. Since multiple small asset sales are less costly than a single large one, firms may diversify debt rollovers across maturity dates. We construct granularity measures using data on corporate bond issuers for the 1991-2011 period and establish a number of novel findings. First, there is substantial variation in granularity in that many firms have either very concentrated or highly dispersed maturity structures. Second, our model's predictions are consistent with observed variation in granularity. Corporate debt maturities are more dispersed for larger and more mature firms, for firms with better investment opportunities, with higher leverage ratios, and with lower levels of current cash flows. We also show that during the recent financial crisis especially firms with valuable investment opportunities implemented more dispersed maturity structures. Finally, granularity plays an important role for bond issuances, because we document that newly issued corporate bond maturities complement pre-existing bond maturity profiles

    Granular technologies to accelerate decarbonization

    Get PDF
    Of the 45 energy technologies deemed critical by the International Energy Agency for meeting global climate targets, 38 need to improve substan- tially in cost and performance while accelerating deployment over the next decades.Low-carbon technological solutions vary in scale from solar panels, e-bikes, and smart thermostats to carbon capture and storage, light rail transit, and whole-building retrofits. We make three contributions to long-standing debates on the appropriate scale of technological responses in the energy system. First, we focus on the specific needs of accelerated low-carbon transformation: rapid technology deployment, escaping lock-in, and social legitimacy. Second, we synthesize evidence on energy end-use technologies in homes, transport, and industry, as well as electricity generation and energy supply. Third, we go beyond technical and economic considerations to include innovation, investment, deployment, social, and equity criteria for assessing the relative advantage of alternative technologies as a function of their scale. We suggest numerous potential advantages of more-granular energy technologies for accelerating progress toward climate targets, as well as the conditions on which such progress depends

    Giving patients granular control of personal health information: Using an ethics ‘Points to Consider’ to inform informatics system designers

    Get PDF
    Objective: There are benefits and risks of giving patients more granular control of their personal health information in electronic health record (EHR) systems. When designing EHR systems and policies, informaticists and system developers must balance these benefits and risks. Ethical considerations should be an explicit part of this balancing. Our objective was to develop a structured ethics framework to accomplish this. Methods: We reviewed existing literature on the ethical and policy issues, developed an ethics framework called a “Points to Consider” (P2C) document, and convened a national expert panel to review and critique the P2C. Results: We developed the P2C to aid informaticists designing an advanced query tool for an electronic health record (EHR) system in Indianapolis. The P2C consists of six questions (“Points”) that frame important ethical issues, apply accepted principles of bioethics and Fair Information Practices, comment on how questions might be answered, and address implications for patient care. Discussion: The P2C is intended to clarify whatis at stake when designers try to accommodate potentially competing ethical commitments and logistical realities. The P2C was developed to guide informaticists who were designing a query tool in an existing EHR that would permit patient granular control. While consideration of ethical issues is coming to the forefront of medical informatics design and development practices, more reflection is needed to facilitate optimal collaboration between designers and ethicists. This report contributes to that discussion

    A cognitive architecture for emergency response

    Get PDF
    Plan recognition, cognitive workload estimation and human assistance have been extensively studied in the AI and human factors communities, resulting in many techniques being applied to domains of various levels of realism. These techniques have seldom been integrated and evaluated as complete systems. In this paper, we report on the development of an assistant agent architecture that integrates plan recognition, current and future user information needs, workload estimation and adaptive information presentation to aid an emergency response manager in making high quality decisions under time stress, while avoiding cognitive overload. We describe the main components of a full implementation of this architecture as well as a simulation developed to evaluate the system. Our evaluation consists of simulating various possible executions of the emergency response plans used in the real world and measuring the expected time taken by an unaided human user, as well as one that receives information assistance from our system. In the experimental condition of agent assistance, we also examine the effects of different error rates in the agent's estimation of user's stat or information needs

    Taking advantage of hybrid systems for sparse direct solvers via task-based runtimes

    Get PDF
    The ongoing hardware evolution exhibits an escalation in the number, as well as in the heterogeneity, of computing resources. The pressure to maintain reasonable levels of performance and portability forces application developers to leave the traditional programming paradigms and explore alternative solutions. PaStiX is a parallel sparse direct solver, based on a dynamic scheduler for modern hierarchical manycore architectures. In this paper, we study the benefits and limits of replacing the highly specialized internal scheduler of the PaStiX solver with two generic runtime systems: PaRSEC and StarPU. The tasks graph of the factorization step is made available to the two runtimes, providing them the opportunity to process and optimize its traversal in order to maximize the algorithm efficiency for the targeted hardware platform. A comparative study of the performance of the PaStiX solver on top of its native internal scheduler, PaRSEC, and StarPU frameworks, on different execution environments, is performed. The analysis highlights that these generic task-based runtimes achieve comparable results to the application-optimized embedded scheduler on homogeneous platforms. Furthermore, they are able to significantly speed up the solver on heterogeneous environments by taking advantage of the accelerators while hiding the complexity of their efficient manipulation from the programmer.Comment: Heterogeneity in Computing Workshop (2014

    The Opacity of Nearby Galaxies from Counts of Background Galaxies: II. Limits of the Synthetic Field Method

    Get PDF
    Recently, we have developed and calibrated the Synthetic Field Method (SFM) to derive the total extinction through disk galaxies. The method is based on the number counts and colors of distant background field galaxies that can be seen through the foreground object, and has been successfully applied to NGC 4536 and NGC 3664, two late-type galaxies located, respectively, at 16 and 11 Mpc. Here, we study the applicability of the SFM to HST images of galaxies in the Local Group, and show that background galaxies cannot be easily identified through these nearby objects, even with the best resolution available today. In the case of M 31, each pixel in the HST images contains 50 to 100 stars, and the background galaxies cannot be seen because of the intrinsic granularity due to strong surface brightness fluctuations. In the LMC, on the other hand, there is only about one star every six linear pixels, and the lack of detectable background galaxies results from a ``secondary'' granularity, introduced by structure in the wings of the point spread function. The success of the SFM in NGC 4536 and NGC 3664 is a natural consequence of the reduction of the intensity of surface brightness fluctuations with distance. When the dominant confusion factor is structure in the PSF wings, as is the case of HST images of the LMC, and would happen in M 31 images obtained with a 10-m diffraction- limited optical telescope, it becomes in principle possible to improve the detectability of background galaxies by subtracting the stars in the foreground object. However, a much better characterization of optical PSFs than is currently available would be required for an adequate subtraction of the wings. Given the importance of determining the dust content of Local Group galaxies, efforts should be made in that direction.Comment: 45 pages, 10 Postscript figure

    Modulation of 5-Aminolevulinic acid mediated photodynamic therapy induced cell death in a human lung adenocarcinoma cell line

    Get PDF
    Photodynamic therapy (PDT) is a cancer treatment involving the administration of a photosensitising drug which selectively accumulates in tumor tissue, followed by irradiation with appropriate wavelength light. It triggers photochemical reactions inducing reactive oxygen species (ROS) production with the consequent cellular damage, which ultimately leads to cell death. Porphyrins are the only photosensitizers (PSs) endogenously synthesized by means of administration of the biological precursor, 5- aminolevulinic acid (ALA). Several antioxidants and ROS scavenger agents: reduced glutathione (GSH), mannitol (Man), l-tryptophan (Trp), ascorbate (Asc) and trolox (Trx), were assayed to determine their ability to modulate ALA-based PDT (ALA-PDT); it was performed on A549 human lung adenocarcinoma cells, by incubating with 1mM ALA for 3 hr and followed by irradiation with or without 1 hr pre-incubation with the modulators. They were previously tested for possible cytotoxicity/ photoactivity in concentrations ranging from 0.01 to 20 mM. The ratio between cell survival after ALA-PDT in the presence and in the absence of the scavenger agent (protection grade: PG) was determined, and the concentration showing no cytotoxicity/ photoactivity and providing the highest PG was used in the subsequent experiments. ALA-PDT alone induced a high percentage of apoptotic cell death (98.4 ± 3.5%) as revealed by acridine orange/ethidium bromide staining and AnnexinV-FITC/propidium iodide labelling. Pre-incubation with the modulators at their highest PG concentration significantly reduced apoptotic cells to 48.3 ± 2.7% (Asc), 58.8 ± 4.2 (Trx), 78.5 ± 3.1% (GSH), 64.3 ± 1.6% (Man), 74.6 ± 2.3% (Trp). ROS involvement in early cell death induction after ALA-PDT was tested by flow cytometry using the fluorescent probes dihydro-dichlorofluorescein diacetate (H2-DCFDA) and methoxyvinylpyrene (MVP) for detection of peroxides and singlet oxygen, respectively. ROS production increased after ALA-PDT (H2-DCFDA positive cells, control: 1.1 ± 0.1 %; 10 min-PDT: 69.3 ± 5.6%; MVP positive cells, control: 0.65 ± 0.35%; 10 min-PDT: 83.5 ± 1.9%). Asc prevented peroxide formation (H2-DCFDA positive cells: 50.7 ± 2.8%) and mostly prevented singlet oxygen increase (MVP positive cells: 25.4 ± 5.2%) whereas Trx limited peroxides formation (H2-DCFDA positive cells: 20.8 ± 0.5%), but did not significantly affected singlet oxygen production (MVP positive cells: 73.6 ± 3.4%). Selective scavenger mediated protection against PDT-induced cell death, and direct detection of specific pro-oxidative agents, entail the strong involvement of ROS in ALA-PDT-mediated tumor eradication, suggesting that undesired photodamage to normal tissue might be attenuated by administration of antioxidant agents.Fil: Teijo, Maria Julieta. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Houssay. Centro de Investigaciones sobre Porfirinas y Porfirias. Universidad de Buenos Aires. Centro de Investigaciones sobre Porfirinas y Porfirias; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Química Biológica; ArgentinaFil: Diez, Berenice Andrea. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Houssay. Centro de Investigaciones sobre Porfirinas y Porfirias. Universidad de Buenos Aires. Centro de Investigaciones sobre Porfirinas y Porfirias; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Química Biológica; ArgentinaFil: Battle, A.. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Houssay. Centro de Investigaciones sobre Porfirinas y Porfirias. Universidad de Buenos Aires. Centro de Investigaciones sobre Porfirinas y Porfirias; Argentina. Universidad de Buenos Aires. Facultad de Medicina. Hospital de Clínicas General San Martín; ArgentinaFil: Fukuda, Haydee. Consejo Nacional de Investigaciones Científicas y Técnicas. Oficina de Coordinación Administrativa Houssay. Centro de Investigaciones sobre Porfirinas y Porfirias. Universidad de Buenos Aires. Centro de Investigaciones sobre Porfirinas y Porfirias; Argentina. Universidad de Buenos Aires. Facultad de Ciencias Exactas y Naturales. Departamento de Química Biológica; Argentin

    Anytime Cognition: An information agent for emergency response

    Get PDF
    Planning under pressure in time-constrained environments while relying on uncertain information is a challenging task. This is particularly true for planning the response during an ongoing disaster in a urban area, be that a natural one, or a deliberate attack on the civilian population. As the various activities pertaining to the emergency response need to be coordinated in response to multiple reports from the disaster site, a user finds itself cognitively overloaded. To address this issue, we designed the Anytime Cognition (ANTICO) concept to assist human users working in time-constrained environments by maintaining a manageable level of cognitive workload over time. Based on the ANTICO concept, we develop an agent framework for proactively managing a user’s changing information requirements by integrating information management techniques with probabilistic plan recognition. In this paper, we describe a prototype emergency response application in the context of a subset of the attacks devised by the American Department of Homeland Security

    Developing an open data portal for the ESA climate change initiative

    Get PDF
    We introduce the rationale for, and architecture of, the European Space Agency Climate Change Initiative (CCI) Open Data Portal (http://cci.esa.int/data/). The Open Data Portal hosts a set of richly diverse datasets – 13 “Essential Climate Variables” – from the CCI programme in a consistent and harmonised form and to provides a single point of access for the (>100 TB) data for broad dissemination to an international user community. These data have been produced by a range of different institutions and vary across both scientific and spatio-temporal characteristics. This heterogeneity of the data together with the range of services to be supported presented significant technical challenges. An iterative development methodology was key to tackling these challenges: the system developed exploits a workflow which takes data that conforms to the CCI data specification, ingests it into a managed archive and uses both manual and automatically generated metadata to support data discovery, browse, and delivery services. It utilises both Earth System Grid Federation (ESGF) data nodes and the Open Geospatial Consortium Catalogue Service for the Web (OGC-CSW) interface, serving data into both the ESGF and the Global Earth Observation System of Systems (GEOSS). A key part of the system is a new vocabulary server, populated with CCI specific terms and relationships which integrates OGC-CSW and ESGF search services together, developed as part of a dialogue between domain scientists and linked data specialists. These services have enabled the development of a unified user interface for graphical search and visualisation – the CCI Open Data Portal Web Presence

    Lanczos eigensolution method for high-performance computers

    Get PDF
    The theory, computational analysis, and applications are presented of a Lanczos algorithm on high performance computers. The computationally intensive steps of the algorithm are identified as: the matrix factorization, the forward/backward equation solution, and the matrix vector multiples. These computational steps are optimized to exploit the vector and parallel capabilities of high performance computers. The savings in computational time from applying optimization techniques such as: variable band and sparse data storage and access, loop unrolling, use of local memory, and compiler directives are presented. Two large scale structural analysis applications are described: the buckling of a composite blade stiffened panel with a cutout, and the vibration analysis of a high speed civil transport. The sequential computational time for the panel problem executed on a CONVEX computer of 181.6 seconds was decreased to 14.1 seconds with the optimized vector algorithm. The best computational time of 23 seconds for the transport problem with 17,000 degs of freedom was on the the Cray-YMP using an average of 3.63 processors
    corecore