435 research outputs found

    Finitary languages

    Full text link
    The class of omega-regular languages provides a robust specification language in verification. Every omega-regular condition can be decomposed into a safety part and a liveness part. The liveness part ensures that something good happens "eventually". Finitary liveness was proposed by Alur and Henzinger as a stronger formulation of liveness. It requires that there exists an unknown, fixed bound b such that something good happens within b transitions. In this work we consider automata with finitary acceptance conditions defined by finitary Buchi, parity and Streett languages. We study languages expressible by such automata: we give their topological complexity and present a regular-expression characterization. We compare the expressive power of finitary automata and give optimal algorithms for classical decisions questions. We show that the finitary languages are Sigma 2-complete; we present a complete picture of the expressive power of various classes of automata with finitary and infinitary acceptance conditions; we show that the languages defined by finitary parity automata exactly characterize the star-free fragment of omega B-regular languages; and we show that emptiness is NLOGSPACE-complete and universality as well as language inclusion are PSPACE-complete for finitary parity and Streett automata

    The Implementation of IMF Programs: A Conceptual Framework

    Get PDF
    IMF supported programs have conventionally been assessed by examining their effects on intermediate variables and final outcomes. More recently greater attention has been paid to implementation on the assumption that in order to work programs need to be implemented. Empirical studies have begun to include political economy variables in an attempt to explain implementation. They have used the concept of ‘ownership’ to provide a theoretical foundation. This paper provides an alternative conceptual framework based on the marginal benefits and costs of implementation. It goes on to discuss policies that might be expected to improve implementation based on this framework.IMF

    Methodologies for an improved prediction of the isotopic content in high burnup samples. Application to Vandellos-II reactor core

    Full text link
    Fuel cycles are designed with the aim of obtaining the highest amount of energy possible. Since higher burnup values are reached, it is necessary to improve our disposal designs, traditionally based on the conservative assumption that they contain fresh fuel. The criticality calculations involved must consider burnup by making the most of the experimental and computational capabilities developed, respectively, to measure and predict the isotopic content of the spent nuclear fuel. These high burnup scenarios encourage a review of the computational tools to find out possible weaknesses in the nuclear data libraries, in the methodologies applied and their applicability range. Experimental measurements of the spent nuclear fuel provide the perfect framework to benchmark the most well-known and established codes, both in the industry and academic research activity. For the present paper, SCALE 6.0/TRITON and MONTEBURNS 2.0 have been chosen to follow the isotopic content of four samples irradiated in the Spanish Vandellós-II pressurized water reactor up to burnup values ranging from 40 GWd/MTU to 75 GWd/MTU. By comparison with the experimental data reported for these samples, we can probe the applicability of these codes to deal with high burnup problems. We have developed new computational tools within MONTENBURNS 2.0. They make possible to handle an irradiation history that includes geometrical and positional changes of the samples within the reactor core. This paper describes the irradiation scenario against which the mentioned codes and our capabilities are to be benchmarked

    Anisotropic Fast-Marching on cartesian grids using Lattice Basis Reduction

    Full text link
    We introduce a modification of the Fast Marching Algorithm, which solves the generalized eikonal equation associated to an arbitrary continuous riemannian metric, on a two or three dimensional domain. The algorithm has a logarithmic complexity in the maximum anisotropy ratio of the riemannian metric, which allows to handle extreme anisotropies for a reduced numerical cost. We prove the consistence of the algorithm, and illustrate its efficiency by numerical experiments. The algorithm relies on the computation at each grid point of a special system of coordinates: a reduced basis of the cartesian grid, with respect to the symmetric positive definite matrix encoding the desired anisotropy at this point.Comment: 28 pages, 12 figure

    The Implementation of IMF Programmes: A Conceptual Framework and a Policy Agenda

    Get PDF
    The success of IMF supported programmes has conventionally been assessed by examining their effects on intermediate variables such as fiscal deficits, monetary growth and exchange rates, and final outcomes, such as the balance of payments, inflation and economic growth. However, little or no distinction has been made between those countries that implement the conditions incorporated into programmes and those that do not. More recently greater attention has been paid to implementation on the assumption that in order to work programmes need to be implemented. Empirical studies have begun to include political economy variables in an attempt to explain implementation. They have used the concept of ‘ownership’ to provide a theoretical framework. This paper provides an alternative conceptual framework based on the marginal benefits and costs of implementation. It goes on to discuss a range of policies that might be expected to improve implementation.

    Data-Driven Robust Optimization

    Full text link
    The last decade witnessed an explosion in the availability of data for operations research applications. Motivated by this growing availability, we propose a novel schema for utilizing data to design uncertainty sets for robust optimization using statistical hypothesis tests. The approach is flexible and widely applicable, and robust optimization problems built from our new sets are computationally tractable, both theoretically and practically. Furthermore, optimal solutions to these problems enjoy a strong, finite-sample probabilistic guarantee. \edit{We describe concrete procedures for choosing an appropriate set for a given application and applying our approach to multiple uncertain constraints. Computational evidence in portfolio management and queuing confirm that our data-driven sets significantly outperform traditional robust optimization techniques whenever data is available.Comment: 38 pages, 15 page appendix, 7 figures. This version updated as of Oct. 201

    Vertical Examination of Reading Environment and Student Engagement in 1st-3rd Grade Classrooms

    Get PDF
    The purpose of this study was to examine the relationship between instructional environment and student engagement during reading instruction. Environment is composed of three key elements: teacher attributes, instructional methods, and the physical classroom setting (Blair, Rupley, & Nichols, 2007; De Naeghel, Van Keer, Vansteenkiste, & Rosseel, 2012; Guthrie, Hoa, Wigfield, Tonks, & Perencevich, 2006; Housand & Reis, 2008). This study examined a first, second, and third grade classroom in one East Tennessee school. Qualitative data was collected using a combination of instructional observation and teacher interviews in order to examine existing practices for successfully engaging young readers. Teachers for each of the classrooms were interviewed; following the interview, each teacher’s classroom was observed three times to examine the teacher’s attributes and most frequently used instructional methods, the physical classroom setting, and the expressed level of engagement of the student body in the classroom. The findings indicate that environment in terms of teacher attributes, instructional methods, and physical classroom setting affects student reading engagement; classrooms with high levels of organization, novel reading areas, and opportunity for students to select reading material were found particularly effective for reading engagement

    Effects of process parameters on structure and properties of melt-blown poly(lactic acid) nonwovens for skin regeneration

    Get PDF
    Skin regeneration requires a three-dimensional (3D) scaffold for cell adhesion, growth and proliferation. A type of the scaffold offering a 3D structure is a nonwoven material produced via a melt-blown technique. Process parameters of this technique can be adapted to improve the cellular response. Polylactic acid (PLA) was used to produce a nonwoven scaffold by a melt-blown technique. The key process parameters, i.e., the head and air temperature, were changed in the range from 180–270 °C to obtain eight different materials (MB1–MB8). The relationships between the process parameters, morphology, porosity, thermal properties and the cellular response were explored in this study. The mean fiber diameters ranged from 3 to 120 µm. The average material roughness values were between 47 and 160 µm, whereas the pore diameters ranged from 5 to 400 µm. The calorimetry thermograms revealed a correlation between the temperature parameters and crystallization. The response of keratinocytes and macrophages exhibited a higher cell viability on thicker fibers. The cell-scaffold interaction was observed via SEM after 7 days. This result proved that the features of melt-blown nonwoven scaffolds depended on the processing parameters, such as head temperature and air temperature. Thanks to examinations, the most suitable scaffolds for skin tissue regeneration were selected
    • …
    corecore