700 research outputs found
Existence of Monetary Steady States in a Matching Model: Indivisible Money
Existence of a monetary steady state is established for a random matching model with divisible goods, indivisible money, and take-it-or-leave-it offers by consumers. There is no restriction on individual money holdings. The background environment is that in papers by Shi and by Trejos and Wright. The monetary steady state shown to exist has nice properties: the value function, defined on money holdings, is increasing and strictly concave, and the measure over money holdings has full support.
Optimizing the computation of overriding
We introduce optimization techniques for reasoning in DLN---a recently
introduced family of nonmonotonic description logics whose characterizing
features appear well-suited to model the applicative examples naturally arising
in biomedical domains and semantic web access control policies. Such
optimizations are validated experimentally on large KBs with more than 30K
axioms. Speedups exceed 1 order of magnitude. For the first time, response
times compatible with real-time reasoning are obtained with nonmonotonic KBs of
this size
Computing FO-Rewritings in EL in Practice: from Atomic to Conjunctive Queries
A prominent approach to implementing ontology-mediated queries (OMQs) is to
rewrite into a first-order query, which is then executed using a conventional
SQL database system. We consider the case where the ontology is formulated in
the description logic EL and the actual query is a conjunctive query and show
that rewritings of such OMQs can be efficiently computed in practice, in a
sound and complete way. Our approach combines a reduction with a decomposed
backwards chaining algorithm for OMQs that are based on the simpler atomic
queries, also illuminating the relationship between first-order rewritings of
OMQs based on conjunctive and on atomic queries. Experiments with real-world
ontologies show promising results
Real time, confocal imaging of Ca2+ waves in arterially perfused rat hearts
Objective: The aim of this study was to characterize the spatio-temporal dynamics of slow Ca2+ waves (SCW's) with cellular resolution in the arterially-perfused rat heart. Methods: Wister rat hearts were Langendorff-perfused with Tyrode solution containing bovine-albumine and Dextran. The heart was loaded with the Ca2+ sensitive dye Fluo-3 AM. Intracellular fluorescence changes reflecting changes in [Ca2+]i were recorded from subepicardial tissue layers using a slit hole confocal microscope with an image intensified video camera system at image rates of up to 50/s. Results: SCW's appeared spontaneously during cardiac rest or after trains of electrical stimuli. They were initiated preferentially in the center third of the cell and propagated to the cell borders, suggesting a relation between the cell nucleus and wave initiation. They were suppressed by Ca2+ transients and their probability of occurrence increased with the Ca2+ resting level. Propagation velocity within myocytes (40 to 180 μm/s) decreased with the resting Ca2+ level. Intercellular propagation was mostly confined to two or three cells and occurred bi-directionally. Intercellular unidirectional conduction block and facilitation of SCW's was occasionally observed. On average 10 to 20% of cells showed non-synchronized simultaneous SCW's within a given area in the myocardium. Conclusions: SCW's occurring at increased levels of [Ca2+]i in normoxic or ischemic conditions are mostly confined to two or three cells in the ventricular myocardium. Spatio-temporal summation of changes in membrane potential caused by individual SCW's may underlie the generation of triggered electrical ectopic impulse
Pyrroloquinoline quinone and molecules mimicking its functional domains Modulators of connective tissue formation?
Surface properties changing of biodegradable polymers by the radio frequency magnetron sputtering modification
Insulin Glargine in the Intensive Care Unit: A Model-Based Clinical Trial Design
Online 4 Oct 2012Introduction: Current succesful AGC (Accurate Glycemic Control) protocols require extra clinical effort and are impractical in less acute wards where patients are still susceptible to stress-induced hyperglycemia. Long-acting insulin Glargine has the potential to be used in a low effort controller. However, potential variability in efficacy and length of action, prevent direct in-hospital use in an AGC framework for less acute wards.
Method: Clinically validated virtual trials based on data from stable ICU patients from the SPRINT cohort who would be transferred to such an approach are used to develop a 24-hour AGC protocol robust to different Glargine potencies (1.0x, 1.5x and 2.0x regular insulin) and initial dose sizes (dose = total insulin over prior 12, 18 and 24 hours). Glycemic control in this period is provided only by varying nutritional inputs. Performance is assessed as %BG in the 4.0-8.0mmol/L band and safety by %BG<4.0mmol/L.
Results: The final protocol consisted of Glargine bolus size equal to insulin over the previous 18 hours. Compared to SPRINT there was a 6.9% - 9.5% absolute decrease in mild hypoglycemia (%BG<4.0mmol/L) and up to a 6.2% increase in %BG between 4.0 and 8.0mmol/L. When the efficacy is known (1.5x assumed) there were reductions of: 27% BG measurements, 59% insulin boluses, 67% nutrition changes, and 6.3% absolute in mild hypoglycemia.
Conclusion: A robust 24-48 clinical trial has been designed to safely investigate the efficacy and kinetics of Glargine as a first step towards developing a Glargine-based protocol for less acute wards. Ensuring robustness to variability in Glargine efficacy significantly affects the performance and safety that can be obtained
Get my pizza right: Repairing missing is-a relations in ALC ontologies (extended version)
With the increased use of ontologies in semantically-enabled applications,
the issue of debugging defects in ontologies has become increasingly important.
These defects can lead to wrong or incomplete results for the applications.
Debugging consists of the phases of detection and repairing. In this paper we
focus on the repairing phase of a particular kind of defects, i.e. the missing
relations in the is-a hierarchy. Previous work has dealt with the case of
taxonomies. In this work we extend the scope to deal with ALC ontologies that
can be represented using acyclic terminologies. We present algorithms and
discuss a system
Verifying Temporal Regular Properties of Abstractions of Term Rewriting Systems
The tree automaton completion is an algorithm used for proving safety
properties of systems that can be modeled by a term rewriting system. This
representation and verification technique works well for proving properties of
infinite systems like cryptographic protocols or more recently on Java Bytecode
programs. This algorithm computes a tree automaton which represents a (regular)
over approximation of the set of reachable terms by rewriting initial terms.
This approach is limited by the lack of information about rewriting relation
between terms. Actually, terms in relation by rewriting are in the same
equivalence class: there are recognized by the same state in the tree
automaton.
Our objective is to produce an automaton embedding an abstraction of the
rewriting relation sufficient to prove temporal properties of the term
rewriting system.
We propose to extend the algorithm to produce an automaton having more
equivalence classes to distinguish a term or a subterm from its successors
w.r.t. rewriting. While ground transitions are used to recognize equivalence
classes of terms, epsilon-transitions represent the rewriting relation between
terms. From the completed automaton, it is possible to automatically build a
Kripke structure abstracting the rewriting sequence. States of the Kripke
structure are states of the tree automaton and the transition relation is given
by the set of epsilon-transitions. States of the Kripke structure are labelled
by the set of terms recognized using ground transitions. On this Kripke
structure, we define the Regular Linear Temporal Logic (R-LTL) for expressing
properties. Such properties can then be checked using standard model checking
algorithms. The only difference between LTL and R-LTL is that predicates are
replaced by regular sets of acceptable terms
A proposal for annotation, semantic similarity and classification of textual documents
The original publication is available at www.springerlink.comInternational audienceIn this paper, we present an approach for classifying documents based on the notion of a semantic similarity and the effective representation of the content of the documents. The content of a document is annotated and the resulting annotation is represented by a labeled tree whose nodes and edges are represented by concepts lying within a domain ontology. A reasoning process may be carried out on annotation trees, allowing the comparison of documents between each others, for classification or information retrieval purposes. An algorithm for classifying documents with respect to semantic similarity and a discussion conclude the paper
- …