1,258 research outputs found

    Alʔilbīrī’s Book of the rational conclusions. Introduction, Critical Edition of the Arabic Text and Materials for the History of the Ḫawāṣṣic Genre in Early Andalus

    Full text link
    [eng] The Book of the rational conclusions, written perhaps somewhen in the 10th c. by a physician from Ilbīrah (Andalus), is a multi-section medical pandect. The author brings together, from a diversity of sources, materials dealing with matters related to drug-handling, natural philosophy, therapeutics, medical applications of the specific properties of things, a regimen, and a dispensatory. This dissertation includes three different parts. First the transmission of the text, its contents, and its possible context are discussed. Then a critical edition of the Arabic text is offered. Last, but certainly not least, the subject of the specific properties is approached from several points of view. The analysis of Section III of the original book leads to an exploration of the early Andalusī assimilation of this epistemic tradition and to the establishment of a well-defined textual family in which our text must be inscribed. On the other hand, the concept itself of ‘specific property’ is often misconstrued and it is usually made synonymous to magic and superstition. Upon closer inspection, however, the alleged irrationality of the knowledge of these properties appears to be largely the result of anachronistic interpretation. As a complement of this particular research and as an illustration of the genre, a sample from an ongoing integral commentary on this section of the book is presented.[cat] El Llibre de les conclusions racionals d’un desconegut metge d’Ilbīrah (l’Àndalus) va ser compilat probablement durant la segona meitat del s. X. Es tracta d’un rudimentari però notablement complet kunnaix (un gènere epistèmic que és definit sovint com a ‘enciclopèdia mèdica’) en què l’autor aplega materials manllevats (sovint de manera literal i no-explícita) de diversos gèneres. El llibre obre amb una secció sobre apoteconomia (una mena de manual d’apotecaris) però se centra després en les diferents branques de la medicina. A continuació d’uns prolegòmens filosòfics l’autor copia, amb mínima adaptació lingüística, un tractat sencer de terapèutica, després un altre sobre les aplicacions mèdiques de les propietats específiques de les coses, una sèrie de fragments relacionats amb la dietètica (un règim en termes tradicionals) i, finalment, una col·lecció de receptes mèdiques. Cadascuna d’aquestes seccions mostren evidents lligams d’intertextualitat que apunten cap a una intensa activitat sintetitzadora de diverses tradicions aliades a la medicina a l’Àndalus califal. El text és, de fet, un magnífic objecte sobre el qual aplicar la metodologia de la crítica textual i de fonts. L’edició crítica del text incorpora la dimensió cronològica dins l’aparat, que esdevé així un element contextualitzador. Quant l’estudi de les fonts, si tot al llarg de la primera part d’aquesta tesi és només secundari, aquesta disciplina pren un protagonisme gairebé absolut en la tercera part, especialment en el capítol dedicat a l’anàlisi individual de cada passatge recollit en la secció sobre les propietats específiques de les coses

    A foundation for synthesising programming language semantics

    Get PDF
    Programming or scripting languages used in real-world systems are seldom designed with a formal semantics in mind from the outset. Therefore, the first step for developing well-founded analysis tools for these systems is to reverse-engineer a formal semantics. This can take months or years of effort. Could we automate this process, at least partially? Though desirable, automatically reverse-engineering semantics rules from an implementation is very challenging, as found by Krishnamurthi, Lerner and Elberty. They propose automatically learning desugaring translation rules, mapping the language whose semantics we seek to a simplified, core version, whose semantics are much easier to write. The present thesis contains an analysis of their challenge, as well as the first steps towards a solution. Scaling methods with the size of the language is very difficult due to state space explosion, so this thesis proposes an incremental approach to learning the translation rules. I present a formalisation that both clarifies the informal description of the challenge by Krishnamurthi et al, and re-formulates the problem, shifting the focus to the conditions for incremental learning. The central definition of the new formalisation is the desugaring extension problem, i.e. extending a set of established translation rules by synthesising new ones. In a synthesis algorithm, the choice of search space is important and non-trivial, as it needs to strike a good balance between expressiveness and efficiency. The rest of the thesis focuses on defining search spaces for translation rules via typing rules. Two prerequisites are required for comparing search spaces. The first is a series of benchmarks, a set of source and target languages equipped with intended translation rules between them. The second is an enumerative synthesis algorithm for efficiently enumerating typed programs. I show how algebraic enumeration techniques can be applied to enumerating well-typed translation rules, and discuss the properties expected from a type system for ensuring that typed programs be efficiently enumerable. The thesis presents and empirically evaluates two search spaces. A baseline search space yields the first practical solution to the challenge. The second search space is based on a natural heuristic for translation rules, limiting the usage of variables so that they are used exactly once. I present a linear type system designed to efficiently enumerate translation rules, where this heuristic is enforced. Through informal analysis and empirical comparison to the baseline, I then show that using linear types can speed up the synthesis of translation rules by an order of magnitude

    Low-Thrust Optimal Escape Trajectories from Lagrangian Points and Quasi-Periodic Orbits in a High-Fidelity Model

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5

    Get PDF
    This fifth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered. First Part of this book presents some theoretical advances on DSmT, dealing mainly with modified Proportional Conflict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classifiers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes. Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identification of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classification. Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classification, and hybrid techniques mixing deep learning with belief functions as well

    Automated and foundational verification of low-level programs

    Get PDF
    Formal verification is a promising technique to ensure the reliability of low-level programs like operating systems and hypervisors, since it can show the absence of whole classes of bugs and prevent critical vulnerabilities. However, to realize the full potential of formal verification for real-world low-level programs one has to overcome several challenges, including: (1) dealing with the complexities of realistic models of real-world programming languages; (2) ensuring the trustworthiness of the verification, ideally by providing foundational proofs (i.e., proofs that can be checked by a general-purpose proof assistant); and (3) minimizing the manual effort required for verification by providing a high degree of automation. This dissertation presents multiple projects that advance formal verification along these three axes: RefinedC provides the first approach for verifying C code that combines foundational proofs with a high degree of automation via a novel refinement and ownership type system. Islaris shows how to scale verification of assembly code to realistic models of modern instruction set architectures-in particular, Armv8-A and RISC-V. DimSum develops a decentralized approach for reasoning about programs that consist of components written in multiple different languages (e.g., assembly and C), as is common for low-level programs. RefinedC and Islaris rest on Lithium, a novel proof engine for separation logic that combines automation with foundational proofs.Formale Verifikation ist eine vielversprechende Technik, um die Verlässlichkeit von grundlegenden Programmen wie Betriebssystemen sicherzustellen. Um das volle Potenzial formaler Verifikation zu realisieren, müssen jedoch mehrere Herausforderungen gemeistert werden: Erstens muss die Komplexität von realistischen Modellen von Programmiersprachen wie C oder Assembler gehandhabt werden. Zweitens muss die Vertrauenswürdigkeit der Verifikation sichergestellt werden, idealerweise durch maschinenüberprüfbare Beweise. Drittens muss die Verifikation automatisiert werden, um den manuellen Aufwand zu minimieren. Diese Dissertation präsentiert mehrere Projekte, die formale Verifikation entlang dieser Achsen weiterentwickeln: RefinedC ist der erste Ansatz für die Verifikation von C Code, der maschinenüberprüfbare Beweise mit einem hohen Grad an Automatisierung vereint. Islaris zeigt, wie die Verifikation von Assembler zu realistischen Modellen von modernen Befehlssatzarchitekturen wie Armv8-A oder RISC-V skaliert werden kann. DimSum entwickelt einen neuen Ansatz für die Verifizierung von Programmen, die aus Komponenten in mehreren Programmiersprachen bestehen (z.B., C und Assembler), wie es oft bei grundlegenden Programmen wie Betriebssystemen der Fall ist. RefinedC und Islaris basieren auf Lithium, eine neue Automatisierungstechnik für Separationslogik, die maschinenüberprüfbare Beweise und Automatisierung verbindet.This research was supported in part by a Google PhD Fellowship, in part by awards from Android Security's ASPIRE program and from Google Research, and in part by a European Research Council (ERC) Consolidator Grant for the project "RustBelt", funded under the European Union’s Horizon 2020 Framework Programme (grant agreement no. 683289)

    Towards A Practical High-Assurance Systems Programming Language

    Full text link
    Writing correct and performant low-level systems code is a notoriously demanding job, even for experienced developers. To make the matter worse, formally reasoning about their correctness properties introduces yet another level of complexity to the task. It requires considerable expertise in both systems programming and formal verification. The development can be extremely costly due to the sheer complexity of the systems and the nuances in them, if not assisted with appropriate tools that provide abstraction and automation. Cogent is designed to alleviate the burden on developers when writing and verifying systems code. It is a high-level functional language with a certifying compiler, which automatically proves the correctness of the compiled code and also provides a purely functional abstraction of the low-level program to the developer. Equational reasoning techniques can then be used to prove functional correctness properties of the program on top of this abstract semantics, which is notably less laborious than directly verifying the C code. To make Cogent a more approachable and effective tool for developing real-world systems, we further strengthen the framework by extending the core language and its ecosystem. Specifically, we enrich the language to allow users to control the memory representation of algebraic data types, while retaining the automatic proof with a data layout refinement calculus. We repurpose existing tools in a novel way and develop an intuitive foreign function interface, which provides users a seamless experience when using Cogent in conjunction with native C. We augment the Cogent ecosystem with a property-based testing framework, which helps developers better understand the impact formal verification has on their programs and enables a progressive approach to producing high-assurance systems. Finally we explore refinement type systems, which we plan to incorporate into Cogent for more expressiveness and better integration of systems programmers with the verification process

    LIPIcs, Volume 261, ICALP 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 261, ICALP 2023, Complete Volum

    High frequency jet ventilation (HFJV) in clinical practice

    Get PDF
    Background: Surgery often requires general anaesthesia. During general anaesthesia, a ventilator is often used to secure the breathing of the patient. This is preferably done by mimicking normal ventilation. Conventional ventilation causes the lung to inflate and deflate which in turn makes the diaphragm move up and down in the craniocaudal direction. Therefore, all organs adjacent to the diaphragm will be affected by these breathing-related motions. During liver tumour ablation, stereotactic technique can be used. During stereotactic technique, radiological images are used to optimise needle placement in three dimensions, to reach the target tumour. It is of great importance that the target tumour does not move, ensuring that the tissue destruction is limited to the tumour and avoiding injury of healthy surrounding tissue. To meet the demand of target organ immobilisation, high frequency jet ventilation (HFJV) has become an interesting option. This method uses small tidal volumes at high frequencies that highly differ from normal physiological respiration in humans, contrary to conventional ventilation during surgery. HFJV has been used for decades especially for ventilation during airway procedures. To ventilate the patient while minimising abdominal organ movement and thereby improving surgical conditions during stereotactic ablative procedures is a novel way of using the benefits of HFJV. Aim: This doctoral thesis studied the feasibility and safety of using high frequency jet ventilation for the specific purpose of liver immobilisation during stereotactic ablation procedures. The aim of Study I was to study gas exchange during HFJV during stereotactic ablation of liver tumours. In Study II, post-operative hypertension and its relation to liver tumour ablation techniques and ventilation methods were studied. In Study III the formation of atelectasis during HFJV was studied. In Study IV the levels of carbon dioxide (CO2) were studied in two different groups randomised to different sizes of the endotracheal tube in which the jet-catheter was placed during HFJV in liver tumour ablation. In Study IV continuous transcutaneous carbon dioxide (TcCO2) monitoring was compared to intermittent measurement of arterial carbon dioxide (PaCO2). Methods: Study I is a prospective, observational study. Blood gas analysis was performed every 15 minutes for the first 45 minutes of HFJV in 24 patients undergoing liver tumour ablation. Study II is a retrospective, observational study. Medical chart records were collected and analysed for early post-operative hypertension for 134 patients receiving either HFJV or conventional ventilation (CV) and various ablation methods, microwave ablation (MWA) or irreversible electroporation (IRE). Study III is a prospective, observational study. CT-images over the lower part of the lung were taken in 25 patients every 15 minutes during the first 45 minutes of HFJV. The images were analysed for atelectasis formation during HFJV using the MatLab software program. Study IV is a randomised, prospective study. PaCO2 was measured during the first 45 minutes after initiation of HFJV in patients randomised to endotracheal tube (ETT) inner diameter (ID) 8 or 9 mm. TcCO2 was also measured during the same period and compared to gold standard PaCO2. Airway pause pressures, peak pressures and signs of intubation injuries were also studied. Results: In Study I blood gas analyses showed that none of the 24 patients experienced hypoxemia during the first 45 minutes of high frequency jet ventilation. A statistically significant rise in arterial carbon dioxide (PaCO2) was seen at all time points during HFJV compared to baseline. A further statistically significant rise in PaCO2 was seen during HFJV compared to T=0 at T=30 (p=0.006) and T=45 (p=0.003). A corresponding statistically significant decrease in pH was seen compared to baseline at T=15 (p=0.03) from a mean value of 7.44 to 7.31. A further small drop in pH was seen over time but with no significance between time points. During early recovery in the post anaesthesia care unit, PaCO2 and pH resumed spontaneous to baseline values. All lactate values were within normal range except for one value in one patient during recovery that was slightly raised to 2.3 mmol L-1. Study II showed that hypertension was common in post-operative care after liver tumour ablation. Patients receiving MWA under HFJV had the highest proportion of having at least one episode of severe hypertension (SAP >180 mmHg) when compared to patients receiving IRE under HFJV and MWA under CV. Multiple regression analysis showed increased odds for post-operative hypertension when MWA was used compared to IRE and when HFJV was used compared to CV. Study III showed that the formation of atelectasis increased over time during HFJV during the 45 minutes studied, from 5.6% to 8.1% of the total lung area. The increase in atelectasis was significant at T=30 (p=0.002) and T=45 (p=0.024). The area of normal ventilation was however unchanged. In a subgroup analysis with patients with a BMI<30, no significant difference in the amount of atelectasis could be seen between the time points. In Study IV PaCO2 increased in both groups, with ETT ID 8 and 9 mm, but no statistically significant difference between the two groups was seen (p=0.06). TcCO2 was measured and compared to PaCO2. A Bland-Altman plot and an ICC analysis showed a good correlation between the two methods. Conclusions: The overall result of this thesis indicates that high frequency jet ventilation is feasible and safe during stereotactic ablation of upper abdominal organs for up to 45 minutes. There is a risk of hypertensive events in the early recovery, following liver tumour ablation when MWA and HFJV are used. Atelectasis increases but the proportion of normally ventilated lung is preserved. PaCO2 increases but is rapidly reversed during recovery. An ETT ID 8 mm can be used in male patients for shorter procedures, regarding PaCO2. TcCO2 is a feasible technique when following the changes in carbon dioxide although blood gas analysis should be considered in patients in need for haemodynamic monitoring and risk of carbon dioxide retention

    Selfish

    Get PDF
    Senior Project submitted to The Division of Social Studies of Bard College

    Linear-Time Temporal Answer Set Programming

    Get PDF
    [Abstract]: In this survey, we present an overview on (Modal) Temporal Logic Programming in view of its application to Knowledge Representation and Declarative Problem Solving. The syntax of this extension of logic programs is the result of combining usual rules with temporal modal operators, as in Linear-time Temporal Logic (LTL). In the paper, we focus on the main recent results of the non-monotonic formalism called Temporal Equilibrium Logic (TEL) that is defined for the full syntax of LTL but involves a model selection criterion based on Equilibrium Logic, a well known logical characterization of Answer Set Programming (ASP). As a result, we obtain a proper extension of the stable models semantics for the general case of temporal formulas in the syntax of LTL. We recall the basic definitions for TEL and its monotonic basis, the temporal logic of Here-and-There (THT), and study the differences between finite and infinite trace length. We also provide further useful results, such as the translation into other formalisms like Quantified Equilibrium Logic and Second-order LTL, and some techniques for computing temporal stable models based on automata constructions. In the remainder of the paper, we focus on practical aspects, defining a syntactic fragment called (modal) temporal logic programs closer to ASP, and explaining how this has been exploited in the construction of the solver telingo, a temporal extension of the well-known ASP solver clingo that uses its incremental solving capabilities.Xunta de Galicia; ED431B 2019/03We are thankful to the anonymous reviewers for their thorough work and their useful suggestions that have helped to improve the paper. A special thanks goes to Mirosaw Truszczy´nski for his support in improving the quality of our paper. We are especially grateful to David Pearce, whose help and collaboration on Equilibrium Logic was the seed for a great part of the current paper. This work was partially supported by MICINN, Spain, grant PID2020-116201GB-I00, Xunta de Galicia, Spain (GPC ED431B 2019/03), R´egion Pays de la Loire, France, (projects EL4HC and etoiles montantes CTASP), European Union COST action CA-17124, and DFG grants SCHA 550/11 and 15, Germany
    corecore