99 research outputs found

    Frameworks, models, and case studies

    Get PDF
    This thesis focuses on models of conceptual change in science and philosophy. In particular, I developed a new bootstrapping methodology for studying conceptual change, centered around the formalization of several popular models of conceptual change and the collective assessment of their improved formal versions via nine evaluative dimensions. Among the models of conceptual change treated in the thesis are Carnap’s explication, Lakatos’ concept-stretching, Toulmin’s conceptual populations, Waismann’s open texture, Mark Wilson’s patches and facades, Sneed’s structuralism, and Paul Thagard’s conceptual revolutions. In order to analyze and compare the conception of conceptual change provided by these different models, I rely on several historical reconstructions of episodes of scientific conceptual change. The historical episodes of scientific change that figure in this work include the emergence of the morphological concept of fish in biological taxonomies, the development of scientific conceptions of temperature, the Church-Turing thesis and related axiomatizations of effective calculability, the history of the concept of polyhedron in 17th and 18th century mathematics, Hamilton’s invention of the quaternions, the history of the pre-abstract group concepts in 18th and 19th century mathematics, the expansion of Newtonian mechanics to viscous fluids forces phenomena, and the chemical revolution. I will also present five different formal and informal improvements of four specific models of conceptual change. I will first present two different improvements of Carnapian explication, a formal and an informal one. My informal improvement of Carnapian explication will consist of a more fine-grained version of the procedure that adds an intermediate, third step to the two steps of Carnapian explication. I will show how this novel three-step version of explication is more suitable than its traditional two-step relative to handle complex cases of explications. My second, formal improvement of Carnapian explication will be a full explication of the concept of explication itself within the theory of conceptual spaces. By virtue of this formal improvement, the whole procedure of explication together with its application procedures and its pragmatic desiderata will be reconceptualized as a precise procedure involving topological and geometrical constraints inside the theory of conceptual spaces. My third improved model of conceptual change will consist of a formal explication of Darwinian models of conceptual change that will make vast use of Godfrey-Smith’s population-based Darwinism for targeting explicitly mathematical conceptual change. My fourth improvement will be dedicated instead to Wilson’s indeterminate model of conceptual change. I will show how Wilson’s very informal framework can be explicated within a modified version of the structuralist model-theoretic reconstructions of scientific theories. Finally, the fifth improved model of conceptual change will be a belief-revision-like logical framework that reconstructs Thagard’s model of conceptual revolution as specific revision and contraction operations that work on conceptual structures. At the end of this work, a general conception of conceptual change in science and philosophy emerges, thanks to the combined action of the three layers of my methodology. This conception takes conceptual change to be a multi-faceted phenomenon centered around the dynamics of groups of concepts. According to this conception, concepts are best reconstructed as plastic and inter-subjective entities equipped with a non-trivial internal structure and subject to a certain degree of localized holism. Furthermore, conceptual dynamics can be judged from a weakly normative perspective, bound to be dependent on shared values and goals. Conceptual change is then best understood, according to this conception, as a ubiquitous phenomenon underlying all of our intellectual activities, from science to ordinary linguistic practices. As such, conceptual change does not pose any particular problem to value-laden notions of scientific progress, objectivity, and realism. At the same time, this conception prompts all our concept-driven intellectual activities, including philosophical and metaphilosophical reflections, to take into serious consideration the phenomenon of conceptual change. An important consequence of this conception, and of the analysis that generated it, is in fact that an adequate understanding of the dynamics of philosophical concepts is a prerequisite for analytic philosophy to develop a realistic and non-idealized depiction of itself and its activities

    Operations research: from computational biology to sensor network

    Get PDF
    In this dissertation we discuss the deployment of combinatorial optimization methods for modeling and solve real life problemS, with a particular emphasis to two biological problems arising from a common scenario: the reconstruction of the three-dimensional shape of a biological molecule from Nuclear Magnetic Resonance (NMR) data. The fi rst topic is the 3D assignment pathway problem (APP) for a RNA molecule. We prove that APP is NP-hard, and show a formulation of it based on edge-colored graphs. Taking into account that interactions between consecutive nuclei in the NMR spectrum are diff erent according to the type of residue along the RNA chain, each color in the graph represents a type of interaction. Thus, we can represent the sequence of interactions as the problem of fi nding a longest (hamiltonian) path whose edges follow a given order of colors (i.e., the orderly colored longest path). We introduce three alternative IP formulations of APP obtained with a max flow problem on a directed graph with packing constraints over the partitions, which have been compared among themselves. Since the last two models work on cyclic graphs, for them we proposed an algorithm based on the solution of their relaxation combined with the separation of cycle inequalities in a Branch & Cut scheme. The second topic is the discretizable distance geometry problem (DDGP), which is a formulation on discrete search space of the well-known distance geometry problem (DGP). The DGP consists in seeking the embedding in the space of a undirected graph, given a set of Euclidean distances between certain pairs of vertices. DGP has two important applications: (i) fi nding the three dimensional conformation of a molecule from a subset of interatomic distances, called Molecular Distance Geometry Problem, and (ii) the Sensor Network Localization Problem. We describe a Branch & Prune (BP) algorithm tailored for this problem, and two versions of it solving the DDGP both in protein modeling and in sensor networks localization frameworks. BP is an exact and exhaustive combinatorial algorithm that examines all the valid embeddings of a given weighted graph G=(V,E,d), under the hypothesis of existence of a given order on V. By comparing the two version of BP to well-known algorithms we are able to prove the e fficiency of BP in both contexts, provided that the order imposed on V is maintained

    Seventh Biennial Report : June 2003 - March 2005

    No full text

    Mesh generation for voxel -based objects

    Get PDF
    A new physically-based approach to unstructured mesh generation via Monte-Carlo simulation is proposed. Geometrical objects to be meshed are represented by systems of interacting particles with a given interaction potential. A new way of distributing nodes in complex domains is proposed based on a concept of dynamic equilibrium ensemble, which represents a liquid state of matter. The algorithm is simple, numerically stable and produces uniform node distributions in domains of complex geometries and different dimensions. Well-shaped triangles or tetrahedra can be created by connecting a set of uniformly-spaced nodes. The proposed method has many advantages and potential applications.;The new method is applied to the problem of meshing of voxel-based objects. By customizing system potential energy function to reflect surface features, particles can be distributed into desired locations, such as sharp corners and edges. Feature-preserved surface mesh can then be constructed by connecting the node set.;A heuristic algorithm using an advancing front approach is proposed to generate triangulated surface meshes on voxel-based objects. The resultant surface meshes do not inherit the anisotropy of the underlying hexagonal grid. However, the important surface features, such as edges and corners may not be preserved in the mesh.;To overcome this problem, surface features such as edges, corners need to be detected. A new approach of edge capturing is proposed and demonstrated. The approach is based on a Laplace solver with incomplete Jacobi iterations, and as such is very simple and efficient. This edge capturing approach combined with the mesh generation methods above forms a simple and robust technique of unstructured mesh generation on voxel-based objects.;A graphical user interface (GUI) capable of complex geometric design and remote simulation control was implemented. The GUI was used in simulations of large fuel-cell stacks. It enables one to setup, run and monitor simulations remotely through secure shell (SSH2) connections. A voxel-based 3D geometrical modeling module is built along with the GUI. The flexibility of voxel-based geometry representation enables one to use this technique for both geometric design and visualization of volume data

    LIPIcs, Volume 258, SoCG 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 258, SoCG 2023, Complete Volum

    A Computational Model of Lakatos-style Reasoning

    Get PDF
    Institute for Computing Systems ArchitectureLakatos outlined a theory of mathematical discovery and justification, which suggests ways in which concepts, conjectures and proofs gradually evolve via interaction between mathematicians. Different mathematicians may have different interpretations of a conjecture, examples or counterexamples of it, and beliefs regarding its value or theoremhood. Through discussion, concepts are refined and conjectures and proofs modified. We hypothesise that: (i) it is possible to computationally represent Lakatos's theory, and (ii) it is useful to do so. In order to test our hypotheses we have developed a computational model of his theory. Our model is a multiagent dialogue system. Each agent has a copy of a pre-existing theory formation system, which can form concepts and make conjectures which empirically hold for the objects of interest supplied. Distributing the objects of interest between agents means that they form different theories, which they communicate to each other. Agents then find counterexamples and use methods identified by Lakatos to suggest modifications to conjectures, concept definitions and proofs. Our main aim is to provide a computational reading of Lakatos's theory, by interpreting it as a series of algorithms and implementing these algorithms as a computer program. This is the first systematic automated realisation of Lakatos's theory. We contribute to the computational philosophy of science by interpreting, clarifying and extending his theory. We also contribute by evaluating his theory, using our model to test hypotheses about it, and evaluating our extended computational theory on the basis of criteria proposed by several theorists. A further contribution is to automated theory formation and automated theorem proving. The process of refining conjectures, proofs and concept definitions requires a flexibility which is inherently useful in fields which handle ill-specified problems, such as theory formation. Similarly, the ability to automatically modify an open conjecture into one which can be proved, is a valuable contribution to automated theorem proving

    Le domaine abstrait des polyÚdres revisité : représentation par contraintes et preuve formelle

    Get PDF
    The work reported in this thesis revisits in two waysthe abstract domain of polyhedraused for static analysis of programs.First, strong guarantees are provided on the soundness of the operationson polyhedra,by using of the Coq proof assistant to check the soundness proofs.The means used to ensure correctnessdon't hinder the performance of the resultingVerimag Polyhedra Library (VPL).It is built on the principle of result verification:computations are performed by an untrusted oracleand their results are verified by a checkerwhose correctness is proved in Coq.In order to make verification cheap,the oracle computes soundness witnesses along with the results.The other distinguishing feature of VPL is thatit relies only on the constraint representation of polyhedra,as opposed to the common practice of using both constraints and generators.Despite this unusual choice,VPL turns out to be a competitive abstract domain of polyhedra,performance-wise.As expected, the join operator of VPL,which performs the convex hull of two polyhedra,is the costliest operator.Since it builds on the projection operator,this thesis also investigates a new approach toperforming projections,based on parametric linear programming.A new understanding of projection encoded asa parametric linear problem is presented.The thesis closes on a progress report in the design of a new solvingalgorithm,tailored to the specifics of the encodingso as to achieve good performance.Cette thÚse revisite de deux maniÚres le domaine abstrait des polyÚdres utilisé pour l'analyse statique de programmes.D'abord, elle montre comment utiliser l'assistant à la preuve Coq pour apporter des garanties sur la correction des opérations sur les polyÚdres sans compromettre l'efficacité de l'outil VP Lissu de ces travaux.L'outil est fondé sur le principe de la vérification de résultats :un oracle, auquel on ne fait pas confiance, fait les calculs,puis les résultats sont vérifiés par un validateur dont la correction est prouvée avec Coq. De plus, l'oracle fournit des témoins de la correction des résultats afin d'accélérer la vérification.L'autre caractéristique de VPL est l' utilsation de la seule représentation par contraintes des polyÚdres,par opposition à l'approche habituelle qui consiste à utiliser à la fois des contraintes et des générateurs.Malgré ce choix inhabituel,les performances de VPL s'avÚrent compétitives.Comme on pouvait le prévoir,l'opérateur "join",qui calcule l'enveloppe convexe de deux polyÚdres,est le plus coûteux.Puisqu'il nécessite un grand nombre de projections,cette thÚse explore plusieurs nouvelles approches de l'opérateur de projection,basées sur la programmation linéaire paramétrique.Elle propose une synthÚse des variantes et des combinaisons possibles.La thÚse se termine sur les éléments clés d'un nouvel algorithme de résolution tirant parti des spécificités de l'encodage afin d'obtenir de bonnes performances

    The Essence of Mathematics Through Elementary Problems

    Get PDF
    "It is increasingly clear that the shapes of reality – whether of the natural world, or of the built environment – are in some profound sense mathematical. Therefore it would benefit students and educated adults to understand what makes mathematics itself ‘tick’, and to appreciate why its shapes, patterns and formulae provide us with precisely the language we need to make sense of the world around us. The second part of this challenge may require some specialist experience, but the authors of this book concentrate on the first part, and explore the extent to which elementary mathematics allows us all to understand something of the nature of mathematics from the inside. The Essence of Mathematics consists of a sequence of 270 problems – with commentary and full solutions. The reader is assumed to have a reasonable grasp of school mathematics. More importantly, s/he should want to understand something of mathematics beyond the classroom, and be willing to engage with (and to reflect upon) challenging problems that highlight the essence of the discipline. The book consists of six chapters of increasing sophistication (Mental Skills; Arithmetic; Word Problems; Algebra; Geometry; Infinity), with interleaved commentary. The content will appeal to students considering further study of mathematics at university, teachers of mathematics at age 14-18, and anyone who wants to see what this kind of elementary content has to tell us about how mathematics really works.

    Computer Science for Continuous Data:Survey, Vision, Theory, and Practice of a Computer Analysis System

    Get PDF
    Building on George Boole's work, Logic provides a rigorous foundation for the powerful tools in Computer Science that underlie nowadays ubiquitous processing of discrete data, such as strings or graphs. Concerning continuous data, already Alan Turing had applied "his" machines to formalize and study the processing of real numbers: an aspect of his oeuvre that we transform from theory to practice.The present essay surveys the state of the art and envisions the future of Computer Science for continuous data: natively, beyond brute-force discretization, based on and guided by and extending classical discrete Computer Science, as bridge between Pure and Applied Mathematics

    The Essence of Mathematics Through Elementary Problems

    Get PDF
    "It is increasingly clear that the shapes of reality – whether of the natural world, or of the built environment – are in some profound sense mathematical. Therefore it would benefit students and educated adults to understand what makes mathematics itself ‘tick’, and to appreciate why its shapes, patterns and formulae provide us with precisely the language we need to make sense of the world around us. The second part of this challenge may require some specialist experience, but the authors of this book concentrate on the first part, and explore the extent to which elementary mathematics allows us all to understand something of the nature of mathematics from the inside. The Essence of Mathematics consists of a sequence of 270 problems – with commentary and full solutions. The reader is assumed to have a reasonable grasp of school mathematics. More importantly, s/he should want to understand something of mathematics beyond the classroom, and be willing to engage with (and to reflect upon) challenging problems that highlight the essence of the discipline. The book consists of six chapters of increasing sophistication (Mental Skills; Arithmetic; Word Problems; Algebra; Geometry; Infinity), with interleaved commentary. The content will appeal to students considering further study of mathematics at university, teachers of mathematics at age 14-18, and anyone who wants to see what this kind of elementary content has to tell us about how mathematics really works.
    • 

    corecore