12,673 research outputs found

    Bernal and the structure of water

    Get PDF
    Bernal recognised early on the importance of water in biological systems and processes, and hence the need to understand the structure of the liquid if he was to understand its biological functionality. Although the structure of crystalline ice had been solved only a few years earlier, and little was understood about the structures of liquids in general, in 1933 he published what is still regarded as a seminal paper which treated not only the structure of liquid water itself, but also addressed an impressively wide range of problems of ice and ionic solutions. Imaginatively exploiting ideas that were developing at the time, he proposed a model for water that reproduced the main features of its x-ray diffraction pattern. Despite the success of this model, however, he subsequently found it unsatisfactory – ''a delusive approach, postulating a greater degree of order in the liquid than actually exists there''. Building on the very successful ''random packing'' model of simple liquids that he developed in the 1950s and 60s, he was ultimately led to a ''random network'' model that was consistent with the known properties of the individual water molecules, and that again could reproduce a range of experimental data – but this time without the model being too ordered. Todays state of the art experiments essentially verify the underlying validity of his ideal model. And even his 1933 model of the water molecule itself is mimicked in some of the more successful molecular models used in todays computer simulations of aqueous systems

    On Verifying Complex Properties using Symbolic Shape Analysis

    Get PDF
    One of the main challenges in the verification of software systems is the analysis of unbounded data structures with dynamic memory allocation, such as linked data structures and arrays. We describe Bohne, a new analysis for verifying data structures. Bohne verifies data structure operations and shows that 1) the operations preserve data structure invariants and 2) the operations satisfy their specifications expressed in terms of changes to the set of objects stored in the data structure. During the analysis, Bohne infers loop invariants in the form of disjunctions of universally quantified Boolean combinations of formulas. To synthesize loop invariants of this form, Bohne uses a combination of decision procedures for Monadic Second-Order Logic over trees, SMT-LIB decision procedures (currently CVC Lite), and an automated reasoner within the Isabelle interactive theorem prover. This architecture shows that synthesized loop invariants can serve as a useful communication mechanism between different decision procedures. Using Bohne, we have verified operations on data structures such as linked lists with iterators and back pointers, trees with and without parent pointers, two-level skip lists, array data structures, and sorted lists. We have deployed Bohne in the Hob and Jahob data structure analysis systems, enabling us to combine Bohne with analyses of data structure clients and apply it in the context of larger programs. This report describes the Bohne algorithm as well as techniques that Bohne uses to reduce the ammount of annotations and the running time of the analysis

    Locating Vagueness

    Get PDF
    The claim that all vagueness must be a feature of language or thought is the current orthodoxy. This is a claim about the “location” of vagueness. “Locating Vagueness” argues that this claim is false, largely by defending the possibility of borderline cases in the absence of language and thought. If the orthodoxy about the location of vagueness is false, then so too is any account of the “nature” of vagueness that implies that orthodoxy. So this paper concludes that various accounts of the nature of vagueness are false. Among such accounts, so this paper argues, are the standard versions of supervaluationism and the standard versions of epistemicism. So I conclude that those accounts are false. Along the way, I present, and uncover ways to motivate, several heretical accounts of the nature of vagueness, including nonstandard versions of both supervaluationism and epistemicism

    Polymorphic Endpoint Types for Copyless Message Passing

    Full text link
    We present PolySing#, a calculus that models process interaction based on copyless message passing, in the style of Singularity OS. We equip the calculus with a type system that accommodates polymorphic endpoint types, which are a variant of polymorphic session types, and we show that well-typed processes are free from faults, leaks, and communication errors. The type system is essentially linear, although linearity alone may leave room for scenarios where well-typed processes leak memory. We identify a condition on endpoint types that prevents these leaks from occurring.Comment: In Proceedings ICE 2011, arXiv:1108.014

    Dip-coating process: Silicon sheet growth development for the large-area silicon sheet task of the low-cost silicon solar array project

    Get PDF
    The objective of this research program is to investigate the technical and economic feasibility of producing solar-cell-quality sheet silicon by coating one surface of carbonized ceramic substrates with a thin layer of large-grain polycrystalline silicon from the melt. The past quarter demonstrated significant progress in several areas. Seeded growth of silicon-on-ceramic (SOC) with an EFG ribbon seed was demonstrated. Different types of mullite were successfully coated with silicon. A new method of deriving minority carrier diffusion length, L sub n from spectral response measurements was evaluated. ECOMOD cost projections were found to be in good agreement with the interim SAMIS method proposed by JPL. On the less positive side, there was a decrease in cell performance which we believe to be due to an unidentified source of impurities

    Memory Mangement in the PoSSo Solver

    Get PDF
    AbstractA uniform general purpose garbage collector may not always provide optimal performance. Sometimes an algorithm exhibits a predictable pattern of memory usage that could be exploited, delaying as much as possible the intervention of the collector. This requires a collector whose strategy can be customized to the need of an algorithm. We present a dynamic memory management framework which allows such customization, while preserving the convenience of automatic collection in the normal case. The Customizable Memory Management (CMM) organizes memory in multiple heaps, each one encapsulating a particular storage discipline. The default heap for collectable objects uses the technique of mostly copying garbage collection, providing good performance and memory compaction. Customization of the collector is achieved through object orientation by specialising the collector methods for each heap class. We describe how the CMM has been exploited in the implementation of the Buchberger algorithm, by using a special heap for temporary objects created during polynomial reduction. The solution drastically reduces the overall cost of memory allocation in the algorithm

    Incremental Network Design with Minimum Spanning Trees

    Full text link
    Given an edge-weighted graph G=(V,E)G=(V,E) and a set E0EE_0\subset E, the incremental network design problem with minimum spanning trees asks for a sequence of edges e1,,eTEE0e'_1,\ldots,e'_T\in E\setminus E_0 minimizing t=1Tw(Xt)\sum_{t=1}^Tw(X_t) where w(Xt)w(X_t) is the weight of a minimum spanning tree XtX_t for the subgraph (V,E0{e1,,et})(V,E_0\cup\{e'_1,\ldots,e'_t\}) and T=EE0T=\lvert E\setminus E_0\rvert. We prove that this problem can be solved by a greedy algorithm.Comment: 9 pages, minor revision based on reviewer comment
    corecore