2,641 research outputs found
Automatically refining partial specifications for Program Verification
10.1007/978-3-642-21437-0_28Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)6664 LNCS369-38
Hunter-gatherers in a howling wilderness: Neoliberal capitalism as a language that speaks itself
The 'self-referential' character of evolutionary process noted by Goldenfeld and Woese (2010) can be restated in the context of a generalized Darwinian theory applied to economic process through a 'language' model: The underlying inherited and learned culture of the firm, the short-time cognitive response of the firm to patterns of threat and opportunity that is sculpted by that culture, and the embedding socioeconomic environment, are represented as interacting information sources constrained by the asymptotic limit theorems of information theory. If unregulated, the larger, compound, source that characterizes high probability evolutionary paths of this composite then becomes, literally, a self-dynamic language that speaks itself. Such a structure is, for those enmeshed in it, more akin to a primitive hunter-gatherer society at the mercy of internal ecological dynamics than to, say, a neolithic agricultural community in which a highly ordered, deliberately adapted, ecosystem is consciously farmed so as to match its productivity to human needs
Verification of Pointer-Based Programs with Partial Information
The proliferation of software across all aspects of people's life means that software failure can bring catastrophic result. It is therefore highly desirable to be able to develop software that is verified to meet its expected specification. This has also been identified as a key objective in one of the UK Grand Challenges (GC6) (Jones et al., 2006; Woodcock, 2006). However, many difficult problems still remain in achieving this objective, partially due to the wide use of (recursive) shared mutable data structures which are hard to keep track of statically in a precise and concise way.
This thesis aims at building a verification system for both memory safety and functional correctness of programs manipulating pointer-based data structures, which can deal with two scenarios where only partial information about the program is available. For instance the verifier may be supplied with only partial program specification, or with full specification but only part of the program code. For the first scenario, previous state-of-the-art works (Nguyen et al., 2007; Chin et al., 2007; Nguyen and Chin, 2008; Chin et al, 2010) generally require users to provide full specifications for each method of the program to be verified. Their approach seeks much intellectual effort from users, and meanwhile users are liable to make mistakes in writing such specifications. This thesis proposes a new approach to program verification that allows users to provide only partial specification to methods. Our approach will then refine the given annotation into a more complete specification by discovering missing constraints. The discovered constraints may involve both numerical and multiset properties that could be later confirmed or revised by users. Meanwhile, we further augment our approach by requiring only partial specification to be given for primary methods of a program. Specifications for loops and auxiliary methods can then be systematically discovered by our augmented mechanism, with the help of information propagated from the primary methods. This work is aimed at verifying beyond shape properties, with the eventual goal of analysing both memory safety and functional properties for pointer-based data structures. Initial experiments have confirmed that we can automatically refine partial specifications with non-trivial constraints, thus making it easier for users to handle specifications with richer properties.
For the second scenario, many programs contain invocations to unknown components and hence only part of the program code is available to the verifier. As previous works generally require the whole of program code be present, we target at the verification of memory safety and functional correctness of programs manipulating pointer-based data structures, where the program code is only partially available due to invocations to unknown components. Provided with a Hoare-style specification ({Pre} prog {Post}) where program (prog) contains calls to some unknown procedure (unknown), we infer a specification (mspecu) for the unknown part (unknown) from the calling contexts, such that the problem of verifying program (prog) can be safely reduced to the problem of proving that the unknown procedure (unknown) (once its code is available) meets the derived specification (mspecu). The expected specification (mspecu) is automatically calculated using an abduction-based shape analysis specifically designed for a combined abstract domain. We have implemented a system to validate the viability of our approach, with encouraging experimental results
Planning dextrous robot hand grasps from range data, using preshapes and digit trajectories
Dextrous robot hands have many degrees of freedom. This enables the manipulation of
objects between the digits of the dextrous hand but makes grasp planning substantially
more complex than for parallel jaw grippers. Much of the work that addresses grasp
planning for dextrous hands concentrates on the selection of contact sites to optimise
stability criteria and ignores the kinematics of the hand. In more complete systems,
the paradigm of preshaping has emerged as dominant. However, the criteria for the
formation and placement of the preshapes have not been adequately examined, and
the usefulness of the systems is therefore limited to grasping simple objects for which
preshapes can be formed using coarse heuristics.In this thesis a grasp metric based on stability and kinematic feasibility is introduced.
The preshaping paradigm is extended to include consideration of the trajectories that
the digits take during closure from preshape to final grasp. The resulting grasp family
is dependent upon task requirements and is designed for a set of "ideal" object-hand
configurations. The grasp family couples the degrees of freedom of the dextrous hand
in an anthropomorphic manner; the resulting reduction in freedom makes the grasp
planning less complex. Grasp families are fitted to real objects by optimisation of the
grasp metric; this corresponds to fitting the real object-hand configuration as close to
the ideal as possible. First, the preshape aperture, which defines the positions of the
fingertips in the preshape, is found by optimisation of an approximation to the grasp
metric (which makes simplifying assumptions about the digit trajectories and hand
kinematics). Second, the full preshape kinematics and digit closure trajectories are
calculated to optimise the full grasp metric.Grasps are planned on object models built from laser striper range data from two
viewpoints. A surface description of the object is used to prune the space of possible
contact sites and to allow the accurate estimation of normals, which is required by the
grasp metric to estimate the amount of friction required. A voxel description, built by
ray-casting, is used to check for collisions between the object and the robot hand using
an approximation to the Euclidean distance transform.Results are shown in simulation for a 3-digit hand model, designed to be like a simplified
human hand in terms of its size and functionality. There are clear extensions of the
method to any dextrous hand with a single thumb opposing multiple fingers and several
different hand models that could be used are described. Grasps are planned on a wide
variety of curved and polyhedral object
Configurational meaning and conceptual shifts in design
Configuration is defined as the entailment of a set of co-present relationships embedded in a design, such that we can read a logic into the way in which the design is put together. We discuss conceptual shifts during design with particular emphasis on the designer's understanding of what kind of configuration the particular design is. The design for the Unitarian Church offers an historical example of such shifts, authorised by Kahn's own post-rationalisation of the design process. We subsequently construct a formal computational experiment where the generation, description and re-conceptualisation of designs is rendered entirely discursive. The experiment serves to clarify the nature of conceptual shifts in actual design, and the reasons why a reading of such shifts cannot be based on discursive evidence only but necessarily requires us to engage presentational forms of symbolisation as well. Our examples demonstrate how a conceptual shift within a particular design can lead to the discovery of a new potential design world. In the historical case, the conceptualisation of a new design world remains implicit and inadequately specified. But the theoretical experiment allows us to make explicit how geometrically similar configurations that arise from the application of one set of generative rules may possess systematic but entirely unanticipated perceptual properties, subsequently incorporated in new generative rules
Predicting the approximate functional behaviour of physical systems
This dissertation addresses the problem of the computer prediction of the approximate
behaviour of physical systems describable by ordinary differential equations.Previous approaches to behavioural prediction have either focused on an exact
mathematical description or on a qualitative account. We advocate a middle ground: a
representation more coarse than an exact mathematical solution yet more specific than a
qualitative one. What is required is a mathematical expression, simpler than the exact
solution, whose qualitative features mirror those of the actual solution and whose
functional form captures the principal parameter relationships underlying the behaviour of
the real system. We term such a representation an approximate functional solution.Approximate functional solutions are superior to qualitative descriptions because they
reveal specific functional relationships, restore a quantitative time scale to a process and
support more sophisticated comparative analysis queries. Moreover, they can be superior to
exact mathematical solutions by emphasizing comprehensibility, adequacy and practical
utility over precision.Two strategies for constructing approximate functional solutions are proposed. The first
abstracts the original equation, predicts behaviour in the abstraction space and maps this
back to the approximate functional level. Specifically, analytic abduction exploits
qualitative simulation to predict the qualitative properties of the solution and uses this
knowledge to guide the selection of a parameterized trial function which is then tuned with
respect to the differential equation. In order to limit the complexity of a proposed
approximate functional solution, and hence maintain its comprehensibility,
back-of-the-envelope reasoning is used to simplify overly complex expressions in a
magnitude extreme. If no function is recognised which matches the predicted behaviour,
segment calculus is called upon to find a composite function built from known primitives
and a set of operators. At the very least, segment calculus identifies a plausible structure
for the form of the solution (e.g. that it is a composition of two unknown functions).
Equation parsing capitalizes on this partial information to look for a set of termwise
interactions which, when interpreted, expose a particular solution of the equation.The second, and more direct, strategy for constructing an approximate functional solution is
embodied in the closed form approximation technique. This extends approximation
methods to equations which lack a closed form solution. This involves solving the
differential equation exactly, as an infinite series, and obtaining an approximate functional
solution by constructing a closed form function whose Taylor series is close to that of the
exact solutionThe above techniques dovetail together to achieve a style of reasoning closer to that of an
engineer or physicist rather than a mathematician. The key difference being to sacrifice the
goal of finding the correct solution of the differential equation in favour of finding an
approximation which is adequate for the purpose to which the knowledge will be put.
Applications to Intelligent Tutoring and Design Support Systems are suggested
Automated Verification of Complete Specification with Shape Inference
Ph.DDOCTOR OF PHILOSOPH
The criminal governance, geography, and network features of extortive offences in El Salvador
This thesis studies offender choices associated with extortive offences in El Salvador, Central America. Super gangs like MS-13 and Barrio 18 have turned extortion into one of the most important impediments for economic development in various countries of the isthmus. The control, influence, and prevalence of the illicit organizations found in Latin American contexts such as El Salvador, are rare. Criminal actors take advantage of the state’s poor governance to rule over large portions of the territory, sometimes establishing secret deals with official authorities to legitimize their power. Yet, little is known about the impact these circumstances have on offender decision-making. Borrowing from political science, the studies in this thesis turn to the criminal governance framework to capture the conditions faced by extortionists during three separate periods in El Salvador’s recent history and examine their impact over offender decisions. The findings suggest that closer partnerships between illicit organizations and state agents remove constraints and add incentives that provide offenders with more options, but that these effects are mediated by features associated with crime groups and the contexts in which they operate. Using these results, this thesis proposes a preliminary conceptual model of offender choices in extortion under criminal governance
Dynamic Functional Connectivity Between Cortex and Muscles
The motor-cortex is recognized as the origin of the major direct path from cortex to muscles. Although it has been studied for over a century, relatively little is known about how the motor cortex facilitates reach-to-grasp movements. We collected a rich dataset from monkeys trained to reach and grasp objects of different shapes, presented at various orientations and spatial locations. We simultaneously recorded single-unit activity from motor cortical areas (mainly the caudal bank of the pre-central gyrus), EMG activity from selected muscles (in the arm, wrist and hand) and high-resolution kinematic data from the wrist and hand. We show that motor-cortical neurons modulate their activity in an object specific manner, resulting in object specific co-activation of muscles and joint movements. We studied the multivariate relationships between the firing rates of individual neurons, EMG, joint angles and joint angle velocities and found that both EMG and kinematic features were encoded in the neural firing rates. Kinematic features were much better predictors of neural firing rates than EMG. We found that the best predictors of neural firing rates were neither individual muscles or joints, nor kinematic or EMG synergies extracted using PCA/ICA, but neuron-specific combinations of EMG and kinematic features. We show better predictions of both muscle activations and JA values by combining the activity of a few tens of sequentially recorded neurons; suggesting that neural activity contains synergistic information related to EMG, not independently present in individual neurons. By using functional connectivity, defined as the probability of observing changes in EMG following spikes from a trigger neuron, we further elucidated motor cortical activity to muscle activation. By studying both the short-time scale functional connectivity, on the order of milliseconds; and long-time scale functional connectivity, on the order of hundreds of milliseconds, we found that flexible long-time scale functional connections between individual neurons and muscles were modulated by kinematic features that could account for the relatively weaker neural firing rate relation to EMG. To support our findings, we show examples of simultaneous short-time scale functional connectivity and conclude that neuronal-muscular functional connectivity is flexible and task-dependent
- …