348 research outputs found

    BlueRedMagenta: effective implementation of color, image, animation, and interaction in an original web-based sequencial art narrative

    Get PDF
    This project is an attempt to review and explore the strengths and limitations of the technologies and tools in the web-based sequential art medium in order to find a process with which one can create more efficiency in delivering impactful sequential images to convey a narrative. The information gathered and analyzed here is then used to inform creative decisions in forming the final work, which is a web-based original sequential art narrative. The final web-based narrative endeavors to function as a ‘proof of concept’ for the topics discussed and analyzed

    Architectural Refinement in HETS

    Get PDF
    The main objective of this work is to bring a number of improvements to the Heterogeneous Tool Set HETS, both from a theoretical and an implementation point of view. In the first part of the thesis we present a number of recent extensions of the tool, among which declarative specifications of logics, generalized theoroidal comorphisms, heterogeneous colimits and integration of the logic of the term rewriting system Maude. In the second part we concentrate on the CASL architectural refinement language, that we equip with a notion of refinement tree and with calculi for checking correctness and consistency of refinements. Soundness and completeness of these calculi is also investigated. Finally, we present the integration of the VSE refinement method in HETS as an institution comorphism. Thus, the proof manangement component of HETS remains unmodified

    Bayesian Uncertainty Analysis and Decision Support for Complex Models of Physical Systems with Application to Production Optimisation of Subsurface Energy Resources

    Get PDF
    Important decision making problems are increasingly addressed using computer models for complex real world systems. However, there are major limitations to their direct use including: their complex structure; large numbers of inputs and outputs; the presence of many sources of uncertainty; which is further compounded by their long evaluation times. Bayesian methodology for the analysis of computer models has been extensively developed to perform inference for the physical systems. In this thesis, the Bayesian uncertainty analysis methodology is extended to provide robust decision support under uncertainty. Bayesian emulators are employed as a fast and efficient statistical approximation for computer models. We establish a hierarchical Bayesian emulation framework that exploits known constrained simulator behaviour in constituents of the decision support utility function. In addition, novel Bayesian emulation methodology is developed for computer models with structured partial discontinuities. We advance the crucial uncertainty quantification methodology to perform a robust decision analysis developing a technique to assess and remove linear transformations of the utility function induced by sources of uncertainty to which conclusions are invariant, as well as incorporating structural model discrepancy and decision implementation error. These are encompassed within a novel iterative decision support procedure which acknowledges utility function uncertainty resulting from the separation of the analysts and final decision makers to deliver a robust class of decisions, along with any additional information, for further consideration. The complete toolkit is successfully demonstrated via an application to the problem of optimal petroleum field development, including an international and commercially important benchmark challenge

    A Bayesian approach to parallel line bioassay

    Get PDF
    This thesis considers parallel line bioassay from a Bayesian point of view along the lines laid about Lindley (1972) and de Finetti (1975). The mathematical model used for the analysis is a non-linear one in which the log potency appears explicitly as a parameter. This enables prior knowledge about the log potency ratio to be incorporated straightforwardly in the analysis. The method of analysis follows closely the ideas of Lindley and Smith (1972) for the linear model. Extended models in which experimental design features such as randomized blocks and Latin squares are accounted for are also considered, and a method for the use of prior information to design an assay is given. In addition to the analysis of a single essay the problem of combining information from several assays is considered and two different models which combine such information are discussed

    From "Mixed" to "Applied" Mathematics: Tracing an important dimension of mathematics and its history

    Get PDF
    The workshop investigated historical variations of the ways in which historically boundaries were drawn between ‘pure’ mathematics on the one hand and ‘mixed’ or ‘applied’ mathematics on the other from about 1500 until today. It brought together historians and philosophers of mathematics as well as several mathematicians working on applications. Emphasis was laid upon the clarification of the relation between the historical use and the historiographical usefulness and philosophical soundness of the various categories

    Rhythm, Rhythmanalysis and Algorithm-Analysis

    Get PDF
    The contemporary Western world has been shaped if not actually born from the algorithm, it has been said. We live in a computational culture, more specifically an algorithmic culture, as Alexander Galloway pointed out more than a decade ago. One of the excellent New Economics Foundation reports puts it thus: “[algorithms] have morphed from curating online content to curating and influencing our lives.” Indeed, capitalism’s current financialized mode depends entirely on algorithmic calculation, as the basis of derivatives, high speed trading and the new fintech sector, for example. Platform capitalism relies on algorithmic machine learning and AI, as does manufacturing. Expert systems for medical diagnosis and robot surgery are built from algorithmic machine learning. Political campaigning exploits the micro-targeting of social media messages, as we have learnt from the Cambridge Analytica scandal, not to mention the Snowden revelation of the most extensive government mass surveillance operations the world has ever seen. Pattern of life analysis has been literally adopted in the algorithms of the “kill chain” of drone bombers
    • …
    corecore