955 research outputs found

    Complex event types for agent-based simulation

    Get PDF
    This thesis presents a novel formal modelling language, complex event types (CETs), to describe behaviours in agent-based simulations. CETs are able to describe behaviours at any computationally represented level of abstraction. Behaviours can be specified both in terms of the state transition rules of the agent-based model that generate them and in terms of the state transition structures themselves. Based on CETs, novel computational statistical methods are introduced which allow statistical dependencies between behaviours at different levels to be established. Different dependencies formalise different probabilistic causal relations and Complex Systems constructs such as ‘emergence’ and ‘autopoiesis’. Explicit links are also made between the different types of CET inter-dependency and the theoretical assumptions they represent. With the novel computational statistical methods, three categories of model can be validated and discovered: (i) inter-level models, which define probabilistic dependencies between behaviours at different levels; (ii) multi-level models, which define the set of simulations for which an inter-level model holds; (iii) inferred predictive models, which define latent relationships between behaviours at different levels. The CET modelling language and computational statistical methods are then applied to a novel agent-based model of Colonic Cancer to demonstrate their applicability to Complex Systems sciences such as Systems Biology. This proof of principle model provides a framework for further development of a detailed integrative model of the system, which can progressively incorporate biological data from different levels and scales as these become available

    ă€ç ”ç©¶ćˆ†é‡Žćˆ„ă€‘ă‚·ăƒŒă‚ș集 [英èȘžç‰ˆ]

    Get PDF
    [英èȘžç‰ˆ

    Algorithmic Analysis Techniques for Molecular Imaging

    Get PDF
    This study addresses image processing techniques for two medical imaging modalities: Positron Emission Tomography (PET) and Magnetic Resonance Imaging (MRI), which can be used in studies of human body functions and anatomy in a non-invasive manner. In PET, the so-called Partial Volume EïŹ€ect (PVE) is caused by low spatial resolution of the modality. The eïŹƒciency of a set of PVE-correction methods is evaluated in the present study. These methods use information about tissue borders which have been acquired with the MRI technique. As another technique, a novel method is proposed for MRI brain image segmen- tation. A standard way of brain MRI is to use spatial prior information in image segmentation. While this works for adults and healthy neonates, the large variations in premature infants preclude its direct application. The proposed technique can be applied to both healthy and non-healthy premature infant brain MR images. DiïŹ€usion Weighted Imaging (DWI) is a MRI-based technique that can be used to create images for measuring physiological properties of cells on the structural level. We optimise the scanning parameters of DWI so that the required acquisition time can be reduced while still maintaining good image quality. In the present work, PVE correction methods, and physiological DWI models are evaluated in terms of repeatabilityof the results. This gives in- formation on the reliability of the measures given by the methods. The evaluations are done using physical phantom objects, correlation measure- ments against expert segmentations, computer simulations with realistic noise modelling, and with repeated measurements conducted on real pa- tients. In PET, the applicability and selection of a suitable partial volume correction method was found to depend on the target application. For MRI, the data-driven segmentation oïŹ€ers an alternative when using spatial prior is not feasible. For DWI, the distribution of b-values turns out to be a central factor aïŹ€ecting the time-quality ratio of the DWI acquisition. An optimal b-value distribution was determined. This helps to shorten the imaging time without hampering the diagnostic accuracy.Siirretty Doriast

    Applying the Free-Energy Principle to Complex Adaptive Systems

    Get PDF
    The free energy principle is a mathematical theory of the behaviour of self-organising systems that originally gained prominence as a unified model of the brain. Since then, the theory has been applied to a plethora of biological phenomena, extending from single-celled and multicellular organisms through to niche construction and human culture, and even the emergence of life itself. The free energy principle tells us that perception and action operate synergistically to minimize an organism’s exposure to surprising biological states, which are more likely to lead to decay. A key corollary of this hypothesis is active inference—the idea that all behavior involves the selective sampling of sensory data so that we experience what we expect to (in order to avoid surprises). Simply put, we act upon the world to fulfill our expectations. It is now widely recognized that the implications of the free energy principle for our understanding of the human mind and behavior are far-reaching and profound. To date, however, its capacity to extend beyond our brain—to more generally explain living and other complex adaptive systems—has only just begun to be explored. The aim of this collection is to showcase the breadth of the free energy principle as a unified theory of complex adaptive systems—conscious, social, living, or not

    New Fundamental Technologies in Data Mining

    Get PDF
    The progress of data mining technology and large public popularity establish a need for a comprehensive text on the subject. The series of books entitled by "Data Mining" address the need by presenting in-depth description of novel mining algorithms and many useful applications. In addition to understanding each section deeply, the two books present useful hints and strategies to solving problems in the following chapters. The contributing authors have highlighted many future research directions that will foster multi-disciplinary collaborations and hence will lead to significant development in the field of data mining

    Women in Science 2015

    Get PDF
    Women in Science 2015 summarizes research done by Smith College’s Summer Research Fellowship (SURF) Program participants. Ever since its 1967 start, SURF has been a cornerstone of Smith’s science education. In 2015, 162 students participated in SURF (153 hosted on campus and nearby eld sites), supervised by 60 faculty mentor-advisors drawn from the Clark Science Center and connected to its eighteen science, mathematics, and engineering departments and programs and associated centers and units. At summer’s end, SURF participants were asked to summarize their research experiences for this publication.https://scholarworks.smith.edu/clark_womeninscience/1002/thumbnail.jp

    Detotaliseerimine ja tagasiulatuv jĂ”ud: musta pĂŒramiidi semiootika

    Get PDF
    VĂ€itekirja ĂŒldiseks probleemiks on semiootika integreeritavus. Detotalisatsioon kirjeldab semiootikatraditsiooni, mille kohaselt suletud terviklikkus pole vĂ”imalik, ning mis oma pĂ”hiliste teoreetiliste koordinaatidena nĂ€eb psĂŒhhoanalĂŒĂŒsi, ideoloogia kriitikat ja strukturaalset semioloogiat. Oluliseks analĂŒĂŒsivahendiks on autori poolt vĂ€lja töötatud nn “musta pĂŒramiidi” skeem-mudel, mille abil otsitakse vastust kĂŒsimusele: kuidas saab puhtdiferentsiaalne, erinevustel pĂ”hinev (internaalne) sĂŒsteem suhestuda vĂ€lisega (eksternaalsega)? JĂ€rgnevalt jĂ”utakse semiootikas esineva subjektiivse relativismi kriitikani ja vĂ”etakse kasutusele retroaktiivsuse mĂ”iste, mille kaudu kirjeldatakse vĂ€liseid mĂ”jusid. Semiootika osavaldu vaadeldakse retroaktiivsuse toimimise aspektist. “Musta pĂŒramiidi” skeem-mudel ĂŒhendab hĂŒbriidselt Peirce’i ja Hjelmslev’ semiootikat, integreerides Peirce’i detotalisatsiooniga. Skeem eristab mĂ€rgifunktsiooni ja mĂ€rgiproduktsiooni ala ning selle jaotuse kaudu sulandab Peirce’i trihhotoomia kokku Saussure’i dihhotoomiaga. Taolisel sĂŒnteesil on kaks eelist. Esmalt on detotalisatsiooni subjektivistlik relativism ankurdatud kognitiivsemiootika ja biosemiootika empiiriliste ja loogiliste rakenduste poolt. Teisalt on kognitiivsemiootika ja biosemiootika rikastatud retroaktiivsuse tekstiliste protseduuridega, mis vĂ”imaldab ligipÀÀsu vĂ€lisele ilma mĂ€rgi mÀÀratlust kahjustamata. SeelĂ€bi on olemas artikulatoorse alusmaatriksi teaduslik seletus, kuid samuti vajadus teaduslikus semiootikas detotalisatsioonile iseloomuliku tekstuaalse eksperimenteerimise jĂ€rele. Just retroaktiivsus on see ĂŒhendav mĂ”iste, mis seob kaks semiootika lahusolevat valda. Integreerides ka kognitiivsemiootika ja biosemiootika detotaliseeritud semiootika pildile, pakub vĂ€itekiri kokkuvĂ”ttes mittereduktiivse ja empiirilise vastuse relativismi probleemile semiootikas, sĂ€ilitades seejuures semiootika teoreetilise terviklikkuse ja pakkudes vĂ€lja ĂŒhtse metakeele killustatud sotsiaalteaduste tarbeks.  Detotalization describes the tradition of semiotics which takes psychoanalysis, ideology critique, and structural semiology as its major theoretic coordinates. Interest in these coordinates has declined against the ascent of the semiotics of Charles Peirce, the two approaches are sometimes construed as irreconcilable, but the dissertation seeks to integrate Peirce to the coordinates of detotalization. This integration requires that Peirce be read in the way that Jacques Derrida and Umberto Eco propose to read him, by moderating his realism. This is achieved through theorization of the notion of retroactivity. Chapters one through four restate the coordinates of detotalization in terms of retroactivity, and chapter five searches the domains of cognitive and biosemiotics for the Peircean equivalent of retroactivity. The black pyramid schema is a picture of the Peirce-Hjelmslev hybrid, where Peirce is integrated to detotalization. In the schema, semiotics is organized by the domains of sign function and sign production, and the Peircean trichotomy is reconciled to the Saussurean dichotomy by means of this division. The synthesis has two advantages. In one direction, the subjectivist relativism of detotalization is anchored by the empirical and logical applications of cognitive and biosemiotics. In the other direction, cognitive and biosemiotics are enhanced by the textual procedures of retroactivity, which account for the external without compromising the definition of the sign by importing a naĂŻve referent. There is a scientific explanation for the profound articulatory matrix, but there is also a need within scientific semiotics for the textual experimentation characteristic of detotalization. Retroactivity as the bridge concept between the two divided camps of semiotics also restores its original ambition, to provide a unifying vocabulary for the fractured social sciences.https://www.ester.ee/record=b540146

    ARTIFICIAL INTELLIGENCE, LLC: CORPORATE PERSONHOOD AS TORT REFORM

    Get PDF
    Our legal system has long tried to fit the square peg of artificial intelligence (AI) technologies into the round hole of the current tort regime, overlooking the inability of traditional liability schemes to address the nuances of how AI technology creates harms. The current tort regime deals out rough justice—using strict liability for some AI products and using the negligence rule for other AI services—both of which are insufficiently tailored to achieve public policy objectives. Under a strict liability regime where manufacturers are always held liable for the faults of their technology regardless of knowledge or precautionary measures, firms are incentivized to play it safe and stifle innovation. But even with this cautionary stance, the goals of strict liability cannot be met due to the unique nature of AI technology: its mistakes are merely “efficient errors”—they appropriately surpass the human baseline, they are game theory problems intended for a jury, they are necessary to train a robust system, or they are harmless but misclassified. Under a negligence liability regime where the onus falls entirely on consumers to prove the element of causation, victimized consumers must surmount the difficult hurdle of tracing the vectors of causation through the “black box” of algorithms. Unable to do so, many are left without sufficient recourse or compensation

    PSA 2018

    Get PDF
    These preprints were automatically compiled into a PDF from the collection of papers deposited in PhilSci-Archive in conjunction with the PSA 2018
    • 

    corecore