18,932 research outputs found

    Digital Ecosystems: Ecosystem-Oriented Architectures

    Full text link
    We view Digital Ecosystems to be the digital counterparts of biological ecosystems. Here, we are concerned with the creation of these Digital Ecosystems, exploiting the self-organising properties of biological ecosystems to evolve high-level software applications. Therefore, we created the Digital Ecosystem, a novel optimisation technique inspired by biological ecosystems, where the optimisation works at two levels: a first optimisation, migration of agents which are distributed in a decentralised peer-to-peer network, operating continuously in time; this process feeds a second optimisation based on evolutionary computing that operates locally on single peers and is aimed at finding solutions to satisfy locally relevant constraints. The Digital Ecosystem was then measured experimentally through simulations, with measures originating from theoretical ecology, evaluating its likeness to biological ecosystems. This included its responsiveness to requests for applications from the user base, as a measure of the ecological succession (ecosystem maturity). Overall, we have advanced the understanding of Digital Ecosystems, creating Ecosystem-Oriented Architectures where the word ecosystem is more than just a metaphor.Comment: 39 pages, 26 figures, journa

    Cardiotoxicity with vascular endothelial growth factor inhibitor therapy

    Get PDF
    Angiogenesis inhibitors targeting the vascular endothelial growth factor (VEGF) signaling pathway (VSP) have been important additions in the therapy of various cancers, especially renal cell carcinoma and colorectal cancer. Bevazicumab, the first VSP to receive FDA approval in 2004 targeting all circulating isoforms of VEGF-A, has become one of the best-selling drugs of all times. The second wave of tyrosine kinase inhibitors (TKIs), which target the intracellular site of VEGF receptor kinases, began with the approval of sorafenib in 2005 and sunitinib in 2006. Heart failure was subsequently noted, in 2–4% of patients on bevacizumab and in 3–8% of patients on VSP-TKIs. The very fact that the single-targeted monoclonal antibody bevacizumab can induce cardiotoxicity supports a pathomechanistic role for the VSP and the postulate of the “vascular” nature of VSP inhibitor cardiotoxicity. In this review we will outline this scenario in greater detail, reflecting on hypertension and coronary artery disease as risk factors for VSP inhibitor cardiotoxicity, but also similarities with peripartum and diabetic cardiomyopathy. This leads to the concept that any preexisting or coexisting condition that reduces the vascular reserve or utilizes the vascular reserve for compensatory purposes may pose a risk factor for cardiotoxicity with VSP inhibitors. These conditions need to be carefully considered in cancer patients who are to undergo VSP inhibitor therapy. Such vigilance is not to exclude patients from such prognostically extremely important therapy but to understand the continuum and to recognize and react to any cardiotoxicity dynamics early on for superior overall outcomes

    Fatherhood and sperm DNA damage in testicular cancer patients

    Get PDF
    Testicular cancer (TC) is one of the most treatable of all malignancies and the management of the quality of life of these patients is increasingly important, especially with regard to their sexuality and fertility. Survivors must overcome anxiety and fears about reduced fertility and possible pregnancy-related risks as well as health effects in offspring. There is thus a growing awareness of the need for reproductive counseling of cancer survivors. Studies found a high level of sperm DNA damage in TC patients in comparison with healthy, fertile controls, but no significant difference between these patients and infertile patients. Sperm DNA alterations due to cancer treatment persist from 2 to 5 years after the end of the treatment and may be influenced by both the type of therapy and the stage of the disease. Population studies reported a slightly reduced overall fertility of TC survivors and a more frequent use of ART than the general population, with a success rate of around 50%. Paternity after a diagnosis of cancer is an important issue and reproductive potential is becoming a major quality of life factor. Sperm chromatin instability associated with genome instability is the most important reproductive side effect related to the malignancy or its treatment. Studies investigating the magnitude of this damage could have a considerable translational importance in the management of cancer patients, as they could identify the time needed for the germ cell line to repair nuclear damage and thus produce gametes with a reduced risk for the offspring

    Regulated MAS: Social Perspective

    Get PDF
    This chapter addresses the problem of building normative multi-agent systems in terms of regulatory mechanisms. It describes a static conceptual model through which one can specify normative multi-agent systems along with a dynamic model to capture their operation and evolution. The chapter proposes a typology of applications and presents some open problems. In the last section, the authors express their individual views on these mattersMunindar Singh’s effort was partially supported by the U.S. Army Research Office under grant W911NF-08-1-0105. The content of this paper does not necessarily reflect the position or policy of the U.S. Government; no official endorsement should be inferred or implied. Nicoletta Fornara’s effort is supported by the Hasler Foundation project nr. 11115-KG and by the SER project nr. C08.0114 within the COST Action IC0801 Agreement Technologies. Henrique Lopes Cardoso’s effort is supported by Fundação para a Ciência e a Tecnologia (FCT), under project PTDC/EIA-EIA/104420/2008. Pablo Noriega’s effort has been partially supported by the Spanish Ministry of Science and Technology through the Agreement Technologies CONSOLIDER project under contract CSD2007-0022, and the Generalitat of Catalunya grant 2009-SGR-1434.Peer Reviewe

    Animal models of ischaemic stroke and characterisation of the ischaemic penumbra

    Get PDF
    Over the past forty years, animal models of focal cerebral ischaemia have allowed us to identify the critical cerebral blood flow thresholds responsible for irreversible cell death, electrical failure, inhibition of protein synthesis, energy depletion and thereby the lifespan of the potentially salvageable penumbra. They have allowed us to understand the intricate biochemical and molecular mechanisms within the ‘ischaemic cascade’ that initiate cell death in the first minutes, hours and days following stroke. Models of permanent, transient middle cerebral artery occlusion and embolic stroke have been developed each with advantages and limitations when trying to model the complex heterogeneous nature of stroke in humans. Yet despite these advances in understanding the pathophysiological mechanisms of stroke-induced cell death with numerous targets identified and drugs tested, a lack of translation to the clinic has hampered pre-clinical stroke research. With recent positive clinical trials of endovascular thrombectomy in acute ischaemic stroke the stroke community has been reinvigorated, opening up the potential for future translation of adjunctive treatments that can be given alongside thrombectomy/thrombolysis. This review discusses the major animal models of focal cerebral ischaemia highlighting their advantages and limitations. Acute imaging is crucial in longitudinal pre-clinical stroke studies in order to identify the influence of acute therapies on tissue salvage over time. Therefore, the methods of identifying potentially salvageable ischaemic penumbra are discussed

    The cybernetic Bayesian brain: from interoceptive inference to sensorimotor contingencies

    Get PDF
    Is there a single principle by which neural operations can account for perception, cognition, action, and even consciousness? A strong candidate is now taking shape in the form of “predictive processing”. On this theory, brains engage in predictive inference on the causes of sensory inputs by continuous minimization of prediction errors or informational “free energy”. Predictive processing can account, supposedly, not only for perception, but also for action and for the essential contribution of the body and environment in structuring sensorimotor interactions. In this paper I draw together some recent developments within predictive processing that involve predictive modelling of internal physiological states (interoceptive inference), and integration with “enactive” and “embodied” approaches to cognitive science (predictive perception of sensorimotor contingencies). The upshot is a development of predictive processing that originates, not in Helmholtzian perception-as-inference, but rather in 20th-century cybernetic principles that emphasized homeostasis and predictive control. This way of thinking leads to (i) a new view of emotion as active interoceptive inference; (ii) a common predictive framework linking experiences of body ownership, emotion, and exteroceptive perception; (iii) distinct interpretations of active inference as involving disruptive and disambiguatory—not just confirmatory—actions to test perceptual hypotheses; (iv) a neurocognitive operationalization of the “mastery of sensorimotor contingencies” (where sensorimotor contingencies reflect the rules governing sensory changes produced by various actions); and (v) an account of the sense of subjective reality of perceptual contents (“perceptual presence”) in terms of the extent to which predictive models encode potential sensorimotor relations (this being “counterfactual richness”). This is rich and varied territory, and surveying its landmarks emphasizes the need for experimental tests of its key contributions

    Principles and Concepts of Agent-Based Modelling for Developing Geospatial Simulations

    Get PDF
    The aim of this paper is to outline fundamental concepts and principles of the Agent-Based Modelling (ABM) paradigm, with particular reference to the development of geospatial simulations. The paper begins with a brief definition of modelling, followed by a classification of model types, and a comment regarding a shift (in certain circumstances) towards modelling systems at the individual-level. In particular, automata approaches (e.g. Cellular Automata, CA, and ABM) have been particularly popular, with ABM moving to the fore. A definition of agents and agent-based models is given; identifying their advantages and disadvantages, especially in relation to geospatial modelling. The potential use of agent-based models is discussed, and how-to instructions for developing an agent-based model are provided. Types of simulation / modelling systems available for ABM are defined, supplemented with criteria to consider before choosing a particular system for a modelling endeavour. Information pertaining to a selection of simulation / modelling systems (Swarm, MASON, Repast, StarLogo, NetLogo, OBEUS, AgentSheets and AnyLogic) is provided, categorised by their licensing policy (open source, shareware / freeware and proprietary systems). The evaluation (i.e. verification, calibration, validation and analysis) of agent-based models and their output is examined, and noteworthy applications are discussed.Geographical Information Systems (GIS) are a particularly useful medium for representing model input and output of a geospatial nature. However, GIS are not well suited to dynamic modelling (e.g. ABM). In particular, problems of representing time and change within GIS are highlighted. Consequently, this paper explores the opportunity of linking (through coupling or integration / embedding) a GIS with a simulation / modelling system purposely built, and therefore better suited to supporting the requirements of ABM. This paper concludes with a synthesis of the discussion that has proceeded. The aim of this paper is to outline fundamental concepts and principles of the Agent-Based Modelling (ABM) paradigm, with particular reference to the development of geospatial simulations. The paper begins with a brief definition of modelling, followed by a classification of model types, and a comment regarding a shift (in certain circumstances) towards modelling systems at the individual-level. In particular, automata approaches (e.g. Cellular Automata, CA, and ABM) have been particularly popular, with ABM moving to the fore. A definition of agents and agent-based models is given; identifying their advantages and disadvantages, especially in relation to geospatial modelling. The potential use of agent-based models is discussed, and how-to instructions for developing an agent-based model are provided. Types of simulation / modelling systems available for ABM are defined, supplemented with criteria to consider before choosing a particular system for a modelling endeavour. Information pertaining to a selection of simulation / modelling systems (Swarm, MASON, Repast, StarLogo, NetLogo, OBEUS, AgentSheets and AnyLogic) is provided, categorised by their licensing policy (open source, shareware / freeware and proprietary systems). The evaluation (i.e. verification, calibration, validation and analysis) of agent-based models and their output is examined, and noteworthy applications are discussed.Geographical Information Systems (GIS) are a particularly useful medium for representing model input and output of a geospatial nature. However, GIS are not well suited to dynamic modelling (e.g. ABM). In particular, problems of representing time and change within GIS are highlighted. Consequently, this paper explores the opportunity of linking (through coupling or integration / embedding) a GIS with a simulation / modelling system purposely built, and therefore better suited to supporting the requirements of ABM. This paper concludes with a synthesis of the discussion that has proceeded
    corecore