1,069,707 research outputs found

    Exploring Cities Using Agent-Based Models and GIS

    Get PDF
    Cities are faced with many problems such as urban sprawl, congestion, and segregation. They are also constantly changing. Computer modelling is becoming an increasingly important tool when examining how cities operate. Agent based models (ABM) allow for the testing of different hypotheses and theories for urban change, thus leading to a greater understanding of how cities work. This paper presents how ABMs can be developed by their integration with Geographical Information System (GIS). To highlight this, a generic ABM is presented. This is then applied to two model applications: a segregation model and a location model. Both models highlight how different theories can be incorporated into the generic model and demonstrate the importance of space in the modelling process. Cities are faced with many problems such as urban sprawl, congestion, and segregation. They are also constantly changing. Computer modelling is becoming an increasingly important tool when examining how cities operate. Agent based models (ABM) allow for the testing of different hypotheses and theories for urban change, thus leading to a greater understanding of how cities work. This paper presents how ABMs can be developed by their integration with Geographical Information System (GIS). To highlight this, a generic ABM is presented. This is then applied to two model applications: a segregation model and a location model. Both models highlight how different theories can be incorporated into the generic model and demonstrate the importance of space in the modelling process

    Integrating transposable elements in the 3D genome

    Get PDF
    Chromosome organisation is increasingly recognised as an essential component of genome regulation, cell fate and cell health. Within the realm of transposable elements (TEs) however, the spatial information of how genomes are folded is still only rarely integrated in experimental studies or accounted for in modelling. Whilst polymer physics is recognised as an important tool to understand the mechanisms of genome folding, in this commentary we discuss its potential applicability to aspects of TE biology. Based on recent works on the relationship between genome organisation and TE integration, we argue that existing polymer models may be extended to create a predictive framework for the study of TE integration patterns. We suggest that these models may offer orthogonal and generic insights into the integration profiles (or "topography") of TEs across organisms. In addition, we provide simple polymer physics arguments and preliminary molecular dynamics simulations of TEs inserting into heterogeneously flexible polymers. By considering this simple model, we show how polymer folding and local flexibility may generically affect TE integration patterns. The preliminary discussion reported in this commentary is aimed to lay the foundations for a large-scale analysis of TE integration dynamics and topography as a function of the three-dimensional host genome

    High-level synthesis under I/O Timing and Memory constraints

    Full text link
    The design of complex Systems-on-Chips implies to take into account communication and memory access constraints for the integration of dedicated hardware accelerator. In this paper, we present a methodology and a tool that allow the High-Level Synthesis of DSP algorithm, under both I/O timing and memory constraints. Based on formal models and a generic architecture, this tool helps the designer to find a reasonable trade-off between both the required I/O timing behavior and the internal memory access parallelism of the circuit. The interest of our approach is demonstrated on the case study of a FFT algorithm

    Simulation of networks of spiking neurons: A review of tools and strategies

    Full text link
    We review different aspects of the simulation of spiking neural networks. We start by reviewing the different types of simulation strategies and algorithms that are currently implemented. We next review the precision of those simulation strategies, in particular in cases where plasticity depends on the exact timing of the spikes. We overview different simulators and simulation environments presently available (restricted to those freely available, open source and documented). For each simulation tool, its advantages and pitfalls are reviewed, with an aim to allow the reader to identify which simulator is appropriate for a given task. Finally, we provide a series of benchmark simulations of different types of networks of spiking neurons, including Hodgkin-Huxley type, integrate-and-fire models, interacting with current-based or conductance-based synapses, using clock-driven or event-driven integration strategies. The same set of models are implemented on the different simulators, and the codes are made available. The ultimate goal of this review is to provide a resource to facilitate identifying the appropriate integration strategy and simulation tool to use for a given modeling problem related to spiking neural networks.Comment: 49 pages, 24 figures, 1 table; review article, Journal of Computational Neuroscience, in press (2007

    Transparency and control in engineering integrated assessment models.

    Get PDF
    Better software engineering such as archiving releases with version control, writing portable code, publishing documentation and results closely tied to the code improves integrated assessment models' transparency and control. A case study of four climate change policy analysis models found that source code and data was generally available, but for largermodels licenses were more restrictive with respect to modification and redistribution. It is suggested that Free software licenses such as the GNU GPL would improve transparency and control. Moreover, opening the source allows opening the development process, a potentially important tool to improve collaboration, data sharing and models integration.

    Formalization of higher-level intelligence through integration of intelligent tutoring tools : a thesis presented in partial fulfilment of the requirements for the degree of Master of Information Systems, Department of Information Systems, Massey University, Palmerston North, New Zealand

    Get PDF
    In contrast with a traditional Intelligent Tutoring System (ITS), which attempts to be fairly comprehensive and covers enormous chunks of a discipline's subject matter, a basic Intelligent Tutoring Tool (ITT) (Patel & Kinshuk, 1997) has a narrow focus. It focuses on a single topic or a very small cluster of related topics. An ITT is regarded as a building block of a larger and more comprehensive tutoring system, which is fundamentally similar with the emerging technology "Learning Objects" (LOs) (LTSC, 2000a). While an individual ITT or LO focuses on a single topic or a very small cluster of knowledge, the importance of the automatic integration of interrelated ITTs or LOs is very clear. This integration can extend the scope of an individual ITT or LO, it can guide the user from a simple working model to a complex working model and provide the learner with a rich learning experience, which results in a higher level of learning. This study reviews and analyses the Learning Objects technology, as well as its advantages and difficulties. Especially, the LOs integration mechanisms applied in the existing learning systems are discussed in detail. As a result, a new ITT integration framework is proposed which extends and formalizes the former ITT integration structures (Kinshuk & Patel, 1997, Kinshuk, et al. 2003) in two ways: identifying and organizing ITTs, and describing and networking ITTs. The proposed ITTs integration framework has the following four notions: (1) Ontology, to set up an explicit conceptualisation in a particular domain, (2) Object Design and Sequence Theory, to identify and arrange learning objects in a pedagogical way through the processes of decomposing principled skills, synthesising working models and placing these models on scales of increasing complexity, (3) Metadata, to describe the identified ITTs and their interrelationships in a cross-platform XML format, and (4) Integration Mechanism, to detect and activate the contextual relationship

    Simulating Nonholonomic Dynamics

    Full text link
    This paper develops different discretization schemes for nonholonomic mechanical systems through a discrete geometric approach. The proposed methods are designed to account for the special geometric structure of the nonholonomic motion. Two different families of nonholonomic integrators are developed and examined numerically: the geometric nonholonomic integrator (GNI) and the reduced d'Alembert-Pontryagin integrator (RDP). As a result, the paper provides a general tool for engineering applications, i.e. for automatic derivation of numerically accurate and stable dynamics integration schemes applicable to a variety of robotic vehicle models

    Teacher education for effective technology integration

    Get PDF
    About a decade ago, several researchers used Shulman's (1986) framework about Pedagogical Content Knowledge (PCK) - a body of knowledge that constitutes a special amalgam of content, pedagogy, learners, and context - as a theoretical basis for developing TPCK or TPACK: a framework for guiding teachers' cognition about technology integration in teaching and learning (Angeli, Valanides, & Christodoulou, 2016). Different models of TPCK/TPACK are proposed in the literature, each with a different focus (on practice, instructional design, context, etc.) and with a different theoretical interpretation about the nature and development of the knowledge that teachers need to have to be able to teach with technology (e.g., Angeli & Valanides, 2005, 2009, 2013; Koehler & Mishra, 2008; Niess, 2005).In this direction, research is being carried out to identify TPCK design procedures for initial teacher education. In teaching, when transferring TPCK to design and methodological practices, there is a need to consider a number of factors, especially: the different modes of adopting technologies; the integration of tool affordances, content and pedagogy; the implementation of learning environments; the operationalization of knowledge; and detailed analysis of teaching models and approache
    • 

    corecore