31,162 research outputs found

    MaaSim: A Liveability Simulation for Improving the Quality of Life in Cities

    Get PDF
    Urbanism is no longer planned on paper thanks to powerful models and 3D simulation platforms. However, current work is not open to the public and lacks an optimisation agent that could help in decision making. This paper describes the creation of an open-source simulation based on an existing Dutch liveability score with a built-in AI module. Features are selected using feature engineering and Random Forests. Then, a modified scoring function is built based on the former liveability classes. The score is predicted using Random Forest for regression and achieved a recall of 0.83 with 10-fold cross-validation. Afterwards, Exploratory Factor Analysis is applied to select the actions present in the model. The resulting indicators are divided into 5 groups, and 12 actions are generated. The performance of four optimisation algorithms is compared, namely NSGA-II, PAES, SPEA2 and eps-MOEA, on three established criteria of quality: cardinality, the spread of the solutions, spacing, and the resulting score and number of turns. Although all four algorithms show different strengths, eps-MOEA is selected to be the most suitable for this problem. Ultimately, the simulation incorporates the model and the selected AI module in a GUI written in the Kivy framework for Python. Tests performed on users show positive responses and encourage further initiatives towards joining technology and public applications.Comment: 16 page

    A Platform for the Analysis of Qualitative and Quantitative Data about the Built Environment and its Users

    Get PDF
    There are many scenarios in which it is necessary to collect data from multiple sources in order to evaluate a system, including the collection of both quantitative data - from sensors and smart devices - and qualitative data - such as observations and interview results. However, there are currently very few systems that enable both of these data types to be combined in such a way that they can be analysed side-by-side. This paper describes an end-to-end system for the collection, analysis, storage and visualisation of qualitative and quantitative data, developed using the e-Science Central cloud analytics platform. We describe the experience of developing the system, based on a case study that involved collecting data about the built environment and its users. In this case study, data is collected from older adults living in residential care. Sensors were placed throughout the care home and smart devices were issued to the residents. This sensor data is uploaded to the analytics platform and the processed results are stored in a data warehouse, where it is integrated with qualitative data collected by healthcare and architecture researchers. Visualisations are also presented which were intended to allow the data to be explored and for potential correlations between the quantitative and qualitative data to be investigated

    Towards a debugging tutor for object-oriented environments

    Get PDF
    Programming has provided a rich domain for Artificial Intelligence in Education and many systems have been developed to advise students about the bugs in their programs, either during program development or post-hoc. Surprisingly few systems have been developed specifically to teach debugging. Learning environment builders have assumed that either the student will be taught these elsewhere or thatthey will be learnt piecemeal without explicit advice.This paper reports on two experiments on Java debugging strategy by novice programmers and discusses their implications for the design of a debugging tutor for Java that pays particular attention to how students use the variety of program representations available. The experimental results are in agreement with research in the area that suggests that good debugging performance is associated with a balanced use ofthe available representations and a sophisticated use of the debugging step facility which enables programmers to detect and obtain information from critical momentsin the execution of the program. A balanced use of the available representations seemsto be fostered by providing representations with a higher degree of dynamic linkingas well as by explicit instruction about the representation formalism employed in the program visualisations

    A grid-enabled problem solving environment for parallel computational engineering design

    Get PDF
    This paper describes the development and application of a piece of engineering software that provides a problem solving environment (PSE) capable of launching, and interfacing with, computational jobs executing on remote resources on a computational grid. In particular it is demonstrated how a complex, serial, engineering optimisation code may be efficiently parallelised, grid-enabled and embedded within a PSE. The environment is highly flexible, allowing remote users from different sites to collaborate, and permitting computational tasks to be executed in parallel across multiple grid resources, each of which may be a parallel architecture. A full working prototype has been built and successfully applied to a computationally demanding engineering optimisation problem. This particular problem stems from elastohydrodynamic lubrication and involves optimising the computational model for a lubricant based on the match between simulation results and experimentally observed data

    GOexpress: an R/Bioconductor package for the identification and visualisation of robust gene ontology signatures through supervised learning of gene expression data

    Get PDF
    Background: Identification of gene expression profiles that differentiate experimental groups is critical for discovery and analysis of key molecular pathways and also for selection of robust diagnostic or prognostic biomarkers. While integration of differential expression statistics has been used to refine gene set enrichment analyses, such approaches are typically limited to single gene lists resulting from simple two-group comparisons or time-series analyses. In contrast, functional class scoring and machine learning approaches provide powerful alternative methods to leverage molecular measurements for pathway analyses, and to compare continuous and multi-level categorical factors. Results: We introduce GOexpress, a software package for scoring and summarising the capacity of gene ontology features to simultaneously classify samples from multiple experimental groups. GOexpress integrates normalised gene expression data (e.g., from microarray and RNA-seq experiments) and phenotypic information of individual samples with gene ontology annotations to derive a ranking of genes and gene ontology terms using a supervised learning approach. The default random forest algorithm allows interactions between all experimental factors, and competitive scoring of expressed genes to evaluate their relative importance in classifying predefined groups of samples. Conclusions: GOexpress enables rapid identification and visualisation of ontology-related gene panels that robustly classify groups of samples and supports both categorical (e.g., infection status, treatment) and continuous (e.g., time-series, drug concentrations) experimental factors. The use of standard Bioconductor extension packages and publicly available gene ontology annotations facilitates straightforward integration of GOexpress within existing computational biology pipelines.Department of Agriculture, Food and the MarineEuropean Commission - Seventh Framework Programme (FP7)Science Foundation IrelandUniversity College Dubli

    Obvious: a meta-toolkit to encapsulate information visualization toolkits. One toolkit to bind them all

    Get PDF
    This article describes ā€œObviousā€: a meta-toolkit that abstracts and encapsulates information visualization toolkits implemented in the Java language. It intends to unify their use and postpone the choice of which concrete toolkit(s) to use later-on in the development of visual analytics applications. We also report on the lessons we have learned when wrapping popular toolkits with Obvious, namely Prefuse, the InfoVis Toolkit, partly Improvise, JUNG and other data management libraries. We show several examples on the uses of Obvious, how the different toolkits can be combined, for instance sharing their data models. We also show how Weka and RapidMiner, two popular machine-learning toolkits, have been wrapped with Obvious and can be used directly with all the other wrapped toolkits. We expect Obvious to start a co-evolution process: Obvious is meant to evolve when more components of Information Visualization systems will become consensual. It is also designed to help information visualization systems adhere to the best practices to provide a higher level of interoperability and leverage the domain of visual analytics

    Mega-City-Regions: on Awareness and Value Chain Approach

    Get PDF
    Mega-City-Regions (MCR) as a new large-scale urban phenomenon have been gaining attention recently: In research, empirical studies address their functional consistency, and spatial planning policies underline the strategic role of MCRs for territorial competition of a country. But increasingly a tension between the functional logic of knowledge-intensive business activities and the territorial and normative approaches of public bodies begin to emerge. Typical conflicts of spatial development in MCRs occur for example when globally motivated investment decisions hit upon the local needs. This paper proposes an integrated view that can help to bridge the gap between the growing factual knowledge about MCRs and the still weak ability to use this knowledge for local and regional development and spatial planning purposes. The proposed integration draws on the one hand from the corporate-based value chain approach: The interaction of analysis of spatio-economic development, its adequate visualization and focussed communication towards stakeholders is apt to bring about the initiating momentum for beneficial spatial development. In the context of a diffuse perception of MCRs ā€“ whose mere size surpasses our common notions of space ā€“ analysis, visualization and communication as methodological components in the spatial planning process add value to sustainable spatial development. The process starts with creating awareness for the often invisible and complex functions, qualities and identities of the MCR spatial scale. New strategies of visualization and communication are needed to improve insight and motivation of the actors involved. On the other hand this value chain approach has to be adapted to the varying vertical levels of public bodies with their numerous policies. Thus, Ć¢ā‚¬Å“multi-level-governanceĆ¢ā‚¬ is to be conceived as a concept to close the gap between the territorial and the functional logic of spatial development. The paper will study this dual approach with the case of the announced expansion of the international airport in Munich. This complex multi-level-governance process experiments with a consensus-oriented dialogue platform ā€“ the so called Ć¢ā‚¬Å“neighbourhood conferenceĆ¢ā‚¬ (NC) ā€“ bringing together actors with divers responsibilities and objectives. The NC sits at the interface of global and local objectives that are transformed on the spatial scale of the MCR of Munich. The paper concludes with recommendations for using the above described spatial value chain approach for more efficient multi-level-governance.

    Agent Street: An Environment for Exploring Agent-Based Models in Second Life

    Get PDF
    Urban models can be seen on a continuum between iconic and symbolic. Generally speaking, iconic models are physical versions of the real world at some scaled down representation, while symbolic models represent the system in terms of the way they function replacing the physical or material system by some logical and/or mathematical formulae. Traditionally iconic and symbolic models were distinct classes of model but due to the rise of digital computing the distinction between the two is becoming blurred, with symbolic models being embedded into iconic models. However, such models tend to be single user. This paper demonstrates how 3D symbolic models in the form of agent-based simulations can be embedded into iconic models using the multi-user virtual world of Second Life. Furthermore, the paper demonstrates Second Life\'s potential for social science simulation. To demonstrate this, we first introduce Second Life and provide two exemplar models; Conway\'s Game of Life, and Schelling\'s Segregation Model which highlight how symbolic models can be viewed in an iconic environment. We then present a simple pedestrian evacuation model which merges the iconic and symbolic together and extends the model to directly incorporate avatars and agents in the same environment illustrating how \'real\' participants can influence simulation outcomes. Such examples demonstrate the potential for creating highly visual, immersive, interactive agent-based models for social scientists in multi-user real time virtual worlds. The paper concludes with some final comments on problems with representing models in current virtual worlds and future avenues of research.Agent-Based Modelling, Pedestrian Evacuation, Segregation, Virtual Worlds, Second Life

    Study of Tools Interoperability

    Get PDF
    Interoperability of tools usually refers to a combination of methods and techniques that address the problem of making a collection of tools to work together. In this study we survey different notions that are used in this context: interoperability, interaction and integration. We point out relation between these notions, and how it maps to the interoperability problem. We narrow the problem area to the tools development in academia. Tools developed in such environment have a small basis for development, documentation and maintenance. We scrutinise some of the problems and potential solutions related with tools interoperability in such environment. Moreover, we look at two tools developed in the Formal Methods and Tools group1, and analyse the use of different integration techniques
    • ā€¦
    corecore