7,713 research outputs found
Engineering simulations for cancer systems biology
Computer simulation can be used to inform in vivo and in vitro experimentation, enabling rapid, low-cost hypothesis generation and directing experimental design in order to test those hypotheses. In this way, in silico models become a scientific instrument for investigation, and so should be developed to high standards, be carefully calibrated and their findings presented in such that they may be reproduced. Here, we outline a framework that supports developing simulations as scientific instruments, and we select cancer systems biology as an exemplar domain, with a particular focus on cellular signalling models. We consider the challenges of lack of data, incomplete knowledge and modelling in the context of a rapidly changing knowledge base. Our framework comprises a process to clearly separate scientific and engineering concerns in model and simulation development, and an argumentation approach to documenting models for rigorous way of recording assumptions and knowledge gaps. We propose interactive, dynamic visualisation tools to enable the biological community to interact with cellular signalling models directly for experimental design. There is a mismatch in scale between these cellular models and tissue structures that are affected by tumours, and bridging this gap requires substantial computational resource. We present concurrent programming as a technology to link scales without losing important details through model simplification. We discuss the value of combining this technology, interactive visualisation, argumentation and model separation to support development of multi-scale models that represent biologically plausible cells arranged in biologically plausible structures that model cell behaviour, interactions and response to therapeutic interventions
Critters in the Classroom: A 3D Computer-Game-Like Tool for Teaching Programming to Computer Animation Students
The brewing crisis threatening computer science education is a well documented fact. To counter this and to increase enrolment and retention in computer science related degrees, it has been suggested to make programming "more fun" and to offer "multidisciplinary and cross-disciplinary programs" [Carter 2006]. The Computer Visualisation and Animation undergraduate degree at the National Centre for Computer Animation (Bournemouth University) is such a programme. Computer programming forms an integral part of the curriculum of this technical arts degree, and as educators we constantly face the challenge of having to encourage our students to engage with the subject.
We intend to address this with our C-Sheep system, a reimagination of the "Karel the Robot" teaching tool [Pattis 1981], using modern 3D computer game graphics that today's students are familiar with. This provides a game-like setting for writing computer programs, using a task-specific set of instructions which allow users to take control of virtual entities acting within a micro world, effectively providing a graphical representation of the algorithms used. Whereas two decades ago, students would be intrigued by a 2D top-down representation of the micro world, the lack of the visual gimmickry found in modern computer games for representing the virtual world now makes it extremely difficult to maintain the interest of students from today's "Plug&Play generation". It is therefore especially important to aim for a 3D game-like representation which is "attractive and highly motivating to today's generation of media-conscious students" [Moskal et al. 2004].
Our system uses a modern, platform independent games engine, capable of presenting a visually rich virtual environment using a state of the art rendering engine of a type usually found in entertainment systems. Our aim is to entice students to spend more time programming, by providing them with an enjoyable experience.
This paper provides a discussion of the 3D computer game technology employed in our system and presents examples of how this can be exploited to provide engaging exercises to create a rewarding learning experience for our students
Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)
This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio
Recommended from our members
Recent advances in the development of a European Mars climate model in Oxford
Since the early 1990s, efforts have been under way in Oxford to develop a range of numerical weather and climate prediction models for various studies of the Martian atmosphere and near-surface environment. Early versions of the Oxford model were more in the way of 'process models', aimed at relatively idealised studies e.g. of baroclinic instability[1] and low-level western boundary currents in the cross-equatorial solsticial Hadley circulation[2]. Since the mid-1990s, however, the group in Oxford have worked closely with the modelling group at LMD in Paris to develop a joint suite of more sophisticated and comprehensive numerical models of Mars' atmosphere. This collaboration, partly sponsored in recent years by the European Space Agency in connection with the associated development of a climate database for Mars[3], culminated in a suite of global circulation models[4], in which both groups share a library of parametrisation schemes, but in which the Oxford team use a spectral representation of horizontal fields (in the form of spherical harmonics) and the LMD group use a grid-point finite difference representation. These models were described in some detail by Forget et al.[4], and their preliminary validation and use in the construction of first versions of the European Mars Climate Database by Lewis et al.[3]. In the present report, we will review further developments which have taken place since the latter papers were published. Aspects of these developments which are common to both the LMD and Oxford groups will also be covered in the companion contribution by Forget et al. in this meeting, and so will only be touched on briefly here. Instead, we will concentrate on those advances which are more specific to the Oxford version of the model. In the following sections, we outline the main new developments to the model formulation since 1999. Subsequent sections then describe some recent examples where the new model is being utilised to advance a diverse range of studies of Mars atmospheric science
3D Weather â Towards a Real-time 3D Simulation of Localised Weather presented at EVA 2011
Weather forecasts are nearly always portrayed from either a satellite view perspective, a numerical
or symbol based representation. None of these methods actually portray weather visually from the
point of view of the observer, that is, they do not represent our experience of weather. This
problem presents a challenge to displaying weather using real-time 3D computer graphics. 3D
Weather is a proposed method to solve this problem, to create more believable representations of
the weather using real weather data. By employing computer graphic techniques and computer
game concepts the project intends to create a localized display of weather using mapping and
weather data. Started in 2010, the project has been exploring existing techniques, scoping out the
needs of stakeholders (such as the Met Office), and creating a prototype to explore the issues. The
paper concludes that the quest for realism with computer graphics can be a double-edged sword. It
can lead to expectations of accuracy in the data its meant to represent, which can be desired, but
in the case of the weather forecast the representation is not necessary what the weather will be, its
what the weather might be. The continuing project will explore the balance of issues when
representing the weather for past events as well as for forecasts
Natural landscape scenic preference: techniques for evaluation and simulation.
The aesthetic beauty of a landscape is a very subjective issue: every person has their own opinions and their own idea of what beauty is. However, all people have a common evolutionary history, and, according to the Biophilia hypothesis, a genetic predisposition to liking certain types of landscapes. It is possible that this common inheritance allows us to attempt to model scenic preference for natural landscapes. The ideal type of model for such predictions is the psychophysical preference model, integrating psychological responses to landscapes with objective measurements of quantitative and qualitative landscape variables. Such models commonly predict two thirds of the variance in the predications of the general public for natural landscapes. In order to create such a model three sets of data were required: landscape photographs (surrogates of the actual landscape), landscape preference data and landscape component variable measurements. The Internet was used to run a questionnaire survey; a novel, yet flexible, environmentally friendly and simple method of data gathering, resulting in one hundred and eighty responses. A geographic information system was used to digitise ninety landscape photographs and measure their landforms (based on elevation) in terms of areas and perimeters, their colours and proxies for their complexity and coherence. Landscape preference models were created by running multiple linear regressions using normalised preference data and the landscape component variables, including mathematical transformations of these variables. The eight models created predicted over sixty percent of variance in the responses and had moderate to high correlations with a second set of landscape preference data. A common base to the models were the variables of complexity, water and mountain landform, in particular the presence or absence of water and mountains was noted as being significant in determining landscape scenic preference. In order to fully establish the utility of these models, they were further tested against: changes in weather and season; the addition of cultural structures; different photographers; alternate film types; different focal lengths; and composition. Results showed that weather and season were not significant in determining landscape preference; cultural structures increased preferences for landscapes; and photographs taken by different people did not produce consistent results from the predictive models. It was also found that film type was not significant and that changes in focal length altered preferences for landscapes
Decision-making for unmanned aerial vehicle operation in icing conditions
With the increased use of unmanned aerial systems
(UAS) for civil and commercial applications, there is
a strong demand for new regulations and technology that
will eventually permit for the integration of UAS in
unsegregated airspace. This requires new technology to
ensure sufficient safety and a smooth integration process.
The absence of a pilot on board a vehicle introduces new
problems that do not arise in manned flight. One challenging
and safety-critical issue is flight in known icing
conditions. Whereas in manned flight, dealing with icing is
left to the pilot and his appraisal of the situation at hand; in
unmanned flight, this is no longer an option and new
solutions are required. To address this, an icing-related
decision-making system (IRDMS) is proposed. The system
quantifies in-flight icing based on changes in aircraft performance
and measurements of environmental properties,
and evaluates what the effects on the aircraft are. Based on
this, it determines whether the aircraft can proceed, and
whether and which available icing protection systems should be activated. In this way, advice on an appropriate
response is given to the operator on the ground, to ensure
safe continuation of the flight and avoid possible accidents
Recommended from our members
Tobac 1.2: Towards a flexible framework for tracking and analysis of clouds in diverse datasets
We introduce tobac (Tracking and Object-Based Analysis of Clouds), a newly developed framework for tracking and analysing individual clouds in different types of datasets, such as cloud-resolving model simulations and geostationary satellite retrievals. The software has been designed to be used flexibly with any two-or three-dimensional timevarying input. The application of high-level data formats, such as Iris cubes or xarray arrays, for input and output allows for convenient use of metadata in the tracking analysis and visualisation. Comprehensive analysis routines are provided to derive properties like cloud lifetimes or statistics of cloud properties along with tools to visualise the results in a convenient way. The application of tobac is presented in two examples. We first track and analyse scattered deep convective cells based on maximum vertical velocity and the threedimensional condensate mixing ratio field in cloud-resolving model simulations. We also investigate the performance of the tracking algorithm for different choices of time resolution of the model output. In the second application, we show how the framework can be used to effectively combine information from two different types of datasets by simultaneously tracking convective clouds in model simulations and in geostationary satellite images based on outgoing longwave radiation. The tobac framework provides a flexible new way to include the evolution of the characteristics of individual clouds in a range of important analyses like model intercomparison studies or model assessment based on observational data. Š 2019 Author(s)
Increasing resilience of ATM networks using traffic monitoring and automated anomaly analysis
Systematic network monitoring can be the cornerstone for
the dependable operation of safety-critical distributed
systems. In this paper, we present our vision for informed
anomaly detection through network monitoring and
resilience measurements to increase the operators'
visibility of ATM communication networks. We raise the
question of how to determine the optimal level of
automation in this safety-critical context, and we present a
novel passive network monitoring system that can reveal
network utilisation trends and traffic patterns in diverse
timescales. Using network measurements, we derive
resilience metrics and visualisations to enhance the
operators' knowledge of the network and traffic behaviour,
and allow for network planning and provisioning based on
informed what-if analysis
- âŚ