23 research outputs found

    The London Chalk model

    Get PDF
    This report describes the work undertaken to produce the London Chalk Model (LCM) within the catchment of the River Thames. This work was funded by the Environment Agency, Thames Region, to support work on the production of a new hydrogeological model for the region. STRUCTURE OF REPORT The introduction describes the background to the project. The second chapter describes the sources for the data used in the model. An account is then given of the processes that led to the generation of the geological model; this includes notes on the criteria used to subdivide the Chalk according to the new lithostratigraphy and how faulting was elucidated. A discussion of the structure of the Chalk starts with observations on the kinds of influence exerted on the Chalk by tectonic structures, and on the difficulties of specifically identifying faults in the Chalk. The final chapter ends with a short discussion on the possible timing of fault movements and how fault movements may have influenced sedimentation of the Chalk

    A business case study for the Environmental information system for planners (EISP) : prepared under Memorandum of Understanding for the Department for Communities and Local Government

    Get PDF
    This report forms the deliverable for work led by the British Geological Survey (BGS) under a Memorandum of Understanding (MOU) with the Department of Communities and Local Government (DCLG) between 1st April 2007 and 31st October 2007. This work (Phase III) followed on from six years research effort (jointly funded by the Natural Environment Research Council URGENT Programme investment of £357,000 and the former Office of the Deputy Prime Minister, ODPM, investment of £347,000) (Phases I and II) in the development of an Environmental Information System for Planners (EISP). 2. Learning from the technically similar ODPM funded PARSOL-developed expert system, the costs of building production systems within a local planning authority are estimated. The availability and reasonable cost of nationally collated environmental datasets required to populate production EISPs, alongside the local authority provided data, are confirmed. The ‘off the shelf’ annual average cost to an individual Local Planning Authority considering to purchase and licence the data for such a production system is estimated at between £13,300 and £36,000 which compares well with other such types of IT systems purchased by LPAs in recent years. 3. Benefits to local authorities in using appropriate planning tools in EISP to implement DCLG environmental planning policies are estimated in terms of time and cost savings and actual extra environmental hazard costs avoided. Actual planning officer staff time saved using an EISP is estimated and costed and compared with the acquisition cost of such a commercially available production system. The saving is extremely conservatively estimated at £200,000 per year. This gives a conservative Benefit over Cost ratio of between 5.6-15 using staff time saving criteria alone. 4. A PARSOL-involved sample of local authorities, which were introduced to the likely costs and benefits of installing an EISP, concluded that it was definitely a worthwhile enhancement to eplanning. 5. Telford and Wrekin Council have offered to install a production EISP in 2008/9 with its technology consortium, if this can be funded by DCLG, as with the PARSOL expert systems. That system will be promoted throughout all the LPAs as the ‘Beacon’ system of best practice for Environmental Information Systems in Planning. 6. DCLG is recommended to fund the installation of one or two production EISP systems. One would be with the Telford and Wrekin Council system. The second would be with a local authority currently using CAP Solutions Uni-form planning system (basic e-planning infrastructure already installed in over 50% of English LPAs). These are costed at approximately £300,000 for the first system and £150,000 for the second

    The development of linked databases and environmental modelling systems for decision-making in London

    Get PDF
    A basic requirement for a city's growth is the availability of land, raw material and water. For continued and sustainable development of today’s cities we must be able to meet these basic requirements whilst being mindful of the environment and its relationship with anthropogenic activity. The heterogeneous and complex nature of urban systems where there are obvious environmental and anthropogenic inter-dependencies necessitates a more holistic approach to decision-making. New developments such as linked databases of environmental data and integrated environmental modelling systems provide new ways of organising cross-disciplinary information and a means to apply this to explain, explore and predict the urban systems response to environmental change. In this paper we show how, accessibility to linked databases, detailed understanding of the geology and integrated environmental modelling solutions has the potential to provide decision-makers and policy developers with the science based information needed to understand and address these challenges

    The application of componentised modelling techniques to catastrophe model generation

    Get PDF
    In this paper we show that integrated environmental modelling (IEM) techniques can be used to generate a catastrophe model for groundwater flooding. Catastrophe models are probabilistic models based upon sets of events representing the hazard and weights their likelihood with the impact of such an event happening which is then used to estimate future financial losses. These probabilistic loss estimates often underpin re-insurance transactions. Modelled loss estimates can vary significantly, because of the assumptions used within the models. A rudimentary insurance-style catastrophe model for groundwater flooding has been created by linking seven individual components together. Each component is linked to the next using an open modelling framework (i.e. an implementation of OpenMI). Finally, we discuss how a flexible model integration methodology, such as described in this paper, facilitates a better understanding of the assumptions used within the catastrophe model by enabling the interchange of model components created using different, yet appropriate, assumptions

    A geological model of the chalk of East Kent

    Get PDF
    This report describes the geological modelling of the Chalk in the North Downs of East Kent, within the catchment of River Great Stour and eastwards to the coast, including the Isle of Thanet. This work was funded by the Environment Agency to support investigations of the local hydrogeology and thereby to enhance catchment management. The whole area is underlain by the Upper Cretaceous Chalk Group, with the Palaeogene succession of the Thanet Sand Formation, the Lambeth Group and the Thames Group overlying it in the northern and central eastern parts. The project included a desk study revision of the Chalk of the North Downs, using the new Chalk lithostratigraphy. The revisions to the geology are shown on the 1:50 000 scale geological map which accompanies this report. Together with evidence from boreholes and from seismic surveys, the new outcrop patterns have been incorporated into a geological model, using both computer software (EarthVision) and manual methods. The introduction describes the background to the project. The second chapter describes the sources for the data used in the model: published and unpublished geological maps, borehole records (both lithological and geophysical), seismic surveys, biostratigraphic records, digital topographic information, and the published literature. Each Chalk formation present in the area is then briefly described in the third chapter, noting its relationship to the older lithostratigraphic divisions, and to biostratigraphic zones. The local Chalk succession extends from the base of the Chalk Group to the Newhaven Chalk Formation, here represented by the Margate Chalk Member. Evidence for the thickness of each formation is reviewed. The early Palaeogene formations (the Thanet Sand, Upnor, Harwich and London Clay formations) are also briefly described (Chapter 4) and the local superficial deposits mentioned, with references to detailed descriptions (Chapter 5). Apart from minor adjustments to the outcrop of the basal Palaeogene surface, no revision of these formations was done for this study

    The development of an underground asset management tool in BGS

    Get PDF
    This report describes the work carried out to scope the potential for BGS to develop an Underground Asset Management tool. This work was funded through both a NERC innovation grant and Science budget funding from the Information Products Theme. The objective of asset management is ‘to ensure that assets deliver the required function and level of performance in terms of service, in a sustainable manner, at optimum whole life cost without compromising health, safety, environmental performance or the organisation’s reputation’. It is in this context that this report discusses the data available within BGS that could be provided to external organisations and how to communicate this information to potential clients. STRUCTURE OF REPORT The introduction explains the background to the project and looks into why an asset management tool may be required. The second chapter discusses customer need and likely uptake if such a tool were developed. This leads into the third chapter where current tools already on the market in the UK and their limitations are discussed. In the forth and fifth chapters an outline is given of the availability of data critical to the creation of an asset management system and how this could be developed into a tool. Finally we describe a pilot system developed for the Humber–Trent region. The final chapter attempts to summarise the key findings

    Meta-model : ensuring the widespread access to metadata and data for environmental models : scoping report

    Get PDF
    This work is a response to the challenge posed by the NERC Environmental Data call, and is designed to scope out how to meet the following objectives: 1. Ensure that the data used to create models are recorded and their source known. 2. The models produced are themselves available. 3. The results produced by these models can be obtained. To scope out how to fulfil these objectives a series of visits, phone calls and meetings were undertaken, alongside a Survey Monkey (on-line) questionnaire. The latter involved sending out a request to fill out the questionnaire to over three hundred contacts from institutions covering the UK, Europe and America, of which 106 responded. The responses have been analysed in conjunction with the information gained from other sources. There are a significant number of standards for both discovery and technical metadata. There are also a range of services by which metadata can be recorded and the data stored alongside these data. NERC itself puts a significant amount of effort into storing data and model results and making the metadata available. For example there are seven Data Centres and the Data Catalogue Service (DCS) to search metadata for datasets stored in the NERC data centres. Whilst there has been a significant amount of time and effort put into standards, the use is variable. There are a number of different standards, which are mainly related to ISO standards, WaterML, GEMINI, MEDIN, climate based standards as well as bespoke standards for data, but there is a lack of formal standards for model metadata. Storage of data and its associated metadata is facilitated via the NERC data centres with a reasonable uptake. Whilst the standards and approaches for discovery and technical metadata for data are well advanced and, in theory, well used there are a number of issues: Recognition of what the user wants rather than what the data manager feels is required. Consolidation of discovery metadata schema based on ISO19115 Recording different file formats and tools to allow ease of transfer from different file formats Retrospective capture of metadata for data and models Incorporation of time based information into metadata However for model metadata, the situation is less well advanced. There is no internationally recognised standard for model metadata, and one should be developed to include features such as: model code and version; code guardian contact details; Links to further information (URL to papers, manuals, etc.); details on how to run the models, etc.; spatial extent of the model instance. Other considerations include: an assessment of data quality and uncertainty needs to be recorded to enable model uncertainty to be quantified and there is the issue of storage of the models themselves. The latter could either be the model code (via standard repositories) or the executable. These gaps could be filled by a work programme that would consist of the development of a metadata standard for models, a portal for the recording and supply of these metadata, testing this with appropriate user organisations and liaising with international standards organisation to ensure that the development could be recognised. The results of the whole process should be disseminated through as many channels as possible

    Controls on the magnitude-frequency scaling of an inventory of secular landslides

    Get PDF
    Linking landslide size and frequency is important at both human and geological timescales for quantifying both landslide hazards and the effectiveness of landslides in the removal of sediment from evolving landscapes. The statistical behaviour of the magnitude-frequency of landslide inventories is usually compiled following a particular triggering event such as an earthquake or storm, and their statistical behaviour is often characterised by a power-law relationship with a small landslide rollover. The occurrence of landslides is expected to be influenced by the material properties of rock and/or regolith in which failure occurs. Here we explore the statistical behaviour and the controls of a secular landslide inventory (SLI) (i.e. events occurring over an indefinite geological time period) consisting of mapped landslide deposits and their underlying lithology (bedrock or superficial) across the United Kingdom. The magnitude-frequency distribution of this secular inventory exhibits an inflected power-law relationship, well approximated by either an inverse gamma or double Pareto model. The scaling exponent for the power-law scaling of medium to large landslides is � = −1.71 ± 0.02. The small-event rollover occurs at a significantly higher magnitude (1.0–7.0 × 10−3 km2) than observed in single-event landslide records (� 4 × 10−3 km2).We interpret this as evidence of landscape annealing, from which we infer that the SLI underestimates the frequency of small landslides. This is supported by a subset of data where a complete landslide inventory was recently mapped. Large landslides also appear to be under-represented relative to model predictions. There are several possible reasons for this, including an incomplete data set, an incomplete landscape (i.e. relatively steep slopes are under-represented), and/or temporal transience in landslide activity during emergence from the last glacial maximum toward a generally more stable late-Holocene state. The proposed process of landscape annealing and the possibility of a transient hillslope response have the consequence that it is not possible to use the statistical properties of the current SLI database to rigorously constrain probabilities of future landslides in the UK

    The development of a GIS methodology to assess the potential for water resource contamination due to new development in the 2012 Olympic Park site, London

    Get PDF
    The Initial Screening Tool (IST) has been developed to enable Planners to assess the potential risk to ground and surface water due to remobilisation of contaminants by new developments. The IST is a custom built GIS application that improves upon previous screening tools developed by the British Geological Survey (BGS) through the inclusion of 3-D geological data and an enhanced scoring methodology. The key new feature of the IST is the ability to track individual pollutant linkages, from a source of contamination, along multiple possible Pathways to potentially susceptible Receptors. A rule based approach allows the methodology to be easily updated, and as a result the IST has a role in scenario planning. The application provides output in the form of an automatically generated report, in which details of the potential pollutant linkages identified are presented. The initial research area selected was the Olympic Park site, London
    corecore