22 research outputs found

    Impacts of upland open drains upon runoff generation: a numerical assessment of catchment-scale impacts

    Get PDF
    Shallow upland drains, grips, have been hypothesized as responsible for increased downstream flow magnitudes. Observations provide counterfactual evidence, often relating to the difficulty of inferring conclusions from statistical correlation and paired catchment comparisons, and the complexity of designing field experiments to test grip impacts at the catchment scale. Drainage should provide drier antecedent moisture conditions, providing more storage at the start of an event; however, grips have higher flow velocities than overland flow, thus potentially delivering flow more rapidly to the drainage network. We develop and apply a model for assessing the impacts of grips on flow hydrographs. The model was calibrated on the gripped case, and then the gripped case was compared with the intact case by removing all grips. This comparison showed that even given parameter uncertainty, the intact case had significantly higher flood peaks and lower baseflows, mirroring field observations of the hydrological response of intact peat. The simulations suggest that this is because delivery effects may not translate into catchment-scale impacts for three reasons. First, in our case, the proportions of flow path lengths that were hillslope were not changed significantly by gripping. Second, the structure of the grip network as compared with the structure of the drainage basin mitigated against grip-related increases in the concentration of runoff in the drainage network, although it did marginally reduce the mean timing of that concentration at the catchment outlet. Third, the effect of the latter upon downstream flow magnitudes can only be assessed by reference to the peak timing of other tributary basins, emphasizing that drain effects are both relative and scale dependent. However, given the importance of hillslope flow paths, we show that if upland drainage causes significant changes in surface roughness on hillslopes, then critical and important feedbacks may impact upon the speed of hydrological response

    Automated determination of landslide locations after large trigger events: advantages and disadvantages compared to manual mapping

    Get PDF
    Earthquakes in mountainous areas can trigger thousands of co-seismic landslides, causing significant damage, hampering relief efforts, and rapidly redistributing sediment across the landscape. Efforts to understand the controls on these landslides rely heavily on manually mapped landslide inventories, but these are costly and time-consuming to collect, and their reproducibility is not typically well constrained. Here we develop a new automated landslide detection algorithm (ALDI) based on pixel-wise NDVI differencing of Landsat time series within Google Earth Engine accounting for seasonality. We compare classified inventories to manually mapped inventories from five recent earthquakes: 2005 Kashmir, 2007 Aisen, 2008 Wenchuan, 2010 Haiti, and 2015 Gorkha. We test the ability of ALDI to recover landslide locations (using ROC curves) and landslide sizes (in terms of landslide area-frequency statistics). We find that ALDI more skilfully identifies landslides than published inventories in 10 of 14 cases when ALDI is locally optimised, and in 8 of 14 cases both when ALDI is globally optimised and in holdback testing. These results reflect both good performance of the automated approach but also surprisingly poor performance of manual mapping, which has implications not only for how future classifiers are tested but also for the interpretations that are based on these inventories. We conclude that ALDI already represents a viable alternative to manual mapping in terms of its ability to identify landslide-affected image pixels. Its fast run-time, cost-free image requirements and near-global coverage make it an attractive alternative with the potential to significantly improve the coverage and quantity of landslide inventories. Its simplicity (pixel-wise analysis only) and parsimony of inputs (optical imagery only) suggests that considerable further improvement should be possible

    Going with the flow? Using participatory action research in physical geography

    Get PDF
    This paper critically appraises the idea and practice of ‘participation’ in scientific environmental research, arguing for the wider uptake by physical geographers of a more radical participatory approach. It proposes participatory action research (PAR), which offers an alternative mode of science, involving collaboration and co-production of research from question definition through to outcomes. We begin with a critical view of public participation in environmental research and policy-making to date. We argue that much rhetoric and practice of participation is shallow, focusing simply on including relevant publics and stakeholders, or having an underlying agenda of building trust in science or policy-making. Both orientations diverge drastically from the radical traditions in which participatory research and planning originate. In the rest of the paper, we illustrate an alternative process of knowledge co-production, reporting on a PAR project on farm slurry pollution conducted with a UK Rivers Trust. We evaluate the knowledge co-produced, the responses of participants and the scientific process. Suggesting that we reframe co-production as the circulation of expertise, we argue that PAR can enrich the learning, knowledge and skills of all those involved and lead to innovation and positive environmental outcomes. A number of structural and institutional barriers to deep participatory processes need to be addressed

    Limits on the validity of infinite length assumptions for modelling shallow landslides

    Get PDF
    The infinite slope method is widely used as the geotechnical component of geomorphic and landscape evolution models. Its assumption that shallow landslides are infinitely long (in a downslope direction) is usually considered valid for natural landslides on the basis that they are generally long relative to their depth. However, this is rarely justified, because the critical length/depth (L/H) ratio below which edge effects become important is unknown. We establish this critical L/H ratio by benchmarking infinite slope stability predictions against finite element predictions for a set of synthetic two-dimensional slopes, assuming that the difference between the predictions is due to error in the infinite slope method. We test the infinite slope method for six different L/H ratios to find the critical ratio at which its predictions fall within 5% of those from the finite element method. We repeat these tests for 5000 synthetic slopes with a range of failure plane depths, pore water pressures, friction angles, soil cohesions, soil unit weights and slope angles characteristic of natural slopes. We find that: (1) infinite slope stability predictions are consistently too conservative for small L/H ratios; (2) the predictions always converge to within 5% of the finite element benchmarks by a L/H ratio of 25 (i.e. the infinite slope assumption is reasonable for landslides 25 times longer than they are deep); but (3) they can converge at much lower ratios depending on slope properties, particularly for low cohesion soils. The implication for catchment scale stability models is that the infinite length assumption is reasonable if their grid resolution is coarse (e.g. >25 m). However, it may also be valid even at much finer grid resolutions (e.g. 1 m), because spatial organization in the predicted pore water pressure field reduces the probability of short landslides and minimizes the risk that predicted landslides will have L/H ratios less than 25

    A multi-dimensional stability model for predicting shallow landslide size and shape across landscapes

    Get PDF
    The size of a shallow landslide is a fundamental control on both its hazard and geomorphic importance. Existing models are either unable to predict landslide size or are computationally intensive such that they cannot practically be applied across landscapes. We derive a model appropriate for natural slopes that is capable of predicting shallow landslide size but simple enough to be applied over entire watersheds. It accounts for lateral resistance by representing the forces acting on each margin of potential landslides using earth pressure theory, and by representing root reinforcement as an exponential function of soil depth. We test our model’s ability to predict failure of an observed landslide where the relevant parameters are well constrained by field data. The model predicts failure for the observed scar geometry and finds that larger or smaller conformal shapes are more stable. Numerical experiments demonstrate that friction on the boundaries of a potential landslide increases considerably the magnitude of lateral reinforcement, relative to that due to root cohesion alone. We find that there is a critical depth in both cohesive and cohesionless soils, resulting in a minimum size for failure, which is consistent with observed size frequency distributions. Furthermore, the differential resistance on the boundaries of a potential landslide is responsible for a critical landslide shape which is longer than it is wide, consistent with observed aspect ratios. Finally, our results show that minimum size increases as approximately the square of failure surface depth, consistent with observed landslide depth-area data

    Predicting shallow landslide size and location across a natural landscape: Application of a spectral clustering search algorithm

    Get PDF
    Predicting shallow landslide size and location across landscapes is important for understanding landscape form and evolution and for hazard identification. We test a recently‐developed model that couples a search algorithm with 3D slope‐stability analysis that predicts these two key attributes in an intensively studied landscape with a ten‐year landslide inventory. We use process‐based sub‐models to estimate soil depth, root strength, and pore pressure for a sequence of landslide‐triggering rainstorms. We parameterize sub‐models with field measurements independently of the slope stability model, without calibrating predictions to observations. The model generally reproduces observed landslide size and location distributions, overlaps 65% of observed landslides, and of these predicts size to within factors of 2 and 1.5 in 55% and 28% of cases, respectively. Five percent of the landscape is predicted unstable, compared to 2% recorded landslide area. Missed landslides are not due to the search algorithm but to the formulation and parameterization of the model and inaccuracy of observed landslide maps. Our model does not improve location prediction relative to infinite‐slope methods but predicts landslide size, improves process representation, and reduces reliance on effective parameters. Increasing rainfall intensity or root cohesion generally increases landslide size and shifts locations down hollow axes while increasing cohesion restricts unstable locations to areas with deepest soils. Our findings suggest that shallow landslide abundance, location, and size are ultimately controlled by co‐varying topographic, material, and hydrologic properties. Estimating the spatio‐temporal patterns of root strength, pore pressure, and soil depth, across a landscape may be the greatest remaining challenge

    Modelling the effects of sediment compaction on salt marsh reconstructions of recent sea-level rise

    Get PDF
    This paper quantifies the potential influence of sediment compaction on the magnitude of nineteenth and twentieth century sea-level rise, as reconstructed from salt marsh sediments. We firstly develop a database of the physical and compression properties of low energy intertidal and salt marsh sediments. Key compression parameters are controlled by organic content (loss on ignition), though compressibility is modulated by local-scale processes, notably the potential for desiccation of sediments. Using this database and standard geotechnical theory, we use a numerical modelling approach to generate and subsequently ‘decompact’ a range of idealised intertidal stratigraphies. We find that compression can significantly contribute to reconstructed accelerations in recent sea level, notably in transgressive stratigraphies. The magnitude of this effect can be sufficient to add between 0.1 and 0.4 mm yr−1 of local sea-level rise, depending on the thickness of the stratigraphic column. In contrast, records from shallow (<0.5 m) uniform-lithology stratigraphies, or shallow near-surface salt marsh deposits in regressive successions, experience negligible compaction. Spatial variations in compression could be interpreted as ‘sea-level fingerprints’ that might, in turn, be wrongly attributed to oceanic or cryospheric processes. However, consideration of existing sea-level records suggests that this is not the case and that compaction cannot be invoked as the sole cause of recent accelerations in sea level inferred from salt marsh sediments

    Population density controls on microbial pollution across the Ganga catchment

    Get PDF
    For millions of people worldwide, sewage-polluted surface waters threaten water security, food security and human health. Yet the extent of the problem and its causes are poorly understood. Given rapid widespread global urbanisation, the impact of urban versus rural populations is particularly important but unknown. Exploiting previously unpublished archival data for the Ganga (Ganges) catchment, we find a strong non-linear relationship between upstream population density and microbial pollution, and predict that these river systems would fail faecal coliform standards for irrigation waters available to 79% of the catchment’s 500 million inhabitants. Overall, this work shows that microbial pollution is conditioned by the continental-scale network structure of rivers, compounded by the location of cities whose growing populations contribute c. 100 times more microbial pollutants per capita than their rural counterparts

    Optimisation of stereo-matching algorithms using extant DEM data

    Get PDF
    Here we present a new method for using existing Digital Elevation Model data to optimise performance of stereo-matching algorithms for digital topographic determination. We show that existing DEM data, even those of a poor quality (precision, resolution) can be used as a means of training stereo-matching algorithms to generate higher quality DEM data. Existing data are used to identify and to remove gross surface errors. We test the method using true vertical aerial imagery for a UK upland study site. Results demonstrate a dramatic improvement in data quality even where DEM data derived from topographic maps are adopted. Comparison with other methods suggests that using existing DEM data improves error identification and correction significantly. Tests suggest that it is applicable to both archival and commissioned aerial imagery

    Digital filtering of generic topographic data in geomorphological research

    No full text
    High resolution terrain models generated from widely available Interferometric Synthetic Aperture Radar (IfSAR) and digital photogrammetry are an exciting resource for geomorphological research. However, these data contain error, necessitating pre-processing to improve their quality. We evaluate the ability of digital filters to improve topographic representation, using: (1) a Gaussian noise removal filter; (2) the proprietary filters commonly applied to these datasets; and (3) a terrain sensitive filter, similar to those applied to laser altimetry data. Topographic representation is assessed in terms of both absolute accuracy measured with reference to independent check data and derived geomorphological variables (slope, upslope contributing area, topographic index and landslide failure probability) from a steepland catchment in Northern England. Results suggest that proprietary filters often degrade or fail to improve precision. A combination of terrain sensitive and Gaussian filters performs best for both datasets, improving the precision of photogrammetry digital elevation models (DEMs) by more than 50 per cent relative to the unfiltered data. High frequency noise and high magnitude gross errors corrupt geomorphic variables derived from unfiltered photogrammetry DEMs. However, a terrain sensitive filter effectively removes gross errors and noise is minimised using a Gaussian filter. These improvements propagate through derived variables in a landslide prediction model, to reduce the area of predicted instability by up to 29 per cent of the study area. IfSAR is susceptible to removal of topographic detail by over-smoothing and its errors are less sensitive to filtering (maximum improvement in precision of 5 per cent relative to the raw data)
    corecore