131,227 research outputs found

    Assessing housing quality and its impact on health, safety and sustainability

    Get PDF
    Background The adverse health and environmental effects of poor housing quality are well established. A central requirement for evidence-based policies and programmes to improve housing standards is a valid, reliable and practical way of measuring housing quality that is supported by policy agencies, the housing sector, researchers and the public. Methods This paper provides guidance on the development of housing quality-assessment tools that link practical measures of housing conditions to their effects on health, safety and sustainability, with particular reference to tools developed in New Zealand and England. Results The authors describe how information on housing quality can support individuals, agencies and the private sector to make worthwhile improvements to the health, safety and sustainability of housing. The information gathered and the resultant tools developed should be guided by the multiple purposes and end users of this information. Other important issues outlined include deciding on the scope, detailed content, practical administration issues and how the information will be analysed and summarised for its intended end users. There are likely to be considerable benefits from increased international collaboration and standardisation of approaches to measuring housing hazards. At the same time, these assessment approaches need to consider local factors such as climate, geography, culture, predominating building practices, important housing-related health issues and existing building codes. Conclusions An effective housing quality-assessment tool has a central role in supporting improvements to housing. The issues discussed in this paper are designed to motivate and assist the development of such tools

    A Multi-Factorial Risk Prioritization Framework for Food-Borne Pathogens

    Get PDF
    To lower the incidence of human food-borne disease, experts and stakeholders have urged the development of a science- and risk-based management system in which food-borne hazards are analyzed and prioritized. A literature review shows that most approaches to risk prioritization developed to date are based on measures of health outcomes and do not systematically account for other factors that may be important to decision making. The Multi-Factorial Risk Prioritization Framework developed here considers four factors that may be important to risk managers: public health, consumer risk perceptions and acceptance, market-level impacts, and social sensitivity. The framework is based on the systematic organization and analysis of data on these multiple factors. The basic building block of the information structure is a three-dimensional cube based on pathogen-food-factor relationships. Each cell of the cube has an information card associated with it and data from the cube can be aggregated along different dimensions. The framework is operationalized in three stages, with each stage adding another dimension to decision-making capacity. The first stage is the information cards themselves that provide systematic information that is not pre-processed or aggregated across factors. The second stage maps the information on the various information cards into cobweb diagrams that create a graphical profile of, for example, a food-pathogen combination with respect to each of the four risk prioritization factors. The third stage is formal multi-criteria decision analysis in which decision makers place explicit values on different criteria in order to develop risk priorities. The process outlined above produces a ‘List A’ of priority food-pathogen combinations according to some aggregate of the four risk prioritization factors. This list is further vetted to produce ‘List B’, which brings in feasibility analysis by ranking those combinations where practical actions that have a significant impact are feasible. Food-pathogen combinations where not enough is known to identify any or few feasible interventions are included in ‘List C’. ‘List C’ highlights areas with significant uncertainty where further research may be needed to enhance the precision of the risk prioritization process. The separation of feasibility and uncertainty issues through the use of ‘Lists A, B, and C’ allows risk managers to focus separately on distinct dimensions of the overall prioritization. The Multi-Factorial Risk Prioritization Framework provides a flexible instrument that compares and contrasts risks along four dimensions. Use of the framework is an iterative process. It can be used to establish priorities across pathogens for a particular food, across foods for a particular pathogen and/or across specific food-pathogen combinations. This report provides a comprehensive conceptual paper that forms the basis for a wider process of consultation and for case studies applying the framework.risk analysis, risk prioritization, food-borne pathogens, benefits and costs

    Generating Survival Times to Simulate Cox Proportional Hazards Models

    Get PDF
    This paper discusses techniques to generate survival times for simulation studies regarding Cox proportional hazards models. In linear regression models, the response variable is directly connected with the considered covariates, the regression coefficients and the simulated random errors. Thus, the response variable can be generated from the regression function, once the regression coefficients and the error distribution are specified. However, in the Cox model, which is formulated via the hazard function, the effect of the covariates have to be translated from the hazards to the survival times, because the usual software packages for estimation of Cox models require the individual survival time data. A general formula describing the relation between the hazard and the corresponding survival time of the Cox model is derived. It is shown how the exponential, the Weibull and the Gompertz distribution can be used to generate appropriate survival times for simulation studies. Additionally, the general relation between hazard and survival time can be used to develop own distributions for special situations and to handle flexibly parameterized proportional hazards models. The use of other distributions than the exponential distribution only is indispensable to investigate the characteristics of the Cox proportional hazards model, especially in non-standard situations, where the partial likelihood depends on the baseline hazard

    Risk assessment of blasting operations in open pit mines using FAHP method

    Get PDF
    Purpose. In the mining blasting operation, fragmentation is the most important output. Fly rock, ground vibration, air blast, and environmental effects are detrimental effects of blasting operations. Identifying and ranking the risk of blasting operations is considered as the most important stage in project management. Methods. In this research, the problem of identifying and ranking the factors constituting the risk in blasting operations is considered with the methodology of the Fuzzy Analytical Hierarchy Process (FAHP). Criteria and sub-criteria have been determined based on historical research studies, field studies, and expert opinions for designing a hierarchical process. Findings. Based on FAHP scores, non-control of the sub-criterion of health and safety (C3), blast operation results (C18) and knowledge, and skill and staffing (C2) with a score of 0.377, 0.334, and 0.294 respectively are the most effective sub-criterion for the creation of blasting operations risk. According to the score, the sub-criterion C18 is the most effective sub-criterion in providing the blasting operations risk. Effects and results of blasting operations (D8), with a score of 0.334 as the most effective criterion, and natural hazards (D10), with a score of 0.015, were the last priorities in the factors causing blasting operations risk. Originality. Regarding the risk rating of blasting operations, the control of the sub-criteria C3, C18, and C2, and the D8 criterion, is of particular importance in reducing the risk of blasting operations and improving project management. Practical implications. The evaluation of human resource performance and increase in the level of knowledge and skills and occupational safety and control of all outputs of blasting operations is necessary. Therefore, selecting the most important project risks and taking actions to remove them is essential for risk management.Мета. Визначення ризиків проведення вибухових робіт та їх оцінка на основі використанням нечіткого методу аналізу ієрархій (НМАІ) для покращення управління якістю проектів. Методика. В рамках даного дослідження, проблеми визначення та оцінки ризиків вибухових робіт розглядалися із застосуванням нечіткого методу аналізу ієрархій. На базі аналізу історичних даних і польового дослідження з урахуванням експертних оцінок були визначені критерії та підкритерії для побудови ієрархій. Результати. За результатами НМАІ, неконтролюючий підкритерій здоров’я та безпеки (С3), підкритерій результатів вибухових робіт (С18), знань, умінь і кадрів (С2) зі значеннями 0.377, 0.334 і 0.294 відповідно найбільш ефективні в появі ризику проведення вибухових робіт. Підкритерій С18 чинить найбільший вплив на ризик проведення вибухових робіт. Критерій результатів і наслідків вибухових робіт (D8) з найефективнішим значенням 0.334 та критерій природних катастроф (D10) зі значенням 0.015 є останніми пріоритетами серед чинників, які визначають ризик проведення вибухових робіт. Наукова новизна. Отримав доповнення та подальший розвиток науково-методичний підхід до визначення ризиків при проведенні вибухових робіт, заснований на їх ранжуванні з використанням системи виявлених критеріїв і підкритеріїв методом НМАІ. Практична значимість. Для успішного керування проектом важливо визначати найсерйозніші ризики проекту й вжити заходів щодо їх усунення. Відносно ранжирування ризиків проведення вибухових робіт управління підкритеріями C3, C18 і C2, а також критерієм D8, особливо важливо для зниження цих ризиків та покращення якості управління проектом.Цель. Определение рисков проведения взрывных работ и их оценка на основе использования нечеткого метода анализа иерархий (НМАИ) для улучшения управления качеством проектов. Методика. В рамках данного исследования, проблемы определения и оценки рисков взрывных работ рассматривались с применением нечеткого метода анализа иерархий. На базе анализа исторических данных и полевого исследования с учетом экспертных оценок были определены, критерии и подкритерии для построения иерархий. Результаты. По результатам НМАИ, неконтролирующий подкритерий здоровья и безопасности (С3), подкритерий результатов взрывных работ (С18), знаний, умений и кадров (С2) со значениями 0.377, 0.334 и 0.294 соответственно наиболее эффективны в появлении риска проведения взрывных работ. Подкритерий С18 оказывает самое большое влияние на риск проведения взрывных работ. Критерий результатов и последствий взрывных работ (D8) с самым эффективным значением 0.334 и критерий природных катастроф (D10) со значением 0.015 являются последними приоритетами среди факторов, которые определяют риск проведения взрывных работ. Научная новизна. Получил дополнение и дальнейшее развитие научно-методический подход к определению рисков при проведении взрывных работ, основанный на их ранжировании с использованием системы выявленных критериев и подкритериев методом НМАИ. Практическая значимость. Для успешного руководства проектом важно определять самые серьезные риски проекта и предпринять действия по их устранению. В отношении ранжирования рисков проведения взрывных работ управление подкритериями C3, C18 и C2, а также критерием D8, особенно важно для снижения этих рисков и улучшения руководства проектом.The authors would like to thank Mining Engineering Department, Islamic Azad University (South Tehran Branch) for supporting this research

    Soil erosion in the Alps : causes and risk assessment

    Get PDF
    The issue of soil erosion in the Alps has long been neglected due to the low economic value of the agricultural land. However, soil stability is a key parameter which affects ecosystem services like slope stability, water budgets (drinking water reservoirs as well as flood prevention), vegetation productivity, ecosystem biodiversity and nutrient production. In alpine regions, spatial estimates on soil erosion are difficult to derive because the highly heterogeneous biogeophysical structure impedes measurement of soil erosion and the applicability of soil erosion models. However, remote sensing and geographic information system (GIS) methods allow for spatial estimation of soil erosion by direct detection of erosion features and supply of input data for soil erosion models. Thus, the main objective of this work is to address the problem of soil erosion risk assessment in the Alps on catchment scale with remote sensing and GIS tools. Regarding soil erosion processes the focus is on soil erosion by water (here sheet erosion) and gravity (here landslides). For these two processes we address i) the monitoring and mapping of the erosion features and related causal factors ii) soil erosion risk assessment with special emphasis on iii) the validation of existing models for alpine areas. All investigations were accomplished in the Urseren Valley (Central Swiss Alps) where the valley slopes are dramatically affected by sheet erosion and landslides. For landslides, a natural susceptibility of the catchment has been indicated by bivariate and multivariate statistical analysis. Geology, slope and stream density are the most significant static landslide causal factors. Static factors are here defined as factors that do not change their attributes during the considered time span of the study (45 years), e.g. geology, stream network. The occurrence of landslides might be significantly increased by the combined effects of global climate and land use change. Thus, our hypothesis is that more recent changes in land use and climate affected the spatial and temporal occurrence of landslides. The increase of the landslide area of 92% within 45 years in the study site confirmed our hypothesis. In order to identify the cause for the trend in landslide occurrence time-series of landslide causal factors were analysed. The analysis revealed increasing trends in the frequency and intensity of extreme rainfall events and stocking of pasture animals. These developments presumably enhanced landslide hazard. Moreover, changes in land-cover and land use were shown to have affected landslide occurrence. For instance, abandoned areas and areas with recently emerging shrub vegetation show very low landslide densities. Detailed spatial analysis of the land use with GIS and interviews with farmers confirmed the strong influence of the land use management practises on slope stability. The definite identification and quantification of the impact of these non-stationary landslide causal factors (dynamic factors) on the landslide trend was not possible due to the simultaneous change of several factors. The consideration of dynamic factors in statistical landslide susceptibility assessments is still unsolved. The latter may lead to erroneous model predictions, especially in times of dramatic environmental change. Thus, we evaluated the effect of dynamic landslide causal factors on the validity of landslide susceptibility maps for spatial and temporal predictions. For this purpose, a logistic regression model based on data of the year 2000 was set up. The resulting landslide susceptibility map was valid for spatial predictions. However, the model failed to predict the landslides that occurred in a subsequent event. In order to handle this weakness of statistic landslide modelling a multitemporal approach was developed. It is based on establishing logistic regression models for two points in time (here 1959 and 2000). Both models could correctly classify >70% of the independent spatial validation dataset. By subtracting the 1959 susceptibility map from the 2000 susceptibility map a deviation susceptibility map was obtained. Our interpretation was that these susceptibility deviations indicate the effect of dynamic causal factors on the landslide probability. The deviation map explained 85% of new independent landslides occurring after 2000. Thus, we believe it to be a suitable tool to add a time element to a susceptibility map pointing to areas with changing susceptibility due to recently changing environmental conditions or human interactions. In contrast to landslides that are a direct threat to buildings and infrastructure, sheet erosion attracts less attention because it is often an unseen process. Nonetheless, sheet erosion may account for a major proportion of soil loss. Soil loss by sheet erosion is related to high spatial variability, however, in contrast to arable fields for alpine grasslands erosion damages are long lasting and visible over longer time periods. A crucial erosion triggering parameter that can be derived from satellite imagery is fractional vegetation cover (FVC). Measurements of the radiogenic isotope Cs-137, which is a common tracer for soil erosion, confirm the importance of FVC for soil erosion yield in alpine areas. Linear spectral unmixing (LSU), mixture tuned matched filtering (MTMF) and the spectral index NDVI are applied for estimating fractional abundance of vegetation and bare soil. To account for the small scale heterogeneity of the alpine landscape very high resolved multispectral QuickBird imagery is used. The performance of LSU and MTMF for estimating percent vegetation cover is good (r²=0.85, r²=0.71 respectively). A poorer performance is achieved for bare soil (r²=0.28, r²=0.39 respectively) because compared to vegetation, bare soil has a less characteristic spectral signature in the wavelength domain detected by the QuickBird sensor. Apart from monitoring erosion controlling factors, quantification of soil erosion by applying soil erosion risk models is done. The performance of the two established models Universal Soil Loss Equation (USLE) and Pan-European Soil Erosion Risk Assessment (PESERA) for their suitability to model erosion for mountain environments is tested. Cs-137 is used to verify the resulting erosion rates from USLE and PESERA. PESERA yields no correlation to measured Cs-137 long term erosion rates and shows lower sensitivity to FVC. Thus, USLE is used to model the entire study site. The LSU-derived FVC map is used to adapt the C factor of the USLE. Compared to the low erosion rates computed with the former available low resolution dataset (1:25000) the satellite supported USLE map shows “hotspots” of soil erosion of up to 16 t ha-1 a-1. In general, Cs-137 in combination with the USLE is a very suitable method to assess soil erosion for larger areas, as both give estimates on long-term soil erosion. Especially for inaccessible alpine areas, GIS and remote sensing proved to be powerful tools that can be used for repetitive measurements of erosion features and causal factors. In times of global change it is of crucial importance to account for temporal developments. However, the evaluation of the applied soil erosion risk models revealed that the implementation of temporal aspects, such as varying climate, land use and vegetation cover is still insufficient. Thus, the proposed validation strategies (spatial, temporal and via Cs-137) are essential. Further case studies in alpine regions are needed to test the methods elaborated for the Urseren Valley. However, the presented approaches are promising with respect to improve the monitoring and identification of soil erosion risk areas in alpine regions
    corecore