6 research outputs found

    Zero Liquid Discharge System for the Tannery Industry—An Overview of Sustainable Approaches

    Get PDF
    The tannery industry is characterized by the consumption of a large quantity of water, around 30–40 m3 for processing 1000 kg of hide or skin. This amount becomes wastewater, containing about 300 kg of different chemicals, mainly refractory organic compounds, with high chemical oxygen demand (COD), total dissolved salts (TDS), chromium, and evolution of toxic gases, such as ammonia and sulfides, etc. The remaining tanning chemicals are released as effluent having high resistance against biological degradation, becoming a serious environmental issue. Usually, end-of-pipe treatment is not sufficient to meet the concerns of environmental issues. In terms of cleaner production options, the redesigning of the existing effluent treatment procedures with alternate or additional treatment techniques, which “supports resource recovery with no added chemicals”, is expected to give a sustainable solution for the management of toxic effluent. The Zero Liquid Discharge (ZLD) system serves to ensure zero water emission, as well as treatment facilities by recycling, recovery, and reuse of the treated wastewater using advanced cleanup technology. The international scenario shows the implementation of ZLD thanks to pressure from regulatory agencies. The ZLD system consists of a pre-treatment system with conventional physicochemical treatment, tertiary treatment, softening of the treated effluent, reverse osmosis (RO) treatment for desalination, and thermal evaporation of the saline reject from RO to separate the salts. By adopting this system, water consumption is reduced. Moreover, ZLD also becomes effective in disaster mitigation in areas where the tannery industry is a strong economic actor. With this review, we aim to give an outlook of the current framework

    GIS based urban social vulnerability assessment for liquefaction susceptible areas: a case study for greater Chennai, India

    No full text
    Abstract Background The areas prone to geological hazards such as liquefaction need special attention with respect to social vulnerability. Though liquefaction by itself may not result in damage, it may trigger a series of ground failures such as ground oscillation, lateral spread, loss of bearing strength, etc., which cause heavy damage. Globally, during the past few decades liquefaction hazard analysis has become one of the important criteria in seismic risk analysis and mitigation management, especially for urban areas. Greater Chennai is one of the million-plus population cities in India. The city also felt earthquakes/tremors in the past history. Method The present study aims to assess the social vulnerability of the population density of the Greater Chennai area due to liquefaction susceptibility using GIS technology. The liquefaction susceptibility map (hazard) for the Greater Chennai was prepared by integration of geological and geomorphological parameters and analyzed over socioeconomic parameters (exposure) using an integration of GIS and AHP. Results The result showed that around 53% of Greater Chennai’s households and population are very much exposed to liquefaction hazard. Conclusions This study can be used as a base level study for decision-making during land use planning as well as disaster mitigation planning

    Tannery: Environmental impacts and sustainable technologies

    No full text
    The tannery is an old industrial sector well-developed, plays an important role in the global economy, and has been heavily industrialized over the years in all countries. In developed countries, the legislation forces the tannery sector to develop clean and sustainable production. Due to the strict legislation policies, there was a continuous change in the distribution of the processing sites. There are two main reasons for this shift of the distribution, namely: 1) lower labour costs in some countries than in others; 2) fewer environmental restrictions in some countries than in others. The current tanning process is still mainly based on the traditional one, making use of chromium salts and being able to give leather of very high quality, despite its severe environmental drawbacks. Since its industrialization, the tannery has shown its heavy environmental impacts, caused by the operations and the processes done in the productive/supply chain, from the raw hides and skins to the final leather. Especially, this sector consumes a huge amount of water, which generates wastewater with high concentrations of pollutants (mainly, chromium(III), sodium sulfide, ammonium chloride, biocides, aldehydes, dyes, etc.). Emissions to the air and solid waste production are worth of consideration, too. At last, the health effects on the workers can give problems in the short as well as long term. Looking at the current distribution of the processing sites, most are in developing countries, where the legislation is still weak. Little amounts of leather derive from the green tannery, where the use of chromium is limited or completely avoided. The reason is the lower quality of the leather produced by the so-called ‘‘green technologies”. This paper analyzes the impacts deriving from the tannery industry due to the conventional chromium process. The reduced-chrome processes and green technologies are discussed, too, considering the role of nanotechnologies

    Partitioning around medoids approach application for computation of regional flood and landslide quantiles

    No full text
    Flood and landslides causes serious damage to the functioning of the society which results in a huge loss of human life, material and other environmental impacts. In this paper, partitioning around medoids approach is executed for the assessment of flood quantiles over 145 sites using 11 basin characteristics. The study region is classified into 6 clusters as a result of the partitioning algorithm which are further proved to be homogeneous by applying the heterogeneity measure test. Results from the study provided the regional flood quantile measurements for the ungauged sites derived from L moments with good accuracy limits for the recurrence intervals 50, 100, 200 and 500 years. As a floods landslides may caused by rainfalls, especially over long time periods, which both increase the weight of slopes and can lubricate planes of weakness within rock or sediment. It is shown that landslides are also allocated in some of the clustered zones, depending of geological conditions of the clusters. Thus regional flood quintiles in conjunction with geology and topography forms landslide activity quantiles

    Explainable Artificial Intelligence (XAI) Model for Earthquake Spatial Probability Assessment in Arabian Peninsula

    No full text
    Among all the natural hazards, earthquake prediction is an arduous task. Although many studies have been published on earthquake hazard assessment (EHA), very few have been published on the use of artificial intelligence (AI) in spatial probability assessment (SPA). There is a great deal of complexity observed in the SPA modeling process due to the involvement of seismological to geophysical factors. Recent studies have shown that the insertion of certain integrated factors such as ground shaking, seismic gap, and tectonic contacts in the AI model improves accuracy to a great extent. Because of the black-box nature of AI models, this paper explores the use of an explainable artificial intelligence (XAI) model in SPA. This study aims to develop a hybrid Inception v3-ensemble extreme gradient boosting (XGBoost) model and shapely additive explanations (SHAP). The model would efficiently interpret and recognize factors’ behavior and their weighted contribution. The work explains the specific factors responsible for and their importance in SPA. The earthquake inventory data were collected from the US Geological Survey (USGS) for the past 22 years ranging the magnitudes from 5 Mw and above. Landsat-8 satellite imagery and digital elevation model (DEM) data were also incorporated in the analysis. Results revealed that the SHAP outputs align with the hybrid Inception v3-XGBoost model (87.9% accuracy) explanations, thus indicating the necessity to add new factors such as seismic gaps and tectonic contacts, where the absence of these factors makes the prediction model performs poorly. According to SHAP interpretations, peak ground accelerations (PGA), magnitude variation, seismic gap, and epicenter density are the most critical factors for SPA. The recent Turkey earthquakes (Mw 7.8, 7.5, and 6.7) due to the active east Anatolian fault validate the obtained AI-based earthquake SPA results. The conclusions drawn from the explainable algorithm depicted the importance of relevant, irrelevant, and new futuristic factors in AI-based SPA modeling
    corecore