2,639 research outputs found

    Urban Air Pollution Forecasting Using Artificial Intelligence-Based Tools

    Get PDF

    MODELLING THE URBAN SUSTAINABLE DEVELOPMENT BY USING FUZZY SETS

    Get PDF
    The sustainable urban development is a subject of interest for regional policy makers and it needs appropriate assessment based on futile instruments for research, and for practical reasonsl (planning and decision making). Even if the sustainability’s attainment is a research topic field for academia and urban planners and managers and, as well, an ambitious goal for any resource administrator, yet there is no precise way of defining and measuring it. The sustainability of the urban development policy implies multiple and diversified aspects from rational exploitation of the local resources and well-structured workforce to environmental issues, endowment of modern urban facilities and infrastructure elements. As the urban sustainability is measured using a multitude of basic indicators, needing proper information to make long term management decision and planning, the subject is treated with fuzzy setsseen as an appropriate manner to deal with ambiguity, subjectivity and imprecision in the human reasoning when processing large volumes of data, eventually unstructured and complex. The paper proposed a modeling approach based on fuzzy sets inspired by the SAFE (Sustainability Assessment by Fuzzy Evaluation), a model which provides a mechanism for measuring development sustainability. The papers intends presenting a quantitative methodology in assessing the potential sustainability of urban development (in terms of adequacy) by pointing the failures in pursuing trends that are associated to a robust growth in the urban areas. The advantages of such approach are derived from taking into account the multi-criteria and uncertainty facets of the phenomenon; also, having in mind that the sustainability remains a non-straight-cut concept, being vaguely defined it implies a non-deterministic character by using the fuzzy set logic. The proposed model is designed to assess the divergence from desired trajectories, the weak point in reaching indicators’ target (as they are commonly regardedd as appropriate in what is understood as a good practices), it may then be addressed for policy makers in indicating some action measures in urban administration as they intendenly strive towards increasingly sustainable development on the long term.sustainability, urban management, indicators, fuzzy approach.

    Development of fuzzy system and nonlinear regression models for ozone and PM2.5 air quality forecasts.

    Get PDF
    Ozone forecast models using nonlinear regression (NLR) have been successfully applied to daily ozone forecast for seven metro areas in Kentucky, including Ashland, Bowling Green, Covington, Lexington, Louisville, Owensboro, and Paducah. In this study, the updated 2005 NLR ozone forecast models for these metro areas were evaluated on both the calibration data sets and independent data sets. These NLR ozone forecast models explained at least 72% of the variance of the daily peak ozone. Using the models to predict the ozone concentrations during the 2005 ozone season, the metro area mean absolute errors (MAEs) of the model hindcasts ranged from 5.90 ppb to 7.20 ppb. For the model raw forecasts, the metro area MAEs ranged from 7.90 ppb to 9.80 ppb. Based on previously developed NLR ozone forecast models for those areas, Takagi-Sugeno fuzzy system models were developed for the seven metro areas. The fuzzy c-means clustering technique coupled with an optimal output predefuzzification approach (least square method) was used to train the Takagi-Sugeno fuzzy system. Two types of fuzzy models, basic fuzzy and NLR-fuzzy system models, were developed. The basic fuzzy and NLR-fuzzy models exhibited essentially equivalent performance to the existing NLR models on 2004 ozone season hindcasts and forecasts. Both types of fuzzy models had, on average, slightly lower metro area averaged MAEs than the NLR models. Among the seven Kentucky metro areas Ashland, Covington, and Louisville are currently designated nonattainment areas for both ground level O 3 and PM 2.5 . In this study, summer PM 2.5 forecast models were developed for providing daily average PM 2.5 forecasts for the seven metro areas. The performance of the PM 2.5 forecast models was generally not as good as that of the ozone forecast models. For the summer 2004 model hindcasts, the metro-area average MAE was 5.33ìg/m 3 . Exploratory research was conducted to find the relationship between the winter PM 2.5 concentrations and the meteorological parameters and other derived prediction parameters. Winter PM 2.5 forecast models were developed for seven selected metro areas in Kentucky. For the model fits, the MAE for the seven forecast models ranged from 3.23 ìg/m 3 to 4.61 ìg/m 3 (~26-28% NMAE). The fuzzy technique was also applied on PM 2.5 forecast models to seek more accurate PM 2.5 prediction. The NLR-fuzzy PM 2.5 had slightly better performance than the NLR models

    Air pollution Analysis with a PFCM Clustering Algorithm Applied in a Real Database of Salamanca (Mexico)

    Get PDF
    Over the last ten years, Salamanca has been considered among the most polluted cities in México. Nowadays, there is an Automatic Environmental Monitoring Network (AEMN) which measures air pollutants (Sulphur Dioxide (SO2), Particular Matter (PM10), Ozone (O3), etc.), as well as environmental variables (wind speed, wind direction, temperature, and relative humidity), and it takes a sample of the variables every minute. The AEM Network is mainly based on three monitoring stations located at Cruz Roja, DIF, and Nativitas. In this work, we use the PFCM (Possibilistic Fuzzy c Means) clustering algorithm as a mean to get a combined measure, from the three stations, looking to provide a tool for better management of contingencies in the city, such that local or general action can be taken in the city according to the pollution level given by each station and the combined measure. Besides, we also performed an analysis of correlation between pollution and environmental variables. The results show a significative correlation between pollutant concentrations and some environmental variables. So, the combined measure and the correlations can be used for the establishment of general contingency thresholds

    Development of ozone forecast models for selected Kentucky metropolitan areas.

    Get PDF
    Ground-level ozone forecast models were developed for the following middle and small metropolitan areas in Kentucky: Ashland, Bowling Green, Owensboro, and Paducah. These models were nonlinear regression models, based on models previously developed for Louisville and Lexington. For each of the four cities, the mean absolute errors (MAE) for the model estimates, based on the 1998-2002 model-fitted data sets, were less than 7.7 ppb; the MAE/O3 were less than 12.7%. The models could explain at least 66% of the variance of the daily peak ozone. On average, the errors of the model were within +/- 15.0 ppb on 88% of days, and were within +/- 10.0 ppb on 73% of days. Using an alarm threshold 80 ppb, the detection rates for National Ambient Air Quality Standard (NAAQS) Exceedences ranged from 0.48 to 0.67 for the four cities. The corresponding false alarm rates ranged from 0.29 to 0.44. The results of this study demonstrate that the ozone forecast models for each of the four cities can be expected to be useful tools for making next-day forecasts of local ground-level O3 in those areas. Similar models, updated using 2003 data, will be used during the 2004 O3 season for providing daily automated forecasts for these metropolitan areas

    Rule-Based System Architecting of Earth Observing Systems: Earth Science Decadal Survey

    Get PDF
    This paper presents a methodology to explore the architectural trade space of Earth observing satellite systems, and applies it to the Earth Science Decadal Survey. The architecting problem is formulated as a combinatorial optimization problem with three sets of architectural decisions: instrument selection, assignment of instruments to satellites, and mission scheduling. A computational tool was created to automatically synthesize architectures based on valid combinations of options for these three decisions and evaluate them according to several figures of merit, including satisfaction of program requirements, data continuity, affordability, and proxies for fairness, technical, and programmatic risk. A population-based heuristic search algorithm is used to search the trade space. The novelty of the tool is that it uses a rule-based expert system to model the knowledge-intensive components of the problem, such as scientific requirements, and to capture the nonlinear positive and negative interactions between instruments (synergies and interferences), which drive both requirement satisfaction and cost. The tool is first demonstrated on the past NASA Earth Observing System program and then applied to the Decadal Survey. Results suggest that the Decadal Survey architecture is dominated by other more distributed architectures in which DESDYNI and CLARREO are consistently broken down into individual instruments."La Caixa" FoundationCharles Stark Draper LaboratoryGoddard Space Flight Cente

    Life Cycle-based Environmental Performance Indicator for the Coal-to-energy Supply Chain: A Chinese Case Application

    Get PDF
    Coal consumption and energy production (CCEP) has received increasing attention since coal-fired power plants play a dominant role in the power sector worldwide. In China, coal is expected to retain its primary energy position over the next few decades. However, a large share of CO2 emissions and other environmental hazards, such as SO2 and NOx, are attributed to coal consumption. Therefore, understanding the environmental implications of the life cycle of coal from its production in coal mines to its consumption at coal-fired power plants is an essential task. Evaluation of such environmental burdens can be conducted using the life cycle assessment (LCA) tool. The main issues with the traditional LCA results are the lack of a numerical magnitude associated with the performance level of the obtained environmental burden values and the inherent uncertainty associated with the output results. This issue was addressed in this research by integrating the traditional LCA methodology with a weighted fuzzy inference system model, which is applied to a Chinese coal-to-energy supply chain system to demonstrate its applicability and effectiveness. Regarding the coal-to-energy supply chain under investigation, the CCEP environmental performance has been determined as “medium performance”, with an indicator score of 39.15%. Accordingly, the decision makers suggested additional scenarios (redesign, equipment replacement, etc.) to improve the performance. A scenario-based analysis was designed to identify alternative paths to mitigate the environmental impact of the coal-to-energy supply chain. Finally, limitations and possible future work are discussed, and the conclusions are presented

    Layer of protection analysis applied to ammonia refrigeration systems

    Get PDF
    Ammonia refrigeration systems are widely used in industry. Demand of these systems is expected to increase due to the advantages of ammonia as refrigerant and because ammonia is considered a green refrigerant. Therefore, it is important to evaluate the risks in existing and future ammonia refrigeration systems to ensure their safety. LOPA (Layer of Protection Analysis) is one of the best ways to estimate the risk. It provides quantified risk results with less effort and time than other methods. LOPA analyses one cause-consequence scenario per time. It requires failure data and PFD (Probability of Failure on Demand) of the independent protection layers available to prevent the scenario. Complete application of LOPA requires the estimation of the severity of the consequences and the mitigated frequency of the initiating event for risk calculations. Especially in existing ammonia refrigeration systems, information to develop LOPA is sometimes scarce and uncertain. In these cases, the analysis relies on expert opinion to determine the values of the variables required for risk estimation. Fuzzy Logic has demonstrated to be useful in this situation allowing the construction of expert systems. Based on fuzzy logic, the LOPA method was adapted to represent the knowledge available in standards and good industry practices for ammonia refrigeration. Fuzzy inference systems were developed for severity and risk calculation. Severity fuzzy inference system uses the number of life threatening injuries or deaths, number of injuries and type of medical attention required to calculate the severity risk index. Frequency of the mitigated scenario is calculated using generic data for the initiating event frequency and PFD of the independent protection layers. Finally, the risk fuzzy inference system uses the frequency and severity values obtained to determine the risk of the scenario. The methodology was applied to four scenarios. Risk indexes were calculated and compared with the traditional approach and risk decisions were made. In conclusion, the fuzzy logic LOPA method provides good approximations of the risk for ammonia refrigeration systems. The technique can be useful for risk assessment of existing ammonia refrigeration systems

    Jätevedenpuhdistamojen prosessinohjauksen ja operoinnin kehittäminen data-analytiikan avulla: esimerkkejä teollisuudesta ja kansainvälisiltä puhdistamoilta

    Get PDF
    Instrumentation, control and automation are central for operation of municipal wastewater treatment plants. Treatment performance can be further improved and secured by processing and analyzing the collected process and equipment data. New challenges from resource efficiency, climate change and aging infrastructure increase the demand for understanding and controlling plant-wide interactions. This study aims to review what needs, barriers, incentives and opportunities Finnish wastewater treatment plants have for developing current process control and operation systems with data analytics. The study is conducted through interviews, thematic analysis and case studies of real-life applications in process industries and international utilities. Results indicate that for many utilities, additional measures for quality assurance of instruments, equipment and controllers are necessary before advanced control strategies can be applied. Readily available data could be used to improve the operational reliability of the process. 14 case studies of advanced data processing, analysis and visualization methods used in Finnish and international wastewater treatment plants as well as Finnish process industries are reviewed. Examples include process optimization and quality assurance solutions that have proven benefits in operational use. Applicability of these solutions for identified development needs is initially evaluated. Some of the examples are estimated to have direct potential for application in Finnish WWTPs. For other case studies, further piloting or research efforts to assess the feasibility and cost-benefits for WWTPs are suggested. As plant operation becomes more centralized and outsourced in the future, need for applying data analytics is expected to increase.Prosessinohjaus- ja automaatiojärjestelmillä on keskeinen rooli modernien jätevedenpuhdistamojen operoinnissa. Prosessi- ja laitetietoa paremmin hyödyntämällä prosessia voidaan ohjata entistä tehokkaammin ja luotettavammin. Kiertotalous, ilmastonmuutos ja infrastruktuurin ikääntyminen korostavat entisestään tarvetta ymmärtää ja ohjata myös eri osaprosessien välisiä vuorovaikutuksia. Tässä työssä tarkastellaan tarpeita, esteitä, kannustimia ja mahdollisuuksia kehittää jätevedenpuhdistamojen ohjausta ja operointia data-analytiikan avulla. Eri sidosryhmien näkemyksiä kartoitetaan haastatteluilla, joiden tuloksia käsitellään temaattisen analyysin kautta. Löydösten perusteella potentiaalisia ratkaisuja kartoitetaan suomalaisten ja kansainvälisten puhdistamojen sekä prosessiteollisuuden jo käyttämistä sovelluksista. Löydökset osoittavat, että monilla puhdistamoilla tarvitaan nykyistä merkittävästi kattavampia menetelmiä instrumentoinnin, laitteiston ja ohjauksen laadunvarmistukseen, ennen kuin edistyneempien prosessinohjausmenetelmien käyttöönotto on mahdollista. Operoinnin toimintavarmuutta ja luotettavuutta voitaisiin kehittää monin tavoin hyödyntämällä jo kerättyä prosessi- ja laitetietoa. Työssä esitellään yhteensä 14 esimerkkiä puhdistamoilla ja prosessiteollisuudessa käytössä olevista prosessinohjaus- ja laadunvarmistusmenetelmistä. Osalla ratkaisuista arvioidaan sellaisenaan olevan laajaa sovelluspotentiaalia suomalaisilla jätevedenpuhdistamoilla. Useiden ratkaisujen käyttöönottoa voitaisiin edistää pilotoinnilla tai jatkotutkimuksella potentiaalisten hyötyjen ja kustannusten arvioimiseksi. Jo kerättyä prosessi- ja laitetietoa hyödyntävien ratkaisujen kysynnän odotetaan tulevaisuudessa lisääntyvän, kun puhdistamojen operointi keskittyy ja paineet kustannus- ja energiatehokkuudelle kasvavat
    corecore