574 research outputs found

    International Conference on Computer Science

    Get PDF
    UBT Annual International Conference is the 11th international interdisciplinary peer reviewed conference which publishes works of the scientists as well as practitioners in the area where UBT is active in Education, Research and Development. The UBT aims to implement an integrated strategy to establish itself as an internationally competitive, research-intensive university, committed to the transfer of knowledge and the provision of a world-class education to the most talented students from all background. The main perspective of the conference is to connect the scientists and practitioners from different disciplines in the same place and make them be aware of the recent advancements in different research fields, and provide them with a unique forum to share their experiences. It is also the place to support the new academic staff for doing research and publish their work in international standard level. This conference consists of sub conferences in different fields like: Art and Digital Media Agriculture, Food Science and Technology Architecture and Spatial Planning Civil Engineering, Infrastructure and Environment Computer Science and Communication Engineering Dental Sciences Education and Development Energy Efficiency Engineering Integrated Design Information Systems and Security Journalism, Media and Communication Law Language and Culture Management, Business and Economics Modern Music, Digital Production and Management Medicine and Nursing Mechatronics, System Engineering and Robotics Pharmaceutical and Natural Sciences Political Science Psychology Sport, Health and Society Security Studies This conference is the major scientific event of the UBT. It is organizing annually and always in cooperation with the partner universities from the region and Europe. We have to thank all Authors, partners, sponsors and also the conference organizing team making this event a real international scientific event. Edmond Hajrizi, President of UBT UBT – Higher Education Institutio

    A mobile assisted coverage hole patching scheme based on particle swarm optimization for WSNs

    Get PDF
    Wireless sensor networks (WSNs) have drawn much research attention in recent years due to the superior performance in multiple applications, such as military and industrial monitoring, smart home, disaster restoration etc. In such applications, massive sensor nodes are randomly deployed and they remain static after the deployment, to fully cover the target sensing area. This will usually cause coverage redundancy or coverage hole problem. In order to effectively deploy sensors to cover whole area, we present a novel node deployment algorithm based on mobile sensors. First, sensor nodes are randomly deployed in target area, and they remain static or switch to the sleep mode after deployment. Second, we partition the network into grids and calculate the coverage rate of each grid. We select grids with lower coverage rate as candidate grids. Finally, we awake mobile sensors from sleep mode to fix coverage hole, particle swarm optimization (PSO) algorithm is used to calculate moving position of mobile sensors. Simulation results show that our algorithm can effectively improve the coverage rate of WSNs

    A dependability framework for WSN-based aquatic monitoring systems

    Get PDF
    Wireless Sensor Networks (WSN) are being progressively used in several application areas, particularly to collect data and monitor physical processes. Moreover, sensor nodes used in environmental monitoring applications, such as the aquatic sensor networks, are often subject to harsh environmental conditions while monitoring complex phenomena. Non-functional requirements, like reliability, security or availability, are increasingly important and must be accounted for in the application development. For that purpose, there is a large body of knowledge on dependability techniques for distributed systems, which provides a good basis to understand how to satisfy these non-functional requirements of WSN-based monitoring applications. Given the data-centric nature of monitoring applications, it is of particular importance to ensure that data is reliable or, more generically, that it has the necessary quality. The problem of ensuring the desired quality of data for dependable monitoring using WSNs is studied herein. With a dependability-oriented perspective, it is reviewed the possible impairments to dependability and the prominent existing solutions to solve or mitigate these impairments. Despite the variety of components that may form a WSN-based monitoring system, it is given particular attention to understanding which faults can affect sensors, how they can affect the quality of the information, and how this quality can be improved and quantified. Open research issues for the specific case of aquatic monitoring applications are also discussed. One of the challenges in achieving a dependable system behavior is to overcome the external disturbances affecting sensor measurements and detect the failure patterns in sensor data. This is a particular problem in environmental monitoring, due to the difficulty in distinguishing a faulty behavior from the representation of a natural phenomenon. Existing solutions for failure detection assume that physical processes can be accurately modeled, or that there are large deviations that may be detected using coarse techniques, or more commonly that it is a high-density sensor network with value redundant sensors. This thesis aims at defining a new methodology for dependable data quality in environmental monitoring systems, aiming to detect faulty measurements and increase the sensors data quality. The framework of the methodology is overviewed through a generically applicable design, which can be employed to any environment sensor network dataset. The methodology is evaluated in various datasets of different WSNs, where it is used machine learning to model each sensor behavior, exploiting the existence of correlated data provided by neighbor sensors. It is intended to explore the data fusion strategies in order to effectively detect potential failures for each sensor and, simultaneously, distinguish truly abnormal measurements from deviations due to natural phenomena. This is accomplished with the successful application of the methodology to detect and correct outliers, offset and drifting failures in real monitoring networks datasets. In the future, the methodology can be applied to optimize the data quality control processes of new and already operating monitoring networks, and assist in the networks maintenance operations.As redes de sensores sem fios (RSSF) têm vindo cada vez mais a serem utilizadas em diversas áreas de aplicação, em especial para monitorizar e capturar informação de processos físicos em meios naturais. Neste contexto, os sensores que estão em contacto direto com o respectivo meio ambiente, como por exemplo os sensores em meios aquáticos, estão sujeitos a condições adversas e complexas durante o seu funcionamento. Esta complexidade conduz à necessidade de considerarmos, durante o desenvolvimento destas redes, os requisitos não funcionais da confiabilidade, da segurança ou da disponibilidade elevada. Para percebermos como satisfazer estes requisitos da monitorização com base em RSSF para aplicações ambientais, já existe uma boa base de conhecimento sobre técnicas de confiabilidade em sistemas distribuídos. Devido ao foco na obtenção de dados deste tipo de aplicações de RSSF, é particularmente importante garantir que os dados obtidos na monitorização sejam confiáveis ou, de uma forma mais geral, que tenham a qualidade necessária para o objetivo pretendido. Esta tese estuda o problema de garantir a qualidade de dados necessária para uma monitorização confiável usando RSSF. Com o foco na confiabilidade, revemos os possíveis impedimentos à obtenção de dados confiáveis e as soluções existentes capazes de corrigir ou mitigar esses impedimentos. Apesar de existir uma grande variedade de componentes que formam ou podem formar um sistema de monitorização com base em RSSF, prestamos particular atenção à compreensão das possíveis faltas que podem afetar os sensores, a como estas faltas afetam a qualidade dos dados recolhidos pelos sensores e a como podemos melhorar os dados e quantificar a sua qualidade. Tendo em conta o caso específico dos sistemas de monitorização em meios aquáticos, discutimos ainda as várias linhas de investigação em aberto neste tópico. Um dos desafios para se atingir um sistema de monitorização confiável é a deteção da influência de fatores externos relacionados com o ambiente monitorizado, que afetam as medições obtidas pelos sensores, bem como a deteção de comportamentos de falha nas medições. Este desafio é um problema particular na monitorização em ambientes naturais adversos devido à dificuldade da distinção entre os comportamentos associados às falhas nos sensores e os comportamentos dos sensores afetados pela à influência de um evento natural. As soluções existentes para este problema, relacionadas com deteção de faltas, assumem que os processos físicos a monitorizar podem ser modelados de forma eficaz, ou que os comportamentos de falha são caraterizados por desvios elevados do comportamento expectável de forma a serem facilmente detetáveis. Mais frequentemente, as soluções assumem que as redes de sensores contêm um número suficientemente elevado de sensores na área monitorizada e, consequentemente, que existem sensores redundantes relativamente à medição. Esta tese tem como objetivo a definição de uma nova metodologia para a obtenção de qualidade de dados confiável em sistemas de monitorização ambientais, com o intuito de detetar a presença de faltas nas medições e aumentar a qualidade dos dados dos sensores. Esta metodologia tem uma estrutura genérica de forma a ser aplicada a uma qualquer rede de sensores ambiental ou ao respectivo conjunto de dados obtido pelos sensores desta. A metodologia é avaliada através de vários conjuntos de dados de diferentes RSSF, em que aplicámos técnicas de aprendizagem automática para modelar o comportamento de cada sensor, com base na exploração das correlações existentes entre os dados obtidos pelos sensores da rede. O objetivo é a aplicação de estratégias de fusão de dados para a deteção de potenciais falhas em cada sensor e, simultaneamente, a distinção de medições verdadeiramente defeituosas de desvios derivados de eventos naturais. Este objectivo é cumprido através da aplicação bem sucedida da metodologia para detetar e corrigir outliers, offsets e drifts em conjuntos de dados reais obtidos por redes de sensores. No futuro, a metodologia pode ser aplicada para otimizar os processos de controlo da qualidade de dados quer de novos sistemas de monitorização, quer de redes de sensores já em funcionamento, bem como para auxiliar operações de manutenção das redes.Laboratório Nacional de Engenharia Civi

    Sustainable marine ecosystems: deep learning for water quality assessment and forecasting

    Get PDF
    An appropriate management of the available resources within oceans and coastal regions is vital to guarantee their sustainable development and preservation, where water quality is a key element. Leveraging on a combination of cross-disciplinary technologies including Remote Sensing (RS), Internet of Things (IoT), Big Data, cloud computing, and Artificial Intelligence (AI) is essential to attain this aim. In this paper, we review methodologies and technologies for water quality assessment that contribute to a sustainable management of marine environments. Specifically, we focus on Deep Leaning (DL) strategies for water quality estimation and forecasting. The analyzed literature is classified depending on the type of task, scenario and architecture. Moreover, several applications including coastal management and aquaculture are surveyed. Finally, we discuss open issues still to be addressed and potential research lines where transfer learning, knowledge fusion, reinforcement learning, edge computing and decision-making policies are expected to be the main involved agents.Postprint (published version

    Remote Sensing

    Get PDF
    This dual conception of remote sensing brought us to the idea of preparing two different books; in addition to the first book which displays recent advances in remote sensing applications, this book is devoted to new techniques for data processing, sensors and platforms. We do not intend this book to cover all aspects of remote sensing techniques and platforms, since it would be an impossible task for a single volume. Instead, we have collected a number of high-quality, original and representative contributions in those areas

    Correction of meteorological vehicle-based measurements for road weather monitoring in pursue of enabling safe automated driving

    Get PDF
    Um zukünftig punktgenaue Vorhersagen und somit verlässliche Warnungen vor wetterbedingten und potenziell gefährlichen lokalen Straßenbedingungen zu erstellen, werden zeitlich und räumlich hochaufgelöste meteorologische Daten benötigt. Die vorliegende Arbeit prüft die Verwendbarkeit von fahrzeugbasierten Messungen basierend auf der derzeit in Serie verbauten Sensorik. Ziel dieser Arbeit ist es zu untersuchen, ob und inwiefern eine Korrektur der fahrzeugbasierten Daten eine Steigerung des Potentials zur Verbesserung der räumlichen und zeitlichen Auflösung von meteorologischen Daten aufweist. Die Rohdaten der Fahrzeugmessungen unterliegen starken Abweichungen zu den verwendeten Referenzdaten, hervorgerufen sowohl durch stationäre Effekte wie Messungenauigkeit und Verbauort des Sensors, als auch durch bewegungsbedingte Effekte, wie beispielsweise den Einfluss der Motorabwärme bei geringen Geschwindigkeiten. Um die genannten Einflüsse zu untersuchen, wird zunächst ein weltweit einzigartiger Datensatz mit parallelen Daten von Serienfahrzeugen und Referenzen im Rahmen von Messkampagnen erstellt. Anschließend führt die vorliegende Arbeit eine Qualitätskontrolle und Korrektur für die vier meteorologischen Parameter Luftdruck, Lufttemperatur, relative Feuchte und Globalstrahlung durch. Die Rohdaten verfügen für meteorologische Anwendungen über eine zu geringe Qualität. Die entwickelten und implementierten Korrekturverfahren, sowohl physikalischer Natur als auch basierend auf Machine Learning, erreichen sowohl für den Luftdruck als auch für die Lufttemperatur und die relative Feuchte signifikante Verbesserungen der vorliegenden Daten. Für die Lufttemperatur erreichen alle getesteten Modelle vergleichbar gute Ergebnisse, wohingegen bei der relativen Feuchte die Machine Learning basierten Modelle qualitativ hochwertigere Ergebnisse erzielen als das physikalische Modell. Die Machine Learning Modelle erreichen für diesen Parameter einen Anteil von über 95 % an Daten innerhalb der einfachen Messunsicherheit. Eine allgemein gültige Aussage bezüglich der Übertragbarkeit und Wirksamkeit auf anderen Fahrzeugen und anderen als den hier getesteten Szenarien kann auf Basis der zur Verfügung stehenden Datengrundlage nicht getroffen werden. Die Korrektur der Globalstrahlung erreicht in stationären Situationen bereits eine signifikante Verbesserung der Qualität der Ergebnisse. Für die Korrektur von mobilen Daten während der Fahrt ist das Potential der Qualitätssteigerung noch nicht ausgereizt. Die vorliegende Arbeit verdeutlicht die Notwendigkeit der Korrektur der fahrzeugbasierten Rohdaten und zeigt das Potential der hiermit verbundenen Qualitätssteigerung auf. Weitere Untersuchungen, vor allem bezüglich der Übertragbarkeit auf Flottendaten, sowie eine größere Datengrundlage sind notwendig, um eine allgemein gültige Aussage über die Qualitätssteigerung treffen und die Korrekturen weiter in Richtung Serienreife entwickeln zu können

    Assessing uncertainties of in situ FAPAR measurements across different forest ecosystems

    Get PDF
    Carbon balances are important for understanding global climate change. Assessing such balances on a local scale depends on accurate measurements of material flows to calculate the productivity of the ecosystem. The productivity of the Earth's biosphere, in turn, depends on the ability of plants to absorb sunlight and assimilate biomass. Over the past decades, numerous Earth observation missions from satellites have created new opportunities to derive so-called “essential climate variables” (ECVs), including important variables of the terrestrial biosphere, that can be used to assess the productivity of our Earth's system. One of these ECVs is the “fraction of absorbed photosynthetically active radiation” (FAPAR) which is needed to calculate the global carbon balance. FAPAR relates the available photosynthetically active radiation (PAR) in the wavelength range between 400 and 700 nm to the absorption of plants and thus quantifies the status and temporal development of vegetation. In order to ensure accurate datasets of global FAPAR, the UN/WMO institution “Global Climate Observing System” (GCOS) declared an accuracy target of 10% (or 0.05) as acceptable for FAPAR products. Since current satellite derived FAPAR products still fail to meet this accuracy target, especially in forest ecosystems, in situ FAPAR measurements are needed to validate FAPAR products and improve them in the future. However, it is known that in situ FAPAR measurements can be affected by significant systematic as well as statistical errors (i.e., “bias”) depending on the choice of measurement method and prevailing environmental conditions. So far, uncertainties of in situ FAPAR have been reproduced theoretically in simulations with radiation transfer models (RTMs), but the findings have been validated neither in field experiments nor in different forest ecosystems. However, an uncertainty assessment of FAPAR in field experiments is essential to develop practicable measurement protocols. This work investigates the accuracy of in situ FAPAR measurements and sources of uncertainties based on multi-year, 10-minute PAR measurements with wireless sensor networks (WSNs) at three sites on three continents to represent different forest ecosystems: a mixed spruce forest at the site “Graswang” in Southern Germany, a boreal deciduous forest at the site “Peace River” in Northern Alberta, Canada and a tropical dry forest (TDF) at the site “Santa Rosa”, Costa Rica. The main statements of the research results achieved in this thesis are briefly summarized below: Uncertainties of instantaneous FAPAR in forest ecosystems can be assessed with Wireless Sensor Networks and additional meteorological and phenological observations. In this thesis, two methods for a FAPAR bias assessment have been developed. First, for assessing the bias of the so-called two-flux FAPAR estimate, the difference between FAPAR acquired under diffuse light conditions and two-flux FAPAR acquired during clear-sky conditions can be investigated. Therefore, measurements of incoming and transmitted PAR are required to calculate the two-flux FAPAR estimate as well as observations of the ratio of diffuse-to-total incident radiation. Second, to assess the bias of not only the two- but also the three-flux FAPAR estimate, four-flux FAPAR observations must be carried out, i.e. measurements of top-of-canopy (TOC) PAR albedo and PAR albedo of the forest background. Then, to quantify the bias of the two and three-flux estimate, the difference with the four-flux estimate can be calculated. Main sources of uncertainty of in situ FAPAR measurements are high solar zenith angle, occurrence of colored leaves and increased wind speed. At all sites, FAPAR observations exhibited considerable seasonal variability due to the phenological development of the forests (Graswang: 0.89 to 0.99 ±0.02; Peace River: 0.55 to 0.87 ±0.03; Santa Rosa: 0.45 to 0.97 ±0.06). Under certain environmental conditions, FAPAR was affected by systemic errors, i.e. bias that go beyond phenologically explainable fluctuations. The in situ observations confirmed a significant overestimation of FAPAR by up to 0.06 at solar zenith angles above 60° and by up to 0.05 under the occurrence of colored leaves of deciduous trees. The results confirm theoretical findings from radiation transfer simulations, which could now for the first time be quantified under field conditions. As a new finding, the influence of wind speed could be shown, which was particularly evident at the boreal location with a significant bias of FAPAR values at wind speeds above 5 ms-1. The uncertainties of the two-flux FAPAR estimate are acceptable under typical summer conditions. Three-flux or four-flux FAPAR measurements do not necessarily increase the accuracy of the estimate. The highest average relative bias of different FAPAR estimates were 2.1% in Graswang, 8.4% in Peace River and -4.5% in Santa Rosa. Thus, the GCOS accuracy threshold of 10% set by the GCOS was generally not exceeded. The two-flux FAPAR estimate was only found to be biased during high wind speeds, as changes in the TOC PAR albedo are not considered in two-flux FAPAR measurements. Under typical summer conditions, i.e. low wind speed, small solar zenith angle and green leaves, two-flux FAPAR measurements can be recommended for the validation of satellite-based FAPAR products. Based on the results obtained, it must be emphasized that the three-flux FAPAR estimate, which has often been preferred in previous studies, is not necessarily more accurate, which was particularly evident in the tropical location. The discrepancies between ground measurements and the current Sentinel-2 FAPAR product still largely exceed the GCOS target accuracy at the respective study sites, even when considering uncertainties of FAPAR ground measurements. It was found that the Sentinel-2 (S2) FAPAR product systematically underestimated the ground observations at all three study sites (i.e. negative values for the mean relative bias in percent). The highest agreement was observed at the boreal site Peace River with a mean relative deviation of -13% (R²=0.67). At Graswang and Santa Rosa, the mean relative deviations were -20% (R²=0.68) and -25% (R²=0.26), respectively. It was argued that these high discrepancies resulted from both the generic nature of the algorithm and the higher ecosystem complexity of the sites Graswang and Santa Rosa. It was also found that the temporal aggregation method of FAPAR ground data should be well considered for comparison with the S2 FAPAR product, which refers to daily averages, as overestimation of FAPAR during high solar zenith angles could distort validation results. However, considering uncertainties of ground measurements, the S2 FAPAR product met the GCOS accuracy requirements only at the boreal study site. Overall, it has been shown that the S2 FAPAR product is already well suited to assess the temporal variability of FAPAR, but due to the low accuracy of the absolute values, the possibilities to feed global production efficiency models and evaluate global carbon balances are currently limited. The accuracy of satellite derived FAPAR depends on the complexity of the observed forest ecosystem. The highest agreement between satellite derived FAPAR product and ground measurements, both in terms of absolute values and spatial variability, was achieved at the boreal site, where the complexity of the ecosystem is lowest considering forest structure variables and species richness. These results have been elaborated and presented in three publications that are at the center of this cumulative thesis. In sum, this work closes a knowledge gap by displaying the interplay of different environmental conditions on the accuracy of situ FAPAR measurements. Since the uncertainties of FAPAR are now quantifiable under field conditions, they should also be considered in future validation studies. In this context, the practical recommendations for the implementation of ground observations given in this thesis can be used to prepare sampling protocols, which are urgently needed to validate and improve global satellite derived FAPAR observations in the future.Projektionen zukünftiger Kohlenstoffbilanzen sind wichtig für das Verständnis des globalen Klimawandels und sind auf genaue Messungen von Stoffflüssen zur Berechnung der Produktivität des Erdökosystems angewiesen. Die Produktivität der Biosphäre unserer Erde wiederum ist abhängig von der Eigenschaft von Pflanzen, Sonnenlicht zu absorbieren und Biomasse zu assimilieren. Über die letzten Jahrzehnte haben zahlreiche Erdbeobachtungsmissionen von Satelliten neue Möglichkeiten geschaffen, sogenannte „essentielle Klimavariablen“ (ECVs), darunter auch wichtige Variablen der terrestrischen Biosphäre, aus Satellitendaten abzuleiten, mit deren Hilfe man die Produktivität unseres Erdsystems computergestützt berechnen kann. Eine dieser „essenziellen Klimavariablen“ ist der Anteil der absorbierten photosynthetisch aktiven Strahlung (FAPAR) die man zur Berechnung der globalen Kohlenstoffbilanz benötigt. FAPAR bezieht die verfügbare photosynthetisch aktive Strahlung (PAR) im Wellenlängenbereich zwischen 400 und 700 nm auf die Absorption von Pflanzen und quantifiziert somit Status und die zeitliche Entwicklung von Vegetation. Um möglichst präzise Informationen aus dem globalen FAPAR zu gewährleisten, erklärte die UN/WMO-Institution zur globalen Klimabeobachtung, das “Global Climate Observing System“ (GCOS), ein Genauigkeitsziel von 10% (bzw. 0.05) FAPAR-Produkte als akzeptabel. Da aktuell satellitengestützte FAPAR-Produkte dieses Genauigkeitsziel besonders in Waldökosystemen immer noch verfehlen, werden dringen in situ FAPAR-Messungen benötigt, um die FAPAR-Produkte validieren und in Zukunft verbessern zu können. Man weiß jedoch, dass je nach Auswahl des Messsystems und vorherrschenden Umweltbedingungen in situ FAPAR-Messungen mit erheblichen sowohl systematischen als auch statistischen Fehlern beeinflusst sein können. Bisher wurden diese Fehler in Simulationen mit Strahlungstransfermodellen zwar theoretisch nachvollzogen, aber die dadurch abgeleiteten Befunde sind bisher weder in Feldversuchen noch in unterschiedlichen Waldökosystemen validiert worden. Eine Unsicherheitsabschätzung von FAPAR im Feldversuch ist allerdings essenziell, um praxistaugliche Messprotokolle entwickeln zu können. Die vorliegende Arbeit untersucht die Genauigkeit von in situ FAPAR-Messungen und Ursachen von Unsicherheit basierend auf mehrjährigen, 10-minütigen PAR-Messungen mit drahtlosen Sensornetzwerken (WSNs) an drei verschiedenen Waldstandorten auf drei Kontinenten: der Standort „Graswang“ in Süddeutschland mit einem Fichten-Mischwald, der Standort „Peace River“ in Nord-Alberta, Kanada mit einem borealen Laubwald und der Standort „Santa Rosa“, Costa Rica mit einem tropischen Trockenwald. Die Hauptaussagen der in dieser Arbeit erzielten Forschungsergebnisse werden im Folgenden kurz zusammengefasst: Unsicherheiten von FAPAR in Waldökosystemen können mit drahtlosen Sensornetzwerken und zusätzlichen meteorologischen und phänologischen Beobachtungen quantifiziert werden. In dieser Arbeit wurden zwei Methoden für die Bewertung von Unsicherheiten entwickelt. Erstens, um den systematischen Fehler der sogenannten „two-flux“ FAPAR-Messung zu beurteilen, kann die Differenz zwischen FAPAR, das unter diffusen Lichtverhältnissen aufgenommen wurde, und FAPAR, das unter klaren Himmelsbedingungen aufgenommen wurde, untersucht werden. Für diese Methode sind Messungen des einfallenden und transmittierten PAR sowie Beobachtungen des Verhältnisses von diffuser zur gesamten einfallenden Strahlung erforderlich. Zweitens, um den systematischen Fehler nicht nur der „two-flux“ FAPAR-Messung, sondern auch der „three-flux“ FAPAR-Messung zu beurteilen, müssen „four-flux“ FAPAR-Messungen durchgeführt werden, d.h. zusätzlich Messungen der PAR Albedo des Blätterdachs sowie des Waldbodens. Zur Quantifizierung des Fehlers der „two-flux“ und „three-flux“ FAPAR-Messung kann die Differenz zur „four-flux“ FAPAR-Messung herangezogen werden. Die Hauptquellen für die Unsicherheit von in situ FAPAR-Messungen sind ein hoher Sonnenzenitwinkel, Blattfärbung und erhöhte Windgeschwindigkeit. An allen drei Untersuchungsstandorten zeigten die FAPAR-Beobachtungen natürliche saisonale Schwankungen aufgrund der phänologischen Entwicklung der Wälder (Graswang: 0,89 bis 0,99 ±0,02; Peace River: 0,55 bis 0,87 ±0,03; Santa Rosa: 0,45 bis 0,97 ±0,06). Unter bestimmten Umweltbedingungen war FAPAR von systematischen Fehlern, d.h. Verzerrungen betroffen, die über phänologisch erklärbare Schwankungen hinausgehen. So bestätigten die in situ Beobachtungen eine signifikante Überschätzung von FAPAR um bis zu 0,06 bei Sonnenzenitwinkeln von über 60° und um bis zu 0,05 bei Vorkommen gefärbter Blätter der Laubbäume. Die Ergebnisse bestätigen theoretische Erkenntnisse aus Strahlungstransfersimulationen, die nun erstmalig unter Feldbedingungen quantifiziert werden konnten. Als eine neue Erkenntnis konnte der Einfluss der Windgeschwindigkeit gezeigt werden, der sich besonders am borealen Standort mit einer signifikanten Verzerrung der FAPAR-Werte bei Windgeschwindigkeiten über 5 ms-1 äußerte. Die Unsicherheiten der „two-flux“ FAPAR-Messung sind unter typischen Sommerbedingungen akzeptabel. „Three-flux“ oder „four-flux“ FAPAR-Messungen erhöhen nicht unbedingt die Genauigkeit der Abschätzung. Die höchsten durchschnittlichen relativen systematischen Fehler verschiedener Methoden zur FAPAR-Messung betrugen 2,1% in Graswang, 8,4% in Peace River und -4,5% in Santa Rosa. Damit wurde der durch GCOS festgelegte Genauigkeitsschwellenwert von 10% im Allgemeinen nicht überschritten. Die „two-flux“ FAPAR-Messung wurde nur als fehleranfällig bei hohe Windgeschwindigkeiten befunden, da Änderungen der PAR-Albedo des Blätterdachs bei der „two-flux“ FAPAR-Messung nicht berücksichtigt werden. Unter typischen Sommerbedingungen, also geringe Windgeschwindigkeit, kleiner Sonnenzenitwinkel und grüne Blätter, kann die „two-flux“ FAPAR-Messung für die Validierung von satellitengestützten FAPAR-Produkten empfohlen werden. Auf Basis der gewonnenen Ergebnisse muss betont werden, dass die „three-flux“ FAPAR-Messung, die in bisherigen Studien häufig bevorzugt wurde, nicht unbedingt weniger fehlerbehaftet sind, was sich insbesondere am tropischen Standort zeigte. Die Abweichungen zwischen Bodenmessungen und dem aktuellen Sentinel-2 FAPAR-Produkt überschreiten auch unter Berücksichtigung von Unsicherheiten in der Messmethodik immer noch weitgehend die GCOS-Zielgenauigkeit an den jeweiligen Untersuchungsstandorten. So zeigte sich, dass das S2 FAPAR-Produkt die Bodenbeobachtungen an allen drei Studienstandorten systematisch unterschätzte (d.h. negative Werte für die mittlere relative Abweichung in Prozent). Die höchste Übereinstimmung wurde am borealen Standort Peace River mit einer mittleren relativen Abweichung von -13% (R²=0,67) beobachtet. An den Standorten Graswang und Santa Rosa betrugen die mittleren relativen Abweichungen jeweils -20% (R²=0,68) bzw. -25% (R²=0,26). Es wurde argumentiert, dass diese hohen Abweichungen auf eine Kombination sowohl des generisch ausgerichteten Algorithmus als auch der höheren Komplexität beider Ökosysteme zurückgeführt werden können. Es zeigte sich außerdem, dass die zeitlichen Aggregierung der FAPAR-Bodendaten zum Vergleich mit S2 FAPAR-Produkt, das sich auf Tagesmittelwerte bezieht, gut überlegt sein sollte, da die Überschätzung von FAPAR während eines hohen Sonnenzenitwinkels in den Bodendaten die Validierungsergebnisse verzerren kann. Unter Berücksichtigung der Unsicherheiten der Bodendaten erfüllte das S2 FAPAR Produkt jedoch nur am boreale Untersuchungsstandort die Genauigkeitsanforderungen des GCOS. Insgesamt hat sich gezeigt, dass das S2 FAPAR-Produkt bereits gut zur Beurteilung der zeitlichen Variabilität von FAPAR geeignet ist, aber aufgrund der geringen Genauigkeit der absoluten Werte sind die Möglichkeiten, globale Produktionseffizienzmodelle zu speisen und globale Kohlenstoffbilanzen zu bewerten, derzeit begrenzt. Die Genauigkeit von satellitengestützten FAPAR-Produkten ist abhängig von der Komplexität des beobachteten Waldökosystems. Die höchste Übereinstimmung zwischen satellitengestütztem FAPAR und Bodenmessungen, sowohl hinsichtlich der Darstellung von absolutem Werten als auch der räumlichen Variabilität, wurde am borealen Standort erzielt, für den die Komplexität des Ökosystems unter Berücksichtigung von Waldstrukturvariablen und Artenreichtum am geringsten ausfällt. Die dargestellten Ergebnisse wurden in drei Publikationen dieser kumulativen Arbeit erarbeitet. Insgesamt schließt diese Arbeit eine Wissenslücke in der Darstellung des Zusammenspiels verschiedener Umgebungsbedingungen auf die Genauigkeit von situ FAPAR-Messungen. Da die Unsicherheiten von FAPAR nun unter Feldbedingungen quantifizierbar sind, sollten sie in zukünftigen Validierungsstudien auch berücksichtigt werden. In diesem Zusammenhang können die in dieser Arbeit genannten praktische Empfehlungen für die Durchführung von Bodenbeobachtungen zur Erstellung von Messprotokollen herangezogen werden, die dringend erforderlich sind, um globale satellitengestützte FAPAR-Beobachten validieren und zukünftig verbessern zu können

    Performance analysis of a two-level polling control system based on LSTM and attention mechanism for wireless sensor networks

    Get PDF
    A continuous-time exhaustive-limited (K = 2) two-level polling control system is proposed to address the needs of increasing network scale, service volume and network performance prediction in the Internet of Things (IoT) and the Long Short-Term Memory (LSTM) network and an attention mechanism is used for its predictive analysis. First, the central site uses the exhaustive service policy and the common site uses the Limited K = 2 service policy to establish a continuous-time exhaustive-limited (K = 2) two-level polling control system. Second, the exact expressions for the average queue length, average delay and cycle period are derived using probability generating functions and Markov chains and the MATLAB simulation experiment. Finally, the LSTM neural network and an attention mechanism model is constructed for prediction. The experimental results show that the theoretical and simulated values basically match, verifying the rationality of the theoretical analysis. Not only does it differentiate priorities to ensure that the central site receives a quality service and to ensure fairness to the common site, but it also improves performance by 7.3 and 12.2%, respectively, compared with the one-level exhaustive service and the one-level limited K = 2 service; compared with the two-level gated- exhaustive service model, the central site length and delay of this model are smaller than the length and delay of the gated- exhaustive service, indicating a higher priority for this model. Compared with the exhaustive-limited K = 1 two-level model, it increases the number of information packets sent at once and has better latency performance, providing a stable and reliable guarantee for wireless network services with high latency requirements. Following on from this, a fast evaluation method is proposed: Neural network prediction, which can accurately predict system performance as the system size increases and simplify calculations
    corecore