19 research outputs found

    Information-Theoretic Methods for Identifying Relationships among Climate Variables

    Full text link
    Information-theoretic quantities, such as entropy, are used to quantify the amount of information a given variable provides. Entropies can be used together to compute the mutual information, which quantifies the amount of information two variables share. However, accurately estimating these quantities from data is extremely challenging. We have developed a set of computational techniques that allow one to accurately compute marginal and joint entropies. These algorithms are probabilistic in nature and thus provide information on the uncertainty in our estimates, which enable us to establish statistical significance of our findings. We demonstrate these methods by identifying relations between cloud data from the International Satellite Cloud Climatology Project (ISCCP) and data from other sources, such as equatorial pacific sea surface temperatures (SST).Comment: Presented at the Earth-Sun System Technology Conference (ESTC 2008), Adelphi, MD. http://esto.nasa.gov/conferences/estc2008/ 3 pages, 3 figures. Appears in the Proceedings of the Earth-Sun System Technology Conference (ESTC 2008), Adelphi, M

    Revealing Relationships among Relevant Climate Variables with Information Theory

    Full text link
    A primary objective of the NASA Earth-Sun Exploration Technology Office is to understand the observed Earth climate variability, thus enabling the determination and prediction of the climate's response to both natural and human-induced forcing. We are currently developing a suite of computational tools that will allow researchers to calculate, from data, a variety of information-theoretic quantities such as mutual information, which can be used to identify relationships among climate variables, and transfer entropy, which indicates the possibility of causal interactions. Our tools estimate these quantities along with their associated error bars, the latter of which is critical for describing the degree of uncertainty in the estimates. This work is based upon optimal binning techniques that we have developed for piecewise-constant, histogram-style models of the underlying density functions. Two useful side benefits have already been discovered. The first allows a researcher to determine whether there exist sufficient data to estimate the underlying probability density. The second permits one to determine an acceptable degree of round-off when compressing data for efficient transfer and storage. We also demonstrate how mutual information and transfer entropy can be applied so as to allow researchers not only to identify relations among climate variables, but also to characterize and quantify their possible causal interactions.Comment: 14 pages, 5 figures, Proceedings of the Earth-Sun System Technology Conference (ESTC 2005), Adelphi, M

    Solutions Network Formulation Report. Integration of OMI and TES Aerosol Products into the EPA Regional Planning Organizations' FASTNET Aerosol Tracking and Analysis Tool

    Get PDF
    Every year, more than 280 million visitors tour our Nation s most treasured parks and wilderness areas. Unfortunately, many visitors are unable to see the spectacular vistas they expect because of white or brown haze in the air. Most of this haze is not natural; it is air pollution, carried by the wind often hundreds of miles from its origin. Some of the pollutants have been linked to serious health problems, such as asthma and other lung disorders, and even premature death. In addition, nitrates and sulfates contribute to acid rain formation, which contaminates rivers and lakes and erodes buildings and historical monuments. The U.S. Environmental Protection Agency RPOs (Regional Planning Organizations) have been tasked with monitoring and determining the nature and origin of haze in Class I scenic areas, and finding ways to reduce haze in order to improve visibility in these areas. The RPOs have developed an Internet-based air quality DST (Decision Support Tool) called FASTNET (Fast Aerosol Sensing Tools for Natural Event Tracking). While FASTNET incorporates a few satellite datasets, most of the data utilized by this DST comes from ground-based instrument networks. The problem is that in many areas the sensors are sparsely located, with long distances between them, causing difficulties in tracking haze over the United States, determining its source, and analyzing its content. Satellite data could help to fill in the data gaps and to supplement and verify ground-recorded air quality data. Although satellite data are now being used for air quality research applications, such data are not routinely used for environmental decision support, in part because of limited resources, difficulties with interdisciplinary data interpretation, and the need for advanced inter-agency partnerships. As a result, the validation and verification of satellite data for air quality operational system applications has been limited This candidate solution evaluates the usefulness of OMI (Ozone Monitoring Instrument) and TES (Tropospheric Emission Spectrometer) air quality data for the RPOs by comparing OMI and TES data with ground-based data that are acquired during identified episodes of air pollution. The air quality data from OMI and TES are of different spectral ranges than data from satellites currently included in FASTNET, giving them potential advantages over the existing satellites. If the OMI and TES data are shown to be useful to the RPOs, they would then be integrated into the FASTNET DST for use on an operational basis

    A Framework for implementing radiation-tolerant circuits on reconfigurable FPGAs

    Get PDF
    The outstanding versatility of SRAM-based FPGAs make them the preferred choice for implementing complex customizable circuits. To increase the amount of logic available, manufacturers are using nanometric technologies to boost logic density and reduce prices. However, the use of nanometric scales also makes FPGAs particularly vulnerable to radiation-induced faults, especially because of the increasing amount of configuration memory cells that are necessary to define their functionality. This paper describes a framework for implementing circuits immune to radiation-induced faults, based on a customized Triple Modular Redundancy (TMR) infrastructure and on a detection-and-fix controller. This controller is responsible for the detection of data incoherencies, location of the faulty module and restoration of the original configuration, without affecting the normal operation of the mission logic. A short survey of the most recent data published concerning the impact of radiation-induced faults in FPGAs is presented to support the assumptions underlying our proposed framework. A detailed explanation of the controller functionality is also provided, followed by an experimental case study

    Debates: Does Information Theory Provide a New Paradigm for Earth Science? Emerging Concepts and Pathways of Information Physics

    Get PDF
    Entropy and Information are key concepts not only in Information Theory but also in Physics: historically in the fields of Thermodynamics, Statistical and Analytical Mechanics, and, more recently, in the field of Information Physics. In this paper we argue that Information Physics reconciles and generalizes statistical, geometric, and mechanistic views on information. We start by demonstrating how the use and interpretation of Entropy and Information coincide in Information Theory, Statistical Thermodynamics, and Analytical Mechanics, and how this can be taken advantage of when addressing Earth Science problems in general and hydrological problems in particular. In the second part we discuss how Information Physics provides ways to quantify Information and Entropy from fundamental physical principles. This extends their use to cases where the preconditions to calculate Entropy in the classical manner as an aggregate statistical measure are not met. Indeed, these preconditions are rarely met in the Earth Sciences due either to limited observations or the far-from-equilibrium nature of evolving systems. Information Physics therefore offers new opportunities for improving the treatment of Earth Science problems.info:eu-repo/semantics/publishedVersio

    Evaluation of a Potential for Enhancing the Decision Support System of the Interagency Modeling and Atmospheric Assessment Center with NASA Earth Science Research Results

    Get PDF
    NASA's objective for the Applied Sciences Program of the Science Mission Directorate is to expand and accelerate the realization of economic and societal benefits from Earth science, information, and technology. This objective is accomplished by using a systems approach to facilitate the incorporation of Earth observations and predictions into the decision-support tools used by partner organizations to provide essential services to society. The services include management of forest fires, coastal zones, agriculture, weather prediction, hazard mitigation, aviation safety, and homeland security. In this way, NASA's long-term research programs yield near-term, practical benefits to society. The Applied Sciences Program relies heavily on forging partnerships with other Federal agencies to accomplish its objectives. NASA chooses to partner with agencies that have existing connections with end-users, information infrastructure already in place, and decision support systems that can be enhanced by the Earth science information that NASA is uniquely poised to provide (NASA, 2004)

    Pengenalan Tanda Tangan Menggunakan Algoritme VFI5 Melalui Praproses Wavelet

    Get PDF
    Tanda tangan merupakan salah satu objek biometrik yang mudah diperoleh, baik melalui kertas maupun peralatan elektronik. Meskipun demikian, biometrik tanda tangan masih menjadi topik riset yang menantang. Tantangan dalam biometrik tanda tangan ini ialah antara lain karena variasi dalam kelas yang besar, tingkat universality dan permanence yang rendah, serta adanya kemungkinan serangan pemalsuan tanda tangan. Penelitian ini menggunakan metode pengenalan tanda tangan secara offline. Pengenalan tanda tangan dilakukan dengan menggunakan algoritme klasifikasi Voting Feature Interval 5. Sebelum dilakukan klasifikasi pada citra tanda tangan yang berdimensi 40 x 60 piksel, dilakukan praproses untuk mereduksi ukuran citra. Reduksi yang digunakan adalah reduksi dimensi melalui transformasi wavelet dengan lima level dekomposisi. Hasil yang diperoleh dari penelitian ini ialah bahwa sampai dengan level dekomposisi ketiga, dengan dimensi fitur sekitar 1.5% dari seluruh fitur, diperoleh akurasi minimum 90%

    Principal components’ features of mid-latitude geomagnetic daily variation

    Get PDF
    The ionospheric and magnetospheric current systems are responsible of the daily magnetic field changes. Recently, the Natural Orthogonal Components (NOC) technique has been applied to model the physical system responsible of the daily variation of the geomagnetic field, efficiently and accurately (Xu and Kamide, 2004). Indeed, this approach guarantees that the number of parameters used to represent the physical process is small as much as possible, and consequently process control for such system becomes apparent. We focus our present study on the analysis of the hourly means of the magnetic elements H, D and Z recorded at L’Aquila observatory in Italy from 1993 to 2004. We apply to this dataset the NOC technique to reconstruct the 3-dimensional structures of the different ionospheric and magnetospheric current systems which contribute to the geomagnetic daily variations. To support our interpretation in terms of the different ionospheric and magnetospheric current systems, the spectral and statistical features of the timedependent amplitudes associated to the set of natural orthogonal components are analyzed and compared to those of a set of descriptors of the magnetospheric dynamics and solar wind changes

    A Recipe for the Estimation of Information Flow in a Dynamical System

    Get PDF
    Information-theoretic quantities, such as entropy and mutual information (MI), can be used to quantify the amount of information needed to describe a dataset or the information shared between two datasets. In the case of a dynamical system, the behavior of the relevant variables can be tightly coupled, such that information about one variable at a given instance in time may provide information about other variables at later instances in time. This is often viewed as a flow of information, and tracking such a flow can reveal relationships among the system variables. Since the MI is a symmetric quantity; an asymmetric quantity, called Transfer Entropy (TE), has been proposed to estimate the directionality of the coupling. However, accurate estimation of entropy-based measures is notoriously difficult. Every method has its own free tuning parameter(s) and there is no consensus on an optimal way of estimating the TE from a dataset. We propose a new methodology to estimate TE and apply a set of methods together as an accuracy cross-check to provide a reliable mathematical tool for any given data set. We demonstrate both the variability in TE estimation across techniques as well as the benefits of the proposed methodology to reliably estimate the directionality of coupling among variables
    corecore