11,446 research outputs found

    A mixture of experts model for predicting persistent weather patterns

    Get PDF
    Weather and atmospheric patterns are often persistent. The simplest weather forecasting method is the so-called persistence model, which assumes that the future state of a system will be similar (or equal) to the present state. Machine learning (ML) models are widely used in different weather forecasting applications, but they need to be compared to the persistence model to analyse whether they provide a competitive solution to the problem at hand. In this paper, we devise a new model for predicting low-visibility in airports using the concepts of mixture of experts. Visibility level is coded as two different ordered categorical variables: Cloud height and runway visual height. The underlying system in this application is stagnant approximately in 90% of the cases, and standard ML models fail to improve on the performance of the persistence model. Because of this, instead of trying to simply beat the persistence model using ML, we use this persistence as a baseline and learn an ordinal neural network model that refines its results by focusing on learning weather fluctuations. The results show that the proposal outperforms persistence and other ordinal autoregressive models, especially for longer time horizon predictions and for the runway visual height variable

    2022 SDSU Data Science Symposium Presentation Abstracts

    Get PDF
    This document contains abstracts for presentations and posters 2022 SDSU Data Science Symposium

    2022 SDSU Data Science Symposium Presentation Abstracts

    Get PDF
    This document contains abstracts for presentations and posters 2022 SDSU Data Science Symposium

    Long Term Predictive Modeling on Big Spatio-Temporal Data

    Get PDF
    In the era of massive data, one of the most promising research fields involves the analysis of large-scale Spatio-temporal databases to discover exciting and previously unknown but potentially useful patterns from data collected over time and space. A modeling process in this domain must take temporal and spatial correlations into account, but with the dimensionality of the time and space measurements increasing, the number of elements potentially contributing to a target sharply grows, making the target\u27s long-term behavior highly complex, chaotic, highly dynamic, and hard to predict. Therefore, two different considerations are taken into account in this work: one is about how to identify the most relevant and meaningful features from the original Spatio-temporal feature space; the other is about how to model complex space-time dynamics with sensitive dependence on initial and boundary conditions. First, identifying strongly related features and removing the irrelevant or less important features with respect to a target feature from large-scale Spatio-temporal data sets is a critical and challenging issue in many fields, including the evolutionary history of crime hot spots, uncovering weather patterns, predicting floodings, earthquakes, and hurricanes, and determining global warming trends. The optimal sub-feature-set that contains all the valuable information is called the Markov Boundary. Unfortunately, the existing feature selection methods often focus on identifying a single Markov Boundary when real-world data could have many feature subsets that are equally good boundaries. In our work, we design a new multiple-Markov-boundary-based predictive model, Galaxy, to identify the precursors to heavy precipitation event clusters and predict heavy rainfall with a long lead time. We applied Galaxy to an extremely high-dimensional meteorological data set and finally determined 15 Markov boundaries related to heavy rainfall events in the Des Moines River Basin in Iowa. Our model identified the cold surges along the coast of Asia as an essential precursor to the surface weather over the United States, a finding which was later corroborated by climate experts. Second, chaotic behavior exists in many nonlinear Spatio-temporal systems, such as climate dynamics, weather prediction, and the space-time dynamics of virus spread. A reliable solution for these systems must handle their complex space-time dynamics and sensitive dependence on initial and boundary conditions. Deep neural networks\u27 hierarchical feature learning capabilities in both spatial and temporal domains are helpful for nonlinear Spatio-temporal dynamics modeling. However, sensitive dependence on initial and boundary conditions is still challenging for theoretical research and many critical applications. This study proposes a new recurrent architecture, error trajectory tracing, and accompanying training regime, Horizon Forcing, for prediction in chaotic systems. These methods have been validated on real-world Spatio-temporal data sets, including one meteorological dataset, three classics, chaotic systems, and four real-world time series prediction tasks with chaotic characteristics. Experiments\u27 results show that each proposed model could outperform the performance of current baseline approaches

    How will climate change affect mycotoxins in food?

    Get PDF
    This invited review and opinion piece, assesses the impact of climate change on mycotoxins in food: only one paper and an abstract referred directly from a substantial literature search and then only in relation to Europe. Climate change is an accepted probability by most scientists. Favourable temperature and water activity are crucial for mycotoxigenic fungi and mycotoxin production. Fungal diseases of crops provide relevant information for pre-harvest mycotoxin contamination. However, the mycotoxin issue also involves post-harvest scenarios. There are no data on how mycotoxins affect competing organisms in crop ecosystems. In general, if the temperature increases in cool or temperate climates, the relevant countries may become more liable to aflatoxins. Tropical countries may become too inhospitable for conventional fungal growth and mycotoxin production. Could this lead to the extinction of thermotolerant Aspergillus flavus? Currently cold regions may become liable to temperate problems concerning ochratoxin A, patulin and Fusarium toxins (e.g. deoxynivalenol). Regions which can afford to control the environment of storage facilities may be able to avoid post-harvest problems but at high additional cost. There appears to be a lack of awareness of the issue in some non-European countries. The era will provide numerous challenges for mycotoxicologists.Fundação para a Ciência e a Tecnologia (FCT) - bolsa SFRH/BPD/34879/2007, Commitment to Science ref. C2008-UMINHO-CEB-2

    PICES Press, Vol. 21, No. 1, Winter 2013

    Get PDF
    •2012 PICES Science: A Note from the Science Board Chairman (pp. 1-6) ◾2012 PICES Awards (pp. 7-9) ◾GLOBEC/PICES/ICES ECOFOR Workshop (pp. 10-15) ◾ICES/PICES Symposium on “Forage Fish Interactions” (pp. 16-18) ◾The Yeosu Declaration, the Yeosu Declaration Forum and the Yeosu Project (pp. 19-23) ◾2013 PICES Calendar (p. 23) ◾Why Do We Need Human Dimensions for the FUTURE Program? (pp. 24-25) ◾New PICES MAFF-Sponsored Project on “Marine Ecosystem Health and Human Well-Being” (pp. 26-28) ◾The Bering Sea: Current Status and Recent Trends (pp. 29-31) ◾Continuing Cool in the Northeast Pacific Ocean (pp. 32, 35) ◾The State of the Western North Pacific in the First Half of 2012 (pp. 33-35) ◾New Leadership in PICES (pp. 36-39

    A toolkit modeling approach for sustainable forest management planning: Achieving balance between science and local needs

    Get PDF
    To assist forest managers in balancing an increasing diversity of resource objectives, we developed a toolkit modeling approach for sustainable forest management (SFM). The approach inserts a meta-modeling strategy into a collaborative modeling framework grounded in adaptive management philosophy that facilitates participation among stakeholders, decision makers, and local domain experts in the meta-model building process. The modeling team works iteratively with each of these groups to define osential questions, identify data resources, and then determine whether available tools can be applied or adapted, or whether new tools can be rapidly created to fit the need. The desired goal of the process is a linked series of domain-specific models (tools) that balances generalized "top-down" models (i.e., scientific models developed without input from the local system) with case-specific customized "bottom-up" models that are driven primarily by local needs. Information flow between models is organized according to vertical (i.e., between scale) and horizontal (i.e., within scale) dimensions. We illustrate our approach within a 2.1 million hectare forest planning district in central Labrador, a forested landscape where social hnd ecological values receive a higher priority than economic values. However, the focus of this paper is on the process of how SFM modeling tools and concepts can be rapidly assembled and applied in new locations, balancing efficient transfer of science with adaptation to local needs. We use the Labrador case study to illustrate strengths and challenges uniquely associated with a meta-modeling approach to integrated modeling as it fits within the broader collaborative modeling framework. Principle advantages of the approach include the scientific rigor introduced by peer-reviewed models, combined with the adaptability of meta-modeling. A key challenge is the limited transparency of scientific models to different participatory groups. This challenge can be overcome by frequent and substantive two-way communication among different groups at appropriate times in the model-building process, combined with strong leadership that includes strategic choices when assembling the modeling team. The toolkit approach holds promise for extending beyond case studies, without compromising the bottom-up flow of needs and information, to inform SFM planning using the best available science
    • …
    corecore