10 research outputs found

    Advancing Cyberinfrastructure for Collaborative Data Sharing and Modeling in Hydrology

    Get PDF
    Hydrologic research is increasingly data and computationally intensive, and often involves hydrologic model simulation and collaboration among researchers. With the development of cyberinfrastructure, researchers are able to improve the efficiency, impact, and effectiveness of their research by utilizing online data sharing and hydrologic modeling functionality. However, further efforts are still in need to improve the capability of cyberinfrastructure to serve the hydrologic science community. This dissertation first presents the evaluation of a physically based snowmelt model as an alternative to a temperature index model to improve operational water supply forecasts in the Colorado River Basin. Then it presents the design of the functionality to share multidimensional space-time data in the HydroShare hydrologic information system. It then describes a web application developed to facilitate input preparation and model execution of a snowmelt model and the storage of these results in HydroShare. The snowmelt model evaluation provided use cases to evaluate the cyberinfrastructure elements developed. This research explored a new approach to advance operational water supply forecasts and provided potential solutions for the challenges associated with the design and implementation of cyberinfrastructure for hydrologic data sharing and modeling

    Integrating Hydrologic Modeling Web Services With Online Data Sharing to Prepare, Store, and Execute Hydrologic Models

    Get PDF
    Web based applications, web services, and online data and model sharing technology are becoming increasingly available to support hydrologic research. This promises benefits in terms of collaboration, computer platform independence, and reproducibility of modeling workflows and results. In this research, we designed an approach that integrates hydrologic modeling web services with an online data sharing system to support web-based simulation for hydrologic models. We used this approach to integrate example systems as a case study to support reproducible snowmelt modeling for a test watershed in the Colorado River Basin, USA. We demonstrated that this approach enabled users to work within an online environment to create, describe, share, discover, repeat, modify, and analyze the modeling work. This approach encourages collaboration and improves research reproducibility. It can also be adopted or adapted to integrate other hydrologic modeling web services with data sharing systems for different hydrologic models

    HydroDS: Data Services in Support of Physically Based, Distributed Hydrological Models

    Get PDF
    Physically based distributed hydrologic models require geospatial and time-series data that take considerable time and effort in processing them into model inputs. Tools that automate and speed up input processing facilitate the application of these models. In this study, we developed a set of web-based data services called HydroDS to provide hydrologic data processing ‘software as a service.’ HydroDS provides functions for processing watershed, terrain, canopy, climate, and soil data. The services are accessed through a Python client library that facilitates developing simple but effective data processing workflows with Python. Evaluations of HydroDS by setting up the Utah Energy Balance and TOPNET models for multiple headwater watersheds in the Colorado River basin show that HydroDS reduces the input preparation time compared to manual processing. It also removes the requirements for software installation and maintenance by the user, and the Python workflows enhance reproducibility of hydrologic data processing and tracking of provenance

    An open-data open-model framework for hydrological models’ integration, evaluation and application

    Get PDF
    To tackle fundamental scientific questions regarding health, resilience and sustainability of water resources which encompass multiple disciplines, researchers need to be able to easily access diverse data sources and to also effectively incorporate these data into heterogeneous models. To address these cyberinfrastructure challenges, a new sustainable and easy-to-use Open Data and Open Modeling framework called Meta-Scientific-Modeling (MSM) is developed. MSM addresses the challenges of accessing heterogeneous data sources via the Open Data architecture which facilitates integration of various external data sources. Data Agents are used to handle remote data access protocols, metadata standards, and source-specific implementations. The Open Modeling architecture allows different models to be easily integrated into MSM via Model Agents, enabling direct heterogeneous model coupling. MSM adopts a graphical scientific workflow system (VisTrails) and does not require re-compiling or adding interface codes for any diverse model integration. A study case is presented to illustrate the merit of MSM

    Data visualization tool for student dropouts in Tanzania

    Get PDF
    A Dissertation Submitted in Partial Fulfilment of the Requirements for the Degree of Master’s in Information and Communication Science and Engineering of the Nelson Mandela African Institution of Science and TechnologyEducation is a crucial sector and key component of many governments’ agenda. Despite that, students dropout has been among the persisting challenges in education, being experienced from basic education levels to colleges and universities. This work presents a study on data visualization in education and a glimpse of data visualization in other domains, and suggests a web based data visualization tool for student dropouts in Tanzania, targeting primary and secondary schools. It also presents users’ feedback regarding the developed web-based tool. This study collected data from the Government Basic Statistics Portal and the President’s Office - Regional Administration and Local Government (PO-RALG). Rapid prototyping was employed to develop the proposed web-based data visualization tool for interactive visualization of the prepared data. Moreover, focus group discussions and questionnaires were used to gather feedback from the users, whereby majority agreed that data visualization is useful for understanding data and providing insights for reporting and decision making. Most users further agreed that the developed tool was easy to use, useful and recommendable. The various challenges that came up during the course of this study enlightens on the need for sound data collection practices and the need for good visualization literacy. The results from this study will be beneficial for decision makers in education domain by providing a wide range of options to visually compare and observe variations in the student dropouts trends, thus facilitating informed decisions. Moreover, this study will be useful source of information for further research in dropout prediction and modelling

    Advancing the Implementation of Hydrologic Models as Web-Based Applications

    Get PDF
    Deeper understanding of relationships between flow in river sand various hydrologic elements such as rainfall, land use, and soil type is imperative to solve water related problems like droughts and floods. Advanced computer models are becoming essential in helping us understand such relationships. However, preparing such models requires huge investment of time and resources, much of which are concentrated on acquisition and curation of data. This work introduces agree and open source web Application (web App) that provides researchers with simplified access to hydrological data and modeling functionality. The web App helps in the creation of both hydrologic models, and climatic and geographic data. Free and open source platforms such as Tethys and Hydro Share were used in the development of the web Apia physics based model called TOPographic Kinematic APproximation and Integration (TOPKAPI) was used as the driving use case for which a complete hydrologic modeling service was developed to demonstrate the approach. The final product is a complete modeling system accessible through the web to create hydrologic data and run a hydrologic model for a watershed of interest. An additional model, TOPNET, was incorporated to demonstrate the generality of the approach and capability for adding other models into the framework

    A Data-driven, High-performance and Intelligent CyberInfrastructure to Advance Spatial Sciences

    Get PDF
    abstract: In the field of Geographic Information Science (GIScience), we have witnessed the unprecedented data deluge brought about by the rapid advancement of high-resolution data observing technologies. For example, with the advancement of Earth Observation (EO) technologies, a massive amount of EO data including remote sensing data and other sensor observation data about earthquake, climate, ocean, hydrology, volcano, glacier, etc., are being collected on a daily basis by a wide range of organizations. In addition to the observation data, human-generated data including microblogs, photos, consumption records, evaluations, unstructured webpages and other Volunteered Geographical Information (VGI) are incessantly generated and shared on the Internet. Meanwhile, the emerging cyberinfrastructure rapidly increases our capacity for handling such massive data with regard to data collection and management, data integration and interoperability, data transmission and visualization, high-performance computing, etc. Cyberinfrastructure (CI) consists of computing systems, data storage systems, advanced instruments and data repositories, visualization environments, and people, all linked together by software and high-performance networks to improve research productivity and enable breakthroughs that are not otherwise possible. The Geospatial CI (GCI, or CyberGIS), as the synthesis of CI and GIScience has inherent advantages in enabling computationally intensive spatial analysis and modeling (SAM) and collaborative geospatial problem solving and decision making. This dissertation is dedicated to addressing several critical issues and improving the performance of existing methodologies and systems in the field of CyberGIS. My dissertation will include three parts: The first part is focused on developing methodologies to help public researchers find appropriate open geo-spatial datasets from millions of records provided by thousands of organizations scattered around the world efficiently and effectively. Machine learning and semantic search methods will be utilized in this research. The second part develops an interoperable and replicable geoprocessing service by synthesizing the high-performance computing (HPC) environment, the core spatial statistic/analysis algorithms from the widely adopted open source python package – Python Spatial Analysis Library (PySAL), and rich datasets acquired from the first research. The third part is dedicated to studying optimization strategies for feature data transmission and visualization. This study is intended for solving the performance issue in large feature data transmission through the Internet and visualization on the client (browser) side. Taken together, the three parts constitute an endeavor towards the methodological improvement and implementation practice of the data-driven, high-performance and intelligent CI to advance spatial sciences.Dissertation/ThesisDoctoral Dissertation Geography 201

    Modelo de proceso para la evaluación continua de la accesibilidad de sitios web

    Get PDF
    [spa] La identidad y la imagen corporativa de las instituciones educativas se presentan al mundo a través de sus sitios web. En sus sitios web, las instituciones educativas publican su oferta académica, su misión, su visión, sus objetivos académicos, sus logros, sus reglamentos, sus noticias y toda la labor que las instituciones necesitan para darse a conocer a la sociedad. Por lo tanto, los sitios web educativos deben cumplir las normas de accesibilidad que permiten a las personas con y sin discapacidades utilizar la Web. Teniendo en cuenta que las personas con discapacidad no son ajenas al uso de los sitios web educativos, en muchos casos, tienen que enfrentarse a nuevas barreras en lugar de experimentar beneficios. De ahí la importancia de la accesibilidad de los sitios web educativos. La accesibilidad web significa que las personas con algún tipo de discapacidad puedan hacer uso de la Web en las mismas condiciones que el resto de las personas sin discapacidad. Cuando hablamos de accesibilidad web, nos referimos a un diseño y desarrollo web que permite a las personas con discapacidad percibir, entender, navegar e interactuar con la Web. La accesibilidad web también beneficia a otras personas, incluidas las personas mayores cuyas capacidades han disminuido como consecuencia de la edad. Además, la Web se ha convertido en un recurso esencial en la actividad humana: la educación, el empleo, el gobierno, el comercio, la salud, el entretenimiento y muchas otras cosas se benefician de la Web como plataformas de comunicación e interacción. Teniendo en cuenta lo anterior, el objetivo de esta tesis es proponer un modelo de proceso para que las organizaciones evalúen continuamente la accesibilidad de sus sitios web para hacerlos más accesibles. Esta tesis se ha desarrollado en tres etapas: caracterización del problema, revisión sistemática de la literatura y propuesta del modelo de proceso. En la caracterización del problema, los sitios web se evaluaron con herramientas automáticas en línea utilizando las Pautas de Accesibilidad al Contenido en la Web (Web Content Accessibility Guidelines, WCAG) 2.0 con niveles de conformidad A y AA y las WCAG 2.1 con nivel de conformidad AAA. Los resultados evidenciaron que el mayor número de errores de accesibilidad se encuentran en los principios robusto y perceptible. También, que los 1,353 sitios web y 463 documentos electrónicos analizados tenían barreras de accesibilidad. En esta etapa se publicaron siete artículos científicos. En el estudio del estado de la cuestión, se determinó que los sitios web que fueron evaluados en los artículos analizados no eran accesibles. En esta etapa se publicaron tres artículos de revisión sistemática de la literatura cuyos principales resultados se presentan a continuación: 1. En 20 de los 25 trabajos, los resultados muestran que la accesibilidad de los sitios web se evalúa con herramientas automáticas; en 2 trabajos, se evalúa con usuarios reales, y en los otros 3 trabajos con herramientas automáticas, usuarios reales y expertos. También se observa que todos los sitios web educativos analizados en los trabajos necesitan corregir errores. 2. Los resultados presentan el análisis y la síntesis de las evaluaciones de 9,140 universidades de 67 países. Los recursos evaluados son 38,416 páginas web, 91,421 vídeos de YouTube y 28,395 documentos PDF. La evaluación utiliza métodos manuales, métodos con herramientas automáticas y una combinación de ambos métodos. La mayoría de los sitios web se evaluaron utilizando las normas ISO/IEC 40500:2012 y la Section 508. 3. Los resultados presentan experimentos narrados de proyectos o individuos que buscan mejorar el aprendizaje colaborativo en el área educativa. Las arquitecturas de software propuestas no contemplan leyes o normas de calidad para el acceso universal. En la tercera y última etapa, se desarrolla un modelo de proceso para la evaluación continua de la accesibilidad de sitios web. El modelo de proceso consta de cuatro fases. La primera fase (Planear) define el problema de accesibilidad, su importancia y las WCAG con las que se evaluará. También determina la situación actual de los sitios web, las posibles causas de los problemas de accesibilidad, clasifica los criterios de éxito por principios, pautas y niveles de conformidad para desarrollar el plan de solución y el plan de acción. La segunda fase (Hacer) permite la ejecución del plan de acción para corregir los problemas de accesibilidad. En esta fase, debemos realizar pruebas continuas con herramientas de evaluación automática, usuarios finales y expertos para corroborar que los cambios han surtido efecto. La tercera fase (Verificar) nos permite medir el cumplimiento y el incumplimiento de los indicadores clave de rendimiento (Key Performance Indicator, KPI) definidos. En esta fase también se determinan los motivos de incumplimiento. La cuarta y última fase (Actuar) documenta las soluciones aprendidas para incluirlas en futuros desarrollos. Esta investigación da como resultado el modelo de proceso para la evaluación continua de la accesibilidad web y su comprobación mediante un estudio de caso para corroborar su funcionalidad y aplicabilidad.[eng] Educational institutions’ identity and corporate image are presented to the world through their websites. On their websites, educational institutions publish their academic offerings, mission, vision, academic goals, achievements, regulations, news, and all the work that the institutions need to make known to society. Therefore, educational websites must meet accessibility standards that allow people with and without disabilities to use the web. Considering that people with disabilities are no strangers to the use of educational websites, in many cases, they have to face new barriers instead of experiencing benefits. Hence the importance of accessibility of educational websites. Web accessibility means that people with disabilities can use the Web under the same conditions as others without disabilities. When we talk about web accessibility, we mean web design and development that enables people with disabilities to perceive, understand, navigate and interact with the Web. Web accessibility also benefits others, including older people whose abilities have diminished as a result of age. In addition, the Web has become an essential resource in human activity: education, employment, government, commerce, health, entertainment, and many other things benefit from the Web as a platform for communication and interaction. Considering the above, the objective of this thesis is to propose a process model for organizations to continuously check the accessibility of their websites to make them more accessible. In three stages, this thesis was developed: characterization of the problem, systematic literature review, and the proposal of the process model. In the problem characterization, the websites were evaluated with automatic online tools using Web Content Accessibility Guidelines (WCAG) 2.0 with conformance levels A and AA and WCAG 2.1 with conformance level AAA. The results showed that the most accessibility errors were found in the robust and perceivable principles. Also, the 1,353 websites and 463 electronic documents analyzed had accessibility barriers. Seven scientific articles were published in this stage. In the study of state of the art, it was determined that the websites evaluated in the articles analyzed were not accessible. At this stage, three systematic literature review articles were published, the main results of which are presented below: XXIII Abstract 1. In 20 of the 25 papers, the results show that the accessibility of websites is evaluated with automatic tools; in 2 papers, it was evaluated with real users, and in the other three papers with automatic tools, real users, and experts. It is also observed that all the educational websites analyzed in the papers need to correct errors. 2. The results present the analysis and synthesis of the evaluations of 9,140 universities from 67 countries. The evaluated resources are 38,416 web pages, 91,421 You- Tube videos, and 28,395 PDF documents. The assessment uses manual methods, methods with automatic tools, and a combination of both methods. Most of the websites were evaluated using ISO/IEC 40500:2012 and Section 508 standards. 3. The results present narrated experiments of projects or individuals that seek to improve collaborative learning in the educational area. The proposed software architectures do not contemplate laws or quality standards for universal access. In the third and final stage, a process model is developed for the continuous evaluation of the accessibility of websites. The process model consists of four phases. The first phase (Plan) defines the accessibility problem, its importance, and the WCAG against which it will be evaluated. It also determines the current situation of the Web sites, the possible causes of accessibility problems, classifies the success criteria by principles, guidelines, and conformance levels to develop the solution plan and the action plan. The second phase (Do) allows the execution of the action plan to correct the accessibility problems. In this phase, we must perform continuous testing with automatic evaluation tools, end-users, and experts to corroborate that the changes have taken effect. The third phase (Check) allows us to measure compliance and non-compliance with the defined Key Performance Indicators (KPI). The reasons for non-compliance are also determined at this phase. The fourth and final phase (Act) documents the solutions learned for inclusion in future developments. This research results in the process model for ongoing evaluation of continuous web accessibility and its testing through a case study to corroborate its functionality and applicability.[cat] La identitat i la imatge corporativa de les institucions educatives es presenten al món a través dels seus llocs web. En els seus llocs web, les institucions educatives publiquen la seva oferta acadèmica, la seva missió, la seva visió, els seus objectius acadèmics, els seus assoliments, els seus reglaments, les seves notícies i tota la labor que les institucions necessiten per a donar-se a conèixer a la societat. Per tant, els llocs web educatius han de complir les normes d’accessibilitat que permeten a les persones amb i sense discapacitats utilitzar la web. Tenint en compte que les persones amb discapacitat no són alienes a l’ús dels llocs web educatius, en molts casos, han d’enfrontar-se a noves barreres en lloc d’experimentar beneficis. D’aquí la importància de l’accessibilitat dels llocs web educatius. L’accessibilitat web significa que les persones amb alguna mena de discapacitat puguin fer ús de la web en les mateixes condicions que la resta de les persones sense discapacitat. Quan parlem d’accessibilitat web, ens referim a un disseny i desenvolupament web que permet a les persones amb discapacitat percebre, entendre, navegar i interactuar amb la Web. L’accessibilitat web també beneficia a altres persones, incloses les persones majors les capacitats de les quals han disminuït com a conseqüència de l’edat. A més, la Web s’ha convertit en un recurs essencial en l’activitat humana: l’educació, l’ocupació, el govern, el comerç, la salut, l’entreteniment i moltes altres coses es beneficien de la Web com a plataformes de comunicació i interacció. Tenint en compte l’anterior, l’objectiu d’aquesta tesi és proposar un model de procés perquè les organitzacions evaluen contínuament l’accessibilitat dels seus llocs web per a fer-los més accessibles. Aquesta tesi s’ha desenvolupat en tres etapas: caracterització del problema, estat de l’art i proposta del model de procés. En la caracterització del problema, els llocs web es van avaluar amb eines automàtiques en línia utilitzant les Pautes d’Accessibilitat al Contingut en la Web (WCAG) 2.0 amb nivells de conformitat A i AA i les WCAG 2.1 amb nivell de conformitat AAA. Els resultats van evidenciar que el major nombre d’errors d’accessibilitat es trobaren en els principis robust i perceptible. També, que els 1,353 llocs web i 463 documents electrònics analitzats tenien barreres d’accessibilitat. En aquesta etapa es van publicar set articles científics. XXV Resum En l’estudi de l’estat de la qüestió, es va determinar que els llocs web que van ser avaluats en els articles analitzats no eren accessibles. En aquesta etapa es van publicar tres articles de revisió sistemàtica de la literatura els principals resultats de la qual es presenten a continuació: 1. En 20 dels 25 treballs, els resultats mostren que l’accessibilitat dels llocs web s’avalua amb eines automàtiques; en 2 treballs, es va avaluar amb usuaris reals, i en els altres 3 treballs amb eines automàtiques, usuaris reals i experts. També s’observa que tots els llocs web educatius analitzats en els treballs necessiten corregir errors. 2. Els resultats presenten l’anàlisi i la síntesi de les avaluacions de 9,140 universitats de 67 països. Els recursos avaluats són 38,416 pàgines web, 91,421 vídeos de You- Tube i 28,395 documents PDF. L’avaluació utilitza mètodes manuals, mètodes amb eines automàtiques i una combinació de tots dos mètodes. La majoria dels llocs web es van avaluar utilitzant les normes ISO/IEC 40500:2012 i la Section 508. 3. Els resultats presenten experiments narrats de projectes o individus que busquen millorar l’aprenentatge col·laboratiu en l’àrea educativa. Les arquitectures de programari proposades no contemplen lleis o normes de qualitat per a l’accés universal. En la tercera i última etapa, es desenvolupa un model de procés per a l’avaluació contínua de l’accessibilitat de llocs web. El model de procés consta de quatre fases. La primera fase (Planejar) defineix el problema d’accessibilitat, la seva importància i les WCAG amb les quals s’avaluarà. També determina la situació actual dels llocs web, les possibles causes dels problemes d’accessibilitat, classifica els criteris d’èxit per principis, pautes i nivells de compliment per a desenvolupar el pla de solució i el pla d’acció. La segona fase (Fer) permet l’execució del pla d’acció per a corregir els problemes d’accessibilitat. En aquesta fase, hem de realitzar proves contínues amb eines d’avaluació automàtica, usuaris finals i experts per a corroborar que els canvis han fet efecte. La tercera fase (Verificar) ens permet mesurar el compliment i l’incompliment dels indicadors clau de rendiment (KPI) definits. En aquesta fase també es determinen els motius d’incompliment. La quarta i última fase (Actuar) documenta les solucions apreses per a incloure-les en futurs desenvolupaments. Aquesta recerca dóna com a resultat el model de procés per a l’avaluació contínua de l’accessibilitat web i la seva comprovació mitjançant un estudi de cas per a corroborar la seva funcionalitat i aplicabilitat

    Integrated High-Resolution Modeling for Operational Hydrologic Forecasting

    Get PDF
    Current advances in Earth-sensing technologies, physically-based modeling, and computational processing, offer the promise of a major revolution in hydrologic forecasting—with profound implications for the management of water resources and protection from related disasters. However, access to the necessary capabilities for managing information from heterogeneous sources, and for its deployment in robust-enough modeling engines, remains the province of large governmental agencies. Moreover, even within this type of centralized operations, success is still challenged by the sheer computational complexity associated with overcoming uncertainty in the estimation of parameters and initial conditions in large-scale or high-resolution models. In this dissertation we seek to facilitate the access to hydrometeorological data products from various U.S. agencies and to advanced watershed modeling tools through the implementation of a lightweight GIS-based software package. Accessible data products currently include gauge, radar, and satellite precipitation; stream discharge; distributed soil moisture and snow cover; and multi-resolution weather forecasts. Additionally, we introduce a suite of open-source methods aimed at the efficient parameterization and initialization of complex geophysical models in contexts of high uncertainty, scarce information, and limited computational resources. The developed products in this suite include: 1) model calibration based on state of the art ensemble evolutionary Pareto optimization, 2) automatic parameter estimation boosted through the incorporation of expert criteria, 3) data assimilation that hybridizes particle smoothing and variational strategies, 4) model state compression by means of optimized clustering, 5) high-dimensional stochastic approximation of watershed conditions through a novel lightweight Gaussian graphical model, and 6) simultaneous estimation of model parameters and states for hydrologic forecasting applications. Each of these methods was tested using established distributed physically-based hydrologic modeling engines (VIC and the DHSVM) that were applied to watersheds in the U.S. of different sizes—from a small highly-instrumented catchment in Pennsylvania, to the basin of the Blue River in Oklahoma. A series of experiments was able to demonstrate statistically-significant improvements in the predictive accuracy of the proposed methods in contrast with traditional approaches. Taken together, these accessible and efficient tools can therefore be integrated within various model-based workflows for complex operational applications in water resources and beyond

    Evaluating Sediment Accumulation Behind Dams In The Great Lakes Watershed From Past To Present

    Get PDF
    Reservoir sedimentation and the consequence long term loss of storage capacity have been a serious threat to the natural environmental system. However, there is only few information and physical measurements regarding to the sediment accumulation rate within the reservoirs. The average age of dams in the country is more than 50 years old, and with aging dams, the number of high-hazard dams continues to increase. There are some serious risks associated with aging dams. Dam removal or dam failure can release considerable sediment load to downstream reaches, eventually deteriorate water quality and fish habitat. The present dissertation investigates the historical function of Great Lakes dams as sediment storage points and provides insight into the remaining capacity of dams in the Great Lakes watershed. to better understand the historical and current sediment yield, Soil and Water Assessment Tool (SWAT) has been used. The regression analysis has been done on SWAT output to predict the sediment yield in un-modeled watershed. The overall objectives of this research are: 1- Determine the historical and current sediment yield within the Great Lakes watershed. 2- Estimate the sediment accumulation rate within the reservoirs and forecast the remaining capacity of reservoirs in the Great Lakes. 3- Evaluate the net effect that humans have caused to the sediment delivery to the Great Lakes. The difference between pre- European settlement and the present- day sediment delivery rate is anthropogenic effects. The research of this investigation includes field studies and modeling for eleven reservoirs in the Great Lakes watershed
    corecore