12 research outputs found

    Land Use And Land Cover Classification And Change Detection Using Naip Imagery From 2009 To 2014: Table Rock Lake Region, Missouri

    Get PDF
    Land use and land cover (LULC) of Table Rock Lake (TRL) region has changed over the last half century after the construction of Table Rock Dam in 1959. This study uses one meter spatial resolution imagery to classify and detect the change of LULC of three typical waterside TRL regions. The main objectives are to provide an efficient and reliable classification workflow for regional level NAIP aerial imagery and identify the dynamic patterns for study areas. Seven class types are extracted by optimal classification results from year 2009, 2010, 2012 and 2014 of Table Rock Village, Kimberling City and Indian Point. Pixel-based post-classification comparison generated from-to” confusion matrices showing the detailed change patterns. I conclude that object-based random trees achieve the highest overall accuracy and kappa value, compared with the other six classification approaches, and is efficient to make a LULC classification map. Major change patterns are that vegetation, including trees and grass, increased during the last five years period while residential extension and urbanization process is not obvious to indicate high economic development in the TRL region. By adding auxiliary spatial information and object-based post-classification techniques, an improved classification procedure can be utilized for LULC change detection projects at the region level

    An Open-Source Workflow for Spatiotemporal Studies with COVID-19 as an Example

    No full text
    Many previous studies have shown that open-source technologies help democratize information and foster collaborations to enable addressing global physical and societal challenges. The outbreak of the novel coronavirus has imposed unprecedented challenges to human society. It affects every aspect of livelihood, including health, environment, transportation, and economy. Open-source technologies provide a new ray of hope to collaboratively tackle the pandemic. The role of open source is not limited to sharing a source code. Rather open-source projects can be adopted as a software development approach to encourage collaboration among researchers. Open collaboration creates a positive impact in society and helps combat the pandemic effectively. Open-source technology integrated with geospatial information allows decision-makers to make strategic and informed decisions. It also assists them in determining the type of intervention needed based on geospatial information. The novelty of this paper is to standardize the open-source workflow for spatiotemporal research. The highlights of the open-source workflow include sharing data, analytical tools, spatiotemporal applications, and results and formalizing open-source software development. The workflow includes (i) developing open-source spatiotemporal applications, (ii) opening and sharing the spatiotemporal resources, and (iii) replicating the research in a plug and play fashion. Open data, open analytical tools and source code, and publicly accessible results form the foundation for this workflow. This paper also presents a case study with the open-source spatiotemporal application development for air quality analysis in California, USA. In addition to the application development, we shared the spatiotemporal data, source code, and research findings through the GitHub repository

    An Open-Source Workflow for Spatiotemporal Studies with COVID-19 as an Example

    No full text
    Many previous studies have shown that open-source technologies help democratize information and foster collaborations to enable addressing global physical and societal challenges. The outbreak of the novel coronavirus has imposed unprecedented challenges to human society. It affects every aspect of livelihood, including health, environment, transportation, and economy. Open-source technologies provide a new ray of hope to collaboratively tackle the pandemic. The role of open source is not limited to sharing a source code. Rather open-source projects can be adopted as a software development approach to encourage collaboration among researchers. Open collaboration creates a positive impact in society and helps combat the pandemic effectively. Open-source technology integrated with geospatial information allows decision-makers to make strategic and informed decisions. It also assists them in determining the type of intervention needed based on geospatial information. The novelty of this paper is to standardize the open-source workflow for spatiotemporal research. The highlights of the open-source workflow include sharing data, analytical tools, spatiotemporal applications, and results and formalizing open-source software development. The workflow includes (i) developing open-source spatiotemporal applications, (ii) opening and sharing the spatiotemporal resources, and (iii) replicating the research in a plug and play fashion. Open data, open analytical tools and source code, and publicly accessible results form the foundation for this workflow. This paper also presents a case study with the open-source spatiotemporal application development for air quality analysis in California, USA. In addition to the application development, we shared the spatiotemporal data, source code, and research findings through the GitHub repository

    Enhanced Mapping of Supraglacial Lakes Through Dual-attention Deep Neural Network

    No full text
    International audienceSupraglacial lakes in the Arctic undergo seasonal and glacial-activity-induced changes, providing profound insights into ice dynamics and climate changes in these sensitive regions. However, the morphological complexity of these lakes, compounded by the environmental obstructions like clouds and slush fields, poses significant challenges to accurate lake detection. The 31st ACM SIGSPATIAL 2023 initiated a competition, GISCUP 2023, focusing on supraglacial lake detection based on multipart, multi-temporal satellite imagery. This paper, distinguished as the 3rd place winner, introduces a pioneering dual-attention U-net algorithm. This approach synergizes deep learning with spectral and spatial knowledge, ensuring a streamlined pipeline structure that upholds methodological soundness and yields satisfying results

    Spatiotemporal analysis of medical resource deficiencies in the U.S. under COVID-19 pandemic.

    No full text
    Coronavirus disease 2019 (COVID-19) was first identified in December 2019 in Wuhan, China as an infectious disease, and has quickly resulted in an ongoing pandemic. A data-driven approach was developed to estimate medical resource deficiencies due to medical burdens at county level during the COVID-19 pandemic. The study duration was mainly from February 15, 2020 to May 1, 2020 in the U.S. Multiple data sources were used to extract local population, hospital beds, critical care staff, COVID-19 confirmed case numbers, and hospitalization data at county level. We estimated the average length of stay from hospitalization data at state level, and calculated the hospitalized rate at both state and county level. Then, we developed two medical resource deficiency indices that measured the local medical burden based on the number of accumulated active confirmed cases normalized by local maximum potential medical resources, and the number of hospitalized patients that can be supported per ICU bed per critical care staff, respectively. Data on medical resources, and the two medical resource deficiency indices are illustrated in a dynamic spatiotemporal visualization platform based on ArcGIS Pro Dashboards. Our results provided new insights into the U.S. pandemic preparedness and local dynamics relating to medical burdens in response to the COVID-19 pandemic

    Big Earth data analytics: a survey

    No full text
    Big Earth data are produced from satellite observations, Internet-of-Things, model simulations, and other sources. The data embed unprecedented insights and spatiotemporal stamps of relevant Earth phenomena for improving our understanding, responding, and addressing challenges of Earth sciences and applications. In the past years, new technologies (such as cloud computing, big data and artificial intelligence) have gained momentum in addressing the challenges of using big Earth data for scientific studies and geospatial applications historically intractable. This paper reviews the big Earth data analytics from several aspects to capture the latest advancements in this fast-growing domain. We first introduce the concepts of big Earth data. The architecture, various functionalities, and supporting modules are then reviewed from a generic methodology aspect. Analytical methods supporting the functionalities are surveyed and analyzed in the context of different tools. The driven questions are exemplified through cutting-edge Earth science researches and applications. A list of challenges and opportunities are proposed for different stakeholders to collaboratively advance big Earth data analytics in the near future

    COVID-Scraper: An Open-Source Toolset for Automatically Scraping and Processing Global Multi-Scale Spatiotemporal COVID-19 Records

    No full text
    In 2019, COVID-19 quickly spread across the world, infecting billions of people and disrupting the normal lives of citizens in every country. Governments, organizations, and research institutions all over the world are dedicating vast resources to research effective strategies to fight this rapidly propagating virus. With virus testing, most countries publish the number of confirmed cases, dead cases, recovered cases, and locations routinely through various channels and forms. This important data source has enabled researchers worldwide to perform different COVID-19 scientific studies, such as modeling this virus’s spreading patterns, developing prevention strategies, and studying the impact of COVID-19 on other aspects of society. However, one major challenge is that there is no standardized, updated, and high-quality data product that covers COVID-19 cases data internationally. This is because different countries may publish their data in unique channels, formats, and time intervals, which hinders researchers from fetching necessary COVID-19 datasets effectively, especially for fine-scale studies. Although existing solutions such as John’s Hopkins COVID-19 Dashboard and 1point3acres COVID-19 tracker are widely used, it is difficult for users to access their original dataset and customize those data to meet specific requirements in categories, data structure, and data source selection. To address this challenge, we developed a toolset using cloud-based web scraping to extract, refine, unify, and store COVID-19 cases data at multiple scales for all available countries around the world automatically. The toolset then publishes the data for public access in an effective manner, which could offer users a real time COVID-19 dynamic dataset with a global view. Two case studies are presented about how to utilize the datasets. This toolset can also be easily extended to fulfill other purposes with its open-source nature
    corecore