983 research outputs found

    An Evolutionary Approach to Adaptive Image Analysis for Retrieving and Long-term Monitoring Historical Land Use from Spatiotemporally Heterogeneous Map Sources

    Get PDF
    Land use changes have become a major contributor to the anthropogenic global change. The ongoing dispersion and concentration of the human species, being at their orders unprecedented, have indisputably altered Earth’s surface and atmosphere. The effects are so salient and irreversible that a new geological epoch, following the interglacial Holocene, has been announced: the Anthropocene. While its onset is by some scholars dated back to the Neolithic revolution, it is commonly referred to the late 18th century. The rapid development since the industrial revolution and its implications gave rise to an increasing awareness of the extensive anthropogenic land change and led to an urgent need for sustainable strategies for land use and land management. By preserving of landscape and settlement patterns at discrete points in time, archival geospatial data sources such as remote sensing imagery and historical geotopographic maps, in particular, could give evidence of the dynamic land use change during this crucial period. In this context, this thesis set out to explore the potentials of retrospective geoinformation for monitoring, communicating, modeling and eventually understanding the complex and gradually evolving processes of land cover and land use change. Currently, large amounts of geospatial data sources such as archival maps are being worldwide made online accessible by libraries and national mapping agencies. Despite their abundance and relevance, the usage of historical land use and land cover information in research is still often hindered by the laborious visual interpretation, limiting the temporal and spatial coverage of studies. Thus, the core of the thesis is dedicated to the computational acquisition of geoinformation from archival map sources by means of digital image analysis. Based on a comprehensive review of literature as well as the data and proposed algorithms, two major challenges for long-term retrospective information acquisition and change detection were identified: first, the diversity of geographical entity representations over space and time, and second, the uncertainty inherent to both the data source itself and its utilization for land change detection. To address the former challenge, image segmentation is considered a global non-linear optimization problem. The segmentation methods and parameters are adjusted using a metaheuristic, evolutionary approach. For preserving adaptability in high level image analysis, a hybrid model- and data-driven strategy, combining a knowledge-based and a neural net classifier, is recommended. To address the second challenge, a probabilistic object- and field-based change detection approach for modeling the positional, thematic, and temporal uncertainty adherent to both data and processing, is developed. Experimental results indicate the suitability of the methodology in support of land change monitoring. In conclusion, potentials of application and directions for further research are given

    Electronic Warfare Receiver Resource Management and Optimization

    Get PDF
    Optimization of electronic warfare (EW) receiver scan strategies is critical to improving the probability of surviving military missions in hostile environments. The problem is that the limited understanding of how dynamic variations in radar and EW receiver characteristics has influenced the response time to detect enemy threats. The dependent variable was the EW receiver response time and the 4 independent variables were EW receiver revisit interval, EW receiver dwell time, radar scan time, and radar illumination time. Previous researchers have not explained how dynamic variations of independent variables affected response time. The purpose of this experimental study was to develop a model to understand how dynamic variations of the independent variables influenced response time. Queuing theory provided the theoretical foundation for the study using Little\u27s formula to determine the ideal EW receiver revisit interval as it states the mathematical relationship among the variables. Findings from a simulation that produced 17,000 data points indicated that Little\u27s formula was valid for use in EW receivers. Findings also demonstrated that variation of the independent variables had a small but statistically significant effect on the average response time. The most significant finding was the sensitivity in the variance of response time given minor differences of the test conditions, which can lead to unexpectedly long response times. Military users and designers of EW systems benefit most from this study by optimizing system response time, thus improving survivability. Additionally, this research demonstrated a method that may improve EW product development times and reduce the cost to taxpayers through more efficient test and evaluation techniques

    Application of Geographic Information Systems

    Get PDF
    The importance of Geographic Information Systems (GIS) can hardly be overemphasized in today’s academic and professional arena. More professionals and academics have been using GIS than ever – urban & regional planners, civil engineers, geographers, spatial economists, sociologists, environmental scientists, criminal justice professionals, political scientists, and alike. As such, it is extremely important to understand the theories and applications of GIS in our teaching, professional work, and research. “The Application of Geographic Information Systems” presents research findings that explain GIS’s applications in different subfields of social sciences. With several case studies conducted in different parts of the world, the book blends together the theories of GIS and their practical implementations in different conditions. It deals with GIS’s application in the broad spectrum of geospatial analysis and modeling, water resources analysis, land use analysis, infrastructure network analysis like transportation and water distribution network, and such. The book is expected to be a useful source of knowledge to the users of GIS who envision its applications in their teaching and research. This easy-to-understand book is surely not the end in itself but a little contribution to toward our understanding of the rich and wonderful subject of GIS

    THE REALISM OF ALGORITHMIC HUMAN FIGURES A Study of Selected Examples 1964 to 2001

    Get PDF
    It is more than forty years since the first wireframe images of the Boeing Man revealed a stylized hu-man pilot in a simulated pilot's cabin. Since then, it has almost become standard to include scenes in Hollywood movies which incorporate virtual human actors. A trait particularly recognizable in the games industry world-wide is the eagerness to render athletic muscular young men, and young women with hour-glass body-shapes, to traverse dangerous cyberworlds as invincible heroic figures. Tremendous efforts in algorithmic modeling, animation and rendering are spent to produce a realistic and believable appearance of these algorithmic humans. This thesis develops two main strands of research by the interpreting a selection of examples. Firstly, in the computer graphics context, over the forty years, it documents the development of the creation of the naturalistic appearance of images (usually called photorealism ). In particular, it de-scribes and reviews the impact of key algorithms in the course of the journey of the algorithmic human figures towards realism . Secondly, taking a historical perspective, this work provides an analysis of computer graphics in relation to the concept of realism. A comparison of realistic images of human figures throughout history with their algorithmically-generated counterparts allows us to see that computer graphics has both learned from previous and contemporary art movements such as photorealism but also taken out-of-context elements, symbols and properties from these art movements with a questionable naivety. Therefore, this work also offers a critique of the justification of the use of their typical conceptualization in computer graphics. Although the astounding technical achievements in the field of algorithmically-generated human figures are paralleled by an equally astounding disregard for the history of visual culture, from the beginning 1964 till the breakthrough 2001, in the period of the digital information processing machine, a new approach has emerged to meet the apparently incessant desire of humans to create artificial counterparts of themselves. Conversely, the theories of traditional realism have to be extended to include new problems that those active algorithmic human figures present

    Segmentation Based Classification of Airborne Laser Scanner Data

    Get PDF

    Machine Learning Approaches for Natural Resource Data

    Get PDF
    Abstract Real life applications involving efficient management of natural resources are dependent on accurate geographical information. This information is usually obtained by manual on-site data collection, via automatic remote sensing methods, or by the mixture of the two. Natural resource management, besides accurate data collection, also requires detailed analysis of this data, which in the era of data flood can be a cumbersome process. With the rising trend in both computational power and storage capacity, together with lowering hardware prices, data-driven decision analysis has an ever greater role. In this thesis, we examine the predictability of terrain trafficability conditions and forest attributes by using a machine learning approach with geographic information system data. Quantitative measures on the prediction performance of terrain conditions using natural resource data sets are given through five distinct research areas located around Finland. Furthermore, the estimation capability of key forest attributes is inspected with a multitude of modeling and feature selection techniques. The research results provide empirical evidence on whether the used natural resource data is sufficiently accurate enough for practical applications, or if further refinement on the data is needed. The results are important especially to forest industry since even slight improvements to the natural resource data sets utilized in practice can result in high saves in terms of operation time and costs. Model evaluation is also addressed in this thesis by proposing a novel method for estimating the prediction performance of spatial models. Classical model goodness of fit measures usually rely on the assumption of independently and identically distributed data samples, a characteristic which normally is not true in the case of spatial data sets. Spatio-temporal data sets contain an intrinsic property called spatial autocorrelation, which is partly responsible for breaking these assumptions. The proposed cross validation based evaluation method provides model performance estimation where optimistic bias due to spatial autocorrelation is decreased by partitioning the data sets in a suitable way. Keywords: Open natural resource data, machine learning, model evaluationTiivistelmä Käytännön sovellukset, joihin sisältyy luonnonvarojen hallintaa ovat riippuvaisia tarkasta paikkatietoaineistosta. Tämä paikkatietoaineisto kerätään usein manuaalisesti paikan päällä, automaattisilla kaukokartoitusmenetelmillä tai kahden edellisen yhdistelmällä. Luonnonvarojen hallinta vaatii tarkan aineiston keräämisen lisäksi myös sen yksityiskohtaisen analysoinnin, joka tietotulvan aikakautena voi olla vaativa prosessi. Nousevan laskentatehon, tallennustilan sekä alenevien laitteistohintojen myötä datapohjainen päätöksenteko on yhä suuremmassa roolissa. Tämä väitöskirja tutkii maaston kuljettavuuden ja metsäpiirteiden ennustettavuutta käyttäen koneoppimismenetelmiä paikkatietoaineistojen kanssa. Maaston kuljettavuuden ennustamista mitataan kvantitatiivisesti käyttäen kaukokartoitusaineistoa viideltä eri tutkimusalueelta ympäri Suomea. Tarkastelemme lisäksi tärkeimpien metsäpiirteiden ennustettavuutta monilla eri mallintamistekniikoilla ja piirteiden valinnalla. Väitöstyön tulokset tarjoavat empiiristä todistusaineistoa siitä, onko käytetty luonnonvaraaineisto riittävän laadukas käytettäväksi käytännön sovelluksissa vai ei. Tutkimustulokset ovat tärkeitä erityisesti metsäteollisuudelle, koska pienetkin parannukset luonnonvara-aineistoihin käytännön sovelluksissa voivat johtaa suuriin säästöihin niin operaatioiden ajankäyttöön kuin kuluihin. Tässä työssä otetaan kantaa myös mallin evaluointiin esittämällä uuden menetelmän spatiaalisten mallien ennustuskyvyn estimointiin. Klassiset mallinvalintakriteerit nojaavat yleensä riippumattomien ja identtisesti jakautuneiden datanäytteiden oletukseen, joka ei useimmiten pidä paikkaansa spatiaalisilla datajoukoilla. Spatio-temporaaliset datajoukot sisältävät luontaisen ominaisuuden, jota kutsutaan spatiaaliseksi autokorrelaatioksi. Tämä ominaisuus on osittain vastuussa näiden oletusten rikkomisesta. Esitetty ristiinvalidointiin perustuva evaluointimenetelmä tarjoaa mallin ennustuskyvyn mitan, missä spatiaalisen autokorrelaation vaikutusta vähennetään jakamalla datajoukot sopivalla tavalla. Avainsanat: Avoin luonnonvara-aineisto, koneoppiminen, mallin evaluoint

    Spectral-Spatial Analysis of Remote Sensing Data: An Image Model and A Procedural Design

    Get PDF
    The distinguishing property of remotely sensed data is the multivariate information coupled with a two-dimensional pictorial representation amenable to visual interpretation. The contribution of this work is the design and implementation of various schemes that exploit this property. This dissertation comprises two distinct parts. The essence of Part One is the algebraic solution for the partition function of a high-order lattice model of a two dimensional binary particle system. The contribution of Part Two is the development of a procedural framework to guide multispectral image analysis. The characterization of binary (black and white) images with little semantic content is discussed in Part One. Measures of certain observable properties of binary images are proposed. A lattice model is introduced, the solution to which yields functional mappings from the model parameters to the measurements on the image. Simulation of the model is explained, as is its usage in the design of Bayesian priors to bias classification analysis of spectral data. The implication of such a bias is that spatially adjacent remote sensing data are identified as belonging to the same class with a high likelihood. Experiments illustrating the benefit of using the model in multispectral image analysis are also discussed. The second part of this dissertation presents a procedural schema for remote sensing data analysis. It is believed that the data crucial to a succc~ssful analysis is provided by the human, as an interpretation of the image representation of the remote sensing spectral data. Subsequently, emphasis is laid on the design of an intelligent implementation of existing algorithms, rather than the development of new algorithms for analysis. The development introduces hyperspectral analysis as a problem requiring multi-source data fusion and presents a process model to guide the design of a solution. Part Two concludes with an illustration of the schema as used in the classification analysis of a given hyperspectral data set

    Carbon-Dioxide Pipeline Infrastructure Route Optimization And Network Modeling For Carbon Capture Storage And Utilization

    Get PDF
    Carbon capture, utilization, and storage (CCUS) is a technology value-chain which can help reduce CO2 emissions while ensuring sustainable development of the energy and industrial sectors. However, CCUS requires large-scale deployment of infrastructure for capturing feasible amounts of CO2 that can be capital intensive for stakeholders. In addition, CCUS deployment leads to the development of extensive pipeline corridors, which can be inconsistent with the requirements for future CCUS infrastructure expansion. With the implementation and growth of CCUS technology in the states of North Dakota, Montana, Wyoming, Colorado and Utah in mind, this dissertation has two major goals: (a) to identify feasible corridors for CO2 pipelines; and (b) to develop a CCUS infrastructure network which minimizes project cost. To address these goals, the dissertation introduces the CCSHawk methodology that develops pipeline routes and CCUS infrastructure networks using a variety of techniques such as multi-criteria decision analysis (MCDA), graph network algorithms, natural language processing and linear network optimization. The pipeline route and CCUS network model are designed using open-source data, specifically: geo-information, emission quantities and reservoir properties. The MCDA of the study area reveals that North Dakota, central Wyoming and Eastern Colorado have the highest amount of land suitable for CO2 pipeline corridors. The optimized graph network routing algorithm reduces the overall length of pipeline routes by an average of 4.23% as compared to traditional routing algorithms while maintaining low environmental impact. The linear optimization of the CCUS infrastructure shows that the cost for implementing the technology in the study area can vary between 24.05/tCO2to24.05/tCO2 to 42/tCO2 for capturing 20 to 90MtCO2. The analysis also reveals that there would be a declining economic impact of existing pipeline infrastructure on the future growth of CCUS networks ranging between 0.01 to 1.62$/tCO2 with increasing CO2 capture targets. This research is significant, as it establishes a technique for pipeline route modeling and CCUS economic analysis highly adaptable to various geographic regions. To the best of the author\u27s knowledge, it is also the first economic analysis that considers the effect of pre-existing infrastructure on the growth of CCUS technology for the region. Furthermore, the pipeline route model establishes a schema for considering not only environmental factors but also ecological factors for the study area

    Report from the MPP Working Group to the NASA Associate Administrator for Space Science and Applications

    Get PDF
    NASA's Office of Space Science and Applications (OSSA) gave a select group of scientists the opportunity to test and implement their computational algorithms on the Massively Parallel Processor (MPP) located at Goddard Space Flight Center, beginning in late 1985. One year later, the Working Group presented its report, which addressed the following: algorithms, programming languages, architecture, programming environments, the way theory relates, and performance measured. The findings point to a number of demonstrated computational techniques for which the MPP architecture is ideally suited. For example, besides executing much faster on the MPP than on conventional computers, systolic VLSI simulation (where distances are short), lattice simulation, neural network simulation, and image problems were found to be easier to program on the MPP's architecture than on a CYBER 205 or even a VAX. The report also makes technical recommendations covering all aspects of MPP use, and recommendations concerning the future of the MPP and machines based on similar architectures, expansion of the Working Group, and study of the role of future parallel processors for space station, EOS, and the Great Observatories era

    Considerations for a design and operations knowledge support system for Space Station Freedom

    Get PDF
    Engineering and operations of modern engineered systems depend critically upon detailed design and operations knowledge that is accurate and authoritative. A design and operations knowledge support system (DOKSS) is a modern computer-based information system providing knowledge about the creation, evolution, and growth of an engineered system. The purpose of a DOKSS is to provide convenient and effective access to this multifaceted information. The complexity of Space Station Freedom's (SSF's) systems, elements, interfaces, and organizations makes convenient access to design knowledge especially important, when compared to simpler systems. The life cycle length, being 30 or more years, adds a new dimension to space operations, maintenance, and evolution. Provided here is a review and discussion of design knowledge support systems to be delivered and operated as a critical part of the engineered system. A concept of a DOKSS for Space Station Freedom (SSF) is presented. This is followed by a detailed discussion of a DOKSS for the Lyndon B. Johnson Space Center and Work Package-2 portions of SSF
    • …
    corecore