165 research outputs found

    Coastal management and adaptation: an integrated data-driven approach

    Get PDF
    Coastal regions are some of the most exposed to environmental hazards, yet the coast is the preferred settlement site for a high percentage of the global population, and most major global cities are located on or near the coast. This research adopts a predominantly anthropocentric approach to the analysis of coastal risk and resilience. This centres on the pervasive hazards of coastal flooding and erosion. Coastal management decision-making practices are shown to be reliant on access to current and accurate information. However, constraints have been imposed on information flows between scientists, policy makers and practitioners, due to a lack of awareness and utilisation of available data sources. This research seeks to tackle this issue in evaluating how innovations in the use of data and analytics can be applied to further the application of science within decision-making processes related to coastal risk adaptation. In achieving this aim a range of research methodologies have been employed and the progression of topics covered mark a shift from themes of risk to resilience. The work focuses on a case study region of East Anglia, UK, benefiting from the input of a partner organisation, responsible for the region’s coasts: Coastal Partnership East. An initial review revealed how data can be utilised effectively within coastal decision-making practices, highlighting scope for application of advanced Big Data techniques to the analysis of coastal datasets. The process of risk evaluation has been examined in detail, and the range of possibilities afforded by open source coastal datasets were revealed. Subsequently, open source coastal terrain and bathymetric, point cloud datasets were identified for 14 sites within the case study area. These were then utilised within a practical application of a geomorphological change detection (GCD) method. This revealed how analysis of high spatial and temporal resolution point cloud data can accurately reveal and quantify physical coastal impacts. Additionally, the research reveals how data innovations can facilitate adaptation through insurance; more specifically how the use of empirical evidence in pricing of coastal flood insurance can result in both communication and distribution of risk. The various strands of knowledge generated throughout this study reveal how an extensive range of data types, sources, and advanced forms of analysis, can together allow coastal resilience assessments to be founded on empirical evidence. This research serves to demonstrate how the application of advanced data-driven analytical processes can reduce levels of uncertainty and subjectivity inherent within current coastal environmental management practices. Adoption of methods presented within this research could further the possibilities for sustainable and resilient management of the incredibly valuable environmental resource which is the coast

    Automatic control program creation using concurrent Evolutionary Computing

    Get PDF
    Over the past decade, Genetic Programming (GP) has been the subject of a significant amount of research, but this has resulted in the solution of few complex real -world problems. In this work, I propose that, for some relatively simple, non safety -critical embedded control applications, GP can be used as a practical alternative to software developed by humans. Embedded control software has become a branch of software engineering with distinct temporal, interface and resource constraints and requirements. This results in a characteristic software structure, and by examining this, the effective decomposition of an overall problem into a number of smaller, simpler problems is performed. It is this type of problem amelioration that is suggested as a method whereby certain real -world problems may be rendered into a soluble form suitable for GP. In the course of this research, the body of published GP literature was examined and the most important changes to the original GP technique of Koza are noted; particular focus is made upon GP techniques involving an element of concurrency -which is central to this work. This search highlighted few applications of GP for the creation of software for complex, real -world problems -this was especially true in the case of multi thread, multi output solutions. To demonstrate this Idea, a concurrent Linear GP (LGP) system was built that creates a multiple input -multiple output solution using a custom low -level evolutionary language set, combining both continuous and Boolean data types. The system uses a multi -tasking model to evolve and execute the required LGP code for each system output using separate populations: Two example problems -a simple fridge controller and a more complex washing machine controller are described, and the problems encountered and overcome during the successful solution of these problems, are detailed. The operation of the complete, evolved washing machine controller is simulated using a graphical LabVIEWapplication. The aim of this research is to propose a general purpose system for the automatic creation of control software for use in a range of problems from the target problem class -without requiring any system tuning: In order to assess the system search performance sensitivity, experiments were performed using various population and LGP string sizes; the experimental data collected was also used to examine the utility of abandoning stalled searches and restarting. This work is significant because it identifies a realistic application of GP that can ease the burden of finite human software design resources, whilst capitalising on accelerating computing potential

    Negotiation is Changing

    Get PDF
    Many changes – those we notice, and those that escape our attention until we are quite a ways down a new path – are only the tip of the iceberg of the change that individuals and society are experiencing as a result of the technological developments of the past couple of decades. Introducing technology into every area of our lives, every aspect of our work, and every pocket of our clothes has far-reaching effects, which researchers are only just now uncovering. We are not only changing our behaviors; we are being changed by our new behaviors: We now conduct our banking and shopping online; at the same time, we have changed in the degree of trust we have in technologically-mediated handling of our financial resources. We are not only interacting in new ways; we have created new communicative paths for supporting such interaction: While this may have been dismissed in the past as informal forms of slang used by younger people, many of us are, by now, familiar with a substantial dictionary of internet-age abbreviations; similarly, emoticons have emerged from a smiley and a frowning face into a highly nuanced set of emoji mini-images, capable of supporting entire messages, full conversations, and even literature. We are not only putting our bodies and our brains to work in new ways; our bodies, and especially our brains, are physiologically changing to adapt to these uses: Our brains are mapping out new neurological networks, developing some areas of the brain at the expense of others

    A Field Guide to Genetic Programming

    Get PDF
    xiv, 233 p. : il. ; 23 cm.Libro ElectrónicoA Field Guide to Genetic Programming (ISBN 978-1-4092-0073-4) is an introduction to genetic programming (GP). GP is a systematic, domain-independent method for getting computers to solve problems automatically starting from a high-level statement of what needs to be done. Using ideas from natural evolution, GP starts from an ooze of random computer programs, and progressively refines them through processes of mutation and sexual recombination, until solutions emerge. All this without the user having to know or specify the form or structure of solutions in advance. GP has generated a plethora of human-competitive results and applications, including novel scientific discoveries and patentable inventions. The authorsIntroduction -- Representation, initialisation and operators in Tree-based GP -- Getting ready to run genetic programming -- Example genetic programming run -- Alternative initialisations and operators in Tree-based GP -- Modular, grammatical and developmental Tree-based GP -- Linear and graph genetic programming -- Probalistic genetic programming -- Multi-objective genetic programming -- Fast and distributed genetic programming -- GP theory and its applications -- Applications -- Troubleshooting GP -- Conclusions.Contents xi 1 Introduction 1.1 Genetic Programming in a Nutshell 1.2 Getting Started 1.3 Prerequisites 1.4 Overview of this Field Guide I Basics 2 Representation, Initialisation and GP 2.1 Representation 2.2 Initialising the Population 2.3 Selection 2.4 Recombination and Mutation Operators in Tree-based 3 Getting Ready to Run Genetic Programming 19 3.1 Step 1: Terminal Set 19 3.2 Step 2: Function Set 20 3.2.1 Closure 21 3.2.2 Sufficiency 23 3.2.3 Evolving Structures other than Programs 23 3.3 Step 3: Fitness Function 24 3.4 Step 4: GP Parameters 26 3.5 Step 5: Termination and solution designation 27 4 Example Genetic Programming Run 4.1 Preparatory Steps 29 4.2 Step-by-Step Sample Run 31 4.2.1 Initialisation 31 4.2.2 Fitness Evaluation Selection, Crossover and Mutation Termination and Solution Designation Advanced Genetic Programming 5 Alternative Initialisations and Operators in 5.1 Constructing the Initial Population 5.1.1 Uniform Initialisation 5.1.2 Initialisation may Affect Bloat 5.1.3 Seeding 5.2 GP Mutation 5.2.1 Is Mutation Necessary? 5.2.2 Mutation Cookbook 5.3 GP Crossover 5.4 Other Techniques 32 5.5 Tree-based GP 39 6 Modular, Grammatical and Developmental Tree-based GP 47 6.1 Evolving Modular and Hierarchical Structures 47 6.1.1 Automatically Defined Functions 48 6.1.2 Program Architecture and Architecture-Altering 50 6.2 Constraining Structures 51 6.2.1 Enforcing Particular Structures 52 6.2.2 Strongly Typed GP 52 6.2.3 Grammar-based Constraints 53 6.2.4 Constraints and Bias 55 6.3 Developmental Genetic Programming 57 6.4 Strongly Typed Autoconstructive GP with PushGP 59 7 Linear and Graph Genetic Programming 61 7.1 Linear Genetic Programming 61 7.1.1 Motivations 61 7.1.2 Linear GP Representations 62 7.1.3 Linear GP Operators 64 7.2 Graph-Based Genetic Programming 65 7.2.1 Parallel Distributed GP (PDGP) 65 7.2.2 PADO 67 7.2.3 Cartesian GP 67 7.2.4 Evolving Parallel Programs using Indirect Encodings 68 8 Probabilistic Genetic Programming 8.1 Estimation of Distribution Algorithms 69 8.2 Pure EDA GP 71 8.3 Mixing Grammars and Probabilities 74 9 Multi-objective Genetic Programming 75 9.1 Combining Multiple Objectives into a Scalar Fitness Function 75 9.2 Keeping the Objectives Separate 76 9.2.1 Multi-objective Bloat and Complexity Control 77 9.2.2 Other Objectives 78 9.2.3 Non-Pareto Criteria 80 9.3 Multiple Objectives via Dynamic and Staged Fitness Functions 80 9.4 Multi-objective Optimisation via Operator Bias 81 10 Fast and Distributed Genetic Programming 83 10.1 Reducing Fitness Evaluations/Increasing their Effectiveness 83 10.2 Reducing Cost of Fitness with Caches 86 10.3 Parallel and Distributed GP are Not Equivalent 88 10.4 Running GP on Parallel Hardware 89 10.4.1 Master–slave GP 89 10.4.2 GP Running on GPUs 90 10.4.3 GP on FPGAs 92 10.4.4 Sub-machine-code GP 93 10.5 Geographically Distributed GP 93 11 GP Theory and its Applications 97 11.1 Mathematical Models 98 11.2 Search Spaces 99 11.3 Bloat 101 11.3.1 Bloat in Theory 101 11.3.2 Bloat Control in Practice 104 III Practical Genetic Programming 12 Applications 12.1 Where GP has Done Well 12.2 Curve Fitting, Data Modelling and Symbolic Regression 12.3 Human Competitive Results – the Humies 12.4 Image and Signal Processing 12.5 Financial Trading, Time Series, and Economic Modelling 12.6 Industrial Process Control 12.7 Medicine, Biology and Bioinformatics 12.8 GP to Create Searchers and Solvers – Hyper-heuristics xiii 12.9 Entertainment and Computer Games 127 12.10The Arts 127 12.11Compression 128 13 Troubleshooting GP 13.1 Is there a Bug in the Code? 13.2 Can you Trust your Results? 13.3 There are No Silver Bullets 13.4 Small Changes can have Big Effects 13.5 Big Changes can have No Effect 13.6 Study your Populations 13.7 Encourage Diversity 13.8 Embrace Approximation 13.9 Control Bloat 13.10 Checkpoint Results 13.11 Report Well 13.12 Convince your Customers 14 Conclusions Tricks of the Trade A Resources A.1 Key Books A.2 Key Journals A.3 Key International Meetings A.4 GP Implementations A.5 On-Line Resources 145 B TinyGP 151 B.1 Overview of TinyGP 151 B.2 Input Data Files for TinyGP 153 B.3 Source Code 154 B.4 Compiling and Running TinyGP 162 Bibliography 167 Inde

    Seeing the City Digitally

    Get PDF
    This book explores what's happening to ways of seeing urban spaces in the contemporary moment, when so many of the technologies through which cities are visualised are digital. Cities have always been pictured, in many media and for many different purposes. This edited collection explores how that picturing is changing in an era of digital visual culture. Analogue visual technologies like film cameras were understood as creating some sort of a trace of the real city. Digital visual technologies, in contrast, harvest and process digital data to create images that are constantly refreshed, modified and circulated. Each of the chapters in this volume examines a different example of this processual visuality is reconfiguring the spatial and temporal organisation of urban life

    Intelligent Transportation Related Complex Systems and Sensors

    Get PDF
    Building around innovative services related to different modes of transport and traffic management, intelligent transport systems (ITS) are being widely adopted worldwide to improve the efficiency and safety of the transportation system. They enable users to be better informed and make safer, more coordinated, and smarter decisions on the use of transport networks. Current ITSs are complex systems, made up of several components/sub-systems characterized by time-dependent interactions among themselves. Some examples of these transportation-related complex systems include: road traffic sensors, autonomous/automated cars, smart cities, smart sensors, virtual sensors, traffic control systems, smart roads, logistics systems, smart mobility systems, and many others that are emerging from niche areas. The efficient operation of these complex systems requires: i) efficient solutions to the issues of sensors/actuators used to capture and control the physical parameters of these systems, as well as the quality of data collected from these systems; ii) tackling complexities using simulations and analytical modelling techniques; and iii) applying optimization techniques to improve the performance of these systems. It includes twenty-four papers, which cover scientific concepts, frameworks, architectures and various other ideas on analytics, trends and applications of transportation-related data

    Seeing the City Digitally

    Get PDF
    This book explores what's happening to ways of seeing urban spaces in the contemporary moment, when so many of the technologies through which cities are visualised are digital. Cities have always been pictured, in many media and for many different purposes. This edited collection explores how that picturing is changing in an era of digital visual culture. Analogue visual technologies like film cameras were understood as creating some sort of a trace of the real city. Digital visual technologies, in contrast, harvest and process digital data to create images that are constantly refreshed, modified and circulated. Each of the chapters in this volume examines a different example of this processual visuality is reconfiguring the spatial and temporal organisation of urban life

    Proceedings of the 2004 ONR Decision-Support Workshop Series: Interoperability

    Get PDF
    In August of 1998 the Collaborative Agent Design Research Center (CADRC) of the California Polytechnic State University in San Luis Obispo (Cal Poly), approached Dr. Phillip Abraham of the Office of Naval Research (ONR) with the proposal for an annual workshop focusing on emerging concepts in decision-support systems for military applications. The proposal was considered timely by the ONR Logistics Program Office for at least two reasons. First, rapid advances in information systems technology over the past decade had produced distributed collaborative computer-assistance capabilities with profound potential for providing meaningful support to military decision makers. Indeed, some systems based on these new capabilities such as the Integrated Marine Multi-Agent Command and Control System (IMMACCS) and the Integrated Computerized Deployment System (ICODES) had already reached the field-testing and final product stages, respectively. Second, over the past two decades the US Navy and Marine Corps had been increasingly challenged by missions demanding the rapid deployment of forces into hostile or devastate dterritories with minimum or non-existent indigenous support capabilities. Under these conditions Marine Corps forces had to rely mostly, if not entirely, on sea-based support and sustainment operations. Particularly today, operational strategies such as Operational Maneuver From The Sea (OMFTS) and Sea To Objective Maneuver (STOM) are very much in need of intelligent, near real-time and adaptive decision-support tools to assist military commanders and their staff under conditions of rapid change and overwhelming data loads. In the light of these developments the Logistics Program Office of ONR considered it timely to provide an annual forum for the interchange of ideas, needs and concepts that would address the decision-support requirements and opportunities in combined Navy and Marine Corps sea-based warfare and humanitarian relief operations. The first ONR Workshop was held April 20-22, 1999 at the Embassy Suites Hotel in San Luis Obispo, California. It focused on advances in technology with particular emphasis on an emerging family of powerful computer-based tools, and concluded that the most able members of this family of tools appear to be computer-based agents that are capable of communicating within a virtual environment of the real world. From 2001 onward the venue of the Workshop moved from the West Coast to Washington, and in 2003 the sponsorship was taken over by ONR’s Littoral Combat/Power Projection (FNC) Program Office (Program Manager: Mr. Barry Blumenthal). Themes and keynote speakers of past Workshops have included: 1999: ‘Collaborative Decision Making Tools’ Vadm Jerry Tuttle (USN Ret.); LtGen Paul Van Riper (USMC Ret.);Radm Leland Kollmorgen (USN Ret.); and, Dr. Gary Klein (KleinAssociates) 2000: ‘The Human-Computer Partnership in Decision-Support’ Dr. Ronald DeMarco (Associate Technical Director, ONR); Radm CharlesMunns; Col Robert Schmidle; and, Col Ray Cole (USMC Ret.) 2001: ‘Continuing the Revolution in Military Affairs’ Mr. Andrew Marshall (Director, Office of Net Assessment, OSD); and,Radm Jay M. Cohen (Chief of Naval Research, ONR) 2002: ‘Transformation ... ’ Vadm Jerry Tuttle (USN Ret.); and, Steve Cooper (CIO, Office ofHomeland Security) 2003: ‘Developing the New Infostructure’ Richard P. Lee (Assistant Deputy Under Secretary, OSD); and, MichaelO’Neil (Boeing) 2004: ‘Interoperability’ MajGen Bradley M. Lott (USMC), Deputy Commanding General, Marine Corps Combat Development Command; Donald Diggs, Director, C2 Policy, OASD (NII

    Innovation in Ship Design

    Get PDF
    What is innovation in ship design? Is it a capability that is inherent in all naval architects? Is it the result of the application of a certain set of tools, or of operation within a certain organizational structure? Can innovation be taught? Innovation is a creative act that results in a new and game-changing product. The emergence of an innovative product creates an asymmetric market. The emergence of an innovative weapon creates an asymmetric battlefield. It is clearly in the economic and military interest of the United States to be able to develop and deploy innovative products, including innovative ships. But the process of ship design is usually one of incremental development and slow evolution. Engineers are taught to develop their product by paying close attention to previous developments. This approach is viewed by some people as anti-innovative. And yet the author has made a career of innovation in ship design. How has this been possible? This dissertation will answer the four questions posed above. It will show what innovation in ship design is, and where innovative naval architecture lies in the taxonomy of human creative endeavor. It will then describe those human attributes which have been found to be essential to successful innovation. It will also describe some of the many tools that innovators use. Some of those tools are used unconsciously. Some of those tools are formal products supported by research institutes and teaching academies. Finally, given the fact that innovation in ship design is a component of engineering – which is a subject taught in Universities – and that it is facilitated by the use of tools – and tool use can be taught – the author will conclude that innovation itself can be taught. Whether it can be mastered will depend upon the individual, just as with most other creative skills
    • 

    corecore