15,414 research outputs found

    Quantifying fisher responses to environmental and regulatory dynamics in marine systems

    Get PDF
    Thesis (Ph.D.) University of Alaska Fairbanks, 2017Commercial fisheries are part of an inherently complicated cycle. As fishers have adopted new technologies and larger vessels to compete for resources, fisheries managers have adapted regulatory structures to sustain stocks and to mitigate unintended impacts of fishing (e.g., bycatch). Meanwhile, the ecosystems that are targeted by fishers are affected by a changing climate, which in turn forces fishers to further adapt, and subsequently, will require regulations to be updated. From the management side, one of the great limitations for understanding how changes in fishery environments or regulations impact fishers has been a lack of sufficient data for resolving their behaviors. In some fisheries, observer programs have provided sufficient data for monitoring the dynamics of fishing fleets, but these programs are expensive and often do not cover every trip or vessel. In the last two decades however, vessel monitoring systems (VMS) have begun to provide vessel location data at regular intervals such that fishing effort and behavioral decisions can be resolved across time and space for many fisheries. I demonstrate the utility of such data by examining the responses of two disparate fishing fleets to environmental and regulatory changes. This study was one of "big data" and required the development of nuanced approaches to process and model millions of records from multiple datasets. I thus present the work in three components: (1) How can we extract the information that we need? I present a detailed characterization of the types of data and an algorithm used to derive relevant behavioral aspects of fishing, like the duration and distances traveled during fishing trips; (2) How do fishers' spatial behaviors in the Bering Sea pollock fishery change in response to environmental variability; and (3) How were fisher behaviors and economic performances affected by a series of regulatory changes in the Gulf of Mexico grouper-tilefish longline fishery? I found a high degree of heterogeneity among vessel behaviors within the pollock fishery, underscoring the role that markets and processor-level decisions play in facilitating fisher responses to environmental change. In the Gulf of Mexico, my VMS-based approach estimated unobserved fishing effort with a high degree of accuracy and confirmed that the regulatory shift (e.g., the longline endorsement program and catch share program) yielded the intended impacts of reducing effort and improving both the economic performance and the overall harvest efficiency for the fleet. Overall, this work provides broadly applicable approaches for testing hypotheses regarding the dynamics of spatial behaviors in response to regulatory and environmental changes in a diversity of fisheries around the world.General introduction -- Chapter 1 Using vessel monitoring system data to identify and characterize trips made by fishing vessels in the United States North Pacific -- Chapter 2 Paths to resilience: Alaska pollock fleet uses multiple fishing strategies to buffer against environmental change in the Bering Sea -- Chapter 3 Vessel monitoring systems (VMS) reveal increased fishing efficiency following regulatory change in a bottom longline fishery -- General Conclusions

    Combining hardware and software instrumentation to classify program executions

    Get PDF
    Several research efforts have studied ways to infer properties of software systems from program spectra gathered from the running systems, usually with software-level instrumentation. While these efforts appear to produce accurate classifications, detailed understanding of their costs and potential cost-benefit tradeoffs is lacking. In this work we present a hybrid instrumentation approach which uses hardware performance counters to gather program spectra at very low cost. This underlying data is further augmented with data captured by minimal amounts of software-level instrumentation. We also evaluate this hybrid approach by comparing it to other existing approaches. We conclude that these hybrid spectra can reliably distinguish failed executions from successful executions at a fraction of the runtime overhead cost of using software-based execution data

    21st Century Simulation: Exploiting High Performance Computing and Data Analysis

    Get PDF
    This paper identifies, defines, and analyzes the limitations imposed on Modeling and Simulation by outmoded paradigms in computer utilization and data analysis. The authors then discuss two emerging capabilities to overcome these limitations: High Performance Parallel Computing and Advanced Data Analysis. First, parallel computing, in supercomputers and Linux clusters, has proven effective by providing users an advantage in computing power. This has been characterized as a ten-year lead over the use of single-processor computers. Second, advanced data analysis techniques are both necessitated and enabled by this leap in computing power. JFCOM's JESPP project is one of the few simulation initiatives to effectively embrace these concepts. The challenges facing the defense analyst today have grown to include the need to consider operations among non-combatant populations, to focus on impacts to civilian infrastructure, to differentiate combatants from non-combatants, and to understand non-linear, asymmetric warfare. These requirements stretch both current computational techniques and data analysis methodologies. In this paper, documented examples and potential solutions will be advanced. The authors discuss the paths to successful implementation based on their experience. Reviewed technologies include parallel computing, cluster computing, grid computing, data logging, OpsResearch, database advances, data mining, evolutionary computing, genetic algorithms, and Monte Carlo sensitivity analyses. The modeling and simulation community has significant potential to provide more opportunities for training and analysis. Simulations must include increasingly sophisticated environments, better emulations of foes, and more realistic civilian populations. Overcoming the implementation challenges will produce dramatically better insights, for trainees and analysts. High Performance Parallel Computing and Advanced Data Analysis promise increased understanding of future vulnerabilities to help avoid unneeded mission failures and unacceptable personnel losses. The authors set forth road maps for rapid prototyping and adoption of advanced capabilities. They discuss the beneficial impact of embracing these technologies, as well as risk mitigation required to ensure success

    Research and technology highlights of the Lewis Research Center

    Get PDF
    Highlights of research accomplishments of the Lewis Research Center for fiscal year 1984 are presented. The report is divided into four major sections covering aeronautics, space communications, space technology, and materials and structures. Six articles on energy are included in the space technology section

    Bayesian inference for queueing networks and modeling of internet services

    Get PDF
    Modern Internet services, such as those at Google, Yahoo!, and Amazon, handle billions of requests per day on clusters of thousands of computers. Because these services operate under strict performance requirements, a statistical understanding of their performance is of great practical interest. Such services are modeled by networks of queues, where each queue models one of the computers in the system. A key challenge is that the data are incomplete, because recording detailed information about every request to a heavily used system can require unacceptable overhead. In this paper we develop a Bayesian perspective on queueing models in which the arrival and departure times that are not observed are treated as latent variables. Underlying this viewpoint is the observation that a queueing model defines a deterministic transformation between the data and a set of independent variables called the service times. With this viewpoint in hand, we sample from the posterior distribution over missing data and model parameters using Markov chain Monte Carlo. We evaluate our framework on data from a benchmark Web application. We also present a simple technique for selection among nested queueing models. We are unaware of any previous work that considers inference in networks of queues in the presence of missing data.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS392 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    NASA SBIR abstracts of 1991 phase 1 projects

    Get PDF
    The objectives of 301 projects placed under contract by the Small Business Innovation Research (SBIR) program of the National Aeronautics and Space Administration (NASA) are described. These projects were selected competitively from among proposals submitted to NASA in response to the 1991 SBIR Program Solicitation. The basic document consists of edited, non-proprietary abstracts of the winning proposals submitted by small businesses. The abstracts are presented under the 15 technical topics within which Phase 1 proposals were solicited. Each project was assigned a sequential identifying number from 001 to 301, in order of its appearance in the body of the report. Appendixes to provide additional information about the SBIR program and permit cross-reference of the 1991 Phase 1 projects by company name, location by state, principal investigator, NASA Field Center responsible for management of each project, and NASA contract number are included

    Index to 1981 NASA Tech Briefs, volume 6, numbers 1-4

    Get PDF
    Short announcements of new technology derived from the R&D activities of NASA are presented. These briefs emphasize information considered likely to be transferrable across industrial, regional, or disciplinary lines and are issued to encourage commercial application. This index for 1981 Tech Briefs contains abstracts and four indexes: subject, personal author, originating center, and Tech Brief Number. The following areas are covered: electronic components and circuits, electronic systems, physical sciences, materials, life sciences, mechanics, machinery, fabrication technology, and mathematics and information sciences
    • …
    corecore