430 research outputs found

    Detection, classification, and density estimation of marine mammals: final report

    Get PDF
    Detection, classification, and localization (DCL) research on marine mammal vocalizations has been in development for decades, and methods for marine mammal population density estimation using acoustic data have been in development since at least 2007. These efforts have been supported by MobySound, an archive of cetacean sounds used for studying call detection and localization that are annotated to facilitate research in DCL. This project was aimed to begin development of high‐performing automatic detection methods for the sounds of beaked whales and other odontocetes. Specifically, this report [1] details the newly collected odontocete recordings that have been added to the MobySound archive; [2] documents continuing development of methods for detection and classification, including improvements to the Energy Ratio Mapping Algorithm (ERMA) method for use on gliders and its extension to new species and populations; [3] reports on application of a newly developed method for population density estimation to field recordings; and [4] also reports on the successful production of datasets focused on odontocete whistles and clicks and baleen whale calls for the Fifth Workshop on Detection, Classification, Localization, and Density Estimation of Marine Mammals using Passive Acoustics.Chief of Naval Operations, Energy and Environmental Readiness Division, Washington DC. The report was prepared by Oregon State University and supported under NPS Grant N00244-10-1-0047.Approved for public release; distribution is unlimited

    Everest: Towards a Verified, Drop-in Replacement of HTTPS

    Get PDF
    The HTTPS ecosystem is the foundation on which Internet security is built. At the heart of this ecosystem is the Transport Layer Security (TLS) protocol, which in turn uses the X.509 public-key infrastructure and numerous cryptographic constructions and algorithms. Unfortunately, this ecosystem is extremely brittle, with headline-grabbing attacks and emergency patches many times a year. We describe our ongoing efforts in Everest (The Everest VERified End-to-end Secure Transport) a project that aims to build and deploy a verified version of TLS and other components of HTTPS, replacing the current infrastructure with proven, secure software. Aiming both at full verification and usability, we conduct high-level code-based, game-playing proofs of security on cryptographic implementations that yield efficient, deployable code, at the level of C and assembly. Concretely, we use F*, a dependently typed language for programming, meta-programming, and proving at a high level, while relying on low-level DSLs embedded within F* for programming low-level components when necessary for performance and, sometimes, side-channel resistance. To compose the pieces, we compile all our code to source-like C and assembly, suitable for deployment and integration with existing code bases, as well as audit by independent security experts. Our main results so far include (1) the design of Low*, a subset of F* designed for C-like imperative programming but with high-level verification support, and KreMLin, a compiler that extracts Low* programs to C; (2) an implementation of the TLS-1.3 record layer in Low*, together with a proof of its concrete cryptographic security; (3) Vale, a new DSL for verified assembly language, and several optimized cryptographic primitives proven functionally correct and side-channel resistant. In an early deployment, all our verified software is integrated and deployed within libcurl, a widely used library of networking protocols

    Documentation Driven Software Development

    Get PDF
    The views, opinions and/or findings contained in this report are those of the author(s) and should not contrued as an official Department of the Army position, policy or decision, unless so designated by other documentation.Our objective is to develop an integrated, systematic, documentation centric approach to software development, known as Documentation Driven Software Development (DDD). The research issues for DDD are creation and application of three key documenting technologies that will drive the development process and a Document Management System (DMS) that will support them. These technologies address (1) representations for active documents; (2) representations for repositories; (3) methods for analysis, transformation, and presentation of this information. In addition, we explored new possibilities for computed-aided interfaces that help humans with routine tasks. In doing so we applied Cognitive Science and machine learning methods to design user interfaces that can learn and assist users. We also expanded our work in the area of integration of ontologies from heterogeneous sources. Specifically, we studied Knowledge System Integration Ontology (KSIO) that aligns data and information systems with current situational context for the efficient knowledge collection, integration and transfer. The role of ontology is to organize and structure knowledge (e.g. by standardized terminology) so that semantic queries and associations become more efficient. We assessed the degree to which natural language processing can be usefully applied to the analysis of requirement changes and their impact on system structure and implementation

    Real-time human ambulation, activity, and physiological monitoring:taxonomy of issues, techniques, applications, challenges and limitations

    Get PDF
    Automated methods of real-time, unobtrusive, human ambulation, activity, and wellness monitoring and data analysis using various algorithmic techniques have been subjects of intense research. The general aim is to devise effective means of addressing the demands of assisted living, rehabilitation, and clinical observation and assessment through sensor-based monitoring. The research studies have resulted in a large amount of literature. This paper presents a holistic articulation of the research studies and offers comprehensive insights along four main axes: distribution of existing studies; monitoring device framework and sensor types; data collection, processing and analysis; and applications, limitations and challenges. The aim is to present a systematic and most complete study of literature in the area in order to identify research gaps and prioritize future research directions

    Self-Adaptive Role-Based Access Control for Business Processes

    Get PDF
    © 2017 IEEE. We present an approach for dynamically reconfiguring the role-based access control (RBAC) of information systems running business processes, to protect them against insider threats. The new approach uses business process execution traces and stochastic model checking to establish confidence intervals for key measurable attributes of user behaviour, and thus to identify and adaptively demote users who misuse their access permissions maliciously or accidentally. We implemented and evaluated the approach and its policy specification formalism for a real IT support business process, showing their ability to express and apply a broad range of self-adaptive RBAC policies

    Cetacean AcousticWelfare in Wild and Managed-Care Settings: Gaps and Opportunities

    Get PDF
    Cetaceans are potentially at risk of poor welfare due to the animals’ natural reliance on sound and the persistent nature of anthropogenic noise, especially in the wild. Industrial, commercial, and recreational human activity has expanded across the seas, resulting in a propagation of sound with varying frequency characteristics. In many countries, current regulations are based on the potential to induce hearing loss; however, a more nuanced approach is needed when shaping regulations, due to other non-hearing loss effects including activation of the stress response, acoustic masking, frequency shifts, alterations in behavior, and decreased foraging. Cetaceans in managedcare settings share the same acoustic characteristics as their wild counterparts, but face different environmental parameters. There have been steps to integrate work on welfare in the wild and in managed-care contexts, and the domain of acoustics offers the opportunity to inform and connect information from both managed-care settings and the wild. Studies of subjects in managed-care give controls not available to wild studies, yet because of the conservation implications, wild studies on welfare impacts of the acoustic environment on cetaceans have largely been the focus, rather than those in captive settings. A deep integration of wild and managed-care-based acoustic welfare research can complement discovery in both domains, as captive studies can provide greater experimental control, while the more comprehensive domain of wild noise studies can help determine the gaps in managed-care based acoustic welfare science. We advocate for a new paradigm in anthropogenic noise research, recognizing the value that both wild and managed-care research plays in illustrating how noise pollution affects welfare including physiology, behavior, and cognition

    A standardisation framework for bio‐logging data to advance ecological research and conservation

    Get PDF
    Bio‐logging data obtained by tagging animals are key to addressing global conservation challenges. However, the many thousands of existing bio‐logging datasets are not easily discoverable, universally comparable, nor readily accessible through existing repositories and across platforms, slowing down ecological research and effective management. A set of universal standards is needed to ensure discoverability, interoperability and effective translation of bio‐logging data into research and management recommendations. We propose a standardisation framework adhering to existing data principles (FAIR: Findable, Accessible, Interoperable and Reusable; and TRUST: Transparency, Responsibility, User focus, Sustainability and Technology) and involving the use of simple templates to create a data flow from manufacturers and researchers to compliant repositories, where automated procedures should be in place to prepare data availability into four standardised levels: (a) decoded raw data, (b) curated data, (c) interpolated data and (d) gridded data. Our framework allows for integration of simple tabular arrays (e.g. csv files) and creation of sharable and interoperable network Common Data Form (netCDF) files containing all the needed information for accuracy‐of‐use, rightful attribution (ensuring data providers keep ownership through the entire process) and data preservation security. We show the standardisation benefits for all stakeholders involved, and illustrate the application of our framework by focusing on marine animals and by providing examples of the workflow across all data levels, including filled templates and code to process data between levels, as well as templates to prepare netCDF files ready for sharing. Adoption of our framework will facilitate collection of Essential Ocean Variables (EOVs) in support of the Global Ocean Observing System (GOOS) and inter‐governmental assessments (e.g. the World Ocean Assessment), and will provide a starting point for broader efforts to establish interoperable bio‐logging data formats across all fields in animal ecology

    Una Legua Cuadrada: Exploring the History of Swanton Pacific Ranch and Environs

    Get PDF
    Swanton Pacific Ranch is an educational and research facility owned by the Cal Poly Corporation and managed by the Cal Poly State University (Cal Poly) College of Agriculture, Food and Environmental Sciences. Located about 180 miles north of campus and just 14 miles north of Santa Cruz, California on Highway 1, the property was first leased to and then donated to Cal Poly by the late Albert E. Smith in 1993. The rancho’s original inhabitants included Native Americans, Spaniards, Mexicans, as well as various European immigrants and their descendants; currently, the staff, faculty, and students of Cal Poly occupy the land. Each of these groups used the land’s rich environment for a variety of purposes from subsistence to financial and intellectual pursuits. Over time, researchers and local historians have discussed specific aspects of the Swanton Pacific Ranch and its environs, particularly concerning its occupants, land use (e.g. businesses, farming, research), and land features (e.g. geology, botany). The following work offers a more cohesive, descriptive narrative of the land and its people organized chronologically from prehistory to the present

    Design space exploration of a jet engine component using a combined object model for function and geometry

    Get PDF
    The design of aircraft and engine components hinges on the use of computer aided design (CAD) models and the subsequent geometry-based analyses for evaluation of the quality of a concept. However, the generation (and variation) of CAD models to include radical or novel design solutions is a resource intense modelling effort. While approaches to automate the generation and variation of CAD models exist, they neglect the capture and representation of the product’s design rationale—what the product is supposed to do. The design space exploration approach Function and Geometry Exploration (FGE) aims to support the exploration of more functionally and geometrically different product concepts under consideration of not only geometrical, but also teleological aspects. The FGE approach has been presented and verified in a previous presentation. However, in order to contribute to engineering design practice, a design method needs to be validated through application in industrial practice. Hence, this publication reports from a study where the FGE approach has been applied by a design team of a Swedish aerospace manufacturers in a conceptual product development project. Conceptually different alternatives were identified in order to meet the expected functionality of a guide vane (GV). The FGE was introduced and applied in a series of workshops. Data was collected through participatory observation in the design teams by the researchers, as well as interviews and questionnaires. The results reveal the potential of the FGE approach as a design support to: (1) Represent and capture the design rationale and the design space; (2) capture, integrate and model novel solutions; and (3) provide support for the embodiment of novel concepts that would otherwise remain unexplored. In conclusion, the FGE method supports designers to articulate and link the design rationale, including functional requirements and alternative solutions, to geometrical features of the product concepts. The method supports the exploration of alternative solutions as well as functions. However, scalability and robustness of the generated CAD models remain subject to further research
    • 

    corecore