379,872 research outputs found
Damage quantification in an aluminium-CFRP composite structure using guided wave wavenumber mapping : Comparison of instantaneous and local wavenumber analyses
Composite-overwrapped pressure vessels (COPV) are increasingly used in the transportation industry due to their high strength to mass ratio. Throughout the years, various designs were developed and found their applications. Currently, there are five designs, which can be subdivided into two main categories - with a load-sharing metal liner and with a non-load-sharing plastic liner. The main damage mechanism defining the lifetime of the first type is fatigue of the metal liner, whereas for the second type it is fatigue of the composite overwrap. Nevertheless, one damage type which may drastically reduce the lifetime of COPV is impact-induced damage. Therefore, this barely visible damage needs to be assessed in a non-destructive way to decide whether the pressure vessel can be further used or has to be put out of service. One of the possible methods is based on ultrasonic waves. In this contribution, both conventional ultrasonic testing (UT) by high-frequency bulk waves and wavenumber mapping by low frequency guided waves are used to evaluate impact damage. Wavenumber mapping techniques are first benchmarked on a simulated aluminium panel then applied to experimental measurements acquired on a delaminated aluminium-CFRP composite plate which corresponds to a structure of COPV with a load-sharing metal liner. The analysis of experimental data obtained from measurements of guided waves propagating in an aluminium-CFRP composite plate with impact-induced damage is performed. All approaches show similar performance in terms of quantification of damage size and depths while being applied to numerical data. The approaches used on the experimental data deliver an accurate estimate of the in-plane size of the large delamination at the aluminium-CFRP interface but only a rough estimate of its depth. Moreover, none of the wavenumber mapping techniques used in the study can quantify every delamination between CFRP plies caused by the impact, which is the case for conventional UT. This may be solved by using higher frequencies (shorter wavelengths) or more advanced signal processing techniques. All in all, it can be concluded that imaging of complex impact damage in fibre-reinforced composites based on wavenumber mapping is not straightforward and stays a challenging task
Recommended from our members
Arcadia, a software development environment research project
The research objectives of the Arcadia project are two-fold: discovery and development of environment architecture principles and creation of novel software development tools, particularly powerful analysis tools, which will function within an environment built upon these architectural principles.Work in the architecture area is concerned with providing the framework to support integration while also supporting the often conflicting goal of extensibility. Thus, this area of research is directed toward achieving external integration by providing a consistent, uniform user interface, while still admitting customization and addition of new tools and interface functions. In an effort to also attain internal integration, research is aimed at developing mechanisms for structuring and managing the tools and data objects that populate a software development environment, while facilitating the insertion of new kinds of tools and new classes of objects.The unifying theme of work in the tools area is support for effective analysis at every stage of a software development project. Research is directed toward tools suitable for analyzing pre-implementation descriptions of software, software itself, and towards the production of testing and debugging tools. In many cases, these tools are specifically tailored for applicability to concurrent, distributed, or real-time software systems.The initial focus of Arcadia research is on creating a prototype environment, embodying the architectural principles, which supports Ada1 software development. This prototype environment is itself being developed in Ada.Arcadia is being developed by a consortium of researchers from the University of California at Irvine, the University of Colorado at Boulder, the University of Massachusetts at Amherst, TRW, Incremental Systems Corporation, and The Aerospace Corporation. This paper delineates the research objectives and describes the approaches being taken, the organization of the research endeavor, and current status of the work
Collaborative e-science architecture for Reaction Kinetics research community
This paper presents a novel collaborative e-science architecture (CeSA) to address two challenging issues in e-science that arise from the management of heterogeneous distributed environments: (i) how to provide individual scientists an integrated environment to collaborate with each other in distributed, loosely coupled research communities where each member might be using a disparate range of tools; and (ii) how to provide easy access to a range of computationally intensive resources from a desktop. The Reaction Kinetics research community was used to capture the requirements and in the evaluation of the proposed architecture. The result demonstrated the feasibility of the approach and the potential benefits of the CeSA
Preparing Laboratory and Real-World EEG Data for Large-Scale Analysis: A Containerized Approach.
Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain-computer interface models. However, the absence of standardized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the difficulty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a "containerized" approach and freely available tools we have developed to facilitate the process of annotating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-)analysis. The EEG Study Schema (ESS) comprises three data "Levels," each with its own XML-document schema and file/folder convention, plus a standardized (PREP) pipeline to move raw (Data Level 1) data to a basic preprocessed state (Data Level 2) suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are increasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at www.eegstudy.org and a central catalog of over 850 GB of existing data in ESS format is available at studycatalog.org. These tools and resources are part of a larger effort to enable data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org)
Collaborative Development and Evaluation of Text-processing Workflows in a UIMA-supported Web-based Workbench
Challenges in creating comprehensive text-processing worklows include a lack of the interoperability of individual components coming from different providers and/or a requirement imposed on the end users to know programming techniques to compose such workflows. In this paper we demonstrate Argo, a web-based system that addresses these issues in several ways. It supports the widely adopted Unstructured Information Management Architecture (UIMA), which handles the problem of interoperability; it provides a web browser-based interface for developing workflows by drawing diagrams composed of a selection of available processing components; and it provides novel user-interactive analytics such as the annotation editor which constitutes a bridge between automatic processing and manual correction. These features extend the target audience of Argo to users with a limited or no technical background. Here, we focus specifically on the construction of advanced workflows, involving multiple branching and merging points, to facilitate various comparative evalutions. Together with the use of user-collaboration capabilities supported in Argo, we demonstrate several use cases including visual inspections, comparisions of multiple processing segments or complete solutions against a reference standard, inter-annotator agreement, and shared task mass evaluations. Ultimetely, Argo emerges as a one-stop workbench for defining, processing, editing and evaluating text processing tasks
Specifications and Development of Interoperability Solution dedicated to Multiple Expertise Collaboration in a Design Framework
This paper describes the specifications of an interoperability platform based on the PPO (Product Process Organization) model developed by the French community IPPOP in the context of collaborative and innovative design. By using PPO model as a reference, this work aims to connect together heterogonous tools used by experts easing data and information exchanges. After underlining the growing needs of collaborative design process, this paper focuses on interoperability concept by describing current solutions and their limits. Then a solution based on the flexibility of the PPO model adapted to the philosophy of interoperability is proposed. To illustrate these concepts, several examples are more particularly described (robustness analysis, CAD and Product Lifecycle Management systems connections)
Entry and access : how shareability comes about
Shareability is a design principle that refers to how a system, interface, or device engages a group of collocated, co-present users in shared interactions around the same content (or the same object). This is broken down in terms of a set of components that facilitate or constrain the way an interface (or product) is made shareable. Central are the notions of access points and entry points. Entry points invite and entice people into engagement, providing an advance overview, minimal barriers, and a honeypot effect that draws observers into the activity. Access points enable users to join a group's activity, allowing perceptual and manipulative access and fluidity of sharing. We show how these terms can be useful for informing analysis and empirical research
- …