6,936 research outputs found

    Discrete Signal Processing on Graphs: Frequency Analysis

    Full text link
    Signals and datasets that arise in physical and engineering applications, as well as social, genetics, biomolecular, and many other domains, are becoming increasingly larger and more complex. In contrast to traditional time and image signals, data in these domains are supported by arbitrary graphs. Signal processing on graphs extends concepts and techniques from traditional signal processing to data indexed by generic graphs. This paper studies the concepts of low and high frequencies on graphs, and low-, high-, and band-pass graph filters. In traditional signal processing, there concepts are easily defined because of a natural frequency ordering that has a physical interpretation. For signals residing on graphs, in general, there is no obvious frequency ordering. We propose a definition of total variation for graph signals that naturally leads to a frequency ordering on graphs and defines low-, high-, and band-pass graph signals and filters. We study the design of graph filters with specified frequency response, and illustrate our approach with applications to sensor malfunction detection and data classification

    A new data reduction scheme to obtain the mode II fracture properties of Pinus Pinaster wood

    Get PDF
    In this work a numerical study of the End Notched Flexure (ENF) specimen was performed in order to obtain the mode II critical strain energy released rate (GIIc) of a Pinus pinaster wood in the RL crack propagation system. The analysis included interface finite elements and a progressive damage model based on indirect use of Fracture Mechanics. The difficulties in monitoring the crack length during an experimental ENF test and the inconvenience of performing separate tests in order to obtain the elastic properties are well known. To avoid these problems, a new data reduction scheme based on the equivalent crack concept was proposed and validated. This new data reduction scheme, the Compliance-Based Beam Method (CBBM), does not require crack measurements during ENF tests and additional tests to obtain elastic properties.FCT - POCTI/EME/45573/200

    Finite element analysis of the ECT test on mode III interlaminar fracture of carbon-epoxy composite laminates

    Get PDF
    In this work a parametric study of the Edge Crack Torsion (ECT) specimen was performed in order to maximize the mode III component (GIII) of the strain energy release rate for carbon-epoxy laminates. A three-dimensional finite element analysis of the ECT test was conducted considering a [90/0/(+45/-45)2/(-45/+45)2/0/90]S lay-up. The main objective was to define an adequate geometry to obtain an almost pure mode III at crack front. The geometrical parameters studied were specimen dimensions, distance between pins and size of the initial crack. The numerical results demonstrated that the ratio between the specimen length and the initial crack length had a significant effect on the strain energy release rate distributions. In almost all of the tested configurations, a mode II component occurred near the edges but it did not interfere significantly with the dominant mode III state.FCT - POCTI/EME/45573/200

    Correlation-Strength Driven Anderson Metal-Insulator Transition

    Get PDF
    The possibility of driving an Anderson metal-insulator transition in the presence of scale-free disorder by changing the correlation exponent is numerically investigated. We calculate the localization length for quasi-one-dimensional systems at fixed energy and fixed disorder strength using a standard transfer matrix method. From a finite-size scaling analysis we extract the critical correlation exponent and the critical exponent characterizing the phase transition.Comment: 3 pages; 2 figure

    Uso do SOC na anĂĄlise de modelos lineares multivariados.

    Get PDF
    bitstream/item/76201/1/CNPTIA-COM.TEC.-8805-88.pd

    TaXEm: a tool for aiding the evaluation of domain topic.

    Get PDF
    The notorious advances in textual information storage need fast and effcient tools to organize, retrieve and browse this information and tools for knowledge extraction. A very interesting way to organize specific domain information is the topic taxonomy building. Moreover a great challenge in this research area is the result evaluation and validation. This evaluation can be carried out through objective measures or through a subjective analysis, which is based on the domain specialist judgment. The measures CQM and SQM [1] are used to evaluate a generated taxonomy against a reference taxonomy. A reference taxonomy is constructed by human and consolidated through its community use along the years. The CQM is used to evaluate the generated taxonomy relatively to the selected descriptors for each taxonomy node; on the other hand, the SQM is used to evaluate the taxonomy structure. As these objective measures do not encompass the specialist knowledge, the specialist evaluation is very important. However the human evaluation is expensive, because this task involves readiness, time and dedication from the specialists. In this way, the TaXEm tool claims to reduce the subjective evaluation costs. The TaXEm (Taxonomia em XML da Embrapa) tool offers subsidies for carrying out a taxonomy (semi)automatic evaluation, which allows the user to implement some automatic evaluation before going on a subjective evaluation.Propor 2010. Demonstration
    • 

    corecore