2,136 research outputs found
Statistical correlation analysis for comparing vibration data from test and analysis
A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures
Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values
The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data
A Parameterized Centrality Metric for Network Analysis
A variety of metrics have been proposed to measure the relative importance of
nodes in a network. One of these, alpha-centrality [Bonacich, 2001], measures
the number of attenuated paths that exist between nodes. We introduce a
normalized version of this metric and use it to study network structure,
specifically, to rank nodes and find community structure of the network.
Specifically, we extend the modularity-maximization method [Newman and Girvan,
2004] for community detection to use this metric as the measure of node
connectivity. Normalized alpha-centrality is a powerful tool for network
analysis, since it contains a tunable parameter that sets the length scale of
interactions. By studying how rankings and discovered communities change when
this parameter is varied allows us to identify locally and globally important
nodes and structures. We apply the proposed method to several benchmark
networks and show that it leads to better insight into network structure than
alternative methods.Comment: 11 pages, submitted to Physical Review
Bounds on Quantum Correlations in Bell Inequality Experiments
Bell inequality violation is one of the most widely known manifestations of
entanglement in quantum mechanics; indicating that experiments on physically
separated quantum mechanical systems cannot be given a local realistic
description. However, despite the importance of Bell inequalities, it is not
known in general how to determine whether a given entangled state will violate
a Bell inequality. This is because one can choose to make many different
measurements on a quantum system to test any given Bell inequality and the
optimization over measurements is a high-dimensional variational problem. In
order to better understand this problem we present algorithms that provide, for
a given quantum state, both a lower bound and an upper bound on the maximal
expectation value of a Bell operator. Both bounds apply techniques from convex
optimization and the methodology for creating upper bounds allows them to be
systematically improved. In many cases these bounds determine measurements that
would demonstrate violation of the Bell inequality or provide a bound that
rules out the possibility of a violation. Examples are given to illustrate how
these algorithms can be used to conclude definitively if some quantum states
violate a given Bell inequality.Comment: 13 pages, 1 table, 2 figures. Updated version as published in PR
Recommended from our members
On the application of optimal wavelet filter banks for ECG signal classification
This paper discusses ECG signal classification after parametrizing the ECG waveforms in the wavelet domain. Signal decomposition using perfect reconstruction quadrature mirror filter banks can provide a very parsimonious representation of ECG signals. In the current work, the filter parameters are adjusted by a numerical optimization algorithm in order to minimize a cost function associated to the filter cut-off sharpness. The goal consists of achieving a better compromise between frequency selectivity and time resolution at each decomposition level than standard orthogonal filter banks such as those of the Daubechies and Coiflet families. Our aim is to optimally decompose the signals in the wavelet domain so that they can be subsequently used as inputs for training to a neural network classifier
An economic evaluation of contingency management for completion of hepatitis B vaccination in those on treatment for opiate dependence
Aims: To determine whether the provision of contingency management using financial incentives to improve hepatitis B vaccine completion in people who inject drugs entering community treatment represents a cost-effective use of healthcare resources.
Design: A probabilistic cost-effectiveness analysis was conducted, using a decision-tree to estimate the short-term clinical and healthcare cost impact of the vaccination strategies, followed by a Markov process to evaluate the long-term clinical consequences and costs associated with hepatitis B infection.
Settings and participants: Data on attendance to vaccination from a UK cluster randomised trial.
Intervention: Two contingency management options were examined in the trial: fixed vs. escalating schedule financial incentives.
Measurement: Lifetime healthcare costs and quality-adjusted life years discounted at 3.5% annually; incremental cost-effectiveness ratios.
Findings: The resulting estimate for the incremental lifetime healthcare cost of the contingency management strategy versus usual care was £22 (95% CI: -£12 to £40) per person offered the incentive. For 1,000 people offered the incentive, the incremental reduction in numbers of hepatitis B infections avoided over their lifetime was estimated at 19 (95% CI: 8 to 30). The probabilistic incremental cost per quality adjusted life year gained of the contingency management programme was estimated to be £6,738 (95% CI: £6,297 to £7,172), with an 89% probability of being considered cost-effective at a threshold of £20,000 per quality-adjusted life years gained (98% at £30,000).
Conclusions: Using financial incentives to increase hepatitis B vaccination completion in people who inject drugs could be a cost-effective use of healthcare resources in the UK as long as the incidence remains above 1.2%
Three Dimensional Numerical General Relativistic Hydrodynamics I: Formulations, Methods, and Code Tests
This is the first in a series of papers on the construction and validation of
a three-dimensional code for general relativistic hydrodynamics, and its
application to general relativistic astrophysics. This paper studies the
consistency and convergence of our general relativistic hydrodynamic treatment
and its coupling to the spacetime evolutions described by the full set of
Einstein equations with a perfect fluid source. The numerical treatment of the
general relativistic hydrodynamic equations is based on high resolution shock
capturing schemes. These schemes rely on the characteristic information of the
system. A spectral decomposition for general relativistic hydrodynamics
suitable for a general spacetime metric is presented. Evolutions based on three
different approximate Riemann solvers coupled to four different discretizations
of the Einstein equations are studied and compared. The coupling between the
hydrodynamics and the spacetime (the right and left hand side of the Einstein
equations) is carried out in a treatment which is second order accurate in {\it
both} space and time. Convergence tests for all twelve combinations with a
variety of test beds are studied, showing consistency with the differential
equations and correct convergence properties. The test-beds examined include
shocktubes, Friedmann-Robertson-Walker cosmology tests, evolutions of
self-gravitating compact (TOV) stars, and evolutions of relativistically
boosted TOV stars. Special attention is paid to the numerical evolution of
strongly gravitating objects, e.g., neutron stars, in the full theory of
general relativity, including a simple, yet effective treatment for the surface
region of the star (where the rest mass density is abruptly dropping to zero).Comment: 45 pages RevTeX, 34 figure
- …