2,195 research outputs found
EffiTest: Efficient Delay Test and Statistical Prediction for Configuring Post-silicon Tunable Buffers
At nanometer manufacturing technology nodes, process variations significantly
affect circuit performance. To combat them, post- silicon clock tuning buffers
can be deployed to balance timing bud- gets of critical paths for each
individual chip after manufacturing. The challenge of this method is that path
delays should be mea- sured for each chip to configure the tuning buffers
properly. Current methods for this delay measurement rely on path-wise
frequency stepping. This strategy, however, requires too much time from ex-
pensive testers. In this paper, we propose an efficient delay test framework
(EffiTest) to solve the post-silicon testing problem by aligning path delays
using the already-existing tuning buffers in the circuit. In addition, we only
test representative paths and the delays of other paths are estimated by
statistical delay prediction. Exper- imental results demonstrate that the
proposed method can reduce the number of frequency stepping iterations by more
than 94% with only a slight yield loss.Comment: ACM/IEEE Design Automation Conference (DAC), June 201
Principal component analysis - an efficient tool for variable stars diagnostics
We present two diagnostic methods based on ideas of Principal Component
Analysis and demonstrate their efficiency for sophisticated processing of
multicolour photometric observations of variable objects.Comment: 8 pages, 4 figures. Published alread
Principal Component Analysis with Noisy and/or Missing Data
We present a method for performing Principal Component Analysis (PCA) on
noisy datasets with missing values. Estimates of the measurement error are used
to weight the input data such that compared to classic PCA, the resulting
eigenvectors are more sensitive to the true underlying signal variations rather
than being pulled by heteroskedastic measurement noise. Missing data is simply
the limiting case of weight=0. The underlying algorithm is a noise weighted
Expectation Maximization (EM) PCA, which has additional benefits of
implementation speed and flexibility for smoothing eigenvectors to reduce the
noise contribution. We present applications of this method on simulated data
and QSO spectra from the Sloan Digital Sky Survey.Comment: Accepted for publication in PASP; v2 with minor updates, mostly to
bibliograph
Significance analysis and statistical mechanics: an application to clustering
This paper addresses the statistical significance of structures in random
data: Given a set of vectors and a measure of mutual similarity, how likely
does a subset of these vectors form a cluster with enhanced similarity among
its elements? The computation of this cluster p-value for randomly distributed
vectors is mapped onto a well-defined problem of statistical mechanics. We
solve this problem analytically, establishing a connection between the physics
of quenched disorder and multiple testing statistics in clustering and related
problems. In an application to gene expression data, we find a remarkable link
between the statistical significance of a cluster and the functional
relationships between its genes.Comment: to appear in Phys. Rev. Let
Elastodynamics of radially inhomogeneous spherically anisotropic elastic materials in the Stroh formalism
A method is presented for solving elastodynamic problems in radially
inhomogeneous elastic materials with spherical anisotropy, i.e.\ materials such
that in a spherical coordinate system
. The time harmonic displacement field is expanded in a separation of variables form with dependence on
described by vector spherical harmonics with -dependent
amplitudes. It is proved that such separation of variables solution is
generally possible only if the spherical anisotropy is restricted to transverse
isotropy with the principal axis in the radial direction, in which case the
amplitudes are determined by a first-order ordinary differential system.
Restricted forms of the displacement field, such as ,
admit this type of separation of variables solutions for certain lower material
symmetries. These results extend the Stroh formalism of elastodynamics in
rectangular and cylindrical systems to spherical coordinates.Comment: 15 page
Recommended from our members
Equitability revisited: why the “equitable threat score” is not equitable
In the forecasting of binary events, verification measures that are “equitable” were defined by Gandin and Murphy to satisfy two requirements: 1) they award all random forecasting systems, including those that always issue the same forecast, the same expected score (typically zero), and 2) they are expressible as the linear weighted sum of the elements of the contingency table, where the weights are independent of the entries in the table, apart from the base rate. The authors demonstrate that the widely used “equitable threat score” (ETS), as well as numerous others, satisfies neither of these requirements and only satisfies the first requirement in the limit of an infinite sample size. Such measures are referred to as “asymptotically equitable.” In the case of ETS, the expected score of a random forecasting system is always positive and only falls below 0.01 when the number of samples is greater than around 30. Two other asymptotically equitable measures are the odds ratio skill score and the symmetric extreme dependency score, which are more strongly inequitable than ETS, particularly for rare events; for example, when the base rate is 2% and the sample size is 1000, random but unbiased forecasting systems yield an expected score of around −0.5, reducing in magnitude to −0.01 or smaller only for sample sizes exceeding 25 000. This presents a problem since these nonlinear measures have other desirable properties, in particular being reliable indicators of skill for rare events (provided that the sample size is large enough). A potential way to reconcile these properties with equitability is to recognize that Gandin and Murphy’s two requirements are independent, and the second can be safely discarded without losing the key advantages of equitability that are embodied in the first. This enables inequitable and asymptotically equitable measures to be scaled to make them equitable, while retaining their nonlinearity and other properties such as being reliable indicators of skill for rare events. It also opens up the possibility of designing new equitable verification measures
High-Dimensional Inference with the generalized Hopfield Model: Principal Component Analysis and Corrections
We consider the problem of inferring the interactions between a set of N
binary variables from the knowledge of their frequencies and pairwise
correlations. The inference framework is based on the Hopfield model, a special
case of the Ising model where the interaction matrix is defined through a set
of patterns in the variable space, and is of rank much smaller than N. We show
that Maximum Lik elihood inference is deeply related to Principal Component
Analysis when the amp litude of the pattern components, xi, is negligible
compared to N^1/2. Using techniques from statistical mechanics, we calculate
the corrections to the patterns to the first order in xi/N^1/2. We stress that
it is important to generalize the Hopfield model and include both attractive
and repulsive patterns, to correctly infer networks with sparse and strong
interactions. We present a simple geometrical criterion to decide how many
attractive and repulsive patterns should be considered as a function of the
sampling noise. We moreover discuss how many sampled configurations are
required for a good inference, as a function of the system size, N and of the
amplitude, xi. The inference approach is illustrated on synthetic and
biological data.Comment: Physical Review E: Statistical, Nonlinear, and Soft Matter Physics
(2011) to appea
Mesoscopic Model for Free Energy Landscape Analysis of DNA sequences
A mesoscopic model which allows us to identify and quantify the strength of
binding sites in DNA sequences is proposed. The model is based on the
Peyrard-Bishop-Dauxois model for the DNA chain coupled to a Brownian particle
which explores the sequence interacting more importantly with open base pairs
of the DNA chain. We apply the model to promoter sequences of different
organisms. The free energy landscape obtained for these promoters shows a
complex structure that is strongly connected to their biological behavior. The
analysis method used is able to quantify free energy differences of sites
within genome sequences.Comment: 7 pages, 5 figures, 1 tabl
Spatially Resolved Mapping of Local Polarization Dynamics in an Ergodic Phase of Ferroelectric Relaxor
Spatial variability of polarization relaxation kinetics in relaxor
ferroelectric 0.9Pb(Mg1/3Nb2/3)O3-0.1PbTiO3 is studied using time-resolved
Piezoresponse Force Microscopy. Local relaxation attributed to the
reorientation of polar nanoregions is shown to follow stretched exponential
dependence, exp(-(t/tau)^beta), with beta~~0.4, much larger than the
macroscopic value determined from dielectric spectra (beta~~0.09). The spatial
inhomogeneity of relaxation time distributions with the presence of 100-200 nm
"fast" and "slow" regions is observed. The results are analyzed to map the
Vogel-Fulcher temperatures on the nanoscale.Comment: 23 pages, 4 figures, supplementary materials attached; to be
submitted to Phys. Rev. Let
- …