2,264 research outputs found
EffiTest: Efficient Delay Test and Statistical Prediction for Configuring Post-silicon Tunable Buffers
At nanometer manufacturing technology nodes, process variations significantly
affect circuit performance. To combat them, post- silicon clock tuning buffers
can be deployed to balance timing bud- gets of critical paths for each
individual chip after manufacturing. The challenge of this method is that path
delays should be mea- sured for each chip to configure the tuning buffers
properly. Current methods for this delay measurement rely on path-wise
frequency stepping. This strategy, however, requires too much time from ex-
pensive testers. In this paper, we propose an efficient delay test framework
(EffiTest) to solve the post-silicon testing problem by aligning path delays
using the already-existing tuning buffers in the circuit. In addition, we only
test representative paths and the delays of other paths are estimated by
statistical delay prediction. Exper- imental results demonstrate that the
proposed method can reduce the number of frequency stepping iterations by more
than 94% with only a slight yield loss.Comment: ACM/IEEE Design Automation Conference (DAC), June 201
Childhood predictors of successful self-reported delinquents
The main aim of this research is to investigate the childhood predictors of successful self-reported delinquents, defined as those who were not convicted. In the Cambridge Study in Delinquent Development (CSDD), 411 London males have been followed up from age 8 to age 61. Self-reported offending was measured for the whole sample for ages 10â14, 15â18, 27â32, and 42â47, for five crimes: burglary, theft of a vehicle, theft from a vehicle, shoplifting, and vandalism. The prevalence of self-reported offending was 64% at ages 10â18 and 11% at ages 27â47, compared with the prevalence of convictions of 23% and 8% respectively. Successful self-reported delinquents were defined as those who offended between ages 10 and 18 but were not convicted up to age 26; 126 successful delinquents were compared with 120 convicted delinquents. Sixteen childhood factors, including attainment, self-control, socioeconomic, parental, family and behavioural factors, predicted successful self-reported delinquents. The most important independent predictors were committing less serious and fewer offences as well as high school attainment, unconvicted parents, low risk-taking, and unseparated families
How Algorithmic Confounding in Recommendation Systems Increases Homogeneity and Decreases Utility
Recommendation systems are ubiquitous and impact many domains; they have the
potential to influence product consumption, individuals' perceptions of the
world, and life-altering decisions. These systems are often evaluated or
trained with data from users already exposed to algorithmic recommendations;
this creates a pernicious feedback loop. Using simulations, we demonstrate how
using data confounded in this way homogenizes user behavior without increasing
utility
Principal component analysis - an efficient tool for variable stars diagnostics
We present two diagnostic methods based on ideas of Principal Component
Analysis and demonstrate their efficiency for sophisticated processing of
multicolour photometric observations of variable objects.Comment: 8 pages, 4 figures. Published alread
Using audit and feedback to increase clinician adherence to clinical practice guidelines in brain injury rehabilitation: v
ObjectiveThis study evaluated whether frequent (fortnightly) audit and feedback cycles over a sustained period of time (>12 months) increased clinician adherence to recommended guidelines in acquired brain injury rehabilitation.DesignA before and after study design.SettingA metropolitan inpatient brain injury rehabilitation unit.ParticipantsClinicians; medical, nursing and allied health staff.InterventionsFortnightly cycles of audit and feedback for 14 months. Each fortnight, medical file and observational audits were completed against 114 clinical indicators.Main outcome measureAdherence to guideline indicators before and after intervention, calculated by proportions, Mann-Whitney U and Chi square analysis.ResultsClinical and statistical significant improvements in median clinical indicator adherence were found immediately following the audit and feedback program from 38.8% (95% CI 34.3 to 44.4) to 83.6% (95% CI 81.8 to 88.5). Three months after cessation of the intervention, median adherence had decreased from 82.3% to 76.6% (95% CI 72.7 to 83.3, pConclusionA fortnightly audit and feedback program increased cliniciansâ adherence to guideline recommendations in an inpatient acquired brain injury rehabilitation setting. We propose future studies build on the evidence-based method used in the present study to determine effectiveness and develop an implementation toolkit for scale-up.</div
A model-based multithreshold method for subgroup identification
Thresholding variable plays a crucial role in subgroup identification for personalizedmedicine. Most existing partitioning methods split the sample basedon one predictor variable. In this paper, we consider setting the splitting rulefrom a combination of multivariate predictors, such as the latent factors, principlecomponents, and weighted sum of predictors. Such a subgrouping methodmay lead to more meaningful partitioning of the population than using a singlevariable. In addition, our method is based on a change point regression modeland thus yields straight forward model-based prediction results. After choosinga particular thresholding variable form, we apply a two-stage multiple changepoint detection method to determine the subgroups and estimate the regressionparameters. We show that our approach can produce two or more subgroupsfrom the multiple change points and identify the true grouping with high probability.In addition, our estimation results enjoy oracle properties. We design asimulation study to compare performances of our proposed and existing methodsand apply them to analyze data sets from a Scleroderma trial and a breastcancer study
High-Dimensional Inference with the generalized Hopfield Model: Principal Component Analysis and Corrections
We consider the problem of inferring the interactions between a set of N
binary variables from the knowledge of their frequencies and pairwise
correlations. The inference framework is based on the Hopfield model, a special
case of the Ising model where the interaction matrix is defined through a set
of patterns in the variable space, and is of rank much smaller than N. We show
that Maximum Lik elihood inference is deeply related to Principal Component
Analysis when the amp litude of the pattern components, xi, is negligible
compared to N^1/2. Using techniques from statistical mechanics, we calculate
the corrections to the patterns to the first order in xi/N^1/2. We stress that
it is important to generalize the Hopfield model and include both attractive
and repulsive patterns, to correctly infer networks with sparse and strong
interactions. We present a simple geometrical criterion to decide how many
attractive and repulsive patterns should be considered as a function of the
sampling noise. We moreover discuss how many sampled configurations are
required for a good inference, as a function of the system size, N and of the
amplitude, xi. The inference approach is illustrated on synthetic and
biological data.Comment: Physical Review E: Statistical, Nonlinear, and Soft Matter Physics
(2011) to appea
Mesoscopic Model for Free Energy Landscape Analysis of DNA sequences
A mesoscopic model which allows us to identify and quantify the strength of
binding sites in DNA sequences is proposed. The model is based on the
Peyrard-Bishop-Dauxois model for the DNA chain coupled to a Brownian particle
which explores the sequence interacting more importantly with open base pairs
of the DNA chain. We apply the model to promoter sequences of different
organisms. The free energy landscape obtained for these promoters shows a
complex structure that is strongly connected to their biological behavior. The
analysis method used is able to quantify free energy differences of sites
within genome sequences.Comment: 7 pages, 5 figures, 1 tabl
Neuronal assembly dynamics in supervised and unsupervised learning scenarios
The dynamic formation of groups of neuronsâneuronal assembliesâis believed to mediate cognitive phenomena at many levels, but their detailed operation and mechanisms of interaction are still to be uncovered. One hypothesis suggests that synchronized oscillations underpin their formation and functioning, with a focus on the temporal structure of neuronal signals. In this context, we investigate neuronal assembly dynamics in two complementary scenarios: the first, a supervised spike pattern classification task, in which noisy variations of a collection of spikes have to be correctly labeled; the second, an unsupervised, minimally cognitive evolutionary robotics tasks, in which an evolved agent has to cope with multiple, possibly conflicting, objectives. In both cases, the more traditional dynamical analysis of the systemâs variables is paired with information-theoretic techniques in order to get a broader picture of the ongoing interactions with and within the network. The neural network model is inspired by the Kuramoto model of coupled phase oscillators and allows one to fine-tune the network synchronization dynamics and assembly configuration. The experiments explore the computational power, redundancy, and generalization capability of neuronal circuits, demonstrating that performance depends nonlinearly on the number of assemblies and neurons in the network and showing that the framework can be exploited to generate minimally cognitive behaviors, with dynamic assembly formation accounting for varying degrees of stimuli modulation of the sensorimotor interactions
- âŠ