565,601 research outputs found
Model-based asymptotically optimal dispersion measure correction for pulsar timing
In order to reach the sensitivity required to detect gravitational waves,
pulsar timing array experiments need to mitigate as much noise as possible in
timing data. A dominant amount of noise is likely due to variations in the
dispersion measure. To correct for such variations, we develop a statistical
method inspired by the maximum likelihood estimator and optimal filtering. Our
method consists of two major steps. First, the spectral index and amplitude of
dispersion measure variations are measured via a time-domain spectral analysis.
Second, the linear optimal filter is constructed based on the model parameters
found in the first step, and is used to extract the dispersion measure
variation waveforms. Compared to current existing methods, this method has
better time resolution for the study of short timescale dispersion variations,
and generally produces smaller errors in waveform estimations. This method can
process irregularly sampled data without any interpolation because of its
time-domain nature. Furthermore, it offers the possibility to interpolate or
extrapolate the waveform estimation to regions where no data is available.
Examples using simulated data sets are included for demonstration.Comment: 15 pages, 15 figures, submitted 15th Sept. 2013, accepted 2nd April
2014 by MNRAS. MNRAS, 201
A study into urban roadworks with shuttle-lane operation
In urban areas where roadworks are required, single lane shuttle operation is applied, especially where there is limited road space. There are operational problems relating to the site such as site geometry, visibility, length of roadworks zone, position of signs with other traffic control devices and signal timing. Other problems are mainly related to drivers’ behaviour and their compliance with traffic controls on site.
The reduced road width caused by the works will interrupt the free flow of traffic and it can also add to the risks to road users. In addition, shuttle operation may introduce long queues and increase delays especially during peak periods.
There is a need to identify those parameters and behaviours which might influence traffic performance in terms of safety and capacity. An investigation of four roadworks sites in urban roadworks within the Greater Manchester area was undertaken for this purpose. Parameters included in the examination were position of the STOP sign, signal timing, weather conditions, time headway, vehicle speed and percentages of heavy goods vehicles (HGV) in the traffic stream. Statistical analysis and comparisons between sites were conducted. Other factors related to the operation of the shuttle-lane were provided based on site observations
The effect of board structure on stock picking and market timing abilities of the Egyptian mutual fund managers: Evidence from financial crisis
This paper seeks to examine the effect of mutual fund governance on stock selection and market timing abilities.
This paper applies a Structural Equation Modelling technique to solve the potential endogeneity problem
between internal governance measures and stock selection and market timing. The main conclusion of this paper
is to provide evidence through robust statistical analysis around the usefulness of governance attributes Egyptian
mutual funds stock selection and market timing abilities. Accordingly, the financial crisis demonstrates a need to
modify some recommendations contained in the OECD methodology for evaluating the implementation of the
OECD Principles of Corporate Governance. This paper find that board size and proportion of independent
directors is negatively associated with stock selection, and proportion of directors holding zero shares is
positively associated with stock selection
The Effect of Board Structure on Stock Picking and Market Timing Abilities of the Egyptian Mutual Fund Managers: Evidence from Financial Crisis
This paper seeks to examine the effect of mutual fund governance on stock selection and market timing abilities. This paper applies a Structural Equation Modelling technique to solve the potential endogeneity problem between internal governance measures and stock selection and market timing. The main conclusion of this paper is to provide evidence through robust statistical analysis around the usefulness of governance attributes Egyptian mutual funds stock selection and market timing abilities. Accordingly, the financial crisis demonstrates a need to modify some recommendations contained in the OECD methodology for evaluating the implementation of the OECD Principles of Corporate Governance. This paper find that board size and proportion of independent directors is negatively associated with stock selection, and proportion of directors holding zero shares is positively associated with stock selection. Keywords: Corporate Governance, Mutual Fund, Endogeneity
Integrating Phase 2 into Phase 3 based on an Intermediate Endpoint While Accounting for a Cure Proportion -- with an Application to the Design of a Clinical Trial in Acute Myeloid Leukemia
For a trial with primary endpoint overall survival for a molecule with
curative potential, statistical methods that rely on the proportional hazards
assumption may underestimate the power and the time to final analysis. We show
how a cure proportion model can be used to get the necessary number of events
and appropriate timing via simulation. If Phase 1 results for the new drug are
exceptional and/or the medical need in the target population is high, a Phase 3
trial might be initiated after Phase 1. Building in a futility interim analysis
into such a pivotal trial may mitigate the uncertainty of moving directly to
Phase 3. However, if cure is possible, overall survival might not be mature
enough at the interim to support a futility decision. We propose to base this
decision on an intermediate endpoint that is sufficiently associated with
survival. Planning for such an interim can be interpreted as making a
randomized Phase 2 trial a part of the pivotal trial: if stopped at the
interim, the trial data would be analyzed and a decision on a subsequent Phase
3 trial would be made. If the trial continues at the interim then the Phase 3
trial is already underway. To select a futility boundary, a mechanistic
simulation model that connects the intermediate endpoint and survival is
proposed. We illustrate how this approach was used to design a pivotal
randomized trial in acute myeloid leukemia, discuss historical data that
informed the simulation model, and operational challenges when implementing it.Comment: 23 pages, 3 figures, 3 tables. All code is available on github:
https://github.com/numbersman77/integratePhase2.gi
Efficient Monte Carlo Based Methods for Variability Aware Analysis and Optimization of Digital Circuits.
Process variability is of increasing concern in modern nanometer-scale CMOS. The
suitability of Monte Carlo based algorithms for efficient analysis and optimization of
digital circuits under variability is explored in this work. Random sampling based Monte
Carlo techniques incur high cost of computation, due to the large sample size required to
achieve target accuracy. This motivates the need for intelligent sample selection
techniques to reduce the number of samples. As these techniques depend on information
about the system under analysis, there is a need to tailor the techniques to fit the specific
application context. We propose efficient smart sampling based techniques for timing and
leakage power consumption analysis of digital circuits. For the case of timing analysis, we
show that the proposed method requires 23.8X fewer samples on average to achieve
comparable accuracy as a random sampling approach, for benchmark circuits studied. It is
further illustrated that the parallelism available in such techniques can be exploited using
parallel machines, especially Graphics Processing Units. Here, we show that SH-QMC
implemented on a Multi GPU is twice as fast as a single STA on a CPU for benchmark
circuits considered. Next we study the possibility of using such information from
statistical analysis to optimize digital circuits under variability, for example to achieve
minimum area on silicon though gate sizing while meeting a timing constraint. Though
several techniques to optimize circuits have been proposed in literature, it is not clear how
much gains are obtained in these approaches specifically through utilization of statistical
information. Therefore, an effective lower bound computation technique is proposed to
enable efficient comparison of statistical design optimization techniques. It is shown that
even techniques which use only limited statistical information can achieve results to
within 10% of the proposed lower bound. We conclude that future optimization research
should shift focus from use of more statistical information to achieving more efficiency
and parallelism to obtain speed ups.Ph.D.Electrical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/78936/1/tvvin_1.pd
Max Operation in Statistical Static Timing Analysis on the Non-Gaussian Variation Sources for VLSI Circuits
As CMOS technology continues to scale down, process variation introduces significant uncertainty in power and performance to VLSI circuits and significantly affects their reliability. If this uncertainty is not properly handled, it may become the bottleneck of CMOS technology improvement. As a result, deterministic analysis is no longer conservative and may result in either overestimation or underestimation of the circuit delay. As we know that Static-Timing Analysis (STA) is a deterministic way of computing the delay imposed by the circuits design and layout. It is based on a predetermined set of possible events of process variations, also called corners of the circuit. Although it is an excellent tool, current trends in process scaling have imposed significant difficulties to STA. Therefore, there is a need for another tool, which can resolve the aforementioned problems, and Statistical Static Timing Analysis (SSTA) has become the frontier research topic in recent years in combating such variation effects.
There are two types of SSTA methods, path-based SSTA and block-based SSTA. The goal of SSTA is to parameterize timing characteristics of the timing graph as a function of the underlying sources of process parameters that are modeled as random variables. By performing SSTA, designers can obtain the timing distribution (yield) and its sensitivity to various process parameters. Such information is of tremendous value for both timing sign-off and design optimization for robustness and high profit margins. The block-based SSTA is the most efficient SSTA method in recent years. In block-based SSTA, there are two major atomic operations max and add. The add operation is simple; however, the max operation is much more complex.
There are two main challenges in SSTA. The Topological Correlation that emerges from reconvergent paths, these are the ones that originate from a common node and then converge again at another node (reconvergent node). Such correlation complicates the maximum operation. The second challenge is the Spatial Correlation. It arises due to device proximity on the die and gives rise to the problems of modeling delay and arrival time.
This dissertation presents statistical Nonlinear and Nonnormals canonical form of timing delay model considering process variation. This dissertation is focusing on four aspects: (1) Statistical timing modeling and analysis; (2) High level circuit synthesis with system level statistical static timing analysis; (3) Architectural implementations of the atomic operations (max and add); and (4) Design methodology.
To perform statistical timing modeling and analysis, we first present an efficient and accurate statistical static timing analysis (SSTA) flow for non-linear cell delay model with non-Gaussian variation sources.
To achieve system level SSTA we apply statistical timing analysis to high-level synthesis flow, and develop yield driven synthesis framework so that the impact of process variations is taken into account during high-level synthesis.
To accomplish architectural implementation, we present the vector thread architecture for max operator to minimize delay and variation. Finally, we present comparison analysis with ISCAS benchmark circuits suites.
In the last part of this dissertation, a SSTA design methodology is presented
Study of the Reliability of Statistical Timing Analysis for Real-Time Systems
Presented at 23rd International Conference on Real-Time Networks and Systems (RTNS 2015). 4 to 6, Nov, 2015, Main Track. Lille, France.Probabilistic and statistical temporal analyses have been developedas a means of determining the worst-case
execution and responsetimes of real-time software for decades. A number of such methodshave been proposed in
the literature, of which the majority claim tobe able to provide worst-case timing scenarios with respect to agiven
likelihood of a certain value being exceeded. Further, suchclaims are based on either some estimates associated
with a probability,or probability distributions with a certain level of confidence.However, the validity of the claims
are very much dependent on anumber of factors, such as the achieved samples and the adopteddistributions for
analysis.In this paper, we investigate whether the claims made are in facttrue as well as the establishing an
understanding of the factors thataffect the validity of these claims. The results are of importancefor two reasons:
to allow researchers to examine whether there areimportant issues that mean their techniques need to be refined;
andso that practitioners, including industrialists who are currently usingcommercial timing analysis tools based on
these types of techniques,understand how the techniques should be used to ensure theresults are fit for their
purposes
- …