36,668 research outputs found
A statistical framework for testing functional categories in microarray data
Ready access to emerging databases of gene annotation and functional pathways
has shifted assessments of differential expression in DNA microarray studies
from single genes to groups of genes with shared biological function. This
paper takes a critical look at existing methods for assessing the differential
expression of a group of genes (functional category), and provides some
suggestions for improved performance. We begin by presenting a general
framework, in which the set of genes in a functional category is compared to
the complementary set of genes on the array. The framework includes tests for
overrepresentation of a category within a list of significant genes, and
methods that consider continuous measures of differential expression. Existing
tests are divided into two classes. Class 1 tests assume gene-specific measures
of differential expression are independent, despite overwhelming evidence of
positive correlation. Analytic and simulated results are presented that
demonstrate Class 1 tests are strongly anti-conservative in practice. Class 2
tests account for gene correlation, typically through array permutation that by
construction has proper Type I error control for the induced null. However,
both Class 1 and Class 2 tests use a null hypothesis that all genes have the
same degree of differential expression. We introduce a more sensible and
general (Class 3) null under which the profile of differential expression is
the same within the category and complement. Under this broader null, Class 2
tests are shown to be conservative. We propose standard bootstrap methods for
testing against the Class 3 null and demonstrate they provide valid Type I
error control and more power than array permutation in simulated datasets and
real microarray experiments.Comment: Published in at http://dx.doi.org/10.1214/07-AOAS146 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Assessing the similarity of dose response and target doses in two non-overlapping subgroups
We consider two problems that are attracting increasing attention in clinical
dose finding studies. First, we assess the similarity of two non-linear
regression models for two non-overlapping subgroups of patients over a
restricted covariate space. To this end, we derive a confidence interval for
the maximum difference between the two given models. If this confidence
interval excludes the equivalence margins, similarity of dose response can be
claimed. Second, we address the problem of demonstrating the similarity of two
target doses for two non-overlapping subgroups, using again a confidence
interval based approach. We illustrate the proposed methods with a real case
study and investigate their operating characteristics (coverage probabilities,
Type I error rates, power) via simulation.Comment: Keywords and Phrases: equivalence testing, multiregional trial,
target dose estimation, subgroup analyse
Techno-Economic Analysis and Optimal Control of Battery Storage for Frequency Control Services, Applied to the German Market
Optimal investment in battery energy storage systems, taking into account
degradation, sizing and control, is crucial for the deployment of battery
storage, of which providing frequency control is one of the major applications.
In this paper, we present a holistic, data-driven framework to determine the
optimal investment, size and controller of a battery storage system providing
frequency control. We optimised the controller towards minimum degradation and
electricity costs over its lifetime, while ensuring the delivery of frequency
control services compliant with regulatory requirements. We adopted a detailed
battery model, considering the dynamics and degradation when exposed to actual
frequency data. Further, we used a stochastic optimisation objective while
constraining the probability on unavailability to deliver the frequency control
service. Through a thorough analysis, we were able to decrease the amount of
data needed and thereby decrease the execution time while keeping the
approximation error within limits. Using the proposed framework, we performed a
techno-economic analysis of a battery providing 1 MW capacity in the German
primary frequency control market. Results showed that a battery rated at 1.6
MW, 1.6 MWh has the highest net present value, yet this configuration is only
profitable if costs are low enough or in case future frequency control prices
do not decline too much. It transpires that calendar ageing drives battery
degradation, whereas cycle ageing has less impact.Comment: Submitted to Applied Energ
Design and Development of an Affordable Haptic Robot with Force-Feedback and Compliant Actuation to Improve Therapy for Patients with Severe Hemiparesis
The study describes the design and development of a single degree-of-freedom haptic robot, Haptic Theradrive, for post-stroke arm rehabilitation for in-home and clinical use. The robot overcomes many of the weaknesses of its predecessor, the TheraDrive system, that used a Logitech steering wheel as the haptic interface for rehabilitation. Although the original TheraDrive system showed success in a pilot study, its wheel was not able to withstand the rigors of use. A new haptic robot was developed that functions as a drop-in replacement for the Logitech wheel. The new robot can apply larger forces in interacting with the patient, thereby extending the functionality of the system to accommodate low-functioning patients. A new software suite offers appreciably more options for tailored and tuned rehabilitation therapies. In addition to describing the design of the hardware and software, the paper presents the results of simulation and experimental case studies examining the system\u27s performance and usability
Concepts for on-board satellite image registration. Volume 2: IAS prototype performance evaluation standard definition
Problems encountered in testing onboard signal processing hardware designed to achieve radiometric and geometric correction of satellite imaging data are considered. These include obtaining representative image and ancillary data for simulation and the transfer and storage of a large quantity of image data at very high speed. The high resolution, high speed preprocessing of LANDSAT-D imagery is considered
Variance Reduction Techniques in Monte Carlo Methods
Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the introduction of computers. This increased computer power has stimulated simulation analysts to develop ever more realistic models, so that the net result has not been faster execution of simulation experiments; e.g., some modern simulation models need hours or days for a single ’run’ (one replication of one scenario or combination of simulation input values). Moreover there are some simulation models that represent rare events which have extremely small probabilities of occurrence), so even modern computer would take ’for ever’ (centuries) to execute a single run - were it not that special VRT can reduce theses excessively long runtimes to practical magnitudes.common random numbers;antithetic random numbers;importance sampling;control variates;conditioning;stratied sampling;splitting;quasi Monte Carlo
- …