306,710 research outputs found
A model for ranking and selecting integrity tests in a distributed database
Checking the consistency of a database state generally involves the execution of integrity tests on the database,
which verify whether the database is satisfying its constraints or not. This paper presents the various types
of integrity tests as reported in previous works and discusses how these tests can significantly improve the
performance of the constraint checking mechanisms without limiting to a certain type of test. Having these test
alternatives and selecting the most suitable test is an issue that needs to be tackled. In this regard, the authors
propose a model to rank and select the suitable test to be evaluated given several alternative tests. The model
uses the amount of data transferred across the network, the number of sites involved, and the amount of data
accessed as the parameters in deciding the suitable test. Several analyses have been performed to evaluate
the proposed model, and results show that the model achieves a higher percentage of local processing as
compared to the previous selected strategies
Cox-type model validation with recurrent event data
Recurrent event data occurs in many disciplines such as actuarial science, biomedical studies, sociology, and environment to name a few. It is therefore important to develop models that describe the dynamic evolution of the event occurrences. One major problem of interest to researchers with these types of data is models for the distribution function of the time between events occurrences, especially in the presence of covariates that play a major role in having a better understanding of time to events.
This work pertains to statistical inference of the regression parameter and the baseline hazard function in a Cox-type model for recurrent events that accounts for the effective age and time varying covariates. Estimators of the regression parameters as well as baseline hazard function are obtained using the counting processes and martingales machinery techniques. Asymptotic properties of the proposed estimators and how they can be used to construct confidence intervals are investigated. The results of the simulation studies assessing the performance of the estimators and an application to a biomedical dataset illustrating the models are presented. The impact of unit effective age is also assessed.
To check the validity of the models used, many decision rules are developed for checking the validity of the various components of Cox-type model. Specifically, using martingales residuals, we proposed test statistics for checking the link function and the covariates functional form. Asymptotic properties of test statistics and simulation studies are presented as well --Abstract, page iii
An Object-Oriented Framework for Explicit-State Model Checking
This paper presents a conceptual architecture for an object-oriented framework to support the development of formal verification tools (i.e. model checkers). The objective of the architecture is to support the reuse of algorithms and to encourage a modular design of tools. The conceptual framework is accompanied by a C++ implementation which provides reusable algorithms for the simulation and verification of explicit-state models as well as a model representation for simple models based on guard-based process descriptions. The framework has been successfully used to develop a model checker for a subset of PROMELA
The C++0x "Concepts" Effort
C++0x is the working title for the revision of the ISO standard of the C++
programming language that was originally planned for release in 2009 but that
was delayed to 2011. The largest language extension in C++0x was "concepts",
that is, a collection of features for constraining template parameters. In
September of 2008, the C++ standards committee voted the concepts extension
into C++0x, but then in July of 2009, the committee voted the concepts
extension back out of C++0x.
This article is my account of the technical challenges and debates within the
"concepts" effort in the years 2003 to 2009. To provide some background, the
article also describes the design space for constrained parametric
polymorphism, or what is colloquially know as constrained generics. While this
article is meant to be generally accessible, the writing is aimed toward
readers with background in functional programming and programming language
theory. This article grew out of a lecture at the Spring School on Generic and
Indexed Programming at the University of Oxford, March 2010
A study of systems implementation languages for the POCCNET system
The results are presented of a study of systems implementation languages for the Payload Operations Control Center Network (POCCNET). Criteria are developed for evaluating the languages, and fifteen existing languages are evaluated on the basis of these criteria
Polynomial Size Analysis of First-Order Shapely Functions
We present a size-aware type system for first-order shapely function
definitions. Here, a function definition is called shapely when the size of the
result is determined exactly by a polynomial in the sizes of the arguments.
Examples of shapely function definitions may be implementations of matrix
multiplication and the Cartesian product of two lists. The type system is
proved to be sound w.r.t. the operational semantics of the language. The type
checking problem is shown to be undecidable in general. We define a natural
syntactic restriction such that the type checking becomes decidable, even
though size polynomials are not necessarily linear or monotonic. Furthermore,
we have shown that the type-inference problem is at least semi-decidable (under
this restriction). We have implemented a procedure that combines run-time
testing and type-checking to automatically obtain size dependencies. It
terminates on total typable function definitions.Comment: 35 pages, 1 figur
- …