150 research outputs found
An improvement of the Berry--Esseen inequality with applications to Poisson and mixed Poisson random sums
By a modification of the method that was applied in (Korolev and Shevtsova,
2009), here the inequalities
and
are proved for the
uniform distance between the standard normal distribution
function and the distribution function of the normalized sum of an
arbitrary number of independent identically distributed random
variables with zero mean, unit variance and finite third absolute moment
. The first of these inequalities sharpens the best known version of
the classical Berry--Esseen inequality since
by virtue of
the condition , and 0.4785 is the best known upper estimate of the
absolute constant in the classical Berry--Esseen inequality. The second
inequality is applied to lowering the upper estimate of the absolute constant
in the analog of the Berry--Esseen inequality for Poisson random sums to 0.3051
which is strictly less than the least possible value of the absolute constant
in the classical Berry--Esseen inequality. As a corollary, the estimates of the
rate of convergence in limit theorems for compound mixed Poisson distributions
are refined.Comment: 33 page
Proof-theoretic semantics, a problem with negation and prospects for modality
This paper discusses proof-theoretic semantics, the project of specifying the meanings of the logical constants in terms of rules of inference governing them. I concentrate on Michael Dummett’s and Dag Prawitz’ philosophical motivations and give precise characterisations of the crucial notions of harmony and stability, placed in the context of proving normalisation results in systems of natural deduction. I point out a problem for defining the meaning of negation in this framework and prospects for an account of the meanings of modal operators in terms of rules of inference
Assessment-schedule matching in unanchored indirect treatment comparisons of progression-free survival in cancer studies
Background
The timing of efficacy-related clinical events recorded at scheduled study visits in clinical trials are interval censored, with the interval duration pre-determined by the study protocol. Events may happen any time during that interval but can only be detected during a planned or unplanned visit. Disease progression in oncology is a notable example where the time to an event is affected by the schedule of visits within a study. This can become a source of bias when studies with varying assessment schedules are used in unanchored comparisons using methods such as matching-adjusted indirect comparisons.
Objective
We illustrate assessment-time bias (ATB) in a simulation study based on data from a recent study in second-line treatment for locally advanced or metastatic urothelial carcinoma, and present a method to adjust for differences in assessment schedule when comparing progression-free survival (PFS) against a competing treatment.
Methods
A multi-state model for death and progression was used to generate simulated death and progression times, from which PFS times were derived. PFS data were also generated for a hypothetical comparator treatment by applying a constant hazard ratio (HR) to the baseline treatment. Simulated PFS times for the two treatments were then aligned to different assessment schedules so that progression events were only observed at set visit times, and the data were analysed to assess the bias and standard error of estimates of HRs between two treatments with and without assessment-schedule matching (ASM).
Results
ATB is highly affected by the rate of the event at the first assessment time; in our examples, the bias ranged from 3 to 11% as the event rate increased. The proposed method relies on individual-level data from a study and attempts to adjust the timing of progression events to the comparator’s schedule by shifting them forward or backward without altering the patients’ actual follow-up time. The method removed the bias almost completely in all scenarios without affecting the precision of estimates of comparative effectiveness.
Conclusions
Considering the increasing use of unanchored comparative analyses for novel cancer treatments based on single-arm studies, the proposed method offers a relatively simple means of improving the accuracy of relative benefits of treatments on progression times
General-elimination stability
General-elimination harmony articulates Gentzen's idea that the elimination-rules are justified if they infer from an assertion no more than can already be inferred from the grounds for making it. Dummett described the rules as not only harmonious but stable if the E-rules allow one to infer no more and no less than the I-rules justify. Pfenning and Davies call the rules locally complete if the E-rules are strong enough to allow one to infer the original judgement. A method is given of generating harmonious general-elimination rules from a collection of I-rules. We show that the general-elimination rules satisfy Pfenning and Davies' test for local completeness, but question whether that is enough to show that they are stable. Alternative conditions for stability are considered, including equivalence between the introduction- and elimination-meanings of a connective, and recovery of the grounds for assertion, finally generalizing the notion of local completeness to capture Dummett's notion of stability satisfactorily. We show that the general-elimination rules meet the last of these conditions, and so are indeed not only harmonious but also stable.Publisher PDFPeer reviewe
The open future, bivalence and assertion
It is highly intuitive that the future is open and the past is closed—whereas it is unsettled whether there will be a fourth world war, it is settled that there was a first. Recently, it has become increasingly popular to claim that the intuitive openness of the future implies that contingent statements about the future, such as ‘there will be a sea battle tomorrow,’ are non-bivalent (neither true nor false). In this paper, we argue that the non-bivalence of future contingents is at odds with our pre-theoretic intuitions about the openness of the future. These are revealed by our pragmatic judgments concerning the correctness and incorrectness of assertions of future contingents. We argue that the pragmatic data together with a plausible account of assertion shows that in many cases we take future contingents to be true (or to be false), though we take the future to be open in relevant respects. It follows that appeals to intuition to support the non-bivalence of future contingents is untenable. Intuition favours bivalence
Models of HoTT and the Constructive View of Theories
Homotopy Type theory and its Model theory provide a novel formal semantic framework for representing scientific theories. This framework supports a constructive view of theories according to which a theory is essentially characterised by its methods.
The constructive view of theories was earlier defended by Ernest Nagel and a number of other philosophers of the past but available logical means did not allow these people to build formal representational frameworks that implement this view
A Bell Inequality Analog in Quantum Measure Theory
One obtains Bell's inequalities if one posits a hypothetical joint
probability distribution, or {\it measure}, whose marginals yield the
probabilities produced by the spin measurements in question. The existence of a
joint measure is in turn equivalent to a certain causality condition known as
``screening off''. We show that if one assumes, more generally, a joint {\it
quantal measure}, or ``decoherence functional'', one obtains instead an
analogous inequality weaker by a factor of . The proof of this
``Tsirel'son inequality'' is geometrical and rests on the possibility of
associating a Hilbert space to any strongly positive quantal measure. These
results lead both to a {\it question}: ``Does a joint measure follow from some
quantal analog of `screening off'?'', and to the {\it observation} that
non-contextual hidden variables are viable in histories-based quantum
mechanics, even if they are excluded classically.Comment: 38 pages, TeX. Several changes and added comments to bring out the
meaning more clearly. Minor rewording and extra acknowledgements, now closer
to published versio
Novosti u sprječavanju, dijagnostici i liječenju infektivnih bolesti
Rizični čimbenici za dugoročni mortalitet od bakterijemija koje je uzrokovao Staphylococcus aureus
Infekcija virusom Zika u trudnica u Rio de Janeiru – preliminarni izvještaj
Fekalni mikrobiom pri pojavi juvenilnog idiopatskog artritisa
Treće međunarodne konsenzusne definicije sepse i septičkog šoka (Sepsis-3)
Dijagnostička i prognostička korist prokalcitonina u pacijenata koji dolaze u hitne ambulante sa simtomom dispnej
Control of COVID-19 Outbreaks under Stochastic Community Dynamics, Bimodality, or Limited Vaccination
Reaching population immunity against COVID-19 is proving difficult even in countries with high vaccination levels. Thus, it is critical to identify limits of control and effective measures against future outbreaks. The effects of nonpharmaceutical interventions (NPIs) and vaccination strategies are analyzed with a detailed community-specific agent-based model (ABM). The authors demonstrate that the threshold for population immunity is not a unique number, but depends on the vaccination strategy. Prioritizing highly interactive people diminishes the risk for an infection wave, while prioritizing the elderly minimizes fatalities when vaccinations are low. Control over COVID-19 outbreaks requires adaptive combination of NPIs and targeted vaccination, exemplified for Germany for January–September 2021. Bimodality emerges from the heterogeneity and stochasticity of community-specific human–human interactions and infection networks, which can render the effects of limited NPIs uncertain. The authors' simulation platform can process and analyze dynamic COVID-19 epidemiological situations in diverse communities worldwide to predict pathways to population immunity even with limited vaccination.Peer Reviewe
- …