164 research outputs found

    Elective Open Suprarenal Aneurysm Repair in England from 2000 to 2010 an Observational Study of Hospital Episode Statistics

    Get PDF
    Background: Open surgery is widely used as a benchmark for the results of fenestrated endovascular repair of complex abdominal aortic aneurysms (AAA). However, the existing evidence stems from single-centre experiences, and may not be reproducible in wider practice. National outcomes provide valuable information regarding the safety of suprarenal aneurysm repair. Methods: Demographic and clinical data were extracted from English Hospital Episodes Statistics for patients undergoing elective suprarenal aneurysm repair from 1 April 2000 to 31 March 2010. Thirty-day mortality and five-year survival were analysed by logistic regression and Cox proportional hazards modeling. Results: 793 patients underwent surgery with 14% overall 30-day mortality, which did not improve over the study period. Independent predictors of 30-day mortality included age, renal disease and previous myocardial infarction. 5-year survival was independently reduced by age, renal disease, liver disease, chronic pulmonary disease, and known metastatic solid tumour. There was significant regional variation in both 30-day mortality and 5-year survival after risk-adjustment. Regional differences in outcome were eliminated in a sensitivity analysis for perioperative outcome, conducted by restricting analysis to survivors of the first 30 days after surgery. Conclusions: Elective suprarenal aneurysm repair was associated with considerable mortality and significant regional variation across England. These data provide a benchmark to assess the efficacy of complex endovascular repair of supra-renal aneurysms, though cautious interpretation is required due to the lack of information regarding aneurysm morphology. More detailed study is required, ideally through the mandatory submission of data to a national registry of suprarenal aneurysm repair

    Bayesian Conditioning, the Reflection Principle, and Quantum Decoherence

    Get PDF
    The probabilities a Bayesian agent assigns to a set of events typically change with time, for instance when the agent updates them in the light of new data. In this paper we address the question of how an agent's probabilities at different times are constrained by Dutch-book coherence. We review and attempt to clarify the argument that, although an agent is not forced by coherence to use the usual Bayesian conditioning rule to update his probabilities, coherence does require the agent's probabilities to satisfy van Fraassen's [1984] reflection principle (which entails a related constraint pointed out by Goldstein [1983]). We then exhibit the specialized assumption needed to recover Bayesian conditioning from an analogous reflection-style consideration. Bringing the argument to the context of quantum measurement theory, we show that "quantum decoherence" can be understood in purely personalist terms---quantum decoherence (as supposed in a von Neumann chain) is not a physical process at all, but an application of the reflection principle. From this point of view, the decoherence theory of Zeh, Zurek, and others as a story of quantum measurement has the plot turned exactly backward.Comment: 14 pages, written in memory of Itamar Pitowsk

    Constructing Dirac linear fermions in terms of non-linear Heisenberg spinors

    Get PDF
    We show that the massive (or massless) neutrinos can be described as special states of Heisenberg nonlinear spinors. As a by-product of this decomposition a particularly attractive consequence appears: the possibility of relating the existence of only three species of mass-less neutrinos to such internal non-linear structure. At the same time it allows the possibility that neutrino oscillation can occurs even for massless neutrinos

    A frequentist framework of inductive reasoning

    Full text link
    Reacting against the limitation of statistics to decision procedures, R. A. Fisher proposed for inductive reasoning the use of the fiducial distribution, a parameter-space distribution of epistemological probability transferred directly from limiting relative frequencies rather than computed according to the Bayes update rule. The proposal is developed as follows using the confidence measure of a scalar parameter of interest. (With the restriction to one-dimensional parameter space, a confidence measure is essentially a fiducial probability distribution free of complications involving ancillary statistics.) A betting game establishes a sense in which confidence measures are the only reliable inferential probability distributions. The equality between the probabilities encoded in a confidence measure and the coverage rates of the corresponding confidence intervals ensures that the measure's rule for assigning confidence levels to hypotheses is uniquely minimax in the game. Although a confidence measure can be computed without any prior distribution, previous knowledge can be incorporated into confidence-based reasoning. To adjust a p-value or confidence interval for prior information, the confidence measure from the observed data can be combined with one or more independent confidence measures representing previous agent opinion. (The former confidence measure may correspond to a posterior distribution with frequentist matching of coverage probabilities.) The representation of subjective knowledge in terms of confidence measures rather than prior probability distributions preserves approximate frequentist validity.Comment: major revisio

    Reciprocity as a foundation of financial economics

    Get PDF
    This paper argues that the subsistence of the fundamental theorem of contemporary financial mathematics is the ethical concept ‘reciprocity’. The argument is based on identifying an equivalence between the contemporary, and ostensibly ‘value neutral’, Fundamental Theory of Asset Pricing with theories of mathematical probability that emerged in the seventeenth century in the context of the ethical assessment of commercial contracts in a framework of Aristotelian ethics. This observation, the main claim of the paper, is justified on the basis of results from the Ultimatum Game and is analysed within a framework of Pragmatic philosophy. The analysis leads to the explanatory hypothesis that markets are centres of communicative action with reciprocity as a rule of discourse. The purpose of the paper is to reorientate financial economics to emphasise the objectives of cooperation and social cohesion and to this end, we offer specific policy advice

    The Class of All Natural Implicative Expansions of Kleene’s Strong Logic Functionally Equivalent to Łukasiewicz’s 3-Valued Logic Ł3

    Get PDF
    25 p.We consider the logics determined by the set of all natural implicative expansions of Kleene’s strong 3-valued matrix (with both only one and two designated values) and select the class of all logics functionally equivalent to Łukasiewicz’s 3-valued logic Ł3. The concept of a “natural implicative matrix” is based upon the notion of a “natural conditional” defined in Tomova (Rep Math Log 47:173–182, 2012).S

    Logical inference for inverse problems

    Get PDF
    Estimating a deterministic single value for model parameters when reconstructing the system response has a limited meaning if one considers that the model used to predict its behaviour is just an idealization of reality, and furthermore, the existence of measurements errors. To provide a reliable answer, probabilistic instead of deterministic values should be provided, which carry information about the degree of uncertainty or plausibility of those model parameters providing one or more observations of the system response. This is widely-known as the Bayesian inverse problem, which has been covered in the literature from different perspectives, depending on the interpretation or the meaning assigned to the probability. In this paper, we revise two main approaches: the one that uses probability as logic, and an alternative one that interprets it as information content. The contribution of this paper is to provide an unifying formulation from which both approaches stem as interpretations, and which is more general in the sense of requiring fewer axioms, at the time the formulation and computation is simplified by dropping some constants. An extension to the problem of model class selection is derived, which is particularly simple under the proposed framework. A numerical example is finally given to illustrate the utility and effectiveness of the method

    Wall shear stress as measured in vivo: consequences for the design of the arterial system

    Get PDF
    Based upon theory, wall shear stress (WSS), an important determinant of endothelial function and gene expression, has been assumed to be constant along the arterial tree and the same in a particular artery across species. In vivo measurements of WSS, however, have shown that these assumptions are far from valid. In this survey we will discuss the assessment of WSS in the arterial system in vivo and present the results obtained in large arteries and arterioles. In vivo WSS can be estimated from wall shear rate, as derived from non-invasively recorded velocity profiles, and whole blood viscosity in large arteries and plasma viscosity in arterioles, avoiding theoretical assumptions. In large arteries velocity profiles can be recorded by means of a specially designed ultrasound system and in arterioles via optical techniques using fluorescent flow velocity tracers. It is shown that in humans mean WSS is substantially higher in the carotid artery (1.1–1.3 Pa) than in the brachial (0.4–0.5 Pa) and femoral (0.3–0.5 Pa) arteries. Also in animals mean WSS varies substantially along the arterial tree. Mean WSS in arterioles varies between about 1.0 and 5.0 Pa in the various studies and is dependent on the site of measurement in these vessels. Across species mean WSS in a particular artery decreases linearly with body mass, e.g., in the infra-renal aorta from 8.8 Pa in mice to 0.5 Pa in humans. The observation that mean WSS is far from constant along the arterial tree implies that Murray’s cube law on flow-diameter relations cannot be applied to the whole arterial system. Because blood flow velocity is not constant along the arterial tree either, a square law also does not hold. The exponent in the power law likely varies along the arterial system, probably from 2 in large arteries near the heart to 3 in arterioles. The in vivo findings also imply that in in vitro studies no average shear stress value can be taken to study effects on endothelial cells derived from different vascular areas or from the same artery in different species. The cells have to be studied under the shear stress conditions they are exposed to in real life

    Precaution or Integrated Responsibility Approach to Nanovaccines in Fish Farming? A Critical Appraisal of the UNESCO Precautionary Principle

    Get PDF
    Nanoparticles have multifaceted advantages in drug administration as vaccine delivery and hence hold promises for improving protection of farmed fish against diseases caused by pathogens. However, there are concerns that the benefits associated with distribution of nanoparticles may also be accompanied with risks to the environment and health. The complexity of the natural and social systems involved implies that the information acquired in quantified risk assessments may be inadequate for evidence-based decisions. One controversial strategy for dealing with this kind of uncertainty is the precautionary principle. A few years ago, an UNESCO expert group suggested a new approach for implementation of the principle. Here we compare the UNESCO principle with earlier versions and explore the advantages and disadvantages by employing the UNESCO version to the use of PLGA nanoparticles for delivery of vaccines in aquaculture. Finally, we discuss whether a combined scientific and ethical analysis that involves the concept of responsibility will enable approaches that can provide a supplement to the precautionary principle as basis for decision-making in areas of scientific uncertainty, such as the application of nanoparticles in the vaccination of farmed fish
    corecore