129,846 research outputs found

    Resolution enhancement of imagestakenby mobile phonecamera

    Get PDF
    Carey and co-researchers have estimated a super-resolution technique DA SR (Demirel-Anbarjafari Super Resolution), based on interpolation of the high frequency sub-band images obtained by discrete wavelet transform (DWT). Their estimation was carried out by investigating the evolution of wavelet transform extrema among the same type of subbands. Edges identified by an edge detection algorithm in lower frequency subbands were used to prepare a model for estimating edges in higher frequency subbands; and only the coefficients with significant values were estimated as the evolution of the wavelet coefficients. Finally, interpolated high-frequency sub-band images and the interpolated input image are combined by using IDWT to achieve a high resolution output image. The technique has been implemented in Java language in order to be installed on the mobile phones. The DA SR technique has been tested on well-known benchmark images

    Locality and measurements within the SR model for an objective interpretation of quantum mechanics

    Full text link
    One of the authors has recently propounded an SR (semantic realism) model which shows, circumventing known no-go theorems, that an objective (noncontextual, hence local) interpretation of quantum mechanics (QM) is possible. We consider here compound physical systems and show why the proofs of nonlocality of QM do not hold within the SR model. We also discuss quantum measurement theory within this model, note that the objectification problem disappears since the measurement of any property simply reveals its unknown value, and show that the projection postulate can be considered as an approximate law, valid FAPP (for all practical purposes). Finally, we provide an intuitive justification for some unusual features of the SR model.Comment: 29 pages, minor correction

    Computation in Finitary Stochastic and Quantum Processes

    Full text link
    We introduce stochastic and quantum finite-state transducers as computation-theoretic models of classical stochastic and quantum finitary processes. Formal process languages, representing the distribution over a process's behaviors, are recognized and generated by suitable specializations. We characterize and compare deterministic and nondeterministic versions, summarizing their relative computational power in a hierarchy of finitary process languages. Quantum finite-state transducers and generators are a first step toward a computation-theoretic analysis of individual, repeatedly measured quantum dynamical systems. They are explored via several physical systems, including an iterated beam splitter, an atom in a magnetic field, and atoms in an ion trap--a special case of which implements the Deutsch quantum algorithm. We show that these systems' behaviors, and so their information processing capacity, depends sensitively on the measurement protocol.Comment: 25 pages, 16 figures, 1 table; http://cse.ucdavis.edu/~cmg; numerous corrections and update

    Languages cool as they expand: Allometric scaling and the decreasing need for new words

    Get PDF
    We analyze the occurrence frequencies of over 15 million words recorded in millions of books published during the past two centuries in seven different languages. For all languages and chronological subsets of the data we confirm that two scaling regimes characterize the word frequency distributions, with only the more common words obeying the classic Zipf law. Using corpora of unprecedented size, we test the allometric scaling relation between the corpus size and the vocabulary size of growing languages to demonstrate a decreasing marginal need for new words, a feature that is likely related to the underlying correlations between words. We calculate the annual growth fluctuations of word use which has a decreasing trend as the corpus size increases, indicating a slowdown in linguistic evolution following language expansion. This ‘‘cooling pattern’’ forms the basis of a third statistical regularity, which unlike the Zipf and the Heaps law, is dynamical in nature

    Extensions of Simple Conceptual Graphs: the Complexity of Rules and Constraints

    Full text link
    Simple conceptual graphs are considered as the kernel of most knowledge representation formalisms built upon Sowa's model. Reasoning in this model can be expressed by a graph homomorphism called projection, whose semantics is usually given in terms of positive, conjunctive, existential FOL. We present here a family of extensions of this model, based on rules and constraints, keeping graph homomorphism as the basic operation. We focus on the formal definitions of the different models obtained, including their operational semantics and relationships with FOL, and we analyze the decidability and complexity of the associated problems (consistency and deduction). As soon as rules are involved in reasonings, these problems are not decidable, but we exhibit a condition under which they fall in the polynomial hierarchy. These results extend and complete the ones already published by the authors. Moreover we systematically study the complexity of some particular cases obtained by restricting the form of constraints and/or rules

    Scientific Realism, Adaptationism and the Problem of the Criterion

    Get PDF
    Scientific Realism (SR) has three crucial aspects: 1) the centrality of the concept of truth, 2) the idea that success is a reliable indicator of truth, and 3) the idea that the Inference to the Best Explanation is a reliable inference rule. It will be outlined how some realists try to overcome the difficulties which arise in justifying such crucial aspects relying on an adaptationist view of evolutionism, and why such attempts are inadequate. Finally, we will briefly sketch some of the main difficulties the realist has to face in defending those crucial aspects, and how such difficulties are deeply related: they derive from the inability of SR to satisfyingly avoid the sceptical challenge of the criterion of truth. Indeed, SR seems not to be able to fill the so-called ‘epistemic gap’ (Sankey 2008). In fact, the epistemic gap cannot be filled in no way other than obtaining a criterion of truth, but such a criterion cannot be obtained if the epistemic gap obtains

    Pseudo-Hermitian approach to energy-dependent Klein-Gordon models

    Full text link
    The relativistic Klein-Gordon system is studied as an illustration of Quantum Mechanics using non-Hermitian operators as observables. A version of the model is considered containing a generic coordinate- and energy-dependent phenomenological mass-term m2(E,x)m^2(E,x). We show how similar systems may be assigned a pair of the linear, energy-independent left- and right-acting Hamiltonians with quasi-Hermiticity property and, hence, with the standard probabilistic interpretation.Comment: 2nd Int. Workshop "Pseudo-Hermitian Hamiltonians in Quantum Physics" (http://gemma.ujf.cas.cz/~znojil/conf

    On the Foundations of the Theory of Evolution

    Full text link
    Darwinism conceives evolution as a consequence of random variation and natural selection, hence it is based on a materialistic, i.e. matter-based, view of science inspired by classical physics. But matter in itself is considered a very complex notion in modern physics. More specifically, at a microscopic level, matter and energy are no longer retained within their simple form, and quantum mechanical models are proposed wherein potential form is considered in addition to actual form. In this paper we propose an alternative to standard Neodarwinian evolution theory. We suggest that the starting point of evolution theory cannot be limited to actual variation whereupon is selected, but to variation in the potential of entities according to the context. We therefore develop a formalism, referred to as Context driven Actualization of Potential (CAP), which handles potentiality and describes the evolution of entities as an actualization of potential through a reiterated interaction with the context. As in quantum mechanics, lack of knowledge of the entity, its context, or the interaction between context and entity leads to different forms of indeterminism in relation to the state of the entity. This indeterminism generates a non-Kolmogorovian distribution of probabilities that is different from the classical distribution of chance described by Darwinian evolution theory, which stems from a 'actuality focused', i.e. materialistic, view of nature. We also present a quantum evolution game that highlights the main differences arising from our new perspective and shows that it is more fundamental to consider evolution in general, and biological evolution in specific, as a process of actualization of potential induced by context, for which its material reduction is only a special case.Comment: 11 pages, no figure

    Computation with narrow CTCs

    Full text link
    We examine some variants of computation with closed timelike curves (CTCs), where various restrictions are imposed on the memory of the computer, and the information carrying capacity and range of the CTC. We give full characterizations of the classes of languages recognized by polynomial time probabilistic and quantum computers that can send a single classical bit to their own past. Such narrow CTCs are demonstrated to add the power of limited nondeterminism to deterministic computers, and lead to exponential speedup in constant-space probabilistic and quantum computation. We show that, given a time machine with constant negative delay, one can implement CTC-based computations without the need to know about the runtime beforehand.Comment: 16 pages. A few typo was correcte
    corecore