131 research outputs found

    On Musical Self-Similarity : Intersemiosis as Synecdoche and Analogy

    Get PDF
    Self-similarity, a concept borrowed from mathematics, is gradually becoming a keyword in musicology. Although a polysemic term, self-similarity often refers to the multi-scalar feature repetition in a set of relationships, and it is commonly valued as an indication for musical ‘coherence’ and ‘consistency’. In this study, Gabriel Pareyon presents a theory of musical meaning formation in the context of intersemiosis, that is, the translation of meaning from one cognitive domain to another cognitive domain (e.g. from mathematics to music, or to speech or graphic forms). From this perspective, the degree of coherence of a musical system relies on a synecdochic intersemiosis: a system of related signs within other comparable and correlated systems. The author analyzes the modalities of such correlations, exploring their general and particular traits, and their operational bounds. Accordingly, the notion of analogy is used as a rich concept through its two definitions quoted by the Classical literature—proportion and paradigm, enormously valuable in establishing measurement, likeness and affinity criteria. At the same time, original arguments by Benoît B. Mandelbrot (1924–2010) are revised, alongside a systematic critique of the literature on the subject. In fact, connecting Charles S. Peirce’s ‘synechism’ with Mandelbrot’s ‘fractality’ is one of the main developments of the present study

    Medical image enhancement

    Get PDF
    Each image acquired from a medical imaging system is often part of a two-dimensional (2-D) image set whose total presents a three-dimensional (3-D) object for diagnosis. Unfortunately, sometimes these images are of poor quality. These distortions cause an inadequate object-of-interest presentation, which can result in inaccurate image analysis. Blurring is considered a serious problem. Therefore, “deblurring” an image to obtain better quality is an important issue in medical image processing. In our research, the image is initially decomposed. Contrast improvement is achieved by modifying the coefficients obtained from the decomposed image. Small coefficient values represent subtle details and are amplified to improve the visibility of the corresponding details. The stronger image density variations make a major contribution to the overall dynamic range, and have large coefficient values. These values can be reduced without much information loss

    New Foundation in the Sciences: Physics without sweeping infinities under the rug

    Get PDF
    It is widely known among the Frontiers of physics, that “sweeping under the rug” practice has been quite the norm rather than exception. In other words, the leading paradigms have strong tendency to be hailed as the only game in town. For example, renormalization group theory was hailed as cure in order to solve infinity problem in QED theory. For instance, a quote from Richard Feynman goes as follows: “What the three Nobel Prize winners did, in the words of Feynman, was to get rid of the infinities in the calculations. The infinities are still there, but now they can be skirted around . . . We have designed a method for sweeping them under the rug. [1] And Paul Dirac himself also wrote with similar tune: “Hence most physicists are very satisfied with the situation. They say: Quantum electrodynamics is a good theory, and we do not have to worry about it any more. I must say that I am very dissatisfied with the situation, because this so-called good theory does involve neglecting infinities which appear in its equations, neglecting them in an arbitrary way. This is just not sensible mathematics. Sensible mathematics involves neglecting a quantity when it turns out to be small—not neglecting it just because it is infinitely great and you do not want it!”[2] Similarly, dark matter and dark energy were elevated as plausible way to solve the crisis in prevalent Big Bang cosmology. That is why we choose a theme here: New Foundations in the Sciences, in order to emphasize the necessity to introduce a new set of approaches in the Sciences, be it Physics, Cosmology, Consciousness etc

    Statistical modelling of financial crashes.

    Get PDF
    As the stock market came to the attention of increasing numbers of physicists, an idea that has recently emerged is that it might be possible to develop a mathematical theory of stock market crashes. This thesis is primarily concerned with statistical aspects of such a theory. Chapters 1-5 discuss simple models for bubbles. Chapter 1 is an introduction. Chapter 2 describes a skeleton exploratory analysis, before discussing some economic interpretations of crashes and a rational expectations model of financial crashes - a slightly simplified version of that in Johansen et aZ. (2000). This model assumes that economic variables undergo a phase transition prior to a crash, and we give some empirical support of this idea in Chapters 4 and 5. Chapter 3 discusses SDE models for bubbles. We describe maximum likelihood estimation of the Bornette and Andersen (2002) model and refine previous estimation of this model in Andersen and Bornette (2004). Further, we extend this model using a heavy-tailed hyperbolic process, Eberlein and Keller (1995), to provide a robust statistical test for bubbles. In Chapter 4 we examine a range of volatility and liquidity precursors. We have some evidence that crashes occur on volatile illiquid markets and economic interpretation of our results appears interesting. Chapter 5 synthesises Chapters 2-4. In Chapter 6 we develop calculations in Johansen and Bornette (2001), to derive a generalised Pareto distribution for drawdowns. In addition, we review the Bornette et aZ. (2004) method of using power-laws to distinguish between endogenous and exogenous origins of crises. Despite some evidence to support the original approach, it appears that a better model is a stochastic volatility model where the log volatility is fractional Gaussian noise. Arias (2003) makes a distinction between insurance crisis and illiquidity crisis models. In Chapter 7, focusing upon illiquidity crises, we apply the method of Malevergne and Barnette (2005) to evaluate contagion in economics. Chapter 8 summarises the main findings and gives suggestions for further work

    Quantum Anthropology: Man, Cultures, and Groups in a Quantum Perspective

    Get PDF
    This philosophical anthropology tries to explore the basic categories of man’s being in the worlds using a special quantum meta-ontology that is introduced in the book. Quantum understanding of space and time, consciousness, or empirical/nonempirical reality elicits new questions relating to philosophical concerns such as subjectivity, free will, mind, perception, experience, dialectic, or agency. The authors have developed an inspiring theoretical framework transcending the boundaries of particular disciplines, e.g. quantum philosophy, metaphysics of consciousness, philosophy of mind, phenomenology of space and time, and ontological relativity

    French Roadmap for complex Systems 2008-2009

    Get PDF
    This second issue of the French Complex Systems Roadmap is the outcome of the Entretiens de Cargese 2008, an interdisciplinary brainstorming session organized over one week in 2008, jointly by RNSC, ISC-PIF and IXXI. It capitalizes on the first roadmap and gathers contributions of more than 70 scientists from major French institutions. The aim of this roadmap is to foster the coordination of the complex systems community on focused topics and questions, as well as to present contributions and challenges in the complex systems sciences and complexity science to the public, political and industrial spheres

    Simulations and statistical inferences

    Get PDF
    For a long time Finance Theory hold on to the idea of efficient markets that convert every piece of new information into the price of an asset, so as to reflect the true value. The upcoming of empirical facts that contradict this view gave rise to new theoretical models. One of the most important contributions was the so called Behavioural Finance Theory that describes people and traders in particular on a more psychological basis. Based on this approach, other concurring ideas to the efficient market theory developed in the following years. A relatively new branch in finance theory that borrows its main ideas from natural siences is known under the name of econophysics. This approach both comprises the statistical features of financial time series as well as simulations that try to picture real asset markets as complex systems. These systems are characterised by many different traders that interact and influence each other and so create time series that are encountered in many other fields like earthquakes, mass extinction or solar flares. This work applies the theory of complex systems in order to understand the mechanics of real financial markets in particular the stock markets. First it comprises the existing literature on econophysics. Then it provides new statistical work that confirms the former result where the main characteristics of financial time series turned out to be the non-normality of the distribution of price changes, the multifractality and the volatility clustering. This is followed by two new simulation-models. The first is an Ising-model where neighbour influence plays the crucial part. The second is a more economically based simulation where the traders have explicit strategies after which they decide how to act. As it turns out, both models are able to produce time series that possess all the characteristics of real time series

    Communicating the Unspeakable: Linguistic Phenomena in the Psychedelic Sphere

    Get PDF
    Psychedelics can enable a broad and paradoxical spectrum of linguistic phenomena from the unspeakability of mystical experience to the eloquence of the songs of the shaman or curandera. Interior dialogues with the Other, whether framed as the voice of the Logos, an alien download, or communion with ancestors and spirits, are relatively common. Sentient visual languages are encountered, their forms unrelated to the representation of speech in natural language writing systems. This thesis constructs a theoretical model of linguistic phenomena encountered in the psychedelic sphere for the field of altered states of consciousness research (ASCR). The model is developed from a neurophenomenological perspective, especially the work of Francisco Varela, and Michael Winkelman’s work in shamanistic ASC, which in turn builds on the biogenetic structuralism of Charles Laughlin, John McManus, and Eugene d’Aquili. Neurophenomenology relates the physical and functional organization of the brain to the subjective reports of lived experience in altered states as mutually informative, without reducing consciousness to one or the other. Consciousness is seen as a dynamic multistate process of the recursive interaction of biology and culture, thereby navigating the traditional dichotomies of objective/subjective, body/mind, and inner/outer realities that problematically characterize much of the discourse in consciousness studies. The theoretical work of Renaissance scholar Stephen Farmer on the evolution of syncretic and correlative systems and their relation to neurobiological structures provides a further framework for the exegesis of the descriptions of linguistic phenomena in first-person texts of long-term psychedelic selfexploration. Since the classification of most psychedelics as Schedule I drugs, legal research came to a halt; self-experimentation as research did not. Scientists such as Timothy Leary and John Lilly became outlaw scientists, a social aspect of the “unspeakability” of these experiences. Academic ASCR has largely side-stepped examination of the extensive literature of psychedelic selfexploration. This thesis examines aspects of both form and content from these works, focusing on those that treat linguistic phenomena, and asking what these linguistic experiences can tell us about how the psychedelic landscape is constructed, how it can be navigated, interpreted, and communicated within its own experiential field, and communicated about to make the data accessible to inter-subjective comparison and validation. The methodological core of this practice-based research is a technoetic practice as defined by artist and theoretician Roy Ascott: the exploration of consciousness through interactive, artistic, and psychoactive technologies. The iterative process of psychedelic self-exploration and creation of interactive software defines my own technoetic practice and is the means by which I examine my states of consciousness employing the multidimensional visual language Glide

    Nonlinear Systems

    Get PDF
    The editors of this book have incorporated contributions from a diverse group of leading researchers in the field of nonlinear systems. To enrich the scope of the content, this book contains a valuable selection of works on fractional differential equations.The book aims to provide an overview of the current knowledge on nonlinear systems and some aspects of fractional calculus. The main subject areas are divided into two theoretical and applied sections. Nonlinear systems are useful for researchers in mathematics, applied mathematics, and physics, as well as graduate students who are studying these systems with reference to their theory and application. This book is also an ideal complement to the specific literature on engineering, biology, health science, and other applied science areas. The opportunity given by IntechOpen to offer this book under the open access system contributes to disseminating the field of nonlinear systems to a wide range of researchers
    corecore