11 research outputs found

    CICT: A Novel Framework for Biomedical and Bioengineering Application

    Get PDF
    In 2013, Computational Information Conservation Theory (CICT) confirmed Newman, Lachmann and Moore's result (in 2004), generating analogous example for 2-D signal (image), to show that even the current, most sophisticated instrumentation system is completely unable to reliably discriminate so called "random noise" from any combinatorially optimized encoded message, which CICT called "deterministic noise". To grasp a more reliable representation of experimental reality and to get stronger physical and biological system correlates,researchers and scientists need two intelligently articulated hands: both stochastic and combinatorial approaches synergistically articulated by natural coupling. CICT approach brings classical and quantum information theory together in a single framework, by considering information not only on the statistical manifold of model states but also on the combinatorial manifold of low-level discrete, phased generators and empirical measures of noise sources, related to experimental high level overall perturbation. As an example of complex system (hirarchical heterogenous multi-scale system) with important implications, we consider classical relativistic electrodynamics, applied to biological system modeling (e.g. fullwave electromagnetic modeling of brain waves). CICT approach can offer an effective and convenient "Science 2.0" universal framework to develop innovative application and beyond, towards a more sustainable economy and wellbeing, in a global competition scenario

    Stronger Nanoscale EM and BEM Solutions by CICT Phased Generators

    Get PDF
    open1noThe addiction to IC (Infinitesimal Calculus), in the mathematical treatment of EM (electromagnetic) and BEM (bioelectromagnetic) modeling problems, is such that, since the digital computer requires an algebraic formulation of physical laws, it is preferred to discretize the differential equations, rather than considering other more convenient tools for problem mathematical description like, for instance, FDC (Finite Differences Calculus) or more sophisticated algebraic methods. Unfortunately, even traditional FDC, FDTD, etc., approaches are unable to conserve overall system information description. As a matter of fact, current Number Theory and modern Numeric Analysis still use mono-directional interpretation for numeric group generator and relations, so information entropy generation cannot be avoided in current computational algorithm and application. Furthermore, traditional digital computational resources are unable to capture and to manage not only the full information content of a single Real Number R, but even Rational Number Q is managed by information dissipation (e.g. finite precision machine, truncating, rounding, etc.). CICT PG approach can offer an effective and convenient "Science 2.0" universal framework, by considering information not only on the statistical manifold of model states but also on the combinatorial manifold of low-level discrete, phased generators and empirical measures of noise sources, related to experimental high-level overall perturbation. We present an effective example; how to unfold the full information content hardwired into Rational OpeRational (OR) representation (nano-microscale discrete representation) and to relate it to acontinuum framework (meso-macroscale) with no information dissipation. This paper is a relevant contribute towards arbitrary multi-scale computer science and systems biology modeling, to show how CICT PG approach can offer a powerful, effective and convenient "Science 2.0" universal framework to develop innovative, antifragile application and beyond.Fiorini, RodolfoFiorini, Rodolf

    From Quantum Sensing to SWEME Interaction Modeling

    Get PDF
    Under the influence of super weak electromagnetic emission (SWEME) the water changes its physical properties and becomes able to have the same effect on biological object as well as the substance which SWEME was used. Up today, there is no understanding of mechanisms of these phenomena, and there are no theoretical basics for the observed results. The core problem concerns the precision with which measurement on quantum resonant systems can be used to estimate quantities that appear as classical parameters in the theory, for example time, displacement, rotation, external fields, etc. The CICT EPG-IPG approach can minimize the traditional multiscale statistic modeling veil opacity. It brings classical and quantum information theory together in a single, effective pre-spatial geometro-arithmetic framework

    A natural framework for arbitrary multi-scale computer science and systems biology efficient computational modeling

    Get PDF
    The aim of the present paper is to provide the first concise overview of a natural framework for arbitrary multi-scale computer science and systems biology computational modeling. To grasp a more reliable representation of reality and to get more effective modeling techniques, researchers and scientists need two intelligently articulated hands: both stochastic and combinatorial approaches synergically articulated by natural coupling. After a brief introduction about traditional modeling vs. fresh QFT approach, we go to the root of the problem directly. We present key points solution to arbitrary multi-scale modeling problems. The first attempt to identify basic principles to get stronger modeling solution for scientific application has been developing at Politecnico di Milano University since the 1990s. The fundamental principles on computational information conservation theory (CICT), for arbitrary multi-scale system modeling from basic generator and relation through discrete paths denser and denser to one another, towards a never ending 'blending quantum continuum,' are recalled. A computational example is presented and discussed. This paper is a relevant contribute towards arbitrary multi-scale computer science and systems biology modeling, to show how computational information conservation approach can offer stronger and more effective system modeling algorithms for more reliable simulation

    Exploiting Numerical Features in Computer Tomography Data Processing and Management by CICT

    Get PDF
    The impact of high resolution Computer Tomography (HRCT) technology is to generate new challenges associated with the problem of formation, acquisition, compression, transmission, and analysis of enormous amount of data. In the past, computational information conservation theory (CICT) has shown potentiality to provide us with new techniques to face this challenge conveniently. CICT explores, at elementary level, the basic properties and relationships of Q arithmetic to achieve full numeric algorithmic information conservation with strong connections to modular group theory and combinatorial optimization. Traditional rational number system can be regarded as a highly sophisticated open logic, powerful and flexible LTR and RTL formal language of languages, with self-defining consistent words and rules, starting from elementary generators and relations. CICT supply us with optimized exponential cyclic sequences (OECS) which inherently capture the hidden symmetries and asymmetries of the hyperbolic space encoded by rational numbers. Symmetry and asymmetry relations can be seen as the operational manifestation of universal categorical irreducible arithmetic dichotomy (”correspondence” and ”incidence”) at the innermost logical data structure level. These two components are inseparable from each other, and in continuous reciprocal interaction. According to Pierre Curie, symmetry breaking has the following role: for the occurrence of a phenomenon in a medium, the original symmetry group of the medium must be lowered (broken, in today's terminology) to the symmetry group of the phenomenon (or to a subgroup of the phenomenons symmetry group) by the action of some cause. In this sense symmetry breaking, or asymmetry, is what creates a phenomenon. The same dichotomy generates ”pairing” and ”fixed point” properties for digit group with the same word length, in word combinatorics. Correspondence and Incidence manifest themselves even into single digit fundamental property (i.e. ”evenness” and ”oddness”), till binary elementary symbols (”0”, ”1”). This new awareness can be exploited into the development of competitive optimized algorithm and application. Practical examples will be presented

    The Entropy Conundrum: A Solution Proposal

    Get PDF
    In 2004, physicist Mark Newman, along with biologist Michael Lachmann and computer scientist Cristopher Moore, showed that if electromagnetic radiation is used as a transmission medium, the most information-efficient format for a given 1-D signal is indistinguishable from blackbody radiation. Since many natural processes maximize the Gibbs-Boltzmann entropy, they should give rise to spectra indistinguishable from optimally efficient transmission. In 2008, computer scientist C.S. Calude and physicist K. Svozil proved that "Quantum Randomness" is not Turing computable. In 2013, academic scientist R.A. Fiorini confirmed Newman, Lachmann and Moore's result, creating analogous example for 2-D signal (image), as an application of CICT in pattern recognition and image analysis. Paradoxically if you don’t know the code used for the message you can’t tell the difference between an information-rich message and a random jumble of letters. This is an entropy conundrum to solve. Even the most sophisticated instrumentation system is completely unable to reliably discriminate so called "random noise" from any combinatorically optimized encoded message, which CICT called "deterministic noise". Entropy fundamental concept crosses so many scientific and research areas, but, unfortunately, even across so many different disciplines, scientists have not yet worked out a definitive solution to the fundamental problem of the logical relationship between human experience and knowledge extraction. So, both classic concept of entropy and system random noise should be revisited deeply at theoretical and operational level. A convenient CICT solution proposal will be presented

    Entropy, Decoherence and Spacetime Splitting

    Get PDF
    Objects in classical world model are in an "either/or" kind of state. A compass needle cannot point both north and south at the same time. The quantum world, by contrast, is "both/and" and a magnetic atom model has no trouble at pointing both directions at once. When that is the case, physicists say that a quantum object is in a "superposition" of states. In previous paper, we already discussed the major intrinsic limitations of "Science 1.0" arbitrary multi-scale (AMS) modeling and strategies to get better simulation results by "Science 2.0" approach. In 2014, Computational information conservation theory (CICT) has shown that even the most sophisticated instrumentation system is completely unable to reliably discriminate so called "random noise" (RN) from any combinatorically optimized encoded message (OECS, optimized exponential cyclic sequence), called "deterministic noise" (DN) by CICT. Unfortunately, the "probabilistic veil" can be quite opaque computationally, and misplaced precision leads to confusion. The "Science 2.0" paradigm has not yet been completely grasped by many contemporary scientific disciplines and current researchers, so that not all the implications of this big change have been realized hitherto, even less their related, vital applications. Thus, one of the key questions in understanding the quantum-classical transition is what happens to the superposition as you go up that atoms-to-apple scale. Exactly when and how does "both/and" become "either/or"? As an example, we present and discuss the observer space-time splitting case. In other words, we show spacetime mapping to classical system additive representation with entropy generation. It is exactly at this point that "both/and" becomes "either/or" representation by usual Science 1.0 approach. CICT new awareness of a discrete HG (hyperbolic geometry) subspace (reciprocal space) of coded heterogeneous hyperbolic structures, underlying the familiar Q Euclidean (direct space) surface representation can open the way to holographic information geometry (HIG) to recover system lost coherence and to overall system minimum entropy representation

    A Cybernetics Update for Competitive Deep Learning System

    Get PDF
    A number of recent reports in the peer-reviewed literature have discussed irreproducibility of results in biomedical research. Some of these articles suggest that the inability of independent research laboratories to replicate published results has a negative impact on the development of, and confidence in, the biomedical research enterprise. To get more resilient data and to achieve higher reproducible result, we present an adaptive and learning system reference architecture for smart learning system interface. To get deeper inspiration, we focus our attention on mammalian brain neurophysiology. In fact, from a neurophysiological point of view, neuroscientist LeDoux finds two preferential amygdala pathways in the brain of the laboratory mouse. The low road is a pathway which is able to transmit a signal from a stimulus to the thalamus, and then to the amygdala, which then activates a fast-response in the body. The high road is activated simultaneously. This is a slower road which also includes the cortical parts of the brain, thus creating a conscious impression of what the stimulus is (to develop a rational mechanism of defense for instance). To mimic this biological reality, our main idea is to use a new input node able to bind known information to the unknown one coherently. Then, unknown "environmental noise" or/and local "signal input" information can be aggregated to known "system internal control status" information, to provide a landscape of attractor points, which either fast or slow and deeper system response can computed from. In this way, ideal cybernetics system interaction levels can be matched exactly to practical system modeling interaction styles, with no paradigmatic operational ambiguity and minimal information loss. The present paper is a relevant contribute to classic cybernetics updating towards a new General Theory of Systems, a post-Bertalanffy Systemics

    How Random is Your Tomographic Noise? A Number Theoretic Transform (NTT) Approach

    No full text

    How Random is Your Tomographic Noise? A Number Theoretic Transform (NTT) Approach

    No full text
    Discrete Tomography (DT), differently from GT and CT, focuses on the case where only few specimen projections are known and the images contain a small number of different colours (e.g. black-and-white). A concise review on main contemporary physical and mathematical CT system problems is offered. Stochastic vs. Combinatorically Optimized Noise generation is compared and presented by two visual examples to emphasise a major double-bind problem at the core of contemporary most advanced instrumentation systems. Automatic tailoring denoising procedures to real dynamic system characteristics and performance can get closer to ideal self-registering and selflinearizing system to generate virtual uniform and robust probing field during its whole designed service life-cycle. The first attempt to develop basic principles for system background low-level noise source automatic characterization, profiling and identification by CICT, from discrete system parameter, is presented. As a matter of fact, CICT can supply us with cyclic numeric sequences perfectly tuned to their low-level multiplicative source generators, related to experimental high-level overall perturbation (according to high-level classic perturbation computational model under either additive or multiplicative perturbation hypothesis). Numeric examples are presented. Furthermore, a practical NTT example is given. Specifically, advanced CT system, HRO and Mission Critical Project (MCP) for very low Technological Risk (TR) and Crisis Management (CM) system will be highly benefited mostly by CICT infocentric worldview. The presented framework, concepts and techniques can be used to boost the development of next generation algorithms and advanced applications quite conveniently
    corecore