2,341 research outputs found

    High Resolution 3D Ultrasonic Breast Imaging by Time-Domain Full Waveform Inversion

    Get PDF
    Ultrasound tomography (UST) scanners allow quantitative images of the human breast's acoustic properties to be derived with potential applications in screening, diagnosis and therapy planning. Time domain full waveform inversion (TD-FWI) is a promising UST image formation technique that fits the parameter fields of a wave physics model by gradient-based optimization. For high resolution 3D UST, it holds three key challenges: Firstly, its central building block, the computation of the gradient for a single US measurement, has a restrictively large memory footprint. Secondly, this building block needs to be computed for each of the 103−10410^3-10^4 measurements, resulting in a massive parallel computation usually performed on large computational clusters for days. Lastly, the structure of the underlying optimization problem may result in slow progression of the solver and convergence to a local minimum. In this work, we design and evaluate a comprehensive computational strategy to overcome these challenges: Firstly, we introduce a novel gradient computation based on time reversal that dramatically reduces the memory footprint at the expense of one additional wave simulation per source. Secondly, we break the dependence on the number of measurements by using source encoding (SE) to compute stochastic gradient estimates. Also we describe a more accurate, TD-specific SE technique with a finer variance control and use a state-of-the-art stochastic LBFGS method. Lastly, we design an efficient TD multi-grid scheme together with preconditioning to speed up the convergence while avoiding local minima. All components are evaluated in extensive numerical proof-of-concept studies simulating a bowl-shaped 3D UST breast scanner prototype. Finally, we demonstrate that their combination allows us to obtain an accurate 442x442x222 voxel image with a resolution of 0.5mm using Matlab on a single GPU within 24h

    Stronger Nanoscale EM and BEM Solutions by CICT Phased Generators

    Get PDF
    open1noThe addiction to IC (Infinitesimal Calculus), in the mathematical treatment of EM (electromagnetic) and BEM (bioelectromagnetic) modeling problems, is such that, since the digital computer requires an algebraic formulation of physical laws, it is preferred to discretize the differential equations, rather than considering other more convenient tools for problem mathematical description like, for instance, FDC (Finite Differences Calculus) or more sophisticated algebraic methods. Unfortunately, even traditional FDC, FDTD, etc., approaches are unable to conserve overall system information description. As a matter of fact, current Number Theory and modern Numeric Analysis still use mono-directional interpretation for numeric group generator and relations, so information entropy generation cannot be avoided in current computational algorithm and application. Furthermore, traditional digital computational resources are unable to capture and to manage not only the full information content of a single Real Number R, but even Rational Number Q is managed by information dissipation (e.g. finite precision machine, truncating, rounding, etc.). CICT PG approach can offer an effective and convenient "Science 2.0" universal framework, by considering information not only on the statistical manifold of model states but also on the combinatorial manifold of low-level discrete, phased generators and empirical measures of noise sources, related to experimental high-level overall perturbation. We present an effective example; how to unfold the full information content hardwired into Rational OpeRational (OR) representation (nano-microscale discrete representation) and to relate it to acontinuum framework (meso-macroscale) with no information dissipation. This paper is a relevant contribute towards arbitrary multi-scale computer science and systems biology modeling, to show how CICT PG approach can offer a powerful, effective and convenient "Science 2.0" universal framework to develop innovative, antifragile application and beyond.Fiorini, RodolfoFiorini, Rodolf

    CICT: A Novel Framework for Biomedical and Bioengineering Application

    Get PDF
    In 2013, Computational Information Conservation Theory (CICT) confirmed Newman, Lachmann and Moore's result (in 2004), generating analogous example for 2-D signal (image), to show that even the current, most sophisticated instrumentation system is completely unable to reliably discriminate so called "random noise" from any combinatorially optimized encoded message, which CICT called "deterministic noise". To grasp a more reliable representation of experimental reality and to get stronger physical and biological system correlates,researchers and scientists need two intelligently articulated hands: both stochastic and combinatorial approaches synergistically articulated by natural coupling. CICT approach brings classical and quantum information theory together in a single framework, by considering information not only on the statistical manifold of model states but also on the combinatorial manifold of low-level discrete, phased generators and empirical measures of noise sources, related to experimental high level overall perturbation. As an example of complex system (hirarchical heterogenous multi-scale system) with important implications, we consider classical relativistic electrodynamics, applied to biological system modeling (e.g. fullwave electromagnetic modeling of brain waves). CICT approach can offer an effective and convenient "Science 2.0" universal framework to develop innovative application and beyond, towards a more sustainable economy and wellbeing, in a global competition scenario

    Exploiting Numerical Features in Computer Tomography Data Processing and Management by CICT

    Get PDF
    The impact of high resolution Computer Tomography (HRCT) technology is to generate new challenges associated with the problem of formation, acquisition, compression, transmission, and analysis of enormous amount of data. In the past, computational information conservation theory (CICT) has shown potentiality to provide us with new techniques to face this challenge conveniently. CICT explores, at elementary level, the basic properties and relationships of Q arithmetic to achieve full numeric algorithmic information conservation with strong connections to modular group theory and combinatorial optimization. Traditional rational number system can be regarded as a highly sophisticated open logic, powerful and flexible LTR and RTL formal language of languages, with self-defining consistent words and rules, starting from elementary generators and relations. CICT supply us with optimized exponential cyclic sequences (OECS) which inherently capture the hidden symmetries and asymmetries of the hyperbolic space encoded by rational numbers. Symmetry and asymmetry relations can be seen as the operational manifestation of universal categorical irreducible arithmetic dichotomy (”correspondence” and ”incidence”) at the innermost logical data structure level. These two components are inseparable from each other, and in continuous reciprocal interaction. According to Pierre Curie, symmetry breaking has the following role: for the occurrence of a phenomenon in a medium, the original symmetry group of the medium must be lowered (broken, in today's terminology) to the symmetry group of the phenomenon (or to a subgroup of the phenomenons symmetry group) by the action of some cause. In this sense symmetry breaking, or asymmetry, is what creates a phenomenon. The same dichotomy generates ”pairing” and ”fixed point” properties for digit group with the same word length, in word combinatorics. Correspondence and Incidence manifest themselves even into single digit fundamental property (i.e. ”evenness” and ”oddness”), till binary elementary symbols (”0”, ”1”). This new awareness can be exploited into the development of competitive optimized algorithm and application. Practical examples will be presented

    A Cybernetics Update for Competitive Deep Learning System

    Get PDF
    A number of recent reports in the peer-reviewed literature have discussed irreproducibility of results in biomedical research. Some of these articles suggest that the inability of independent research laboratories to replicate published results has a negative impact on the development of, and confidence in, the biomedical research enterprise. To get more resilient data and to achieve higher reproducible result, we present an adaptive and learning system reference architecture for smart learning system interface. To get deeper inspiration, we focus our attention on mammalian brain neurophysiology. In fact, from a neurophysiological point of view, neuroscientist LeDoux finds two preferential amygdala pathways in the brain of the laboratory mouse. The low road is a pathway which is able to transmit a signal from a stimulus to the thalamus, and then to the amygdala, which then activates a fast-response in the body. The high road is activated simultaneously. This is a slower road which also includes the cortical parts of the brain, thus creating a conscious impression of what the stimulus is (to develop a rational mechanism of defense for instance). To mimic this biological reality, our main idea is to use a new input node able to bind known information to the unknown one coherently. Then, unknown "environmental noise" or/and local "signal input" information can be aggregated to known "system internal control status" information, to provide a landscape of attractor points, which either fast or slow and deeper system response can computed from. In this way, ideal cybernetics system interaction levels can be matched exactly to practical system modeling interaction styles, with no paradigmatic operational ambiguity and minimal information loss. The present paper is a relevant contribute to classic cybernetics updating towards a new General Theory of Systems, a post-Bertalanffy Systemics

    From Quantum Sensing to SWEME Interaction Modeling

    Get PDF
    Under the influence of super weak electromagnetic emission (SWEME) the water changes its physical properties and becomes able to have the same effect on biological object as well as the substance which SWEME was used. Up today, there is no understanding of mechanisms of these phenomena, and there are no theoretical basics for the observed results. The core problem concerns the precision with which measurement on quantum resonant systems can be used to estimate quantities that appear as classical parameters in the theory, for example time, displacement, rotation, external fields, etc. The CICT EPG-IPG approach can minimize the traditional multiscale statistic modeling veil opacity. It brings classical and quantum information theory together in a single, effective pre-spatial geometro-arithmetic framework

    Material Profile Influences in Bulk-Heterojunctions

    Get PDF
    The morphology in mixed bulk-heterojunction films are compared using three different quantitative measurement techniques. We compare the vertical composition changes using high-angle annular dark-field scanning transmission electron microscopy with electron tomography and neutron and x-ray reflectometry. The three measurement techniques yield qualitatively comparable vertical concentration measurements. The presence of a metal cathode during thermal annealing is observed to alter the fullerene concentration throughout the thickness of the film for all measurements. However, the absolute vertical concentration of fullerene is quantitatively different for the three measurements. The origin of the quantitative measurement differences is discussed

    The Entropy Conundrum: A Solution Proposal

    Get PDF
    In 2004, physicist Mark Newman, along with biologist Michael Lachmann and computer scientist Cristopher Moore, showed that if electromagnetic radiation is used as a transmission medium, the most information-efficient format for a given 1-D signal is indistinguishable from blackbody radiation. Since many natural processes maximize the Gibbs-Boltzmann entropy, they should give rise to spectra indistinguishable from optimally efficient transmission. In 2008, computer scientist C.S. Calude and physicist K. Svozil proved that "Quantum Randomness" is not Turing computable. In 2013, academic scientist R.A. Fiorini confirmed Newman, Lachmann and Moore's result, creating analogous example for 2-D signal (image), as an application of CICT in pattern recognition and image analysis. Paradoxically if you don’t know the code used for the message you can’t tell the difference between an information-rich message and a random jumble of letters. This is an entropy conundrum to solve. Even the most sophisticated instrumentation system is completely unable to reliably discriminate so called "random noise" from any combinatorically optimized encoded message, which CICT called "deterministic noise". Entropy fundamental concept crosses so many scientific and research areas, but, unfortunately, even across so many different disciplines, scientists have not yet worked out a definitive solution to the fundamental problem of the logical relationship between human experience and knowledge extraction. So, both classic concept of entropy and system random noise should be revisited deeply at theoretical and operational level. A convenient CICT solution proposal will be presented
    • 

    corecore