291,611 research outputs found

    Holistic debugging - enabling instruction set simulation for software quality assurance

    Get PDF
    We present holistic debugging, a novel method for observing execution of complex and distributed software. It builds on an instruction set simulator, which provides reproducible experiments and non-intrusive probing of state in a distributed system. Instruction set simulators, however, only provide low-level information, so a holistic debugger contains a translation framework that maps this information to higher abstraction level observation tools, such as source code debuggers. We have created Nornir, a proof-of-concept holistic debugger, built on the simulator Simics. For each observed process in the simulated system, Nornir creates an abstraction translation stack, with virtual machine translators that map machine-level storage contents (e.g. physical memory, registers) provided by Simics, to application-level data (e.g. virtual memory contents) by parsing the data structures of operating systems and virtual machines. Nornir includes a modified version of the GNU debugger (GDB), which supports non-intrusive symbolic debugging of distributed applications. Nornir's main interface is a debugger shepherd, a programmable interface that controls multiple debuggers, and allows users to coherently inspect the entire state of heterogeneous, distributed applications. It provides a robust observation platform for construction of new observation tools

    Fuzzy investment decision support for brownfield redevelopment

    Get PDF
    Tato disertační práce se zaměřuje na problematiku investování a podporu rozhodování pomocí moderních metod. Zejména pokud jde o analýzu, hodnocení a výběr tzv. brownfieldů pro jejich redevelopment (revitalizaci). Cílem této práce je navrhnout univerzální metodu, která usnadní rozhodovací proces. Proces rozhodování je v praxi komplikován též velkým počet relevantních parametrů ovlivňujících konečné rozhodnutí. Navržená metoda je založena na využití fuzzy logiky, modelování, statistické analýzy, shlukové analýzy, teorie grafů a na sofistikovaných metodách sběru a zpracování informací. Nová metoda umožňuje zefektivnit proces analýzy a porovnávání alternativních investic a přesněji zpracovat velký objem informací. Ve výsledku tak bude zmenšen počet prvků množiny nejvhodnějších alternativních investic na základě hierarchie parametrů stanovených investorem.This dissertation focuses on decision making, investing and brownfield redevelopment. Especially on the analysis, evaluation and selection of previously used real estates suitable for commercial use. The objective of this dissertation is to design a method that facilitates the decision making process with many possible alternatives and large number of relevant parameters influencing the decision. The proposed method is based on the use of fuzzy logic, modeling, statistic analysis, cluster analysis, graph theory and sophisticated methods of information collection and processing. New method allows decision makers to process much larger amount of information and evaluate possible investment alternatives efficiently.

    N=4 Super-Yang-Mills Theory, QCD and Collider Physics

    Full text link
    We review how (dimensionally regulated) scattering amplitudes in N=4 super-Yang-Mills theory provide a useful testing ground for perturbative QCD calculations relevant to collider physics, as well as another avenue for investigating the AdS/CFT correspondence. We describe the iterative relation for two-loop scattering amplitudes in N=4 super-Yang-Mills theory found in C. Anastasiou et al., Phys. Rev. Lett. 91:251602 (2003), and discuss recent progress toward extending it to three loops.Comment: 17 pages, 9 figures. Talk presented by LD at Strings 200

    Efficient transfer entropy analysis of non-stationary neural time series

    Full text link
    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these observations, available estimators assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that deals with the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method. We test the performance and robustness of our implementation on data from simulated stochastic processes and demonstrate the method's applicability to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscientific data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems.Comment: 27 pages, 7 figures, submitted to PLOS ON

    PRISE: An Integrated Platform for Research and Teaching of Critical Embedded Systems

    Get PDF
    In this paper, we present PRISE, an integrated workbench for Research and Teaching of critical embedded systems at ISAE, the French Institute for Space and Aeronautics Engineering. PRISE is built around state-of-the-art technologies for the engineering of space and avionics systems used in Space and Avionics domain. It aims at demonstrating key aspects of critical, real-time, embedded systems used in the transport industry, but also validating new scientific contributions for the engineering of software functions. PRISE combines embedded and simulation platforms, and modeling tools. This platform is available for both research and teaching. Being built around widely used commercial and open source software; PRISE aims at being a reference platform for our teaching and research activities at ISAE

    QED challenges at FCC-ee precision measurements

    Full text link
    The expected experimental precision of the rates and asymmetries in the Future Circular Collider with electron positron beams (FCC-ee) in the centre of the mass energy range 88-365GeV considered for construction in CERN, will be better by a factor 5-200. This will be thanks to very high luminosity, factor up to 10510^5 higher than in the past LEP experiments. This poses the extraordinary challenge of improving the precision of the Standard Model predictions by a comparable factor. In particular the perturbative calculations of the trivial QED effects, which have to be removed from the experimental data, are considered to be a major challenge for almost all quantities to be measured at FCC-ee. The task of this paper is to summarize on the "state of the art" in this class of the calculations left from the LEP era and to examine what is to be done to match the precision of the FCC-ee experiments -- what kind of technical advancements are necessary. The above analysis will be done for most important observables of the FCC-ee like the total cross sections near ZZ and WWWW threshold, charge asymmetries, the invisible width of ZZ boson, the spin asymmetry from τ\tau lepton decay and the luminosity measurement.Comment: Corrected author's name in ref. [106
    corecore