49 research outputs found

    Divide and Measure : CFG Segmentation for the Measurement-Based Analysis of Resource Consumption

    Get PDF
    A computer system is a good computer system if it correctly performs the task it was intended to perform. This is not even half of the truth: Non-functional requirements are abundant in the world of software and system engineering, even if they are not always stated explicitly. In our work we are concerned with the measurement-based analysis of resource consumption. Examples of resources are time, energy, or memory space. In the context of our measurement-based approach for software analysis, we face the problem of breaking the software under examination into smaller parts of managable size, a process dubbed CFG Segmentation

    INFER: Interactive Timing Profiles based on Bayesian Networks

    Get PDF
    We propose an approach for timing analysis of software-based embedded computer systems that builds on the established probabilistic framework of Bayesian networks. We envision an approach where we take (1) an abstract description of the control flow within a piece of software, and (2) a set of run-time traces, which are combined into a Bayesian network that can be seen as an interactive timing profile. The obtained profile can be used by the embedded systems engineer not only to obtain a probabilistic estimate of the WCET, but also to run interactive timing simulations, or to automatically identify software configurations that are likely to evoke noteworthy timing behavior, like, e.g., high variances of execution times, and which are therefore candidates for further inspection

    A Power-Aware Framework for Executing Streaming Programs on Networks-on-Chip

    Get PDF
    Nilesh Karavadara, Simon Folie, Michael Zolda, Vu Thien Nga Nguyen, Raimund Kirner, 'A Power-Aware Framework for Executing Streaming Programs on Networks-on-Chip'. Paper presented at the Int'l Workshop on Performance, Power and Predictability of Many-Core Embedded Systems (3PMCES'14), Dresden, Germany, 24-28 March 2014.Software developers are discovering that practices which have successfully served single-core platforms for decades do no longer work for multi-cores. Stream processing is a parallel execution model that is well-suited for architectures with multiple computational elements that are connected by a network. We propose a power-aware streaming execution layer for network-on-chip architectures that addresses the energy constraints of embedded devices. Our proof-of-concept implementation targets the Intel SCC processor, which connects 48 cores via a network-on- chip. We motivate our design decisions and describe the status of our implementation

    Die Flügelmalereien des Sterzinger Altarretabels im stilkritischen und lokalhistorischen Kontext

    Get PDF
    Im Zentrum dieser Arbeit steht die autonome Betrachtung der acht Tafelgemälde des Sterzinger Altars, welcher als eines der am besten dokumentierten Retabel des 15. Jahrhunderts gilt. Die aktuelle Forschungslage zu den Malereien stellt im wesentlichen die Auseinandersetzung mit der Künstlerpersönlichkeit Hans Multscher und seiner Ulmer Werkstätte dar. Dieser wurde von der von Sterzinger Büurgerschaft mit der Herstellung des Altars beauftragt. Der erste Abschnitt der Disseration setzt sich mit der Motivgeschichte und Ikonographie in den vier Passionsszenen auseinander. In diesen konnten verstärkte Übernahmen aus der deutschen Malerei dokumentiert werden, während bei den vier Marienszenen auch niederländische Vorbilder erkannt wurden. Darüberhinaus stellt die Arbeit einen Bezug zwischen den Tafelgemälden und den Sterzinger Passionsspielen her,die seit der Mitte des 15. Jahrhunderts dokumentiert sind. Der stilkritische Abschnitt zeigt, daß die Malereien des Sterzinger Altars von der Formgelegenheit Flügelaltar und ihrer Funktion als Kultbild auf einem Hochaltar bestimmt werden. Ferner wurde die Bedeutung der Flügelaltarproduktion und des Verkaufs für die Beschaffenheit der Malerei beleuchtet. Eine anschließende Analyse des Künstlers ”Sterzinger Meister“ zeigt Einflüsse nationaler Konstanten, künstlerischer Herkunft und bestätigt abschließend andere, ihm zugeschriebene Werke.The work focuses on the eight table pictures of the Sterzing altar, which is among the best described altarpieces of the 15th century. State of the art research about the pictures mainly derives from the analysis of the artist Hans Multscher. It was Multscher‘s Ulmer Werkstätte, who was commissioned by the Sterzing citizenship to construct the altar. In a first chapter the four scenes of the passion of Christ were analysed and according to iconography and motif history a German influence was detected. In contrast, the four scenes of Mary rather follow Dutch models. The work also establishes a relationship between the table pictures and Sterzing passion plays dating back to the mid 15th century. A chapter dedicated to stylistic aspects demonstrates, that the character of the Sterzing altar table pictures is determined by the design of a winged altar itself and by their cultic picture function on a high altar. Moreover, aspects of altar production and selling were found to be important elements for the quality of the pictures. In a final analysis of the artist Sterzinger Meister it is shown, that his work was influenced by national artistic constraints and his artistic origin. The elaboration of other works of art attributed to him was confirmed

    Calculating WCET Estimates from Timed Traces

    Get PDF
    © The Author(s) 2015. This article is published with open access at Springerlink.comReal-time systems engineers face a daunting duty: They must ensure that each task in their system can always meet its deadline. To analyse schedulability they must know the worst-case execution time (WCET) of each task. However, determining exact WCETs is practically infeasible in cost-constrained industrial settings involving real-life code and COTS hardware. Static analysis tools that could yield sufficiently tight WCET bounds are often unavailable. As a result, interest in portable analysis approaches like measurement-based timing analysis (MBTA) is growing. We present an approach based on integer linear programming (ILP) for calculating a WCET estimate from a given database of timed execution traces. Unlike previous work, our method specifically aims at reducing overestimation, by means of an automatic classification of code executions into scenarios with differing worst-case behaviour. To ease the integration into existing analysis tool chains, our method is based on the implicit path enumeration technique (IPET). It can thus reuse flow facts from other analysis tools and produces ILP problems that can be solved by off-the-shelf solvers.Peer reviewe

    Dynamic Power Management for Reactive Stream Processing on the SCC Tiled Architecture

    Get PDF
    This article is distributed under the terms of the Creative Commons Attribution 4.0 International License(http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.Dynamic voltage and frequency scaling} (DVFS) is a means to adjust the computing capacity and power consumption of computing systems to the application demands. DVFS is generally useful to provide a compromise between computing demands and power consumption, especially in the areas of resource-constrained computing systems. Many modern processors support some form of DVFS. In this article we focus on the development of an execution framework that provides light-weight DVFS support for reactive stream-processing systems (RSPS). RSPS are a common form of embedded control systems, operating in direct response to inputs from their environment. At the execution framework we focus on support for many-core scheduling for parallel execution of concurrent programs. We provide a DVFS strategy for RSPS that is simple and lightweight, to be used for dynamic adaptation of the power consumption at runtime. The simplicity of the DVFS strategy became possible by sole focus on the application domain of RSPS. The presented DVFS strategy does not require specific assumptions about the message arrival rate or the underlying scheduling method. While DVFS is a very active field, in contrast to most existing research, our approach works also for platforms like many-core processors, where the power settings typically cannot be controlled individually for each computational unit. We also support dynamic scheduling with variable workload. While many research results are provided with simulators, in our approach we present a parallel execution framework with experiments conducted on real hardware, using the SCC many-core processor. The results of our experimental evaluation confirm that our simple DVFS strategy provides potential for significant energy saving on RSPS.Peer reviewe

    SOLUS: An innovative multimodal imaging system to improve breast cancer diagnosis through diffuse optics and ultrasounds

    Get PDF
    To improve non-invasively the specificity in the diagnosis of breast cancer after a positive screening mammography or doubt/suspicious ultrasound examination, the SOLUS project developed a multimodal imaging system that combines: B-mode ultrasound (US) scans (to assess morphology), Color Doppler (to visualize vascularization), shear-wave elastography (to measure stiffness), and time domain multi-wavelength diffuse optical tomography (to estimate tissue composition in terms of oxy- and deoxy-hemoglobin, lipid, water, and collagen concentrations). The multimodal probe arranges 8 innovative photonic modules (optodes) around the US transducer, providing capability for optical tomographic reconstruction. For more accurate estimate of lesion composition, US-assessed morphological priors can be used to guide the optical reconstructions. Each optode comprises: i) 8 picosecond pulsed laser diodes with different wavelengths, covering a wide spectral range (635-1064 nm) for good probing of the different tissue constituents; ii) a large-area (variable, up to 8.6 mm2) fast-gated digital Silicon Photomultiplier; iii) the acquisition electronics to record the distribution of time-of-flight of the re-emitted photons. The optode is the basic element of the optical part of the system, but is also a stand-alone, ultra-compact (about 4 cm3) device for time domain multi-wavelength diffuse optics, with potential application in various fields

    SOLUS: a novel multimodal approach to ultrasound and diffuse optics imaging of breast cancer

    Get PDF
    A multimodal instrument for breast imaging was developed, combining ultrasound (morphology), shear wave elastography (stiffness), and time domain multiwavelength diffuse optical tomography (blood, water, lipid, collagen) to improve the non-invasive diagnosis of breast cancer

    The WCET Tool Challenge 2011

    Get PDF
    Following the successful WCET Tool Challenges in 2006 and 2008, the third event in this series was organized in 2011, again with support from the ARTIST DESIGN Network of Excellence. Following the practice established in the previous Challenges, the WCET Tool Challenge 2011 (WCC'11) defined two kinds of problems to be solved by the Challenge participants with their tools, WCET problems, which ask for bounds on the execution time, and flow-analysis problems, which ask for bounds on the number of times certain parts of the code can be executed. The benchmarks to be used in WCC'11 were debie1, PapaBench, and an industrial-strength application from the automotive domain provided by Daimler AG. Two default execution platforms were suggested to the participants, the ARM7 as "simple target'' and the MPC5553/5554 as a "complex target,'' but participants were free to use other platforms as well. Ten tools participated in WCC'11: aiT, Astr\'ee, Bound-T, FORTAS, METAMOC, OTAWA, SWEET, TimeWeaver, TuBound and WCA

    Strategic research agenda for biomedical imaging

    Get PDF
    This Strategic Research Agenda identifies current challenges and needs in healthcare, illustrates how biomedical imaging and derived data can help to address these, and aims to stimulate dedicated research funding efforts. Medicine is currently moving towards a more tailored, patient-centric approach by providing personalised solutions for the individual patient. Innovation in biomedical imaging plays a key role in this process as it addresses the current needs for individualised prevention, treatment, therapy response monitoring, and image-guided surgery. The use of non-invasive biomarkers facilitates better therapy prediction and monitoring, leading to improved patient outcomes. Innovative diagnostic imaging technologies provide information about disease characteristics which, coupled with biological, genetic and -omics data, will contribute to an individualised diagnosis and therapy approach. In the emerging field of theranostics, imaging tools together with therapeutic agents enable the selection of best treatments and allow tailored therapeutic interventions. For prenatal monitoring, the use of innovative imaging technologies can ensure an early detection of malfunctions or disease. The application of biomedical imaging for diagnosis and management of lifestyle-induced diseases will help to avoid disease development through lifestyle changes. Artificial intelligence and machine learning in imaging will facilitate the improvement of image interpretation and lead to better disease prediction and therapy planning. As biomedical imaging technologies and analysis of existing imaging data provide solutions to current challenges and needs in healthcare, appropriate funding for dedicated research is needed to implement the innovative approaches for the wellbeing of citizens and patients
    corecore