1,098 research outputs found

    Timing Analysis of Embedded Software Updates

    Full text link
    We present RETA (Relative Timing Analysis), a differential timing analysis technique to verify the impact of an update on the execution time of embedded software. Timing analysis is computationally expensive and labor intensive. Software updates render repeating the analysis from scratch a waste of resources and time, because their impact is inherently confined. To determine this boundary, in RETA we apply a slicing procedure that identifies all relevant code segments and a statement categorization that determines how to analyze each such line of code. We adapt a subset of RETA for integration into aiT, an industrial timing analysis tool, and also develop a complete implementation in a tool called DELTA. Based on staple benchmarks and realistic code updates from official repositories, we test the accuracy by analyzing the worst-case execution time (WCET) before and after an update, comparing the measures with the use of the unmodified aiT as well as real executions on embedded hardware. DELTA returns WCET information that ranges from exactly the WCET of real hardware to 148% of the new version's measured WCET. With the same benchmarks, the unmodified aiT estimates are 112% and 149% of the actual executions; therefore, even when DELTA is pessimistic, an industry-strength tool such as aiT cannot do better. Crucially, we also show that RETA decreases aiT's analysis time by 45% and its memory consumption by 8.9%, whereas removing RETA from DELTA, effectively rendering it a regular timing analysis tool, increases its analysis time by 27%

    Timing Analysis of Embedded Software Updates

    Get PDF
    We present RETA (Relative Timing Analysis), a differential timing analysis technique to verify the impact of an update on the execution time of embedded software. Timing analysis is computationally expensive and labor intensive. Software updates render repeating the analysis from scratch a waste of resources and time, because their impact is inherently confined. To determine this boundary, in RETA we apply a slicing procedure that identifies all relevant code segments and a statement categorization that determines how to analyze each such line of code. We adapt a subset of RETA for integration into aiT, an industrial timing analysis tool, and also develop a complete implementation in a tool called DELTA. Based on staple benchmarks and realistic code updates from official repositories, we test the accuracy by analyzing the worst-case execution time (WCET) before and after an update, comparing the measures with the use of the unmodified aiT as well as real executions on embedded hardware. DELTA returns WCET information that ranges from exactly the WCET of real hardware to 148% of the new version's measured WCET. With the same benchmarks, the unmodified aiT estimates are 112% and 149% of the actual executions; therefore, even when DELTA is pessimistic, an industry-strength tool such as aiT cannot do better. Crucially, we also show that RETA decreases aiT's analysis time by 45% and its memory consumption by 8.9%, whereas removing RETA from DELTA, effectively rendering it a regular timing analysis tool, increases its analysis time by 27%

    Scheduling Dynamic Software Updates in Mobile Robots

    Get PDF
    We present NeRTA (Next Release Time Analysis), a technique to enable dynamic software updates for low-level control software of mobile robots. Dynamic software updates enable software correction and evolution during system operation. In mobile robotics, they are crucial to resolve software defects without interrupting system operation or to enable on-the-fly extensions. Low-level control software for mobile robots, however, is time sensitive and runs on resource-constrained hardware with no operating system support. To minimize the impact of the update process, NeRTA safely schedules updates during times when the computing unit would otherwise be idle. It does so by utilizing information from the existing scheduling algorithm without impacting its operation. As such, NeRTA works orthogonal to the existing scheduler, retaining the existing platform-specific optimizations and fine-tuning, and may simply operate as a plug-in component. To enable larger dynamic updates, we further conceive an additional mechanism called bounded reactive control and apply mixed-criticality concepts. The former cautiously reduces the overall control frequency, whereas the latter excludes less critical tasks from NeRTA processing. Their use increases the available idle times. We combine real-world experiments on embedded hardware with simulations to evaluate NeRTA. Our experimental evaluation shows that the difference between NeRTA’s estimated idle times and the measured idle times is less than 15% in more than three-quarters of the samples. The combined effect of bounded reactive control and mixed-criticality concepts results in a 150+% increase in available idle times. We also show that the processing overhead of NeRTA and of the additional mechanisms is essentially negligible

    Dynamical-Friction Galaxy-Gas Coupling and Cluster Cooling Flows

    Get PDF
    We revisit the notion that galaxy motions can efficiently heat intergalactic gas in the central regions of clusters through dynamical friction. For plausible values of the galaxy mass-to-light ratio, the heating rate is comparable to the cooling rate due to X-ray emission. Heating occurs only for supersonic galaxy motions, so the mechanism is self-regulating: it becomes efficient only when the gas sound speed is smaller than the galaxy velocity dispersion. We illustrate with the Perseus cluster, assuming a stellar mass-to-light ratio for galaxies in the very central region with the dark-matter contribution becoming comparable to this at some radius rsr_s. For r_s \la 400 {\rm kpc} \sim 3 r_{\rm cool}--corresponding to an average mass-to-light ratio of 10\sim10 inside that radius--the dynamical-friction coupling is strong enough to provide the required rate of gas heating. The measured sound speed is smaller than the galaxy velocity dispersion, as required by this mechanism. With this smaller gas temperature and the observed distribution of galaxies and gas, the energy reservoir in galactic motions is sufficient to sustain the required heating rate for the lifetime of the cluster. The galaxies also lose a smaller amount of energy through dynamical friction to the dark matter implying that non--cooling-flow clusters should have flat-cored dark-matter density distributions.Comment: Six pages, 4 figs, Monthly Notices styl

    2-D Magnetohydrodynamic Simulations of Induced Plasma Dynamics in the Near-Core Region of a Galaxy Cluster

    Full text link
    We present results from numerical simulations of the cooling-core cluster A2199 produced by the two-dimensional (2-D) resistive magnetohydrodynamics (MHD) code MACH2. In our simulations we explore the effect of anisotropic thermal conduction on the energy balance of the system. The results from idealized cases in 2-D axisymmetric geometry underscore the importance of the initial plasma density in ICM simulations, especially the near-core values since the radiation cooling rate is proportional to ne2{n_e}^2. Heat conduction is found to be non-effective in preventing catastrophic cooling in this cluster. In addition we performed 2-D planar MHD simulations starting from initial conditions deliberately violating both thermal balance and hydrostatic equilibrium in the ICM, to assess contributions of the convective terms in the energy balance of the system against anisotropic thermal conduction. We find that in this case work done by the pressure on the plasma can dominate the early evolution of the internal energy over anisotropic thermal conduction in the presence of subsonic flows, thereby reducing the impact of the magnetic field. Deviations from hydrostatic equilibrium near the cluster core may be associated with transient activity of a central active galactic nucleus and/or remnant dynamical activity in the ICM and warrant further study in three dimensions.Comment: 16 pages, 13 figures, accepted for publication in MNRA

    Targeting the latent cytomegalovirus reservoir with an antiviral fusion toxin protein.

    Get PDF
    Reactivation of human cytomegalovirus (HCMV) in transplant recipients can cause life-threatening disease. Consequently, for transplant recipients, killing latently infected cells could have far-reaching clinical benefits. In vivo, myeloid cells and their progenitors are an important site of HCMV latency, and one viral gene expressed by latently infected myeloid cells is US28. This viral gene encodes a cell surface G protein-coupled receptor (GPCR) that binds chemokines, triggering its endocytosis. We show that the expression of US28 on the surface of latently infected cells allows monocytes and their progenitor CD34+ cells to be targeted and killed by F49A-FTP, a highly specific fusion toxin protein that binds this viral GPCR. As expected, this specific targeting of latently infected cells by F49A-FTP also robustly reduces virus reactivation in vitro. Consequently, such specific fusion toxin proteins could form the basis of a therapeutic strategy for eliminating latently infected cells before haematopoietic stem cell transplantation

    The slope of the mass profile and the tilt of the fundamental plane in early-type galaxies

    Full text link
    We present a survey, using the Chandra X-ray observatory, of the central gravitating mass profiles in a sample of 10 galaxies, groups and clusters, spanning ~2 orders of magnitude in virial mass. We find the total mass distributions from ~0.2--10Re, where Re is the optical effective radius of the central galaxy, are remarkably similar to powerlaw density profiles. The negative logarithmic slope of the mass density profiles, alpha, systematically varies with Re, from alpha=2, for systems with Re~4kpc to alpha=1.2 for systems with Re>30kpc. Departures from hydrostatic equilibrium are likely to be small and cannot easily explain this trend. We show that the conspiracy between the baryonic (Sersic) and dark matter (NFW/ Einasto) components required to maintain a powerlaw total mass distribution naturally predicts an anti-correlation between alpha and Re that is very close to what is observed. The systematic variation of alpha with Re implies a dark matter fraction within Re that varies systematically with the properties of the galaxy in such a manner as to reproduce, without fine tuning, the observed tilt of the fundamental plane. We speculate that establishing a nearly powerlaw total mass distribution is therefore a fundamental feature of galaxy formation and the primary factor which determines the tilt of the fundamental plane.Comment: 10 pages, 5 figures, 2 tables. Accepted for publication in MNRAS. Minor revisions to match accepted versio

    Galaxy Motions, Turbulence and Conduction in Clusters of Galaxies

    Full text link
    Unopposed radiative cooling in clusters of galaxies results in excessive mass deposition rates. However, the cool cores of galaxy clusters are continuously heated by thermal conduction and turbulent heat diffusion due to minor mergers or the galaxies orbiting the cluster center. These processes can either reduce the energy requirements for AGN heating of cool cores, or they can prevent overcooling altogether. We perform 3D MHD simulations including field-aligned thermal conduction and self-gravitating particles to model this in detail. Turbulence is not confined to the wakes of galaxies but is instead volume-filling, due to the excitation of large-scale g-modes. We systematically probe the parameter space of galaxy masses and numbers. For a wide range of observationally motivated galaxy parameters, the magnetic field is randomized by stirring motions, restoring the conductive heat flow to the core. The cooling catastrophe either does not occur or it is sufficiently delayed to allow the cluster to experience a major merger that could reset conditions in the intracluster medium. Whilst dissipation of turbulent motions is negligible as a heat source, turbulent heat diffusion is extremely important; it predominates in the cluster center. However, thermal conduction becomes important at larger radii, and simulations without thermal conduction suffer a cooling catastrophe. Conduction is important both as a heat source and to reduce stabilizing buoyancy forces, enabling more efficient diffusion. Turbulence enables conduction, and conduction enables turbulence. In these simulations, the gas vorticity---which is a good indicator of trapped g-modes--increases with time. The vorticity growth is approximately mirrored by the growth of the magnetic field, which is amplified by turbulence.Comment: Submitted to MNRA
    corecore