50 research outputs found
Sprobes: Enforcing Kernel Code Integrity on the TrustZone Architecture
Many smartphones now deploy conventional operating systems, so the rootkit
attacks so prevalent on desktop and server systems are now a threat to
smartphones. While researchers have advocated using virtualization to detect
and prevent attacks on operating systems (e.g., VM introspection and trusted
virtual domains), virtualization is not practical on smartphone systems due to
the lack of virtualization support and/or the expense of virtualization.
Current smartphone processors do have hardware support for running a protected
environment, such as the ARM TrustZone extensions, but such hardware does not
control the operating system operations sufficiently to enable VM
introspection. In particular, a conventional operating system running with
TrustZone still retains full control of memory management, which a rootkit can
use to prevent traps on sensitive instructions or memory accesses necessary for
effective introspection. In this paper, we present SPROBES, a novel primitive
that enables introspection of operating systems running on ARM TrustZone
hardware. Using SPROBES, an introspection mechanism protected by TrustZone can
instrument individual operating system instructions of its choice, receiving an
unforgeable trap whenever any SPROBE is executed. The key challenge in
designing SPROBES is preventing the rootkit from removing them, but we identify
a set of five invariants whose enforcement is sufficient to restrict rootkits
to execute only approved, SPROBE-injected kernel code. We implemented a
proof-of-concept version of SPROBES for the ARM Fast Models emulator,
demonstrating that in Linux kernel 2.6.38, only 12 SPROBES are sufficient to
enforce all five of these invariants. With SPROBES we show that it is possible
to leverage the limited TrustZone extensions to limit conventional kernel
execution to approved code comprehensively.Comment: In Proceedings of the Third Workshop on Mobile Security Technologies
(MoST) 2014 (http://arxiv.org/abs/1410.6674
High-precision interferometric measurement of slow and fast temperature changes in static fluid and convective flow
We explore the strengths and limitations of using a standard Michelson
interferometer to sample line-of-sight-averaged temperature in water via two
experimental setups: slow-varying temperature in static fluid and fast
temperature variations in convective flow. The high precision of our
measurements (a few mK) is enabled by the fast response time and high
sensitivity of the interferometer to minute changes in the refractive index of
water caused by temperature variations. These features allow us to detect the
signature of fine fluid dynamical patterns in convective flow in a fully
non-intrusive manner. For example, we are able to observe an asymmetry in the
rising thermal plume (i.e. an asynchronous arrival of two counter-rotating
vortices at the measurement location), which is not possible to resolve with
more traditional (and invasive) techniques, such as RTD (Resistance Temperature
Detector) sensors. These findings, and the overall reliability of our method,
are further corroborated by means of Particle Image Velocimetry and Large Eddy
Simulations. While this method presents inherent limitations (mainly stemming
from the line-of-sight-averaged nature of its results), its non-intrusiveness
and robustness, along with the ability to readily yield real-time, highly
accurate measurements, render this technique very attractive for a wide range
of applications in experimental fluid dynamics
An Evil Copy: How the Loader Betrays You
Abstract-Dynamic loading is a core feature used on current systems to (i) enable modularity and reuse, (ii) reduce memory footprint by sharing code pages of libraries and executables among processes, and (iii) simplify update procedures by eliminating the need to recompile executables when a library is updated. The Executable and Linkable Format (ELF) is a generic specification that describes how executable programs are stitched together from object files produced from source code to libraries and executables. Programming languages allow fine-grained control over variables, including access and memory protections, so programmers may write defense mechanisms assuming that the permissions specified at the source and/or compiler level will hold at runtime. Unfortunately, information about memory protection is lost during compilation. We identify one case that has significant security implications: when instantiating a process, constant external variables that are referenced in executables are forcefully relocated to a writable memory segment without warning. The loader trades security for compatibility due to the lack of memory protection information on the relocated external variables. We call this new attack vector COREV for Copy Relocation Violation. An adversary may use a memory corruption vulnerability to modify such "read-only" constant variables like vtables, function pointers, format strings, and file names to bypass defenses (like FORTIFY SOURCE or CFI) and to escalate privileges. We have studied all Ubuntu 16.04 LTS packages and found that out of 54,045 packages, 4,570 packages have unexpected copy relocations that change read-only permissions to read-write, presenting new avenues for attack. The attack surface is broad with 29,817 libraries exporting relocatable read-only variables. The set of 6,399 programs with actual copy relocation violations includes ftp servers, apt-get, and gettext. We discuss the cause, effects, and a set of possible mitigation strategies for the COREV attack vector
Comparative validation of machine learning algorithms for surgical workflow and skill analysis with the HeiChole benchmark
Purpose: Surgical workflow and skill analysis are key technologies for the next generation of cognitive surgical assistance systems. These systems could increase the safety of the operation through context-sensitive warnings and semi-autonomous robotic assistance or improve training of surgeons via data-driven feedback. In surgical workflow analysis up to 91% average precision has been reported for phase recognition on an open data single-center video dataset. In this work we investigated the generalizability of phase recognition algorithms in a multicenter setting including more difficult recognition tasks such as surgical action and surgical skill.
Methods: To achieve this goal, a dataset with 33 laparoscopic cholecystectomy videos from three surgical centers with a total operation time of 22 h was created. Labels included framewise annotation of seven surgical phases with 250 phase transitions, 5514 occurences of four surgical actions, 6980 occurences of 21 surgical instruments from seven instrument categories and 495 skill classifications in five skill dimensions. The dataset was used in the 2019 international Endoscopic Vision challenge, sub-challenge for surgical workflow and skill analysis. Here, 12 research teams trained and submitted their machine learning algorithms for recognition of phase, action, instrument and/or skill assessment.
Results: F1-scores were achieved for phase recognition between 23.9% and 67.7% (n = 9 teams), for instrument presence detection between 38.5% and 63.8% (n = 8 teams), but for action recognition only between 21.8% and 23.3% (n = 5 teams). The average absolute error for skill assessment was 0.78 (n = 1 team).
Conclusion: Surgical workflow and skill analysis are promising technologies to support the surgical team, but there is still room for improvement, as shown by our comparison of machine learning algorithms. This novel HeiChole benchmark can be used for comparable evaluation and validation of future work. In future studies, it is of utmost importance to create more open, high-quality datasets in order to allow the development of artificial intelligence and cognitive robotics in surgery
Open X-Embodiment:Robotic learning datasets and RT-X models
Large, high-capacity models trained on diverse datasets have shown remarkable successes on efficiently tackling downstream applications. In domains from NLP to Computer Vision, this has led to a consolidation of pretrained models, with general pretrained backbones serving as a starting point for many applications. Can such a consolidation happen in robotics? Conventionally, robotic learning methods train a separate model for every application, every robot, and even every environment. Can we instead train "generalist" X-robot policy that can be adapted efficiently to new robots, tasks, and environments? In this paper, we provide datasets in standardized data formats and models to make it possible to explore this possibility in the context of robotic manipulation, alongside experimental results that provide an example of effective X-robot policies. We assemble a dataset from 22 different robots collected through a collaboration between 21 institutions, demonstrating 527 skills (160266 tasks). We show that a high-capacity model trained on this data, which we call RT-X, exhibits positive transfer and improves the capabilities of multiple robots by leveraging experience from other platforms. The project website is robotics-transformer-x.github.io
Impact of Arrhythmia on Myocardial Perfusion: A Computational Model-Based Study
(1) Background: Arrhythmia, which is an umbrella term for various types of abnormal rhythms of heartbeat, has a high prevalence in both the general population and patients with coronary artery disease. So far, it remains unclear how different types of arrhythmia would affect myocardial perfusion and the risk/severity of myocardial ischemia. (2) Methods: A computational model of the coronary circulation coupled to the global cardiovascular system was employed to quantify the impacts of arrhythmia and its combination with coronary artery disease on myocardial perfusion. Furthermore, a myocardial supply–demand balance index (MSDBx) was proposed to quantitatively evaluate the severity of myocardial ischemia under various arrhythmic conditions. (3) Results: Tachycardia and severe irregularity of heart rates (HRs) depressed myocardial perfusion and increased the risk of subendocardial ischemia (evaluated by MSDBx), whereas lowering HR improved myocardial perfusion. The presence of a moderate to severe coronary artery stenosis considerably augmented the sensitivity of MSDBx to arrhythmia. Further data analyses revealed that arrhythmia induced myocardial ischemia mainly via reducing the amount of coronary artery blood flow in each individual cardiac cycle rather than increasing the metabolic demand of the myocardium (measured by the left ventricular pressure-volume area). (4) Conclusions: Both tachycardia and irregular heartbeat tend to increase the risk of myocardial ischemia, especially in the subendocardium, and the effects can be further enhanced by concomitant existence of coronary artery disease. In contrast, properly lowering HR using drugs like β-blockers may improve myocardial perfusion, thereby preventing or relieving myocardial ischemia in patients with coronary artery disease
Exploration of interferometry for measuring temperature in experimental fluid mechanics
This thesis explores the strengths and limitations of using standard interferometry to sample line-of-sight-averaged temperature in water via two experimental setups: slow-varying temperature in static fluid and fast temperature variations in convective flow. The fast response time and high precision of the measurements obtained (a few mK) is enabled by the high sensitivity of the interferometer to minute changes in the refractive index of water caused by temperature variations. These features enable the detection of fine signatures imprinted by fluid dynamical patterns in convective flow in a fully non-intrusive manner. For example, an asymmetry in the rising thermal plume (i.e. an asynchronous arrival of two counter-rotating vortices at the measurement location) is observed, which is not possible to resolve with more traditional (and intrusive) techniques, such as RTD (Resistance Temperature Detector) sensors. Furthermore, an exploratory model is proposed, elucidating the potential feasibility of extracting localized (line-of-sight-averaged) temperature gradients and fluid velocity by interpreting the drop in the interferometer’s visibility; by doing so, useful information may be extracted from a phenomenon (visibility drop) that is commonly an important source of error in other interferometric techniques. These findings, and the overall reliability of the proposed method, are further corroborated by means of Particle Image Velocimetry and Large Eddy Simulations. While the proposed technique presents inherent limitations (mainly stemming from the line-of-sight-averaged nature of its results and high sensitivity to mechanical disturbances), its non-intrusiveness and robustness, along with the ability to readily yield real-time, highly accurate measurements, render this technique very attractive for a wide range of applications in experimental fluid dynamics
High-precision interferometric measurement of slow and fast temperature changes in static fluid and convective flow
We explore the strengths and limitations of using a standard Michelson interferometer to sample line-of-sight-averaged temperature in water via two experimental setups: slow-varying temperature in static fluid and fast temperature variations in convective flow. The high precision of our measurements (a few mK) is enabled by the fast response time and high sensitivity of the interferometer to minute changes in the refractive index of water caused by temperature variations. These features allow us to detect the signature of fine fluid dynamical patterns in convective flow in a fully non-intrusive manner. For example, we are able to observe an asymmetry in the rising thermal plume (i.e., an asynchronous arrival of two counter-rotating vortices at the measurement location), which is not possible to resolve with more traditional (and invasive) techniques, such as RTD (Resistance Temperature Detector) sensors. These findings, and the overall reliability of our method, are further corroborated by means of Particle Image Velocimetry and Large Eddy Simulations. While this method presents inherent limitations (mainly stemming from the line-of-sight-averaged nature of its results), its non-intrusiveness and robustness, along with the ability to readily yield real-time, highly accurate measurements, render this technique very attractive for a wide range of applications in experimental fluid dynamics