46 research outputs found

    Clonal Hematopoiesis after Autologous Stem Cell Transplantation Does Not Confer Adverse Prognosis in Patients with AML.

    Get PDF
    INTRODUCTION Despite a 50% cure rate, relapse remains the main cause of death in patients with acute myeloid leukemia (AML) consolidated with autologous stem cell transplantation (ASCT) in first remission (CR1). Clonal hematopoiesis of indeterminate potential (CH) increases the risk for hematological and cardiovascular disorders and death. The impact of CH persisting after ASCT in AML patients is unclear. MATERIALS AND METHODS We retrospectively investigated the prognostic value of persisting DNMT3A, TET2, or ASXL1 (DTA) mutations after ASCT. Patients underwent stratification depending on the presence of DTA mutations. RESULTS We investigated 110 consecutive AML patients receiving ASCT in CR1 after two induction cycles at our center between 2007 and 2020. CH-related mutations were present in 31 patients (28.2%) after ASCT. The baseline characteristics were similar between patients with or without persisting DTA mutations after ASCT. The median progression free survival was 26.9 months in patients without DTA mutations and 16.7 months in patients with DTA mutations (HR 0.75 (0.42-1.33), p = 0.287), and the median overall survival was 80.9 and 54.4 months (HR 0.79 (0.41-1.51), p = 0.440), respectively. CONCLUSION We suggest that DTA-CH after ASCT is not associated with an increased risk of relapse or death. The persistence of DTA mutations after induction should not prevent AML patients in CR1 from ASCT consolidation. Independent studies should confirm these data

    Immersive Neural Graphics Primitives

    Full text link
    Neural radiance field (NeRF), in particular its extension by instant neural graphics primitives, is a novel rendering method for view synthesis that uses real-world images to build photo-realistic immersive virtual scenes. Despite its potential, research on the combination of NeRF and virtual reality (VR) remains sparse. Currently, there is no integration into typical VR systems available, and the performance and suitability of NeRF implementations for VR have not been evaluated, for instance, for different scene complexities or screen resolutions. In this paper, we present and evaluate a NeRF-based framework that is capable of rendering scenes in immersive VR allowing users to freely move their heads to explore complex real-world scenes. We evaluate our framework by benchmarking three different NeRF scenes concerning their rendering performance at different scene complexities and resolutions. Utilizing super-resolution, our approach can yield a frame rate of 30 frames per second with a resolution of 1280x720 pixels per eye. We discuss potential applications of our framework and provide an open source implementation online.Comment: Submitted to IEEE VR, currently under revie

    Das Nürnberger Kinderpanel: Zielsetzungen, theoretisches Ausgangsmodell, methodische Vorgehensweise sowie wissenschaftliche und praktische Relevanz

    Full text link
    Zielsetzung der Studie ist die Gewinnung eines Paneldatensatzes zur Analyse (1) der Gesundheit, des Wohlbefindens, des gesundheitsbewußten Handelns und Wissens von Kindern, (2) der diese beeinflussenden Faktoren und (3) der Auswirkungen von Gesundheit und Wohlbefinden auf die Lebenssituation und Entwicklung von Kindern. Die Autoren nutzen dazu ein integriertes Modell, das sozialisationstheoretische, entwicklungspsychologische und kindheitsorientierte Perspektiven verknüpft und "Kinder als Werdende und Seiende" auffaßt. Kinder werden in diesem Modell als produktiv Realität verarbeitende Subjekte konzipiert im Sinne der akteursbezogenen Kindheitsforschung. Insgesamt will der Nürnberger Kinderpanel einen Beitrag zur Gesundheitsförderung und -planung leisten, da in der Studie Aussagen über die Entstehung von gesundheitsorientierten Lebensstilen und die sie ermöglichenden und behindernden Faktoren gemacht werden. (ICA

    Improving Web2cHMI Gesture Recognition Using Machine Learning

    No full text
    Web2cHMI is multi-modal human-machine interface which seamlessly incorporates actions based on various interface modalities in a single API, including finger, hand and head gestures as well as spoken commands. The set of native gestures provided by off-the-shelf 2D- or 3D-interface devices such as the Myo gesture control armband can be enriched or extended by additional custom gestures. This paper discusses a particular method and its implementation in recognizing different finger, hand and head movements using supervised machine learning algorithms including a non-linear regression for feature extraction of the movement and a k-nearest neighbor method for movement classification using memorized training data. The method is capable of distinguishing between fast and slow, short and long, up and down, or right and left linear as well as clockwise and counterclockwise circular movements, which can then be associated with specific user interactions

    Das Beschleuniger-Kontrollsystem im BKR

    No full text
    Vom BKR aus werden zur Zeit fast alle Beschleuniger bei DESY (Quellen, Linac 2 und 3, PIA, DESY 2 und 3, DORIS, PETRA, HERA und die jeweiligen Transportwege) betrieben. Auch der Betrieb von TTF ist mit Einschränkungen schon möglich. Es ist geplant, zu einem späteren Zeitpunkt TTF gänzlich vom BKR aus zu betreiben.Im BKR gibt es eine Reihe verschiedener Kontrollsysteme. Diese sind das Bindeglied zwischen den Operateuren und den Beschleunigern bzw. den Beschleunigerkomponenten. In diesem Artikel werden die verschiedenen Kontrollsystemansätze im BKR beschrieben und ein Weg augezeigt, wie aus mehreren Kontrollsystemen durch Integration mathbfeinmathbf{ein} Kontrollsystem werden kann. Integration bedeutet nicht Gleichmacherei sondern die Möglichkeit, Informationen, Kommandos, Programme etc. zwischen den und innerhalb der Kontrollsysteme der verschiedenen Beschleuniger austauschen zu können, ohne spezifische, oft wohlbegründete Unterschiede aufgeben zu müssen. Diese Integration geschieht im Rahmen der Betriebssystemsumstellung von Win 3.11 nach Win NT im Bereich der Vorbeschleuniger bzw. im Rahmen der Weiterentwicklung des HERA Kontrollsystems. Der Artikel endet mit einem kurzen Ausblick auf neue Programme nach dem Luminositätsupgrade von HERA

    A Multi-Modal Human-Machine-Interface for Accelerator Operation and Maintenance Applications

    No full text
    The advent of advanced mobile, gaming and augmented reality devices provides users with novel interaction modalities. Today's accelerator control applications do not provide features like speech, finger and hand gesture recognition or even gaze detection. Their look-and-feel and handling are typically optimized for mouse-based interactions and are not well suited for the specific requirements of more complex interaction modalities. This paper describes the conceptual design and implementation of an intuitive single-user, multi-modal human-machine interface for accelerator operation and maintenance applications. The interface seamlessly combines standard actions (mouse), actions associated with 2D single/multi-finger gestures (touch sensitive display) and 3D single/multi-finger and hand gestures (motion controller), and spoken commands (speech recognition system). It will be an integral part of the web-based, platform-neutral Web2cToGo framework belonging to the Web2cToolkit suite and will be applicable for desktop and notebook computers, tablet computers and smartphones, and even see-through augmented reality glasses

    Renovating and Upgrading the Web2cToolkit Suite: A Status Report

    No full text
    Web2cToolkit is a collection of Web services. It enables scientists, operators or service technicians to supervise and operate accelerators and beam lines through the World Wide Web. In addition, it provides users with a platform for communication and the logging of data and actions. Recently a novel service, especially designed for mobile devices, has been added. Besides the standard mouse-based interaction it provides a touch- and voice-based user interface. Web2cToolkit is currently undergoing an extensive renovation and upgrading process. Real WYSIWYG-editors are now available to generate and configure synoptic and history displays, and an interface based on 3D-motion and gesture recognition has been implemented. Also the multi-language support and the security of the communication between Web client and server have been improved substantially. The paper reports the complete status of this work and outlines upcoming development

    Beyond Mouse-Based User Interaction

    No full text
    The advent of advanced mobile, gaming and augmented reality devices provides users with novel interaction modalities. Speech, finger and hand gesture recognition or even gaze detection are commonly used technologies, often enriched with data from embedded motion sensors. This paper describes a common human-machine interface which seamlessly combines actions based on various modalities. It discusses potential use cases, benefits and limitations of those technologies in the field of accelerator operations and maintenance

    Augmented User Interaction

    No full text
    The advent of advanced mobile, gaming and augmented reality devices provides users with novel interaction modalities. Speech, finger- and hand gesture recognition or even gaze detection are commonly used technologies, often enriched with data from embedded gyroscope-like motion sensors. This paper discusses potential use cases of those technologies in the field of accelerator controls and maintenance. It describes the conceptual design of an intuitive, single-user, multi-modal human-machine interface which seamlessly incorporates actions based on various modalities in a single API. It discusses the present implementation status of this interface (Web2cHMI) within the Web2cToolkit framework. Finally, an outlook to future developments and ideas is presented
    corecore