11900 research outputs found
Sort by
Influence of Friction Stir Weld Parameters on the Corrosion Susceptibility of EN AW‐7075 Weld Seam and Heat‐Affected Zone
Friction stir welding enables joining of high‐strength, lightweight aluminum alloys, e.g., EN‐AW‐7075, below the melting point by induced plastic deformation. Therefore, heat transfer into the adjacent regions beneath the weld seam is significantly reduced as compared to fusion welding processes such as laser beam welding. However, specific zones along the weld seam area are susceptible to localized corrosion due to grain growth and the precipitation of intermetallic phases. Thus, several approaches toward lowering the corrosion susceptibility of the heat‐affected zone are presented. Special interest is given to increasing the plastic deformation by the use of novel multipin welding tools that eventually facilitate reduced heat input during welding as a result of substantially lower tool revolutions. The corrosion behavior is tested by means of full material immersion tests and electrochemical measurements which provide insight into the corrosion kinetics. Using pre‐ and postmortem microstructural analysis, the mechanisms influencing the initiation of corrosion can be identified. Supported by in‐operando temperature measurements, the varied welding parameters and their interrelationships to corrosion resistance can be derived. Furthermore, recommendations on optimal welding parameters to obtain enhanced corrosion resistance can be deduced
Carbon Ions for Hypoxic Tumors: Are We Making the Most of Them?
Hypoxia, which is associated with abnormal vessel growth, is a characteristic feature of many solid tumors that increases their metastatic potential and resistance to radiotherapy. Carbon-ion radiation therapy, either alone or in combination with other treatments, is one of the most promising treatments for hypoxic tumors because the oxygen enhancement ratio decreases with increasing particle LET. Nevertheless, current clinical practice does not yet fully benefit from the use of carbon ions to tackle hypoxia. Here, we provide an overview of the existing experimental and clinical evidence supporting the efficacy of C-ion radiotherapy in overcoming hypoxia-induced radioresistance, followed by a discussion of the strategies proposed to enhance it, including different approaches to maximize LET in the tumors
Mechanical softening of CuX alloys at elevated temperatures studied via high temperature scanning indentation
The thermal stability and temperature dependent hardness of ultrafine-grained Cu-alloys CuSn5 and CuZn5 after high pressure torsion are investigated using the high temperature scanning indentation (HTSI) method. Fast indentations are carried out during thermal cycling of the samples (heating-holding-cooling) to measure hardness and strain rate sensitivity as a function of temperature and time. The microstructures after each thermal cycle are investigated to characterize the coarsening behaviour of both alloys.
Results show that the thermal stability of the tested alloys can be expressed in terms of several temperature regimes: A fully stable regime, a transient regime in which growth of individual grains occurs, and finally a regime in which the microstructure is fully coarsened. The onset of grain growth is accompanied by high strain rate sensitivity on the order of 0.2–0.3. Furthermore, the obtained hardness and strain rate sensitivity values are in good agreement with continuous stiffness measurement (CSM) and strain rate jump (SRJ) experiments. This highlights the applicability of the HTSI method to the characterization of the thermomechanical properties of ultrafine-grained alloys
Covalent Attachment of Enzymes to Paper Fibers for Paper-Based Analytical Devices
Due to its unique material properties, paper offers many practical advantages as a viable platform for sensing devices. In view of paper-based microfluidic biosensing applications, the covalent immobilization of enzymes with preserved functional activity is highly desirable and ultimately challenging. In the present manuscript, we report an efficient approach to achieving the covalent attachment of certain enzymes on paper fibers via a surface-bound network of hydrophilic polymers bearing protein-modifiable sites. This tailor-made macromolecular system consisting of polar, highly swellable copolymers is anchored to the paper exterior upon light-induced crosslinking of engineered benzophenone motifs. On the other hand, this framework contains active esters that can be efficiently modified by the nucleophiles of biomolecules. This strategy allowed the covalent immobilization of glucose oxidase and horseradish peroxidase onto cotton linters without sacrificing their bioactivities and performance upon surface binding. As a proof-of-concept application, a microfluidic chromatic paper-based glucose sensor was developed and achieved successful glucose detection in a simple yet efficient cascade reaction
Visual Insights into Memory Behavior of GPU Ray Tracers
Ray tracing is a fundamental rendering technique that typically projects three-dimensional representations of a scene onto a two-dimensional display. This is achieved by perspectively sampling a set of rays into the scene and computing intersections against the relevant geometry. Secondary rays may be sent out from these intersection points, allowing for physically correct global illumination on the reverse photon direction.
Real-time rendering has historically used classical rasterization pipelines, which are straightforward to implement on hardware as they form a data-parallel problem projecting the whole scene into the coordinate system of the image. In contrast, task-parallel ray tracing suffers from incoherency between rays. However, recent advances in ray tracing have led to more efficient approaches, resulting in even more efficient embedded hardware implementations. While these approaches are already capable of rendering realistic images, further improvements in run-time performance can compensate for computational time to achieve higher framerates, display resolutions, ray-tracing recursion depths, or reducing the energy footprint of ray-tracing data centers.
A fundamental technique for improving ray-tracing performance is the use of bounding-volume hierarchies (BVH), which prevent rays from intersecting the entire scene, especially in occluded or distant regions. In addition to the structural efficiency of a BVH, the primary bottlenecks of GPU ray tracing are memory latency and work distribution. These factors mainly result in more coherent memory accesses, making caching more efficient.
Creating programs with the goal of achieving higher caching rates typically requires increased programming efforts and a deep understanding of the hardware, as an additional abstraction layer is introduced, making the memory pipeline less transparent. General-purpose profilers aim to support the implementation process. However, they typically display caching rates based on kernel calls. This is because these values are measured using basic hardware counters that do not distinguish between the context of a memory access. In many cases, it would be useful to have a more detailed representation of memory-related profiling metrics, such as the number of recordings per memory allocation or projections into other domains, such as the framebuffer or the scene geometry.
This thesis presents a new method for simulating the GPU memory pipeline accurately. The method uses memory traces exported by dynamic binary instrumentation, which can be applied to any compiled GPU binaries, similar to standard profilers. The exported memory profiles can be used for performance visualization purposes in individual domains, as well as traditional memory profiling metrics that can be displayed in finer granularity than usual. A method for mapping memory metrics onto the original scene is included, allowing users to explore profiling results within the scene domain, making the profiling process more intuitive. In addition, this thesis presents a novel compressed ray-tracing implementation that optimizes its memory footprint by making assumptions about the topological properties of the scene to be rendered. The findings can be used to evaluate and optimize a wide range of ray tracing and ray marching applications in a user-friendly manner
Symmetry analysis and invariant solutions of the multipoint infinite systems describing turbulence
The present work concerns multipoint description of turbulence in terms of the probability density functions (pdf's) and the characteristic Hopf functional. Lie symmetries of infinite systems of equations for the pdf's and the Hopf functional equation are discussed. Based on symmetries, invariant solutions for turbulence statistics are calculated
A Literature Review on the Development and Creation of Digital Twins, Cyber-Physical Systems, and Product-Service Systems
Digital Twins offer vast potential, yet many companies, particularly small and medium-sized enterprises, hesitate to implement them. This hesitation stems partly from the challenges posed by the interdisciplinary nature of creating Digital Twins. To address these challenges, this paper explores systematic approaches for the development and creation of Digital Twins, drawing on relevant methods and approaches presented in the literature. Conducting a systematic literature review, we delve into the development of Digital Twins while also considering analogous concepts, such as Cyber-Physical Systems and Product-Service Systems. The compiled literature is categorised into three main sections: holistic approaches, architecture, and models. Each category encompasses various subcategories, all of which are detailed in this paper. Through this comprehensive review, we discuss the findings and identify research gaps, shedding light on the current state of knowledge in the field of Digital Twin development. This paper aims to provide valuable insights for practitioners and researchers alike, guiding them in navigating the complexities associated with the implementation of Digital Twins
Mit maßgeschneiderten Metadatenprofilen zu validierten und nachhaltigen Forschungsdaten
Zeitgemäßes Forschungsdatenmanagement (FDM) beinhaltet zunehmend auch die Integration reichhaltiger, maschinennutzbarer Metadaten, allgemein zur Sicherstellung wissenschaftlicher Qualität insbesondere aber im Kontext von Reproduzierbarkeit und Nachnutzung. Bestehende Standards umfassen meist nur deskriptive Metadaten und die in umfassenderen Metadatenschemata enthaltenen Informationen sind in der Regel weder standardisiert noch maschinennutzbar. Fachspezifische Metadaten sind jedoch notwendig, um Forschung präzise und reichhaltig zu dokumentieren. Die Abläufe und Werkzeuge dafür sind jedoch nicht umfassend verfügbar. Ein vielversprechender Ansatz ist die Anwendung von Metadatenprofilen, die es ermöglichen hochspezifische Terminologien, aufbauend auf bestehenden Community-Standards, in flexible und interoperable Metadatenbeschreibungen zu überführen. Basierend auf etablierten Technologien ermöglichen Metadatenprofile eine Lösung zum Gestalten und Verarbeiten von komplexen, maschinennutzbaren und letztlich FAIRen Metadaten.
Anhand eines Beispiels aus den Ingenieurswissenschaften, wird die Datenvalidierung mittels Metadatenprofilen basierend auf kontrolliertem Vokabular gezeigt. Dieser Prozess kann dann zu fast jedem Zeitpunkt im Lebenszyklus von Forschungsdaten genutzt werden. Ein Anwendungsbeispiel demonstriert außerdem die sich daraus ergebenden Möglichkeiten im Bereich der Datenanalyse bzw. der Archivierung. Erst die Kombination aus praktischer Integration in die Forschungslandschaft in Verbindung mit der Umsetzung in verschiedenen FDM-Projekten, Initiativen und Werkzeugen ermöglicht die für eine Standardisierung notwendigen Synergien. Damit wird die Forschung durch höhere Datenqualität gefördert, sowie für die nachhaltige Bewahrung von Forschungsinhalten durch spezifischere Dokumentation ein Mehrwert gebildet
On the excitation of the 2₁⁺ state in ¹²C in the (e, e'γ) reaction
The excitation of the carbon 2⁺ state at 4.439 MeV by 70-150 MeV electron impact and its subsequent decay to the ground state by photon emission is described within the distorted-wave Born approximation. The transition densities are obtained from the nuclear quasiparticle phonon model. The photon angular distributions are compared with earlier results and with experiment, including the influence of bremsstrahlung. Predictions for spin asymmetries in the case of polarized electron impact are also made
Improving 3D convolutional neural network comprehensibility via interactive visualization of relevance maps: evaluation in Alzheimer’s disease
Background: Although convolutional neural networks (CNNs) achieve high diagnostic accuracy for detecting Alzheimer’s disease (AD) dementia based on magnetic resonance imaging (MRI) scans, they are not yet applied in clinical routine. One important reason for this is a lack of model comprehensibility. Recently developed visualization methods for deriving CNN relevance maps may help to fill this gap as they allow the visualization of key input image features that drive the decision of the model. We investigated whether models with higher accuracy also rely more on discriminative brain regions predefined by prior knowledge.
Methods: We trained a CNN for the detection of AD in N = 663 T1-weighted MRI scans of patients with dementia and amnestic mild cognitive impairment (MCI) and verified the accuracy of the models via cross-validation and in three independent samples including in total N = 1655 cases. We evaluated the association of relevance scores and hippocampus volume to validate the clinical utility of this approach. To improve model comprehensibility, we implemented an interactive visualization of 3D CNN relevance maps, thereby allowing intuitive model inspection.
Results: Across the three independent datasets, group separation showed high accuracy for AD dementia versus controls (AUC ≥ 0.91) and moderate accuracy for amnestic MCI versus controls (AUC ≈ 0.74). Relevance maps indicated that hippocampal atrophy was considered the most informative factor for AD detection, with additional contributions from atrophy in other cortical and subcortical regions. Relevance scores within the hippocampus were highly correlated with hippocampal volumes (Pearson’s r ≈ −0.86, p < 0.001).
Conclusion: The relevance maps highlighted atrophy in regions that we had hypothesized a priori. This strengthens the comprehensibility of the CNN models, which were trained in a purely data-driven manner based on the scans and diagnosis labels. The high hippocampus relevance scores as well as the high performance achieved in independent samples support the validity of the CNN models in the detection of AD-related MRI abnormalities. The presented data-driven and hypothesis-free CNN modeling approach might provide a useful tool to automatically derive discriminative features for complex diagnostic tasks where clear clinical criteria are still missing, for instance for the differential diagnosis between various types of dementia