20 research outputs found

    Direct no-carrier-added 18F-labelling of arenes via nucleophilic substitution on aryl(2-thienyl)iodonium salts

    Get PDF
    For in vivo imaging of molecular processes via positron emission tomography (PET) radiotracers of high specific activity are demanded. In case of the most commonly used positron emitter fluorine-18, this is only achievable with no-carrier-added [18F]fluoride, which implies nucleophilic methods of 18F substitution. Whereas electron deficient aromatic groups can be labelled in one step using no-carrier-added [18F]fluoride, electron rich 18F-labelled aromatic molecules are only available by multi-step radiosyntheses or carrier-added electrophilic reactions. Here, diaryliodonium salts represent an alternative, since they have been proven as potent precursor for a direct nucleophilic 18F-introduction into aromatic molecules. Furthermore, as known from non-radioactive studies, the highly electron rich 2 thienyliodonium leaving group leads to a high regioselectivity in nucleophilic substitution reactions. Consequently, a direct nucleophilic no-carrier-added 18F-labelling of electron rich arenes via aryl¬(2 thienyl)iodonium precursors was developed in this work. The applicability of direct nucleophilic 18F labelling was examined in a systematic study on eighteen aryl(2-thienyl)iodonium salts. As electron rich precursors the ortho-, meta- and para-methoxyphenyl(2 thienyl)iodonium bromides, iodides, tosylates and triflates were synthesised. In addition, para-substituted (R = BnO, CH3, H, Cl, Br, I) aryl(2-thienyl)iodonium bromides were prepared as precursors with a systematically varying electron density. As first approach, the general reaction conditions of the nucleophilic 18F-substitution procedure were optimised. The best conditions for direct nucleophilic no-carrier-added 18F-labelling via aryl(2-thien¬yl)iodonium salts were found with dimethylformamide as solvent, a reaction temperature of 130 + 3 °C and 25 mmol/l as concentration of the precursor. For the effect of bromide, iodide, tosylate and triflate as counter anion on the radiochemical yield (RCY) the following order was obtained: tosylate < iodide < triflate < bromide. However, based on the kinetics a different order was observed for the initial reaction rates with: tosylate < bromide < iodide < triflate. The influence of the substitution pattern in ortho-, meta- and para-methoxyphenyl(2-thienyl)iodonium bromide showed an expected strong ortho-effect, which led with 60 % of 2-[18F]fluoroanisole to the highest RCY. Also under no-carrier-added conditions, the 2-thienyl group directed to a regiospecific radiofluorination, thus in all 18F-substitutions no 2-[18F]fluorothiophene, but only the desired [18F]fluoroarenes were formed. With the intention of a systematic examination of electronic factors, the kinetics of the 18F-substitution on substituted aryl(2-thienyl)iodonium bromides were investigated and the determined relative reactivities were compared with the Hammett constants of the corresponding substituents. As result a good linear Hammett correlation and a reaction parameter of &#61554; = + 1.16 + 0.2 were obtained. This confirms the SNAr-mechanism and a consistent mechanism over the whole range of investigated substituents. In order to demonstrate the applicability of this method, the 18F-labelling of two pharmacological relevant molecules was carried out. First, n.c.a. 4-[18F]fluorophenol was synthesised via 4-benzyl¬oxyphenyl(2-thienyl)iodonium bromide within 40 min and an overall RCY of 34 to 36 %. Second, a complex AMPA receptor antagonist was 18F-labelled via iodonium precursors; however only with low RCY of 1.2 to 3.6 %. For comparison, the radioiodine analogue was labelled via a trimethyltin precursor and [131I]iodide with a very high RCY of 97 + 2 % within 2 min, thus it was available for preliminary pharmacological evaluation studies. In conclusion, the 2-thienyliodonium leaving group proved as highly effective for direct nucleophilic no-carrier-added 18F-labelling even of nucleophilically non-activated arenes. Concerning the 18F-label¬ling of complex molecules, further optimisation, however, is necessary for this method

    Generating large labeled data sets for laparoscopic image processing tasks using unpaired image-to-image translation

    Full text link
    In the medical domain, the lack of large training data sets and benchmarks is often a limiting factor for training deep neural networks. In contrast to expensive manual labeling, computer simulations can generate large and fully labeled data sets with a minimum of manual effort. However, models that are trained on simulated data usually do not translate well to real scenarios. To bridge the domain gap between simulated and real laparoscopic images, we exploit recent advances in unpaired image-to-image translation. We extent an image-to-image translation method to generate a diverse multitude of realistically looking synthetic images based on images from a simple laparoscopy simulation. By incorporating means to ensure that the image content is preserved during the translation process, we ensure that the labels given for the simulated images remain valid for their realistically looking translations. This way, we are able to generate a large, fully labeled synthetic data set of laparoscopic images with realistic appearance. We show that this data set can be used to train models for the task of liver segmentation of laparoscopic images. We achieve average dice scores of up to 0.89 in some patients without manually labeling a single laparoscopic image and show that using our synthetic data to pre-train models can greatly improve their performance. The synthetic data set will be made publicly available, fully labeled with segmentation maps, depth maps, normal maps, and positions of tools and camera (http://opencas.dkfz.de/image2image).Comment: Accepted at MICCAI 201

    Effectiveness and cost-effectiveness of four different strategies for SARS-CoV-2 surveillance in the general population (CoV-Surv Study): study protocol for a two-factorial randomized controlled multi-arm trial with cluster sampling

    Get PDF
    Background: To achieve higher effectiveness in population-based SARS-CoV-2 surveillance and to reliably predict the course of an outbreak, screening, and monitoring of infected individuals without major symptoms (about 40% of the population) will be necessary. While current testing capacities are also used to identify such asymptomatic cases, this rather passive approach is not suitable in generating reliable population-based estimates of the prevalence of asymptomatic carriers to allow any dependable predictions on the course of the pandemic. Methods: This trial implements a two-factorial, randomized, controlled, multi-arm, prospective, interventional, single-blinded design with cluster sampling and four study arms, each representing a different SARS-CoV-2 testing and surveillance strategy based on individuals' self-collection of saliva samples which are then sent to and analyzed by a laboratory. The targeted sample size for the trial is 10,000 saliva samples equally allocated to the four study arms (2500 participants per arm). Strategies differ with respect to tested population groups (individuals vs. all household members) and testing approach (without vs. with pre-screening survey). The trial is complemented by an economic evaluation and qualitative assessment of user experiences. Primary outcomes include costs per completely screened person, costs per positive case, positive detection rate, and precision of positive detection rate. Discussion: Systems for active surveillance of the general population will gain more importance in the context of pandemics and related disease prevention efforts. The pandemic parameters derived from such active surveillance with routine population monitoring therefore not only enable a prospective assessment of the short-term course of a pandemic, but also a more targeted and thus more effective use of local and short-term countermeasures. Trial registration: ClinicalTrials.gov DRKS00023271. Registered November 30, 2020, with the German Clinical Trials Register (Deutsches Register Klinischer Studien

    Vapor phase deposition of perovskite photovoltaics:Short track to commercialization?

    Get PDF
    While perovskite-based photovoltaics (PV) is progressing toward commercialization, it remains an open question which fabrication technology - solution-based, vapor-based, or combinations - will pave the way to faster economic breakthrough. The vast majority of research studies make use of solution-processed perovskite thin films, which benefit from a rapid optimization feedback and inexpensive to procure tools in modern research laboratories, but vapor phase deposition processes dominate today's established thin-film manufacturing. As research and development of vapor phase processed perovskite thin films are still strongly underrepresented in literature, their full potential is yet to be identified. In this collaborative perspective of academic influenced by industrial views, we convey a balanced viewpoint on the prospects of vapor-based processing of perovskite PV at an industrial scale. Our perspective highlights the conceptual advantages of vapor phase deposition, discusses the most crucial process parameters in a technology assessment, contains an overview about relevant global industry clusters, and provides an outlook on the commercialization perspectives of the perovskite technology in general.</p

    Surgical data science in endoscopic surgery

    No full text
    Surgical data science (SDS) is a research field that aims to improve the quality of interventional healthcare by observing all aspects of the patient treatment process to provide the right assistance at the right time. To date, most SDS applications are based on the deep learning technique, which has shown great potential to solve challenging tasks in a complex surgical environment.However, such algorithms are dependent on a large amount of training data, which not only must contain data, but also labels (e.g., localization of an instrument in the image), so that they can be used for training. To date, however, such a mass of training data is not available. This is primarily because the creation of such data would often require medical experts, as well as significant time and money resources. This data scarcity motivates the two major challenges of surgical data science, namely, (1) how algorithms based on machine learning methods can be trained despite the limited availability of such data and (2) how more training data could be provided. For this work, as a concrete example of a surgical data science application, instrument segmentation of medical instruments in images of laparoscopic videos was used. This thesis investigated several means to alleviate this data scarcity in the context of laparoscopic instrument segmentation resulting in the following main contributions: First, it was examined how \textbf{unlabeled data can be integrated into the training of a machine-based algorithm to reduce the amount of annotated data}. Although with this method, the performance of a deep learning model trained on only a few labeled data could be significantly increased, the achieved performance was not high enough to cover the lack of training data. For this reason, as second contribution, \textbf{the largest dataset to date for the segmentation of multiple instruments in images of laparoscopic videos was created}. Generating the dataset followed a strict annotation protocol and was quality controlled. The created data was then \textbf{published as part of an international challenge to test the submitted methods and identify unresolved problems}. The third contribution was that \textbf{image characteristics were determined which negatively affect the segmentation quality}. In order to identify and quantify the influence of such characteristics, a statistical method has been developed. This analysis then flowed into the last contribution, \textbf{the targeted development of an algorithm that was designed to address the identified difficult characteristics} and achieved the best performance on the challenge dataset. As a result, this work provided a new tool for dealing with data sparsity by revealing the great potential of unlabeled data and the performance gain that can be achieved when generating high-quality datasets. Further, it showed that an in-depth statistical analysis of challenge results could be used to identify open issues of state-of-the-art methods and develop algorithms that are specifically designed to address those issues. This problem-driven approach even leads to a new best score on the task of multi-instance segmentation. Based on this thesis's results, one can confidently assume that the combination of generating data and problem-driven algorithm development and design has the potential to bridge the gap between research and the transition into clinical practice

    Im Wettlauf mit der Zeit - Die Novelle des BVerfSchG auf dem Prüfstand

    No full text
    Die Ampel hat sich in ihrem Koalitionsvertrag bis 2025 eine umfassende Reform des Sicherheitsrechts vorgenommen. Wie so oft in dieser Legislatur kam ihr dabei jedoch etwas in die Quere – diesmal das Bundesverfassungsgericht. Karlsruhe erklärte im Herbst 2022 bestimmte Informationsübermittlungsvorschriften des Bundesverfassungsschutzgesetzes für verfassungswidrig – und setzte dem Gesetzgeber enge Fristen zur Reparatur bis Ende 2023. Nicht viel Zeit, um einen Kernbereich des Sicherheitsrechts neu zu justieren und zugleich in einem hoch volatilen politischen Umfeld angemessene Regelungen zu finden. Inzwischen liegt ein Gesetzentwurf vor, der allerdings an mehreren Stellen mit den Vorgaben aus Karlsruhe offensichtlich nicht zu vereinbaren ist

    First Principle Design Load Estimation for LH2 Fuel Tanks by Means of 0D Approach

    No full text
    Within the framework of the Sustainable Development Goals, the United Nations (UN) General Assembly has declared its firm intention to combat climate change and the associated changes in the environment. Shipping is an important factor since its exhaust gases account for just over two percent of global green-house gas (GHG) emissions. This is reflected in the GHG Strategy of the International Maritime Organization (IMO), which forces the maritime industry to move away from fossil fuels towards zero-carbon alternatives. Liquid hydrogen is a promising candidate to enable this transition. As of now, class approval for the required technology is based on an alternative design approach which entails operational scenario development. In this work, a simulation-based approach for the estimation of design loads with respect to liquid hydrogen fuel tanks is presented. The MATLAB software is employed to implement a 0-dimensional approach for the calculation of the bulk thermodynamic behavior. Necessary thermodynamic quantities are obtained via internal energy by means of the CoolProp package. Vapor and liquid phases are treated separately under the assumption of a satu- rated liquid state while the vapor is allowed to superheat. There is good correlation between validation data and simulation. In order to provide a realistic load assessment during a ship voy- age, an exemplary cruise ship and corresponding power-demand profiles are utilized. The results are then analyzed with regard to occurring loads and operational efficienc

    Das Institut für Plasma- und Lasertechnik der Technischen Fachhochschule Wildau

    No full text
    Am 31.12.1996 endete das Wissenschaftler-Integrations-Programm (WIP) in den neuen Ländern. Es war angelegt worden, um Teilen der in der Akademie der Wissenschaften der DDR arbeitenden (und mehrfach positiv evaluierten) Wissenschaftlern eine Übergangsphase zu finanzieren, in der sie einerseits ihre Arbeit kontinuierlich fortsetzen konnten und andererseits die Integration in einer Hochschule erfolgen sollte, da die Forschungslandschaft der DDR an die bundesdeutschen Strukturen anzupassen waren. Das vorab auf zwei Jahre angelegte, 1992 gestartete Programm konnte bis 1996 verlängert werden, wenn es gelang, eine Hochschule zu finden, die bekundete, an einer Integration dieser Wissenschaftler in ihrer Einrichtung interessiert zu sein. Zielhochschule der Wissenschaftler in der von Prof. Dr. sc. H. Gündel gegründeten Gruppe, die sich in Fortführung ihrer langjährigen Forschungstätigkeit auf dem Gebiet der nichtthermischen Plasmen (auch Niedertemperaturplasmen), insbesondere zur Erzeugung von Laserlicht beschäftigte und auch noch beschäftigt, war die TFH Wildau. Dies war für eine Berliner Einrichtung einerseits ungewöhnlich, andererseits auch nahe: In der Gründungsphase der TFH Wildau lag ein besonderes Schwergewicht auf den modernen Technologien, was sich auch dadurch ausdrückte, daß es einen Fachbereich Physikalische Technik gab, der heute im Fachbereich Ingenieur- und Wirtschaftsingenieurwesen als Studienrichtung weiter besteht. Da sich für die Zukunft eine immer breitere, auch ingenieurmäßige Anwendung der Plasma- und Lasertechnik abzeichnet und an den Deutschen Fachhochschulen dieses Gebiet (insbesondere die Plasmatechnik) kaum in der Lehre vertreten wird, wurde von beiden Seiten - der TFH Wildau auf der einen und der sich damals noch als WIP AG Niedertemperaturplasmen bezeichnenden Gruppe aus Berlin auf der anderen - eine Integration in einem Institut für Plasma- und Lasertechnik (IPLT) der TFH Wildau im Fachbereich Physikalische Technik angestrebt
    corecore