67 research outputs found

    Size-advantage of monovalent nanobodies against the macrophage mannose receptor for deep tumor penetration and tumor-associated macrophage targeting

    Get PDF
    Rationale: Nanobodies (Nbs) have emerged as an elegant alternative to the use of conventional monoclonal antibodies in cancer therapy, but a detailed microscopic insight into the in vivo pharmacokinetics of different Nb formats in tumor-bearers is lacking. This is especially relevant for the recognition and targeting of pro-tumoral tumor-associated macrophages (TAMs), which may be located in less penetrable tumor regions.Methods: We employed anti-Macrophage Mannose Receptor (MMR) Nbs, in a monovalent (m) or bivalent (biv) format, to assess in vivo TAM targeting. Intravital and confocal microscopy were used to analyse the blood clearance rate and targeting kinetics of anti-MMR Nbs in tumor tissue, healthy muscle tissue and liver. Fluorescence Molecular Tomography was applied to confirm anti-MMR Nb accumulation in the primary tumor and in metastatic lesions.Results: Intravital microscopy demonstrated significant differences in the blood clearance rate and macrophage targeting kinetics of (m) and (biv)anti-MMR Nbs, both in tumoral and extra-tumoral tissue. Importantly, (m)anti-MMR Nbs are superior in reaching tissue macrophages, an advantage that is especially prominent in tumor tissue. The administration of a molar excess of unlabelled (biv)anti-MMR Nbs increased the (m)anti-MMR Nb bioavailability and impacted on its macrophage targeting kinetics, preventing their accumulation in extra-tumoral tissue (especially in the liver) but only partially influencing their interaction with TAMs. Finally, anti-MMR Nb administration not only allowed the visualization of TAMs in primary tumors, but also at a distant metastatic site.Conclusions: These data describe, for the first time, a microscopic analysis of (m) and (biv)anti-MMR Nb pharmacokinetics in tumor and healthy tissues. The concepts proposed in this study provide important knowledge for the future use of Nbs as diagnostic and therapeutic agents, especially for the targeting of tumor-infiltrating immune cells.Radiolog

    What scans we will read: imaging instrumentation trends in clinical oncology

    Get PDF
    Oncological diseases account for a significant portion of the burden on public healthcare systems with associated costs driven primarily by complex and long-lasting therapies. Through the visualization of patient-specific morphology and functional-molecular pathways, cancerous tissue can be detected and characterized non- invasively, so as to provide referring oncologists with essential information to support therapy management decisions. Following the onset of stand-alone anatomical and functional imaging, we witness a push towards integrating molecular image information through various methods, including anato-metabolic imaging (e.g., PET/ CT), advanced MRI, optical or ultrasound imaging. This perspective paper highlights a number of key technological and methodological advances in imaging instrumentation related to anatomical, functional, molecular medicine and hybrid imaging, that is understood as the hardware-based combination of complementary anatomical and molecular imaging. These include novel detector technologies for ionizing radiation used in CT and nuclear medicine imaging, and novel system developments in MRI and optical as well as opto-acoustic imaging. We will also highlight new data processing methods for improved non-invasive tissue characterization. Following a general introduction to the role of imaging in oncology patient management we introduce imaging methods with well-defined clinical applications and potential for clinical translation. For each modality, we report first on the status quo and point to perceived technological and methodological advances in a subsequent status go section. Considering the breadth and dynamics of these developments, this perspective ends with a critical reflection on where the authors, with the majority of them being imaging experts with a background in physics and engineering, believe imaging methods will be in a few years from now. Overall, methodological and technological medical imaging advances are geared towards increased image contrast, the derivation of reproducible quantitative parameters, an increase in volume sensitivity and a reduction in overall examination time. To ensure full translation to the clinic, this progress in technologies and instrumentation is complemented by progress in relevant acquisition and image-processing protocols and improved data analysis. To this end, we should accept diagnostic images as “data”, and – through the wider adoption of advanced analysis, including machine learning approaches and a “big data” concept – move to the next stage of non-invasive tumor phenotyping. The scans we will be reading in 10 years from now will likely be composed of highly diverse multi- dimensional data from multiple sources, which mandate the use of advanced and interactive visualization and analysis platforms powered by Artificial Intelligence (AI) for real-time data handling by cross-specialty clinical experts with a domain knowledge that will need to go beyond that of plain imaging

    Historical operational data analysis for defence preparedness planning

    No full text
    Preparedness is an important function of defence planning that involves developing defence capabilities to deal with emergent situations relating to national defence and security. Preparedness planning relies on a number of inputs, including requirement analysis, to identify critical capability gaps. Modern data analysis can play an important role in identifying such future requirements. To this end, this paper presents an analytical study, consisting of both descriptive as well as predictive analysis, of historical defence operational data. The descriptive analysis component of the methodology focuses on identifying useful features in the collected data for building a predictive model. The predictive analysis investigates existing patterns in the data, including spatial and temporal trends. An artificial neural network based time series forecasting model is developed to predict future operations based on the identified features. The proposed methodology is applied to a defence operational data set, built from a number of unclassified sources relating to the historical operational deployments of the Australian Defence Force between 1885 and 2012. Implications are also discussed

    Federated TON_IoT Windows Datasets for Evaluating AI-based Security Applications

    Full text link
    © 2020 IEEE. Existing cyber security solutions have been basically developed using knowledge-based models that often cannot trigger new cyber-attack families. With the boom of Artificial Intelligence (AI), especially Deep Learning (DL) algorithms, those security solutions have been plugged-in with AI models to discover, trace, mitigate or respond to incidents of new security events. The algorithms demand a large number of heterogeneous data sources to train and validate new security systems. This paper presents the description of new datasets, the so-called ToN_IoT, which involve federated data sources collected from Telemetry datasets of IoT services, Operating system datasets of Windows and Linux, and datasets of Network traffic. The paper introduces the testbed and description of TON_IoT datasets for Windows operating systems. The testbed was implemented in three layers: edge, fog and cloud. The edge layer involves IoT and network devices, the fog layer contains virtual machines and gateways, and the cloud layer involves cloud services, such as data analytics, linked to the other two layers. These layers were dynamically managed using the platforms of software-Defined Network (SDN) and Network-Function Virtualization (NFV) using the VMware NSX and vCloud NFV platform. The Windows datasets were collected from audit traces of memories, processors, networks, processes and hard disks. The datasets would be used to evaluate various AI-based cyber security solutions, including intrusion detection, threat intelligence and hunting, privacy preservation and digital forensics. This is because the datasets have a wide range of recent normal and attack features and observations, as well as authentic ground truth events. The datasets can be publicly accessed from this link [1]

    Werkstoffliches Recycling von TSE-Gummireststoffen im Feinstkornbereich. Teilvorhaben 4: Entwicklung von Einsatzmoeglichkeiten von Feinstmehlen in Moosgummi-Dichtungsprofilen Abschlussbericht

    No full text
    Currently German rubber industry produces about 750.000 t/a crosslinked rubber waste. Only 50.000 t/a of this waste are shredded, remilled or ground into rubber particles (sizes ranging from 0.2 to 2 mm) being recirculated into secondary downgraded articles such as doormats, shock absorbing elements, and others. Recirculation does not meet the requirements of recycling standards aiming at high-graded applications. Milling and grinding of scrap rubber has been known for a long time and the terms and circumstances of incorporating powdered rubber into compounds for moulded items as well. A thorough knowledge with regard to extruded goods and especially cellular items did not exist up to now. The objectives of the project 'Rubber Recycling for Substitution of Raw Materials' therefore consisted in gaining basic knowledge with regard to methods for manufacturing rubber powders (with particle sizes <0.1 mm) from cellular rubber profiles as well as methods for recirculating such powders into primary applications, i.e. continuously extruded/crosslinked cellular rubber profiles. In addition to that, the resulting consequences for material/processing/product quality characteristics should be evaluated. Laboratory and extensive production testing demonstrated cryogenic grinding of cellular profiles to be an efficient method for obtaining rubber particles <0.1 mm in high yields. Such powders can be incorporated into compounds substituting an amount of up to 30% of the original raw materials without major modifying the mixing procedure, if the composition of the scrap is quite close to that of the compound. The use of rubber powder affects the materials characteristics in various aspects. Due to the amount of ground rubber compound viscosity rises correspondingly and influences die swell/injection behaviour, which can require die modification to overcome extrusion problems. Other properties are adversely affected too, so for instance compression set, compression deflection, tensile strength, although slight adjustments of the processing parameters will help to keep properties within tolerances in many cases. The most apparent drawback however is the worsening of the profile's surface, which can be overcome by changing die layout, but only within a limited range. Potential improvements are expected in optimizing particle size distribution and further die modification. In principle the usage of rubber powders in manufacturing extruded cellular rubber profiles seems to be possible under quite normal conditions, provided that surface appearance does not require special attention. Other applications have been explored and tested to cover rough surfaces using coextrusion techniques. (orig.)Derzeit werden von etwa 750.000 t/a in Deutschland entstehenden vulkanisierten Gummiabfaellen nur ca. 50.000 t/a als Granulat oder Mahlgut im Korngroessenbereich von 0,2-2 mm in anspruchslose Sekundaeranwendungen wie Fallschutzplatten, Fussmatten, Daempfungselemente zurueckgefuehrt. Dies wird einem echten Recycling zu qualitativ hochwertigem Primaerwerkstoff nicht gerecht. Grundsaetzlich ist die Vermahlung von Weichgummi ebenso bekannt wie die Randbedingungen fuer die Wiedereinarbeitung des Gummimehls in Mischungen fuer Formartikel. Erfahrungen im Bereich von Extrusionsartikeln und hier speziell getriebenen Profilen (Moosgummi) lagen jedoch noch nicht vor. Im Rahmen des Projekts 'Werkstoffliches Recycling' war es demnach das Ziel, empirische Erkenntnisse zur Herstellung von Moosgummifeinstmehl und seiner Rueckfuehrung in extrudierte, frei geheizte Moosgummi-Artikel sowie die Auswirkungen auf Werkstoff-, Qualitaets-, Verarbeitungsmerkmale zu gewinnen. In Labor- und ausgedehnten Produktionstests konnte gezeigt werden, dass die Herstellung von Gummifeinstmehl <0,1 mm aus vulkanisierten Moosgummi-Profilabfaellen durch kryogene Vermahlung relativ problemlos in guten Ausbeuten moeglich ist. Dieses Feinstmehl laesst sich bis zu Anteilen von etwa 30% ohne wesentliche Modifikationen des Mischprozesses als Primaerwerkstoff in entsprechende Gummimischungen zurueckfuehren. Hinsichtlich der Auswirkungen auf die Fertigprodukte sind verschiedene Effekte festzuhalten. Durch Zugabe von Feinstmehl aendert sich die Viskositaet der Mischung und damit Quell-/Spritzverhalten, so dass in sehr unguenstigen Faellen Werkzeuge angepasst werden muessen. Daneben gibt es merkliche, negative Auswirkungen auf mechanische Eigenschaften wie DVR, Stauchhaerte, Dehnung, Festigkeit, ohne dass (bei Wahl der richtigen Parameter) jedoch die noch zulaessigen Toleranzen ueberschritten werden. Weniger einfach ist die deutlich verschlechterte Oberflaechenqualitaet zu optimieren. Auch mit verschiedenen Massnahmen auf der Werkzeugseite lassen sich nur in begrenztem Umfang Verbesserungen erzielen. Hinsichtlich einer Gesamtoptimierung sollte noch vorhandenes Potential im Feinheitsgrad des Gummimehls und in der Werkzeuggestaltung genutzt werden. Grundsaetzlich erscheint im betrachteten Bereich, sofern die Oberflaeche weniger wichtig ist, ein ganz normaler Einsatz von Moosgummi-Profilen mit Feinstmehlanteil moeglich. Weitere Anwendungen sind denkbar (und erprobt), in denen die Profiloberflaeche durch koextrudiertes Gummimaterial kaschiert wird. (orig.)SIGLEAvailable from TIB Hannover: F02B1462 / FIZ - Fachinformationszzentrum Karlsruhe / TIB - Technische InformationsbibliothekBundesministerium fuer Bildung und Forschung, Berlin (Germany)DEGerman

    A framework for multi-stage ML-based electricity demand forecasting

    No full text
    This paper proposes a novel framework for energy utility companies to anticipate their customers' energy usage based on their historical consumption data. The proposed framework comprises three major stages: (i) it detects and removes anomalies in consumers' energy consumption data by employing the isolation forest (iForest); (ii) it forms clusters of distinct consumer groups based on similarities in their consumption behavior via the k-means clustering algorithm; and (iii) it predicts electricity consumption by using deep learning algorithms. To this end, two different deep learning algorithms are designed: a long short-term memory (LSTM) network and the combination of convolutional neural network (CNN) and LSTM (referred to as CNN-LSTM) with multiple inputs. Since the latter is a combination of CNN and LSTM models, we apply a 2-D discrete wavelet transform (DWT) based feature extraction to the Gramian angular field (GAF) transformation of the time series to improve the accuracy of predictions. Various evaluation metrics are utilized for 1-hour- and 24-hours-ahead predictions with two different sliding-window sizes, i.e., 24 hours and 36 hours. The results demonstrate that the CNN-LSTM performs significantly better in predicting 24-hours-ahead electricity consumption
    • …
    corecore